Skip to content

feat: Implementing minimal feature scope for Spring AI integration in OpenAI #526

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 58 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
58 commits
Select commit Hold shift + click to select a range
9ff262b
first draft
Jonas-Isr Jun 16, 2025
b14d70c
Align with docs
Jonas-Isr Jun 17, 2025
b32b999
Codestyle
Jonas-Isr Jun 17, 2025
32d1bb2
Merge branch 'main' into agent-workflow-examples
Jonas-Isr Jun 17, 2025
5720581
Merge branch 'main' into agent-workflow-examples
Jonas-Isr Jun 19, 2025
64c8b0e
Merge branch 'main' into agent-workflow-examples
Jonas-Isr Jun 24, 2025
98ab1af
Merge branch 'main' into agent-workflow-examples
CharlesDuboisSAP Jul 3, 2025
0e59667
feat: [OpenAI] Spring AI integration
CharlesDuboisSAP Jul 3, 2025
71898c5
test
CharlesDuboisSAP Jul 3, 2025
b056414
Merge branch 'agent-workflow-examples' into openai-springai
CharlesDuboisSAP Jul 3, 2025
025eba1
test
CharlesDuboisSAP Jul 4, 2025
d402c64
Merge branch 'agent-workflow-examples' into openai-springai
CharlesDuboisSAP Jul 4, 2025
7ed6baa
Merge branch 'main' into openai-springai
CharlesDuboisSAP Jul 24, 2025
3b0273b
Fixing errors in OpenAiChatOptions according to "Upgrade to Spring AI…
n-o-u-r-h-a-n Jul 25, 2025
8c2a638
Fixing errors in OpenAiChatOptions according to "Upgrade to Spring AI…
n-o-u-r-h-a-n Jul 25, 2025
8eb5459
Fixing SpringAiAgenticWorkflowService according to "Upgrade to Spring…
n-o-u-r-h-a-n Jul 25, 2025
ab1f795
Implementation of completion and streamChatCompletion in SpringAiOpen…
n-o-u-r-h-a-n Jul 29, 2025
9d4bdcd
Removing a comment
n-o-u-r-h-a-n Jul 29, 2025
72198d9
Removing a comment
n-o-u-r-h-a-n Jul 31, 2025
88904c8
Chat Memory test working.
n-o-u-r-h-a-n Aug 1, 2025
47d1bc0
Formatting for SpringAiOpenAiService.
n-o-u-r-h-a-n Aug 3, 2025
443e464
Removing unneccessary imports in SpringAiOpenAiService.
n-o-u-r-h-a-n Aug 3, 2025
00ba4ad
Updating the toOpenAiRequest in OpenAiChatModel.java.
n-o-u-r-h-a-n Aug 3, 2025
55c85d9
Implementing the new approach
n-o-u-r-h-a-n Aug 4, 2025
56cda93
Editing the approach.
n-o-u-r-h-a-n Aug 5, 2025
0183c83
Fix compilation and format and annotations and javadoc
newtork Aug 5, 2025
02d5f54
Merge remote-tracking branch 'origin/main' into chatcompletion-for-sp…
newtork Aug 5, 2025
ee94941
Remove unrelated code
newtork Aug 5, 2025
dec85d6
implementation hint
newtork Aug 5, 2025
d99dc0e
Formatting
bot-sdk-js Aug 5, 2025
c073c8d
Updating OpenAiChatOptions.java with our Config Object.
n-o-u-r-h-a-n Aug 6, 2025
6ed3a13
Formatting
bot-sdk-js Aug 6, 2025
9f4a403
Passing our Config Object as an input parameter for OpenAiChatOptions()
n-o-u-r-h-a-n Aug 6, 2025
687fe2e
Fixing NullPointerException in toOpenAiRequest method for ToolCallng …
n-o-u-r-h-a-n Aug 6, 2025
efa3831
Adding topK for the Config Class ??
n-o-u-r-h-a-n Aug 6, 2025
817d3cb
Formatting
bot-sdk-js Aug 6, 2025
ad4241a
Update foundation-models/openai/src/main/java/com/sap/ai/sdk/foundati…
n-o-u-r-h-a-n Aug 7, 2025
203eba6
Failing Test of testToolCallingWithoutExecution() in SpringAiOpenAiTe…
n-o-u-r-h-a-n Aug 7, 2025
f7c7ece
Resolving Reviewed Issues.
n-o-u-r-h-a-n Aug 7, 2025
84546c0
Resolving Reviewed Issues.
n-o-u-r-h-a-n Aug 7, 2025
c52720e
--> still having testToolCallingWithoutExecution() in SpringAiOpenAiT…
n-o-u-r-h-a-n Aug 7, 2025
cd3501c
format
n-o-u-r-h-a-n Aug 8, 2025
7f447d7
format
n-o-u-r-h-a-n Aug 8, 2025
9e1760c
Removing wild cards imports
n-o-u-r-h-a-n Aug 8, 2025
222924c
Sucessful build of OpenAi
n-o-u-r-h-a-n Aug 8, 2025
de9ef56
Sucessful build of Spring Boot app.
n-o-u-r-h-a-n Aug 8, 2025
9fb6321
Merge branch 'main' into chatcompletion-for-springopenai
n-o-u-r-h-a-n Aug 8, 2025
9284fe3
Formatting
bot-sdk-js Aug 8, 2025
8d40dfa
Removing this test for now.
n-o-u-r-h-a-n Aug 8, 2025
be51dd3
Fix nullcheck
newtork Aug 8, 2025
d0ea158
Merge remote-tracking branch 'origin/chatcompletion-for-springopenai'…
newtork Aug 8, 2025
3c161c0
Fix unit test
newtork Aug 8, 2025
35d89cb
Merge branch 'main' into chatcompletion-for-springopenai
n-o-u-r-h-a-n Aug 11, 2025
c186aea
chore: Reduce constructor visibility in OpenAI / SpringAI PR (#531)
newtork Aug 12, 2025
75a2361
Replacing config.toolsExecutable with getter-usage + adding tolerate …
n-o-u-r-h-a-n Aug 12, 2025
ddcb88c
Merge remote-tracking branch 'origin/chatcompletion-for-springopenai'…
n-o-u-r-h-a-n Aug 12, 2025
9f27cb4
Merge branch 'main' into chatcompletion-for-springopenai
n-o-u-r-h-a-n Aug 13, 2025
3e3d41a
Merge branch 'main' into chatcompletion-for-springopenai
n-o-u-r-h-a-n Aug 13, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions foundation-models/openai/pom.xml
Original file line number Diff line number Diff line change
Expand Up @@ -38,11 +38,11 @@
</scm>
<properties>
<project.rootdir>${project.basedir}/../../</project.rootdir>
<coverage.complexity>72%</coverage.complexity>
<coverage.complexity>70%</coverage.complexity>
<coverage.line>80%</coverage.line>
<coverage.instruction>76%</coverage.instruction>
<coverage.branch>70%</coverage.branch>
<coverage.method>83%</coverage.method>
<coverage.method>75%</coverage.method>
<coverage.class>84%</coverage.class>
</properties>
<dependencies>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,7 @@
import com.sap.ai.sdk.foundationmodels.openai.generated.model.ChatCompletionRequestAssistantMessage;
import com.sap.ai.sdk.foundationmodels.openai.generated.model.ChatCompletionRequestAssistantMessageContent;
import com.sap.ai.sdk.foundationmodels.openai.generated.model.ToolCallType;
import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import javax.annotation.Nonnull;
Expand Down Expand Up @@ -52,6 +53,21 @@ public class OpenAiAssistantMessage implements OpenAiMessage {
@Nonnull
List<OpenAiToolCall> toolCalls;

/**
* Creates a new assistant message with the given content and additional tool calls.
*
* @param toolCalls the additional tool calls to associate with the message.
* @return a new assistant message with the given content and additional tool calls.
* @since 1.10.0
*/
@Nonnull
public OpenAiAssistantMessage withToolCalls(
@Nonnull final List<? extends OpenAiToolCall> toolCalls) {
final List<OpenAiToolCall> newToolCalls = new ArrayList<>(this.toolCalls);
newToolCalls.addAll(toolCalls);
return new OpenAiAssistantMessage(content, newToolCalls);
}

/**
* Creates a new assistant message with the given single message as text content.
*
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,130 @@
package com.sap.ai.sdk.foundationmodels.openai;

import com.sap.ai.sdk.foundationmodels.openai.generated.model.ChatCompletionStreamOptions;
import com.sap.ai.sdk.foundationmodels.openai.generated.model.ChatCompletionTool;
import com.sap.ai.sdk.foundationmodels.openai.generated.model.CreateChatCompletionRequestAllOfResponseFormat;
import java.math.BigDecimal;
import java.util.List;
import java.util.Map;
import javax.annotation.Nullable;
import lombok.AccessLevel;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.With;

/** Configuration for OpenAI chat completion requests. */
@With
@NoArgsConstructor
@AllArgsConstructor
@Getter
public class OpenAiChatCompletionConfig {

/** Upto 4 Stop sequences to interrupts token generation and returns a response without them. */
@Nullable List<String> stop;

/**
* Controls the randomness of the completion.
*
* <p>Lower values (e.g. 0.0) make the model more deterministic and repetitive, while higher
* values (e.g. 1.0) make the model more random and creative.
*/
@Nullable BigDecimal temperature;

/**
* Controls the cumulative probability threshold used for nucleus sampling. Alternative to {@link
* #temperature}.
*
* <p>Lower values (e.g. 0.1) limit the model to consider only the smallest set of tokens whose
* combined probabilities add up to at least 10% of the total.
*/
@Nullable BigDecimal topP;

/**
* Controls the number of top tokens to consider for sampling.
*
* <p>Higher values (e.g. 50) allow the model to consider more tokens, while lower values (e.g. 1)
* restrict it to the most probable token.
*/
@Nullable Integer topK;

/** Maximum number of tokens that can be generated for the completion. */
@Nullable Integer maxTokens;

/**
* Maximum number of tokens that can be generated for the completion, including consumed reasoning
* tokens. This field supersedes {@link #maxTokens} and should be used with newer models.
*/
@Nullable Integer maxCompletionTokens;

/**
* Encourage new topic by penalising token based on their presence in the completion.
*
* <p>Value should be in range [-2, 2].
*/
@Nullable BigDecimal presencePenalty;

/**
* Encourage new topic by penalising tokens based on their frequency in the completion.
*
* <p>Value should be in range [-2, 2].
*/
@Nullable BigDecimal frequencyPenalty;

/**
* A map that adjusts the likelihood of specified tokens by adding a bias value (between -100 and
* 100) to the logits before sampling. Extreme values can effectively ban or enforce the selection
* of tokens.
*/
@Nullable Map<String, Integer> logitBias;

/**
* Unique identifier for the end-user making the request. This can help with monitoring and abuse
* detection.
*/
@Nullable String user;

/** Whether to include log probabilities in the response. */
@Nullable Boolean logprobs;

/**
* Number of top log probabilities to return for each token. An integer between 0 and 20. This is
* only relevant if {@code logprobs} is enabled.
*/
@Nullable Integer topLogprobs;

/** Number of completions to generate. */
@Nullable Integer n;

/** Whether to allow parallel tool calls. */
@Nullable Boolean parallelToolCalls;

/** Seed for random number generation. */
@Nullable Integer seed;

/** Options for streaming the completion response. */
@Nullable ChatCompletionStreamOptions streamOptions;

/** Response format for the completion. */
@Nullable CreateChatCompletionRequestAllOfResponseFormat responseFormat;

/**
* Tools the model may invoke during chat completion (metadata only).
*
* <p>Use {@link #withToolsExecutable} for registering executable tools.
*/
@Nullable List<ChatCompletionTool> tools;

/**
* Tools the model may invoke during chat completion that are also executable at application
* runtime.
*
* @since 1.8.0
*/
@Getter(value = AccessLevel.PACKAGE)
@Nullable
List<OpenAiTool> toolsExecutable;

/** Option to control which tool is invoked by the model. */
@Nullable OpenAiToolChoice toolChoice;
}
Loading