Open coderphonui opened 3 days ago
Adding
.withSchemaType(FunctionCallbackWrapper.Builder.SchemaType.OPEN_API_SCHEMA)
will fix the issue.
The final code should be like this:
@Bean
public FunctionCallback weatherFunctionInfo() {
return FunctionCallbackWrapper.builder(new MockWeatherService())
.withName("weatherFunction")
.withDescription("Get the weather in location")
.withSchemaType(FunctionCallbackWrapper.Builder.SchemaType.OPEN_API_SCHEMA)
.build();
}
I see this is mentioned in the documentation: https://docs.spring.io/spring-ai/reference/api/chat/functions/vertexai-gemini-chat-functions.html
However, I am not satisfied with the solution. In my opinion, the FunctionCallbackWrapper should be abstract enough to cover the schema compatibility on each model.
Let's put an example mapping to my scenario:
I am building an agent system that allows user to configure agents with prompt and the LLM. I also built some built-in functions - this is where I need to load them to Spring AI using FunctionCallBackWrapper. When implementing the built-in functions, I don't know which LLM that the user will configure to use that function for the agent they create. Fortunately, OPEN_API_SCHEMA can help to cover most of the cases, but what happens if another LLM (in the future) will require only JSON_SCHEMA?
So, I would suggest to remove the withSchemaType in the builder and the core logic of Spring AI should cover the compatibility in this case.
Finally, I come with a solution to wrap up the Function definition like below:
@Data
@Builder
@AllArgsConstructor
public class FunctionDefinition<I, O> {
private String name;
private String description;
private Function<I, O> function;
public FunctionCallbackWrapper<I, O> toFunctionCallbackWrapper(FunctionCallbackWrapper.Builder.SchemaType schemaType) {
return FunctionCallbackWrapper.builder(function)
.withName(name)
.withDescription(description)
.withSchemaType(schemaType)
.build();
}
}
I make a hook to build up the ChatModel for each model I need to support and rebuild the CallbackFunction list before passing them to the model constructor. Below is an example code:
@Slf4j
@Component
public class VertexGeminiChatClientBuilder implements ChatClientBuilder {
private final VertexAI vertexAi;
private final List<FunctionCallback> toolFunctionCallbacks;
private final ApplicationContext context;
private final List<FunctionDefinition> functionDefinitions;
public VertexGeminiChatClientBuilder(VertexAI vertexAi, List<FunctionCallback> toolFunctionCallbacks, ApplicationContext context, List<FunctionDefinition> functionDefinitions) {
this.vertexAi = vertexAi;
this.toolFunctionCallbacks = toolFunctionCallbacks;
this.context = context;
this.functionDefinitions = functionDefinitions;
}
@Override
public String getProviderName() {
return "vertex-ai";
}
@Override
public ChatModel buildChatModel(AgentConfig agentConfig) {
if(agentConfig == null || agentConfig.getLlmConfig() == null) {
return null;
}
FunctionCallbackContext functionCallbackContext = this.springAiFunctionManager(context);
List<FunctionCallback> clonedToolFunctionCallbacks = new ArrayList<>(List.copyOf(toolFunctionCallbacks));
if(functionDefinitions != null) {
functionDefinitions.forEach(definition -> {
clonedToolFunctionCallbacks.add(definition.toFunctionCallbackWrapper(FunctionCallbackWrapper.Builder.SchemaType.OPEN_API_SCHEMA));
});
}
return new VertexAiGeminiChatModel(vertexAi, buildVertexGeminiChatOptions(agentConfig),
functionCallbackContext, clonedToolFunctionCallbacks);
}
@Override
public ChatOptions buildChatOptions(AgentConfig agentConfig) {
return buildVertexGeminiChatOptions(agentConfig);
}
private VertexAiGeminiChatOptions buildVertexGeminiChatOptions(AgentConfig agentConfig) {
if(agentConfig == null) {
return null;
}
LLMConfig modelConfig = agentConfig.getLlmConfig();
if(modelConfig == null) {
return null;
}
if(modelConfig.getModelName() == null) {
modelConfig.setModelName(VertexAiGeminiChatModel.ChatModel.GEMINI_1_5_FLASH.getValue());
}
VertexAiGeminiChatOptions.Builder builder = VertexAiGeminiChatOptions.builder()
.withModel(modelConfig.getModelName())
.withMaxOutputTokens(modelConfig.getMaxTokens())
.withTemperature(modelConfig.getTemperature());
if(!agentConfig.getTools().isEmpty()) {
agentConfig.getTools().forEach(builder::withFunction);
}
return builder.build();
}
private FunctionCallbackContext springAiFunctionManager(ApplicationContext context) {
FunctionCallbackContext manager = new FunctionCallbackContext();
manager.setSchemaType(FunctionCallbackWrapper.Builder.SchemaType.OPEN_API_SCHEMA);
manager.setApplicationContext(context);
return manager;
}
}
By this approach, I bind the SchemaType exactly to where it needs (when the ChatModel is initialized)
Bug description I am using the snapshot build from master branch, and I got this error when trying to make a function call using Vertex Gemini.
Full error log:
Environment
Spring AI snapshot version built from master branch.
Steps to reproduce
Code to reproduce:
Test case
Configuration
MockWeatherService
Expected behavior The function call should work normally with the approach of using FunctionCallbackWrapper