spring-projects / spring-ai

An Application Framework for AI Engineering
https://docs.spring.io/spring-ai/reference/index.html
Apache License 2.0
3.13k stars 779 forks source link

Functions specified in Chat Options is not returned in the getToolCalls() method #1362

Open iAMSagar44 opened 1 month ago

iAMSagar44 commented 1 month ago

Bug description I have two functions that I have specified in Chat Options. I can see from the logs that these functions are getting invoked by the LLM, but when I try to retrieve the function details using the getToolCalls() method they are not being returned (even when the functions have been invoked by the LLM).

Code snippet -

 Prompt prompt = new Prompt(List.of(systemMessage, userMessage),
                                OpenAiChatOptions.builder().withTemperature(0.7f)
                                                .withModel("gpt-4o")
                                                .withFunction("findPapers")
                                                .withFunction("summarizePaper")
                                                .withParallelToolCalls(false)
                                                .build());
                Flux<ChatResponse> chatResponseStream = chatModel.stream(prompt);

                chatResponseStream.map(response -> response.getResult().getOutput().getToolCalls())
                                .doOnNext(toolCalls -> {
                                        logger.info("Tool calls: {}", toolCalls); // returns an empty list when the function has actually been invoked
                                        }
                                })
                                .onErrorContinue((e, o) -> logger.error("Error occurred while processing chat response",
                                                e))
                                .subscribe();

Environment Spring AI version - 1.0.0-M2

Steps to reproduce Please see sample code above for the steps.

Expected behavior Expecting the getTools() method to return the list of functions invoked by the LLM.

JogoShugh commented 1 week ago

How did you find logs to show you that? I am having the same problem. A few months ago, I was auto-generating json schemas for some kotlin data classes and populating them into my system message, but I figured I would try this Function support that auto-generates it too but is baked into Spring AI formally....but I'm seeing the same issue as you and trying to troubleshoot now.