Closed joshlong closed 4 months ago
Ollama API doesn't support function calling at the moment, or I miss some update?
Ollama doesn't support function calling at the moment, but there are a few related feature requests to introduce that feature, mentioned in https://github.com/ollama/ollama/issues/4386
Ok, lets see when they will support functions :crossed_fingers:
Hello, Mistral 0.3 supports function calling with Ollama :
now, ollama is supported function calling we need it! reference
That one is a custom wrapper implementing a workaround where "tool" messages are handled as "assistant" messages, tricking Ollama into accepting them (see here). I wonder if a similar trick would work in Spring AI.
Ollama doesn't support yet function calling via its Chat Completion API, even though the models themselves do. There is a raw mode available, as mentioned by @tchoteau, that allows working with functions. But that requires a different implementation and design in Spring AI compared to all the other chat completion integrations.
It looks like they added the support for function calling in ollama, to the models that support tools like mistral or llama3-groq-tool-use this pr was merged some days ago and you can already make calls in ollama using function calling as the comments here
would be nice to add the support for function calling for ollama in spring ai
Hi. Fantastic, thanks for taking the time to update the issue @Ricard-Kollcaku ! Not sure we can get it in time for M2, but will take a look once we take a look at the final list of issues.
It's worth distinguishing these three different options for function calling in Ollama:
One more important thing to mention is that there are only a few models in the Ollama Library that support function calling via options 2 and 3 (whereas more models might work with option 1). For example, you can use function calling with mistral-nemo
but not with llama3
, because the latter has not been trained for function calling.
This first PR adds function calling support at the API level: https://github.com/spring-projects/spring-ai/pull/1103. A second PR will extend the function calling support at the ChatModel level.
among other things it should be possible to specify a function in the OllamaChatOptions