spring-projects / spring-ai

An Application Framework for AI Engineering
https://docs.spring.io/spring-ai/reference/index.html
Apache License 2.0
3.33k stars 857 forks source link

Support invoking functions in models that support it via Ollama #720

Closed joshlong closed 4 months ago

joshlong commented 6 months ago

among other things it should be possible to specify a function in the OllamaChatOptions

Grogdunn commented 6 months ago

Ollama API doesn't support function calling at the moment, or I miss some update?

ThomasVitale commented 6 months ago

Ollama doesn't support function calling at the moment, but there are a few related feature requests to introduce that feature, mentioned in https://github.com/ollama/ollama/issues/4386

Grogdunn commented 6 months ago

Ok, lets see when they will support functions :crossed_fingers:

tchoteau commented 5 months ago

Hello, Mistral 0.3 supports function calling with Ollama : image

jidaojiuyou commented 5 months ago

now, ollama is supported function calling we need it! reference

ThomasVitale commented 5 months ago

That one is a custom wrapper implementing a workaround where "tool" messages are handled as "assistant" messages, tricking Ollama into accepting them (see here). I wonder if a similar trick would work in Spring AI.

Ollama doesn't support yet function calling via its Chat Completion API, even though the models themselves do. There is a raw mode available, as mentioned by @tchoteau, that allows working with functions. But that requires a different implementation and design in Spring AI compared to all the other chat completion integrations.

Ricard-Kollcaku commented 4 months ago

It looks like they added the support for function calling in ollama, to the models that support tools like mistral or llama3-groq-tool-use this pr was merged some days ago and you can already make calls in ollama using function calling as the comments here

would be nice to add the support for function calling for ollama in spring ai

markpollack commented 4 months ago

Hi. Fantastic, thanks for taking the time to update the issue @Ricard-Kollcaku ! Not sure we can get it in time for M2, but will take a look once we take a look at the final list of issues.

ThomasVitale commented 4 months ago

It's worth distinguishing these three different options for function calling in Ollama:

  1. Raw mode. This option supports function calling already. I don't see it as a good fit for Spring AI to support since it skips entirely the Ollama APIs and acts at a lower level. Example: https://github.com/ollama/ollama/issues/1729#issuecomment-1937763369
  2. Ollama API. In the past few weeks, several changes have been delivered to the Ollama project to support function calling through the Ollama API. I'm currently working on a PR for supporting this also in Spring AI.
  3. OpenAI-Compatible API. This option supports function calling already. It will work with the Spring AI OpenAI integration as soon as this bug in Ollama gets fixed: https://github.com/ollama/ollama/issues/5796.

One more important thing to mention is that there are only a few models in the Ollama Library that support function calling via options 2 and 3 (whereas more models might work with option 1). For example, you can use function calling with mistral-nemo but not with llama3, because the latter has not been trained for function calling.

ThomasVitale commented 4 months ago

This first PR adds function calling support at the API level: https://github.com/spring-projects/spring-ai/pull/1103. A second PR will extend the function calling support at the ChatModel level.