lmos-ai / arc

The goal of the Arc project is to utilize the power of Kotlin DSL and Kotlin Scripting to define a language optimized for building LLM powered solutions.
https://www.telekom.com/imprint
64 stars 3 forks source link

Support tools in Ollama client #40

Open hsudbrock opened 4 days ago

hsudbrock commented 4 days ago

Currently, the Ollama client does not support tools (cf. https://github.com/lmos-ai/arc/blob/main/arc-ollama-client/src/main/kotlin/OllamaClient.kt#L94). Ollama itself, in principle, supports tools, not for all models but for quite a few (cf. https://ollama.com/blog/tool-support for a brief description about tool support in Ollama's API, and https://ollama.com/search?c=tools for a list of supported models).

It would be nice to use Ollama's tool support for supporting Ollama-based agents with tools in Arc (similar to how other Arc clients like Arc's AzureAIClient already support tools).

I have played around a little with a version of Arc's OllamaClient into which I have patched tool support using Ollama's API for that, and it worked well from what I saw. Would you be interested in a pull request adding tool support for the OllamaClient? (I would have to beautify my patch for that - hence asking here first to avoid unnecessary work, e.g., because someone else is already working on that, or because the topic does not fit the roadmap of Arc...)

patwlan commented 4 days ago

Hi, that is definitly a good a idea. As we have now started to use LangChain4j, creating a new ollama client with tools support should be very straight forward.

hsudbrock commented 4 days ago

OK; do I understand you correctly that I should not adapt the existing OllamaClient class, because through the use of LangChain4j this will become obsolete anyways?

patwlan commented 3 days ago

Correct. If you want, you can take a look at the LangChain4j bridge we currently have. The LangChain4j abstraction is quite good, but it requires creating a new client everytime the settings change.. Maybe you have some ideas on improving what we currently have.