sgomez / ollama-ai-provider

Vercel AI Provider for running LLMs locally using Ollama
https://www.npmjs.com/package/ollama-ai-provider
Other
150 stars 18 forks source link

ollama tool calling with Vercel's AI SDK #1

Closed mfshaho closed 6 months ago

mfshaho commented 6 months ago

the AI SDK says that in order to call tools using the LLM, The model needs to support calling tools.

using this Ollama Provider, which ollama models are capable of tool calling?

sgomez commented 6 months ago

Hi, @mfshaho. I will transfer this issue as a discussion. Anyway, the problem is that Ollama API doesn't support function call yet. So right now it is not possible to do it directly.

It is possible to do it with prompt injection and I am doing some progress in that line. I hope to have a way to do it soon.