Closed whisper-bye closed 1 month ago
Sorry, but I need more information. Can u share a tiny repository to check what is happening? Please, notice than only generate-text and generate-stream support tool-calling natively by ollama. Tool-calling using streams is a experimental feature of this provider than cannot work with all models.
Anyway, I test the model with examples/ai-core/src/generate-text/ollama-tool-call.ts
example and it works.
So, probably there are nothing we can do, because the tool response does not depends of the provider but the ollama engine.
I think this is related to the adaptation of the qwen2:7b model.
Then I gonna close the issue. If you have more information related with the execution or a PoC to check there are a bug I can reopen it.
Is your feature request related to a problem? Please describe. I'm experiencing issues when using the ollama-ai-provider with ai-sdk to call the newly supported qwen2:7b tool in Ollama. Despite proper integration, the tool does not return the correct results as expected.
Describe the solution you'd like I would like the ollama-ai-provider to properly support and return the correct results when calling qwen2:7b tools via the ai-sdk.
Additional context https://ollama.com/library/qwen2:7b/blobs/77c91b422cc9