Open wenzel-felix opened 1 month ago
Clio requires function calling support which ollama just added today or yesterday I think. We haven't tested Clio again ollama at all. I'm not sure why you'd be seeing a GET request. Clio is powered by GPTScript which has been very extensively tested against different OpenAI implementations. I'd do believe we are doing anything wrong. Please first try with the latest ollama and a model that is function calling compatible.
Hi @ibuildthecloud, thanks for the fast reply. I just now tried to run it with the newest Ollama version - it was not clear to me that clio requires function calls.
But I got it working now, awesome!
Please do not close the issue yet, I'm going to create a PR to add an example in the documentation.
Ok, I was a bit to euphoric. Looks like there is an issue with the way tools work in Ollama + Llama3.1 after all. @ibuildthecloud, any idea?
My current environment: Environment variables:
CLIO_OPENAI_API_KEY=ollama
CLIO_OPENAI_BASE_URL=http://127.0.0.1:11434/v1
# gpt-3.5-turbo is just an alias for llama3.1:8b
CLIO_OPENAI_MODEL=gpt-3.5-turbo
clio version: v0.1.3 ollama version: 0.3.0
I see similar behavior on gptscript as well using the configuration from the official demo.
Should I move this to a new issue in the gptscript repo? @ibuildthecloud
Looks like export GPTSCRIPT_INTERNAL_OPENAI_STREAMING=false
fixes it for gptscript, but afaict clio does not offer a similar environment variable - at least it does not seem to impact it.
Hi there,
I was trying to use it with ollama running locally. Ollama itself is working fine to get chat completions with the new OpenAI compatible API: https://github.com/ollama/ollama/blob/main/docs/openai.md
The issue I'm facing seems to be caused by clio using GET instead of POST to reach the endpoint. The ollama endpoint is still experimental, but from my perspective the issue is not on ollama side.
Here the logs I see on ollama side (line 1 is a curl POST and line 2 is clio apparently using GET):
The ENVs are as follows ("gpt-3.5-turbo" is just an alias for another model coz I wanted to make sure that there are no issues with the model name):
Would appreciate any help or input.