Closed MikeBirdTech closed 2 weeks ago
try gemma2 it will work
I am having the same error. switching to gemma2 worked to solve it for me but id still like to know if anyone knows a way to get llama3.1 working
Inform Open Interpreter that the language model you’re using does not support function calling, with: --no-llm_supports_functions
interpreter --api_base "http://localhost:11434" --model ollama/llama3.1 --no-llm_supports_functions
Also, make sure the version of Ollama that you're using supports llama3.1. You might have to update.
had same problem, just running interpreter --local --no-llm_supports_functions
works as expected.
Describe the bug
When trying to use Ollama to power OI (via CLI or in Python library), it errors out before I am able to send a prompt.
Reproduce
interpreter --local
Expected behavior
No error
Screenshots
No response
Open Interpreter version
Open Interpreter 0.3.7 The Beginning (Ty and Victor)
Python version
Python 3.11.7
Operating System name and version
MacOS
Additional context
ollama version is 0.3.6