robusta-dev / holmesgpt

On-Call/DevOps Assistant - Get a head start on fixing alerts with AI investigation
MIT License
366 stars 32 forks source link

Trying holmesgpt with local ollama3.1 fails w. KeyError: 'name' #129

Open suukit opened 1 week ago

suukit commented 1 week ago

Hi, I wanted to give this a try and installed ollama locally. I am able to use the ollama API on http://localhost:11434/api/generate with curl. I evaluated export OLLAMA_API_BASE=http://localhost:11434, installed holmes using brew as described in the readme and startet holmes ask --model ollama/llama3.1:latest "what issues do I have in my cluster" -v resulting in

APIConnectionError: litellm.APIConnectionError: 'name' Traceback (most recent call last): File "litellm/main.py", line 2422, in completion File "litellm/llms/ollama.py", line 293, in get_ollama_response KeyError: 'name'

image

Any help how to debug this? Already tryied -v which gives no further helpful output. Changed the OLLAMA_API_BASE to http://localhost:11434/ZZ and then receive and 404 as expected, so I assume OLLAMA_API_BASE is working. Also switched to ollama/llama2 model with same result.

Thank you in advance Max