Closed tofgun closed 3 weeks ago
Hey there @synesthesiam, mind taking a look at this issue as it has been labeled with an integration (ollama
) you are listed as a code owner for? Thanks!
(message by CodeOwnersMention)
ollama documentation ollama source (message by IssueLinks)
problem solved with llama3.2
The problem
The Ollama integration is failing to process any request to the ollama server : Sorry, I had a problem talking to the Ollama server: llama runner process has terminated: signal: killed
The identical request launched locally with curl is successfull curl http://localhost:11434/api/generate -d '{ "model": "llama3.1", "prompt": "Quelle heure est il ?" }
What version of Home Assistant Core has the issue?
core-2024.9.1
What was the last working version of Home Assistant Core?
core-2024.8.3
What type of installation are you running?
Home Assistant OS
Integration causing the issue
Ollama
Link to integration documentation on our website
https://www.home-assistant.io/integrations/ollama
Diagnostics information
2024-09-12 21:06:54.674 DEBUG (MainThread) [homeassistant.components.ollama.conversation] Prompt: Current time is 21:06:54. Today's date is 2024-09-12. You are a voice assistant for Home Assistant. Answer questions about the world truthfully. Answer in plain text. Keep it simple and to the point. 2024-09-12 21:06:54.676 DEBUG (MainThread) [homeassistant.components.ollama.conversation] Tools: None 2024-09-12 21:07:06.723 ERROR (MainThread) [homeassistant.components.ollama.conversation] Unexpected error talking to Ollama server: llama runner process has terminated: signal: killed
Example YAML snippet
No response
Anything in the logs that might be useful for us?
Additional information
No response