Open Rudd-O opened 2 hours ago
Hey there @synesthesiam, mind taking a look at this issue as it has been labeled with an integration (ollama
) you are listed as a code owner for? Thanks!
(message by CodeOwnersMention)
ollama documentation ollama source (message by IssueLinks)
The problem
Ollama correctly analyzes a question like "is the office ceiling light on?" (responding it's off).
But when asked to turn it on, here is the full debug log (the relevant snippet) from Ollama:
Pay close attention to the last two lines. It's attempting to set the name "Office ceiling light, luz de la oficina" which is very clearly not a light at home, hence the
MatchFailedError
. That makes sense, since it was given that "name" by the integration itself in the system prompt:Given that Ollama (LLAMA 3.1) is failing to understand the nuance of
names
vs.name
, then the YAML prompt given to Ollama has to be reengineered to separate the aliases from the name, rather than unifying them both in a single field. Consider this as an alternative:Alternatively, we the users should be given the ability to customize the template again (this feature was taken out a few releases ago).
What version of Home Assistant Core has the issue?
core-2024.9.0
What was the last working version of Home Assistant Core?
No response
What type of installation are you running?
Home Assistant Core
Integration causing the issue
ollama
Link to integration documentation on our website
No response
Diagnostics information
No response
Example YAML snippet
No response
Anything in the logs that might be useful for us?
No response
Additional information
No response