acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
484 stars 56 forks source link

Llama 3 8b and Mistral latest returns Failed to parse #151

Closed KC2021 closed 2 weeks ago

KC2021 commented 1 month ago

Describe the bug
I have Groups set up such as Outdoor Lights. Saying "lights on" works but only turns on/off my living room lights, "all lights on" somehow works, but "outdoor lights on" fails. The normal HA assistant has no issue with this, nor does the OpenAI integration. I am using the Ollama API.

Expected behavior The response doesn't look malformed in terms of the response. In the example log, it appears to be missing a service value string but this seems expected when the LLM is "reading" from the system prompt data. It seems to also error when asking "all" lights to be turned off as it tries to use "target_devices" as an array unless this is expected and it's failing for some other reason. Without seeing exactly what the LLM is seeing/doing step-to-step it's hard for me to deduce what's happening for myself.

Additionally, I also think the Home LLM assistant should always be filtered to not return such an error with characters like { and } as a setting (for us voice users), instead it should sent to the log with a message such as "Sorry, where was an error". Hearing an error log be read out is noisy when heard on voice, the grammar could also be post-processed by linting and prettyprinting it as sometimes the extension returns stuff like "Turned the lights off.Turned the lights off.Turned the lights off." which is not only repetitious but also have no gaps after the .

If I edit the response_examples CSV to single word responses I also get responses like "light.turn_on(),Okay", so I presume this can be fixed by removing non-alphanumeric and . characters when initially parsing the input prior to sending the prompt to the LLM. Though this is a different issue, it further indicates that pre-processing the prompt better is important in the case of parsing issues.

Logs
If applicable, please upload any error or debug logs output by Home Assistant.

Failed to parse call from '{"to_say": "The outdoor lights are currently on with an orange color and at 50% brightness.", "service": "", "target_device": "light.outdoor_lights"}'!
acon96 commented 1 month ago

The model is outputting junk tokens at the end of the line that is preventing the regex from detecting it. Either enable JSON output mode for the model or use a backend that supports JSON mode.

janstadt commented 1 month ago

Im running into this as well with https://ollama.com/library/mistral:instruct. I have JSON mode enabled. Any other suggestions?

acon96 commented 3 weeks ago

I believe this has been fixed in v0.3. The issue was an incorrect regular expression for detecting JSON blocks.