acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
483 stars 56 forks source link

"Unexpected error during intent recognition" Ollama as Backend #165

Closed hsman2 closed 2 weeks ago

hsman2 commented 3 weeks ago

I keep getting an Error saying "Unexpected error during intent recognition" I'm on the new 0.3.2 version that dropped last night. I'm also using Ollama as the back end with llama3:latest. Everything was working before 2024.6 update to Home Assistant but after that and the updates to this integration I get this error now. I saw another person on a closed issue having this issue as well (looks like it was closed because it was fixed in 0.3.2) but that issue looks to be for localAI so maybe it's still affecting other backends? Things i've tried so far to fix was redownloading this integration (and setting it back up), and rebooting HA.

log file assist box

NathanKun commented 3 weeks ago

I have the exact same error with ollama backend on 0.3.2

Unexpected error during intent recognition
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/components/assist_pipeline/pipeline.py", line 994, in recognize_intent
    conversation_result = await conversation.async_converse(
                          ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/components/conversation/agent_manager.py", line 108, in async_converse
    result = await method(conversation_input)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/agent.py", line 275, in async_process
    message = self._generate_system_prompt(raw_prompt, llm_api)
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/agent.py", line 665, in _generate_system_prompt
    exposed_attributes = expose_attributes(attributes)
                         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/agent.py", line 648, in expose_attributes
    value = F"{closest_color(value)} {value}"
               ^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/utils.py", line 31, in closest_color
    for key, name in webcolors.CSS3_HEX_TO_NAMES.items():
                     ^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: module 'webcolors' has no attribute 'CSS3_HEX_TO_NAMES'
mikeperalta1 commented 3 weeks ago

+1 just got this same thing but have llama.cpp as my backend

cfpandrade commented 3 weeks ago

same here! :(

apedance commented 3 weeks ago

I get the same error using llama.cpp backend or using ollama server.

neutrinotek commented 3 weeks ago

Same error. Home LLM with Ollama backend

sl33pydog commented 3 weeks ago

Same issue. It worked on 2024.6.1 and .0 but with intermittent intent issues. Currently on 2024.6.2 it's completely failed intent.

mikeperalta1 commented 3 weeks ago

For anyone wanting to get this working asap, I wrapped the offending code in a try/except block and it seems to be fine for me now. Here's my diff

<     min_colors = {}
<     for key, name in webcolors.CSS3_HEX_TO_NAMES.items():
<         r_c, g_c, b_c = webcolors.hex_to_rgb(key)
<         rd = (r_c - requested_color[0]) ** 2
<         gd = (g_c - requested_color[1]) ** 2
<         bd = (b_c - requested_color[2]) ** 2
<         min_colors[(rd + gd + bd)] = name
---
>     try:
>         min_colors = {}
>         for key, name in webcolors.CSS3_HEX_TO_NAMES.items():
>             r_c, g_c, b_c = webcolors.hex_to_rgb(key)
>             rd = (r_c - requested_color[0]) ** 2
>             gd = (g_c - requested_color[1]) ** 2
>             bd = (b_c - requested_color[2]) ** 2
>             min_colors[(rd + gd + bd)] = name
>     except AttributeError:
>         return requested_color
208c211
<     return f"{'https' if ssl else 'http'}://{hostname}{ ':' + port if port else ''}{path}"
\ No newline at end of file
---
>     return f"{'https' if ssl else 'http'}://{hostname}{ ':' + port if port else ''}{path}"
acon96 commented 3 weeks ago

I looked into this and the upstream library (webcolors) released a major update recently that broke compatibility. I updated the integration's requirements to install the correct version of the library.

The develop branch has the fix and I'll push an update this weekend with some other fixes.

kshoichi commented 2 weeks ago

Deleting the "Additional attribute to expose in the context: RGB" in the model's options will allow you to use the integration until the new version is available.

acon96 commented 2 weeks ago

this should be fixed in v0.3.3