Open Teagan42 opened 2 months ago
Same issue here
Can you check home assistant log for errors related to conflicting dependencies? My best guess is that another integration you are using requires the most recent versions of webcolors
which do not expose those dictionaries anymore.
Same issue here HASS 2024.8.2 home-llm 0.3.6
LLM Model 'fixt/home-3b-v3:q4_k_m' (remote)
Temporarily fixed by removing additional attribute "rgb_color" (which exists by default).
Same issue, here is the traceback:
Logger: homeassistant.components.assist_pipeline.pipeline
Source: components/assist_pipeline/pipeline.py:1015
integration: Assist pipeline ([documentation](https://www.home-assistant.io/integrations/assist_pipeline), [issues](https://github.com/home-assistant/core/issues?q=is%3Aissue+is%3Aopen+label%3A%22integration%3A+assist_pipeline%22))
First occurred: 16:52:27 (6 occurrences)
Last logged: 17:01:00
Unexpected error during intent recognition
Traceback (most recent call last):
File "/usr/src/homeassistant/homeassistant/components/assist_pipeline/pipeline.py", line 1015, in recognize_intent
conversation_result = await conversation.async_converse(
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/conversation/agent_manager.py", line 108, in async_converse
result = await method(conversation_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/src/homeassistant/homeassistant/components/conversation/entity.py", line 47, in internal_async_process
return await self.async_process(user_input)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/config/custom_components/llama_conversation/conversation.py", line 365, in async_process
message = self._generate_system_prompt(raw_prompt, llm_api)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/config/custom_components/llama_conversation/conversation.py", line 776, in _generate_system_prompt
exposed_attributes = expose_attributes(attributes)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/config/custom_components/llama_conversation/conversation.py", line 759, in expose_attributes
value = F"{closest_color(value)} {value}"
^^^^^^^^^^^^^^^^^^^^
File "/config/custom_components/llama_conversation/utils.py", line 31, in closest_color
for key, name in webcolors.CSS3_HEX_TO_NAMES.items():
^^^^^^^^^^^^^^^^^^^^^^^^^^^
AttributeError: module 'webcolors' has no attribute 'CSS3_HEX_TO_NAMES'
Home-Assistant: 2024.8.1 llama-conversation: 0.3.6 webcolors: 1.13
Exact same issue was marked as fixed in 0.3.3 but appears not to be: https://github.com/acon96/home-llm/issues/165 What is odd is that I looked at the webcolors package inside of home-assistant and the constant definitely exists and is exported.
Describe the bug
When performing a chat completion via the Assist Pipeline, the integration raises an AttributeError.
Expected behavior
The Assist Pipeline should be able to determine the intent.
Logs
If applicable, please upload any error or debug logs output by Home Assistant.