acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
630 stars 64 forks source link

"Unexpected error" when trying to add the Integration to Devices #97

Closed thisIsLoading closed 6 months ago

thisIsLoading commented 6 months ago

Describe the bug
when trying to add the integration and configure it for my LocalAI instance, which runs on a different server in the same network, i get the following error:

image

what's interesting is, that the text reads Provide the connection details for an instance of text-generation-webui that is hosting the model. even though i selected this in the first screen:

image

Expected behavior
it should add the integraton and start communicating with it

Logs
there isnt much in the logs:

2024-03-24 08:40:37.962 ERROR (MainThread) [custom_components.llama_conversation.config_flow] Unexpected exception
Traceback (most recent call last):
  File "/config/custom_components/llama_conversation/config_flow.py", line 481, in async_step_remote_model
    if error_message:
       ^^^^^^^^^^^^^
UnboundLocalError: cannot access local variable 'error_message' where it is not associated with a value

what i can tell is that there is ZERO attempt to connect to the LocalAI as well (i dont know if there should be in config phase though). as there's nothing in the LocalAI logs indicating any communication

acon96 commented 6 months ago

This should be fixed in v0.2.10