Describe the bug
when trying to add the integration and configure it for my LocalAI instance, which runs on a different server in the same network, i get the following error:
what's interesting is, that the text reads Provide the connection details for an instance of text-generation-webui that is hosting the model. even though i selected this in the first screen:
Expected behavior
it should add the integraton and start communicating with it
Logs
there isnt much in the logs:
2024-03-24 08:40:37.962 ERROR (MainThread) [custom_components.llama_conversation.config_flow] Unexpected exception
Traceback (most recent call last):
File "/config/custom_components/llama_conversation/config_flow.py", line 481, in async_step_remote_model
if error_message:
^^^^^^^^^^^^^
UnboundLocalError: cannot access local variable 'error_message' where it is not associated with a value
what i can tell is that there is ZERO attempt to connect to the LocalAI as well (i dont know if there should be in config phase though). as there's nothing in the LocalAI logs indicating any communication
Describe the bug
when trying to add the integration and configure it for my LocalAI instance, which runs on a different server in the same network, i get the following error:
what's interesting is, that the text reads
Provide the connection details for an instance of text-generation-webui that is hosting the model.
even though i selected this in the first screen:Expected behavior
it should add the integraton and start communicating with it
Logs
there isnt much in the logs:
what i can tell is that there is ZERO attempt to connect to the LocalAI as well (i dont know if there should be in config phase though). as there's nothing in the LocalAI logs indicating any communication