acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
483 stars 56 forks source link

v0.2.14 - intregration failed to configure #135

Closed pbn42 closed 2 months ago

pbn42 commented 2 months ago

Hi,

Describe the bug

I just tried your new version (v0.2.14). Installation works fine, but when i finished to create the integration, i got a "failed to configure" message.

I tried using these two methods :

Expected behavior
Integration shall start and should appear as a conversation agent

Logs
If applicable, please upload any error or debug logs output by Home Assistant.

`Enregistreur: homeassistant.config_entries
Source: config_entries.py:575
S'est produit pour la première fois: 22:01:06 (3 occurrences)
Dernier enregistrement: 22:18:11

Error setting up entry LLM Model 'acon96/Home-3B-v3-GGUF' (llama.cpp) for llama_conversation
Error setting up entry LLM Model 'Home-3B-v3.q3_k_m.gguf' (llama.cpp) for llama_conversation
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/config_entries.py", line 575, in async_setup
    result = await component.async_setup_entry(hass, self)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/__init__.py", line 67, in async_setup_entry
    agent = await hass.async_add_executor_job(create_agent, backend_type)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/__init__.py", line 63, in create_agent
    return agent_cls(hass, entry)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/agent.py", line 137, in __init__
    self._load_model(entry)
  File "/config/custom_components/llama_conversation/agent.py", line 540, in _load_model
    validate_llama_cpp_python_installation()
  File "/config/custom_components/llama_conversation/utils.py", line 80, in validate_llama_cpp_python_installation
    multiprocessing.set_start_method('spawn') # required because of aio
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/multiprocessing/context.py", line 247, in set_start_method
    raise RuntimeError('context has already been set')
RuntimeError: context has already been set`

Thanks a lot for your work and your help !

pbn42 commented 2 months ago

And i didn't change any parameters

acon96 commented 2 months ago

this should be fixed in v0.2.15