acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
483 stars 56 forks source link

Failed to properly initialize llama-cpp-python. (Exit code 1.) #176

Open 912-Cireap-Bogdan opened 2 weeks ago

912-Cireap-Bogdan commented 2 weeks ago

Describe the bug
Installation from HACS worked fine, however when initializing the integration I get a "Failed to set up" error in HA UI

Expected behavior
Integration would install properly and configure properly with the basic default settings

Logs
If applicable, please upload any error or debug logs output by Home Assistant.

Traceback (most recent call last):
  File "<string>", line 1, in <module>
  File "/usr/local/lib/python3.12/multiprocessing/spawn.py", line 122, in spawn_main
    exitcode = _main(fd, parent_sentinel)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/multiprocessing/spawn.py", line 132, in _main
    self = reduction.pickle.load(from_parent)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/__init__.py", line 7, in <module>
    import homeassistant.components.conversation as ha_conversation
  File "/usr/src/homeassistant/homeassistant/components/conversation/__init__.py", line 11, in <module>
    from homeassistant.config_entries import ConfigEntry
  File "/usr/src/homeassistant/homeassistant/config_entries.py", line 30, in <module>
    from .components import persistent_notification
  File "/usr/src/homeassistant/homeassistant/components/persistent_notification/__init__.py", line 14, in <module>
    from homeassistant.components import websocket_api
  File "/usr/src/homeassistant/homeassistant/components/websocket_api/__init__.py", line 14, in <module>
    from . import commands, connection, const, decorators, http, messages  # noqa: F401
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/components/websocket_api/http.py", line 15, in <module>
    from homeassistant.components.http import KEY_HASS, HomeAssistantView
  File "/usr/src/homeassistant/homeassistant/components/http/__init__.py", line 44, in <module>
    from homeassistant.helpers.network import NoURLAvailableError, get_url
  File "/usr/src/homeassistant/homeassistant/helpers/network.py", line 9, in <module>
    from hass_nabucasa import remote
  File "/usr/local/lib/python3.12/site-packages/hass_nabucasa/__init__.py", line 30, in <module>
    from .remote import RemoteUI
  File "/usr/local/lib/python3.12/site-packages/hass_nabucasa/remote.py", line 22, in <module>
    from .acme import AcmeClientError, AcmeHandler, AcmeJWSVerificationError
  File "/usr/local/lib/python3.12/site-packages/hass_nabucasa/acme.py", line 13, in <module>
    from acme import challenges, client, crypto_util, errors, messages
  File "/usr/local/lib/python3.12/site-packages/acme/challenges.py", line 24, in <module>
    from acme import crypto_util
  File "/usr/local/lib/python3.12/site-packages/acme/crypto_util.py", line 23, in <module>
    from acme import errors
  File "/usr/local/lib/python3.12/site-packages/acme/errors.py", line 52, in <module>
    class MissingNonce(NonceError):
  File "/usr/local/lib/python3.12/site-packages/acme/errors.py", line 62, in MissingNonce
    def __init__(self, response: requests.Response, *args: Any) -> None:
                                 ^^^^^^^^^^^^^^^^^
AttributeError: module 'requests' has no attribute 'Response'
2024-06-20 12:20:12.797 ERROR (MainThread) [homeassistant.config_entries] Error setting up entry LLM Model 'acon96/Home-3B-v3-GGUF' (llama.cpp) for llama_conversation
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/config_entries.py", line 594, in async_setup
    result = await component.async_setup_entry(hass, self)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/__init__.py", line 83, in async_setup_entry
    await agent._async_load_model(entry)
  File "/config/custom_components/llama_conversation/agent.py", line 201, in _async_load_model
    return await self.hass.async_add_executor_job(
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/agent.py", line 805, in _load_model
    validate_llama_cpp_python_installation()
  File "/config/custom_components/llama_conversation/utils.py", line 132, in validate_llama_cpp_python_installation
    raise Exception(f"Failed to properly initialize llama-cpp-python. (Exit code {process.exitcode}.)")
Exception: Failed to properly initialize llama-cpp-python. (Exit code 1.)

Thanks in advance!