acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
483 stars 56 forks source link

Fail to load `llama_cpp` on a fresh install #144

Closed benbender closed 3 weeks ago

benbender commented 1 month ago

Describe the bug
After a fresh install of home-llm, the integration fails to load because it seems to be unable to find the llama_cpp-module even if it says that it successfully installed the module.

Home-Assistant is running inside Podman on a small Server with an Intel(R) Celeron(R) N5105 @ 2.00GHz-CPU.

Expected behavior
The integration should load successfully :)

Logs

2024-05-11 20:17:20.609 INFO (SyncWorker_36) [custom_components.llama_conversation.agent] Using model file '/config/media/models/models--acon96--Home-3B-v3-GGUF/snapshots/1f20ec6ddf2cbf9e6996c9f8a524bc5d80abb42e/Home-3B-v3.q3_k_m.gguf'
2024-05-11 20:17:22.868 INFO (SyncWorker_36) [homeassistant.util.package] Attempting install of https://github.com/acon96/home-llm/releases/download/v0.2.17/llama_cpp_python-0.2.70-cp312-cp312-musllinux_1_2_x86_64.whl
2024-05-11 20:17:30.569 INFO (SyncWorker_36) [custom_components.llama_conversation.utils] llama-cpp-python successfully installed from GitHub release
2024-05-11 20:17:32.774 ERROR (MainThread) [homeassistant.config_entries] Error setting up entry LLM Model 'acon96/Home-3B-v3-GGUF' (llama.cpp) for llama_conversation
Traceback (most recent call last):
  File "/config/custom_components/llama_conversation/agent.py", line 546, in _load_model
    self.llama_cpp_module = importlib.import_module("llama_cpp")
                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/util/loop.py", line 144, in protected_loop_func
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1324, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'llama_cpp'

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/config_entries.py", line 575, in async_setup
    result = await component.async_setup_entry(hass, self)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/__init__.py", line 67, in async_setup_entry
    agent = await hass.async_add_executor_job(create_agent, backend_type)
            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/concurrent/futures/thread.py", line 58, in run
    result = self.fn(*self.args, **self.kwargs)
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/__init__.py", line 63, in create_agent
    return agent_cls(hass, entry)
           ^^^^^^^^^^^^^^^^^^^^^^
  File "/config/custom_components/llama_conversation/agent.py", line 139, in __init__
    self._load_model(entry)
  File "/config/custom_components/llama_conversation/agent.py", line 554, in _load_model
    self.llama_cpp_module = importlib.import_module("llama_cpp")
                            ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/src/homeassistant/homeassistant/util/loop.py", line 144, in protected_loop_func
    return func(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/importlib/__init__.py", line 90, in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "<frozen importlib._bootstrap>", line 1387, in _gcd_import
  File "<frozen importlib._bootstrap>", line 1360, in _find_and_load
  File "<frozen importlib._bootstrap>", line 1324, in _find_and_load_unlocked
ModuleNotFoundError: No module named 'llama_cpp'

PS: Thanks for your awesome work for the community! <3

benbender commented 3 weeks ago

Duplicate #140