acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
483 stars 56 forks source link

404 when adding the integration #128

Closed toxic0berliner closed 2 months ago

toxic0berliner commented 2 months ago

Describe the bug
Unable to add the llama integration

Environment
HaOS VM on PvE 8 4GB ram Add HACS Install from HACS and reboot Add the llama integration and select download from huggingface

Logs
I get a 404 at the exact url.

Unable to install package https://github.com/acon96/home-llm/releases/download/v0.2.13/llama_cpp_python-0.2.64-cp312-cp312-musllinux_1_2_x86_64.whl: ERROR: HTTP error 404 while getting https://github.com/acon96/home-llm/releases/download/v0.2.13/llama_cpp_python-0.2.64-cp312-cp312-musllinux_1_2_x86_64.whl ERROR:
wilcomir commented 2 months ago

I have the same issue. There must be something slightly broken somewhere in the link generation for the whl file, the correct link should contain the -fix:

https://github.com/acon96/home-llm/releases/download/v0.2.13-fix/llama_cpp_python-0.2.64-cp312-cp312-musllinux_1_2_x86_64.whl

I will try going to 0.2.12; it should make things work.

acon96 commented 2 months ago

This should be fixed in v0.2.14