Closed juan11perez closed 4 months ago
Same error here :
2024-02-16 16:36:24.605 ERROR (SyncWorker_9) [homeassistant.util.package] Unable to install package /config/custom_components/llama_conversation/llama_cpp_python-0.2.42-cp311-cp311-musllinux_1_2_x86_64.whl: ERROR: llama_cpp_python-0.2.42-cp311-cp311-musllinux_1_2_x86_64.whl is not a supported wheel on this platform. 2024-02-16 16:36:24.606 WARNING (MainThread) [custom_components.llama_conversation.config_flow] Failed to install wheel: False
I'm on a NUC (Intel Celeron N5095) running a VM in proxmox, where HAOS is installed. Both the wheels from https://github.com/acon96/home-llm/tree/develop/dist are in the custom_components/llama_conversation/ folder.
Any way to create a custom wheel localy ?
Thanks for your amazing work !
Home Assistant 2024.2.1 updated from Python 3.11 to Python 3.12. I have published new wheels that are compatible with 3.12 in the /dist
folder.
@pbn42 building your own wheel can be done via the /dist/run_docker.sh
script with docker installed. It builds the wheel inside of the HA core image to ensure compatibility at runtime.
@acon96 Thank you. What do i type in the Local File Name box in configflow?
Thank you. What do i type in the Local File Name box in configflow?
@juan11perez It should be wherever you placed the model file on the Home Assistant filesystem.
I put mine in the /config/models
folder that I made so the path would be /config/models/home-3b-v2.q5_k_m.gguf
(replace with the quant level you downloaded).
Alternatively, if you choose the Llama.cpp (Huggingface)
backend it will ask for a Repo ID and quantization level instead and will download the model for you.
@acon96 thank you. I was able to install it with your instructions, but as soon as i invoked it, it crashed my server.
The only thing I know that could cause that is llama.cpp failing to load the GBNF grammar file. It just hard segfaults the entire Home Assistant process. I actually need to figure that one out too since that's not a good user experience.
thank you again @acon96
welcome. closing as finished
Installed the integration and manually downloaded/Installed the Local Model.
When trying to add the integration to homeassistant and selecting 'local model' I get this error:
Pip returned an error while installing the wheel
Thank you