acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
483 stars 56 forks source link

v2.13-fix breaks setup #133

Closed Portagoras closed 2 months ago

Portagoras commented 2 months ago

Describe the bug
Links are still going to the v.2.13 release versions, which you have deleted. So you can't really setup the addon atm.

Expected behavior
Setup works and it finds the whl files in the release folder on github

Steps to reproduce the issue

  1. Install the HACS Addon (fresh install)
  2. Try to set it up
  3. See the Error 404 with it trying to aquite the following file 'https://github.com/acon96/home-llm/releases/download/v0.2.13/llama_cpp_python-0.2.64-cp312-cp312-musllinux_1_2_x86_64.whl'

Logs

Following Entries in order
Logger: homeassistant.util.package
Source: util/package.py:123
First occurred: 12:50:02 AM (5 occurrences)
Last logged: 12:55:27 AM
Unable to install package https://github.com/acon96/home-llm/releases/download/v0.2.13/llama_cpp_python-0.2.64-cp312-cp312-musllinux_1_2_x86_64.whl: ERROR: HTTP error 404 while getting https://github.com/acon96/home-llm/releases/download/v0.2.13/llama_cpp_python-0.2.64-cp312-cp312-musllinux_1_2_x86_64.whl ERROR: Could not install requirement llama-cpp-python==0.2.64 from https://github.com/acon96/home-llm/releases/download/v0.2.13/llama_cpp_python-0.2.64-cp312-cp312-musllinux_1_2_x86_64.whl because of HTTP error 404 Client Error: Not Found for url: https://github.com/acon96/home-llm/releases/download/v0.2.13/llama_cpp_python-0.2.64-cp312-cp312-musllinux_1_2_x86_64.whl for URL https://github.com/acon96/home-llm/releases/download/v0.2.13/llama_cpp_python-0.2.64-cp312-cp312-musllinux_1_2_x86_64.whl

Logger: custom_components.llama_conversation.utils
Source: custom_components/llama_conversation/utils.py:118
integration: LLaMA Conversation (documentation)
First occurred: 12:50:02 AM (5 occurrences)
Last logged: 12:55:27 AM
Error installing llama-cpp-python. Could not install the binary wheels from GitHub for platform: x86_64, python version: 3.12. Please manually build or download the wheels and place them in the `/config/custom_components/llama_conversation` directory.Make sure that you download the correct .whl file for your platform and python version from the GitHub releases page.

Logger: custom_components.llama_conversation.config_flow
Source: custom_components/llama_conversation/config_flow.py:364
integration: LLaMA Conversation (documentation)
First occurred: 12:50:02 AM (5 occurrences)
Last logged: 12:55:27 AM
Failed to install wheel: False
acon96 commented 2 months ago

this should be fixed now. I pushed an updated tag that references the correct github url