acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
490 stars 56 forks source link

Llama Conversation won't start after upgrading from 0.2.6 to 0.2.7 #83

Closed mikey60 closed 4 months ago

mikey60 commented 4 months ago

Describe the bug The only thing I did was upgrade from 0.2.6 to 0.2.7 and restarted HA. The addon was working correctly before this. The error I see when I place the cursor over the service is "Ollama server does not have the provided model: fixt/home-3b-v2:q5_k_m". I tried to add a new service with the same model but got the same error. Additional information: I am running HA version 2024.2.1 I am running a local Ollama server. When I query Ollama API it reports that the model is there. The model works fine when I use the Ollama integration to query the model. Update: I was able to get it to start and run normal by commenting out the code of the additional check in the version 0.2.7 update (see below):

        try:
            headers = {}
            if self.api_key:
                headers["Authorization"] = f"Bearer {self.api_key}"

            currently_downloaded_result = requests.get(
                f"{self.api_host}/api/tags",
                headers=headers,
            )
            currently_downloaded_result.raise_for_status()

        except Exception as ex:
            _LOGGER.debug("Connection error was: %s", repr(ex))
            raise ConfigEntryNotReady("There was a problem connecting to the remote server") from ex

        if not any([ x["name"].split(":")[0] == self.model_name for x in currently_downloaded_result.json()["models"]]):
            raise ConfigEntryNotReady(f"Ollama server does not have the provided model: {self.model_name}")

Expected behavior I expected the addon would continue to work as before after upgrading.

Logs I enabled logging but no errors were logged in the HA log.

Please do not report issues with the model generating incorrect output. If you are having trouble getting the correct output from the model, please open a Discussion thread instead.

acon96 commented 4 months ago

Thanks for the report. I wasn't properly handling the colon in the Ollama model name. I'll push an update to fix this in a bit.

acon96 commented 4 months ago

Should be fixed in v0.2.8