acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
483 stars 56 forks source link

Unexpected error when configuring llama conversation #148

Closed Overbite5117 closed 1 month ago

Overbite5117 commented 1 month ago

Please do not report issues with the model generating incorrect output. This includes any instance where the model responds with Failed to run: ... or outputs badly formatted responses. If you are having trouble getting the correct output from the model, please open a Discussion thread instead.

Describe the bug
Configuring the llama conversation integration to connect to ollama gives an unexpected error and resets the input form.

Expected behavior
Configuring the llama conversation integration lets me add the selected model from ollama (llama3:8b in this case) to home assistant.

Logs
Home assistant: llmerror4 llmerror2 ollama docker log: llmerror3

Paste logs here
acon96 commented 1 month ago

What version of Home Assistant is this running on?

Overbite5117 commented 1 month ago

What version of Home Assistant is this running on?

This is running on version 2023.11.0

Overbite5117 commented 1 month ago

Upgraded Home Assistant to 2024.5.4 and I'm now able to get things configured.