home-assistant / core

:house_with_garden: Open source home automation that puts local control and privacy first.
https://www.home-assistant.io
Apache License 2.0
72.06k stars 30.18k forks source link

Ollama 0.1.40 integration seems to have stopped working in 2024.6.0 #118972

Closed carroarmato0 closed 2 weeks ago

carroarmato0 commented 4 months ago

The problem

The Ollama integration fails to connect to a local running instance of Ollama 0.1.40

What version of Home Assistant Core has the issue?

core-2024.6.0

What was the last working version of Home Assistant Core?

core-2024.5.0

What type of installation are you running?

Home Assistant Container

Integration causing the issue

ollama

Link to integration documentation on our website

https://www.home-assistant.io/integrations/ollama

Diagnostics information

No response

Example YAML snippet

No response

Anything in the logs that might be useful for us?

No response

Additional information

If I use an external tool like open-webui I am able to interface with Ollama, confirming it works. My instance is reachable to home-assistant with URL http://ollama.apps.lan. Unfortunately I seem to be unable to turn on debugging messages since the integration cannot be created without having a working URL.

My ollama instance already has 2 models preinstalled.

I checked to verify that Ollama is reachable from within the Home-Assistant container by curl'ing the address:

home-assistant:/config# ip add
1: lo: <LOOPBACK,UP,LOWER_UP> mtu 65536 qdisc noqueue state UNKNOWN qlen 1000
    link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00
    inet 127.0.0.1/8 scope host lo
       valid_lft forever preferred_lft forever
    inet6 ::1/128 scope host 
       valid_lft forever preferred_lft forever
197: eth0@if2: <BROADCAST,MULTICAST,UP,LOWER_UP,M-DOWN> mtu 1500 qdisc noqueue state UP 
    link/ether 02:42:c0:a8:32:0b brd ff:ff:ff:ff:ff:ff
    inet 192.168.50.11/24 brd 192.168.50.255 scope global eth0
       valid_lft forever preferred_lft forever
    inet6 2a02:1811:c40d:7780:42:c0ff:fea8:320b/64 scope global dynamic flags 100 
       valid_lft 85929sec preferred_lft 29607sec
    inet6 2a02:1811:c40d:7780::5/64 scope global flags 02 
       valid_lft forever preferred_lft forever
    inet6 fe80::42:c0ff:fea8:320b/64 scope link 
       valid_lft forever preferred_lft forever
home-assistant:/config# curl http://ollama.apps.lan
Ollama is running
home-assistant:/config# 
home-assistant[bot] commented 4 months ago

Hey there @synesthesiam, mind taking a look at this issue as it has been labeled with an integration (ollama) you are listed as a code owner for? Thanks!

Code owner commands Code owners of `ollama` can trigger bot actions by commenting: - `@home-assistant close` Closes the issue. - `@home-assistant rename Awesome new title` Renames the issue. - `@home-assistant reopen` Reopen the issue. - `@home-assistant unassign ollama` Removes the current integration label and assignees on the issue, add the integration domain after the command. - `@home-assistant add-label needs-more-information` Add a label (needs-more-information, problem in dependency, problem in custom component) to the issue. - `@home-assistant remove-label needs-more-information` Remove a label (needs-more-information, problem in dependency, problem in custom component) on the issue.

(message by CodeOwnersMention)


ollama documentation ollama source (message by IssueLinks)

shadynafie commented 3 months ago

Same issue here

carroarmato0 commented 3 months ago

Suddenly it seems to work. I'm not sure what happened. Version of HA didn't change. I did notice the UI updated (browser reload was necessary). I'm wondering if that didn't execute after the upgrade.

shadynafie commented 3 months ago

I opened an incognito browser, removed the integration and added it again and restarted the HomeAssistant. But still stuck.

jjnj023 commented 3 months ago

I see the following log error when trying to setup a new Ollama instance:


Logger: homeassistant.components.ollama.config_flow
Bron: components/ollama/config_flow.py:89
integratie: Ollama (documentatie, problemen)
Eerst voorgekomen: 18:04:19 (1 gebeurtenissen)
Laatst gelogd: 18:04:19

Unexpected exception
Traceback (most recent call last):
  File "/usr/src/homeassistant/homeassistant/components/ollama/config_flow.py", line 89, in async_step_user
    response = await self.client.list()
               ^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/ollama/_client.py", line 618, in list
    return response.json()
           ^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/site-packages/httpx/_models.py", line 764, in json
    return jsonlib.loads(self.content, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/json/__init__.py", line 346, in loads
    return _default_decoder.decode(s)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/json/decoder.py", line 337, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "/usr/local/lib/python3.12/json/decoder.py", line 355, in raw_decode
    raise JSONDecodeError("Expecting value", s, err.value) from None
json.decoder.JSONDecodeError: Expecting value: line 1 column 1 (char 0)
carroarmato0 commented 3 months ago

Suddenly it seems to work. I'm not sure what happened. Version of HA didn't change. I did notice the UI updated (browser reload was necessary). I'm wondering if that didn't execute after the upgrade.

Never mind. For some reason it briefly worked to add the integration, but it doesn't function

denisjoshua commented 3 months ago

Hi there all, I have same error (I think) In my log viewer I have this:

  File "/usr/local/lib/python3.12/contextlib.py", line 158, in __exit__
    self.gen.throw(value)
  File "/usr/local/lib/python3.12/site-packages/httpx/_transports/default.py", line 86, in map_httpcore_exceptions
    raise mapped_exc(message) from exc
httpx.ReadError

Thanks in advance Denis

issue-triage-workflows[bot] commented 3 weeks ago

There hasn't been any activity on this issue recently. Due to the high number of incoming GitHub notifications, we have to clean some of the old issues, as many of them have already been resolved with the latest updates. Please make sure to update to the latest Home Assistant version and check if that solves the issue. Let us know if that works for you by adding a comment 👍 This issue has now been marked as stale and will be closed if no further activity occurs. Thank you for your contributions.