jekalmin / extended_openai_conversation

Home Assistant custom component of conversation agent. It uses OpenAI to control your devices.
834 stars 108 forks source link

Problems connecting to Ollama on local network machine #163

Open griffindodd opened 4 months ago

griffindodd commented 4 months ago

I have Ollama running models on a local machine on my network (192.x.x.101:11434) and it is exposed to my network so I can reach it from my Home Assistant machine and others within the network.

When I try and add it as a service, using the following info, I only get errors.

Name: LocalAi-Ollama API Key: base_url: http://192.x.x.101:11434/v1

I have included some log output to see if that helps...

2024-02-28 12:37:10.600 ERROR (MainThread) [custom_components.extended_openai_conversation.config_flow] Unexpected exception Traceback (most recent call last): File "/config/custom_components/extended_openai_conversation/config_flow.py", line 140, in async_step_user await validate_input(self.hass, user_input) File "/config/custom_components/extended_openai_conversation/config_flow.py", line 113, in validate_input await validate_authentication( File "/config/custom_components/extended_openai_conversation/helpers.py", line 150, in validate_authentication await client.models.list(timeout=10) File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 265, in _get_page return await self._client.request(self._page_cls, self._options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1306, in request return await self._request( ^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1358, in _request raise self._make_status_error_from_response(err.response) from None openai.NotFoundError: 404 page not found 2024-02-28 12:37:27.912 ERROR (MainThread) [custom_components.extended_openai_conversation.config_flow] Unexpected exception Traceback (most recent call last): File "/config/custom_components/extended_openai_conversation/config_flow.py", line 140, in async_step_user await validate_input(self.hass, user_input) File "/config/custom_components/extended_openai_conversation/config_flow.py", line 113, in validate_input await validate_authentication( File "/config/custom_components/extended_openai_conversation/helpers.py", line 150, in validate_authentication await client.models.list(timeout=10) File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 265, in _get_page return await self._client.request(self._page_cls, self._options) ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1306, in request return await self._request( ^^^^^^^^^^^^^^^^^^^^ File "/usr/local/lib/python3.12/site-packages/openai/_base_client.py", line 1358, in _request raise self._make_status_error_from_response(err.response) from None openai.NotFoundError: 404 page not found

danktankk commented 4 months ago

I have the same issue

turns out this is the issue:

https://github.com/jekalmin/extended_openai_conversation/issues/158#issuecomment-1970575804

and I cant get LocalAI to install on my unraid server :|

pajeronda commented 4 months ago

take a look at this project: https://github.com/cheshire-cat-ai