acon96 / home-llm

A Home Assistant integration & Model to control your smart home using a Local LLM
483 stars 56 forks source link

Unable to setup to use openai API #147

Closed wilcomir closed 1 month ago

wilcomir commented 1 month ago

Describe the bug
I am trying to setup the component to use "Generic OpenAI API Compatible API" but I am unable to do so; here are my settings:

image

I leave all the options as default.

When I try to use the conversation agent via the chat I get the following error:

Failed to communicate with the API! 404 Client Error: Not Found for url: https://api.openai.com:443/v1/completions

I tried setting the port to 0 and I get the following:

Failed to communicate with the API! 404 Client Error: Not Found for url: https://api.openai.com/v1/completions

Expected behavior
The conversation agent should be able to communicate with the API

I believe I am just dumb and not configuring this correctly - but what am I missing? Thanks!

acon96 commented 1 month ago

The OpenAI Compatible API backend isn't actually compatible with GPT4 endpoints. It is for using local AI models that implement the legacy OpenAI API spec. If you would like to use ChatGPT to control your Home Assistant installation I would recommend you check out the extended_openai_conversation integration.

wilcomir commented 1 month ago

Thanks a lot! For some reason I was under the impression it was compatible.

I truly appreciate your help; we can mark this one as solved.