Closed iG8R closed 2 weeks ago
Only OpenAI, Gemini and Ollama are supported. See the help file or the wiki pages regarding the supported endpoints by the Chat and Assistant.
For example in Chat the Endpoint needs to be one of:
Is there any way to allow custom endpoints to be used? There are many free models that use OpenAI API-like requests. It would be very convenient to use custom endpoints when working with such models.
Hi. When I try to use a custom endpoint (https://github.com/xtekky/gpt4free/) for the OpenAI chat I get the error "Chat setup incomplete: The LLM endpoint is missing or not supported".