Closed thelege2nd closed 7 months ago
At the moment, this option is not supported.
but since letmedoit can be used with local models, then it should be able easy to change the API endpoint for openai too.
It makes sense, I will implement it in https://github.com/eliranwong/freegenius
Alternately, welcome a pull request to letmedoit
You can use gemini and ollama for chat features without openai in LetMeDoIt AI.
To fully support Gemini and Ollama, and Llama.cpp, i. E. for both chat and task execution, use sibling project https://github.com/eliranwong/freegenius instead.
For using with open-source LLMs, please check: https://github.com/eliranwong/freegenius
I have not installed it yet, but I didn't see any documentation on how to change the API base URL?
I want to send the requests to my localhost:PORT as that's where I am running a self hosted API for GPT 4.