Open cztomsik opened 9 months ago
Would be great if Ava supported remote openai api. This would allow us to reuse the server and avoid loading the model multiple times if we are using it in a different app.
Would be really great if Ava supported using an already running Ollama instance via its API!
Would be really great if Ava supported using an already running Ollama instance via its API!
Yes, this is in the works, but not finished yet.
Just a small update, the UI part has been rewritten and we now have /api/chat/completions
endpoint which is mostly compatible with openai, so hopefully, we are really close to closing this.
What's missing:
<ModelSelect
(it's not yet clear which ones, and how to configure that)/api
endpoint (more work), because the first option would make the api key visible in the web browser devtools panel
/api/chat/completions
endpoint which will just wrap what we do in client-side and then if we are using remote endpoint, we can just proxy