Open Coriolan-Bataille opened 4 months ago
Interesting idea. I could generalise the app to support multiple APIs beyond Ollama.
For OpenAI/Ollama compatible APIs, an optionally configurable API key would be great and probably is all that's needed.
The header "Authorization: Bearer $API_KEY"
is a common way to send the API key to the endpoint. For example, Mistral, Fireworks.ai, and [OpenAI](https://platform.openai.com/docs/api-reference/authentication] itself do it this way.
This would also make it possible to use Ollama over the internet with an API key secured endpoint, I'm currently working on an article that describes how to do this with Cloudflare Tunnels, Caddy and Docker Compose.
Hi! Wondering if this is still being considered? Would love to have the ability to use the app with any openai equivalent api by defining the url and possibly the api key if needed. That way, this app can be utilized broadly across deployments.
Agreed, API key based OpenAI API based would be best and most general since many things, e.g. vLLM, support OpenAI and ollama also support OpenAI API.
As a rule of thumb it would be good ot be able to switch from Openai, claude, groq or ollama depending on use case?
I just got my local box to run:
python3 -m llama_cpp.server --model Mistral-7B-Instruct-v0.2/Mistral-7B-Instruct-v0.2.gguf --host 0.0.0.0
And I would love to be able to connect to it via Enchanted.
I would be nice to be able to connect to Mistral AI API to utilize their server. Setting would look like: Url: https://api.mistral.ai/v1 Private key: xxxxxxxxxxx