Bin-Huang / chatbox

User-friendly Desktop Client App for AI Models/LLMs (GPT, Claude, Gemini, Ollama...)
https://chatboxai.app
GNU General Public License v3.0
20.63k stars 2.09k forks source link

Maintain custom LLM endpoints without manually swapping them around each time #1199

Open mountee32 opened 6 months ago

mountee32 commented 6 months ago

Problem Description I want to support multiple LLM API endpoints, some local and some like openrouter.ai. There's no current way to store the configs for all of these at the same time but have to manually edited each time to switch.

Proposed Solution Allow adding MULTIPLE custom API LLM configurations

Additional Context Ideally allow editing these MULTIPLE custom API LLM configurations in the UI but an interim solution of editing a config file manually could work

miaotouy commented 6 months ago

I'm the same way

rushikb commented 6 months ago

+1 — openrouter configuration makes it a lot easier to experiment with other / newer custom models!

22GNUs commented 5 months ago

+1

kaptainkangaroo commented 5 months ago

+1

montoriusz commented 5 months ago

I think the best would be to support multiple "User Models" each model being a combination of:

The user could add and remove his User Models. This way it's possible to easily use the same Type of API with different URLs, but also different API Keys to the same API and provider. Some providers (e.g. Mistral) expose an OpenAI-like API which can be used while its native API is not available with Chatbox. With this solution it would be possible to e.g. keep both configurations for Mistral and OpenAI, which is one of my use cases. This approach is common for file clients like FileZilla, Cybeduck etc.

coljac commented 3 months ago

I think this is necessary. I imagine a common use case would be to use one model for general queries, but specialised models for code or experiments. Easily swapping, and having the model persist with the chat, would facilitate this.

dreamshit commented 2 weeks ago

I have three azure openai endpoints....