lobehub / lobe-chat

🤯 Lobe Chat - an open-source, modern-design LLMs/AI chat framework. Supports Multi AI Providers( OpenAI / Claude 3 / Gemini / Ollama / Bedrock / Azure / Mistral / Perplexity ), Multi-Modals (Vision/TTS) and plugin system. One-click FREE deployment of your private ChatGPT chat application.
https://chat-preview.lobehub.com
Other
35.65k stars 8.43k forks source link

[Request] Please add option to create more than one open ai model from settings because if we use litellm then we will require more than one openai model option as we will be using litellm to create multiple local proxy server which will support different llm. #1515

Open HakaishinShwet opened 4 months ago

HakaishinShwet commented 4 months ago

🥰 Feature Description

Right Now in interface setting there is one option to set openai model after setting it we cant create another openai custom model. I am using LiteLLM so i need more than one same option to create openai model so please dont make it flexible option rather than just letting us set only one openai model. LiteLLM is another great open source project which helps us to run different llm models using openai api structure you can read more about them from there github.I am using litellm to create local proxy server for groq two llm models and for gemini too and for some local models too so it is very convenient to use litellm and then connect different llms with project like lobe-chat or babyAGI which is another similar project like this one.

🧐 Proposed Solution

Just provide us custom way to set more number of open ai llm instances because one is not sufficient if we are using litellm which is nowadays used by many

📝 Additional Information

No response

lobehubbot commented 4 months ago

👀 @HakaishinShwet

Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. Please make sure you have given us as much context as possible.\ 非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。

arvinxx commented 1 month ago

We will implement it here: https://github.com/lobehub/lobe-chat/discussions/2040