Open Greatz08 opened 9 months ago
👀 @HakaishinShwet
Thank you for raising an issue. We will investigate into the matter and get back to you as soon as possible. Please make sure you have given us as much context as possible.\ 非常感谢您提交 issue。我们会尽快调查此事,并尽快回复您。 请确保您已经提供了尽可能多的背景信息。
We will implement it here: https://github.com/lobehub/lobe-chat/discussions/2040
🥰 Feature Description
Right Now in interface setting there is one option to set openai model after setting it we cant create another openai custom model. I am using LiteLLM so i need more than one same option to create openai model so please dont make it flexible option rather than just letting us set only one openai model. LiteLLM is another great open source project which helps us to run different llm models using openai api structure you can read more about them from there github.I am using litellm to create local proxy server for groq two llm models and for gemini too and for some local models too so it is very convenient to use litellm and then connect different llms with project like lobe-chat or babyAGI which is another similar project like this one.
🧐 Proposed Solution
Just provide us custom way to set more number of open ai llm instances because one is not sufficient if we are using litellm which is nowadays used by many
📝 Additional Information
No response