Closed supmo668 closed 2 weeks ago
Hey @supermomo668 ! First of all, agreed with your feature request, I want that too haha. Implementation-wise, we've been thinking that the best approach for adding support for additional models will be through setting up instructions for using a proxy server like LiteLLM and allowing you to set a custom openai base URL in the chat configuration.
If we do take that up as recommendation, we should certainly improve our documentation as well.
Halfway there! Gemini chat is now supported with #902 . I'll look into the vision integration in a week or two.
Provide Chat Model support for Gemini
With the recent advances with Gemini and long context window features, I would like to add Gemini as a chat model available for administrator to include that as a choice of chat model.
Potentially I would like this to be assigned to me since I'd like to use it also.