Open comeback01 opened 3 weeks ago
If base url is configurable, I think local LLMs(Ollama, etc...) will be supported automatically through BerriAI/litellm proxy.
Oh yeah, we can but what about vision capabilities, the services that you suggested have vision mode. Should we separate the vision mechanism.
I will add a support to set custom langchain llm class. With this we can be integrated many models above langchain.
I wanted to suggest you the possibility to customize the endpoint.
This allows the use of services compatible with OpenAI such as:
https://openrouter.ai https://fireworks.ai https://www.together.ai
The advantage of OpenRouter is that they also offer free endpoints.
Thank you for the help.