Closed Soein closed 2 weeks ago
Just added support for all ollama models through https://www.litellm.ai/!
In your .env set CUSTOM_MODEL=
any provider/model in this list https://litellm.vercel.app/docs/providers.
I also added Bing Search support. You can set it up as so in your .env:
BING_API_KEY=...
SEARCH_PROVIDER=bing
Can you add the option to choose multiple models for Ollama, such as the 70B model, and also add the ability to perform custom searches, like integrating Bing search?