rashadphz / farfalle

🔍 AI search engine - self-host with local or cloud LLMs
https://www.farfalle.dev/
Apache License 2.0
2.21k stars 166 forks source link

Can you add custom search functionality and the ability to choose different Ollama models? #35

Closed Soein closed 2 weeks ago

Soein commented 1 month ago

Can you add the option to choose multiple models for Ollama, such as the 70B model, and also add the ability to perform custom searches, like integrating Bing search?

rashadphz commented 2 weeks ago

Other models

Just added support for all ollama models through https://www.litellm.ai/!

In your .env set CUSTOM_MODEL= any provider/model in this list https://litellm.vercel.app/docs/providers.

Bing

I also added Bing Search support. You can set it up as so in your .env:

BING_API_KEY=...
SEARCH_PROVIDER=bing