Closed psmukhopadhyay closed 4 months ago
Hey! When you change the provider it doesn't automatically switch the base URL (was debating about doing that). You will still need to enter the ollama host your models are at, typically localhost:11434/v1. The provider selection buttons have more to do with the internal routing rather than setting the base URL. Sorry for the confusion, I'll make this more straightforward. Let me know if that solves it for you!
Trying this version (1.0) in Ubuntu 22.04.
Change of the versions of these two packages (datasets and fsspec) allowed me to run the the command gradio app.py. But after initializing the output folder 20240713-154301, when I select Ollama in the LLM settinigs and Embedding settings it is showing me no models. Although if I refresh and don't select any of the options, it is showing me LLM and embedding models. What am I missing?