Chat with your database (SQL, CSV, pandas, polars, mongodb, noSQL, etc). PandasAI makes data analysis conversational using LLMs (GPT 3.5 / 4, Anthropic, VertexAI) and RAG.
Since pandasai already supports Local LLM managers like Ollama, it would be great if it can also support UI/API frameworks for Ollama such as Open WebUI
Motivation, pitch
Open WebUI https://docs.openwebui.com/
enables various features for ollama such as auth, load balancing etc so supporting them with the below will be super helpful
LocalLLM(api_base = <Open WebUI API URL>)
Example API URL for Open Web UI http://localhost:3000/ollama/api/chat
🚀 The feature
Since pandasai already supports Local LLM managers like Ollama, it would be great if it can also support UI/API frameworks for Ollama such as Open WebUI
Motivation, pitch
Open WebUI https://docs.openwebui.com/ enables various features for ollama such as auth, load balancing etc so supporting them with the below will be super helpful
Example API URL for Open Web UI
http://localhost:3000/ollama/api/chat
Alternatives
No response
Additional context
No response