Sinaptik-AI / pandas-ai

Chat with your database (SQL, CSV, pandas, polars, mongodb, noSQL, etc). PandasAI makes data analysis conversational using LLMs (GPT 3.5 / 4, Anthropic, VertexAI) and RAG.
https://pandas-ai.com
Other
12.55k stars 1.21k forks source link

Ollama models are not working with Loca;LLM #1358

Open anilmadishetty2498 opened 1 week ago

anilmadishetty2498 commented 1 week ago

System Info

python=3.11.7

🐛 Describe the bug

from pandasai.llm.local_llm import LocalLLM

llm = LocalLLM(api_base="http://localhost:11434/v1", model="llama3")

db = Agent(scm_vc, config={"llm": llm})

not working

dosubot[bot] commented 1 week ago

The Ollama models are not working with LocalLLM because LocalLLM is specifically designed to interface with OpenAI's API. The LocalLLM class initializes an OpenAI client and uses it to generate chat completions. There is no implementation or support for Ollama models in this class.

class LocalLLM(LLM):
    def __init__(self, api_base: str, model: str = "", api_key: str = "", **kwargs):
        if not api_key:
            api_key = "dummy"

        self.model = model
        self.client = OpenAI(base_url=api_base, api_key=api_key).chat.completions
        self._invocation_params = kwargs

The self.client is set to OpenAI(base_url=api_base, api_key=api_key).chat.completions, which means it is configured to use OpenAI's chat completion endpoint. Therefore, it cannot work with Ollama models without modification to support Ollama's API [1].

To continue talking to Dosu, mention @dosu.