Sinaptik-AI / pandas-ai

Chat with your database (SQL, CSV, pandas, polars, mongodb, noSQL, etc). PandasAI makes data analysis conversational using LLMs (GPT 3.5 / 4, Anthropic, VertexAI) and RAG.
https://pandas-ai.com
Other
12.73k stars 1.23k forks source link

Unable to pass n_ctx when using Ollama. #1117

Closed Kanishk-Kumar closed 5 months ago

Kanishk-Kumar commented 5 months ago

System Info

Title says it all. This, on the other hand doesn't give error, but doesn't work:

from pandasai import SmartDataframe
from pandasai.llm.local_llm import LocalLLM

ollama_llm = LocalLLM(api_base="http://localhost:11434/v1", model="mistral", temperature=0, max_tokens=32768)

But, such as these give error:

from pandasai import SmartDataframe
from pandasai.llm.local_llm import LocalLLM

ollama_llm = LocalLLM(api_base="http://localhost:11434/v1", model="mistral", temperature=0, n_ctx=32768)

Isn't this because its "somehow" using OpenAI to run my local Ollama?:

https://github.com/Sinaptik-AI/pandas-ai/blob/14dc3f45ec98320c21d194ff776beec538521883/pandasai/llm/local_llm.py#L21

Thanks.

🐛 Describe the bug

Traceback (most recent call last):
  File "/home/user_name/jupyter/venv/lib/python3.10/site-packages/pandasai/pipelines/chat/generate_chat_pipeline.py", line 283, in run
    output = (self.code_generation_pipeline | self.code_execution_pipeline).run(
  File "/home/user_name/jupyter/venv/lib/python3.10/site-packages/pandasai/pipelines/pipeline.py", line 137, in run
    raise e
  File "/home/user_name/jupyter/venv/lib/python3.10/site-packages/pandasai/pipelines/pipeline.py", line 101, in run
    step_output = logic.execute(
  File "/home/user_name/jupyter/venv/lib/python3.10/site-packages/pandasai/pipelines/chat/code_generator.py", line 33, in execute
    code = pipeline_context.config.llm.generate_code(input, pipeline_context)
  File "/home/user_name/jupyter/venv/lib/python3.10/site-packages/pandasai/llm/base.py", line 196, in generate_code
    response = self.call(instruction, context)
  File "/home/user_name/jupyter/venv/lib/python3.10/site-packages/pandasai/llm/local_llm.py", line 45, in call
    return self.chat_completion(self.last_prompt, memory)
  File "/home/user_name/jupyter/venv/lib/python3.10/site-packages/pandasai/llm/local_llm.py", line 36, in chat_completion
    response = self.client.create(**params)
  File "/home/user_name/jupyter/venv/lib/python3.10/site-packages/openai/_utils/_utils.py", line 275, in wrapper
    return func(*args, **kwargs)
TypeError: Completions.create() got an unexpected keyword argument 'n_ctx'
gventuri commented 5 months ago

@Kanishk-Kumar I confirm it is using the OpenAI-compatible API for Ollama, therefore it's only possible to pass the params supported by that standard at the moment.