Open anilmadishetty2498 opened 1 month ago
I found a similar open issue related to using local llama models with the pandasai
library. The issue was discussed in this thread, where it was noted that the problem might be due to the experimental nature of Ollama's compatibility with the OpenAI API. The documentation states:
Ollama's compatibility is experimental (see docs).
This suggests that the integration might not be fully stable or reliable, which could result in no results being returned.
Additionally, ensure that you have installed the necessary dependencies for pandasai
and configured it correctly. Here is the relevant code snippet for using Ollama with a local model:
from pandasai import SmartDataframe
from pandasai.llm.local_llm import LocalLLM
ollama_llm = LocalLLM(api_base="http://localhost:11434/v1", model="codellama")
df = SmartDataframe("data.csv", config={"llm": ollama_llm})
Make sure you have installed pandasai
and any optional dependencies required for your use case:
pip install pandasai
Given the experimental status, it is recommended to verify the local server setup and ensure it adheres to the OpenAI API specifications. Additionally, check the configuration settings in pandasai
to ensure they are correctly set up for your environment [1][2].
To continue talking to Dosu, mention @dosu.
System Info
python=3.11.7
🐛 Describe the bug
import pandas as pd from pandasai import SmartDataframe from langchain_community.llms import Ollama
Sample DataFrame
data = { 'Month': ['January', 'February', 'March', 'April'], 'Revenue': [1000, 1500, 1300, 1700], 'Profit': [200, 300, 250, 400] }
df = pd.DataFrame(data)
Initialize Ollama LLM
llm = Ollama(model="mistral")
Create a SmartDataframe with the LLM
db = SmartDataframe(df, config={"llm": llm})
No results