Open Rumeysakeskin opened 3 months ago
import torch from transformers import AutoModelForCausalLM, AutoTokenizer, AwqConfig model_id = "hugging-quants/Meta-Llama-3.1-70B-Instruct-AWQ-INT4" llm = HuggingFaceLLM( context_window=8192, #4096 max_new_tokens=512, generate_kwargs={"temperature": 0, "do_sample": False}, system_prompt=system_prompt, query_wrapper_prompt=query_wrapper_prompt, tokenizer_name=model_id, model_name=model_id, device_map="auto", tokenizer_kwargs={"max_length": 8192} # 4096 )
from pandasai import PandasAI import pandas as pd langchain_llm = LangchainLLM(langchain_llm=llm) pandas_ai = PandasAI(llm=langchain_llm) df = pd.read_csv("data/deneme.csv") result = pandas_ai.run(df, "question??")
--------------------------------------------------------------------------- AttributeError Traceback (most recent call last) Cell In[17], line 10 6 pandas_ai = PandasAI(llm=langchain_llm) 8 df = pd.read_csv("data/deneme.csv") ---> 10 result = pandas_ai.run(df, "question??") 12 result File /usr/local/lib/python3.10/dist-packages/pandasai/__init__.py:298, in PandasAI.run(self, data_frame, prompt, is_conversational_answer, show_code, anonymize_df, use_error_correction_framework) 278 """ 279 Run the PandasAI to make Dataframes Conversational. 280 (...) 293 294 """ 296 self._start_time = time.time() --> 298 self.log(f"Running PandasAI with {self._llm.type} LLM...") 300 self._prompt_id = str(uuid.uuid4()) 301 self.log(f"Prompt ID: {self._prompt_id}") Cell In[16], line 60, in LangchainLLM.type(self) 58 @property 59 def type(self) -> str: ---> 60 return f"langchain_{self.langchain_llm._llm_type}" AttributeError: 'HuggingFaceLLM' object has no attribute '_llm_type'
I noticed that PandasAI is generally used with OpenAI's LLM. Am I getting errors because I use it with HuggingFace? How can I resolve this issue?
PandasAI
.
If using huggingface is not a must, using the llama3 model with ollama or groq may work.
Checked other resources
Example Code
Error Message and Stack Trace (if applicable)
Description
I noticed that
PandasAI
is generally used with OpenAI's LLM. Am I getting errors because I use it with HuggingFace? How can I resolve this issue?System Info
.