Sinaptik-AI / pandas-ai

Chat with your database (SQL, CSV, pandas, polars, mongodb, noSQL, etc). PandasAI makes data analysis conversational using LLMs (GPT 3.5 / 4, Anthropic, VertexAI) and RAG.
https://pandas-ai.com
Other
11.71k stars 1.09k forks source link

LLM Inference Limit Reached! #1063

Closed bindu0702 closed 3 months ago

bindu0702 commented 3 months ago

System Info

OS version: Windows 11 Python version: 3.9.10 The current version of pandasai being used: 2.0.23

🐛 Describe the bug

Code -

import pandas as pd
import numpy as np

# Generate sample data
np.random.seed(0)  # for reproducibility

months =  ['jan',  'feb',  'mar',  'apr',
                'may',  'jun',  'jul',  'aug',
                'sep','oct', 'nov','dec']
prices = np.random.randint(10, 100, size=(10, 12))

# Create DataFrame
df = pd.DataFrame(prices, columns=months)

items = ['onion', 'carrot', 'tomato', 'potato', 'cabbage',
             'lettuce', 'bell pepper', 'spinach', 'broccoli', 'radish']
df.insert(0,"items",items)

agent2 = Agent(df)
agent2.chat("which item has highest item price")

Error -

 Traceback (most recent call last):
  File "C:\ProgramData\anaconda3\envs\forpandasai\Lib\site-packages\pandasai\pipelines\chat\generate_chat_pipeline.py", line 283, in run
    output = (self.code_generation_pipeline | self.code_execution_pipeline).run(
             ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ProgramData\anaconda3\envs\forpandasai\Lib\site-packages\pandasai\pipelines\pipeline.py", line 137, in run
    raise e
  File "C:\ProgramData\anaconda3\envs\forpandasai\Lib\site-packages\pandasai\pipelines\pipeline.py", line 101, in run
    step_output = logic.execute(
                  ^^^^^^^^^^^^^^
  File "C:\ProgramData\anaconda3\envs\forpandasai\Lib\site-packages\pandasai\pipelines\chat\code_generator.py", line 33, in execute
    code = pipeline_context.config.llm.generate_code(input, pipeline_context)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ProgramData\anaconda3\envs\forpandasai\Lib\site-packages\pandasai\llm\base.py", line 196, in generate_code
    response = self.call(instruction, context)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ProgramData\anaconda3\envs\forpandasai\Lib\site-packages\pandasai\llm\bamboo_llm.py", line 18, in call
    response = self._session.post("/llm/chat", json=data)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ProgramData\anaconda3\envs\forpandasai\Lib\site-packages\pandasai\helpers\request.py", line 37, in post
    return self.make_request("POST", path, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\ProgramData\anaconda3\envs\forpandasai\Lib\site-packages\pandasai\helpers\request.py", line 71, in make_request
    raise PandasAIApiCallError(data["message"])
pandasai.exceptions.PandasAIApiCallError: LLM Inference Limit Reached!
gventuri commented 3 months ago

@bindu0702 this is not a bug. There are 100 free requests/month that are provided with BambooLLM. You can use any other LLM or upgrade to a pro license (check it out: https://sinaptik.notion.site/LLM-Inference-Limit-Reached-10d8fce36b0d4b19844230668cd51e0b?pvs=4).

Hope it helps!