Closed ferrymul7 closed 2 days ago
To resolve the error "pandasai.exceptions.NoCodeFoundError: No code found in the response" while using Pandasai 2.2.12 with Python 3.10.14 and Amazon Bedrock - Anthropic Claude 2 as the LLM, you need to ensure that the response from the LLM contains valid Python code. The _extract_code
method in the LLM
class is responsible for extracting and validating the code from the response.
Here is the relevant part of the code:
def _extract_code(self, response: str, separator: str = "```") -> str:
"""
Extract the code from the response.
Args:
response (str): Response
separator (str, optional): Separator. Defaults to "```".
Raises:
NoCodeFoundError: No code found in the response
Returns:
str: Extracted code from the response
"""
code = response
# If separator is in the response then we want the code in between only
if separator in response and len(code.split(separator)) > 1:
code = code.split(separator)[1]
code = self._polish_code(code)
# Even if the separator is not in the response, the output might still be valid python code
if not self._is_python_code(code):
raise NoCodeFoundError("No code found in the response")
return code
To resolve the error, you should:
Ensure the LLM response contains code: Make sure that the response from the LLM (Anthropic Claude 2) includes Python code enclosed within the specified separator (default is "```"). For example:
```python
print("Hello, World!")
Check the separator: If the code is not enclosed within the default separator, you might need to adjust the separator parameter when calling _extract_code
.
Validate the code: Ensure that the extracted code is valid Python code. The _is_python_code
method uses ast.parse
to check for syntax errors.
If the response does not contain valid Python code or is not properly formatted, the _extract_code
method will raise the NoCodeFoundError
.
Here is an example of how you might call the generate_code
method, which internally uses _extract_code
:
instruction = BasePrompt("Generate a Python function to add two numbers.")
context = PipelineContext() # Assuming you have a context object
try:
code = llm.generate_code(instruction, context)
print("Generated code:", code)
except NoCodeFoundError as e:
print("Error:", e)
Ensure that the LLM is correctly configured and that the prompt provided to the LLM is likely to generate a response containing valid Python code. Additionally, verify that the BedrockClaude
class is properly set up with the correct model and parameters. Here is a snippet from the BedrockClaude
class for reference:
class BedrockClaude(LLM):
"""Bedrock Claude LLM
Generates text using Anthropic Claude Messages API.
Attributes:
bedrock_runtime_client: The boto3 bedrock runtime client.
max_tokens: Max number of tokens to generate.
model: The Bedrock Claude model to use.
temperature: (Optional) The amount of randomness injected into the response.
top_p: (Optional) Use nucleus sampling. In nucleus sampling, Anthropic Claude computes the cumulative distribution over all the options for each subsequent token in decreasing probability order and cuts it off once it reaches a particular probability specified by top_p. You should alter either temperature or top_p, but not both.
top_k: (Optional) Only sample from the top K options for each subsequent token.
stop_sequences: (Optional) Custom text sequences that cause the model to stop generating. Anthropic Claude models normally stop when they have naturally completed their turn, in this case the value of the stop_reason response field is end_turn. If you want the model to stop generating when it encounters custom strings of text, you can use the stop_sequences parameter. If the model encounters one of the custom text strings, the value of the stop_reason response field is stop_sequence and the value of stop_sequence contains the matched stop sequence.
"""
_supported__models = [
"anthropic.claude-3-opus-20240229-v1:0",
"anthropic.claude-3-5-sonnet-20240620-v1:0",
"anthropic.claude-3-sonnet-20240229-v1:0",
"anthropic.claude-3-haiku-20240307-v1:0",
]
_valid_params = [
"max_tokens",
"model",
"temperature",
"top_p",
"top_k",
"stop_sequences",
]
max_tokens: int = 1024
model: str = "anthropic.claude-3-sonnet-20240229-v1:0"
temperature: Optional[float] = None
top_p: Optional[float] = None
top_k: Optional[float] = None
stop_sequences: Optional[str] = None
client: Any
def __init__(self, bedrock_runtime_client, **kwargs):
for key, val in kwargs.items():
if key in self._valid_params:
setattr(self, key, val)
self.client = bedrock_runtime_client
if self.model not in self._supported__models:
raise UnsupportedModelError(self.model)
invoke_model = getattr(self.client, "invoke_model", None)
if not callable(invoke_model):
raise APIKeyNotFoundError
@property
def _default_params(self) -> Dict[str, Any]:
"""Get the default parameters for calling text generation inference API."""
return {
"max_tokens": self.max_tokens,
"model": self.model,
"temperature": self.temperature,
"top_p": self.top_p,
"top_k": self.top_k,
"stop_sequences": self.stop_sequences,
}
def call(self, instruction: BasePrompt, context: PipelineContext = None) -> str:
prompt = instruction.to_string()
memory = context.memory if context else None
messages = []
system_prompt = ""
if memory:
if memory.agent_info:
system_prompt = memory.get_system_prompt()
for message in memory.all():
if message["is_user"]:
if messages and messages[-1]["role"] == "user":
messages[-1]["content"].append(
{"type": "text", "text": message["message"]}
)
else:
messages.append(
{
"role": "user",
"content": [
{"type": "text", "text": message["message"]}
],
}
)
else:
messages.append(
{
"role": "assistant",
"content": [{"type": "text", "text": message["message"]}],
}
)
# adding current prompt as latest query message
if messages and messages[-1]["role"] == "user":
messages[-1]["content"].append({"type": "text", "text": prompt})
else:
messages.append(
{"role": "user", "content": [{"type": "text", "text": prompt}]},
)
params = {
"anthropic_version": "bedrock-2023-05-31",
"system": system_prompt,
"messages": messages,
}
for key, value in self._default_params.items():
if key != "model" and value is not None:
params[key] = value
body = json.dumps(params)
# print(body)
response = self.client.invoke_model(modelId=self.model, body=body)
response_body = json.loads(response.get("body").read())
self.last_prompt = prompt
# print(response_body["content"][0]["text"])
return response_body["content"][0]["text"]
@property
def type(self) -> str:
return "bedrock-claude"
Ensure that the BedrockClaude
class is correctly instantiated and configured to use the appropriate model and parameters [1][2].
To continue talking to Dosu, mention @dosu.
System Info
I encountered error "pandasai.exceptions.NoCodeFoundError: No code found in the response" as per attached. Pandasai 2.2.12 Python 3.10.14 LLM : Amazon Bedrock - Anthropic Claude 2
🐛 Describe the bug
!pip install -q hdbcli pandas pandasai langchain langchain_aws sagemaker polars pydantic==1.10.8
Setup boto3 client to access bedrock in a shared AWS Account
import json import os import sys import boto3 import botocore
module_path = "/home/sagemaker-user/myutils.zip" sys.path.append(os.path.abspath(module_path)) from myutils import bedrock, print_ww
---- ⚠️ Un-comment and edit the below lines as needed for your AWS setup ⚠️ ----
os.environ["AWS_DEFAULT_REGION"] = "us-west-2"
os.environ["AWS_PROFILE"] = ""
os.environ["BEDROCK_ASSUME_ROLE"] = "arn:aws:iam:::role/Crossaccountbedrock" # E.g. "arn:aws:..."
boto3_bedrock = bedrock.get_bedrock_client( assumed_role=os.environ.get("BEDROCK_ASSUME_ROLE", None), region=os.environ.get("AWS_DEFAULT_REGION", None), runtime=True )
Import your dependencies
from hdbcli import dbapi import pandas as pd from pandasai import SmartDataframe from langchain_aws import BedrockLLM
Initialize your connection
conn = dbapi.connect( address='xxxx.hana.trial-us10.hanacloud.ondemand.com', port='443', user='xxxx', password='xxxx', encrypt=True, sslValidateCertificate=True )
If no errors, print connected
print('connected\n')
schema = "USER1" tablename = "ALL_RESERVATIONS" data=pd.read_sql(f'select * from {schema}.{tablename}',conn) print(data)
Instantiate a LLM
model_parameter = {"temperature": 0, "max_tokens_to_sample": 2000} langchain_llm = BedrockLLM(model_id="anthropic.claude-v2", model_kwargs=model_parameter, client=boto3_bedrock) df = SmartDataframe(data, config={"llm": langchain_llm})
df.chat("List the hotel name sorted by its total number of nights reserved")
df.chat("What is the sum of nights for all the hotels ?")
df.chat("Provide Full Analysis of this data for a Hotel Manager.")
pandasai29072024.zip