Sinaptik-AI / pandas-ai

Chat with your database (SQL, CSV, pandas, polars, mongodb, noSQL, etc). PandasAI makes data analysis conversational using LLMs (GPT 3.5 / 4, Anthropic, VertexAI) and RAG.
https://pandas-ai.com
Other
13.27k stars 1.29k forks source link

use openai error(openai.BadRequestError: Error code: 400) #1365

Open xiaozhi-ai20 opened 1 month ago

xiaozhi-ai20 commented 1 month ago

System Info

Unfortunately, I was not able to answer your question, because of the following error:

Error code: 400 - {'object': 'error', 'message': 'Conversation roles must alternate user/assistant/user/assistant/...', 'type': 'BadRequestError', 'param': None, 'code': 400}

Traceback (most recent call last): File "E:\users\xxx.conda\envs\llm_new\lib\site-packages\pandasai\pipelines\chat\generate_chat_pipeline.py", line 335, in run ).run(input) File "E:\users\xxx.conda\envs\llm_new\lib\site-packages\pandasai\pipelines\pipeline.py", line 137, in run raise e File "E:\users\xxx.conda\envs\llm_new\lib\site-packages\pandasai\pipelines\pipeline.py", line 101, in run step_output = logic.execute( File "E:\users\xxx.conda\envs\llm_new\lib\site-packages\pandasai\pipelines\chat\code_generator.py", line 33, in execute code = pipeline_context.config.llm.generate_code(input, pipeline_context) File "E:\users\xxx.conda\envs\llm_new\lib\site-packages\pandasai\llm\base.py", line 201, in generate_code response = self.call(instruction, context) File "E:\users\xxx.conda\envs\llm_new\lib\site-packages\pandasai\llm\base.py", line 391, in call self.chat_completion(self.last_prompt, memory) File "E:\users\xxx.conda\envs\llm_new\lib\site-packages\pandasai\llm\base.py", line 365, in chat_completion response = self.client.create(*params) File "E:\users\xxx.conda\envs\llm_new\lib\site-packages\openai_utils_utils.py", line 275, in wrapper return func(args, **kwargs) File "E:\users\xxx.conda\envs\llm_new\lib\site-packages\openai\resources\chat\completions.py", line 667, in create return self._post( File "E:\users\xxx.conda\envs\llm_new\lib\site-packages\openai_base_client.py", line 1213, in post return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)) File "E:\users\xxx.conda\envs\llm_new\lib\site-packages\openai_base_client.py", line 902, in request return self._request( File "E:\users\xxx.conda\envs\llm_new\lib\site-packages\openai_base_client.py", line 993, in _request raise self._make_status_error_from_response(err.response) from None openai.BadRequestError: Error code: 400 - {'object': 'error', 'message': 'Conversation roles must alternate user/assistant/user/assistant/...', 'type': 'BadRequestError', 'param': None, 'code': 400}

🐛 Describe the bug

Error code: 400 - {'object': 'error', 'message': 'Conversation roles must alternate user/assistant/user/assistant/...', 'type': 'BadRequestError', 'param': None, 'code': 400}

dosubot[bot] commented 1 month ago

Hello, @52phm! I'm here to help you out with any bugs, questions, or contributions you have. Let's tackle this together!

To resolve the error code 400 in OpenAI that states 'Conversation roles must alternate user/assistant/user/assistant/...' when using the pandasai library, ensure that the conversation messages alternate correctly between user and assistant roles. The Memory class in pandasai/helpers/memory.py provides a method to_openai_messages that formats the messages for the OpenAI API.

Here is the relevant code snippet from the Memory class:

def to_openai_messages(self):
    """
    Returns the conversation messages in the format expected by the OpenAI API
    """
    messages = []
    if self.agent_info:
        messages.append(
            {
                "role": "system",
                "content": self.get_system_prompt(),
            }
        )
    for message in self.all():
        if message["is_user"]:
            messages.append({"role": "user", "content": message["message"]})
        else:
            messages.append({"role": "assistant", "content": message["message"]})
    return messages

Ensure that the messages added to the Memory instance alternate between user and assistant roles. If they do not, you will need to adjust the logic where messages are added to the Memory instance to maintain this alternation [1].

To continue talking to Dosu, mention @dosu.

xiaozhi-ai20 commented 1 month ago

The source code is the same as the code you provided, which can solve openai What is the replacement code for BadRequestError?

dosubot[bot] commented 1 month ago

To handle the BadRequestError from OpenAI, you need to ensure that the conversation roles alternate correctly between user and assistant. The error message indicates that the roles in the conversation history are not alternating as required.

Here is a revised version of the to_openai_messages method in the Memory class to ensure that the roles alternate correctly:

def to_openai_messages(self):
    """
    Returns the conversation messages in the format expected by the OpenAI API
    """
    messages = []
    if self.agent_info:
        messages.append(
            {
                "role": "system",
                "content": self.get_system_prompt(),
            }
        )
    last_role = None
    for message in self.all():
        current_role = "user" if message["is_user"] else "assistant"
        if current_role == last_role:
            raise ValueError("Conversation roles must alternate between user and assistant.")
        messages.append({"role": current_role, "content": message["message"]})
        last_role = current_role
    return messages

This code ensures that the roles alternate correctly and raises an error if they do not. You can integrate this into your Memory class to prevent the BadRequestError from occurring [1].

To continue talking to Dosu, mention @dosu.

xiaozhi-ai20 commented 1 month ago

No effect, still reporting the same error