run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
37.07k stars 5.32k forks source link

[Question]: OpenAILike chat have a problem. #15298

Open cxycxm opened 3 months ago

cxycxm commented 3 months ago

Question Validation

Question

code:

messages = [] message = ChatMessage(role="user", content=('你是谁?')) messages.append(message) OpenAILike(model="apus", api_base="http://10.11.50.38:8082/v1/chat",temperature=temperature).chat(messages)

problem: File "C:\Users\chenxumin\AppData\Local\anaconda3\envs\agent\Lib\site-packages\openai_base_client.py", line 1042, in _request raise self._make_status_error_from_response(err.response) from None openai.UnprocessableEntityError: Error code: 422 - {'error': "[{'loc': ('body', 'messages'), 'msg': 'field required', 'type': 'value_error.missing'}]"}

dosubot[bot] commented 3 months ago

To resolve the UnprocessableEntityError indicating that the 'messages' field is required but missing in your OpenAILike chat implementation, you need to ensure that the 'messages' field is included in your request payload correctly. Here is an example of how you might structure your request:

from llama_index.llms.openai_like import OpenAILike
from llama_index.core.base.llms.types import ChatMessage, MessageRole

# Create the messages list
messages = [ChatMessage(role=MessageRole.USER, content='你是谁?')]

# Initialize the OpenAILike instance
chat = OpenAILike(
    model="apus",
    api_base="http://10.11.50.38:8082/v1/chat",
    temperature=temperature
)

# Send the chat request
response = chat.chat(messages)
print(response)

Ensure that the ChatMessage and MessageRole are correctly imported and used to create the messages list. This should resolve the UnprocessableEntityError [1].

To continue talking to Dosu, mention @dosu.

cxycxm commented 3 months ago

The robot's reply is wrong, please reply to me correctly

dosubot[bot] commented 3 months ago

To fix the UnprocessableEntityError in your OpenAILike chat implementation, ensure that the messages field is correctly formatted and included in your request. Here's an example of how to structure your request:

from llama_index.llms.openai_like import OpenAILike
from llama_index.llms.base import ChatMessage, MessageRole

# Create the messages list
messages = [ChatMessage(role=MessageRole.USER, content='你是谁?')]

# Initialize the OpenAILike instance
chat = OpenAILike(
    model="apus",
    api_base="http://10.11.50.38:8082/v1",
    temperature=0.7  # Replace with your desired temperature value
)

# Send the chat request
response = chat.chat(messages)
print(response)

Ensure that:

  1. The ChatMessage and MessageRole are correctly imported.
  2. The api_base URL is correctly formatted as "http://10.11.50.38:8082/v1" [1].
  3. The messages list is not empty and contains at least one ChatMessage object [2].

This should resolve the UnprocessableEntityError.

To continue talking to Dosu, mention @dosu.