Closed dwgeneral closed 5 days ago
I found a similar closed issue that might be relevant to your problem: bad_request: The browser (or proxy) sent a request that this server could not understand. The solution was to escape the newline character in the text [1].
Regarding the specific issue with Claude LLM requiring at least one user message, this is due to the implementation of the MinimaxChatCompletionPro
class, which explicitly checks for the presence of a user message after processing the system message. If only a system message is provided, it raises a BadRequestError
[2].
The relevant code snippet is as follows:
# check if there is a system message
if len(prompt_messages) == 0:
raise BadRequestError("At least one message is required")
if prompt_messages[0].role == MinimaxMessage.Role.SYSTEM.value:
if prompt_messages[0].content:
bot_setting["content"] = prompt_messages[0].content
prompt_messages = prompt_messages[1:]
# check if there is a user message
if len(prompt_messages) == 0:
raise BadRequestError("At least one user message is required")
This requirement ensures proper interaction flow with the AI model. Other models might not have this constraint due to different implementation details.
Your pull request likely addresses this by ensuring that there is at least one user message and removing any empty system messages from the prompt messages list. This is a common approach to mitigate such issues [3].
To continue talking to Dosu, mention @dosu.
Self Checks
Dify version
v0.8.3
Cloud or Self Hosted
Self Hosted (Source)
Steps to reproduce
✔️ Expected Behavior
should no error and return the correct response, like this.
I already have a mr to mitigate this issue, pls check it out.
❌ Actual Behavior
got at least one message is required error.