langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
51.81k stars 7.51k forks source link

Occurring at least one message error when there was no user message for Claude llm on workflow app #8788

Closed dwgeneral closed 5 days ago

dwgeneral commented 1 month ago

Self Checks

Dify version

v0.8.3

Cloud or Self Hosted

Self Hosted (Source)

Steps to reproduce

  1. created a workflow app with Calude LLM
  2. only give a system message on LLM node Screenshot 2024-09-26 at 2 01 58 PM
  3. got error for at least one message is required Screenshot 2024-09-26 at 12 10 23 PM
  4. for other models, there's no issue for this case.

✔️ Expected Behavior

should no error and return the correct response, like this.

Screenshot 2024-09-26 at 12 09 10 PM

I already have a mr to mitigate this issue, pls check it out.

❌ Actual Behavior

got at least one message is required error.

dosubot[bot] commented 1 month ago

I found a similar closed issue that might be relevant to your problem: bad_request: The browser (or proxy) sent a request that this server could not understand. The solution was to escape the newline character in the text [1].

Regarding the specific issue with Claude LLM requiring at least one user message, this is due to the implementation of the MinimaxChatCompletionPro class, which explicitly checks for the presence of a user message after processing the system message. If only a system message is provided, it raises a BadRequestError [2].

The relevant code snippet is as follows:

# check if there is a system message
if len(prompt_messages) == 0:
    raise BadRequestError("At least one message is required")
if prompt_messages[0].role == MinimaxMessage.Role.SYSTEM.value:
    if prompt_messages[0].content:
        bot_setting["content"] = prompt_messages[0].content
    prompt_messages = prompt_messages[1:]

# check if there is a user message
if len(prompt_messages) == 0:
    raise BadRequestError("At least one user message is required")

This requirement ensures proper interaction flow with the AI model. Other models might not have this constraint due to different implementation details.

Your pull request likely addresses this by ensuring that there is at least one user message and removing any empty system messages from the prompt messages list. This is a common approach to mitigate such issues [3].

To continue talking to Dosu, mention @dosu.