langgenius / dify

Dify is an open-source LLM app development platform. Dify's intuitive interface combines AI workflow, RAG pipeline, agent capabilities, model management, observability features and more, letting you quickly go from prototype to production.
https://dify.ai
Other
47.01k stars 6.66k forks source link

Node re-run and iteration judgment termination #8489

Open suwubee opened 2 weeks ago

suwubee commented 2 weeks ago

Self Checks

1. Is this request related to a challenge you're experiencing? Tell me about your story.

LLM always generates the content we specify at one time, but in the long process, there may be a problem in a certain step in the middle, or we need to readjust the prompt word or regenerate it according to the middle part. The iteration function is indeed good for long processes, but it lacks the output after judging the conditions. I found a similar function in coze, which can insert questions and answers in the loop to interact with the user and terminate the loop separately. But there is a problem here. If new auxiliary prompt words can be added when looping again, that is, the user gives new suggestions from the question and answer, so that LLM can regenerate. image

2. Additional context or comments

I saw this mentioned as early as a year ago, but no one seems to be paying attention to this feature.

3. Can you help us with this feature?

laipz8200 commented 2 weeks ago

I've outlined your request as follows:

  1. A node similar to the Answer Node that offers the user multiple choices.
  2. A break function in the Iteration Node.

Please let me know if this aligns with your needs!

suwubee commented 2 weeks ago

I've outlined your request as follows:

  1. A node similar to the Answer Node that offers the user multiple choices.
  2. A break function in the Iteration Node.

Please let me know if this aligns with your needs!

1, yes 2, not only break, with matching conditions require restarting the node.

What I hope to achieve is that at each key node, the user can decide whether to continue or whether to re-run the node. This will generate higher quality and more in line with the logic of interaction, instead of generating everything according to the established process like now, with no room for adjustment.

I create with coze workflow, but unfortunately, when the loop is restarted, the subsequent conversation records are not introduced, but only the variables are introduced. image

suwubee commented 2 days ago

@laipz8200 flowith create a new Oracle model, that is what I need.