Closed QuietRocket closed 1 month ago
Using workflows as tools I think will contribute to the benefit of this feature. Many times i'm taking a query and executing multiple individual llm queries to a cheaper llm, rather than packing it into a single llm completion. Or, i might want to execute multiple searches via searxng tool in parallel, then aggregate the response.
At the moment, there is no way to do this without doing it external to dify where the token counts are no longer available to understand the total workflow cost.
parallel nodes feature would be very useful. it's a common use case. here a fake demo for this kind of workflow in my mind that can make a workflow more efficient and flexible if the "parallel node" can be connected to any kind of node.
Probably needs a mechanism to do rate and concurrency limiting, but a fork-join or map-reduce pattern would make a lot of use-cases for the new iteration block run much faster.
Any update on ETA for this feature?
+1
any updates?
mark
.
Eagerly waiting for this feature
This would be very useful.
This will greatly enhance the efficiency of my workflows. I look forward to seeing this feature implemented soon.
Self Checks
1. Is this request related to a challenge you're experiencing?
Some workflow configurations have high response latency because they necessitate multiple orthogonal requests to execute sequentially. The latency can be greatly reduced via simple fork-join parallelism. For example, multiple LLM generations in parallel with HTTP requests.
2. Describe the feature you'd like to see
Add Fork and Join blocks.
3. How will this feature improve your workflow or experience?
Improves response latency by performing orthogonal tasks in parallel.
4. Additional context or comments
Fork Join diagram courtesy of this paper.
5. Can you help us with this feature?