run-llama / llama_index

LlamaIndex is a data framework for your LLM applications
https://docs.llamaindex.ai
MIT License
35.36k stars 4.98k forks source link

[Question]: Orchestrate multiple workflows. #15590

Open yb-sid opened 2 weeks ago

yb-sid commented 2 weeks ago

Question Validation

Question

I tried using llama-agents to build out a multi-agent TEXT-to-SQL system. Logan from the team said to use llamaindex's workflow. (https://docs.llamaindex.ai/en/stable/module_guides/workflow/)

I was able to build out a solution that works on text-to-sql-execution Workflow. But now I want to extend it to multiple workflows i.e a workflow for SQL generation , a workflow for document Q/A , maybe a workflow for general Q/A on database.

How can I orchestrate and control outputs from multiple workflows.

So a user could give any request and it would go to appropriate workflow.

Thanks in advance for any inputs!

dosubot[bot] commented 2 weeks ago

Hello, @yb-sid! I'm here to assist you with any questions or issues you may have. Let's dive into your query and work towards a solution.

To orchestrate and control outputs from multiple workflows, such as SQL generation, document Q/A, and general Q/A on a database using LlamaIndex's workflow, you can leverage the framework's ability to combine agents, data connectors, and other tools into an event-driven, step-based system. Here is a conceptual example in Python:

from llama_index.core import VectorStoreIndex, SimpleDirectoryReader, Workflow, Agent

# Load documents for document Q/A
documents = SimpleDirectoryReader("data").load_data()
index = VectorStoreIndex.from_documents(documents)
query_engine = index.as_query_engine()

# Define agents for different tasks
sql_agent = Agent(name="SQLAgent", task="Generate SQL queries")
doc_qa_agent = Agent(name="DocQAAgent", task="Document Question Answering", query_engine=query_engine)
general_qa_agent = Agent(name="GeneralQAAgent", task="General Question Answering")

# Define a workflow that orchestrates the agents
workflow = Workflow(name="MultiTaskWorkflow")

# Add steps to the workflow
workflow.add_step(agent=sql_agent, input_data="Generate SQL for retrieving user data")
workflow.add_step(agent=doc_qa_agent, input_data="What is the revenue for Q1?")
workflow.add_step(agent=general_qa_agent, input_data="What is the capital of France?")

# Execute the workflow
results = workflow.execute()

# Print the results
for result in results:
    print(result)

This example demonstrates how to set up a workflow that includes multiple agents for different tasks and execute them in sequence. Each agent is responsible for a specific type of task, and the workflow orchestrates their execution and collects the results.

LlamaIndex's workflow system is event-driven and step-based, allowing you to break down your application into smaller, manageable pieces called Steps, which are triggered by Events. This approach makes it easier to manage complex flows, including loops and branches, and encapsulate logic in a maintainable way.

Additionally, LlamaIndex supports various advanced techniques for question-answering (QA) over different types of data, such as unstructured documents, structured data (like SQL databases), and combining multiple data sources. For example, you can use Retrieval Augmented Generation (RAG) for semantic search and summarization over unstructured documents, or text-to-SQL for querying structured data. This flexibility allows you to tailor your workflow to handle diverse data sources and complex queries effectively [1][2][3].

To continue talking to Dosu, mention @dosu.

yb-sid commented 2 weeks ago

This is not what I am talking about @dosu

I have multiple workflows :

sql_flow = SQLFlow() db_qna_flow = DBQnaFlow() document_flow = DocumentFlow()

user could give any request.

How can I orchestrate among these multiple workflows.

Please don't give me AI generated answers of documentation.

dosubot[bot] commented 2 weeks ago

This is a tough nut to crack, could really use your help on this, @logan-markewich.

To continue talking to Dosu, mention @dosu.

mezamateoj commented 2 days ago

@yb-sid did you find a way to achieve this ?? Im trying to implement something like this. Also i found an example that uses multiple workflows, but they are deployed as separate services. https://github.com/run-llama/llama_deploy/tree/main/examples/python_fullstack

yb-sid commented 15 hours ago

@mezamateoj Unfortunately no. I've been using llama-index's selector prompt and some openai function calling to perform orchestration.

The example you shared does not serve our purpose.