:mag: AI orchestration framework to build customizable, production-ready LLM applications. Connect components (models, vector DBs, file converters) to pipelines or agents that can interact with your data. With advanced retrieval methods, it's best suited for building RAG, question answering, semantic search or conversational agent chatbots.
I have multiple doc stores with different corpuses, I want to create a router to route queries to each retriever. Each sub-branch will use a PipelineComponent with following "sub-components": Prompt-builder, LLM, validator