One query to rule them all
| Documentation | Blog | Discord |
Korvus is a search SDK that unifies the entire RAG pipeline in a single database query. Built on top of Postgres with bindings for Python, JavaScript and Rust, Korvus delivers high-performance, customizable search capabilities with minimal infrastructure concerns.
https://github.com/postgresml/korvus/assets/19626586/2b697dc6-8c38-41a7-8c8e-ef158dacb29b
Korvus is an all-in-one, open-source RAG (Retrieval-Augmented Generation) pipeline built for Postgres. It combines LLMs, vector memory, embedding generation, reranking, summarization and custom models into a single query, maximizing performance and simplifying your search architecture.
Korvus provides SDK support for multiple programming languages, allowing you to integrate it seamlessly into your existing tech stack:
Korvus stands out by harnessing the full power of Postgres for RAG operations:
Postgres-Native RAG: Korvus leverages Postgres' robust capabilities, allowing you to perform complex RAG operations directly within your database. This approach eliminates the need for external services and API calls, significantly reducing latency and complexity many times over.
Single Query Efficiency: With Korvus, your entire RAG pipeline - from embedding generation to text generation - is executed in a single SQL query. This "one query to rule them all" approach simplifies your architecture and boosts performance.
Scalability and Performance: By building on Postgres, Korvus inherits its excellent scalability and performance characteristics. As your data grows, Korvus grows with it, maintaining high performance even with large datasets.
Korvus utilizes PostgresML's pgml extension and the pgvector extension to compress the entire RAG pipeline inside of Postgres.
To use Korvus, you need a Postgres database with pgml and pgvector installed. You have two options:
Self-hosted: Set up your own database with pgml and pgvector.
Hosted Service: Use our managed Postgres service with pgml and pgvector pre-installed.
pip install korvus
KORVUS_DATABASE_URL
env variable:export KORVUS_DATABASE_URL="{YOUR DATABASE CONNECTION STRING}"
from korvus import Collection, Pipeline
import asyncio
collection = Collection("korvus-demo-v0")
pipeline = Pipeline(
"v1",
{
"text": {
"splitter": {"model": "recursive_character"},
"semantic_search": {"model": "Alibaba-NLP/gte-base-en-v1.5"},
}
},
)
async def add_pipeline():
await collection.add_pipeline(pipeline)
asyncio.run(add_pipeline())
async def upsert_documents():
documents = [
{"id": "1", "text": "Korvus is incredibly fast and easy to use."},
{"id": "2", "text": "Tomatoes are incredible on burgers."},
]
await collection.upsert_documents(documents)
asyncio.run(upsert_documents())
5. Perform RAG
```python
async def rag():
query = "Is Korvus fast?"
print(f"Querying for response to: {query}")
results = await collection.rag(
{
"CONTEXT": {
"vector_search": {
"query": {
"fields": {"text": {"query": query}},
},
"document": {"keys": ["id"]},
"limit": 1,
},
"aggregate": {"join": "\n"},
},
"chat": {
"model": "meta-llama/Meta-Llama-3-8B-Instruct",
"messages": [
{
"role": "system",
"content": "You are a friendly and helpful chatbot",
},
{
"role": "user",
"content": f"Given the context\n:{{CONTEXT}}\nAnswer the question: {query}",
},
],
"max_tokens": 100,
},
},
pipeline,
)
print(results)
asyncio.run(rag())
While Korvus provides a high-level interface in multiple programming languages, its core operations are built on optimized SQL queries. This approach offers several advantages:
Don't worry if you're not a SQL expert - Korvus's intuitive API abstracts away the complexity while still allowing you to harness the full power of SQL-based operations.
For comprehensive documentation, including API references, tutorials, and best practices, visit our official documentation.
Join our community to get help, share ideas, and contribute:
We welcome contributions to Korvus! Please read our Contribution Guidelines before submitting pull requests.
Korvus is maintained by PostgresML. For enterprise support and consulting services, please contact us.