Haystack is a great opportunity to use RAG. Haystack has an API and LibreChat is a great UI with all the good implementation of user and document management. Do you see an opportunity to access Haystack with LibreChat using the API?
https://haystack.deepset.ai
Haystack offers numerous features that can significantly enhance LibreChat, while LibreChat is the perfect complement for using Haystack with a user-friendly interface.
More details
Here is a small step by step guide to get the idea:
If ollama is not already installed and running, visit https://ollama.com
for linux installation:
curl -fsSL https://ollama.com/install.sh | sh
Get the Model used in the code (its a small one for testing purposes)
ollama pull mistral
After installing the necessary dependencies I have the following code provided to geht a small glimpse of what I mean.
# Importing required libraries
from datasets import load_dataset
from haystack import Document
from haystack.document_stores.in_memory import InMemoryDocumentStore
from haystack.components.retrievers import InMemoryBM25Retriever
from haystack.components.builders import PromptBuilder
from haystack_integrations.components.generators.ollama import OllamaGenerator
from haystack import Pipeline
import gradio as gr
# Write documents to InMemoryDocumentStore
document_store = InMemoryDocumentStore()
document_store.write_documents([
Document(content="My name is Jean and I live in Paris."),
Document(content="My name is Mark and I live in Berlin."),
Document(content="My name is Giorgio and I live in Rome.")
])
# Initialize retriever
retriever = InMemoryBM25Retriever(document_store)
# Define prompt template
template = """
Given the following information, answer the question.
Context:
{% for document in documents %}
{{ document.content }}
{% endfor %}
Question: {{question}}
Answer:
"""
# Initialize prompt builder
prompt_builder = PromptBuilder(template=template)
# Initialize Ollama generator
generator = OllamaGenerator(
model="mistral",
url = "http://localhost:11434/api/generate",
generation_kwargs={
"num_predict": 100,
"temperature": 0.9,
}
)
# Create and configure pipeline
basic_rag_pipeline = Pipeline()
basic_rag_pipeline.add_component("retriever", retriever)
basic_rag_pipeline.add_component("prompt_builder", prompt_builder)
basic_rag_pipeline.add_component("llm", generator)
basic_rag_pipeline.connect("retriever", "prompt_builder.documents")
basic_rag_pipeline.connect("prompt_builder", "llm")
# Visualize pipeline (optional)
# basic_rag_pipeline.draw("basic-rag-pipeline.png")
# Define function to run pipeline with Gradio
def ask_question(question):
response = basic_rag_pipeline.run(
{
"retriever": {"query": question},
"prompt_builder": {"question": question}
}
)
return response["llm"]["replies"][0]
# Create Gradio interface
gr_interface = gr.Interface(
fn=ask_question,
inputs=gr.components.Textbox(lines=2, placeholder="Enter your question here..."),
outputs="text"
)
gr_interface.launch()
run the script
python example.py
access to the the gradio UI: http://127.0.0.1:7860
and ask a question about the given informations like:
My name is Jean, where do I live?
Instead of gradio it would be very nice using LibreChat as it offers so much more possibilities.
Thanks for reading all of this, you are very nice! And I send some nice greetings to all of you in the community!
Cheers! Matthias :)
Which components are impacted by your request?
Endpoints
Pictures
Code of Conduct
[X] I agree to follow this project's Code of Conduct
What features would you like to see added?
Haystack is a great opportunity to use RAG. Haystack has an API and LibreChat is a great UI with all the good implementation of user and document management. Do you see an opportunity to access Haystack with LibreChat using the API? https://haystack.deepset.ai
Haystack offers numerous features that can significantly enhance LibreChat, while LibreChat is the perfect complement for using Haystack with a user-friendly interface.
More details
Here is a small step by step guide to get the idea: If ollama is not already installed and running, visit https://ollama.com for linux installation:
Get the Model used in the code (its a small one for testing purposes)
After installing the necessary dependencies I have the following code provided to geht a small glimpse of what I mean.
file: example.py
run the script
access to the the gradio UI: http://127.0.0.1:7860 and ask a question about the given informations like:
Instead of gradio it would be very nice using LibreChat as it offers so much more possibilities.
Thanks for reading all of this, you are very nice! And I send some nice greetings to all of you in the community! Cheers! Matthias :)
Which components are impacted by your request?
Endpoints
Pictures
Code of Conduct