issues
search
curiousily
/
ragbase
Completely local RAG. Chat with your PDF documents (with open LLM) and UI to that uses LangChain, Streamlit, Ollama (Llama 3.1), Qdrant and advanced methods like reranking and semantic chunking.
https://www.mlexpert.io/bootcamp/ragbase-local-rag
MIT License
54
stars
18
forks
source link
Updated libraries to the latest
#4
Open
muthuka
opened
3 months ago
muthuka
commented
3 months ago
Upgraded libraries and set Ollama as local version in Config
Upgraded libraries and set Ollama as local version in Config