asim / reminder

A reminder to the whole world
https://reminder.dev
0 stars 0 forks source link

TODO: Pass index query results as context to LLM to ask question #6

Closed asim closed 3 hours ago

asim commented 4 hours ago

Using the RAG example from chromem-go, we can pass the index results from our pre-generated index as a context when asking an LLM questions. This would be useful if we wanted to run a small local LLM rather than using OpenAI.

https://github.com/philippgille/chromem-go/tree/main/examples/rag-wikipedia-ollama

asim commented 3 hours ago

Not using local model but works well with gpt 4o mini

Example below. Note, we are passing the context which is a vector search query from the quran, hadith and names of allah index we built. So this will look to maintain far higher accuracy than passing a question straight to chatgpt and getting info from anywhere on the internet.

Screenshot 2024-11-16 22 03 40