run-llama / chat-llamaindex

https://chat.llamaindex.ai
MIT License
759 stars 245 forks source link

[Feature] Connect to Python-based llamaindex instance #81

Open PanCakeConnaisseur opened 3 months ago

PanCakeConnaisseur commented 3 months ago

I built a small RAG with a local embedding model in the normal python-based llamaindex. How do I use this react-based chat application with the python-based chat engine? Or what is the idiomatic way to have a GUI chat for the python-based llamaindex?

marcusschiesser commented 3 months ago

I think you have two options:

  1. Continue working on the Python backend for Chat LlamaIndex, see https://github.com/run-llama/chat-llamaindex/issues/30#issuecomment-1987846940
  2. Use create-llama to generate a FastAPI backend with NextJS frontend and integrate your Python RAG code into it.