run-llama / create-llama

The easiest way to get started with LlamaIndex
MIT License
917 stars 114 forks source link

Use reflex for Python full-stack #375

Open marcusschiesser opened 2 days ago

marcusschiesser commented 1 day ago

could be based on https://github.com/reflex-dev/reflex-examples/pull/269

elviskahoro commented 11 hours ago

@marcusschiesser this might be better: https://github.com/reflex-dev/reflex-llamaindex-template

Still have a bunch of features I want to add to the chat app

marcusschiesser commented 4 hours ago

@elviskahoro thanks for the input! Our goal is to have the Reflex template as similar to the NextJS template as possible, so we will first extract some UI components from the NextJS template: https://github.com/run-llama/create-llama/issues/382

Then, we can use them in our upcoming Reflex template. So, the structure of our Reflex template will be based on your code, but we will use the chat components of https://github.com/run-llama/create-llama/issues/382.

You might even consider using them yourself, as they have a lot of features, e.g. widgets for LLM tools, image and document uploads and viewers, etc.