run-llama / llama_deploy

Deploy your agentic worfklows to production
https://docs.llamaindex.ai/en/stable/module_guides/llama_deploy/
MIT License
1.86k stars 193 forks source link

[EPIC] Maintain context/session state between runs #273

Open logan-markewich opened 2 months ago

logan-markewich commented 2 months ago

Currently, the code that was supposed to maintain context/state between runs is all commented out.

We need to bring back this feature, by figuring out how to serialize the context. That means, we need to serialize

In this case, I don't think it's enough to maintain just the global dict. We need it all, especially to support future use-cases where runs of a workflow can be stepwise, take days, have undo/rewind, etc.

This will require figuring out how to do this in llama-index itself first of course

jonpspri commented 1 month ago

I'm looking first to a simple case of managing a session-level context in the WorkflowService. I see a framework is in place but not yet implemented. The basic idea is easy, create a context for each session and persist it (in-memory initially, eventually on storage).

The simple approach is to wire that in to the existing Workflow Service or a subclass, but I wondering if the notion of a separate ContextManager class is appropriate? I'll probably hold that in store as a refactoring.