microsoft / durabletask-mssql

Microsoft SQL storage provider for Durable Functions and the Durable Task Framework
MIT License
87 stars 32 forks source link

Recommendations on storing large payloads #206

Closed gauravvgat closed 9 months ago

gauravvgat commented 10 months ago

We are building an orchestration where the client will create/update multiple linked documents. The payload will consist of document metadata and the content as well. Safe to assume the total size to go upto 100mb.

So my question is: what problems can we expect if we are storing this payload in azure sql? Is it a good idea? Or would it be better to store them in storage with references in sql?

One problem, I'm concerned of is the network traffic of reading that large a payload everytime the orchestration resumes.

cgillum commented 10 months ago

If the payload will make its way into the orchestration history, then the biggest problem you'll run into is memory usage because each orchestration will need to pull that document into its memory (perhaps multiple times), which often takes up more memory than the actual size of the document content on disk. Database disk and network I/O will also be issues for the reasons you mentioned. If you decide to go this route, you'll want to be sure to carefully throttle your concurrency to avoid problems with resource exhaustion.

All that said, its best practice is to pass around references to data in your orchestration and have activities use those references to access the data directly from external storage.

gauravvgat commented 10 months ago

Thanks @cgillum . We are going ahead with storing payload in storage and passing reference. Also when and why would a payload get into the orchestration history?

cgillum commented 10 months ago

Payloads get stored in history when they are used as activity inputs or outputs, or as external event payloads. This is so that the values can be replayed to reconstruct the previous state of an orchestration.

gauravvgat commented 9 months ago

Thanks, closing this now.