Open samschott opened 1 year ago
Thanks for the detailed request! I can't make any promises on if/when this would be supported, but I'm sending this along to the team.
If there is some consensus by the team this is worth considering, I'm happy to create a PR. But if this is not an approach that you want to pursue, do let me know and I'll look for other solutions.
We welcome PRs in general, but I can't say offhand if this in particular is something the team would or wouldn't want to support in the SDK. I'll ask them though to see if I can get some guidance on this.
Why is this feature valuable to you? Does it solve a problem you're having? The requests library allows both streaming and chunked uploads (see https://requests.readthedocs.io/en/latest/user/advanced/#streaming-uploads and https://requests.readthedocs.io/en/latest/user/advanced/#chunk-encoded-requests). This has two benefits:
The Dropbox API of course already requires upload sessions (
files/upload_session_start
,files/upload_session_append
andfiles/upload_session_finish
) to upload files > 150 MB. However, this approach by itself does not replace chunked or streaming uploads because:Describe the solution you'd like Requests supports streaming uploads by passing a file-like object as the request body and chunked uploads by passing a generator as the request body. However, the Python SDK explicitly prevents both by requiring the request body to be of type
bytes
:https://github.com/dropbox/dropbox-sdk-python/blob/9895d705317583cedb9fc11e5aa1f17f6bea303a/dropbox/dropbox_client.py#L533-L539
It would be good to either completely drop this limitation, with appropriate warnings in the doc string, or at least allow chunked uploads (where requests handles retry / rewind logic) even when disallowing streaming uploads.
Describe alternatives you've considered Not at present.