-
**Issue:** The basic AWS S3 connection interface doesn't support file uploads greater than a certain size. I believe the hard limit is 5GB, but AWS recommends multipart upload for any files larger tha…
-
By making use of large file upload features in B2 it should be possible to reduce the memory footprint of B2Fuse to a set maximum memory size. This at the cost of disk space as the entire file must be…
-
### Additional Information
#### Version of s3fs being used (s3fs --version)
```
Amazon Simple Storage Service File System V1.91 (commit:unknown) with OpenSSL
```
#### Version of fuse being us…
-
Package version (if known): v12.0.5
## Describe the bug
When uploading a (larger?) file the progress isn't shown right. It goes from 0 to 1 to 2 ... 9 and then you see 0% again instead of 10%. The…
-
Hi,
I am using Uppy AWS S3 for direct upload. I can easily upload files smaller than 100 MB but larger files immediately crash with:
"Access to XMLHttpRequest at 'https://api.cloudinary.com/v1_1/CLO…
-
Thanks a lot for the great app! We discovered a problem uploading large files via the Nextcloud app:
### Steps to reproduce
1. Open Nextcloud app (iphone, ipad)
2. Enable Chunking in settings t…
-
```
Add ability to upload large files in chunks to allow for uploads to resume.
```
Original issue reported on code.google.com by `DJGosn...@gmail.com` on 14 Feb 2011 at 7:01
-
```
HTTP Error 500 thrown while requesting PUT https://hf-hub-lfs-us-east-1.s3-accelerate.amazonaws.com/repos/...
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/h…
-
Does not stop upload. Checksum ok.
Error appear in interface "unknown error", in logs say multiple times for 1 upload large file
```
Sabre\DAV\Exception\BadRequest: Expected filesize of 1048576…
-
```
Add ability to upload large files in chunks to allow for uploads to resume.
```
Original issue reported on code.google.com by `DJGosn...@gmail.com` on 14 Feb 2011 at 7:01