Open jerryhalstead opened 9 years ago
Running on a device with more memory I'm able to upload and capture the memory footprint. The zipped file is around 130MB. After calling putObjectWithFile the memory shoots up, roughly 3x the file size.
Stepping through what code I can:
It seems that uploading should use some kind of stream so that the whole NSData doesn't have to be loaded into memory. I found this, no idea if it is useful: http://stackoverflow.com/a/18352296
Use postObjectWithFile:...
instead, which uses an HTTP body stream.
Hi Matt, Sorry, should have mentioned that I've tried that. It transfers 32k, stalls and fails with a Error Domain=NSURLErrorDomain Code=-1001 "The request timed out."
Doesn't matter the file size.
postObjectWithFile:..
seems to stall and fail with error code -1001 for me as well.
I'm having the same problem too.
I'm having the same problem. I am using only AFNetworking and I created POST exactly the same as in this link: http://stackoverflow.com/questions/20551548/timeout-posting-to-s3-from-ios-using-signed-urls Upload fails: Error Domain=NSURLErrorDomain Code=-1001
We never noticed similar problem from windows and browser apps - only mac fails.
Also found few similar problem on the web and stackoverflow- mulitpart fails with timeout when uploading large files. Unfortunately I cannot find solution how to fix this.
Added this to our app the other day and putObjectWithFile works great to send files to our S3 bucket, at least if the files are under 100MB. Ours is a video app and so quite often the upload will be larger.
What happens is when the files get too large (I was using 125MB zip) is that the app starts eating up 128MB chunks of memory until the OS shuts it down. This happens sometime after the call to putObjectWithFile but before the progress callback ever happens.