pgaertig / nginx-big-upload

Resumable and reliable file uploads of any size. Nginx extension written in Lua.
https://github.com/pgaertig/nginx-big-upload
Other
192 stars 49 forks source link

in-core solution #1

Closed mikhailov closed 11 years ago

mikhailov commented 11 years ago

nginx-upload-module is widely known solution for reliable big files upload with resume option and it's really great to see it still supported even by new project name.

But still don't understand why not to just use in-core client_body_in_file_only functionality? It has a lack of documentation and nobody use it, but we tried and have some success on it. I can share my experience and configuration here if it necessary

mikhailov commented 11 years ago

muti-part or binary-data difference is 4 extra lines only!

-----------------------------7dc1f42e3005a8
Content-Disposition: form-data; name="qqfile";filename="[filename]"
Content-Type: application/octet-stream

[bytes from data stream]
-----------------------------7dc1f42e3005a8--
pgaertig commented 11 years ago

Regarding client_body_in_file_only on; I explained it in https://github.com/vkholodkov/nginx-upload-module/issues/41#issuecomment-15683609 . Multi-part is not hard in protocol but it is harder when you want to resume them. Multi-part is low priority because I want to have bullet proof resumable uploads first.

pgaertig commented 11 years ago

@mikahailov thanks for your input. The method you mentioned works well in some scenarios. However I also needed chunked upload and on the fly SHA1 calculation. These should be without post-upload lag which joining chunk parts and separate calculation often causes. Multi-part body request support is on my TODO list, they won't rather support resumability. That will be old-plain-form fallback only.