davidmoreno / onion

C library to create simple HTTP servers and Web Applications.
http://www.coralbits.com/libonion/
Other
2.02k stars 252 forks source link

multipart POST larger than max_file_size continually retries upload, if multiple GET requests fire while uploading #233

Open jeffvandyke opened 6 years ago

jeffvandyke commented 6 years ago

I'll need to fix this problem pretty soon for a company project, but I might as well add an issue for documentation, and to see if anyone else has any helpful knowledge and expertise.

My problem seems to be happening inside request_parser.c. From my app, I'm setting the max_file_size to 50 MB, and uploading to another computer over ethernet. This takes up to 10 seconds to get to 50 MB, after which an internal error will be returned to the client saying that the file was too big. So far, all's well. It does waste time uploading the first 50 MB of a file that's discarded afterward, but no big deal. I'm using the server in O_POOL | O_DETACH_LISTEN | O_NO_SIGTERM mode with 8 maximum threads.

My problem arises if I'm polling more than a few simple small GET requests while the file is uploading. If this happens, then though I'm seeing the ONION_ERROR("Files on this post too big. Aborting.") in the output, doing a watch -n1 ls -lc /tmp reveals that it then deletes the ~50MB file and creates a new tmp file, and starts uploading from the beginning of the file. As far as I can tell, lib onion appears to be signaling the browser that it needs more data, and the browser then tries to upload the whole file again. With continual polling, the upload can be repeated over a dozen times.

The function parse_POST_multipart_content_type could be suspect - after adding in some ONION_DEBUG statements, the condition if (res == STRING_NEW_LINE) { ... triggers a return of OCS_NEED_MORE_DATA, I'm not sure exactly what the call stack is yet.