If I send POST data to the server and there is no cgi function configured to handle it, it crashes horribly.
To repeat: Send post data with cURL
$ curl --progress-bar -X POST -T bigfile.txt "http://192.168.33.173/newdir/test.txt" | tee /dev/null
# 2.0%
404 File not found.
E (6827) httpd: url = NULL
E (6827) httpd: Unexpected data from client. blah blah blah........
... pages of error...
E (6827) httpd: Unexpected data from client. blah blah blah........
... pages of error...
... repeats indefinitely ...
I fixed it by adding a check to cgiNotFound() that continues if post data has not been completely received.
//Used to spit out a 404 error
static CgiStatus ICACHE_FLASH_ATTR cgiNotFound(HttpdConnData *connData) {
if (connData->isConnectionClosed) return HTTPD_CGI_DONE;
if (connData->post.received == connData->post.len)
{
httpdStartResponse(connData, 404);
httpdEndHeaders(connData);
httpdSend(connData, "404 File not found.", -1);
return HTTPD_CGI_DONE;
}
return HTTPD_CGI_MORE; // make sure to eat-up all the post data that the client may be sending!
}
Also, I prevented that (and other?) error from repeating indefintely by adding a break; after it.
Philosophically, should the server continue to eat-up all of the post data if it can't do anything with it? The file could be gigabytes and used as a DoS attack. Is there a way to gracefully refuse the post?
If I send POST data to the server and there is no cgi function configured to handle it, it crashes horribly.
To repeat: Send post data with cURL
I fixed it by adding a check to cgiNotFound() that continues if post data has not been completely received.
Also, I prevented that (and other?) error from repeating indefintely by adding a break; after it.
Philosophically, should the server continue to eat-up all of the post data if it can't do anything with it? The file could be gigabytes and used as a DoS attack. Is there a way to gracefully refuse the post?
Stay tuned for pull-request.