Closed animaonline closed 3 months ago
Buffer.concat([buffer, newChunk])
this creates new buffers and copies entire buffer for every chunk, not good for large files. Either create 1 buffer at start and write chunks to the correct position, or append chunks to file on disk. Is it slow before last chunk received or after? What's your available ram during this process?
Potential Cause:
The buffer management during large file uploads might be blocking the event loop or causing resource contention.
Yes, but we don't do any buffer management, only you do.
https://github.com/uNetworking/uWebSockets/blob/master/examples/Crc32.cpp this example takes any amount of POST and returns the CRC32 of it. So you can use that example to test any huge file.
Hi, I’m not sure how crc calculation will help, like I have mentioned in the case, it only started happening after upgrading to version 20.42, it’s a bit weird that it stops all other request handlers.
The Crc32 example is an upload example without any buffering. I did some testing and get inconclusive results. Can be a case to add in CI testing. Will come back, but in the mean time you can test with the Crc32 example (build all with make in root folder)
This code collects chunks into 1 buffer, no extra copy:
const getBody = (res, req, maxSize = 5000) => new Promise((resolve, reject) => {
const contentLength = Number(req.getHeader('content-length'));
if (!contentLength) return reject({ message: 'missingLength', code: '411' });
if (contentLength > maxSize) return reject({ message: 'maxSize', code: '413' });
let buffer, offset = 0;
res.onAborted(() => {
res.aborted = true;
reject({ message: 'aborted' });
});
res.onData((arrayBuffer, isLast) => {
const total = offset + arrayBuffer.byteLength;
if (isLast && total != contentLength) return reject({ message: 'sizeMismatch', code: '400' });
if (total > contentLength) return reject({ message: 'sizeMismatch', code: '400' }) || res.close();
if (!buffer) {
if (isLast) return resolve(Buffer.from(arrayBuffer));
buffer = Buffer.allocUnsafe(contentLength);
}
Buffer.from(arrayBuffer).copy(buffer, offset);
if (isLast) return resolve(buffer);
offset = total;
});
});
app.post('/upload', async (res, req) => {
try {
const body = await getBody(res, req);
console.log('received bytes:', body.length);
res.cork(() => res.end('ok'));
} catch (e) {
console.log('upload error:', e.message);
!res.aborted && res.cork(() => res.writeStatus(e.code || '500').end());
}
});
Thanks for the suggestions and the provided code examples. The new approach works well and resolves the issue. Here's a summary of why it worked and how it differs from my original implementation:
Content-Length
and directly copies chunks into it, avoiding the overhead of repeatedly creating and copying buffers.contentLength
and proper handling of size mismatches ensure robust upload processing.The issue appears to be related to inefficient buffer handling in my original code. However, it's still unclear why this problem only surfaced after upgrading to version 20.42.0. Any insights on changes in buffer handling in the latest version would be appreciated.
Thanks again for your help!
You can also just do exponentially growing buffers. This is standard behavior in C++ containers. Then you do logarithmic amounts of reallocations instead of linear.
uWS does no buffering internally
Alright, thanks for your help ❤️
Bug Report: Server Unresponsive During Large File Uploads
Version:
uWebSockets.js 20.42.0
Environment:
Issue: When uploading a large file (e.g., 500MB), the server becomes unresponsive, affecting the handling of other requests. This issue started occurring in version 20.42.0.
Code to Reproduce:
Steps to Reproduce:
Expected Behavior: The server should handle the file upload and respond to other requests concurrently.
Observed Behavior: The server stops responding to all other requests while the upload is in progress.
Additional Information: This issue did not occur in previous versions of uWebSockets.js.
Potential Cause: The buffer management during large file uploads might be blocking the event loop or causing resource contention.
curl Command: To reproduce the issue with a large file upload, you can use the following curl command:
curl -X POST -F 'file=@path/to/your/largefile.zip' http://localhost/upload