Closed ronag closed 3 years ago
@dnlup would you mind taking a look and maybe making a test?
How does Node.js handle this case?
It just works. This might be a wasm thing.
Are we talking about something greater than Number.MAX_SAFE_INTEGER
?
Are we talking about something greater than
Number.MAX_SAFE_INTEGER
?
No, I have a case where I'm unable to receive anything larger than ~2^31. I'm not sure if it's undici at the moment but I think it's good idea to have a test where we transfer more than 2^32 bytes.
Could also be a problem with Node 16 http1/http2 server. I'm seeing the issue in a reverse proxy in production.
Inspecting llhttp
I see content-length is an unsigned int of 64 bits. That should exclude some issues in the parser native side.
I think the problem is with Node. It happens only if you pass the entire buffer to res.write
. Undici errors with message The other side closed
.
I used Node 16 to test.
Something is weird. undici always prematurely ends response at 1605405800 bytes without error. While curl properly reads all 5 GB.
@mcollina @dnlup
This fails: https://github.com/nodejs/undici/pull/805
@mcollina we should probably fix this before release.
@indutny How come llhttp ends message without error before outputting content-length number of bytes?
going back to undici@3 resolves the issue.
Currently I'm at a loss on cause and fix for this.
Isn't wasm memory space limited to ~2gb~? I think you need to split up the data before passing it into llhttp.
Correction: 4gb
Looking. Completely unrelated, you should probably use parseInt(..., 10)
instead of just parseInt()
(Yes, JS is weird: parseInt('0x1f') === 0x1f
)
llhttp itself appears to work fine: https://gist.github.com/indutny/818e4428eeadb17bb541a668e653e2a9 . Whenever I execute it - I get back exactly the same amount of bytes that passed to execute.
I strongly suspect that it is either an issue with wasm build or the wrapping around it.
@indutny Thanks for confirming. Much appreciated.
@indutny I ported your test into our WebASsembly and was able to reproduce the issue: https://gist.github.com/ronag/3e316e78931f8b49f1138460a77a258a
Added a bunch of logging and looks like uint64_t
subtraction doesn't work:
parser.content_length=0x100028c68 on_body_len=65536 read_total=1605238784
parser.content_length=0x100018c68 on_body_len=65536 read_total=1605304320
parser.content_length=0x100008c68 on_body_len=65536 read_total=1605369856
parser.content_length=0x0 on_body_len=35944 read_total=1605405800
As you might have guessed 35944 is 0x8c68
in hex. So it subtracts 0x8c68 from 0x100008c68 and gets 0x0.
FWIW, here's the code that I've added to wasm_on_body
to extract this:
const _cl = new Uint32Array(llhttp.exports.memory.buffer, p + 8 * 4, 2);
const _remaining = _cl[1] * 0x100000000 + _cl[0];
console.log('parser.content_length=0x%s on_body_len=%d read_total=%d', _remaining.toString(16), len, _total += len);
What's the best way to view the contents of the wasm file?
Sounds like wasm2wat
What's the best way to view the contents of the wasm file?
Maybe https://webassembly.studio?
I know what's going on. Could you find s_n_llhttp__internal__n_consume_content_length
replace size_t
s with uint64_t
s right after it and try recompiling and running the test?
That fixed it!
Alright, working on the real fix. Thanks for trying it.
Fix: https://github.com/nodejs/llparse/pull/44 .
@ronag could you try llhttp.c
from the attachment and see if it still works?
FWIW, this is the file that I intend to release once llparse fix lands. Feel free to ship the wasm version built from this file (after reviewing the diff) until we'll get it pushed upstream!
Fix: nodejs/llparse#44 .
@ronag could you try
llhttp.c
from the attachment and see if it still works?
All good!
Thank you for the help @indutny !
No worries. Always glad to fix my own mess ups 😂
On Tue, May 11, 2021 at 23:41 Daniele Belardi @.***> wrote:
Thank you for the help @indutny https://github.com/indutny !
— You are receiving this because you were mentioned. Reply to this email directly, view it on GitHub https://github.com/nodejs/undici/issues/803#issuecomment-839505492, or unsubscribe https://github.com/notifications/unsubscribe-auth/AAB2HQ632KFBEM2ZR2JGJVTTNIPI3ANCNFSM44V2Z4ZQ .
Created a new release: https://github.com/nodejs/llhttp/releases/tag/release%2Fv6.0.2
Haven't had time to check but I suspect we might have a problem with body sizes larger than 31 bit.undici will complete without error before reading entire response body.