nodejs / node

Node.js JavaScript runtime ✨🐢🚀✨
https://nodejs.org
Other
106.59k stars 29.06k forks source link

"This is caused by either a bug in Node.js or incorrect usage of Node.js internals. Please open an issue with this stack trace..." #35384

Open Navigatron opened 3 years ago

Navigatron commented 3 years ago

What steps will reproduce the bug?

This occurred while sitting on 2000 concurrent https requests (using a custom agent) across 4 worker threads. ~100 of the requests hit a 2-minute timeout shortly before this happened. In short, this may be difficult to reproduce.

How often does it reproduce? Is there a required condition?

So far, it's only happened once.

Edit: I can reproduce this. 2000 concurrent requests does it, 400 does not.

What is the expected behavior?

No error messages.

What do you see instead?

In text form:

events.js:291
      throw er; // Unhandled 'error' event
      ^
Error [ERR_INTERNAL_ASSERTION]: This is caused by either a bug in Node.js or incorrect usage of Node.js internals.
Please open an issue with this stack trace at https://github.com/nodejs/node/issues

    at assert (internal/assert.js:14:11)
    at JSStreamSocket.doWrite (internal/js_stream_socket.js:165:5)
    at JSStream.onwrite (internal/js_stream_socket.js:33:57)
    at Socket.ondata (internal/js_stream_socket.js:77:22)
    at Socket.emit (events.js:314:20)
    at addChunk (_stream_readable.js:307:12)
    at readableAddChunk (_stream_readable.js:282:9)
    at Socket.Readable.push (_stream_readable.js:221:10)
    at TCP.onStreamRead (internal/stream_base_commons.js:188:23)
Emitted 'error' event on process instance at:
    at emitUnhandledRejectionOrErr (internal/event_target.js:542:11)
    at MessagePort.[nodejs.internal.kHybridDispatch] (internal/event_target.js:357:9)
    at MessagePort.exports.emitMessage (internal/per_context/messageport.js:18:26) {
  code: 'ERR_INTERNAL_ASSERTION'
}

In picture form:

screenshot

Trott commented 3 years ago

@nodejs/streams @joyeecheung

mcollina commented 3 years ago

Unfortunately it's close to impossible to help you with this without a script to reproduce this reliably.

hnviradiya commented 3 years ago

I am also facing this issue in version 12.x (we are using AWS lambda). We are calling multiple ajax in loop and sometime this error is thrown. I had also added comment in below similar issue which is for 15.x

https://github.com/nodejs/help/issues/2708

Trott commented 3 years ago

I am also facing this issue in version 12.x (we are using AWS lambda). We are calling multiple ajax in loop and sometime this error is thrown.

@hnviradiya Are you able to share your Lambda code?

terreng commented 2 years ago

I'm also experiencing this issue error while downloading large files using https. I tried using node-fetch instead but I'm seeing the same error. It happens randomly, but I've only seen it happen when I'm downloading files from a slow and somewhat unstable server connection. v14.16.0

mcollina commented 2 years ago

@terreng can you share your script?

terreng commented 2 years ago

@mcollina

const https = require('https');
const fs = require('fs');
const path = require('path');

var file = fs.createWriteStream(path.resolve(__dirname, "test"));
https.get(URL, function(response) {
    response.pipe(file);
    file.on("finish", function() {
        file.close();
        console.log("Download complete");
    });
}).on("error", function(err) {
    console.error(err);
});
Error: Parse Error: Expected HTTP/
    at HTTPParser.execute (<anonymous>)
    at TLSSocket.socketOnData (_http_client.js:509:22)
    at TLSSocket.emit (events.js:315:20)
    at TLSSocket.Readable.read (internal/streams/readable.js:519:10)
    at TLSSocket.Socket.read (net.js:625:39)
    at flow (internal/streams/readable.js:992:34)
    at resume_ (internal/streams/readable.js:973:3)
    at processTicksAndRejections (internal/process/task_queues.js:80:21) {
  bytesParsed: 5381,
  code: 'HPE_INVALID_CONSTANT',
  reason: 'Expected HTTP/',
  rawPacket: <Buffer 76 5e f0 68 d0 0c 11 d7 50 8f 67 2e 18 c8 b3 a6 ac c8 75 70 2b e3 65 26 5d e9 b5 1f 6e 62 2b 88 87 15 0e 2f 14 4b 7d 90 33 6a 5a 18 06 bf 74 86 cf 6c ... 16334 more bytes>
}
internal/assert.js:14
    throw new ERR_INTERNAL_ASSERTION(message);
    ^

Error [ERR_INTERNAL_ASSERTION]: This is caused by either a bug in Node.js or incorrect usage of Node.js internals.
Please open an issue with this stack trace at https://github.com/nodejs/node/issues

    at assert (internal/assert.js:14:11)
    at TLSSocket.socketOnData (_http_client.js:507:3)
    at TLSSocket.emit (events.js:315:20)
    at TLSSocket.Readable.read (internal/streams/readable.js:519:10)
    at TLSSocket.Socket.read (net.js:625:39)
    at flow (internal/streams/readable.js:992:34)
    at resume_ (internal/streams/readable.js:973:3)
    at processTicksAndRejections (internal/process/task_queues.js:80:21) {
  code: 'ERR_INTERNAL_ASSERTION'
}

v14.16.0 linux arm

mcollina commented 2 years ago

Which URL are you downloading from?

terreng commented 2 years ago

@mcollina I've sent you an email with an example URL.

mcollina commented 2 years ago

@terreng that's a different issue from the one at the top. In your case the HTTP server is responding with a response that is not conforming to the HTTP spec. I still think there is a bug there somewhere because the second error should not be there. I've opened a new issue: https://github.com/nodejs/node/issues/40242.

terreng commented 2 years ago

@mcollina Got it. I definitely got just an ERR_INTERNAL_ASSERTION with a different stack trace at one point, but I haven't been able to recreate it so far.

solymosi commented 2 years ago

I ran into an assertion with the exact same stack trace the other day:

node:internal/assert:14
    throw new ERR_INTERNAL_ASSERTION(message);
    ^

Error [ERR_INTERNAL_ASSERTION]: This is caused by either a bug in Node.js or incorrect usage of Node.js internals.
Please open an issue with this stack trace at https://github.com/nodejs/node/issues

    at new NodeError (node:internal/errors:371:5)
    at assert (node:internal/assert:14:11)
    at JSStreamSocket.doWrite (node:internal/js_stream_socket:164:5)
    at JSStream.onwrite (node:internal/js_stream_socket:33:57)
    at MqttStream.ondata (node:internal/js_stream_socket:77:22)
    at MqttStream.emit (node:events:526:28)
    at MqttStream.emit (node:domain:475:12)
    at addChunk (node:internal/streams/readable:315:12)
    at readableAddChunk (node:internal/streams/readable:289:9)
    at MqttStream.Readable.push (node:internal/streams/readable:228:10) {
  code: 'ERR_INTERNAL_ASSERTION'
}

In my case I have a custom Duplex stream wrapped with a TLSSocket but the symptoms are the same: under moderate to high load every once in a while this assertion is hit and my app crashes.

I did some debugging and it seems to me that this is caused by a missing "current empty write" check in the native TLSWrap implementation. Debug output for the TLSSocket in question:

TLSWrap client (6653) DoWrite()
TLSWrap client (6653) Empty write
TLSWrap client (6653) Trying to read cleartext output
TLSWrap client (6653) Read -1 bytes of cleartext output
TLSWrap client (6653) No pending encrypted output, writing to underlying stream
TLSWrap client (6653) Read 33 bytes from underlying stream
TLSWrap client (6653) Trying to write cleartext input
TLSWrap client (6653) Returning from ClearIn(), no pending data
TLSWrap client (6653) Trying to read cleartext output
TLSWrap client (6653) SSLInfoCallback(SSL_CB_HANDSHAKE_START);
TLSWrap client (6653) Read -1 bytes of cleartext output
TLSWrap client (6653) Trying to write encrypted output
TLSWrap client (6653) Writing 1 buffers to the underlying stream
node:internal/assert:14
    throw new ERR_INTERNAL_ASSERTION(message);
    ^

Error [ERR_INTERNAL_ASSERTION]: This is caused by either a bug in Node.js or incorrect usage of Node.js internals.
Please open an issue with this stack trace at https://github.com/nodejs/node/issues

The failed assertion is produced as follows:

To reproduce the bug the following conditions need to be fulfilled:

I tried but haven't been able to come up with a minimal implementation that reliably reproduces the issue.

For now I am using the following workaround in JS-land, in the constructor of my TLSSocket subclass:

// Patch `JSStreamSocket` implementation to work around a bug in the native `TLSWrap` class. `TLSWrap` treats
// empty writes in a special way, tracking pending ones in a separate state variable, `current_empty_write_`:
// - https://github.com/nodejs/node/blob/v16.14.2/src/crypto/crypto_tls.cc#L934-L961
// Sadly `TLSWrap` forgets to check this variable upon subsequent write requests, only checking `current_write_`:
// - https://github.com/nodejs/node/blob/v16.14.2/src/crypto/crypto_tls.cc#L574
// As a result, if another write is received before the empty write's callback is called, `TLSWrap` forwards it to
// the underlying `JSStreamSocket` straight away where it then blows up with an `ERR_INTERNAL_ASSERTION`:
// - https://github.com/nodejs/node/blob/v16.14.2/lib/internal/js_stream_socket.js#L164
// We work around this problem by patching `doWrite` to buffer any write requests that arrive while another write
// is still in progress, as well as patching `finishWrite` to execute any buffered writes after the pending empty
// write has finished. This *should* prevent the assertion from being hit.
// TODO: re-test with newer Node versions to see if the bug has been fixed
let kCurrentWriteRequest: symbol;
const jsStreamSocket = (this as any).ssl._parentWrap;
// Save original `doWrite` and `finishWrite` methods
const doWrite = jsStreamSocket.doWrite;
const finishWrite = jsStreamSocket.finishWrite;
// List of pending write requests
const pending: [any, Buffer[]][] = [];
// Patch `doWrite` to buffer write requests if another one is already pending
jsStreamSocket.doWrite = (req: any, bufs: Buffer[]) => {
  // Fish the private `kCurrentWriteRequest` symbol out of the `JSStreamSocket` and cache it
  kCurrentWriteRequest =
    kCurrentWriteRequest ||
    Object.getOwnPropertySymbols(jsStreamSocket).find(s => s.description === "kCurrentWriteRequest");
  // Do we have a pending write request already?
  if (kCurrentWriteRequest && jsStreamSocket[kCurrentWriteRequest]) {
    // If so, do not do the write now, instead buffer it for later
    pending.push([req, bufs]);
    return 0;
  }
  // If not, we can go ahead with the write as usual
  return doWrite.call(jsStreamSocket, req, bufs);
};
// Patch `finishWrite` to dispatch any buffered write requests now that the current one has finished
jsStreamSocket.finishWrite = (handle: any, errCode: number) => {
  if (handle) {
    // Finish the current write
    finishWrite.call(jsStreamSocket, handle, errCode);
  }
  // Take the next request from the buffer, if any
  const queued = pending.shift();
  if (queued) {
    // Call `doWrite` now that the previous write has finished
    jsStreamSocket.doWrite(...queued);
  }
};

With this in place I haven't been able to hit the assertion even on a ridiculously overloaded server.

I don't feel comfortable enough around C++ code to attempt to fix this myself (nor do I understand the internals of TLSWrap enough for that), but hopefully this input will help someone more knowledgeable develop a fix.

naveenjpr commented 2 months ago

MongoDB Compass has encountered an unexpected error Error [ERR_INTERNAL_ASSERTION]: This is caused by either a bug in Node.js or incorrect usage of Node.js internals. Please open an issue with this stack trace at https://github.com/nodejs/node/issues at new NodeError (node:internal/errors:406:5) at assert (node:internal/assert:14:11) at internalConnectMultiple (node:net:1118:3) at Timeout.internalConnectMultipleTimeout (node:net:1687:3) at listOnTimeout (node:internal/timers:575:11) at process.processTimers (node:internal/timers:514:7)

my node version v20.13.1