vercel / next.js

The React Framework
https://nextjs.org
MIT License
125.66k stars 26.83k forks source link

MaxListenersExceededWarning #54899

Closed filmerjarred closed 1 year ago

filmerjarred commented 1 year ago

Verify canary release

Provide environment information

NOTE: Installing next@canary caused a bunch of `ERESOLVE` errors, so I have not tested with it

 Operating System:
      Platform: linux
      Arch: x64
      Version: #1 SMP Fri Jan 27 02:56:13 UTC 2023
    Binaries:
      Node: 20.5.1
      npm: 9.8.0
      Yarn: 1.22.15
      pnpm: 6.11.0
    Relevant Packages:
      next: 13.4.20-canary.15
      eslint-config-next: 13.4.12
      react: 18.2.0
      react-dom: 18.2.0
      typescript: 5.1.6
    Next.js Config:
      output: N/A

Which area(s) of Next.js are affected? (leave empty if unsure)

No response

Link to the code that reproduces this issue or a replay of the bug

https://github.com/Sage-Future/fatebook/tree/revision-test

To Reproduce

Push to vercel and load page

Issue deployment here https://vercel.com/sage-org/forecast-bot/6CiFTtcpywgQBwxYUpPJHTVQwZPz

Describe the Bug

We pushed a new version of our app to production today and started seeing "MaxListenersExceededWarning" warnings.

What we tried:

  1. "Instant rollback" made the warnings stop, however
  2. forking the code identical to the rolled back commit and deploying to a new branch still had the warnings
  3. Updating the deploy install command to npm ci and clearing the build cache didn't help

I figured out how to get a stack trace for the warnings, and this is what we see:

  (node:8) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 response listeners added to [ClientRequest]. Use emitter.setMaxListeners() to increase limit
  MaxListenersExceededWarning
  Possible EventEmitter memory leak detected. 11 response listeners added to [ClientRequest]. Use emitter.setMaxListeners() to increase limit
  MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 response listeners added to [ClientRequest]. Use emitter.setMaxListeners() to increase limit
      at _addListener (node:events:588:17)
      at ClientRequest.prependListener (node:events:620:14)
      at ClientRequest.prependOnceListener (node:events:665:12)
      at mod.request (/var/task/___vc/__launcher.js:190:13)
      at Bridge.handleEvent (/var/task/___vc/__launcher.js:617:31)
      at async Bridge.launcher [as handler] (/var/task/___vc/__launcher.js:474:26)

My guess is this __launcher.js:190:13 has been updated in the last couple days and this is causing the issue. We don't see the warnings when we use "instant rollback" as I assume it's using a cached version of launcher.js

I can see 2 other issues potentially related, https://github.com/vercel/next.js/issues/53949 and https://github.com/vercel/next.js/issues/54104

Expected Behavior

Our logs not being full of memory leak warnings

Which browser are you using? (if relevant)

No response

How are you deploying your application? (if relevant)

vercel

DB-Alex commented 1 year ago

We have the same all over our logs in vercel:

(node:15) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 response listeners added to [ClientRequest]. Use emitter.setMaxListeners() to increase limit
(node:15) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 response listeners added to [ClientRequest]. Use emitter.setMaxListeners() to increase limit
(node:15) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 response listeners added to [ClientRequest]. Use emitter.setMaxListeners() to increase limit
renenielsendk commented 1 year ago

same here:

(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
(node:97237) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 close listeners added to [TLSSocket]. Use emitter.setMaxListeners() to increase limit
san-pblr-gct commented 1 year ago

Same here.

image
devjiwonchoi commented 1 year ago

@filmerjarred @Renetnielsen @san-pblr-gct @DB-Alex Y'all need to provide some valid repros. Sharing logs or undeployable repos doesn't help to fix this issue.

filmerjarred commented 1 year ago

Seems fixed now, we're not seeing it with new deploys.

And this probably wanted to be a vercel support request rather than a nextjs issue.

Was hoping someone could eyeball whatever had changed within the last week around /var/task/___vc/__launcher.js:190:13 within the vercel code and it would be obvious whatever it adding listeners.

elie222 commented 1 year ago

Facing the exact same issue as others. So not sure what that supposed fix is.

I can't provide a repo as it's our entire codebase, and don't have the time to try create a reproduction that is smaller. If I knew what was causing it I'd fix it.

I'm using Prisma with data proxy as a best guess but actually have no idea what's causing it.

GainorB commented 1 year ago

Same issue here. Looking to move my deployment to Vercel, and I am seeing this error in the logs while trying to debug next-auth issues as well.

rawestmoreland commented 1 year ago

Seeing the same issue. We've rolled back our production app. We're seeing these Maximum call stack size exceeded errors as well. Also referencing /var/task/___vc/__launcher.js

tpae commented 1 year ago

Seeing this as well

wwei-github commented 1 year ago

How to solve this problem?

youminkim commented 1 year ago

We see MaxListenersExceededWarning errors from many severless functions.

It happens only to deployments after today 4:30 pm (Singapore time). We instant rollbacked to previous build and it works fine without errors.

(node:15) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 response listeners added to [ClientRequest]. Use emitter.setMaxListeners() to increase limit

elja commented 1 year ago

I do have the same:

(node:8) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 response listeners added to [ClientRequest]. Use emitter.setMaxListeners()

domharrington commented 1 year ago

Seeing lots of these errors rolling into our Vercel logs in the past 30 mins or so.

angelpadillar commented 1 year ago

This also began happening ~2 hours ago. All was good yesterday.

(node:8) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 response listeners added to [ClientRequest]. Use emitter.setMaxListeners() to increase limit

marcelocch commented 1 year ago

Same here in our account "artisan-labs" my project is "brisas-titicaca" Same logs MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 response listeners added to [ClientRequest]

devPablo commented 1 year ago

This is happening on our end as well, error message: (node:15) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 16 response listeners added to [ClientRequest]. Use emitter.setMaxListeners() to increase limit

domharrington commented 1 year ago

I just created a support case on Vercel: https://vercel.com/help I will update in here if I hear anything.

christopherliedtke commented 1 year ago

We see MaxListenersExceededWarning errors from many severless functions.

It happens only to deployments after today 4:30 pm (Singapore time). We instant rollbacked to previous build and it works fine without errors.

(node:15) MaxListenersExceededWarning: Possible EventEmitter memory leak detected. 11 response listeners added to [ClientRequest]. Use emitter.setMaxListeners() to increase limit

I do have the same problem and rolling back didn't work for me unfortunately. This issue should be reopened.

maccman commented 1 year ago

We're seeing the same issue.

kitajchuk commented 1 year ago

Also having this issue.

theholla commented 1 year ago

We also ran into this starting with yesterday's deploy

Fruchtix commented 1 year ago

Same Issue over here. Any help is appreciated.

askkaz commented 1 year ago

Same issue here - seeing this on all of our API routes (page router). We tried redeploying the older version of our code on Vercel that was working and it is now exhibiting the same behavior. Suspect something underlying in the Vercel infrastructure. We're on Next 13.4.19.

theZaX commented 1 year ago

Same issue here. Connected to axiom and sentry on vercel.

christopherliedtke commented 1 year ago

Same issue here - seeing this on all of our API routes (page router). We tried redeploying the older version of our code on Vercel that was working and it is now exhibiting the same behavior. Suspect something underlying in the Vercel infrastructure. We're on Next 13.4.19.

For all API routes on page router mode as well. "next": "^13.5.2",

holcz commented 1 year ago

Same issue for us as well. Our stack that can be relevant:

"next": "~12.1.6",
"next-auth": "~4.20.1",
"next-axiom": "^0.14.0",
"@next-auth/prisma-adapter": "^1.0.3",
"prisma": "4.8.1",

Connected to Axiom. Thanks for looking into this.

danielhochman commented 1 year ago

Same issue here. I have opened a support case with Vercel and I would suggest anyone experiencing the problem to do the same, as this is likely an issue with their environment.

javivelasco commented 1 year ago

Hi everyone! This was an issue we detected in our runtime and was already fixed. You can re-deploy to Vercel or re-run your deployment and the problem should be gone. Please reopen in case it keeps happening. Sorry about the inconvenience! 🙏

github-actions[bot] commented 1 year ago

This closed issue has been automatically locked because it had no new activity for 2 weeks. If you are running into a similar issue, please create a new issue with the steps to reproduce. Thank you.