Open muhammad-zakir opened 10 months ago
I have been seeing this too. I saw that our process manager restarted the node process because of an exit.
Diving deeper in the logs I found the same error:
node:events:492
throw er; // Unhandled 'error' event
^
Error: EMFILE: too many open files, open '/var/nodeapp/sveltekit-app/build/client/_app/version.json'
Emitted 'error' event on ReadStream instance at:
at emitErrorNT (node:internal/streams/destroy:151:8)
at emitErrorCloseNT (node:internal/streams/destroy:116:3)
at process.processTicksAndRejections (node:internal/process/task_queues:82:21) {
errno: -24,
code: 'EMFILE',
syscall: 'open',
path: '/var/nodeapp/sveltekit-app/build/client/_app/version.json'
}
Nothing else in particular. Nothing I can find to reproduce it.
Using not the latest, but still quite recent versions of SK and node adapter:
"@sveltejs/adapter-node": "^2.0.0",
"@sveltejs/kit": "^2.0.0",
I am on Debian Linux v11
(x86-64), Node v20.10.0
. No containers, plain server.
I have been seeing this too. I saw that our process manager restarted the node process because of an exit.
Diving deeper in the logs I found the same error:
node:events:492 throw er; // Unhandled 'error' event ^ Error: EMFILE: too many open files, open '/var/nodeapp/sveltekit-app/build/client/_app/version.json' Emitted 'error' event on ReadStream instance at: at emitErrorNT (node:internal/streams/destroy:151:8) at emitErrorCloseNT (node:internal/streams/destroy:116:3) at process.processTicksAndRejections (node:internal/process/task_queues:82:21) { errno: -24, code: 'EMFILE', syscall: 'open', path: '/var/nodeapp/sveltekit-app/build/client/_app/version.json' }
Nothing else in particular. Nothing I can find to reproduce it.
Using not the latest, but still quite recent versions of SK and node adapter:
"@sveltejs/adapter-node": "^2.0.0", "@sveltejs/kit": "^2.0.0",
I am on Debian Linux
v11
(x86-64), Nodev20.10.0
. No containers, plain server.
Have you tried running lsof
on your server and check whether or not there’s strangely a lot of opened files by node? And it keeps increasing each time until it hits the limit?
If you don’t mind me asking, what’s your temporary solution for now?
sudo lsof | awk '{print $9}' | sort | uniq -c | sort -nr
5385057 27
384986 /var/nodeapp/sveltekit-app/build/client/_app/version.json
8470 731
8320 1406
6029 10932
3052 43387
2651 0
....
This is very surprising. Why would version.json need to be open 384986 times? So apparently this number grows (quite rapidly) until it breaks.
I have no temporary solution, our process manager (supervisord) restarts the process when it fails (I was not even aware of this problem).
sudo lsof | awk '{print $9}' | sort | uniq -c | sort -nr 5385057 27 384986 /var/nodeapp/sveltekit-app/build/client/_app/version.json 8470 731 8320 1406 6029 10932 3052 43387 2651 0 ....
This is very surprising. Why would version.json need to be open 384986 times? So apparently this number grows (quite rapidly) until it breaks.
I have no temporary solution, our process manager (supervisord) restarts the process when it fails (I was not even aware of this problem).
Hahaha, I can relate to the frustration. By any chance, are you also using Sentry for your project?
No, no Sentry involved.
I only have the version.json file being opened this many times (I have no idea what that '27' thing is that has that many file descriptors open, I'm chatGPTing my commands here).
I do have this implemented with a pollInterval set that is, presumably, too low (despite being the same as used in the tutorial):
const config = {
kit: {
adapter: adapter(),
version: {
pollInterval: 5000,
},
};
Still don't know why this could add up the file descriptors like that, we probably only have a few hundred of visitors each day.
No, no Sentry involved.
I only have the version.json file being opened this many times (I have no idea what that '27' thing is that has that many file descriptors open, I'm chatGPTing my commands here).
I do have this implemented with a pollInterval set that is, presumably, too low (despite being the same as used in the tutorial):
const config = { kit: { adapter: adapter(), version: { pollInterval: 5000, }, };
Still don't know why this could add up the file descriptors like that, we probably only have a few hundred of visitors each day.
I've been stuck with this for almost 2 weeks by now, still've no idea what's causing all these weirdly opened files. In my case, those seems to be coming from Sentry's source maps, but even one without Sentry like your case has the a similar issue.
For now I use pm2 to help with the restart when it hits the limit, but this is quite annoying to be just let be.
Got a similar error when trying to run the development server: Error: EMFILE: too many open files, watch '.../vite.config.js'
Edit: was somehow able to fix it after restarting several applications. Still no clue why. Probably an issue with my OS.
Describe the bug
Hi there, first things first sorry if my english isn't proper and my questions isn't clear or I miss a requirement for raising up an issue. I just started using Svelte with Svelte Kit for an app on production, but recently we started to get some crashes on the server.
All the times the crashes happens, the log just said as how I've put the log below, just maybe with a different source of files, but mostly from the same
immutable
directory. This app is served inside a container that's managed by CapRover.I've tried increasing the number of instances of container for the app and increasing the container's open file limit, but this problem still occurs.
There is too much files that are being opened by
node
and it keeps increasing each time until it hits the limits, so I personally don't think that keep increasing theulimit
will be the proper solution here, for example:Is something that I've dong wrong here? Maybe my
Dockerfile
, or how I handled an import? etc? anyone else having a similar issue?By the way, here's the
Dockerfile
that I'm using:Thanks a lot beforehand, and sorry for the inconveniences if my questions are incomplete or just blatantly wrong here.
Reproduction
I can't really reproduce it since usually this problem occurs when we're having a bit a peak traffic (around 100-ish visitor)
Logs
System Info
Alpine Linux:
3.19.0
Architecture:arm64
Node:21.5.0
Containerized inside a
node:alpine
image, using latest version of Node and running on an ARM Architecture.I'm also using
pnpm
to build the app, and requiringdotenv
as the documentation suggested when running the app.Severity
annoyance
Additional Information
No response