Closed bilalq closed 7 years ago
Perhaps 'node' could have a parameter added that would suppress closed pipe errors on stdout, almost as a cmdlet mode specifier.
Related: #831.
Doesn't seem to be limited to console
only:
$ iojs -e "process.stdout.write('1\n');process.stdout.write('2\n')" | head -1
1
events.js:141
throw er; // Unhandled 'error' event
^
Error: write EPIPE
at Object.exports._errnoException (util.js:734:11)
at exports._exceptionWithHostPort (util.js:757:20)
at WriteWrap.afterWrite (net.js:753:14)
A single call works though:
$ iojs -e "process.stdout.write('1\n2\n')" | head -1
1
I'm not sure if this can help, but I've just seen exactly the same error displayed (on the console) from within a Koa application I'm developing:
Error: write EPIPE
at Object.exports._errnoException (util.js:749:11)
at exports._exceptionWithHostPort (util.js:772:20)
at WriteWrap.afterWrite (net.js:753:14)
Unfortunately I have no idea what caused it or how to replicate it – but also with so little information shown, no means of investigating it... (I'm not doing any piping or working with streams directly myself).
This is on io.js v1.6.4, Koa v0.19.1. I can give further details of packages I'm using if it might help anyone.
+1 for this issue. Just ran into it as well. Makes it impossible to create command line utilities that compose in the unix style.
The problem is that, in the head
example, head
is closing the stdout file descriptor but the node program is still trying to write to that descriptor.
We don't currently expose a way for a program to check that the FD is still open (fcntl
+ GETFD
, I think?) and the act of checking before writing would likely be prone to race conditions where EPIPE
could slip through again.
User code can sidestep some of the badness by doing something like:
blackholeEPIPE(process.stderr)
blackholeEPIPE(process.stdout)
function blackholeEPIPE(stream) {
stream.on('error', onerror)
function onerror(err) {
if (err.code === 'EPIPE') {
stream._write = noopWrite
stream._writev = noopWritev
stream._read = noopRead
return stream.removeListener('error', onerror)
}
if (EE.listenerCount(stream, 'error') === 1) {
stream.removeListener('error', onerror)
stream.emit('error', err)
}
}
}
function noopWrite(chunk, enc, cb) {
cb()
}
function noopRead() {
this.push('')
}
function noopWritev(chunks, cb) {
cb()
}
Out of curiosity, how is this handled in other languages? I've never encountered this type of issue before.
If checking before writing is error-prone, would it be possible to wrap some try/catch logic around the write to handle the error when it happens?
Just discovered EPIPE with uploading 6.4 MB data to the cloud.
See https://github.com/nodejs/node/pull/9470 ... although I probably need to take a different approach.
Requesting reopen, because this clearly still happens:
$ node --version
v7.10.0
$ node -e 'console.log("1");console.log("2")' | head -n1
1
events.js:163
throw er; // Unhandled 'error' event
^
Error: write EPIPE
at exports._errnoException (util.js:1050:11)
at WriteWrap.afterWrite [as oncomplete] (net.js:814:14)
Or is there an alternative solution that I missed, which this was closed in favour of?
@anko https://github.com/nodejs/node/pull/9744 was deemed semver-major, so your example will work in Node 8 upwards, which is expected to be released in 2 weeks – there’s an RC available here: https://nodejs.org/download/rc/v8.0.0-rc.0/
If you’re using process.stdout.write
or process.stderr.write
, pass a callback to .write()
that checks whether an error has occurred or not (because Node can’t decide for you what your application wants to do with that error).
@addaleax I see. Thank you! :sparkles:
@addaleax Still doesn't work. I'm running on v8.1.0. Any pointers? @anko Did it work for you by any chance?
@bozzmob Can you share any code that might be relevant?
@bozzmob My earlier example that failed in Node 7 works in Node 8:
$ node --version
v8.7.0
$ node -e 'console.log("1");console.log("2")' | head -n1
1
This error happens to me when streaming video with https://github.com/phoboslab/jsmpeg . It happens after a while when clients are connecting. I was using NodeJS 4.7.2. Now upgraded to 8.8.1 and error still exists.
events.js:141
throw er; // Unhandled 'error' event
^
Error: write EPIPE
at exports._errnoException (util.js:907:11)
at WriteWrap.afterWrite (net.js:787:14)
@Dacesilian Could you post some code to reproduce that, ideally a minimal example?
Sorry, I can't, because it's not my application and I don't understand it :(
@addaleax @anko I actually changed to code to work Synchronously, so it worked then. I might have lost the exact code which was throwing this error. But, I can confirm my environment was v8.1.0 and it still happened. I can check and get back with the code.
same error as @Dacesilian here @anko I can't post an minimal example since it happened in my production environment, I got alert when my website first request was so slow so I restarted the pm2 process and it started to work correctly.
If you are using Jsmpeg library like me. you can add code
socket.on('error', function(e){
console.log(e)
});
to the WebSocket server.
I have tested it. The server runs ok.
I've also opened a related issue in the joyent/node#9279 repo about this. The error message from iojs is slightly different, but just as unhelpful in this situation.
For the sake of convenience, the rest of this post is verbatim what the issue under the node repo had, with the minor change in error message accounted for.
So this issue seems to have been around for quite some time. The problem probably affects all instances where things are piped into closed streams, but a very common example that comes up in practice is piping the output of a node program to something else.
Let's use this as a simple (albeit contrived) example:
When piping to something that won't consume the entirety of the stream, the program crashes in a very unhelpful way:
Categorizing this as an error in the first place is a little strange. While I can appreciate that trying to write to a closed stream is an error in theory, it seems really odd to crash like this when not paging through everything on stdout.
Treating stdout as a special case may be a bit much to ask, but at the very least, there should be some improvements to the error messaging here. The example I gave was just using
iojs -e
, but even with a proper file, no useful stacktrace or messaging is provided. Maybe add something like:Warning: Attempting to pipe to a closed stream (foo.js:10:12)
?I discovered this as I was piping the output of a CLI program I'd written to
less
. When paging all the way to the bottom, no errors would be logged, but exiting less before paging all the way through would consistently have the EPIPE error written to stderr.Right now, there are a few workarounds.
Users can just redirect stderr to
/dev/null
like so:or use the epipebomb module.
Neither approach is ideal.