rannn505 / child-shell

Node.js bindings 🔗 for shell
http://rannn505.github.io/child-shell/
MIT License
301 stars 71 forks source link

Cannot call write after a stream was destroyed #74

Closed mszekiel closed 4 years ago

mszekiel commented 5 years ago

I am using node-powershell with some script files of PS. One of them I use for isntalling VM. It is quite big script. I call all my scripts via execute function.

import Shell from 'node-powershell';
import { POWERSHELL_TIMEOUT } from '../constants';
import pTimeout from 'p-timeout';

export function execute(command, callback) {
    const powershell = new Shell({
        executionPolicy: 'Bypass',
        noProfile: true,
        debugMsg: false,
        inputEncoding: 'UTF-8',
        outputEncoding: 'UTF-8',
    });

    let psTimeout;

    if (callback instanceof Function) {
        psTimeout = 999999999;
    } else {
        psTimeout = POWERSHELL_TIMEOUT;
    }

    powershell.addCommand(command);

    if (callback instanceof Function) {
        powershell.streams.stdout.on('data', data => data.trim().split('\n').map(callback));
        powershell.streams.stdout.on('err', data => data.trim().split('\n').map(callback));
    }

    return pTimeout(powershell.invoke(), psTimeout)
        .then(data => {
            return powershell.dispose().then(() => {
                return data;
            });
        }).catch(err => {
            powershell.dispose();
            throw err;
        });
}

I use dispose after every single command. That makes app lightweight and max of 70MB of RAM usage. Calling simple commands is quite easy and works, no errors. But if I call my large script I have problem with dispose(). It returns me the following error.

NPS>  Process 85860 started
NPS>  Command invoke started
NPS>  Process 85860 exited with code 0
2019-01-29T12:33:10.078Z NodeError: Cannot call write after a stream was destroyed
    at doWrite (_stream_writable.js:406:19)
    at writeOrBuffer (_stream_writable.js:394:5)
    at Socket.Writable.write (_stream_writable.js:294:11)
    at main.js:125:1421
    at D.t._execute 
    at D._resolveFromExecutor 
    at new D 
    at t.ShellWrite 
    at main.js:117:4691
    at D.t._execute 
    at D._resolveFromExecutor 
From previous event:
    at D.I [as _captureStackTrace] 
    at D._resolveFromExecutor 
    at new D 
    at t.value 
    at main.js:117:4646
    at a 
    at main.js:64391
    at c
    at process._tickCallback (internal/process/next_tick.js:61:11)

main.js:11 Error [ERR_STREAM_DESTROYED]: Cannot call write after a stream was destroyed
    at doWrite (_stream_writable.js:406:19)
    at writeOrBuffer (_stream_writable.js:394:5)
    at Socket.Writable.write (_stream_writable.js:294:11)
    at     at D.t._execute 
    at D._resolveFromExecutor 
    at new D 
    at t.ShellWrite 
    at main.js:125:19258
    at D.t._execute 
    at D._resolveFromExecutor 
From previous event:
    at D.I [as _captureStackTrace] 
    at D._resolveFromExecutor 
    at new D 
    at t.value 
    at main.js:117:4646
    at a 
    at main.js:11:64246
    at c 
    at process._tickCallback (internal/process/next_tick.js:61:11)
mszekiel commented 5 years ago

Even if I try hackish way and do something like fire and forget of dispose it throws at the dispose.

setTimeout(() => {
                powershell.dispose();
            }, 10000);
BenjaminMichael commented 5 years ago

I wrote a program that used node-powershell and it had to do many operations that were asynchronous. What I landed on was that the only way I could do things as quick as I wanted while resource efficient was to only create a single PS object with const powershell = new Shell, never dispose of it, and always be listening for the output of Powershell. Because I was doing LDAP lookups and things I had to be on the lookout for thing coming back in the wrong order or even mashed together.


//one powershell to rule them all
powershell.on('output', output =>{
    let data;
    try{
       data = JSON.parse(output);
    console.log("PowerShell returning data: " + output);
        evaluatePSOutput(data);
    }
    catch(err){
        console.log('EOI error');
        let outputChunks = output.split(`EOI`);
        outputChunks.forEach(chunk => {
            let myData=JSON.parse(chunk);
            evaluatePSOutput(myData);
        });
    }
mszekiel commented 5 years ago

@BenjaminMichael okay, but what if I trigger multiple scripts/cmds all the time app is running? For eg. I start long script to install VM but concurrently run other commands in PS.

BenjaminMichael commented 5 years ago

You could wrap your script/cmds inside a scriptblock and use start-job to run it async but you don't have to do it this way.

You can have multiple node-powershell objects running at once and be able to .dispose() of each with a callback but in your case your script is returning exit code 0 which disposes of it automatically. Note that node-powershell handles EOI (end of input) differently from the process closing and if the process closes and it exited with code 0 then it automatically disposes.

this._proc.on('close', code => {
      this.emit('end', code);
      let exitMsg = `Process ${this._proc.pid} exited with code ${code}\n`;
      if(hasError) {
        // PS process failed to start
        this._print(ERROR_MSG(exitMsg));
        // this._print(ERROR_MSG(Buffer.concat(output.stdout).toString()));
        throw new Error(Buffer.concat(output.stdout).toString().replace(/\0/g, ''));
      } else {
        // dispose
        if(code !== 0) {
          this._print(ERROR_MSG(exitMsg));
          this._reject(ERROR_MSG(`script exit ${code}`));
        } else {
          this._print(OK_MSG(exitMsg));
          this._resolve(`script exit ${code}`);
        }
      }
    });

tl;dr take the Exit keyword out of your script and it probably wouldnt behave this way

BenjaminMichael commented 5 years ago

I would add: what if you had to do a hundred tasks? I ran into this with recursive functions and found out I didn't really want to try and open up to 100 powershell instances at once. The way I tried to deal with it was to open 1 powershell objects and use better-queue to queue up powershell commands 1 at a time but you can push a new task into the queue asynchronously no problem.

rannn505 commented 4 years ago

PSCulster is coming :)