Open agorf opened 7 years ago
Hi @agorf, thanks for your feedback, I wasn't aware of this. Can you illustrate how you would do it? Please find the current situation in below snippets.
https://github.com/kerberos-io/machinery/blob/master/src/kerberos/machinery/io/IoScript.cpp#L49 https://github.com/kerberos-io/machinery/blob/master/scripts/run.sh
@cedricve Sure. In most programming languages there's a call called "popen" that lets you spawn a command as a subprocess and that returns a file handler. If you write to that file handler, you pass data to the subprocess stdin. If you read from that file handler, you read data from the subprocess stdout. Here's an example in Ruby:
f = IO.popen('tee', 'w') # open subprocess to write to its stdin
f.puts('testing!') # write
f.close
This calls the tee
command which reads from stdin and outputs everything to its stdout. So it essentially writes 'testing!
to the stdin of tee
and tee
outputs testing!
.
Sorry I'm not able to provide an example in C++; my familiarity with the language is very limited.
Hi, for me when I use scripts, my script is don't receive the first argument with the JSON.
And thanks a lot for Kerberos =)
Hi, thanks for Kerberos.
When choosing to run a script, passing it a JSON payload for each capture, Kerberos sends the request via the command line arguments which is non-standard and is not compatible with the Unix philosophy of passing contents from one process to the other through pipes.
It'd be nice to have Kerberos pass the JSON payload to each script via the standard input. This should make the whole process more robust (e.g. command line arguments are space-separated and JSON may have spaces) and cleaner.