Closed vicenteg closed 10 years ago
Please.
That is not a good thing.
On Wed, Oct 29, 2014 at 5:52 AM, Vince Gonzalez notifications@github.com wrote:
[vagrant@node1 you-suck]$ ~/log-synth/synth -count $((RANDOM % 10)) -schema ~/you-suck/test/preso-ratings.schema -format json {"name":"Nicole","timestamp":"2014-11-03 18:40:12","slide_title":"slide3","rating":"sweet!"} {"name":"Paula","timestamp":"2014-11-03 18:43:51","slide_title":"slide1","rating":"meh."} F 1 0.0 0 0.0 0.000
The last line prevents me using the synth'd JSON directly when redirecting the output to a file - the last line of output is not parseable as JSON. I think a simple solution would be to print that final line of logging information to stderr instead of stdout, so I can redirect to a file without filtering.
If that sound reasonable I'm happy to try it out in a fork and submit a pull request.
Use case, if the context is helpful: I am playing with spark streaming with textFileStream, and I'm using log-synth to generate data periodically in new files that will be batched up and processed by spark.
— Reply to this email directly or view it on GitHub https://github.com/tdunning/log-synth/issues/12.
I think that this is fixed.
The last line prevents me using the synth'd JSON directly when redirecting the output to a file - the last line of output is not parseable as JSON. I think a simple solution would be to print that final line of logging information to stderr instead of stdout, so I can redirect to a file without filtering.
If that sound reasonable I'm happy to try it out in a fork and submit a pull request.
Use case, if the context is helpful: I am playing with spark streaming with textFileStream, and I'm using log-synth to generate data periodically in new files that will be batched up and processed by spark.