Closed wizzard0 closed 4 years ago
In the data
stream event, you can call .end()
after N rows have been collected. Any feature added would do the same.
oh, I see, it was the underlying Readable that did not close properly, calling .end()
on csv-parser connected to regular fs.ReadStream does indeed stop reading. Sorry for bothering then :/
To anybody who might run into a similar issue: tracing events via monkey-patching EventEmitter.prototype.emit
and setting highWatermark
to something like 4 bytes helps.
Feature Proposal
add "while" predicate and/or a "limit" option to get e.g. first 100 rows or "while rows match predicate"
Feature Use Case
exploring/previewing huge files, searching files, auto-detecting column separators
of course, we can just skip lines after parsing, but this wastes CPU and user time :/