gkindel / csv-js

A Comma-Separated Values parser for JavaScript. Standards-based, stand alone, and no regular expressions.
http://gkindel.github.io/cxv-js/csv.html
106 stars 35 forks source link

node stream support idea #16

Closed cygnyx closed 9 years ago

cygnyx commented 9 years ago

I would like to be able to parse very large CSV files in node.js. I've implemented a CSV.stream as an idea on how this might be supported. As a utility function I also implemented CSV.json for some example output. In practice I would use the object stream output from CSV.stream for further processing. I'm not sure how this might work in a web browser. I think there are tools, like browserify, that emulate the node.js type of 'require' function. I've included a command line example in the code which demonstrates a simple case

$ node -e "c=require('CSV-JS');require('fs').createReadStream('csv.txt').pipe(c.stream()).pipe(c.json()).pipe(process.stdout)"

This implementation could be improved, but I didn't want to modify a lot of the existing code. The new code parses the text input stream line by line. It feeds a single line at a time to CSV.parser() and push the result into the output stream.

gkindel commented 9 years ago

It'll need testing, but i'll pull it in. Thanks for contributing!

gkindel commented 9 years ago

Heads up -- I've cleaned up the lint, and decided to move CSV.json to CSV.stream.json to reduce the namespace footprint.

cygnyx commented 9 years ago

Sure. You may not want the CSV.json function in your code at all. I just put it in for testing.