klaemo / csv-stream

:page_with_curl: Streaming CSV Parser for Node. Small and made entirely out of streams.
Other
103 stars 15 forks source link

Invalid non-string/buffer chunk #10

Closed mulderp closed 10 years ago

mulderp commented 10 years ago

I am running a CSV conversion with:

// parser.js
var Transform = require('stream').Transform;

var csv = require('csv-streamify');
var JSONStream = require('JSONStream')

var csvToJson = csv({objectMode: true, delimiter: ';', inputEncoding: 'utf8'});

var parser = new Transform();
parser._transform = function(data, encoding, done) {
  this.push(data);
  done();
};

var jsonToStrings = JSONStream.stringify(false);

// Pipe the streams
process.stdin
  .pipe(csvToJson)
  .pipe(parser)
  .pipe(jsonToStrings)
  .pipe(process.stdout);

But I get:

events.js:72
        throw er; // Unhandled 'error' event
              ^
TypeError: Invalid non-string/buffer chunk
    at validChunk (_stream_writable.js:150:14)
    at Transform.Writable.write (_stream_writable.js:179:12)
    at write (_stream_readable.js:573:24)
    at flow (_stream_readable.js:582:7)
    at CSVStream.pipeOnReadable (_stream_readable.js:614:5)
    at CSVStream.EventEmitter.emit (events.js:92:17)
    at emitReadable_ (_stream_readable.js:408:10)
    at emitReadable (_stream_readable.js:404:5)
    at readableAddChunk (_stream_readable.js:165:9)
    at CSVStream.Readable.push (_stream_readable.js:127:10)
klaemo commented 10 years ago

hey, you can do two things, either

var parser = new Transform({ objectMode: true })

or

var csvToJson = csv({encoding: 'utf8', delimiter: ';'});

var parser = new Transform({ encoding: 'utf8'});

and there is no need to specify the inputEncoding for csv-streamify if it's utf8 :)

Good luck!

mulderp commented 10 years ago

The second approach works! Thanks!