Closed vismayshah90 closed 8 years ago
Can you upload a Demo.json
file that shows the problem, and maybe give information on what kind of system you are using (node version, OS, etc.)?
Hi, PFA the Sample Demo.json which I'm trying to parse.
OS:- Windows 7 Node version :- 4.3.2
@addaleax, I was basically trying to read a huge amount json file as stream and convert it into csv but while doing so I found that while doing parsing i.e. converting json 2 csv for more than 10k records some of my records are getting missed out and hence causing problem. Is there a better way around of converting stream of json data into csv.?
ack, I can reproduce that, and it looks a lot like a bug in json2csv-stream… you should probably report it there, and maybe look for other modules that have the same effect. I think you could easily put something together on top of https://www.npmjs.com/package/jsonstream.
@addaleax , thanks for your response. I would probably try the module which you have suggested.
Hi All,
I'm trying to use json2csv-stream module to convert json object to csv, but I have observed that when I increase my size of json object to say around 10k records then while converting to csv some of the records are getting missed without any error being thrown. In order to cross verify I have used JSLint to verify my json and it is valid json.
Sample JSON : [{"name":"xyz","subject":"CS"}, {"name":"abc","subject":"Maths"}, .... 10k records]
Sample Code : var fs = require('fs'); const jsonStream = require('json2csv-stream');
const parser = new jsonStream();
const file = fs.createWriteStream('Demo.csv') .on('close', () => { console.log('File Done'); return; }) .on('error', (err) => { console.log('File NOT Done'); return; });
fs.createReadStream('Demo.json').pipe(parser).pipe(file);
Please suggest solution or root cause of what I'm doing wrong.