Closed ankurkaushal closed 8 years ago
How big is your data set? Any way you could provide a sample?
Unfortunately I can't provide a sample but the datasets are quite large; some having more than 600,000 records. I actually ended up using inbuilt streams
in Node which works perfectly for my company's use case.
Have you tried https://github.com/zemirco/json2csv-stream?
To be honest, I didn't since we were little short on time. Will surely try it in the next project.
I'm going to close this, since the streams project is a better use-case for huge data and there is no way for me to reproduce the issue. Maybe one day, we'll merge the two.
@ankurkaushal Hi, I was also facing same problem from past last two days.. I tried with many libs and at last I found a great library which handles large dataset problem nicely....
https://www.npmjs.com/package/csvwriter
You must try this.. I have exported upto 5 lacs json objects..
Hope this helps others as well, when they come over this issue.
Cheers, Thanks.
Thanks for the link.
Same issue here with a 13GB jsonl file. I'm looking for a command line tool supporting jsonl.
[EDIT] json2csv in go looks like a good alternative.
I have been using
json2csv
for big data records & apparently it seems it can't handle large datasets.Here's the error I get:
I am using the latest version i.e.
3.0.1
. Any idea what might be causing this?