Closed Frankusky closed 5 years ago
The issue you're experiencing is due to pulling all of the data into memory. Most database libraries/orms have a streaming feature which should be used instead. Have a look at the streaming API docs here: https://github.com/zemirco/json2csv#json2csv-async-parser-streaming-api
What he said ^
And feel free to ask here with more details of what you are trying to do if you need help.
Thanks for the quick support guys! It worked perfectly :) Closing
Include the version of json2csv used. 4.5.2
Include your node version/browser vendor and version. v10.13.0
Include the command or code you used. json2csv.parse(dbQueryResult, {delimiter: '\t', header: false, fields: tableColumns});
Include a sample dataset that we can test against. My dataset has 22 objects, and for this case it crashed when the query returned 881950 elements
[ {data1: 1, data2: 2, data3: 3, data4: 4, data5: 5, data6: 6..... (till 22)}, {data1: 1, data2: 2, data3: 3, data4: 4, data5: 5, data6: 6..... (till 22)} ...(till 881950) ]
[8416:000001AA50C92E70] 441325 ms: Mark-sweep 1393.4 (1425.5) -> 1393.1 (1426.0) MB, 1497.7 / 0.0 ms (average mu = 0.093, current mu = 0.002) allocation failure scavenge might not succeed [8416:000001AA50C92E70] 442625 ms: Mark-sweep 1393.8 (1426.0) -> 1393.5 (1426.5) MB, 1297.1 / 0.0 ms (average mu = 0.048, current mu = 0.002) allocation failure scavenge might not succeed
<--- JS stacktrace --->
==== JS stack trace =========================================
Security context: 0x03d44ba931e1
2: processCell [000003C0E785B589] [{my directory}\node_modules\json2csv\dist\json2csv.cjs.js:~1298] [pc=0000019AD0B3A3F8](this=0x030015b2e0f9 ,row=0x02fd1fdb65f9
FATAL ERROR: Ineffective mark-compacts near heap limit Allocation failed - JavaScript heap out of memory