I am new to Kysely and trying to insert approx 1.000.000 records which originates from a CSV file.
The file is processed fine but when adding DB insert I get out of memory.
How can this be done in a better way to reduce memory usage?
Parsing
async function doStandardTexts() {
const fs = require('fs');
var bomstrip = require('bomstrip');
const csvFile = fs.createReadStream('standard_text_utf8.txt').pipe(new bomstrip());
let count = 0;
Papa.parse(csvFile, {
header: true,
delimiter: "\t",
dynamicTyping: true,
quoteChar: '',
escapeChar: '',
step: function(row:any) {
//console.log("Row:", row.data);
const standardTextItem = toStandardText(row.data);
console.log(standardTextItem);
count +=1;
insertStandardText(standardTextItem); // <<<<<<<<< this fails with out of memory
},
complete: function() {
console.log("All done!: " + count);
}
});
}
Hi,
I am new to Kysely and trying to insert approx 1.000.000 records which originates from a CSV file. The file is processed fine but when adding DB insert I get out of memory.
How can this be done in a better way to reduce memory usage?
Parsing
insertStandardText(standardTextItem);
implementationDatabase config: