Closed shlomo666 closed 4 years ago
The CSV Specification does not specifically address trailing commas and doesn't specify that they should be dropped. Empty/missing values between or after commas are treated as an empty column of data.
Correct. Thanks.
Having the same issue.
Solve it by doing :
delete record[''];
just don't do if(record['']) but if(record[''] != undefined)
Also I get different result when running on different os, same node version. For "csv-parser@2.3.0"
Example:
//data.csv content
//NAME,AGE,
//Daffy Duck,24,
//Bugs Bunny,22,
const csv = require('csv-parser')
const fs = require('fs')
const results = [];
fs.createReadStream('data.csv')
.pipe(csv({mapValues: ({ header, index, value }) => isNaN(+value)?value:+value}))
.on('data', (data) => results.push(data))
.on('end', () => {
console.log(JSON.stringify(results[0], null, 2));
});
Linux (Red Hat 4.8.5-36) node version v12.16.1 result:
{ "NAME": "Daffy Duck", "AGE": 24, "": 0}
Windows 10 node version v12.16.1 result:
{ "NAME": "Daffy Duck", "AGE": 24}
Expected Behavior
Used to read files just fine when files had "," at the end of every line, back in my old version 1.12.0.
Actual Behavior
Creating a map of values where the keys are 0-N (numbers of columns).
How Do We Reproduce?
File:
code:
Result in v2.3.1:
Result in v1.12.0: