Closed devingfx closed 11 months ago
According to the spec https://csv-spec.org/ (4th point)
Each record MUST contain the same number of fields
I don't expect JSON.parse to parse invalid json and the same is with csv parser, my opinion is that the user should preprocess the file to make it a valid csv before trying to parse it
I agree with @sigmaSd. Throwing an error when trying to parse an invalid string is the correct and expected behavior.
Yeah, I'm with @sigmaSd and @timreichen
This isn't a good and proper behavior when parsing files, if any data is invalid the package should throw an error to warn the user that the data type is invalid. This treatment should be done before any parsing.
Shouldn't a valid empty field parse to an empty string, rather than undefined
, anyway?
All fields are always strings. CSV itself does not support type casting to integers, floats, booleans, or anything else. It is not a CSV library’s responsibility to type cast input CSV data.
I agree with the others here. The implementation shouldn't be retrofitted to work with incorrect data. Instead, the data needs to be correct. Thank you @sigmaSd, @timreichen and @luk3skyw4lker.
Describe the bug
If
columns
option is given to the parser (or the same withskipFirstRow: true
) and a line doesn't contains the good number of columns an error is thrown and the whole process abord. Sometimes files doesn't contains all the columns, especilly without quotes and with TSV separator, and it's ok and good practice to reduce the file size...Steps to Reproduce
Expected behavior
Put
undefined
on undefined columns and don't throw !!