Closed ssasoglu closed 4 years ago
Hi,
I had the same problem, but adding \ufeff
before the csv data when writing the file fixed it for me.
jsonexport( { /* JSON data */ } , {rowDelimiter: ';'} , function( err , csv ) {
fs.writeFile( 'filename.csv' , '\ufeff' + csv , 'utf8' , function( err ) {
//...
}
}
Hope this helps!
@virginielgb Your tactic worked for me. Thanks!! But, What is '\ufeff'? What did you use this? May you explain or give a link about it?
@rdvnkrtl it is the byte order mark bits in the beginning of the file. See from here.
Thank you @nerdomancer
This is not a bug in the library. Not everyone use utf-8.
Yes, adding the UTF-8 BOM before the '\ufeff' + csv
worked great.
In my case I'm getting a JSON object to node.js from a client -
request(options, function (error, response, body) {
if (error) {
console.log(error);
}
let data = JSON.parse(body)
jsonexport(data,function(err, csv){
if(err) return console.log(err);
fs.writeFile('test.csv', '\ufeff'+csv, (e) => {
if (e) return next(e);
})
});
});
Error with fs, not able to install fs. Can anyone help
Error with fs, not able to install fs. Can anyone help
First of all you need to understand what 'fs' is. https://nodejs.org/api/fs.html After that you'll understand - that this is not an issue of 'jsonexport'
Just throw this out there. When I updated to the latest NodeJs 12.2.0, some of my packages had errors due to deprecated require('fs')
functionality. For instance, fs.exists()
produced some hard to understand binary issue.
That all maybe unrelated but I did have to go back and update my own packages as of yesterday.
Operating system: Windows 10
If source json string fields contain special characters (ñ,ó,à,ô,è) and exported csv file is opened in Excel, the characters are replaced with à ¨ © ± Ã
This seems to be originating from the encoding. If I change the encoding to UTF-8-BOM with a text editor, then excel shows those characters correctly.