Open MichaelA59 opened 3 years ago
Hi @MichaelA59 ,
What do you mean that saving as txt works but saving as csv, doesn't?
Aren't you using fs.writeFile
exactly the same way for both formats?
In the above code snippet, I have the path
variable. I'm saying, that if I modify that so that the extension reads .txt
rather than .csv
like you see above, the special characters will be outputted in the txt
file as they should (like this: Postică
) rather than the jumbled up broken way they are outputted in the csv
version. I don't know why that makes any difference though
That's my point.
json2csv produces the correct content as seen in the TXT.
And I expect the fs.writeFile
to also write the content equally using the default utf-8 enconding.
so the only think that I can think of is that you are opening the documents differently? How are you opening the txt? And the CSV?
@juanjoDiaz I have been facing the similar issue. I am opening the txt using Notepad and the csv using Microsoft Excel. The text file is showing the special characters properly but the csv file opened in Excel is giving out a cryptic output.
Any news on this?
I haven't been able to replicate the issue. It sounds like an issue with Excel and the character encoding selected when importing the file.
This solution works to me: https://blog.theodo.com/2017/04/csv-excel-escape-from-the-encoding-hell-in-nodejs/
iconv.encode(csvData, "iso-8859-1")
json2csv - 5.0.5 Node - 10.16.0
Running into an issue where special characters are being outputted incorrectly.
Input string:
Postică
Outputted as:Postică
I need my file to be UTF-8 encoded (apparently doing this correctly) but also needs to account for special characters. If I export the file as a TXT file, the issue goes away, but that is not a viable solution for me. I need this exported as a CSV file.
Here's the function I built which uses the package. It accepts a JSON payload of data to compile the report.
This is being used as a module, not CLI.