Closed mderazon closed 5 years ago
It seems like the problem is due to values containing the ,
delimiter string and thus confusing the csv parser I use
close because deprecated: No time for maintenance
I know you dont have time to maintain it but as a novice programmer, I really need this fixed. Could you point me to where I can simple just force the parser to add quotes around all objects? I know this will increase the CSV size but I work on optimizing it later. I just need to see if this will be a viable module for me to use because so far it looks great except it cannot recognize a , inside the value as opposed to being part of the delim.
@darkenspirit I have started using https://github.com/wdavidw/node-csv-stringify instead of this library and so far it's working great
@mderazon
I looked into this but it is a bit too complex and outside of my scope I am afraid. Ontop of that documentation is extremely fragmented with limited options for newest version and the documentation for the older version looks completely different.
Are you using this from a readstream?
All I want to do is pipe from my readstream into a parser and pipe the results to express (res)
this json2csv is doing it perfectly except for this comma delim issue shifting cells around. I dont think I need something so complicated as csv.
I am using it to export my database collection to csv and send it as a response to the user in express.
I am assuming you want the result stream as csv
so you can take any readable stream and pipe it into this module to output it as csv
It's very simple, you can acheive same functionality as json2csv-stream
with something like this
var csv_stringify = require('csv-stringify');
// ...
// ...
// ...
app.get('/export-to-csv', function(req, res, next) {
var query = {}; // some query to run on the database
var results = db.stream(query); // this returns a readable stream from my db module with the results of the query, your stream readable stream might come from a different source
var keys = ['id', 'name', 'time']; // setting the fields that I want the csv to contain
var to_csv = csv_stringify({header: true, columns: keys}); // create the stream to write to
res.setHeader('content-type', 'text/csv');
res.setHeader('content-disposition', 'attachment; filename=export.csv');
return results.pipe(to_csv).pipe(res); // pipe the results from the db to the csv and then to express res obj
});
I will experiment with your examples a bit more. I like how theres more options available as the next step for me is filtering through this stream eventually.
Anyways back to my original issue with json2csv. I fixed this issue in particular.
On line 154 that._line.push(val)
val is the value that gets pushed to the array that is passed the in stream that gets written to csv. I dont know how or why it works but I changed val here to '"'+val+'"' this way it encases every object value in "" so to fix my needs where the csv is opened in excel, there isnt an issue with extra commas in values being split into two different columns.
Thank you though, I want to play more with CSV when I have the time :) First deadlines and making it work, making it more elegant and better coded after :P
Are you open to transferring the repo, so I can maintain it?
I'm here to accept PRs. I haven't done anything due to lack of interest from anyone. Once you have submitted a PR or two, we can add you as a collaborator.
+1,same issue here
https://github.com/zemirco/json2csv now has a streams API, please use that module as this one is deprecated/unmaintained.
according to csv spec,
http://stackoverflow.com/questions/769621/dealing-with-commas-in-a-csv-file