Open WillDelish opened 2 months ago
I'm curious if there are any recommended ways to use this library on large data sets?
When building out a .csv, I've hit over 2.5gb of RAM usage spike with some giant JSON files.
Being able to stream writes in chunks to a .csv would be a way to avoid such a high spikes, if this could be a feature request.
Thanks for this amazing library! So easy to use.
I'm curious if there are any recommended ways to use this library on large data sets?
When building out a .csv, I've hit over 2.5gb of RAM usage spike with some giant JSON files.
Being able to stream writes in chunks to a .csv would be a way to avoid such a high spikes, if this could be a feature request.
Thanks for this amazing library! So easy to use.