Closed reevr closed 3 years ago
good pointer, we will surely work on this. And you are also free to help out.
When I try to load a csv with 1 million rows, the heap memory gets exceeded. I know we need to increase the MAX_OLD_SPACE_SIZE before running the application, but what I see is, DanfoJs is consuming more memory than pandas. Memory gets exhausted fast with few thousands of rows getting loaded I hope this gets fixed soon, as DanfoJs is a promising future to NodeJs data processing capabilities.
I understand your plight. I'm looking towards streaming operations and parallelization of most of the core operations for danfo. This will help solve this issue.
Stale issue message
When I try to load a csv with 1 million rows, the heap memory gets exceeded. I know we need to increase the MAX_OLD_SPACE_SIZE before running the application, but what I see is, DanfoJs is consuming more memory than pandas. Memory gets exhausted fast with few thousands of rows getting loaded I hope this gets fixed soon, as DanfoJs is a promising future to NodeJs data processing capabilities.