Open catfish132 opened 2 years ago
my csv is so big that the memory is out of the range. May be there is a way to fix out
Maybe try reading the CSV in chunks, running the labeler on each chunk, then writing the output iteratively.
I faced the same issue as I was processing 1.5M reports. I created a set of scripts to chunk a large file and process chunks in parallel automatically. This might be useful for others who see this so they don't have to reinvent the wheel:
my csv is so big that the memory is out of the range. May be there is a way to fix out