Open mihaimyh opened 3 years ago
Well, the readers are stream/forward only based. Well enough to handle large files. It is up to you on how to read and process the data.
If you can provide specific, I can provide direction. Thanks.
What is the best way to process a 100GB+ csv file and transfer it to sql server for example. I know I can use bulk insert but in this case it is not memory/process/resource efficient.
I have a request to process large csv files (>10GB size), what is the suggested way to handle this situation without running out for memory?
I was reading https://www.codeproject.com/Articles/1145337/Cinchoo-ETL-CSV-Reader but there is nothing mentioned about this scenario.