Open mcolominas opened 2 years ago
I'd love this feature. Happy to PR if others want?
Is there a reason why you aren't just chunking and upsertting/inserting it after getting the results gathered? I prefer using upsert because it works like updateOrCreate if the DB Schema property is set to Unique.
Like for instance:
$results = FastExcel::import(Storage::disk("public")->path($filename), function ($line) {
return [
'sku' => $line["Part Number"],
]
}
collect($results)
->chunk(10000)
->each(function ($chunk) {
Product::upsert($chunk->toArray(), 'sku');
});
If the spreadsheet is huge, this cases memory issues pretty quickly. I guess could just up the memory, but since the file is read line by line should be easy to just yield each chunk from a generator similar :)
My company import many and hundreds of csv everyday. currently I'm using goodby/csv,
but somehow I want to use fast-excel unfortunately fast-excel always exceed the ram usage (over 2GB). where using goodby/csv I can use only 64MB of ram just fine.
I really hope fast-excel could do better with importing
From what I have commented on #162, this package is missing an option to import using chucks.
The idea would be to implement a function that receives 4 parameters: