vitorccs / laravel-csv

PHP Laravel package to export and import CSV files in a memory-optimized way
MIT License
26 stars 5 forks source link

Large Export Timeouts and Memory #10

Closed mikoop79 closed 1 year ago

mikoop79 commented 1 year ago

this package is a massive speed increase compare to excel packages. so thanks.

but we are still hitting issues with large exports for timeouts and memory issues. eg +50000 rows via FromQuery

do yo have a way to append to csv sheet and off load memory consumption with each batch attempts?

thanks in advance

Illuminate\Queue\MaxAttemptsExceededException: Vitorccs\LaravelCsv\Jobs\CreateCsv has been attempted too many times or run too long. The job may have previously timed out. in /var/www/html/v2/laravel/vendor/laravel/framework/src/Illuminate/Queue/Worker.php:648

I know increasing the timeout and memory is an option but we need to set a limit.

vitorccs commented 1 year ago

@mikoop79

From my experience, default settings of PHP and Laravel Queue should handle 50k records with no trouble.

Here is a checklist that may help you:

1) Use Eager Loading In case your model has relationships (hasOne, hasMany, belongsTo) try to eager load them, so this reduces lots of individual queries. Source: https://laravel.com/docs/10.x/eloquent-relationships#eager-loading

Before

public function query() {
  return \App\Models\Bill::query();
}

After

public function query() {
  return \App\Models\Bill::with(['payment_method', 'customer', 'subscription']);
}

2) Make sure the problem is not related to the queue This problem may also be related to a low timeout value for Job or Queue. Also, the Job may be failing (e.g: an exception is being thrown inside your code) and it is being reattempted too many times. So try to run the code in CLI mode (eg: php artisan tinker) or HTTP (web/Controller) to clarify that.

3) Be careful with large fields If you have long data fields (e.g: text fields) they consume a lot of memory and you may need to consider a different approach (not retrieve the field / isolate into a new table / reduce value of chunk_size parameter)

4) Try to restart your query from the simplest one possible Get started by only adding the ID field and as the the query is performing well keep adding new fields until you get the memory issue

public function query() {
  return \Illuminate\Support\Facades\DB::table('bills')->select('id', 'amount');
}
vitorccs commented 1 year ago

@mikoop79 closing the issue due to inactivity

mikoop79 commented 1 year ago

thanks @vitorccs