Closed kstmostofa closed 1 year ago
It seems that it has exceeded the execution timeout of the queue where the job is running. Try increasing the timeout in your config files For reference in our project importing 30.000 rows takes a half hour
This bug report has been automatically closed because it has not had recent activity. If this is still an active bug, please comment to reopen. Thank you for your contributions.
I'm getting the same error. I did increase max_execution_time = 120000
both inside php.ini
and config/queue.php
file as well like below. Plus i added public $failOnTimeout = false;
to ProductsImport
class file as well but nothing seems to be working. I'm trying this on my Loca env as well so no idea what is the problem behind this.
'database' => [
'driver' => 'database',
'connection' => env('DB_QUEUE_CONNECTION'),
'table' => env('DB_QUEUE_TABLE', 'jobs'),
'queue' => env('DB_QUEUE', 'default'),
'retry_after' => (int) env('DB_QUEUE_RETRY_AFTER', 120000),
'after_commit' => false,
'max_attempts' => 5,
],
Split your jobs into some parts like ProductsImport will only read the file data using chunks
<?php
namespace App\Imports;
use App\Jobs\ProductsImportJob;
use Illuminate\Contracts\Queue\ShouldQueue;
use Illuminate\Support\Collection;
use Maatwebsite\Excel\Concerns\ToCollection;
use Maatwebsite\Excel\Concerns\WithChunkReading;
use Maatwebsite\Excel\Concerns\WithHeadingRow;
class ProductsImport implements ToCollection, WithHeadingRow, WithChunkReading, ShouldQueue
{
public function collection(Collection $collection)
{
foreach ($collection as $row) {
ProductsImportJob::dispatch($row->toArray());
}
}
public function chunkSize(): int
{
return 1000;
}
}
Note: You can adjust the chunk size according to the data column size. in ProductsImportJob you can do the database transaction.
Is the bug applicable and reproducable to the latest version of the package and hasn't it been reported before?
What version of Laravel Excel are you using?
3.1.30
What version of Laravel are you using?
10
What version of PHP are you using?
8.1
Describe your issue
Illuminate\Queue\MaxAttemptsExceededException: Maatwebsite\Excel\Jobs\ReadChunk has been attempted too many times. in /Volumes/Backup Data/lara-1m/vendor/laravel/framework/src/Illuminate/Queue/Worker.php:785
How can the issue be reproduced?
What should be the expected behaviour?
when i am trying with less entries of file then everything working fine. but when entries goes above 100K or 200K then coming this error
Illuminate\Queue\MaxAttemptsExceededException: Maatwebsite\Excel\Jobs\ReadChunk has been attempted too many times. in /Volumes/Backup Data/lara-1m/vendor/laravel/framework/src/Illuminate/Queue/Worker.php:785