Closed jackkitley closed 11 months ago
For starters, passing the entire collection through the constructor defeats the purpose of using the queue feature. It will already have loaded everything into memory and the payload of the queue job will be enormous.
It's better to do the query within the export object.
I would further advise against using the built-in queue feature, it's better to wrap the export inside of an normal Laravel job and queue it on a long running queue. (Longer timeout)
For starters, passing the entire collection through the constructor defeats the purpose of using the queue feature. It will already have loaded everything into memory and the payload of the queue job will be enormous.
It's better to do the query within the export object.
I would further advise against using the built-in queue feature, it's better to wrap the export inside of an normal Laravel job and queue it on a long running queue. (Longer timeout)
The ->query()
worked thanks. i was able to run the export. It just took alot of time but i think its because it needed to interact with s3 and im on my local.
I further went to remove shouldQueue
and put the export process into a Job. This is what it looks like
Controller:
QueueStockExport::dispatch($this->stockRepository, $filters, request()->user())->onQueue('exports');
return $this->backWithSuccessFlash(__('Export started'));
Job:
class QueueStockExport implements ShouldQueue
{
use Dispatchable, InteractsWithQueue, Queueable, SerializesModels;
/**
* Create a new job instance.
*/
public function __construct(private readonly StockRepository $stockRepository, private array $filters, private User $user)
{
//
}
/**
* Execute the job.
*/
public function handle(): void
{
$uniqueFile = 'stock' . uniqid() . '.xlsx';
$export = new StockExport($this->stockRepository);
$export->setFilters($this->filters);
$complete = $export->store("exports/$uniqueFile", 's3');
if($complete) {
NotifyUserOfCompleteExport::dispatch($this->user, $uniqueFile);
}
}
}
Now i dont get an AppendToSheet
job that executes. I presume its doing the chunking in the background?
Its just running my job in my queue container called QueueStockExport
.
Let me know if this is the correct way please.
Thanks
@patrickbrouwers Would avoiding the default queue and using a laravel queue still chunk the file? I dont see any chunked files being added to S3 anymore.
Im running multiple containers so would prefer it to be on S3.
It chunks the query to keep memory low. It doesn't chunk the writing to the file, which is a feature that unfortunately doesn't work because of how phpspreadsheet works with re-loading the entire sheet back into memory.
It chunks the query to keep memory low. It doesn't chunk the writing to the file, which is a feature that unfortunately doesn't work because of how phpspreadsheet works with re-loading the entire sheet back into memory.
ok, thank you
This bug report has been automatically closed because it has not had recent activity. If this is still an active bug, please comment to reopen. Thank you for your contributions.
Is the bug applicable and reproducable to the latest version of the package and hasn't it been reported before?
What version of Laravel Excel are you using?
3.1.48
What version of Laravel are you using?
10.16.1
What version of PHP are you using?
8.2.6
Describe your issue
I create a eloquent query with 4 relationships. I return the data as a collection and pass this through to laravel export construct to be processed. I have switched to queuing the data and chunking the export but the export seems to fail on receiving the data to process.
Here is the start of my export:
Here is the config:
Here is my Export:
How can the issue be reproduced?
Seems it can be produced with not alot of data. I have 256M of memory in php.ini. I dont think i should be going more than that.
Im not sure what else to do. I cant set infinite memory.
My goal would be to export over 500k records for example...
What should be the expected behaviour?
I shouldnt get memory issues and the data should be processed and exported fine.