Closed lrakauskas closed 7 months ago
I couldn't reproduce the problem, but you are right, storing buffers in a static variable is a bad idea. There was a reason for this, but now I moved them to a protected variable
I moved the buffers from static to protected variables. Hope the problem is resolved. Thank you very much for your work and your analysis
When generating Excel files synchronously everything is working perfect, but if you generate Excel files within a queue worker, it's very likely that some of the files will turn out be invalid.
Issue steams from
vendor/avadim/fast-excel-writer/src/FastExcelWriter/Writer.php
, when running in a queue,__destruct()
is not called straight away after job is finished, it could be called later on. In fact, this usually will end up called in a middle of a subsequent exports, which would destroy all buffers, including the ones that are being actively used, because buffers are stored in static variable.Repro:
To illustrate, if we were to add the following logging:
And we were to run export twice, we would get the following log:
You can see that
destruct
has been called in a middle of a second export, which destroyed all write buffers and second export obviusly ended up missing some of the data and closing XML tags.Quick solution is to destroy only the buffers that belonged to the object we are destructing. Something along the lines of:
Note that for this to work, temp directory should be unique for each Excel file being generated, so either when creating Excel with
Excel::create()
you would want to passtemp_dir
option likeExcel::create(options: ['temp_dir' => $tempDir]);
, or invendor/avadim/fast-excel-laravel/src/FastExcelLaravel/ExcelWriter.php
create()
you would want to make a unique subfolder.In general, I don't think it's a good practice to store buffers in static variables, but I would leave that for you to decide.
P.S. sorry for a limited example, don't have much time to create extensive Issue.