Closed JuniorRaja closed 2 months ago
In this code, the memory_get_peak_usage() function shows the memory usage of more than just the library. You load a large json file, then call json_decode(), and these actions consume a lot of memory, much more than the FastExcelWriter library.
In the code below you can change the $maxRow variable and measure memory consumption. And you will see that it will be approximately 4Mb even with large $maxRow values
<?php
require_once __DIR__ . '/vendor/autoload.php';
require_once __DIR__ . '/src/autoload.php';
$maxCol = 150;
$maxRow = 100000;
$rowData = array_fill(0, $maxCol, 'text');;
$outFileName = __DIR__ . '/' . $maxCol . '-' . (int)($maxRow / 1000) . 'k.xlsx';
$timer = microtime(true);
$excel = \avadim\FastExcelWriter\Excel::create(['Sheet1']);
$sheet = $excel->sheet();
for($rowNum = 1; $rowNum <= $maxRow; $rowNum++) {
$sheet->writeRow($rowData);
}
$excel->save($outFileName);
$time = round(microtime(true) - $timer, 3);
$memory = round(memory_get_peak_usage(true) / (1024 * 1024), 2);
echo '<b>', basename(__FILE__, '.php'), "</b><br>\n<br>\n";
echo 'out filename: ', $outFileName, "<br>\n";
echo 'elapsed time: ', $time, ' sec', "<br>\n";
echo 'memory peak usage: ', $memory, "Mb<br>\n";
echo 'total rows: ', $maxRow, "<br>\n";
echo 'speed: ', round($maxRow / $time, 3), " rows/sec<br>\n";
Hi,
I have been trying to write an .xlsx file from a really large dataset of 350K rows & 150 Columns. How would you approach writing such a large file. Pl find the below dry run results and the code.
For 10K Records elapsed time: 10.374 sec Memory peak usage: 76 MB
For 50K Records Elapsed time: 56.563 sec Memory peak usage: 364 MB
For 75K Records Elapsed time: 97.748 sec Memory peak usage: 546 MB
`![image](https://github.com/aVadim483/fast-excel-writer/assets/102710826/249c7e79-6bb7-4ddf-89f1-fcc7f628bc2e)