SpartnerNL / Laravel-Excel

🚀 Supercharged Excel exports and imports in Laravel
https://laravel-excel.com
MIT License
12.29k stars 1.92k forks source link

Exporting thousands of rows #466

Closed tlshaheen closed 9 years ago

tlshaheen commented 9 years ago

I'm using v1.3.7 with Laravel 4.

I'm attempting to export 5,000 - 50,000 rows with 13 columns in a XLS or XLSX file. The data is read from our database.

Once I get over 4k rows, I'm hitting the memory limit for PHP. I know I can raise the memory limit, but I'm hoping there are other ways to achieve my goal. 4k seems pretty low, indicating to me that I may be doing something wrong here.

I see there was some discussion back in 2014 via #131 - have there been any improvements made or work done on the project since 131? Is there something I can be doing with chunks or streaming?

Right now I'm pulling 400 records out of the DB, opening up the Excel file, adding the 400 records, and kicking off a new job (so new page load/memory limit) that gets the next 400 records, etc.

Here's some example code:

$excel = \Excel::load($savedfilepath, function($file) use ($title, $rows, $headers, $startrow) {                                
    $file->sheet($title, function($sheet) use ($rows, $headers, $startrow) {
        $sheet->fromArray($rows, $nullvalue = null, $startcell = 'A' . $startrow, $strictnull = true, $autoheader = false);
        if ($startrow === 1) {
            $sheet->prependRow(array_keys($headers));
            $sheet->setFreeze('A2');
        }
    });
});
patrickbrouwers commented 9 years ago

Sorry, no improvements have been made to improve this. It's a problem with the parent package PHPExcel.

The solution you made is the thing we do too to work around the problem

SteveEdson commented 8 years ago

If this is still relevant to people, what I did it loaded my data inside the \Excel::load function. In here, I then chunked my data and appended the records to the rows within the chunks.

$rowPointer = 0;
$query->chunk(100, function($records) use($sheet, &$rowPointer) {
    foreach ($records as $index => $stockist) {
        $this->writeRecordToSheet($rowPointer, $record, $sheet);
        $rowPointer++;
    }
});

I've been able to generate files with multiple sheets, each containing 15,000 - 40,000 records.

lukedanielson commented 8 years ago

Hey Steve, could you show me your whole method, from load to save? Trying to do something similar but am having a few issues. Thanks!

tlshaheen commented 8 years ago

@SteveEdson Echoing what @lukedanielson said, could you please show your whole method? It would help out viewers (and me!)

SteveEdson commented 8 years ago

It's important to note that I'm using PHP 7, which seriously helps with memory usage. However, here's the basic code, running inside a Job by the queue reader.

        // Create new empty excel document
        $doc = new PHPExcel();

        // Get the default active first sheet
        $sheet1 = $doc->getActiveSheet();

        // Add the user information on the sheet
        $this->createUsersSheet($countryId, $sheet1);

       // Repeat for as many sheet as necessary

       // Return the document
       return $doc;

And the createUsersSheet etc functions look like:

function createUsersSheet($countryId, $sheet) {
        $sheet->setTitle('Users');

        // Add headings
        $sheet->getCell('A1')->setValueExplicit('ID');
        $sheet->getCell('B1')->setValue('Name');
        $sheet->getCell('C1')->setValue('Email');
        $sheet->getCell('D1')->setValue('Telephone Number');
        $sheet->getCell('E1')->setValue('SMS Number');

        // Start at row 2
        $rowPointer = 2;

        // Get all users
        $userQuery = Users::where('vendors.country_id', '=', $countryId);

        $totalRecords = $userQuery->count();

        $userQuery->chunk(100, function($users) use($sheet, &$rowPointer, $totalRecords) {
            // Iterate over users
            foreach ($users as $index => $user) {
                $this->report("Writing User to row " . $rowPointer . "/" . $totalRecords);
                $this->writeUserToSheet($rowPointer, $user, $sheet);

                // Move on to the next row
                $rowPointer++;
            }
        });
}

And finally

function writeUserToSheet($index, User $user, \PHPExcel_Worksheet $sheet) {
        $sheet->setCellValue('A' . $index, $user->id);
        $sheet->setCellValue('B' . $index, $user->name);
        $sheet->setCellValue('C' . $index, $user->email);
        $sheet->setCellValue('D' . $index, $user->telephone_number);
        $sheet->setCellValue('E' . $index, $user->sms_number);
        // etc
}
lukedanielson commented 8 years ago

Very helpful, thank you Steve!

lemetice commented 8 years ago

Hi @SteveEdson , I am currently exporting 10000 records with set_time_limit(0). But it takes very long and logs out most times without exporting. I wish to know if you tried you solution in Laravel 5.0 running PHP 5.5

SteveEdson commented 8 years ago

Unfortunately I only tried it with Laravel 5.2 on both PHP 7.0 and PHP 5.7, sorry.

mohitpawar10 commented 7 years ago

But the system will fail once you try to get the larger file into memory. @SteveEdson $userQuery = Users::where('vendors.country_id', '=', $countryId); If I have 1 million rows then system will fail over there only, otherwise we need to increase memory limit in php_ini file.

ravigopani commented 7 years ago

hi @SteveEdson Can you please help me with the function report that you used inside foreach loop. i can't figure out that from where that call ? and also is this code is usefull for time optimization in excel export of thousands of records ?

darwinquintana44 commented 6 years ago

hola a todos, y alguien sabe de otro paquete de instalacion que no tenga este problema, ejemplo DomPdf tienes este mismo problema pero el paquete de instalacion snappy lo soluciona

fastreading commented 6 years ago

if you need only import you can use this one that is in dev but is working, it returns an array with all the contents of the https://github.com/fastreading/excel