holicovrazvan1 / parsecsv-for-php

Automatically exported from code.google.com/p/parsecsv-for-php
MIT License
0 stars 0 forks source link

Auto function cannot handle large files more than 6 mb. #20

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
Step to produce the problem:

1. $filename = "res/autors.csv"; // Latge files with 19 columns and 6900 
records, separated by semicolon.
2. $csv = new parseCSV();
3. $csv->auto($filename);
4. print_r($csv);

Those steps will produce: 

Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to allocate 
780 bytes) in /Applications/xampp/xamppfiles/htdocs/user143/Cactus/parseCSV.php 
on line 454

I understand that I have limitation on memeory, but I am not going to increase 
the memory because it will impact to my policy and also for security reason.

Expected output:
The data will appear.

Version: parseCSV v0.4.3 beta
Operating system: Windows 7 64 bit, with Firefox 3.6.15

Additional data:
You could try it using your dummy data

Original issue reported on code.google.com by rheza.satria.ta on 21 Mar 2011 at 11:44

GoogleCodeExporter commented 9 years ago
Those steps above will produce:
Fatal error: Allowed memory size of 8388608 bytes exhausted (tried to allocate 
780 bytes) in /Applications/xampp/xamppfiles/htdocs/user143/Cactus/parseCSV.php 
on line 454

Original comment by rheza.satria.ta on 21 Mar 2011 at 11:49

GoogleCodeExporter commented 9 years ago
This also fails on a set delimiter. Is there no solution for this?

Original comment by da.g...@gmail.com on 22 Sep 2012 at 9:02

GoogleCodeExporter commented 9 years ago
Just change _rfile function to the following:

    /**
     * Read local file
     * @param   file   local filename
     * @return  Data from file, or false on failure
     */
    function _rfile ($file = null) {
        if ( is_readable($file) ) {
            if ( !($fh = fopen($file, 'r')) ) return false;
            $data = false;
            while (!feof($file)) {
                $data .= fread($fh, 1000);
            }

            fclose($fh);
            return $data;
        }
        return false;
    }

This will load small chunks of file into server's memory instead of loading the 
file at once causing the server to break for large files.

Original comment by ggirt...@gmail.com on 12 Dec 2012 at 10:12