yourilima / php-excel-reader

Automatically exported from code.google.com/p/php-excel-reader
0 stars 0 forks source link

Memory Exhaust on $streamdata variable #46

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
I'm getting a memory error on the

getWorkBook function

so on line ~238 
$streamData .= substr($rootdata, $pos, SMALL_BLOCK_SIZE);

I've already changed the max execution limit (0) and memory limit in my 
.ini file is at 200M. I've also included the error reporting line for E_ALL 
and E_NOTICE. For one, I installed AppServ on a Dell GX620 with 3.2Ghz P4 
with 1 gig of ram on winxp. So, the AppServ and system aren't the greatest 
but that's all I had to work with to access the local excel files. Next, 
I'm looping through a readdir of about 2000 excel files (160M) in order to 
extract a few bits of info from each. My script gets through about 1700 xls 
files and extracts the info I need but it's not making it past the last 
200-300. So, I'm really at a lost cause the scripts working great and I've 
already changed the GetInt4d and _GetInt4d (magical??)functions but I'm on 
32bit archy so results have been the same. I've also been unsetting 
variables left and right. 

I know there may be some problems due to the crap machine or AppServ quick 
AMP install, but everything seems to be working and all the paths look good 
to me. 

Any suggestions on what I can try to get my script to finish to completion. 
All I need is the data so the extended info cells is turned off (false) and 
I hardcoded that in the function cause I think there were problems with the 
call always being on, but I want to say the read() still wants to get 
additional cell info which may be pushing the data chunks or memory limits.

Thanks for any ideas for me to try. Otherwise, very nice update to the 
sourceforge and works in many other instances for me at 100% but this one 
time I'm only getting about 90% completion.  

Original issue reported on code.google.com by charlier...@gmail.com on 9 Oct 2009 at 9:20

GoogleCodeExporter commented 9 years ago
After tons of other memory workups, I came up with the following. I originally 
ran 
the sourceforge class and everyone knows now that there's 2 main issues (1) 
corrupt 
example .xls and (2) wrong include name in reader.php. Any programmer should 
pick 
those up quickly. I fixed those without reading any forums or help and I had it 
up 
and running and dumping tons of excel data to mysql. Of course, I was on a 
brandnew, 
powerful quad-proc 3+gigs of ram system running vista, apache, php, and mysql. 
So, 
when I had to work on a refurbed piece of crap I installed AppServ thinking I 
could 
use the php I wrote for the other system to get the local excel files in mysql. 
The 
original OLEread ran into tons of problems. Then, I found Mr Kruse's updated 
class 
and I got better performance right off. Plus I could already see tons of better 
code 
in the class with extra parameter calls. However, as stated above my excel 
folder was 
only going to about 90% completion before FATAL ERROR with memory allocation. I 
did 
everything as mentioned above plus changed memory allocation in my Windows_sys 
ie 
took away from programs and gave to cache and vice-versa. I was just looking at 
installing memcache (PECL or xcache,etc) today when I noticed my AppServ had 
two 
instances running of httpd at about 15k mem usage a piece. So, to the point.... 
If 
you're having memory problems with the php class then your problem may lay with 
apache which is a huge memory hog. I downloaded lightTPD for windows. A small 
executable, although, in hungarian (couldn't find the english windows exe) I 
was up 
and running a WLMP in about 60 seconds. I made a phpmyadmin folder in my HTDOCS 
folder, made a shortcut to my index.php (of course username is root and no 
password 
to begin with). I copied over my files and ran my script to completion with 
about 27k 
(2 httpd instances at 15k = 30k) less mem usage. LightTPD runs between 3k-5k 
mem 
usage. Again excellent php class just problematic on systems that can't sport 
apache 
and access tons of xls files or lines. Hope this helps someone that's getting 
memory 
usage errors of any kind. Focker out.

Original comment by charlier...@gmail.com on 12 Oct 2009 at 9:27