Open GoogleCodeExporter opened 9 years ago
Thanks for your report.
Does the same problem occur when you use the "browse" tab?
| What is the expected output?
| A full results set
Well, your browser would probably get into trouble if we gave it a 70.000 lines
table.
I think we either need to introduce pagination here (which means adjusting /
adding the LIMIT clause). Or we should simply check how many rows get returned
and say "no, use a smaller limit".
But besides the browser-problem, we can still improve this a lot: We currently
fetch the whole resultset into memory ($result = $db->selectArray($query[$i],
"assoc"); ). This is what makes your page appear blank: php runs out of memory
and ends execution.
If we would iterate through the resultset, we could give you the full table.
But as I said, your browser will likely get into trouble with a table as huge
as that.
IE6 would not display the table until fully loaded, which means you would
probably wait for ages until windows says "Internet explorer does not respond
any longer".
Have not tried it, though.
It should help to start a new <table> every few hundred rows to come around
problems like this.
Original comment by crazy4ch...@gmail.com
on 5 Apr 2013 at 4:05
You are welcome. Great product by the way.
Same issue when using the "browse" tab and this emerges here if I go for
somewhere around 50.000.
Original comment by maciej.n...@gmail.com
on 8 Apr 2013 at 11:20
I got the same problem here, inserting 88K rows and didn't get any result
stoping the site and getting an DB with half data inserted. I've finally get
the data into the DB using local server coding. Maybe it's a PHP timeout
problem?
Original comment by nanozer...@gmail.com
on 30 Aug 2013 at 7:47
How did you insert the 88K rows? Using some import (csv/sql)? Using the
sql-tab? Using the insert-feature (probably not ;) )?
It could be a php timeout or an out-of-memory.
Original comment by crazy4ch...@gmail.com
on 30 Aug 2013 at 9:05
I guess this issue relates to the same root cause as issue #78.
We often load full result sets into memory, which won't work with big result
sets.
We should switch to an iterator-based approach to avoid these problems.
Original comment by crazy4ch...@gmail.com
on 15 Jan 2014 at 10:09
Original issue reported on code.google.com by
maciej.n...@gmail.com
on 5 Apr 2013 at 2:30Attachments: