stevukas / phpliteadmin

Automatically exported from code.google.com/p/phpliteadmin
0 stars 0 forks source link

Cannot display more that ~60000 records #209

Open GoogleCodeExporter opened 9 years ago

GoogleCodeExporter commented 9 years ago
What steps will reproduce the problem?
1. Go to any table with records greater then circa 60000 
example phpliteadmin.php?table=BIGdata&action=table_sql

2 Executing a query with a results greater then circa 60000 records
example SELECT * FROM "BIGdata" LIMIT 70000

What is the expected output? 
A full results set 
or an error saying that 'too many records' to display
or some kind of auto-limit showing number of records that can be displayed

What do you see instead?
empty result set (screenshot attached)

What version of the product are you using? On what operating system? 
phpLiteAdmin v1.9.4.1, running on ubuntu Ubuntu 12.04.2 LTS (GNU/Linux 
3.5.0-26-generic x86_64)

Which
Database Extension (PDO/SQLiteDatabase/SQLiteDatabase3 - see Database
structure-tab in phpLiteAdmin)?
 SQLite version: 3.7.9
 SQLite extension [?]: PDO
 PHP version: 5.3.10-1ubuntu3.6

Please provide any additional information below.

Original issue reported on code.google.com by maciej.n...@gmail.com on 5 Apr 2013 at 2:30

Attachments:

GoogleCodeExporter commented 9 years ago
Thanks for your report.

Does the same problem occur when you use the "browse" tab?

| What is the expected output? 
| A full results set 
Well, your browser would probably get into trouble if we gave it a 70.000 lines 
table.
I think we either need to introduce pagination here (which means adjusting / 
adding the LIMIT clause). Or we should simply check how many rows get returned 
and say "no, use a smaller limit".

But besides the browser-problem, we can still improve this a lot: We currently 
fetch the whole resultset into memory ($result = $db->selectArray($query[$i], 
"assoc"); ). This is what makes your page appear blank: php runs out of memory 
and ends execution.
If we would iterate through the resultset, we could give you the full table. 
But as I said, your browser will likely get into trouble with a table as huge 
as that.
IE6 would not display the table until fully loaded, which means you would 
probably wait for ages until windows says "Internet explorer does not respond 
any longer".
Have not tried it, though.
It should help to start a new <table> every few hundred rows to come around 
problems like this.

Original comment by crazy4ch...@gmail.com on 5 Apr 2013 at 4:05

GoogleCodeExporter commented 9 years ago
You are welcome. Great product by the way.

Same issue when using the "browse" tab and this emerges here if I go for 
somewhere around 50.000. 

Original comment by maciej.n...@gmail.com on 8 Apr 2013 at 11:20

GoogleCodeExporter commented 9 years ago
I got the same problem here, inserting 88K rows and didn't get any result 
stoping the site and getting an DB with half data inserted. I've finally get 
the data into the DB using local server coding. Maybe it's a PHP timeout 
problem?

Original comment by nanozer...@gmail.com on 30 Aug 2013 at 7:47

GoogleCodeExporter commented 9 years ago
How did you insert the 88K rows? Using some import (csv/sql)? Using the 
sql-tab? Using the insert-feature (probably not ;) )?
It could be a php timeout or an out-of-memory.

Original comment by crazy4ch...@gmail.com on 30 Aug 2013 at 9:05

GoogleCodeExporter commented 9 years ago
I guess this issue relates to the same root cause as issue #78.
We often load full result sets into memory, which won't work with big result 
sets.
We should switch to an iterator-based approach to avoid these problems.

Original comment by crazy4ch...@gmail.com on 15 Jan 2014 at 10:09