Open koracle opened 8 years ago
Congrats on issue #100
Also, try reducing the page size to 100 and let me know how you get on?
it seems doesn't depend on fechPage size but on datastore size
Some testing with my 48,000 items datastore Pagesize=100 failed at iteration 3 - read 200 items - 43,9kB in 2 minutes Pagesize=50 failed at iteration 2 - read 50 items - 11,3kB in 1 minutes Pagesize=10 failed at iteration 1 - read nothing Pagesize=20 failed at iteration 3 - read 41 items - 9,0kB in 2 minutes
This goes extremenly slow for entity 'item' - even the local Google App Engine Datastore Viewer takes a long time to show each page with 20 items data. Also the bottom line goes "Results 1 - 20 of 1000" though there are 48,000 items loaded
Testing with different datastore sizes
Datastore with 998 items - Pagesize=20 read 998 items - 217,7kB in aprox. 1 minutes Datastore with 998 items - Pagesize=500 read 998 items - 217,7kB in aprox. 10 secs Datastore with 1500 items - Pagesize=500 read 1500 items - 323,3kB in aprox. 10 secs Datastore with 5000 items - Pagesize=500 read 5000 items - 1,0MB in aprox. 1 min. 20 secs Datastore with 15000 items - Pagesize=500 read 15000 items - 3,1MB in aprox. 17 min. Datastore with 30000 items - Pagesize=500 failed at iteration 36 - read 17701 items - 3,9MB in 39 minutes
Perhaps Google App Engine sdk is not prepared to cope? Any way, thank you for your time. I like your library.
I suspect you are right, it may be he local Datastore implementation.
Have you tried using the remote Gateway?
On Tuesday, 15 March 2016, koracle notifications@github.com wrote:
it seems doesn't depend on fechPage size but on datastore size
Some testing with my 48,000 items datastore Pagesize=100 failed at iteration 3 - read 200 items - 43,9kB in 2 minutes Pagesize=50 failed at iteration 2 - read 50 items - 11,3kB in 1 minutes Pagesize=10 failed at iteration 1 - read nothing Pagesize=20 failed at iteration 3 - read 41 items - 9,0kB in 2 minutes
This goes extremenly slow for entity 'item' - even the local Google App Engine Datastore Viewer takes a long time to show each page with 20 items data. Also the bottom line goes "Results 1 - 20 of 1000" though there are 48,000 items loaded
Testing with different datastore sizes
Datastore with 998 items - Pagesize=20 read 998 items - 217,7kB in aprox. 1 minutes Datastore with 998 items - Pagesize=500 read 998 items - 217,7kB in aprox. 10 secs Datastore with 1500 items - Pagesize=500 read 1500 items - 323,3kB in aprox. 10 secs Datastore with 5000 items - Pagesize=500 read 5000 items - 1,0MB in aprox. 1 min. 20 secs Datastore with 15000 items - Pagesize=500 read 15000 items - 3,1MB in aprox. 17 min. Datastore with 30000 items - Pagesize=500 failed at iteration 36 - read 17701 items - 3,9MB in 39 minutes
Perhaps Google App Engine sdk is not prepared to cope? Any way, thank you for your time. I like your library.
— You are receiving this because you commented. Reply to this email directly or view it on GitHub https://github.com/tomwalder/php-gds/issues/100#issuecomment-196953321
Tom Walder, CTO We're hiring. Find out more at www.docnet.nu/jobs http://www.docnet.nu/jobs?utm_source=email_signature&utm_medium=email&utm_campaign=email_signature Call: 0161 660 7110 / Web: www.docnet.nu http://www.docnet.nu/?utm_source=email_signature&utm_medium=email&utm_campaign=email_signature This message is private and confidential. If you have received this message in error, please notify us and remove it from your system. Venditan Limited t/a Docnet is a company registered in England and Wales. Registered number:
in my local machine, development environment yet I'm using $obj_store->fetchPage(500) trying to export all entities of a kind. With few entities it works nice but with many (now about 48,000) it fails with
Warning: file_get_contents(http://localhost:57325): failed to open stream: HTTP request failed! in /home/victor/ADESK/google_appengine/php/sdk/google/appengine/runtime/RemoteApiProxy.php on line 79
Fatal error: Uncaught exception 'google\net\ProtocolBufferDecodeError' with message 'Not initialized: batch' in /home/victor/ADESK/google_appengine/php/sdk/google/appengine/runtime/proto/ProtocolMessage.php:121 Stack trace: #0 /home/victor/ADESK/google_appengine/php/sdk/google/appengine/runtime/proto/ProtocolMessage.php(88): google\net\ProtocolMessage->mergeFromString('') #1 /home/victor/ADESK/google_appengine/php/sdk/google/appengine/runtime/RemoteApiProxy.php(96): google\net\ProtocolMessage->parseFromString('') #2 /home/victor/ADESK/google_appengine/php/sdk/google/appengine/runtime/ApiProxy.php(40): google\appengine\runtime\RemoteApiProxy->makeSyncCall('datastore_v4', 'RunQuery', Object(google\appengine\datastore\v4\RunQueryRequest), Object(google\appengine\datastore\v4\RunQueryResponse), 60) #3 /home/victor/ADESK/apps/gesmo2/vendor/tomwalder/php-gds/src/GDS/Gateway/ProtoBuf.php(213): google\appengine\runtime\ApiProxy::makeSyncCall('datastore_v4', 'RunQuery', Object(google\appengine\datastore\v4\RunQueryRequest), Object in /home/victor/ADESK/google_appengine/php/sdk/google/appengine/runtime/proto/ProtocolMessage.php on line 121
my code goes like this
function gmGDS_itemBackupcsv() {
}
for 7 items it goes perfect, for 48,000 gets broken - .csv file is created only it is empty
thank you very much for your library, makes life easier v.