Closed jacoballred closed 10 years ago
Nice to know that someone's using this :)
I hadn't thought of the memory issue, I had only thought about the datastore write quota running out, but now that I think about it, of course memory is going to be an issue if people have a lot of photos, since the free instances on app engine don't have a lot of memory and I read the entire zip file into memory.
If I was doing the import/export again I would probably read and write the zip files on the client side with the HTML5 FileReader api and a zip library and then send them one by one to the server. However, since so far you're the only one using this besides me, and you seem to be finished with the import I'll wait and see if someone else starts using this and runs into problems with large files.
Dropbox export sounds like a really good feature though, and something I could use myself. I'll look into it soon and see what I can do.
Well, I've added the backup to Dropbox feature. You can get it by getting the latest version of the code and publishing it, and then follow the instructions at http://einaregilsson.com/mylife-backup-to-dropbox/ to get it up and running. Let me know if it doesn't work :)
Thanks for adding this feature so quickly!
I published the new code yesterday and followed your instructions, but it hit a memory limit when it ran last night. Here is the error message:
18:00:23.168 500 0 B 23049ms /backup/dropbox
0.1.0.1 - - [05/Nov/2014:16:00:23 -0800] "GET /backup/dropbox HTTP/1.1" 500 0 - "AppEngine-Google; (+http://code.google.com/appengine)" "jacob-mylife.appspot.com" ms=23049 cpu_ms=16066 cpm_usd=0.000032 queue_name=__cron task_name=c973e868a77533afb77acf9dcf83fb82 exit_code=105 instance=00c61b117c7571ef4f283072d563edbfdc088d app_engine_release=1.9.15
F 18:00:23.164 Exceeded soft private memory limit of 128 MB with 154 MB after servicing 16 requests total
W 18:00:23.164 While handling this request, the process that handled this request was found to be using too much memory and was terminated. This is likely to cause a new process to be used for the next request to your application. If you see this message frequently, you may have a memory leak in your application.
This error is for my journal, which only has maybe 8MB of images. Any ideas what the problem could be?
That sounds weird. I have around ~6 MB of photos and that didn't cause any problem. Did anything get saved to Dropbox? The backup starts by just getting all your posts and putting them together in one text file and uploading that, which shouldn't take much memory. So, did the post text get saved, or at least did your app folder get created in Dropbox?
I have two MyLife projects published, and I created a Dropbox app for each.
The Dropbox app folder showed up in my Dropbox as soon as I created the app. Both folders are empty except for a .dropbox file that was created at the same time as the folder.
Both MyLife projects are showing similar memory errors for the Dropbox backup. I tried running the backup manually by going to /backup/dropbox in my browser, and I get the same memory error.
I've also tried re-publishing to make sure there wasn't a glitch during the initial publishing process, but I still get the same memory error.
Can you think of anything else I could try?
Ok, I made a couple of changes. I was accidentally fetching all the posts twice, so I fixed that. I also added some more logging. Can you get the latest version, publish and try again?
You should be able to go to /backup/dropbox and it should at least be able to upload all your posts there before timing out (unless your posts are really long!).
Also, if you get an error try going to the app engine page, https://appengine.google.com, choosing your app, choosing "Logs" under the Main menu on the left.
You should see the request for /backup/dropbox, if you click on it you'll see all the log messages that were written. Let me know what they say, then I can maybe figure out which part of the backup is causing memory issues.
Thanks for the pull request, have merged it. So does the backup work for you now? Also with images?
I should probably do the same in the zip export as well.
I think I figured it out. Looks like appending strings in Python is memory intensive because strings are immutable objects. I switched the code to use StringIO (which will treat the string like a file) and I was able to get the backup to work.
Awesome :) Will close this issue then!
Thanks so much for creating this! It is awesome!
I had a problem importing a large OhLife file. My wife had a lot of photos, so the instance ran out of memory:
I got around this by manually attaching some images to posts after importing just the text file and some of the photos, so the zip was smaller. I suspect I could also get around this by temporarily changing the frontend instance class to an F2 or higher.
Even with upgrading the frontend instance class, eventually my wife will have too many photos to zip up. Any chance you could add in some sort of Dropbox export method? Perhaps it could send the photos off one by one to Dropbox instead of trying to zip them, so it won't run into a memory issue.