Closed chrishol closed 9 years ago
Hey, we've never tested this thing against such a large collection, so can't help you with this at this time, but definitely will test it in the near future and let you know if something comes up.
The backup process which is used in here creates collection dump in memory, so memory size that allowed for the process might be an issue.
That could well be the issue. I'll take a look - thanks!
Cursor timeout is a known problem when getting lots of documents in one query in a loop. I would suggest to solve this problem with loading data in batches.
We use mongodb from third-party providers now, and probably would not support this gem anymore. So anyone who wants to take over is welcome.
Hey there,
First of all, thanks for the gem!
I have two apps that I've been trying to use this with, one on Rails 3/Mongoid 3, one on Rails 4/Mongoid 4, and I can't get this to run through on either.
On Rails 4/Mongoid 4, I consistently get:
On Rails 3/Mongoid 3, I consistently get:
In the Rails 4/Mongoid 4 case, the collection it fails on is always the same, and is the largest one in the DB (500k records, 125MB).
This all points to a cursor timeout being the problem I'm hitting. Is there an configuration option or Mongo setting I need to avoid this?
Thanks!