Open wdarosh opened 9 years ago
I have the same problem. Any solution?
I've figured out a solution, but my Ruby knowledge is very minimal, so I'll need some assistance in getting this patch into the proper form to add to the repo.
First, install the deep clone gem, from the Unix shell:
gem install ruby_deep_clone
Then, comment out line 212 of schema.rb:
obj = BSON.deserialize(BSON.serialize(obj))
and underneath insert the following lines:
require "deep_clone"
obj = DeepClone.clone(original)
I don't think this is the proper way to introduce an external dependency to the project, but as a quick hack it worked for me. It's quite slow on large objects (it took over 5 mins to process: ~2000 rows containing large PDFs), but it eventually inserts them into the postgres db.
Hmm, the issue here is likely that BSON.serialize
uses the original default maximum BSON size (4MB). The maximum has since been raised, but increasing it relies on negotiating the new limit with the connection.
Replacing BSON.serialize
with something like BSON::BSON_CODER.serialize(obj, false, false, 16*1024*1024)
will likely also fix your issue (without requiring a new dependency)
I've verified, changing line 212 of schema.rb to:
obj = BSON.deserialize(BSON::BSON_CODER.serialize(obj, false, false, 16*1024*1024))
fixes the issue. I tried to push up a new branch for the fix, but I don't seem to have permission to do so. What's the best way to get this fix into the master branch?
I have been working with MongoDB 2.4.12 attempting to migrate to PostgreSQL 9.4.X for a system migration. Most of the collections translate but I am unable to get past this error.
I have tried swapping up the driver however I had had no luck with MoSQL detecting and utilizing the new driver.