Closed filmerjarred closed 9 years ago
Thanks for the bug report. Yes, I was able to reproduce it on Linux with node 0.10 and I'm going to run more tests directly with the driver in order to localize the problem.
It seems that this problem is not related to cassandra adapter but rather to the depth of the recursion. This would also explain why you were having problems with memory profiling. The reason you see this error with cassandra is that the driver may be storing more data on the stack. But eventually any adapter given sufficient time would fail.
As a workaround you could try inserting records in batches. So, instead of passing and object to .create() method you could pass an array of objects. You will need to experiment with the size, although something in the range of few hundreds should work fine. This will reduce recursion depth and the batch insert should also go faster. Please give it a try. I'm closing the issue for now.
When inserting a large volume of entries like so:
function insert(){ inserted++; Model.create({id:inserted}).exec(insert); } insert();
, the memory the process uses increases slowly, until it caps out at around 1.4gb and the whole thing slows to a crawl. If you stop the loop above at any point the process just chills with the memory it's gained without giving anything back, this doesn't appear to happen with any other adapter. I've tried doing some mem profiling, but for whatever reason the node-inspector on windows starts running around with it's hair on fire when you try to do a recording. The heap snapshots suggested that the records I'm inserting might be getting stored somewhere in the process, but I couldn't find anything in fly by of the code. Using node 0.12 on windows through the latest versions of the adapter and sails.