elastic / elasticsearch-rails

Elasticsearch integrations for ActiveModel/Record and Ruby on Rails
Apache License 2.0
3.07k stars 793 forks source link

Add batch_size(batch_size) to __find_in_batches (Mongoid) #1036

Open sylvain-8422 opened 2 years ago

sylvain-8422 commented 2 years ago

Add .batch_size(batch_size) to #__find_in_batches (Mongoid).

Fixes #1037 .

Although .each_slice(batch_size) is useful in order to limit how many documents are sent to Elasticsearch at a time, it does nots limit the batch size of MongoDB's getMore commands.

By default, iterating over a MongoDB collection will first return 101 documents, and then subsequent batches of 16 MiB :

https://www.mongodb.com/docs/manual/tutorial/iterate-a-cursor/#cursor-batches

For example, a MongoDB collection containing documents averaging 1 KiB might return more than 16,000 documents at a time.

Although Mongoid claims in its documentation a default batch size of 1,000 documents, it does not seem to be the case.

Also, Mongoid's .no_timeout is broken right now and does nothing:

https://github.com/mongodb/mongo-ruby-driver/pull/2557

It is now likely that more than 10 minutes go by between two getMore commands and that the MongoDB cursor expires.

Adding .batch_size(batch_size) to the query makes sure that MongoDB documents are retrieved at the same rate as they are processed and indexed in Elasticsearch, and allow applications affected by the .no_timeout issue to reduce the batch size to avoid cursor timeouts.

sylvain-8422 commented 1 year ago

@shashankjo

Same simple change as before, but I fixed the conflict created by whitespace changes in main.

From ef8985e97575d0ec3ec1028c0cb699a61f8de27e to aa38a1b5aa770cb1fa04487ba7a6d6511283155b.