I offer this up, because in the case of large datasets (it happened for me at 650+ rows), the callback will be called multiple times, until the data is finished processing. Perhaps the docs can reflect this nuance by adding the following line:
"In the case of large datasets, .toArray() callback may potentially be called multiple times as data is received from the cursor, which may cause errors."
This line in the description is a little misleading: 'Retrieve all results and pass them as an array to the given callback.'
https://rethinkdb.com/api/javascript/to_array/
I offer this up, because in the case of large datasets (it happened for me at 650+ rows), the callback will be called multiple times, until the data is finished processing. Perhaps the docs can reflect this nuance by adding the following line:
"In the case of large datasets,
.toArray()
callback may potentially be called multiple times as data is received from the cursor, which may cause errors."