-
Type: Bug
-
Resolution: Done
-
Priority: Major - P3
-
None
-
Affects Version/s: None
-
Component/s: primer
-
Labels:
-
Environment:Node
*Location*: https://docs.mongodb.com/getting-started/node/query/
*User-Agent*: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36
*Referrer*: https://docs.mongodb.com/getting-started/node/insert/
*Screen Resolution*: 1440 x 900
Node *Location*: https://docs.mongodb.com/getting-started/node/query/ *User-Agent*: Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36 *Referrer*: https://docs.mongodb.com/getting-started/node/insert/ *Screen Resolution*: 1440 x 900
The examples are great, but all use cursor.each() to iterate over the results. With a huge table this might flood your application with loads of data, because you can't stop and resume the "stream" of docs.
I would love to see an example on how to fetch docs by smaller quantities, to enable processing, e.g. to fetch docs in 100 pieces to batch insert them into Elastic. But still iterate over all results, not just limit the result to 100 docs.