-
Type:
Improvement
-
Resolution: Unresolved
-
Priority:
Major - P3
-
None
-
Affects Version/s: None
-
Component/s: None
-
Query Optimization
-
None
-
None
-
None
-
None
-
None
-
None
-
None
Currently, the sample from random cursor aggregation stage will fail after seeing (hard-coded) 100 consecutive duplicate documents, but there have been cases where changing this limit could help to mitigate issues with mongosync.