-
Type: Bug
-
Resolution: Works as Designed
-
Priority: Major - P3
-
None
-
Affects Version/s: 3.6.2
-
Component/s: Performance
-
None
-
Fully Compatible
-
ALL
-
Opening multiple (10+) change stream cursors causes massive delays (up to several minutes) between database writes and notification arrival. A single change stream (or 2-3 of them) does not produce the same issue.
In a synthetic test, I wrote 100 small documents per second into a database and listen to changes using change streams. I opened 50 change streams and ran it for 100 seconds. The average delay between DB write and change event arrival was 7.1 seconds; the largest delay was 205 seconds (not a typo, over three minutes).
MongoDB version: 3.6.2
Test setup #1: MongoDB Atlas M10 (3 replica set)
Test setup #2: DigitalOcean Ubuntu box + single instance mongo cluster in Docker
I used a Node.js client, CPU and memory usage was minimal.
I tried two ways to set up change streams:
{{let cursor = collection.watch([ {$match: {"fullDocument.room": roomId}}, ]); cursor.stream().on("data", doc => {...});}}
and
{{let cursor = collection.aggregate([
{$changeStream: {}},
{$match: {"fullDocument.room": roomId}},
]);
cursor.forEach(doc => {...});}}
Both had the same effect.
- related to
-
NODE-1305 Document potential performance degradation when using change streams
- Backlog