Duplicate-key errors cause excess slowness

XMLWordPrintableJSON

    • Product Performance
    • ALL
    • Hide

      Run this:

      db.foo.drop();
      
      db.foo.createIndex({uniqField: 1}, { unique: true });
      
      const docs = Array.from(
          {length: 10_000},
          () => ({uniqField: Math.random()}),
      );
      
      console.time("insert non-dupe");
      db.foo.insertMany(docs);
      console.timeEnd("insert non-dupe");
      
      console.time("insert dupe");
      try { db.foo.insertMany(docs, { ordered: false }) } catch(e) {}
      console.timeEnd("insert dupe");
      

      … and observe that the “insert dupe” is well over an order of magnitude slower, despite doing no actual writes:

      > mongosh `test-connstring.pl` dup_key_slow.js
      %s: %s insert non-dupe 195.07ms
      %s: %s insert dupe 8.752s
      
      Show
      Run this: db.foo.drop(); db.foo.createIndex({uniqField: 1}, { unique: true }); const docs = Array.from( {length: 10_000}, () => ({uniqField: Math .random()}), ); console.time( "insert non-dupe" ); db.foo.insertMany(docs); console.timeEnd( "insert non-dupe" ); console.time( "insert dupe" ); try { db.foo.insertMany(docs, { ordered: false }) } catch (e) {} console.timeEnd( "insert dupe" ); … and observe that the “insert dupe” is well over an order of magnitude slower, despite doing no actual writes: > mongosh `test-connstring.pl` dup_key_slow.js %s: %s insert non-dupe 195.07ms %s: %s insert dupe 8.752s
    • None
    • None
    • None
    • None
    • None
    • None
    • None

      Duplicate keys in ordered:false inserts seem surprisingly (unreasonably?) slow.

            Assignee:
            Product Performance Team
            Reporter:
            Felipe Gasper
            Votes:
            0 Vote for this issue
            Watchers:
            7 Start watching this issue

              Created:
              Updated: