Uploaded image for project: 'Core Server'
  1. Core Server
  2. SERVER-24231

DuplicateKey exeception while replaying the Oplog with unique index

    • Type: Icon: Bug Bug
    • Resolution: Duplicate
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: None
    • Component/s: None
    • None
    • ALL
    • Hide
      #!/bin/sh
      pkill mongod
      sleep 5
      mkdir db1
      rm -rf db1/* db1_backup/ dump/
      mongod --dbpath ./db1 --fork --syslog --replSet test
      sleep 2
      
      # initializing the db
      mongo --eval 'rs.initiate()' && sleep 2
      mongo --eval 'db.getCollection("test").createIndex({a: 1}, {unique: true})'
      mongo --eval 'db.getCollection("test").insert({a: 1})'
      mongo --eval 'db.getCollection("test").update({a: 1}, {$set: {a: 2}})'
      mongo --eval 'db.getCollection("test").remove({a: 2})'
      mongo --eval 'db.getCollection("test").insert({a: 2})'
      
      # let's perform a point in time restore using the oplog now
      mongodump -d local -c oplog.rs
      mv dump/local/oplog.rs.bson dump/oplog.bson
      rm -rf dump/local
      
      mongorestore --oplogReplay
      # failed: restore error: error applying oplog: applyOps: E11000 duplicate key error collection: test.test index: a_1 dup key: { : 2.0 }
      
      
      Show
      #!/bin/sh pkill mongod sleep 5 mkdir db1 rm -rf db1/* db1_backup/ dump/ mongod --dbpath ./db1 --fork --syslog --replSet test sleep 2 # initializing the db mongo --eval 'rs.initiate()' && sleep 2 mongo --eval 'db.getCollection( "test" ).createIndex({a: 1}, {unique: true })' mongo --eval 'db.getCollection( "test" ).insert({a: 1})' mongo --eval 'db.getCollection( "test" ).update({a: 1}, {$set: {a: 2}})' mongo --eval 'db.getCollection( "test" ).remove({a: 2})' mongo --eval 'db.getCollection( "test" ).insert({a: 2})' # let's perform a point in time restore using the oplog now mongodump -d local -c oplog.rs mv dump/local/oplog.rs.bson dump/oplog.bson rm -rf dump/local mongorestore --oplogReplay # failed: restore error: error applying oplog: applyOps: E11000 duplicate key error collection: test.test index: a_1 dup key: { : 2.0 }

      While performing the applyOps command, duplicateKey exception are ignored during insertion but not on update operations.

      This could result in a situation where oplog would not be idempotent with a unique index as update could fail where they should not. For example:

      db.test.createIndex({a: 1}, {unique: true})
      db.test.insert({a: 1})
      db.test.update({a: 1}, {$set: {a: 2, b: 1}})
      db.test.remove({a: 2})
      db.test.insert({a: 2})
      

      if you replay all oplog entries whereas the last insert has already been done ({a: 2} is present in the collection), mongod will fail with:

      db.test.update({a: 1}, {$set: {a: 2}})
      applyOps: E11000 duplicate key error collection: test.test index: a_1 dup key: { : 2.0 }
      

      This is specially painful while you are performing point in time restore as mongorestore is stopping at the first error encountered.

      Cheers,
      Damien

        1. replay.sh
          0.9 kB
          Damien Gasparina

            Assignee:
            Unassigned Unassigned
            Reporter:
            damien.gasparina Damien Gasparina
            Votes:
            0 Vote for this issue
            Watchers:
            9 Start watching this issue

              Created:
              Updated:
              Resolved: