Uploaded image for project: 'Core Server'
  1. Core Server
  2. SERVER-13616

"type 7" (OID) error when acquiring distributed lock for first time



    • Icon: Bug Bug
    • Resolution: Done
    • Icon: Minor - P4 Minor - P4
    • 2.4.12, 2.6.2, 2.7.0
    • None
    • Sharding
    • None
    • ALL
    • 0


      Lock is not taken, but an ugly and confusing assert is logged.

      From test failure:

       m27001| 2014-04-01T21:07:24.400-0400 [conn11] Collection config.locks does not exist. Using EOF runner: query: { _id: "configUpgrade", state: 0 } sort: {} projection: {}
       m27001| 2014-04-01T21:07:24.400-0400 [conn11] update config.locks query: { _id: "configUpgrade", state: 0 } update: { $set: { state: 1, who: "ip-10-146-215-7:27003:1396400844:1804289383:mongosMain:846930886", process: "ip-10-146-215-7:27003:1396400844:1804289383", when: new Date(1396400844233), why: "upgrading config database to new format v5", ts: ObjectId('533b62cccf9c8a51fbe0c364') } } nscanned:0 nscannedObjects:0 nMatched:0 nModified:0 keyUpdates:0 numYields:0 locks(micros) w:218 0ms
       m27001| 2014-04-01T21:07:24.400-0400 [conn13] Database::_addNamespaceToCatalog ns: config.locks
       m27001| 2014-04-01T21:07:24.401-0400 [conn13] insert config.locks ninserted:1 keyUpdates:0 numYields:0 locks(micros) r:37 w:721 0ms

      The first config upgrade lock document got inserted on the first config server (27000) by some non-27003 mongos - this caused the mongos at port 27003 to try to lock the initial lock document. On the second config server though (27001, as seen above), the mongos's update at port 27003 beat the initial lock document and so the update didn't apply. Crossing messages between servers is normal for distributed locks, but since this was the very first lock document created there was no "ts" field and the tournament was confused and errored.

      Behavior has been like this for a long time.




            randolph@mongodb.com Randolph Tan
            greg_10gen Greg Studer
            0 Vote for this issue
            6 Start watching this issue