Uploaded image for project: 'Core Server'
  1. Core Server
  2. SERVER-19920

Duplicate Key Error on Upsert with multi processes

    • Type: Icon: Bug Bug
    • Resolution: Duplicate
    • Priority: Icon: Major - P3 Major - P3
    • None
    • Affects Version/s: None
    • Component/s: WiredTiger
    • None
    • ALL
    • Hide

      1. run test_mongo_update.py
      2. run test_mongo_update.py as other process

      it will raise duplicate key error

      Show
      1. run test_mongo_update.py 2. run test_mongo_update.py as other process it will raise duplicate key error

      Hi, all

      I just got a weird error sent through from our applcation:

      when i updated with two processes, it was complaining of a duplicate key error on a collection with a unique index on it, but the operation in question was an upsert.

      case code(test_mongo_update.py):

      Bar.python
      import time
      from bson import Binary
      from pymongo import MongoClient, DESCENDING
      
      bucket = MongoClient('127.0.0.1', 27017)['test']['foo']
      bucket.drop()
      bucket.update({'timestamp': 0}, {'$addToSet': {'_exists_caps': 'cap15'}}, upsert=True, safe=True, w=1, wtimeout=10)
      bucket.create_index([('timestamp', DESCENDING)], unique=True)
      while True:
          timestamp =  str(int(1000000 * time.time()))
          bucket.update({'timestamp': timestamp}, {'$addToSet': {'_exists_foos': 'fooxxxxx'}}, upsert=True, safe=True, w=1, wtimeout=10)
      

      When i run script with two processes, Pymongo Exception:

      Bar.python
      Traceback (most recent call last):
        File "test_mongo_update.py", line 11, in <module>
          bucket.update({'timestamp': timestamp}, {'$addToSet': {'_exists_foos': 'fooxxxxx'}}, upsert=True, safe=True, w=1, wtimeout=10)
        File "build/bdist.linux-x86_64/egg/pymongo/collection.py", line 552, in update
        File "build/bdist.linux-x86_64/egg/pymongo/helpers.py", line 202, in _check_write_command_response
      pymongo.errors.DuplicateKeyError: E11000 duplicate key error collection: test.foo index: timestamp_-1 dup key: { : "1439374020348044" }
      

      Env:

      1. mongodb 3.0.5, WiredTiger
      1. single mongodb instance
      1. pymongo 2.8.1
      1. centos6.6

      mongo.conf

      Bar.ini
      systemLog:
         destination: file
         logAppend: true
         logRotate: reopen
         path: /opt/lib/log/mongod.log
      
      # Where and how to store data.
      storage:
         dbPath: /opt/lib/mongo
         journal:
           enabled: true
      
         engine: "wiredTiger"
         directoryPerDB: true
      
      # how the process runs
      processManagement:
         fork: true  # fork and run in background
         pidFilePath: /opt/lib/mongo/mongod.pid
      
      # network interfaces
      net:
         port: 27017
         bindIp: 0.0.0.0  # Listen to local interface only, comment to listen on all interfaces.
      
      setParameter:
         enableLocalhostAuthBypass: false
      Any thoughts on what could be going wrong here?
      

      PS:

      I retried the same case in MMAPV1 storage engine, it works fine, why?

      I've asked this same issue in StackOverflow (http://stackoverflow.com/questions/31962539/duplicate-key-error-on-upsert-with-multi-processesmongo-3-0-4-wiredtiger) and in the mongodb-user mailing list

      I found something related here:
      https://jira.mongodb.org/browse/SERVER-18213

      but after this bug fix, it cases this error, so it looks like this bug is not fixed completely.

      Cheers

        1. mongod.conf
          1.0 kB
          memorybox
        2. test_mongo_update.py
          0.5 kB
          memorybox

            Assignee:
            Unassigned Unassigned
            Reporter:
            memorybox memorybox
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated:
              Resolved: