[SERVER-21006] duplicate key error on upsert Created: 19/Oct/15  Updated: 29/Oct/15  Resolved: 20/Oct/15

Status: Closed
Project: Core Server
Component/s: Concurrency
Affects Version/s: 3.0.7
Fix Version/s: None

Type: Bug Priority: Major - P3
Reporter: Kay Agahd Assignee: Unassigned
Resolution: Duplicate Votes: 0
Labels: None
Remaining Estimate: Not Specified
Time Spent: Not Specified
Original Estimate: Not Specified

Attachments: File mongotest-2.13.0-sources.jar     File mongotest-2.13.0.jar     File mongotest-3.0.3-sources.jar     File mongotest-3.0.3.jar    
Issue Links:
Duplicate
duplicates SERVER-14322 Retry on predicate unique index viola... Closed
Operating System: ALL
Steps To Reproduce:

please see above and attached java sources

Participants:

 Description   

We are running a replSet with auth enabled. We may get duplicate key errors on upsert when different threads use the same _id.

We wrote a java program to reproduce the issue (see attached 2 programs using different java driver versions). The duplicate key error may happen only at the very beginning of the programm, so it doesn't matter whether the program inserts thousands or millions of documents.

The error never occurred when all documents to be written existed already in the collection before the program was started.

The error also never occurred when the java program deleted all documents by itself (see parameter -r). We had to remove the documents through mongoshell before starting the program in order to reproduce the error.

java -jar mongotest-2.13.0.jar -h sx645 -db OfferMetaDataStore -c offerMetaData -u offermanager -pw ******* -t 40 -n 10000 -d 10
19:28:47.062  INFO - Testing Mongo collection offerMetaData in DB OfferMetaDataStore on sx645:27017
19:28:48.337  INFO - Server info: { "serverUsed" : "sx645:27017" , "version" : "3.0.5" , "gitVersion" : "8bc4ae20708dbb493cb09338d9e7be6698e4a3a3" , "OpenSSLVersion" : "OpenSSL 1.0.1e 11 Feb 2013" , "sysInfo" : "Linux ip-10-47-180-245 3.2.0-4-amd64 #1 SMP Debian 3.2.46-1 x86_64 BOOST_LIB_VERSION=1_49" , "versionArray" : [ 3 , 0 , 5 , 0] , "loaderFlags" : "" , "compilerFlags" : "-Wnon-virtual-dtor -Woverloaded-virtual -std=c++11 -fPIC -fno-strict-aliasing -ggdb -pthread -Wall -Wsign-compare -Wno-unknown-pragmas -Winvalid-pch -pipe -Werror -O3 -Wno-unused-local-typedefs -Wno-unused-function -Wno-deprecated-declarations -Wno-unused-but-set-variable -Wno-missing-braces -fno-builtin-memcmp -std=c99" , "allocator" : "tcmalloc" , "javascriptEngine" : "V8" , "bits" : 64 , "debug" : false , "maxBsonObjectSize" : 16777216 , "ok" : 1.0}
19:28:48.339  INFO - Executing 40 threads, each writing a total of 10000 updates for 10 documents, _ids starting with 000006ed82e7cb21ff4f2f2b25e, with varying attribute 'fingerprint'
19:28:50.971 ERROR - DuplicateKeyException!
com.mongodb.MongoException$DuplicateKey: { "serverUsed" : "sx645:27017" , "ok" : 1 , "n" : 0 , "updatedExisting" : false , "err" : "E11000 duplicate key error collection: OfferMetaDataStore.offerMetaData index: _id_ dup key: { : \"000006ed82e7cb21ff4f2f2b25e00006\" }" , "code" : 11000}
	at com.mongodb.CommandResult.getWriteException(CommandResult.java:88) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.CommandResult.getException(CommandResult.java:79) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollectionImpl.translateBulkWriteException(DBCollectionImpl.java:316) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollectionImpl.update(DBCollectionImpl.java:274) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollection.update(DBCollection.java:214) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollection.save(DBCollection.java:1058) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollection.save(DBCollection.java:1014) ~[mongotest-2.13.0.jar:na]
	at upsert.UpsertDupes$$Lambda$42/480438026.apply(Unknown Source) ~[na:na]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_20]
	at java.util.stream.ReferencePipeline$11$1.accept(ReferencePipeline.java:373) ~[na:1.8.0_20]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_20]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_20]
	at java.util.stream.IntPipeline$4$1.accept(IntPipeline.java:250) ~[na:1.8.0_20]
	at java.util.stream.IntPipeline$10$1.accept(IntPipeline.java:362) ~[na:1.8.0_20]
	at java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:110) ~[na:1.8.0_20]
	at java.util.Spliterator$OfInt.forEachRemaining(Spliterator.java:693) ~[na:1.8.0_20]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:512) ~[na:1.8.0_20]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:502) ~[na:1.8.0_20]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_20]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_20]
	at java.util.stream.IntPipeline.reduce(IntPipeline.java:456) ~[na:1.8.0_20]
	at java.util.stream.IntPipeline.sum(IntPipeline.java:414) ~[na:1.8.0_20]
	at upsert.UpsertDupes.lambda$null$4(UpsertDupes.java:189) ~[mongotest-2.13.0.jar:na]
	at upsert.UpsertDupes$$Lambda$2/1477657879.run(Unknown Source) [mongotest-2.13.0.jar:na]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_20]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_20]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_20]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_20]
	at java.lang.Thread.run(Thread.java:745) [na:1.8.0_20]
19:28:51.375 ERROR - DuplicateKeyException!
com.mongodb.MongoException$DuplicateKey: { "serverUsed" : "sx645:27017" , "ok" : 1 , "n" : 0 , "updatedExisting" : false , "err" : "E11000 duplicate key error collection: OfferMetaDataStore.offerMetaData index: _id_ dup key: { : \"000006ed82e7cb21ff4f2f2b25e00000\" }" , "code" : 11000}
	at com.mongodb.CommandResult.getWriteException(CommandResult.java:88) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.CommandResult.getException(CommandResult.java:79) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollectionImpl.translateBulkWriteException(DBCollectionImpl.java:316) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollectionImpl.update(DBCollectionImpl.java:274) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollection.update(DBCollection.java:214) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollection.save(DBCollection.java:1058) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollection.save(DBCollection.java:1014) ~[mongotest-2.13.0.jar:na]
	at upsert.UpsertDupes$$Lambda$42/480438026.apply(Unknown Source) ~[na:na]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_20]
	at java.util.stream.ReferencePipeline$11$1.accept(ReferencePipeline.java:373) ~[na:1.8.0_20]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_20]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_20]
	at java.util.stream.IntPipeline$4$1.accept(IntPipeline.java:250) ~[na:1.8.0_20]
	at java.util.stream.IntPipeline$10$1.accept(IntPipeline.java:362) ~[na:1.8.0_20]
	at java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:110) ~[na:1.8.0_20]
	at java.util.Spliterator$OfInt.forEachRemaining(Spliterator.java:693) ~[na:1.8.0_20]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:512) ~[na:1.8.0_20]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:502) ~[na:1.8.0_20]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_20]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_20]
	at java.util.stream.IntPipeline.reduce(IntPipeline.java:456) ~[na:1.8.0_20]
	at java.util.stream.IntPipeline.sum(IntPipeline.java:414) ~[na:1.8.0_20]
	at upsert.UpsertDupes.lambda$null$4(UpsertDupes.java:189) ~[mongotest-2.13.0.jar:na]
	at upsert.UpsertDupes$$Lambda$2/1477657879.run(Unknown Source) [mongotest-2.13.0.jar:na]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_20]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_20]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_20]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_20]
	at java.lang.Thread.run(Thread.java:745) [na:1.8.0_20]
19:28:51.511 ERROR - DuplicateKeyException!
com.mongodb.MongoException$DuplicateKey: { "serverUsed" : "sx645:27017" , "ok" : 1 , "n" : 0 , "updatedExisting" : false , "err" : "E11000 duplicate key error collection: OfferMetaDataStore.offerMetaData index: _id_ dup key: { : \"000006ed82e7cb21ff4f2f2b25e00006\" }" , "code" : 11000}
	at com.mongodb.CommandResult.getWriteException(CommandResult.java:88) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.CommandResult.getException(CommandResult.java:79) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollectionImpl.translateBulkWriteException(DBCollectionImpl.java:316) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollectionImpl.update(DBCollectionImpl.java:274) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollection.update(DBCollection.java:214) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollection.save(DBCollection.java:1058) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollection.save(DBCollection.java:1014) ~[mongotest-2.13.0.jar:na]
	at upsert.UpsertDupes$$Lambda$42/480438026.apply(Unknown Source) ~[na:na]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_20]
	at java.util.stream.ReferencePipeline$11$1.accept(ReferencePipeline.java:373) ~[na:1.8.0_20]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_20]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_20]
	at java.util.stream.IntPipeline$4$1.accept(IntPipeline.java:250) ~[na:1.8.0_20]
	at java.util.stream.IntPipeline$10$1.accept(IntPipeline.java:362) ~[na:1.8.0_20]
	at java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:110) ~[na:1.8.0_20]
	at java.util.Spliterator$OfInt.forEachRemaining(Spliterator.java:693) ~[na:1.8.0_20]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:512) ~[na:1.8.0_20]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:502) ~[na:1.8.0_20]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_20]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_20]
	at java.util.stream.IntPipeline.reduce(IntPipeline.java:456) ~[na:1.8.0_20]
	at java.util.stream.IntPipeline.sum(IntPipeline.java:414) ~[na:1.8.0_20]
	at upsert.UpsertDupes.lambda$null$4(UpsertDupes.java:189) ~[mongotest-2.13.0.jar:na]
	at upsert.UpsertDupes$$Lambda$2/1477657879.run(Unknown Source) [mongotest-2.13.0.jar:na]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_20]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_20]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_20]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_20]
	at java.lang.Thread.run(Thread.java:745) [na:1.8.0_20]
19:28:51.524 ERROR - DuplicateKeyException!
com.mongodb.MongoException$DuplicateKey: { "serverUsed" : "sx645:27017" , "ok" : 1 , "n" : 0 , "updatedExisting" : false , "err" : "E11000 duplicate key error collection: OfferMetaDataStore.offerMetaData index: _id_ dup key: { : \"000006ed82e7cb21ff4f2f2b25e00006\" }" , "code" : 11000}
	at com.mongodb.CommandResult.getWriteException(CommandResult.java:88) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.CommandResult.getException(CommandResult.java:79) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollectionImpl.translateBulkWriteException(DBCollectionImpl.java:316) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollectionImpl.update(DBCollectionImpl.java:274) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollection.update(DBCollection.java:214) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollection.save(DBCollection.java:1058) ~[mongotest-2.13.0.jar:na]
	at com.mongodb.DBCollection.save(DBCollection.java:1014) ~[mongotest-2.13.0.jar:na]
	at upsert.UpsertDupes$$Lambda$42/480438026.apply(Unknown Source) ~[na:na]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_20]
	at java.util.stream.ReferencePipeline$11$1.accept(ReferencePipeline.java:373) ~[na:1.8.0_20]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_20]
	at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193) ~[na:1.8.0_20]
	at java.util.stream.IntPipeline$4$1.accept(IntPipeline.java:250) ~[na:1.8.0_20]
	at java.util.stream.IntPipeline$10$1.accept(IntPipeline.java:362) ~[na:1.8.0_20]
	at java.util.stream.Streams$RangeIntSpliterator.forEachRemaining(Streams.java:110) ~[na:1.8.0_20]
	at java.util.Spliterator$OfInt.forEachRemaining(Spliterator.java:693) ~[na:1.8.0_20]
	at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:512) ~[na:1.8.0_20]
	at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:502) ~[na:1.8.0_20]
	at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708) ~[na:1.8.0_20]
	at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234) ~[na:1.8.0_20]
	at java.util.stream.IntPipeline.reduce(IntPipeline.java:456) ~[na:1.8.0_20]
	at java.util.stream.IntPipeline.sum(IntPipeline.java:414) ~[na:1.8.0_20]
	at upsert.UpsertDupes.lambda$null$4(UpsertDupes.java:189) ~[mongotest-2.13.0.jar:na]
	at upsert.UpsertDupes$$Lambda$2/1477657879.run(Unknown Source) [mongotest-2.13.0.jar:na]
	at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) [na:1.8.0_20]
	at java.util.concurrent.FutureTask.run(FutureTask.java:266) [na:1.8.0_20]
	at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [na:1.8.0_20]
	at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [na:1.8.0_20]
	at java.lang.Thread.run(Thread.java:745) [na:1.8.0_20]
.................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................19:31:51.561  INFO - Thread 14 wrote 10000 updates
..........................19:31:52.860  INFO - Thread 22 wrote 10000 updates
.....19:31:53.050  INFO - Thread 28 wrote 10000 updates
..19:31:53.186  INFO - Thread 39 wrote 10000 updates
....19:31:53.379  INFO - Thread 24 wrote 10000 updates
..19:31:53.420  INFO - Thread 37 wrote 10000 updates
19:31:53.424  INFO - Thread 36 wrote 10000 updates
.19:31:53.437  INFO - Thread 7 wrote 10000 updates
..19:31:53.478  INFO - Thread 29 wrote 10000 updates
.19:31:53.534  INFO - Thread 5 wrote 10000 updates
...19:31:53.693  INFO - Thread 9 wrote 10000 updates
....19:31:53.716  INFO - Thread 34 wrote 10000 updates
.19:31:53.735  INFO - Thread 15 wrote 10000 updates
...19:31:53.807  INFO - Thread 2 wrote 10000 updates
.19:31:53.836  INFO - Thread 18 wrote 10000 updates
..19:31:53.864  INFO - Thread 27 wrote 10000 updates
.19:31:53.891  INFO - Thread 31 wrote 10000 updates
..19:31:53.987  INFO - Thread 38 wrote 10000 updates
..19:31:54.007  INFO - Thread 23 wrote 10000 updates
.19:31:54.017  INFO - Thread 13 wrote 10000 updates
19:31:54.019  INFO - Thread 1 wrote 10000 updates
.19:31:54.042  INFO - Thread 32 wrote 10000 updates
...19:31:54.131  INFO - Thread 3 wrote 10000 updates
19:31:54.133  INFO - Thread 25 wrote 10000 updates
.19:31:54.158  INFO - Thread 6 wrote 10000 updates
.19:31:54.194  INFO - Thread 16 wrote 10000 updates
.19:31:54.259  INFO - Thread 21 wrote 10000 updates
..19:31:54.290  INFO - Thread 17 wrote 10000 updates
19:31:54.291  INFO - Thread 33 wrote 10000 updates
.19:31:54.293  INFO - Thread 35 wrote 10000 updates
.19:31:54.313  INFO - Thread 12 wrote 10000 updates
.19:31:54.318  INFO - Thread 20 wrote 10000 updates
.19:31:54.321  INFO - Thread 0 wrote 10000 updates
.19:31:54.326  INFO - Thread 8 wrote 10000 updates
.19:31:54.335  INFO - Thread 19 wrote 10000 updates
.19:31:54.377  INFO - Thread 30 wrote 10000 updates
19:31:54.379  INFO - Done (in time), saved docs: 360006, threads terminated with duplicate key exception: 4



 Comments   
Comment by Ramon Fernandez Marina [ 29/Oct/15 ]

Correct, the duplicate key error is sent back to the application to decide whether to retry or not (unless or until SERVER-14322 is addressed).

Comment by Kay Agahd [ 20/Oct/15 ]

Ok, I see. I understood wrongly that the unique constraint will trigger a retry of the update serverside but the aim of the unique constraint is to throw a DuplicateKeyError which has to be resolved by the client, not by the database (unless ticket SERVER-14322 will be resolved).
Thanks Ramon for explaining!

Comment by Ramon Fernandez Marina [ 20/Oct/15 ]

Here's the sequence of events:

  • two updates come in with upsert:true for a document that doesn't yet exist
  • neither update finds the document so both threads try to insert it
  • if both updates contain the same value for a field that has a unique index this will trigger a duplicate key violation

If there are no unique index constraints then you'll end up with two identical documents (except for _id). If this is not the behavior you desire then as the note in the update documentation indicates one needs to create a unique index on the query field.

Comment by Kay Agahd [ 20/Oct/15 ]

Ramon, I read the mongodb docs you suggested. According to them our upsert should not throw any duplicate key error because we are querying _id which is per default uniquely indexed. Am I missing something? Thank you!

Comment by Kay Agahd [ 20/Oct/15 ]

We didn't know about this upsert restriction. Thanks Ramon!

Comment by Ramon Fernandez Marina [ 20/Oct/15 ]

kay.agahd@idealo.de, if I understand the issue you're describing correctly this is expected behavior for both update and findAndModify when upsert is true.

SERVER-14322 is open to explore alternative behaviors in future versions of MongoDB, but in the current design the application needs to handle the duplicate key events and retry the operation.

Regards,
Ramón.

Generated at Thu Feb 08 03:55:57 UTC 2024 using Jira 9.7.1#970001-sha1:2222b88b221c4928ef0de3161136cc90c8356a66.