-
Type:
Bug
-
Resolution: Duplicate
-
Priority:
Blocker - P1
-
None
-
Affects Version/s: 3.2.7, 3.2.12
-
Component/s: Index Maintenance
-
ALL
-
None
-
None
-
None
-
None
-
None
-
None
-
None
I do have an unique index with a partialFilterExpression on a collection but duplicate data is sometimes inserted.
Index creation
getCollection().createIndex(new BasicDBObject(userId, 1) , new BasicDBObject("name", "uidx-something-user") .append("partialFilterExpression", new BasicDBObject(Properties.someting, new BasicDBObject("$eq", true))) .append("unique", true));
The index from the getIndicies command
{
"v" : 1,
"unique" : true,
"key" : {
"userId" : 1
},
"name" : "uidx-something-user",
"ns" : "somewhere.something",
"partialFilterExpression" : {
"something" : {
"$eq" : true
}
}
}
The duplicated Documents
{
"_id" : "08a8506c-bcbc-4ed6-9972-67fd7c37b4bc",
"userId" : "1068",
"express" : false,
"something" : true,
"items" : [ ],
"recipient" : {
"_id" : "efbd8618-c480-4194-964e-f5a821edf695"
}
}
{
"_id" : "b6695c6a-f29d-4531-96ac-795f14c72547",
"userId" : "1068",
"express" : false,
"something" : true,
"items" : [ ],
"recipient" : {
"_id" : "4f93fe38-edb2-4cb7-a1b3-c2c51ac8ded1"
}
MongoDb version: 3.2.7, seems also to happen with 3.2.12 - at least
A Sidenote: When dumping the collection and restoring it, a duplicate key error is thrown
Why is it sometimes possible to insert duplicate data and how to avoid that?
- duplicates
-
SERVER-28546 Documents can erroneously be unindexed from a partial index
-
- Closed
-