[SERVER-13861] Mongo crashes on duplicates when there are no any Created: 07/May/14 Updated: 10/Dec/14 Resolved: 20/May/14 |
|
| Status: | Closed |
| Project: | Core Server |
| Component/s: | Replication |
| Affects Version/s: | 2.6.0 |
| Fix Version/s: | None |
| Type: | Bug | Priority: | Critical - P2 |
| Reporter: | Moshe Shperling | Assignee: | Thomas Rueckstiess |
| Resolution: | Cannot Reproduce | Votes: | 0 |
| Labels: | None | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original Estimate: | Not Specified | ||
| Operating System: | ALL |
| Participants: |
| Description |
|
Hi Recently we have noticed rather annoying issue: we have a collection called items_full which has a unique index on field sku. when we run following update query: ) it crashes with error: when i look for the above sku i get only 1 result. Any ideas? |
| Comments |
| Comment by Moshe Shperling [ 21/May/14 ] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
hi indeed we cannot reproduce this issue with indexes, but we found another issue now that has to do with 2.6 version. where should i report it? here? or open another ticket? thanks | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Comment by Thomas Rueckstiess [ 20/May/14 ] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Hi Moshe, Thanks for letting us know. I'll resolve the ticket now as "Cannot Reproduce", but if it happens again, feel free to re-open the ticket and provide additional information on the incident. Regards, | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Comment by Moshe Shperling [ 12/May/14 ] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
hello. here is an update on this. while we copied the collection (items_full to items_full copy) the indexes were not duplicated. therefore even though that the duplicates are there - we cannot reproduce the issue. we run db.upgradeCheck() command and here is the result: we also schedulled indexes recreation in order to catch the duplicates when they appear. if/when we encounter this issue again we will update this ticket. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Comment by Thomas Rueckstiess [ 08/May/14 ] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Hi Moshe, 1. We'd like to find out if the problem is in the data or the indexes. As an experiment, can you do the following steps:
2. We can see in the log file you pasted that there are "getlasterror" commands. These are deprecated with the new write commands in MongoDB 2.6 and the C# driver 1.9. Did you run the bulk update still with C# driver 1.8 (or an older php driver, as you mentioned?) Can you repeat it with the 1.9 driver? 3. To narrow it further down, can you tell us the exact operations you use to insert/upsert/update the documents? I can see the bulk update call, but are there other operations that insert or modify the data? How did the duplicate skus get generated in the first place? 4. Finally, as we are still unsuccessful in reproducing the problem so far, would you be willing to share the data of this collection with us (securely) for internal reproduction of the issue? Regards, | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Comment by Moshe Shperling [ 08/May/14 ] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
one more thing: we use php driver as well to update this collection | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Comment by Moshe Shperling [ 08/May/14 ] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
hi Thomas we did some research on this and have some more information, but first here is an information you have asked: 1. here is the log:
2. we have upgraded from 2.4.9. now some more information we have found.
we downgraded today back to 2.4.9 and there:
the question now is How these duplicated skus (around 20) got through unique index? thanks for your help. hope you come up with some idea regarding duplicates creation. | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Comment by Thomas Rueckstiess [ 07/May/14 ] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Hi Moshe, I have tried to reproduce this issue but haven't been able to so far. To diagnose this further, we would need the log file covering the entire update process. Can you share that with us? If it contains sensitive information I can provide a way to share it securely. If this happens repeatedly (and reproducibly), could you increase the log level to 2 before the next bulk update and provide the log file afterwards? Also some additional questions:
Thanks, | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Comment by Moshe Shperling [ 07/May/14 ] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
hi i deleted sku_1, but the same thing happens with sku_upper_1. here is an Exception print i get from c# driver writeconcern object WriteConcern detected an error 'E11000 duplicate key error index:
when i look in the collection for this (DSSWI#LUC_W_LP-12550-GY-014-GY) item i get only 1 result the db in general runs as usual - only updates to this collection do not work to all the items (only to 35% - then it crashes). | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Comment by Michael Grundy [ 07/May/14 ] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
I don't see the "sku_1" index that the insert error references. Could you post the output of db.system.indexes.find( {name:"sku_1"}) ? Also, when you say crashes, is mongod also terminating with an exception, or is it just the update failing with the duplicate index error? Thanks! | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Comment by Moshe Shperling [ 07/May/14 ] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
here you go
thanks | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Comment by Michael Grundy [ 07/May/14 ] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Yes, just the indexes from items_full should be fine | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Comment by Moshe Shperling [ 07/May/14 ] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
there are 191 indexes. it is quite large to print. maybe you need just indexes referring to the items_full collection? | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
| Comment by Michael Grundy [ 07/May/14 ] | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Hi Moshe - Could you post the output from db.system.indexes.find() please? Thanks! |