Details
-
Bug
-
Status: Closed
-
Major - P3
-
Resolution: Won't Do
-
None
-
None
-
None
-
true
Description
Description
Hello!
I just investigated undocumented behavior of unordered bulk execution.
simple example (Node.js):
// 1. initialize
|
const bulk = initializeUnorderedBulkOp(); |
|
// 2. populating data
|
bulk.insert(doc1);
|
bulk.insert(doc2);
|
bulk.insert(doc3);
|
...
|
bulk.insert(doc10000);
|
|
// 3. execution
|
bulk.execute().catch(error => { |
// ignore duplicete errors because this decision works faster then find().upsert().update() |
// yes, it's crutch, but it works :) |
if (error.code === 11000) { |
return; |
}
|
throw error; |
});
|
So... Sometimes in the bulk object I see more then single batch (If it was large source in input). And if some batch has any document duplicate unique key - following batches won't be processed.
Right now i got three batches. First of them has 8547 operations. In bulkResult I find 7 errors (duplicates). So after execution the target collection had increased for 8540 documents. Usually I use 'getWriteErrors()' to collect problems but (surprize), operations from second and next batches didn't have errors because hadn't process.
I tried to use 'continueOnError' flag, but it didn't has any effect.
Now I use 4.2.7 version but abut month ago I had upgraded from 3.x. I dont't remember this problem before upgrade, but after quick analyse of result collections I may think that this problem can be present in previous versions...
Scope of changes
Impact to Other Docs
MVP (Work and Date)
Resources (Scope or Design Docs, Invision, etc.)
Attachments
Issue Links
- related to
-
NODE-2619 Unordered bulk write aborts on first encountered error
-
- Closed
-