-
Type:
Improvement
-
Resolution: Won't Fix
-
Priority:
Major - P3
-
None
-
Affects Version/s: 2.2.2
-
Component/s: Performance
-
None
-
None
-
3
-
None
-
None
-
None
-
None
-
None
-
None
While doing testing with C++ driver, I noticed that when all
keys were duplicates the performance was way slower.
Doesnt make sense since it's not supposed to do work in that case...
I tried with the very simple loop in js, and gives even worse result:
var start = new Date();
for (var count = 0; count < 1000000; ++count) {
db.foo.insert({_id: count});
}
print(new Date() - start);
Collection is empty:
antoine@ag410:~/Downloads/mongodb-linux-x86_64-2.2.2$ ./bin/mongo
~/adobe/testwrites.js
MongoDB shell version: 2.2.2
connecting to: test
12527
2nd run without dropping:
antoine@ag410:~/Downloads/mongodb-linux-x86_64-2.2.2$ ./bin/mongo
~/adobe/testwrites.js
MongoDB shell version: 2.2.2
connecting to: test
78153
this is reproducible always..