-
Type: Question
-
Resolution: Done
-
Priority: Major - P3
-
None
-
Affects Version/s: 2.6.2
-
Component/s: Performance
-
Environment:OSX 10.9.4, MongoDB 2.6.0
I am experiencing slowdown when inserting documents into a collection with more than 650,000 documents in it. The data used is wikipedia content dump. It started fast as always, and I was getting decent ~1000 documents/second, however the performance degraded after every batch of 3000 documents, and now at the 666000 documents the speed is less than 10 documents per second. My goal was to insert 1M documents to test full text speed, which seems now forever at this pace.
Is this by design?
Here are the indexes:
db.content.getIndexes() [ { "v" : 1, "key" : { "_id" : 1 }, "name" : "_id_", "ns" : "mediawiki.content" }, { "v" : 1, "key" : { "_fts" : "text", "_ftsx" : 1 }, "name" : "fulltext_index", "ns" : "mediawiki.content", "weights" : { "body" : 1, "title" : 1 }, "default_language" : "english", "language_override" : "language", "textIndexVersion" : 2 } ]
and stats
db.content.stats() { "ns" : "mediawiki.content", "count" : 663429, "size" : 3283468976, "avgObjSize" : 4949, "storageSize" : 3918798848, "numExtents" : 22, "nindexes" : 2, "lastExtentSize" : 1021497344, "paddingFactor" : 1, "systemFlags" : 1, "userFlags" : 1, "totalIndexSize" : 6679227856, "indexSizes" : { "_id_" : 21560112, "fulltext_index" : 6657667744 }, "ok" : 1 }