[SERVER-20392] Sharding an existing small collection results in large number of chunks Created: 14/Sep/15 Updated: 28/Aug/18 Resolved: 09/Aug/17 |
|
| Status: | Closed |
| Project: | Core Server |
| Component/s: | Sharding |
| Affects Version/s: | 3.0.6, 3.2.11, 3.4.0 |
| Fix Version/s: | 3.4.9, 3.5.12 |
| Type: | Bug | Priority: | Major - P3 |
| Reporter: | Yves Duhem | Assignee: | Kevin Pulo |
| Resolution: | Done | Votes: | 5 |
| Labels: | None | ||
| Remaining Estimate: | Not Specified | ||
| Time Spent: | Not Specified | ||
| Original Estimate: | Not Specified | ||
| Attachments: |
|
||||||||||||||||
| Issue Links: |
|
||||||||||||||||
| Backwards Compatibility: | Minor Change | ||||||||||||||||
| Operating System: | ALL | ||||||||||||||||
| Backport Requested: |
v3.4
|
||||||||||||||||
| Steps To Reproduce: | To reproduce:
Attached is a javascript file with these steps. |
||||||||||||||||
| Sprint: | Sharding 2017-01-02, Sharding 2017-07-31 | ||||||||||||||||
| Participants: | |||||||||||||||||
| Case: | (copied to CRM) | ||||||||||||||||
| Description |
|
Creating a small collection (about < 4M) and then sharding the collection triggers a multi-split generating a large number of chunks (possibly thousands). Some examples: The behavior was observed with the following variations:
The behavior did not occur on 2.6.11: only a few chunks are created (3 in the case of 20007 documents described before). |
| Comments |
| Comment by Githook User [ 25/Aug/17 ] | ||||||||||||||||
|
Author: {'username': 'devkev', 'email': 'kevin.pulo@mongodb.com', 'name': 'Kevin Pulo'}Message: Plus some additional 3.4-specific jstest fixes. (cherry picked from commit ad6a668da49c61a4276749aef7529088dc3524ea) | ||||||||||||||||
| Comment by Githook User [ 09/Aug/17 ] | ||||||||||||||||
|
Author: {'username': 'devkev', 'email': 'kevin.pulo@mongodb.com', 'name': 'Kevin Pulo'}Message: | ||||||||||||||||
| Comment by Pedro Rocha Goncalves [ 15/Mar/17 ] | ||||||||||||||||
|
I believe I was affected by this as well:
This seems like an unusually large number of chunks for such a small collection. The number of chunks was actually close to 1k, but we merged empty chunks. Unfortunately our script doesn't seem to find more empty chunks to merge. We're running MongoDB 3.2. | ||||||||||||||||
| Comment by Hoyt Ren [ 16/Mar/16 ] | ||||||||||||||||
|
It seems I meet the same problem, by the way, why the index is so large, the indexed field is just a integer. Tell me if need more detailed info. db.pop_store.stats() , |