Uploaded image for project: 'Core Server'
  1. Core Server
  2. SERVER-15674

Auto split of low top chunk does not move it to appropriate shard node

    • Type: Icon: Bug Bug
    • Resolution: Done
    • Priority: Icon: Major - P3 Major - P3
    • 2.8.0-rc3
    • Affects Version/s: 2.7.7
    • Component/s: Sharding
    • ALL
    • Hide

      In a 4 node sharded cluster, configure as follows:

      • Pre-split collection such that it is imbalanced (set to a large number of chunks on the nodes that it will be moved from).
      • shard0 - low "top chunk "Tag "NYC" 10 chunks -infinity -> 0
      • shard1 - middle chunks Tag "SF" 10 chunks 0 -> 500
      • shard2 - high "top chunk" Tag "NYC" 2 chunks 1000 -> infinity
      • shard3 - middle chunks Tag "SF" 1 chunk 500 -> 1000

      Insert into low "top chunk" to create a new chunk
      The new low "top chunk" should be created and reside on shard2
      The previous low "top chunk" should still reside on shard0

      Show
      In a 4 node sharded cluster, configure as follows: Pre-split collection such that it is imbalanced (set to a large number of chunks on the nodes that it will be moved from). shard0 - low "top chunk "Tag "NYC" 10 chunks -infinity -> 0 shard1 - middle chunks Tag "SF" 10 chunks 0 -> 500 shard2 - high "top chunk" Tag "NYC" 2 chunks 1000 -> infinity shard3 - middle chunks Tag "SF" 1 chunk 500 -> 1000 Insert into low "top chunk" to create a new chunk The new low "top chunk" should be created and reside on shard2 The previous low "top chunk" should still reside on shard0

      "Top chunks" are the special chunks with MinKey or MaxKey in their range. A low "top chunk" is the chunk with the MinKey. When an auto split is required on a top chunk the auto split should perform the split and move the new top chunk to another node if required.

            Assignee:
            randolph@mongodb.com Randolph Tan
            Reporter:
            jonathan.abrahams Jonathan Abrahams
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated:
              Resolved: