-
Type: Task
-
Resolution: Done
-
Priority: Major - P3
-
None
-
Affects Version/s: None
-
Component/s: Sharding
-
None
-
Sharding 2018-11-19, Sharding 2018-12-03
The sharding catalog may see millions of chunks representing unsharded collections, we need to verify that this will not significantly affect performance or functionality of the sharding catalog. This test may need more resources than others, so we will add a test suite that will run on a bigger AWS instance and will not be run in parallel with other tests.
Implementation:
1. Create a new hook CreateManyUnshardedCollections
- Create a new resmoke hook in resmokelib/testing/hooks
- Define function before_suite() that will create 1,000,000 collections spread across N databases. N will be a parameter will be set by the suite running this hook. For each collection directly insert an entry into config.collections (with partitioned: false and dummy shard key) and config.chunks (with range: globalMin -> globalMax). This will be run once at the start of a suite.
- Define function after_suite() that will validate that there exist the correct entries for the N databases in config.databases (partitioned should still be false and the primary shard should remain unchanged) and and the 1,000,000 collections in config.collections (partitioned: false and the dummy shard key) and config.chunks (the primary shard should remain unchanged and min and max should be globalMin and globalMax).
2. Create a new suite large_sharding_catalog_one_database_jscore_passthrough
- This will be a jscore passthrough suite
- This will use a sharded fixture
- This should include CreateManyUnshardedCollections as a hook and pass N = 1 to create all 1,000,000 collections in the same database
3. Create a new task in etc/evergreen.yml to define large_sharding_catalog_one_database_jscore_passthrough
4. Define a new build variant in etc/evergreen.yml "~ Large Sharding Catalog Cache Enterprise RHEL 6.2" to run on rhel62-large
- This will only run compile and large_sharding_catalog_one_database_jscore_passthrough
To run the suite:
locally:
python buildscripts/resmoke.py --suites=large_sharding_catalog_jscore_passthrough jstests/core/any_core_test.js
on evergreen:
submit a patch to the mongodb-mongo-master project, and choose ~ Large Sharding Catalog Cache Enterprise RHEL 6.2 from the list of variants.
Future considerations:
- Create other build variants to run this suite one
- Create other suites that will spread the 1,000,000 collections over 100 or 1,000 databases
- related to
-
SERVER-38148 Test performance when reading from mongos with large catalog cache
- Closed