It should run mongoebench with the various JSON config files that live in the src/third_party/mongo-perf/mongoebench/ directory that have been vendored into the source tree as part of the changes from
This involves creating a new buildscripts/resmokelib/testing/testcases/mongoebench_test.py test case that executes mongoebench with the appropriate arguments. For example, the value for the --benchmarkMinTimeSecs command line option should be forwarded as the --time command line option to mongoebench.
This also involves creating a new hook similar to the CombineBenchmarkResults hook that parses the JSON stats file specified as the --output command line option (from
SERVER-36073) to mongoebench. The new hook should accumulate benchmark results of all the test cases we run as part of the test suite and serialize them as a JSON file (taking its name from the
--perfReportFile command line option) that can be used for the json.send Evergreen command to display the performance results. The test case should also handle the --benchmarkRepetitions command line option (in Python, as there is no equivalent option to forward to mongoebench) and accumulate the benchmark results of multiple executions.
We may find it beneficial to define separate test suites that each run a subset of the test cases similar to what is done in the performance Evergreen project when these test cases are run with benchrun.py to avoid having an Evergreen task run for a long time.