-
Type: Task
-
Resolution: Won't Do
-
Priority: Major - P3
-
None
-
Affects Version/s: None
-
Component/s: None
-
Labels:None
A non-formalized paradigm for benchmarking within the Tools was supplied with TOOLS-1856. It is clear that the setup being used can give rise to a generalized benchmark framework across the Tools.
In general, a benchmark framework should satisfy the following requirements:
- Parametrize environment setup, i.e. with a Go function – setup environment variables, database server (if necessary), etc.
- Parametrize function for benchmarking, i.e. with a Go function – this should specify an idempotent, i.e. repeatable, action to be safely run many times by the benchmark runtime
- Programmatically output benchmark data, i.e. using runtime/pprof
- Optional: An Evergreen task to generate a visualization of performance metrics over a static website (see --http flag of pprof)
We could include such a framework as part of MTC for use across the Tools (and possibly mongomirror).