As evaluk tries to establish an expected response time baseline for a simple performance tests we have decided that the test suite needs to be reorganized to allow us to understand the results:
- the tests need to be isolated – e.g by introducing a cooldown and warmup times between tests. Currently the tests directly influence each other which makes it impossible to understand perf regressions in particular APIs
- we need smaller number of vetted test cases (workflows) – many of the tests are testing raw APIs with artificial parameters. This is too large surface area and too many conditions to optimize for. We should focus on tests that represent real-life load on the system.
Tasks to be peformed:
- Cross check existing tests against 2 real world work flows.
- Examine all large queries from standpoint of real world usage
- set up a new Jenkins job to include only verified tests
- Include only tests listed below
- Delete/disable other tests
- Add cool-down time between tests to isolate them
- set up test parameters to match the number of threads used to configure our baseline test in
Test to be included in the initial job