tempesta-tech / tempesta

All-in-one solution for high performance web content delivery and advanced protection against DDoS and web attacks
https://tempesta-tech.com/
GNU General Public License v2.0
625 stars 103 forks source link

Test: automated performance testing suite #781

Open krizhanovsky opened 7 years ago

krizhanovsky commented 7 years ago

Scope

Need to develop performance testing suite for

All the tests above must compare Tempesta FW against:

The tests must measure:

These tests must run in two environments:

The tests must run periodically in smoke (short) mode on CI and full run, including other web servers comparisons.

The test results should be stored in a server filesystem along with the configuration and system statistics (memory and CPU usage at first). A benchmark results must be also stored as text files with the command line to run the benchmark.

The CI jobs for the smoke performance tests must plot a Grafana graph to compare with previous runs and observe the trend.

Representing performance measurements

The benchmark runs must be cleaned to avoid results deviations. Different resources use 3-25 runs to get clean data and use different approaches for cleaning:

See https://bencher.dev/docs/explanation/thresholds/

References

Following issues address the problems, which must be revealed with the test suite, but require manual work.

https://github.com/nyrkio/dsi - automated performance regression testing in Python, inherited from MongoDB

https://github.com/bencherdev/bencher - similar project in Rust

ykargin commented 5 months ago

Performance Testing Plan

1. Existing Stress Tests:
    Can be used for performance testing.
    Need to write a configuration with a reasonable number of requests and parameters.

2. Grafana for Results Visualization:
    Determine how to calculate metrics for each test.
    Initially, it is sufficient to have a single metric for each test (total of 4 metrics).

3. CI (Continuous Integration):
    Set up a dedicated worker (virtual machine) for running performance tests.
    Create a separate pipeline for execution. It should operate only on the dedicated virtual machine for performance testing.

4. Reporting Script:
    Develop a script that will:
        - Report results.
        - Log installed packages.
        - Store all information locally in an archived format.

5. Grafana Charts:
    Draw a separate chart in Grafana for each of the 4 test suites.

6. Running Tests against HAproxy/nginx/Envoy:
    Add the execution of these tests against HAproxy/nginx/Envoy.
    Display the results on the charts of the corresponding test suites.
krizhanovsky commented 4 months ago

We agreed on the call, that we'll go with adjusting our existing code from https://github.com/tempesta-tech/tempesta-test/ to build the performance regression test suite