Closed ldimaggi closed 7 years ago
Maybe you can check this one also: https://github.com/tsenart/vegeta
Apache AB and Vegeta are very hammer on a 'single URL' focused. I think we want something that is more user near, where we can define scenarios that represent more what a user would do.
OpenShift Cluster load setup:
Can spawn multiple JMeter pods to simulate load etc..
Summary of planned tests (subject to change):
Test type 1 - Core Throughput
Test type 2 - UI + Core Throughput
Test type 3 - UI Efficiency
Test type 4 - UI Responsiveness
Background - Response Times
Dedicated performance test network
Open Questions
Background on JavaScript Performance Tests
There is probably a typo here. Is the ratio 1:10 or 1:20?
Recommendation from Perfcake team, based on EAP/Fuse testing, is a 1/20 ratio of test threads to human users (1 thread ~= 10 users)
Do you plan to run these tests in production-like environment, i.e. in OpenShift? If yes, how will be this environment provisioned? If no, how do we measure performance decrease caused by OpenShift itself?
I think that, in order to isolate any performance issues, we will have to test on both a physical network, and in a production/openshiift environment.
Would it make sense to use some cloud provider (Blazemeter or similar) for perf testing until we prepare proper lab etc?
Basic perfcake scenario to create workitems:
<?xml version="1.0" encoding="utf-8"?>
<scenario xmlns="urn:perfcake:scenario:7.0">
<run type="time" value="10000"/>
<generator class="DefaultMessageGenerator"
threads="${thread.count:10}"/>
<sender class="HttpSender">
<target>http://localhost:8080/api/workitems</target>
<property name="method" value="POST"/>
<property name="expectedResponseCodes" value="201"/>
</sender>
<reporting>
<reporter class="ResponseTimeStatsReporter">
<property name="minimumEnabled" value="false"/>
<property name="maximumEnabled" value="false"/>
<destination class="ChartDestination">
<period type="time" value="1000"/>
<property name="name" value="Response Time"/>
<property name="group" value="rt"/>
<property name="yAxis" value="Response Time [ms]"/>
<property name="attributes" value="Result,Average"/>
</destination>
<destination class="ConsoleDestination">
<period type="time" value="1000"/>
</destination>
</reporter>
</reporting>
<messages>
<message content='{"type":"system.bug", "fields":{"system.title":"test this workitem", "system.owner":"tmaeder", "system.state":"open", "system.creator":"ldimaggi"}}'>
<header name="HttpSender" value="POST" />
<header name="Content-Type" value="application/json" />
<header name="Accept" value="application/json" />
<header name="Authorization" value="Bearer eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.eyJmdWxsTmFtZSI6IlRlc3QgRGV2ZWxvcGVyIiwiaW1hZ2VVUkwiOiIiLCJ1dWlkIjoiZmZhOTJlMzQtNWM0ZC00Yjc0LWE3NzEtZDFiNGYzNjE4ZWI0In0.M9WEhgNXksZyfED0McyvQd_z_43BdDaFv3Ptwk20-r0hha_LLLsK0UcmZhoVR_VeVdWbuVfqjwE6GT7n4AaBexBa0w3opxytajEVY0iQaZWW3bjBZdie0UwhFAkw2PxdPF-6r02dh4l7xyRqDsvLtzLWqb6Jw9xuX5gWBLdJcHCB-ovaR4FTNQ3bIs2_wv2URaRMBbhBOlVKLjMQrgD625fo7n47DCpj7aXrp0KbWuRiCHJngCZmS7E0R3O4IkSosp3LFiKQgMp4E4qDPeBLVVNMrtO1dhQOT31gF1YvtAp2W-0yB-w-jRdy9D3OwXmLgSt-IzcRe4pA6nZl9Jw-hg" />
</message>
</messages>
</scenario>
Update - January 6, 2017 - Steps to perform:
The test requires a full core/server install: git clone git@github.com:almighty/almighty-core.git make deps make dev
Perfcake must also be installed Java must be installed first Download Perfcake binary - https://www.perfcake.org/download/ Download this Perfcake config file - https://github.com/ldimaggi/perfcake/blob/master/http-post-reporting.xml Execute perfcake test with this shell command: $PERFCAKE_HOME/bin/perfcake.sh -s http-post-reporting.xml -Dthread.count=1
Configure the job to run nightly
Configure the job to report all results (pass or fail) in email to: ldimaggi@redhat.com, naverma@redhat.com, mkleinhe@redhat.com
Where are we in getting a CRUD performance test for the core/platform - and what are the steps to complete this?
Test requirements - in order to create work items, the test requires:
What do we have set up today?
Steps to complete the test
Notes:
https://github.com/almighty/almighty-jobs, create the job in the repo, it shows up here: https://ci.centos.org/view/Devtools/
Create a .sh file which you define in the job description https://github.com/almighty/almighty-jobs/blob/master/devtools-ci-index.yaml#L358
That will run a fresh centos:7 node. from there you can do what ever yum install etc etc you want
The configuration of the test will be easier if we use Docker - we need to find a Docker image for the core server (running in dev mode) and Java, then all we have to do is install Perfcake
Marking this as closed - the POC test is in place - https://ci.centos.org/job/devtools-perfcake/ - additional performance tests are now under design.
See: https://www.redhat.com/archives/almighty-public/2016-October/msg00027.html
Tools to investigate: