locustio / locust

Write scalable load tests in plain Python 🚗💨
https://locust.cloud
MIT License
25.06k stars 3k forks source link

Start distributed test with multiple slaves with one command. #721

Closed debugtalk closed 6 years ago

debugtalk commented 6 years ago

Description of feature request

Currently when we need to do distributed test, we have to start Locust master and slaves one by one. Suppose our load test machine has 32 cores, we need to run start command 33 times ! Also, when we adjust our Locust scripts, we have to kill all Locust slaves and start again.

Considering this scenario is so common, we can add one parameter (such as --cpu-cores) to simplify this job.

Expected behavior

With the argument , we can start locust with master and specified number of slaves (default to cpu cores number) at one time.

$ locust -f locustfile.py --cpu-cores 4
[2017-08-26 23:51:47,071] bogon/INFO/locust.main: Starting web monitor at *:8089
[2017-08-26 23:51:47,075] bogon/INFO/locust.main: Starting Locust 0.8a2
[2017-08-26 23:51:47,078] bogon/INFO/locust.main: Starting Locust 0.8a2
[2017-08-26 23:51:47,080] bogon/INFO/locust.main: Starting Locust 0.8a2
[2017-08-26 23:51:47,083] bogon/INFO/locust.main: Starting Locust 0.8a2
[2017-08-26 23:51:47,084] bogon/INFO/locust.runners: Client 'bogon_656e0af8e968a8533d379dd252422ad3' reported as ready. Currently 1 clients ready to swarm.
[2017-08-26 23:51:47,085] bogon/INFO/locust.runners: Client 'bogon_09f73850252ee4ec739ed77d3c4c6dba' reported as ready. Currently 2 clients ready to swarm.
[2017-08-26 23:51:47,084] bogon/INFO/locust.main: Starting Locust 0.8a2
[2017-08-26 23:51:47,085] bogon/INFO/locust.runners: Client 'bogon_869f7ed671b1a9952b56610f01e2006f' reported as ready. Currently 3 clients ready to swarm.
[2017-08-26 23:51:47,085] bogon/INFO/locust.runners: Client 'bogon_80a804cda36b80fac17b57fd2d5e7cdb' reported as ready. Currently 4 clients ready to swarm.

Actual behavior

To achieve the same goal above, we have to start Locust master first.

$ locust -f locustfile.py --master

And then open another terminal shell, start Locust slaves one by one.

$ locust -f locustfile.py --slave &
$ locust -f locustfile.py --slave &
$ locust -f locustfile.py --slave &
$ locust -f locustfile.py --slave &

Environment settings (for bug reports)

N/A

Steps to reproduce (for bug reports)

N/A

cgoldberg commented 6 years ago

I'm -1 on this

debugtalk commented 6 years ago

@cgoldberg I have implemented this feature in HttpRunner, as a Locust wrapper.

http://docs.httprunner.top/en/latest/load-test.html

I think this feature may do help to the convenience of Locust, and will not influence any current feature. Shall I set up a PR on this ?

debugtalk commented 6 years ago

OK, I will keep this feature in HttpRunner.

SpencerPinegar commented 6 years ago

@cgoldberg OR @heyman - I am relatively new to Python (2 years experience) and I was wondering why a feature like this would be too hard to maintain or implement -- I am sure your decisions are based on good reasoning, can you help me understand?

cgoldberg commented 6 years ago

why a feature like this would be too hard to maintain or implement

While it might be hard, my reasoning was based on the fact that many great configuration management tools already exist... you should use one to provision and execute your Locust tests if you need a complex distributed setup.

SpencerPinegar commented 6 years ago

Yeah, it would be cool to have an example of this.

indrgun commented 5 years ago

OK, I will keep this feature in HttpRunner.

@debugtalk I tried running your "locusts" but it simply exits without any output on std-out console.

emilorol commented 5 years ago

I had a similar need and I was able to solve it locally with minishift. Later I was able to take it to openshift to get the most out of the hardware with minimum commands. The only down side is that the auto scale will reset your tests.

MarcSteven commented 4 years ago

The issue is still fuzzy, but the command cannot go @debugtalk