uber-node / tcurl

A command line utility to talk to tchannel servers
MIT License
36 stars 7 forks source link

Benchmark Support - setting concurrency, num-requests, & time-limit. #68

Open HelloGrayson opened 8 years ago

HelloGrayson commented 8 years ago

Their are lots of times where I've found myself wanting to do a quick and dirty stress test from tcurl. Let's add Apache Bench-like functionality so that we can easily run concurrent traffic from the command line.

Starting with a --concurrency flag would go a long ways towards this goal.

# issue 100k requests at 1k concurrency
tcurl hyperbahn --health --concurrency=1000 --num-requests=100000

# issue requests for 1m at 1k concurrency
tcurl hyperbahn --health --concurrency=1000 --time-limit=60
Raynos commented 8 years ago

Please re-use either:

Probably the hyperbahn one. It has a #lotoffeatures

HelloGrayson commented 8 years ago

Golang's sniper also has a nice interface: https://github.com/lubia/sniper

Raynos commented 8 years ago

I would love to have this feature btw; if someone can bang it out then mega :+1: should be pretty easy if you steal the BatchClient from hyperbahn.

ShanniLi commented 8 years ago

Do we want to also add a time interval between each burst?

Raynos commented 8 years ago

What hyperbahn batch client does is:

I dont know how that measures with "concurrency"; you would probably be like "if concurrency 1000 then divide by 100; do batches of N=100reqs; M=100ms"

You probably also want a flag for "flushing/blocking" etc.

The BatchClient in hyperbahn does not wait; it will just slam you on an interval and not care about response or "maximum requests in flight".

ShanniLi commented 8 years ago

I think @breerly means: concurrency == N.

The reason I ask about the time interval is that I want some way to send requests at a certain QPS. This will also be useful for testing rate limiting. I have my own private integration test for that now. :)

Raynos commented 8 years ago

I think tests where we send Nk at the start are useless.

The dials you want is:

tcurl --requests 100000 --rate 1000 #100k @ 1k req/s
tcurl --time 60 --rate 1000 #60s @ 1k req/s
tcurl --requests 100000 --rate 1000 --concurrency 1000 #100k @ 1k req/s bound concurrency to 1000
tcurl --requests 100000 --rate 1000 --delay 250 #100k @ 1k req/s with 250ms batching delay

Note: default concurrency of {rate} and default delay of 100ms

I may be completely wrong here.

ShanniLi commented 8 years ago

@breerly I think rate is what you want. Do we also need concurrency?

Raynos commented 8 years ago

we totally need concurrency :) but maybe not v0.

ShanniLi commented 8 years ago

@Raynos Sounds good, I will start with v0 without concurrency, i.e., set concurrency = rate.

HelloGrayson commented 8 years ago

Sounds good to me guys :)