Closed odnobit closed 1 year ago
P.S. wrk utility showed ~850 RPS when im was testing my server
P.S. wrk utility showed ~850 RPS when im was testing my server
i want instal the tool can u hellp me ? @
P.S. wrk utility showed ~850 RPS when im was testing my server
i want instal the tool can u hellp me ? @
Why are you using v1.5.0
if we are already at v1.49.0
?
With your code you're trying to send 10000 at exactly the same time. You should instead make for example 32 workers and have those send the requests one by one. That way you'll only send 32 at the same time and you shouldn't get as many errors (depending on the limitations of the receiving server). wrk
will for example use 10
"workers" if you don't specify this on the command line.
Why are you using
v1.5.0
if we are already atv1.49.0
?With your code you're trying to send 10000 at exactly the same time. You should instead make for example 32 workers and have those send the requests one by one. That way you'll only send 32 at the same time and you shouldn't get as many errors (depending on the limitations of the receiving server).
wrk
will for example use10
"workers" if you don't specify this on the command line.
Thanks for your answer. Actually im using v1.50.0, I made mistake in the message. What is the best way to do 32 or more workers with fasthttp? Im really need an example :)
Something like this:
package main
import (
"encoding/json"
"fmt"
"sync"
"sync/atomic"
"time"
"github.com/valyala/fasthttp"
)
type sendBody struct {
ID string `json:"uuid"`
Value int `json:"value"`
}
func main() {
var errReqCounter atomic.Int64
var successReqCounter atomic.Int64
var wg sync.WaitGroup
hc := &fasthttp.Client{
WriteTimeout: time.Second,
ReadTimeout: time.Second,
MaxIdleConnDuration: time.Hour,
MaxConnsPerHost: reqCount,
}
work := make(chan int, workers)
workers := 32
wg.Add(workers)
for i := 0; i < workers; i++ {
go func() {
defer wg.Done()
for index := range work {
req := fasthttp.AcquireRequest()
resp := fasthttp.AcquireResponse()
defer fasthttp.ReleaseRequest(req)
defer fasthttp.ReleaseResponse(resp)
req.SetRequestURI("http://localhost:8080/post")
req.Header.SetMethod(fasthttp.MethodPost)
req.Header.Set("Content-Type", "application/json")
body, _ := json.Marshal(sendBody{ID: "10110101101", Value: index})
req.SetBody(body)
if err := hc.Do(req, resp); err != nil {
fmt.Printf("request err: %v\n", err)
errReqCounter.Add(1)
} else {
if resp.StatusCode() == fasthttp.StatusOK {
successReqCounter.Add(1)
} else {
errReqCounter.Add(1)
}
}
}
}()
}
reqCount := 10000
for i := 0; i < reqCount; i++ {
work <- i
}
close(work)
wg.Wait()
fmt.Printf("Statistics\nError: %v\nSuccess: %v\n", errReqCounter.Load(), successReqCounter.Load())
}
thx for help :)
with proxies?
With proxies you can use this: https://pkg.go.dev/github.com/valyala/fasthttp@v1.50.0/fasthttpproxy and set one of those as Dial
function for fasthttp.Client
.
Hello! Package version:
v1.5.0
This is code example (here i'm sending 10k reqs):And i got so many errors like
Statistics:
Can you please provide best way to resolve this task? I also was trying to use RateLimiter by time (eg 250 requets per second) and got same result.