http-rs / surf

Fast and friendly HTTP client framework for async Rust
https://docs.rs/surf
Apache License 2.0
1.46k stars 119 forks source link

Performance question #176

Closed imbolc closed 3 years ago

imbolc commented 4 years ago

I just did a simple echo benchmark and found 9 times drop in performance, while I'd expect it to be about 2. All the CPU cores was busy.

Here's the code I used:

use tide::{Request, Response, StatusCode};

#[async_std::main]
async fn main() -> std::io::Result<()> {
    let mut app = tide::new();
    app.at("/").get(hello);
    app.at("/echo").get(echo);
    app.listen("127.0.0.1:3000").await?;
    Ok(())
}

async fn hello(_: Request<()>) -> tide::Result {
    Ok(Response::new(StatusCode::Ok).body_string("hello, world".into()))
}

async fn echo(_: Request<()>) -> tide::Result {
    let body = surf::get("http://127.0.0.1:3000/")
        .recv_string()
        .await
        .unwrap();
    Ok(Response::new(StatusCode::Ok).body_string(body))
}

And the results I got:

imbolc@p1:~/0/lab$ wrk http://localhost:3000/
Running 5s test @ http://localhost:3000/
  2 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency    78.48us  184.22us   8.16ms   99.11%
    Req/Sec    68.04k     4.85k   89.54k    81.19%
  683341 requests in 5.10s, 84.07MB read
Requests/sec: 133997.23
Transfer/sec:     16.48MB
imbolc@p1:~/0/lab$ wrk http://localhost:3000/echo
Running 5s test @ http://localhost:3000/echo
  2 threads and 10 connections
  Thread Stats   Avg      Stdev     Max   +/- Stdev
    Latency   753.08us  535.41us   8.60ms   93.69%
    Req/Sec     7.31k     1.33k    9.07k    86.14%
  73379 requests in 5.10s, 9.03MB read
Requests/sec:  14391.00
Transfer/sec:      1.77MB
Fishrock123 commented 3 years ago

Try using { version = "2.2", default-features = false, features = ["h1-client"] } in your Cargo.toml and then let us know if this is still an issue.

By default Surf 2 uses libcurl via isahc for it's HTTP client implementation.