Open tunakasif opened 1 year ago
Thanks for opening an issue! At first glance it looks like this memory "leak" is just the global Isahc client, which does get allocated on the heap and is not cleaned up. This is because when using the simple isahc::get
API, it allocates a global HttpClient
under the hood.
This is a busy time for me right now and I don't have a ton of time to look at this yet, but definitely something worth looking in to. If you are feeling adventurous, I would be curious if there are any such leaks reported when only using the HttpClient
API directly.
@sagebind You were right 🎉, the following snippet first generates a HttpClient
and calls .get()
with the generated client.
fn main() -> Result<(), isahc::Error> {
let client = isahc::HttpClient::new()?;
let response = client.get("https://example.org")?;
println!("Status: {}", response.status());
Ok(())
}
It does not generate possibly lost
leaks, only still reachable
ones (as provided below), which is expected. So, the answer of the question
if are there any such leaks reported when only using the
HttpClient
API ?
is no. I/You can close the issue if you think this is satisfactory, or keep it open if you think it is necessary to address global HttpClient
allocation. Thank you 👍🏻
==399934== Memcheck, a memory error detector
==399934== Copyright (C) 2002-2022, and GNU GPL'd, by Julian Seward et al.
==399934== Using Valgrind-3.20.0 and LibVEX; rerun with -h for copyright info
==399934== Command: ./target/debug/memory
==399934==
Status: 200 OK
==399934==
==399934== HEAP SUMMARY:
==399934== in use at exit: 3,659 bytes in 11 blocks
==399934== total heap usage: 142,797 allocs, 142,786 frees, 6,467,777 bytes allocated
==399934==
==399934== LEAK SUMMARY:
==399934== definitely lost: 0 bytes in 0 blocks
==399934== indirectly lost: 0 bytes in 0 blocks
==399934== possibly lost: 0 bytes in 0 blocks
==399934== still reachable: 3,659 bytes in 11 blocks
==399934== suppressed: 0 bytes in 0 blocks
==399934== Rerun with --leak-check=full to see details of leaked memory
==399934==
==399934== For lists of detected and suppressed errors, rerun with: -s
==399934== ERROR SUMMARY: 0 errors from 0 contexts (suppressed: 0 from 0)
The simple example provided in
README.md
(also provided below for reference) generates memory leak. Thecargo-valgrind
output is also provided below. AFAIK, it is common to seestill reachable
leaks especially inffi
code, but this example results inpossibly lost
leaks. I don't know if this is expected due tolibcurl
utilization, but wanted to point it out.Minimal
Cargo.toml
dependencies:Given minimal
src/main.rs
use example:The
cargo-valgrind
output indicating memory leaks: