Closed seanpianka closed 3 months ago
Thanks for the detailed report. Based on what you've described, my first inclination would be to say that your system is stalling under I/O or memory pressure. (Nextest doesn't come with great debugging for this at the moment.)
Could you try the following:
- Running the test suite with nextest but without llvm-cov. This will help isolate llvm-cov as a factor (which might not be its fault -- llvm-cov necessarily requires additional memory and I/O).
Great intuition. I stripped llvm-cov from the command and the execution time for a majority of tests fell below one second.
cargo nextest run \
--locked \
--config-file ./.config/nextest.toml \
--features test_database_backing_impl \
--bins --examples --tests --all-targets \
--no-fail-fast \
--workspace
It appears that generating the code coverage results puts incredible CPU, memory, and/or I/O load on our CI runner's host. I'll go about filing an issue there and see what I can do about debugging this issue (or see if it simply requires throwing more hardware at the problem).
Thanks again!
Lately, I am experiencing significantly slow test execution times using nextest + llvm-cov. Below is an example of a single test and its execution time.
Test Example
Execution Time
From one invocation, the above test takes an excessively long time to execute:
Environment
docker:26.1.3-dind-alpine3.19
, runs within GitLab CI on a self-managed runner (hardware: Intel(R) Core(TM) i7-9700K CPU @ 3.60GHz, 32GB of DDR4 memory)Steps to Reproduce
Cargo.toml
:Create the following test in
src/lib.rs
:Additional Information
[profile.default.junit]
Output a JUnit report into the given file inside 'store.dir/'.
If unspecified, JUnit is not written out.
path = "results.xml"
The name of the top-level "report" element in JUnit report. If aggregating
reports across different test runs, it may be useful to provide separate names
for each report.
report-name = "nextest-run"