Closed mitchellmanware closed 9 months ago
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 95.76%. Comparing base (
37ba1fe
) to head (0ff1926
).:exclamation: Current head 0ff1926 differs from pull request most recent head 37f8aea. Consider uploading reports for the commit 37f8aea to get more accurate results
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
testthat output shows that all tests are passing, but the workflow fails when uploading the coverage to Codecov.
@mitchellmanware I'm not sure what to do. I would suggest first updating your branch with main as I am seeing this warning from GitHub (i.e. your branch is both ahead and behind main). Interestingly the build checks and test-coverage appeared to take 3-4x longer than when they ran for me.
Pulled in the updates so we will see if that makes a difference.
It is weird how the different Ubuntu versions fail the build check seemingly randomly. In this commit oldrel-1 was the only to fail, but later it is the only one to pass.
I assume it stems from whatever is causing the testthat upload to fail, but changing the Sys.sleep()
values in both the test_download_functions()
function and in the test-downoad_functions.R
tests causes unpredictable pass/failure.
The unit tests run very quick locally, so also unsure about the running time in the GitHub runners. @Spatiotemporal-Exposures-and-Toxicology
test-coverage finally passes but still takes 52 minutes. Failure for ubuntu-latest (release) caused by same problem. There might be system-specific differences for the URL responses, which could be why the different sleep times cause certain check to pass and fail.
I will keep looking into it, but at least tests are passing.
Exploring replacing httr::HEAD()
and httr::GET()
with httr2::request() > req_perform()
. httr2
is a replacement to the httr
package, so I will see if quicker HTTP response times fix the test-coverage and ubuntu check failures.
> microbenchmark(
+ request(urls[1]) %>% req_perform(),
+ httr::HEAD(urls[1]),
+ times = 10,
+ unit = "seconds"
+ )
Unit: seconds
expr min lq mean median uq max neval
request(urls[1]) %>% req_perform() 0.2373555 0.2524261 0.3734844 0.3562834 0.4702028 0.5638149 10
httr::HEAD(urls[1]) 5.4498386 8.4269201 9.5435405 9.0156858 11.0656759 16.0476204 10
>
Somehow the httr2
package resulted in three hour run times for the R CMD checks. Not sure how this is possible but I will revert to previous version with httr
.
This thread describes issues using .RDS
files in testthat
tests. I deleted the sites_NC.rds
file I was using for testing and matched the
ncp <- data.frame(lon = -78.8277, lat = 35.95013)
ncp$site_id <- "3799900018810101"
testing locations used in other covariate calculation unit tests.
Also renamed geos test data folders as aqc
and chm
to limit characters in file paths. Updated calc_geos
detects collection from .nc
file paths so full collection name is not needed.
I am now getting test failures for process_geos_collection
, which is a simple string splitting and selection function. I am going to close PR and create new branch from most recent merge from main
.
beethoven
process.R
andcalculate_covariates.R
filesdownload_noaa_hms_smoke_data
>download_hms_data
(reflected inman/
and all tests)strsplit()
in theextract_urls()
download support function.stringr
dependency withstringi
forstringi::stri_pad()
inprocess_narr()