All workflows use the standard CRAN repository. All Linux workflows also use the ubuntugis-unstable PPA to get (more) recent geospatial libraries.
For Ubuntu, this meant taking no advantage of the binary packages from RStudio Package Manager (RSPM). The approach has its merits: CRAN is a bit more up to date and given re-compilation the packages can be used with the ubuntugis-unstable PPA, so if unit tests were added to n2khab, geospatial functions would be tested against more recent versions of geospatial libraries = more future-proof. However the downside is that all packages are compiled from source, and this takes long. The compiled packages are cached on GHA, but from the moment one dependency gets updated on CRAN, the cache is not used and all dependencies are compiled again.
Different approaches may be tried to make this process shorter:
don't be too strict in dropping the cache. If a few CRAN packages need to be updated, only those will need recompilation. It needs investigation how dropping the cache can be triggered in a sensible way.
return to RSPM (without PPA) as package source for the most common trigger (any push). Just use R-release there. Preserve CRAN + PPA + R-devel for more critical events (separate workflow), e.g. for PRs to master, and for pushes (not PR) on a dev* branch.
Citing from PR #108:
For Ubuntu, this meant taking no advantage of the binary packages from RStudio Package Manager (RSPM). The approach has its merits: CRAN is a bit more up to date and given re-compilation the packages can be used with the ubuntugis-unstable PPA, so if unit tests were added to
n2khab
, geospatial functions would be tested against more recent versions of geospatial libraries = more future-proof. However the downside is that all packages are compiled from source, and this takes long. The compiled packages are cached on GHA, but from the moment one dependency gets updated on CRAN, the cache is not used and all dependencies are compiled again.Different approaches may be tried to make this process shorter: