This PR updates R, and all dependencies to their latest version, and adds some new tensor methods. It also introduces the gsignal library as a dependency, becuase the new tensor methods need some signal processing algorithms.
On this branch, some old tests fail, and the documentation won't build. But I'm not sure if they were even working before at all; I tried with a few old versions of R + devtools + dependencies as specified in renv.lock but never got things running with the old renv.lock environment (which does not specify enough to fully recreate whatever developer environment was being used) so I gave up.
I'm keen to try and just sort out the test failures with the new dependency versions to future-proof changes as opposed to spending lots of time trying to track down whatever arcane environment was used in the past to coerce the failing tests into working (if they were working at all). I'm running on Windows with the latest R version, which I reckon would be a relatively standard mixOmics2 user / open-source developer environment?
Let me know what you think :)
Test failures
The new tensor functions have their own testing scripts, but I'll make very sure they do not produce any new errors / suppress warnings to it's easier to focus on the test failures / warnings produced by the old code.
In the old code and accompanying tests, there's 3 failing tests and 4 deprecation warnings (but positively - 258 passing ones between the old and new tests!!)
This PR has not changed any code in old functions or their accompanying testing scripts. A rough brainstorm for potential causes of the old tests failures are:
They never worked in the first place on master even before the dependencies were updated
They worked before in some specific environment, but fail since I've updated all the dependency versions
They don't / never work(ed) on Windows: All the failing tests look like they are to do with some parallel algorithms, I wonder if they work in a Linux environment...? Maybe they don't need to work for a Windows user anyway...?
The results of the most recent test run (locally on a Windows machine) are shown below; I've suppressed a few finnicky warnings my new code was producing to do with BiocParallel algorithms on Windows so all messages concern old code and tests:
══ Results ══════════════════════════════════════════════════════════════════════════════════════════════════════
Duration: 69.7 s
── Failed tests ─────────────────────────────────────────────────────────────────────────────────────────────────
Error ('test-perf.diablo.R:41:5'): perf.diablo works with and without parallel processing and with auroc
Error in `checkForRemoteErrors(lapply(cl, recvResult))`: 2 nodes produced errors; first error: there is no package called 'mixOmics'
Backtrace:
▆
1. ├─mixOmics::perf(...) at test-perf.diablo.R:41:5
2. └─mixOmics:::perf.sgccda(...) at mixOmics/R/perf.R:254:9
3. └─parallel::clusterEvalQ(cl, library(mixOmics)) at mixOmics/R/perf.diablo.R:640:7
4. └─parallel::clusterCall(cl, eval, substitute(expr), envir = .GlobalEnv)
5. └─parallel:::checkForRemoteErrors(lapply(cl, recvResult))
Error ('test-tune.spls.R:26:5'): tune.spls works in parallel
<bplist_error/bperror/error/condition>
Error: BiocParallel errors
4 remote errors, element index: 1, 4, 7, 10
6 unevaluated and other errors
first remote error:
Error in spls(X = X, Y = Y, keepX = c(choice.keepX, keepX), keepY = c(choice.keepY, : could not find function "spls"
Backtrace:
▆
1. ├─base::suppressWarnings(...) at test-tune.spls.R:26:5
2. │ └─base::withCallingHandlers(...)
3. └─mixOmics::tune.spls(...)
4. ├─BiocParallel::bplapply(X = keep.vals, FUN = iter_keep.vals, BPPARAM = BPPARAM) at mixOmics/R/tune.spls.R:376:7
5. └─BiocParallel::bplapply(X = keep.vals, FUN = iter_keep.vals, BPPARAM = BPPARAM) at BiocParallel/R/AllGenerics.R:2:5
6. └─BiocParallel:::.bpinit(...) at BiocParallel/R/bplapply-methods.R:57:5
Error ('test-tune.splsda.R:18:5'): tune.splsda works
Error in `checkForRemoteErrors(lapply(cl, recvResult))`: 2 nodes produced errors; first error: there is no package called 'mixOmics'
Backtrace:
▆
1. └─mixOmics::tune.splsda(...) at test-tune.splsda.R:18:5
2. └─parallel::clusterEvalQ(cl, library(mixOmics)) at mixOmics/R/tune.splsda.R:333:13
3. └─parallel::clusterCall(cl, eval, substitute(expr), envir = .GlobalEnv)
4. └─parallel:::checkForRemoteErrors(lapply(cl, recvResult))
[ FAIL 3 | WARN 4 | SKIP 0 | PASS 258 ]
Documentation can't build
Similar thing here - I'm not sure if the documentation was build-able before at all, but I'm not too familiar with documentation in R. I have been invoking devtools::document() to export my new functions in the NAMESPACE file, and it seems to throw some informative errors on why it won't build new docs...maybe it's an easy fix...?
Test deprecation warnings
I have not looked too closely here but hoping they should be fix-able, although I'm concious that as I'm constantly updating the dependencies now, one of them will soon bite me and turn into a warning-less test failure.
New dependencies (minor)
I added the gsignal library as a dependency, as well as microbenchmark as a suggested dependency, but very happy to remove the latter if it does not belong there! The new tensor methods rely on some signal processing algorithms and they are implemented reasonably nicely / efficiently in gsignal compared to alternatives I've found so far.
Development plan
I'm adding a few more tensor methods that Saritha is interested in researching, but will try to be very careful not to introduce new test failures!
I am a bit clueless in developing in R, but saw online that good practice is to avoid manually adjusting files and so forth so I've just been invoking devtools commands like devtools::test(), devtools::test_active_file(...), devtools::document(), devtools::install(), renv::status() and renv::snapshot() to avoid touching the lockfiles and NAMESPACE files directly.
Let me know how things look when you run tests in your environment(s) with these new dependencies, and if you have any insight on those finnicky things the documentation generator is complaining about / testing failures. Happy to change absolutely anything in my code to make things easier to maintain!
Brief summary
This PR updates R, and all dependencies to their latest version, and adds some new tensor methods. It also introduces the
gsignal
library as a dependency, becuase the new tensor methods need some signal processing algorithms.On this branch, some old tests fail, and the documentation won't build. But I'm not sure if they were even working before at all; I tried with a few old versions of R + devtools + dependencies as specified in renv.lock but never got things running with the old renv.lock environment (which does not specify enough to fully recreate whatever developer environment was being used) so I gave up.
I'm keen to try and just sort out the test failures with the new dependency versions to future-proof changes as opposed to spending lots of time trying to track down whatever arcane environment was used in the past to coerce the failing tests into working (if they were working at all). I'm running on Windows with the latest R version, which I reckon would be a relatively standard mixOmics2 user / open-source developer environment?
Let me know what you think :)
Test failures
The new tensor functions have their own testing scripts, but I'll make very sure they do not produce any new errors / suppress warnings to it's easier to focus on the test failures / warnings produced by the old code.
In the old code and accompanying tests, there's 3 failing tests and 4 deprecation warnings (but positively - 258 passing ones between the old and new tests!!)
This PR has not changed any code in old functions or their accompanying testing scripts. A rough brainstorm for potential causes of the old tests failures are:
The results of the most recent test run (locally on a Windows machine) are shown below; I've suppressed a few finnicky warnings my new code was producing to do with BiocParallel algorithms on Windows so all messages concern old code and tests:
Documentation can't build
Similar thing here - I'm not sure if the documentation was build-able before at all, but I'm not too familiar with documentation in R. I have been invoking
devtools::document()
to export my new functions in the NAMESPACE file, and it seems to throw some informative errors on why it won't build new docs...maybe it's an easy fix...?Test deprecation warnings
I have not looked too closely here but hoping they should be fix-able, although I'm concious that as I'm constantly updating the dependencies now, one of them will soon bite me and turn into a warning-less test failure.
New dependencies (minor)
I added the gsignal library as a dependency, as well as microbenchmark as a suggested dependency, but very happy to remove the latter if it does not belong there! The new tensor methods rely on some signal processing algorithms and they are implemented reasonably nicely / efficiently in gsignal compared to alternatives I've found so far.
Development plan
I'm adding a few more tensor methods that Saritha is interested in researching, but will try to be very careful not to introduce new test failures!
I am a bit clueless in developing in R, but saw online that good practice is to avoid manually adjusting files and so forth so I've just been invoking devtools commands like
devtools::test()
,devtools::test_active_file(...)
,devtools::document()
,devtools::install()
,renv::status()
andrenv::snapshot()
to avoid touching the lockfiles and NAMESPACE files directly.Let me know how things look when you run tests in your environment(s) with these new dependencies, and if you have any insight on those finnicky things the documentation generator is complaining about / testing failures. Happy to change absolutely anything in my code to make things easier to maintain!