nimble-dev / nimble

The base NIMBLE package for R
http://R-nimble.org
BSD 3-Clause "New" or "Revised" License
160 stars 24 forks source link

Update to Eigen 3.4.0 #1406

Closed perrydv closed 9 months ago

perrydv commented 10 months ago

We are behind on Eigen versions. This update appears to clean up lots of compiler messages. The Eigen header file "DisableStupidWarnings.h" says it all. Let's see what testing turns up. On my machine, a single test in test-coreR failed. It was due to a seq case where the result went from satisfying identical to being minutely numerically different and satisfying only equals. I have cleaned that up. Other tests so far seem fine, so let's see what the full test suite says.

perrydv commented 10 months ago

The failures in test-ADlaplace look like non-problematic numerical tolerance issues. What do you think @paciorek ?

paciorek commented 10 months ago

Probably. Some of them are big enough that I think it is worth checking the size of the differences from before updating Eigen. I can put that on my list. Let's leave this unmerged for the moment.

paciorek commented 10 months ago

Just a side note for posterity that the warnings on Linux install from source are much better than before, but we do still have a few lingering warnings that look like this:

In file included from ../include/nimble/EigenTypedefs.h:28,
                 from ../include/nimble/nimbleCppAD.h:38,
                 from nimbleCppAD.cpp:1:
In destructor 'virtual pointedToBase::~pointedToBase()',
    inlined from 'void pointedToBase::removeWatcher()' at ../include/nimble/smartPtrs.h:144:14,
    inlined from 'void pointedToBase::removeWatcher()' at ../include/nimble/smartPtrs.h:132:8,
    inlined from 'nimSmartPtr<T>::~nimSmartPtr() [with T = NIMBLE_ADCLASS]' at ../include/nimble/smartPtrs.h:116:29:
../include/nimble/smartPtrs.h:147:29: warning: 'void operator delete(void*, std::size_t)' called on pointer '*this.nimSmartPtr<NIMBLE_ADCLASS>::realPtr' with nonzero offset 56 [-Wfree-nonheap-object]
  147 |   virtual ~pointedToBase() {};
      |                             ^
paciorek commented 10 months ago

I'm not sure I'm happy about the tolerance issues. I'm surprised to see this much difference. I was expecting differences in rather lower-order digits...

Here's the first one. We used to match to 6 sign. digits, now 4.

Nimble 1.0.1:

lme4res$coefficients[,"Estimate"] [1] 22.97222222 nimres$params$estimates[1] [1] 22.97220036

Nimble PR:

options(digits=10) nimres$params$estimates[1] [1] 22.97385529 lme4res$coefficients[,"Estimate"] [1] 22.97222222

Here's the second one.

Nimble 1.0.1:

nimres$params$estimates[c(3,4,2)] [1] 0.8455725401 1.7706886173 0.5499322589 as.data.frame(VarCorr(lme4_fit))[,"sdcor"] [1] 0.8455725123 1.7706474293 0.5499321395

Nimble PR:

nimres$params$estimates[c(3,4,2)] [1] 0.8458135633 1.7696294561 0.5498576631 as.data.frame(VarCorr(lme4_fit))[,"sdcor"] [1] 0.8455725123 1.7706474293 0.5499321395

Here's the third one. For some of these we now only agree with lme4 to one digit whereas formerly it was 3.

Nimble 1.0.1:

nimres$randomEffects$estimates[25:30] [1] 2.18568078339 -1.00980973536 1.93668152059 -0.09681245660 -0.01381269905 [6] -3.00180382211 as.vector(t(ranef(lme4_fit)$sample)) [1] 2.18565977680 -1.00983015004 1.93666056172 -0.09683302809 -0.01383328973 [6] -3.00182387067

Nimble PR:

nimres$randomEffects$estimates[25:30] [1] 2.18411332336 -1.01136534626 1.93511500222 -0.09837143373 -0.01537198826 [6] -3.00335205010 as.vector(t(ranef(lme4_fit)$sample)) [1] 2.18565977680 -1.00983015004 1.93666056172 -0.09683302809 -0.01383328973 [6] -3.00182387067

perrydv commented 10 months ago

I'm guessing the Eigen update results in changes in order of computations deep within the implementation. Maybe some elements are stochastically closer to lme4 and some farther, but only the ones farther result in test failure so that's where our attention is drawn?

paciorek commented 10 months ago

ah, that's a very good point. Let me look at some of the tests that still pass.

paciorek commented 10 months ago

@perrydv I ran some of the other tests and compared up to 12 sign. digits between nimble using current Eigen (i.e., devel) and nimble using Eigen 3.4.0. In those other tests, at least up to 12 digits (I didn't happen to look further) the nimble results are identical to each other. So it seems weird that just for the crossed random effects case, we get such a big difference.

I also see that for the crossed REs case that with Eigen 3.4.0 we report lack of convergence with the inner optimization while with older Eigen we don't. The lack of convergence 'explains' the poorer results compared to lmer, but of course doesn't really tell us much about what is going on numerically.

I perturbed the data in the crossed RE test by adding rpois(144,1) to the original data. Now with Eigen 3.4.0 we don't have the lack of convergence of inner optimization, and the results with old and new Eigen agree to 5 sign digits and results with new Eigen relative to lmer are no worse. So that is reassuring.

If I perturb the data by adding 1 only to the first element of the data vector, I get (up to 12 digits) exactly the same results with the different Eigen versions. Neither agree within tolerance to lmer for expect_equal(nimres$params$estimates[1], lme4res$coefficients[,"Estimate"], tol=1e-6).

So I guess this reassures me somewhat. We still need to decide what to do about the failing test. Loosening the tolerance by enough to make it pass feels like it makes the test somewhat unuseful. We could take the approach of perturbing the data and loosening the tolerance a bit to get it to pass.

perrydv commented 10 months ago

Thanks for all this investigation. It is weird. Tempting to think the particular example is randomly unlucky numerically, but who knows. Would another option be to tighten the convergence tolerances using by nimble and lme4?

paciorek commented 10 months ago

Yes, but I am wary of a dive into tweaking Laplace at this late time relative to release. I could try to take a quick look Monday morning. Is your idea to try to achieve inner optimization convergence but modifying innerOptimControl, in particular increasing the max number of iterations? If not, I'm not sure what you mean by tightening the tolerance since we are already reporting inner convergence is not reached.

paciorek commented 9 months ago

I tried the following:

1.) Given lack of inner convergence, I tried setting maxit to 500 (default is 100). That removes the lack of convergence warning, but the result with Eigen 3.4.0 is still worse than with older Eigen. 2.) I tried various other values of reltol for inner optimization. I think the idea here is that if we tighten the criterion, we may do a better job in the overall (outer) optimization. Results are mixed:

@perrydv I think given all of this, I'm ok with merging this in, but with slight wariness that use of new Eigen in this particular example seems to give a bit worse behavior.

paciorek commented 9 months ago

@perrydv and I decided to merge in given we need to update eventually regardless.

Also, if I update the beta init value to be consistent with the data (thanks @paul-vdb for pointing me to the question of initial values), we get consistent behavior with no inner optim convergence failures between the different Eigen versions.

Will merge one tests runs.