This adds hypothesis tests (again, this is a continuation of #224). They're quite slow to run, but it's not prohibitive.
This tests the following hypotheses:
Given a model, where the initial parameters values are already at the optimal values, the minimizers don't find a worse answer
Given a linear model (specfically polynomial), we can find the minimum starting from any initial guess
Still TODO in future iterations:
[ ] Constraints
[ ] Fixed parameters
[ ] ODE Models
[ ] Pick random minimizers
[ ] Other objectives that are not LeastSquares
Constraints are hard, since we'll need to whether the best answer we have is still the best given the constraints. Fixed parameters have the same issue, unless you fix one (or more) at the optimum.
ODE Models shouldn't be too bad.
I still don't understand loglikelihood well enough to implement a test for it.
This adds hypothesis tests (again, this is a continuation of #224). They're quite slow to run, but it's not prohibitive.
This tests the following hypotheses:
Still TODO in future iterations:
Constraints are hard, since we'll need to whether the best answer we have is still the best given the constraints. Fixed parameters have the same issue, unless you fix one (or more) at the optimum. ODE Models shouldn't be too bad. I still don't understand loglikelihood well enough to implement a test for it.