Open LisaSchlueter opened 1 month ago
In principle we can used automatic differentiation (ForwardDiff if it's just a handful of parameters) to enable LBFGS and so on. But we have to make sure the Likelihood is differentiable in principle, but for fitting a function to a histograms that's typically not a problem.
This does not work for multithreaded code.
The default algorithm of the Optim minimizer is
Nelder-Mead()
, which is a gradient-free method. I'd like to explore gradient-based algorithms, such asGradientDescent()
orLBFGS()
. Motivation:When going to a gradient-based algorithm, it would make sense to provide the minimizer with the analytical gradients. @oschulz do you have experience on how to do that in julia?