JuliaNLSolvers / Optim.jl

Optimization functions for Julia
Other
1.1k stars 214 forks source link

BHHH for Likelihood Optimization #1055

Open ParadaCarleton opened 8 months ago

ParadaCarleton commented 8 months ago

BHHH is a second-order algorithm that (conceptually) uses the self-outer-product of the gradient to approximate the Hessian. This is justified by the information matrix equality in statistics, which states that E(x * x') = E(hessian(x)), making the self-outer-product an unbiased and consistent estimator of the Hessian (that can usually be calculated much more easily than the full Hessian). This method is widely used in statistics.

Is there an implementation of BHHH in Optim.jl, or are there any plans to add it?

pkofod commented 6 months ago

I'm well aware of BHHH, but I'm not sure what you would want beyond the Newton method? Is it because you want Optim to automatically write the outer product of the score using AD?

ParadaCarleton commented 6 months ago

Yep!

pkofod commented 5 months ago

Okay, then I suppose you'd have to a) have a vector objective type that can then be interpreted according to some aggregation (I suppose a sum here because you'd have likelihood contributions in your use case) or simply a uhhh-constructor that constructs a normal objective type?