Open sbinet opened 4 years ago
I took a crack at this and ended up with this:
// FuncSettings describes the methods of derivation for a function
type FuncSettings struct {
Grad func(fct func(ps []float64) float64) func(grad []float64, ps []float64)
Hess func(fct func(ps []float64) float64) func(hess *mat.SymDense, x []float64)
}
func (settings *FuncSettings) init() {
if settings.Grad == nil {
settings.Grad = func(fct func(ps []float64) float64) func(grad []float64, ps []float64) {
return func(grad []float64, ps []float64) {
fd.Gradient(grad, fct, ps, nil)
}
}
}
if settings.Hess == nil {
settings.Hess = func(fct func(ps []float64) float64) func(hess *mat.SymDense, x []float64) {
return func(hess *mat.SymDense, x []float64) {
fd.Hessian(hess, fct, x, nil)
}
}
}
}
Which was the cleanest way I could come up with to expose the cost function external to the constructor of Func*D. Then I went to look for other implementation of gradient computation, and was surprised to not find anything in pure Go other than your gonum/autodiff (I am new to scientific computing in golang). This looks like a good implementation, but would likely require writing the cost function out to a temporary file, which doesn't seem ideal for reliability.
In conclusion, I don't think anything I wrote is an improvement on what exists in the codebase, and am going to try to pick up a different first-issue
.
hi @jordanrule and thanks for your interest in Go-HEP.
when I filed that issue I was probably thinking of simply s/grad/Grad/
(the same for hess
) and modifying Func1D.init
to set Grad
with fd.Gradient
if it was nil
.
one could then provide a simple example with a well-known Grad
function.
w.r.t. autofd
, I guess one could just put the result of the autofd
command in a dedicated file, adding the exact command used to generate it (and/or add it as a //go:generate
directive).
right now, the
fit.Func1D
andfit.FuncND
automatically compute the gradient of the provided function to fit, usinggonum/diff/fd
(a package that applies numerical differentiation to approximate the derivatives).we should probably consider exposing the
Grad
field, making it optional, so users could provide an implementation - perhaps generated with gonum/autodiff. (we should probably provide such an example.)