Closed a-jp closed 2 years ago
Would anyone be able to help with this?
Hey, yes, you can pass gradients and hessian for the algorithms that make use of them. Just add the necessary methods to your udp, its that simple.
For example have a look at the code here: https://esa.github.io/pygmo2/tutorials/nlopt_basics.html
In particular at the section "I do not have the gradient".
I know its confusing as you do have the gradient, but bear with me. In that example:
def gradient(self, x):
return pg.estimate_gradient_h(lambda x: self.fitness(x), x)
we have added to the UDP a method called gradient. And we return a numerical estimate.
If you do have the gradient (for example using autodiff functionalities), you can just return it there!
Hi,
When using the Scipy SLSQP directly I use the autograd python package to compute derivatives. When using SLSQP through pygmo or indeed ipopt through pygmo, is it possible to use autograd to compute the derivatives rather than the numerical differentiation provided? If so would it be possible to have a small working example showing how to do that?
Many Thanks, Andy