-
Nonlinear optimization algorithms that leverage just the gradient information (i.e. "first-order methods") can have trouble traversing through the cost function as the hessian becomes ill-conditioned.…
-
-
Hello @amontoison @gdalle @tmigot,
First, thank you for your work on this project! I recently updated to v0.8.3 and encountered an issue with my code. I use OrdinaryDiffEq to solve an ODE within th…
-
I'm trying to calculate the minimum of a function using a `NonlinearConstraint`, and I'm using autograd to get all my derivatives. For the input to minimize itself, everything works fine. For `Nonli…
-
The jac=True option for `optimize.minimize` is very useful when using automatic differentiation utilities because it avoids duplicated computations encountered with a separate jac callable. I am wonde…
-
I figured out how to do a relatively fast (on the order of the inversion of the gradient finding step [the backpropagation or reverse mode autodiff or whatever] as opposed to the naive order of the gr…
-
Hi JSO team,
Do you have any guidance about which of these solvers is "better" for unconstrained problems, and, if so, why? I had naively assumed that a solver specifically designed for unconstrai…
-
Hi,
I want to get the second-order gradients for a nn.Module function. I use `make_functional_with_buffers` to wrap it. But I fail to get the second-order gradients with RuntimeError. The code is as…
-
`File "/home/phu/Desktop/gatedtabtransformer/sophia_custom.py", line 46, in step
hessian_estimate = self.hutchinson(p, grad)
File "/home/phu/Desktop/gatedtabtransformer/sophia_custom.py", line…
-
Definitively the notion of optimizer is somewhat fuzzy and so is the class `Optimizer`.
We should attempt to clarify definitions we are going to use in the library.
**Definitions (to be added in a w…