Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
!(opt isa Optim.ZerothOrderOptimizer) && f.grad === nothing &&
error("Use OptimizationFunction to pass the derivatives or automatically generate them with one of the autodiff backends")
function _optim_requires_grad(opt)
if opt isa Optim.ConstrainedOptimizer
return true
elseif opt isa Optim.AbstractConstrainedOptimizer # Optim.ConstrainedOptimizer isn't sufficient - Fminbox isn't one
return _optim_requires_grad(opt.method)
else
return !(opt isa Optim.ZerothOrderOptimizer)
end
end
The following code leads to an error:
because
OptimJLOptimizationCache
hasbut, with constraints,
opt
in this case isI think something like
is needed, e.g.
Haven't tested anything for any of the other packages.