Closed ChrisNabold closed 1 year ago
That is because the gradient element goes first in the gradient method/function signature... So
function g!(storage::Vector, x::Vector)
storage[1] = -2.0 * (1.0 - x[1]) - 400.0 * (x[2] - x[1]^2) * x[1]
storage[2] = 200.0 * (x[2] - x[1]^2)
end
Looks like you somehow found some really really old documentation...
I try to run the rosenbrock example from the doc.
julia> using Optim
julia> f(x) = 100(x[2]-x[1]x[1])^2 +(1-x[1])^2 f (generic function with 1 method)
julia> x0=[-1.2,1];
julia> sol=optimize(f,x0)
Status: success
Candidate solution Final objective value: 4.657541e-09
Found with Algorithm: Nelder-Mead
Convergence measures √(Σ(yᵢ-ȳ)²)/n ≤ 1.0e-08
Work counters Seconds run: 0 (vs limit Inf) Iterations: 78 f(x) calls: 149
julia> sol.minimizer 2-element Vector{Float64}: 1.0000117499532974 1.0000167773369513
julia> sol.minimum 4.657541291131408e-9
julia> sol.iterations 78
julia>
julia> function g!(x::Vector, storage::Vector) storage[1] = -2.0 (1.0 - x[1]) - 400.0 (x[2] - x[1]^2) x[1] storage[2] = 200.0 (x[2] - x[1]^2) end g! (generic function with 1 method)
julia> sol=optimize(f,g!,x0,BFGS())
Status: failure
Candidate solution Final objective value: NaN
Found with Algorithm: BFGS
Convergence measures |x - x'| = NaN ≰ 0.0e+00 |x - x'|/|x'| = NaN ≰ 0.0e+00 |f(x) - f(x')| = NaN ≰ 0.0e+00 |f(x) - f(x')|/|f(x')| = NaN ≰ 0.0e+00 |g(x)| = NaN ≰ 1.0e-08
Work counters Seconds run: 0 (vs limit Inf) Iterations: 0 f(x) calls: 1 ∇f(x) calls: 1
julia>