SciML / Optimization.jl

Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
https://docs.sciml.ai/Optimization/stable/
MIT License
691 stars 75 forks source link

[OptimizationMOI] improve performance of Jacobian and Hessian accesses #680

Closed odow closed 6 months ago

odow commented 6 months ago

cc @ccoffrin

Checklist

Additional context

The problem is that evaluator.J and evaluator.H in the hot loops involve a call to: https://github.com/SciML/Optimization.jl/blob/d6bea20a466fb0fbb30592c8dc3f0003cad9081d/lib/OptimizationMOI/src/nlp.jl#L23-L26 this is expensive because it involves fieldnames.

Benchmark

julia> import ForwardDiff

julia> import Ipopt

julia> import Optimization

julia> import OptimizationMOI

julia> import Test

julia> function _run_optimization(N, automatic_differentiation)
           h = 1 / N
           alpha = 350
           x_offset = N + 1
           u_offset = 2(N + 1)
           function objective_fn(x, p)
               return sum(
                   0.5 * h * (x[x_offset+i+1]^2 + x[x_offset+i]^2) +
                   0.5 * alpha * h * (cos(x[i+1]) + cos(x[i])) for i in 1:N
               )
           end
           function constraint_fn(res, x, p)
               for i in 1:N
                   res[i] = x[x_offset+i+1] - x[x_offset+i] - 0.5 * h * (sin(x[i+1]) + sin(x[i]))
               end
               for i in 1:N
                   res[N+i] = x[i+1] - x[i] - 0.5 * h * x[u_offset+i+1] - 0.5 * h * x[u_offset+i]
               end
               return
           end
           prob = Optimization.OptimizationProblem(
               Optimization.OptimizationFunction(
                   objective_fn,
                   automatic_differentiation;
                   cons = constraint_fn,
               ),
               zeros(3 * (N + 1)),
               nothing;
               lb = vcat(fill(-1.0, N+1), fill(-0.05, N+1), fill(-Inf, N+1)),
               ub = vcat(fill(1.0, N+1), fill(0.05, N+1), fill(Inf, N+1)),
               lcons = zeros(2 * N),
               ucons = zeros(2 * N),
           )
           sol = Optimization.solve(prob, Ipopt.Optimizer(); print_level = 0)
           Test.@test ≈(sol.objective, 350.0; atol = 1e-6)
           Test.@test ≈(sol.u, zeros(3 * (N + 1)); atol = 1e-6)
           return
       end
_run_optimization (generic function with 1 method)

julia> run_forward_diff(N) = _run_optimization(N, Optimization.AutoForwardDiff())
run_forward_diff (generic function with 1 method)

julia> @time run_forward_diff(100);

******************************************************************************
This program contains Ipopt, a library for large-scale nonlinear optimization.
 Ipopt is released as open source code under the Eclipse Public License (EPL).
         For more information visit https://github.com/coin-or/Ipopt
******************************************************************************

 13.697718 seconds (9.53 M allocations: 751.142 MiB, 7.05% gc time, 92.11% compilation time: <1% of which was recompilation)

julia> @time run_forward_diff(100);
  0.564060 seconds (2.62 M allocations: 299.192 MiB, 17.04% gc time)

julia> # Revise to go back to old code

julia> @time run_forward_diff(100);
  0.986995 seconds (3.23 M allocations: 319.710 MiB, 4.80% gc time, 14.28% compilation time)

julia> @time run_forward_diff(100);
  0.815753 seconds (3.17 M allocations: 315.911 MiB, 1.76% gc time)

Profile

Before

image

After

image
codecov[bot] commented 6 months ago

Codecov Report

Attention: 12 lines in your changes are missing coverage. Please review.

Comparison is base (d6bea20) 3.25% compared to head (54085d2) 8.73%.

Files Patch % Lines
lib/OptimizationMOI/src/nlp.jl 0.00% 12 Missing :warning:
Additional details and impacted files ```diff @@ Coverage Diff @@ ## master #680 +/- ## ========================================= + Coverage 3.25% 8.73% +5.47% ========================================= Files 23 31 +8 Lines 1506 2519 +1013 ========================================= + Hits 49 220 +171 - Misses 1457 2299 +842 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.

odow commented 6 months ago

Format-check seems unrelated. Also failing on master: https://github.com/SciML/Optimization.jl/actions/runs/7493571366/job/20399569099

odow commented 6 months ago

This makes little difference to the Optimization.AutoSparseReverseDiff() case because the main cost is computing the Hessian sparsity

image
Vaibhavdixit02 commented 6 months ago

Yeah the sparsity detection is the main contributor across the board. Even with the MTK implementation that would be the major part. We need a more performant one 😅

Vaibhavdixit02 commented 6 months ago

Is there a future where the MOI reverse mode becomes a package? 😄

odow commented 6 months ago

It's just part of MOI, sooo it's already a package?

The main blocker is that you'd need to convert the symbolic form into MOI's expression format. It also operates on a single model with all the constraints, not function-by-function.