lanl-ansi / MathOptSymbolicAD.jl

Other
24 stars 2 forks source link

MathOptSymbolicAD

This package implements an experimental symbolic automatic differentiation backend for JuMP.

For more details, see Oscar's JuMP-dev 2022 talk.

Installation

Install MathOptSymbolicAD as follows:

import Pkg
Pkg.add("MathOptSymbolicAD")

Use with JuMP

using JuMP
import Ipopt
import MathOptSymbolicAD
model = Model(Ipopt.Optimizer)
@variable(model, x[1:2])
@objective(model, Min, (1 - x[1])^2 + 100 * (x[2] - x[1]^2)^2)
set_attribute(
    model,
    MOI.AutomaticDifferentiationBackend(),
    MathOptSymbolicAD.DefaultBackend(),
)
optimize!(model)

Background

MathOptSymbolicAD is inspired by Hassan Hijazi's work on coin-or/gravity, a high-performance algebraic modeling language in C++.

Hassan made the following observations:

The symbolic differentiation approach of Gravity works well when the problem is large with few unique constraints. For example, a model like:

model = Model()
@variable(model, 0 <= x[1:10_000] <= 1)
@constraint(model, [i=1:10_000], sin(x[i]) <= 1)
@objective(model, Max, sum(x))

is ideal, because although the Jacobian matrix has 10,000 rows, we can compute the derivative of sin(x[i]) as cos(x[i]), and then fill in the Jacobian by evaluating the derivative function instead of having to differentiation 10,000 expressions.

The symbolic differentiation approach of Gravity works poorly if there are a large number of unique constraints in the model (which would require a lot of expressions to be symbolically differentiated), or if the nonlinear functions contain a large number of nonlinear terms (which would make the symbolic derivative expensive to compute).

License

This software is provided under a BSD license as part of the Grid Optimization Competition Solvers project, C19076. See LICENSE.md.