jump-dev / DiffOpt.jl

Differentiating convex optimization programs w.r.t. program parameters
https://jump.dev/DiffOpt.jl/stable
MIT License
123 stars 14 forks source link
differentiable-programming julia mathematical-modelling optimization

DiffOpt.jl

stable docs development docs Build Status Coverage

DiffOpt.jl is a package for differentiating convex optimization programs with respect to the program parameters. DiffOpt currently supports linear, quadratic, and conic programs.

License

DiffOpt.jl is licensed under the MIT License.

Installation

Install DiffOpt using Pkg.add:

import Pkg
Pkg.add("DiffOpt")

Documentation

The documentation for DiffOpt.jl includes a detailed description of the theory behind the package, along with examples, tutorials, and an API reference.

Use with JuMP

Use DiffOpt with JuMP by following this brief example:

using JuMP, DiffOpt, HiGHS
# Create a model using the wrapper
model = Model(() -> DiffOpt.diff_optimizer(HiGHS.Optimizer))
# Define your model and solve it
@variable(model, x)
@constraint(model, cons, x >= 3)
@objective(model, Min, 2x)
optimize!(model)
# Choose the problem parameters to differentiate with respect to, and set their
# perturbations.
MOI.set(model, DiffOpt.ReverseVariablePrimal(), x, 1.0)
# Differentiate the model
DiffOpt.reverse_differentiate!(model)
# fetch the gradients
grad_exp = MOI.get(model, DiffOpt.ReverseConstraintFunction(), cons)  # -3 x - 1
constant(grad_exp)        # -1
coefficient(grad_exp, x)  # -3

GSOC2020

DiffOpt began as a NumFOCUS sponsored Google Summer of Code (2020) project