lanl-ansi / MathOptSymbolicAD.jl

Other
22 stars 2 forks source link

Support Nonlinear.NODE_LOGIC and Nonlinear.NODE_COMPARISON #31

Closed SebKrantz closed 1 week ago

SebKrantz commented 3 months ago

Hi @odow,

I've tried this backend with my optimal transport network models at https://github.com/SebKrantz/OptimalTransportNetworks.jl/tree/main/src/models, and it seems to work for none of them. For example, using Example 1 which calls this model:

using OptimalTransportNetworks

param = init_parameters(labor_mobility = true, K = 10, gamma = 1, beta = 1, verbose = true,
                        N = 1, kappa_tol = 1e-5, cross_good_congestion = false, nu = 1)

param, graph = create_graph(param, 11, 11, type = "map") # create a map network of 11x11 nodes located in [0,10]x[0,10]

# Customize graph
param[:Zjn] = fill(0.1, param[:J], param[:N]) # set most places to low productivity
Ni = find_node(graph, 6, 6) # Find index of the central node at (6,6)
param[:Zjn][Ni, :] .= 1 # central node more productive

# Optimizer options 
import MathOptInterface as MOI
import MathOptSymbolicAD
param[:optimizer_attr] = Dict(:linear_solver => "ma57") 
param[:model_attr] = Dict(:backend => (MOI.AutomaticDifferentiationBackend(), MathOptSymbolicAD.DefaultBackend()))

# RESOLUTION
@time results = optimal_network(param, graph) # solve the network

# Plot the network with the optimal transportation plan

plot_graph(graph, results[:Ijk], node_sizes = results[:Cj])

Yields:

A JuMP Model
Maximization problem with:
Variables: 1204
Objective function type: VariableRef
`NonlinearExpr`-in-`MathOptInterface.LessThan{Float64}`: 242 constraints
`AffExpr`-in-`MathOptInterface.Interval{Float64}`: 122 constraints
`VariableRef`-in-`MathOptInterface.GreaterThan{Float64}`: 363 constraints
`VariableRef`-in-`MathOptInterface.LessThan{Float64}`: 121 constraints
`VariableRef`-in-`MathOptInterface.Parameter{Float64}`: 420 constraints
Model mode: AUTOMATIC
CachingOptimizer state: EMPTY_OPTIMIZER
Solver name: Ipopt
Names registered in the model: Cjn, Lj, Ljn, Pjn, Qin, Yjn, kappa_ex, u

ERROR: MathOptInterface.UnsupportedAttribute{MathOptInterface.AutomaticDifferentiationBackend}: Attribute MathOptInterface.AutomaticDifferentiationBackend() is not supported by the model.
Stacktrace:
 [1] throw_set_error_fallback(model::Ipopt.Optimizer, attr::MathOptInterface.AutomaticDifferentiationBackend, value::MathOptSymbolicAD.DefaultBackend; error_if_supported::MathOptInterface.SetAttributeNotAllowed{MathOptInterface.AutomaticDifferentiationBackend})
   @ MathOptInterface ~/.julia/packages/MathOptInterface/mz9FK/src/attributes.jl:586
 [2] throw_set_error_fallback(model::Ipopt.Optimizer, attr::MathOptInterface.AutomaticDifferentiationBackend, value::MathOptSymbolicAD.DefaultBackend)
   @ MathOptInterface ~/.julia/packages/MathOptInterface/mz9FK/src/attributes.jl:577
 [3] set(model::Ipopt.Optimizer, attr::MathOptInterface.AutomaticDifferentiationBackend, args::MathOptSymbolicAD.DefaultBackend)
   @ MathOptInterface ~/.julia/packages/MathOptInterface/mz9FK/src/attributes.jl:550
 [4] set(b::MathOptInterface.Bridges.LazyBridgeOptimizer{Ipopt.Optimizer}, attr::MathOptInterface.AutomaticDifferentiationBackend, value::MathOptSymbolicAD.DefaultBackend)
   @ MathOptInterface.Bridges ~/.julia/packages/MathOptInterface/mz9FK/src/Bridges/bridge_optimizer.jl:955
 [5] set(model::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{Ipopt.Optimizer}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}, attr::MathOptInterface.AutomaticDifferentiationBackend, value::MathOptSymbolicAD.DefaultBackend)
   @ MathOptInterface.Utilities ~/.julia/packages/MathOptInterface/mz9FK/src/Utilities/cachingoptimizer.jl:1059
 [6] set(m::Model, attr::MathOptInterface.AutomaticDifferentiationBackend, value::MathOptSymbolicAD.DefaultBackend)
   @ JuMP ~/.julia/packages/JuMP/as6Ji/src/optimizer_interface.jl:794
 [7] set_attribute(model::Model, attr::MathOptInterface.AutomaticDifferentiationBackend, value::MathOptSymbolicAD.DefaultBackend)
   @ JuMP ~/.julia/packages/JuMP/as6Ji/src/optimizer_interface.jl:1023
 [8] top-level scope
   @ ~/Documents/Julia/OptimalTransportNetworks.jl/examples/example01.jl:59

This does not seem to be dependent on how the parameters are passed to the optimizer, for example I can

model, recover_allocation = optimal_network(param, graph, verbose = true, return_model = true)
set_attribute(
    model,
    MOI.AutomaticDifferentiationBackend(),
    MathOptSymbolicAD.DefaultBackend(),
)

Which yields the same. I note that this works with Matlab's symbolic differentiation engine ADiGator, so I guess the problem lies in certain parts of the model specification not supported by the symbolic backend rather than the model not being symbolically differentiable?

odow commented 3 months ago

What version of Ipopt.jl are you using. You need at least Ipopt v1.6.0

SebKrantz commented 3 months ago

Thanks @odow, indeed, my version of Ipopt was outdated, but even after updating I get

ERROR: AssertionError: node.type == MOI.Nonlinear.NODE_VALUE
Stacktrace:
  [1] _to_expr(data::MathOptInterface.Nonlinear.Model, expr::MathOptInterface.Nonlinear.Expression, variable_order::Dict{Int64, Int64}, subexpressions::Vector{Any})
    @ MathOptSymbolicAD ~/.julia/packages/MathOptSymbolicAD/A5rTJ/src/nonlinear_oracle.jl:521
  [2] (::MathOptSymbolicAD.var"#16#18"{MathOptInterface.Nonlinear.Model, Vector{Any}, Dict{Int64, Int64}})(c::MathOptInterface.Nonlinear.Constraint)
    @ MathOptSymbolicAD ~/.julia/packages/MathOptSymbolicAD/A5rTJ/src/nonlinear_oracle.jl:549
  [3] iterate
    @ ./generator.jl:47 [inlined]
  [4] collect_to!
    @ ./array.jl:845 [inlined]
  [5] collect_to_with_first!
    @ ./array.jl:823 [inlined]
  [6] collect(itr::Base.Generator{Base.ValueIterator{OrderedCollections.OrderedDict{MathOptInterface.Nonlinear.ConstraintIndex, MathOptInterface.Nonlinear.Constraint}}, MathOptSymbolicAD.var"#16#18"{MathOptInterface.Nonlinear.Model, Vector{Any}, Dict{Int64, Int64}}})
    @ Base ./array.jl:797
  [7] map
    @ ./abstractarray.jl:2961 [inlined]
  [8] MathOptInterface.Nonlinear.Evaluator(model::MathOptInterface.Nonlinear.Model, backend::MathOptSymbolicAD.DefaultBackend, ordered_variables::Vector{MathOptInterface.VariableIndex})
    @ MathOptSymbolicAD ~/.julia/packages/MathOptSymbolicAD/A5rTJ/src/nonlinear_oracle.jl:548
  [9] _setup_model(model::Ipopt.Optimizer)
    @ Ipopt ~/.julia/packages/Ipopt/YDBAD/src/MOI_wrapper.jl:789
 [10] optimize!(model::Ipopt.Optimizer)
    @ Ipopt ~/.julia/packages/Ipopt/YDBAD/src/MOI_wrapper.jl:900
 [11] optimize!
    @ ~/.julia/packages/MathOptInterface/mz9FK/src/Bridges/bridge_optimizer.jl:380 [inlined]
 [12] optimize!
    @ ~/.julia/packages/MathOptInterface/mz9FK/src/MathOptInterface.jl:85 [inlined]
 [13] optimize!(m::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{Ipopt.Optimizer}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}})
    @ MathOptInterface.Utilities ~/.julia/packages/MathOptInterface/mz9FK/src/Utilities/cachingoptimizer.jl:316
 [14] optimize!(model::JuMP.Model; ignore_optimize_hook::Bool, _differentiation_backend::MathOptInterface.Nonlinear.SparseReverseMode, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ JuMP ~/.julia/packages/JuMP/as6Ji/src/optimizer_interface.jl:457
 [15] optimize!(model::JuMP.Model)
    @ JuMP ~/.julia/packages/JuMP/as6Ji/src/optimizer_interface.jl:409
 [16] optimal_network(param::Dict{Any, Any}, graph::Dict{Symbol, Any}; I0::Nothing, Il::Nothing, Iu::Nothing, verbose::Bool, return_model::Int64)
    @ OptimalTransportNetworks ~/Documents/Julia/OptimalTransportNetworks.jl/src/main/optimal_network.jl:156
 [17] top-level scope
    @ ./timing.jl:262 [inlined]
 [18] top-level scope
    @ ~/Documents/Julia/OptimalTransportNetworks.jl/examples/example01.jl:0

For both the fixed and mobile model. I thus presume this has to do with some part of the model specification. Perhaps the parameters?

odow commented 3 months ago

The issue is that we don't support logical or comparison operators:

https://github.com/odow/MathOptSymbolicAD.jl/blob/e5d3eaa1f56346d53a5ef8f895743ab3e89ff38c/src/nonlinear_oracle.jl#L502-L523

https://github.com/jump-dev/MathOptInterface.jl/blob/dfea1558c279988faff4384bcd9ad4c5e47bc4e9/src/Nonlinear/types.jl#L30-L33

I'm not sure why I didn't add this. Perhaps Symbolics.jl didn't support differentiating them when I first wrote this.

SebKrantz commented 1 month ago

Hi @odow, is there any chance this support could be added in the near future?

odow commented 1 month ago

I have no immediate plans, but I would happily review a PR. Having a minimal reproducible example would be helpful.