lanl-ansi / Juniper.jl

A JuMP-based Nonlinear Integer Program Solver
https://lanl-ansi.github.io/Juniper.jl/stable/
MIT License
179 stars 22 forks source link

Using user defined functions #118

Closed dvignon closed 5 years ago

dvignon commented 5 years ago

So I am using Juniper with Ipopt and Cbc to solve a bilevel problem. The user-defined function for the lower level problem takes in binary values from the upper-problem and returns the objective function value. It seems, however, that Juniper does not recognize that user-defined function, even after it has been registered with JuMP. A MWE is posted below:

using JuMP, Juniper, Ipopt, Cbc
solver = JuniperSolver(IpoptSolver();
                       mip_solver=CbcSolver())
m = Model(solver = solver)

type blink
    cost::Float64
    x::Float64
    t::Float64
end

cost = randn(5)
x = randn(5)
t = randn(5)

Blinks = [blink(cost[i], x[i], t[i]) for i = 1:5]
p = [(1,3)]
@variables m begin
    δ[a in Blinks], Bin,(start = (a.cost == 0))
end

#Some constraints
for i in p
    @constraint(m, sum(δ[Blinks[j]] for j in i) <= 1)
end

#Budget constraint
B = 90
@constraint(m,Bud, sum(δ[a]*a.cost for a in Blinks) <= B)

function low_level(Blinks,δ...)
    ind = Int[]
    print(δ)
    for i = 1:length(δ)
        if Int(δ[i])
            push!(ind,i)
        else
            Blinks[i].x = 0.0
            Blinks[i].t = 0.0
        end
    end

    #SolveEquilibrium(Nodes,vcat(Links,Clinks[ind]),ODs)
    #SolveEquilibrium modifies x and t for each member. That problem solves fine
    return  sum(a.x*a.t for a in Blinks)
end

#Function to be registered
ll(δ...) = low_level(Blinks,δ...)

JuMP.register(m,:ll,length(Blinks),ll,autodiff=true)

JuMP.setNLobjective(m,:Min,:($(Expr(:call, :ll, (δ[a] for a in Blinks)...))))

status = solve(m)

The following error is returned

KeyError: key :ll not found
getindex at dict.jl:474 [inlined]
expr_to_nodedata(::Expr, ::Array{ReverseDiffSparse.NodeData,1}, ::Array{Float64,1}, ::Int64, ::ReverseDiffSparse.UserOperatorRegistry) at conversion.jl:23
expr_to_nodedata(::Expr, ::ReverseDiffSparse.UserOperatorRegistry) at conversion.jl:8
JuMP.NonlinearExprData(::JuMP.Model, ::Expr) at parsenlp.jl:235
setNLobjective(::JuMP.Model, ::Symbol, ::Expr) at nlp.jl:1414
optimize!(::Juniper.JuniperModel) at model.jl:231
#solvenlp#165(::Bool, ::Function, ::JuMP.Model, ::JuMP.ProblemTraits) at nlp.jl:1271
(::JuMP.#kw##solvenlp)(::Array{Any,1}, ::JuMP.#solvenlp, ::JuMP.Model, ::JuMP.ProblemTraits) at <missing>:0
#solve#116(::Bool, ::Bool, ::Bool, ::Array{Any,1}, ::Function, ::JuMP.Model) at solvers.jl:172
solve(::JuMP.Model) at solvers.jl:150
include_string(::String, ::String) at loading.jl:522
include_string(::String, ::String, ::Int64) at eval.jl:30
include_string(::Module, ::String, ::String, ::Int64, ::Vararg{Int64,N} where N) at eval.jl:34
(::Atom.##102#107{String,Int64,String})() at eval.jl:82
withpath(::Atom.##102#107{String,Int64,String}, ::String) at utils.jl:30
withpath(::Function, ::String) at eval.jl:38
hideprompt(::Atom.##101#106{String,Int64,String}) at repl.jl:67
macro expansion at eval.jl:80 [inlined]
(::Atom.##100#105{Dict{String,Any}})() at task.jl:80

Thanks for your help.

ccoffrin commented 5 years ago

To the best of my knowledge, this is not yet supported. But it is a good point, we can put it up on the feature request list.

dvignon commented 5 years ago

@ccoffrin Sounds good. Thanks. Would you by chance have any insight as to how I might be able to solve the above problem with JuMP?

ccoffrin commented 5 years ago

@odow @mlubin any suggestions?

mlubin commented 5 years ago

It looks like Juniper is trying to add an expression with the user-defined function to another JuMP model without registering it. However, there's no API to access the registry of user-defined functions from a JuMP model, so this is not an easy issue to fix.

ccoffrin commented 5 years ago

Good to know. Is there any MINLP solver that supports user defined functions? For example, I am wondering what would happen with the old OSI interface to bonmin.

Wikunia commented 5 years ago

@dvignon it's not really the nicest way but maybe it's something you can work with for now. Define the solver like this and set it after defining ll the model definition is then simply m= Model()

solver = JuniperSolver(IpoptSolver(print_level=0);
                       mip_solver=CbcSolver(),
                       registered_functions=[Juniper.register(:ll,length(Blinks),ll,autodiff=true)])

setsolver(m, solver) 

this works on the branch: https://github.com/lanl-ansi/Juniper.jl/tree/idea-118

mlubin commented 5 years ago

Bonmin is oblivious to expression graphs, so user-defined functions are irrelevant. Does Juniper use expression graphs in a nontrivial way, or does it just copy them from one model to another? If it's the latter, you could look to how Pavito transforms the derivative evaluators instead of manipulating expression graphs.

dvignon commented 5 years ago

@Wikunia Thank you very much!!! @ccoffrin and @mlubin thanks for looking into this.

Wikunia commented 5 years ago

This is implemented for v0.4 as well here: https://github.com/lanl-ansi/Juniper.jl/tree/feature-userfunction Will be tested and then merged.

Wikunia commented 5 years ago

New push on that branch for user defined gradients.

Wikunia commented 5 years ago

Tested by @tasseff shell it be merged @ccoffrin ?

ccoffrin commented 5 years ago

Is there a PR I can review?