JuliaSmoothOptimizers / NLPModelsJuMP.jl

Create NLPModels with JuMP
Other
34 stars 8 forks source link

Function tracing (JuMP 1.15.0) is not supported #160

Closed abelsiqueira closed 10 months ago

abelsiqueira commented 11 months ago

A short, not minimal, example:

n = 2
jmp = Model()
σ(t) = 1 / (1 + exp(-t))
@variable(jmp, x[1:n])
@variable(jmp, y[1:n])
@objective(jmp, Min,
    sum(σ(x[i] - 1)^2 for i = 1:n) + sum(y[i]^2 for i = 1:n)
)
@constraint(jmp, [i=1:n,j=i+1:n], x[i] + x[j] == y[i] - y[j])

nlp = MathOptNLPModel(jmp)

Raises

MethodError: Cannot `convert` an object of type

MathOptInterface.ScalarNonlinearFunction to an object of type

Union{MathOptInterface.VariableIndex, MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.ScalarQuadraticFunction{Float64}}

Closest candidates are:

convert(::Type{T}, !Matched::T) where T

@ Base Base.jl:64

get(::MathOptInterface.Utilities.ObjectiveContainer{Float64}, ::MathOptInterface.ObjectiveFunction{Union{MathOptInterface.VariableIndex, MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.ScalarQuadraticFunction{Float64}}})@objective_container.jl:130
get@model.jl:294[inlined]
get(::MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}, ::MathOptInterface.ObjectiveFunction{Union{MathOptInterface.VariableIndex, MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.ScalarQuadraticFunction{Float64}}})@universalfallback.jl:550
get@cachingoptimizer.jl:869[inlined]
parser_objective_MOI(::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.AbstractOptimizer, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}, ::Int64)@utils.jl:328
var"#MathOptNLPModel#15"(::Bool, ::String, ::Type{NLPModelsJuMP.MathOptNLPModel}, ::JuMP.Model)@moi_nlp_model.jl:31
NLPModelsJuMP.MathOptNLPModel(::JuMP.Model)@moi_nlp_model.jl:19
top-level scope@[Local: 12](http://localhost:1234/edit?id=20cefd64-57ee-11ee-0b4e-f19cdb29ea67#)

Release notes of JuMP 1.15.0: https://jump.dev/blog/1.15.0-release/

amontoison commented 11 months ago

Thanks @abelsiqueira We need to add an upper bound for JuMP in the Project.toml and do a new release.

abelsiqueira commented 11 months ago

Thanks for the quick update @amontoison, but that is not enough, unfortunately. We need to retroactively add an upper bound to all older versions, because if you add JuMP first (or force the version update), then when you add NLPModelsJuMP, it will be added at 0.12.1.

odow commented 11 months ago

This isn't a breaking change. This is a new feature that was added to JuMP 1.15. You need to update NLPModelJuMP to support MOI.ScalarNonlinearFunction. I don't think it needs a compat bound on JuMP.

odow commented 11 months ago

Old code that people wrote should continue to work.