jump-dev / JuMP.jl

Modeling language for Mathematical Optimization (linear, mixed-integer, conic, semidefinite, nonlinear)
http://jump.dev/JuMP.jl/
Other
2.24k stars 396 forks source link

MethodError: Cannot `convert` an object of type NonlinearExpr to an object of type Float64 #3579

Closed LebedevRI closed 11 months ago

LebedevRI commented 12 months ago
using JuMP, Ipopt

a = 3
b = 1

model = Model(Ipopt.Optimizer)

@variable(model, c[1:b])

@expression(model, d[i=1:b,j=1:a], j <= c[i])

@variable(model, e[i=1:b,j=1:a] == d[i,j])

print(model)

results in

MethodError: Cannot `convert` an object of type NonlinearExpr to an object of type Float64

Closest candidates are:
  convert(::Type{T}, ::Base.TwicePrecision) where T<:Number
   @ Base twiceprecision.jl:273
  convert(::Type{T}, ::AbstractChar) where T<:Number
   @ Base char.jl:185
  convert(::Type{T}, ::CartesianIndex{1}) where T<:Number
   @ Base multidimensional.jl:127
  ...

Stacktrace:
  [1] MathOptInterface.EqualTo{Float64}(value::NonlinearExpr)
    @ MathOptInterface ~/.julia/packages/MathOptInterface/wW7fs/src/sets.jl:223
  [2] _moi_constrain_variable(moi_backend::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{Ipopt.Optimizer}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}, index::MathOptInterface.VariableIndex, info::VariableInfo{Float64, Float64, NonlinearExpr, Float64}, #unused#::Type{Float64})
    @ JuMP ~/.julia/packages/JuMP/D44Aq/src/variables.jl:1768
  [3] _moi_add_variable(moi_backend::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{Ipopt.Optimizer}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}, model::Model, v::ScalarVariable{Float64, Float64, NonlinearExpr, Float64}, name::String)
    @ JuMP ~/.julia/packages/JuMP/D44Aq/src/variables.jl:1737
  [4] add_variable(model::Model, v::ScalarVariable{Float64, Float64, NonlinearExpr, Float64}, name::String)
    @ JuMP ~/.julia/packages/JuMP/D44Aq/src/variables.jl:1726
  [5] (::var"#85#86"{Model})(i::Int64, j::Int64)
    @ Main ~/.julia/packages/JuMP/D44Aq/src/Containers/macro.jl:301
  [6] #84
    @ ~/.julia/packages/JuMP/D44Aq/src/Containers/container.jl:85 [inlined]
  [7] iterate
    @ ./generator.jl:47 [inlined]
  [8] collect(itr::Base.Generator{JuMP.Containers.VectorizedProductIterator{Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}}, JuMP.Containers.var"#84#85"{var"#85#86"{Model}}})
    @ Base ./array.jl:782
  [9] map(f::Function, A::JuMP.Containers.VectorizedProductIterator{Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}})
    @ Base ./abstractarray.jl:3291
 [10] container
    @ ~/.julia/packages/JuMP/D44Aq/src/Containers/container.jl:85 [inlined]
 [11] container
    @ ~/.julia/packages/JuMP/D44Aq/src/Containers/container.jl:71 [inlined]
 [12] container(f::Function, indices::JuMP.Containers.VectorizedProductIterator{Tuple{Base.OneTo{Int64}, Base.OneTo{Int64}}}, #unused#::Type{JuMP.Containers.AutoContainerType}, names::Vector{Any})
    @ JuMP.Containers ~/.julia/packages/JuMP/D44Aq/src/Containers/container.jl:75
 [13] macro expansion
    @ ~/.julia/packages/JuMP/D44Aq/src/macros.jl:1213 [inlined]
 [14] top-level scope
    @ In[15]:12

It works if split-out the constraint:

using JuMP, Ipopt

a = 3
b = 1

model = Model(Ipopt.Optimizer)

@variable(model, c[1:b])

@expression(model, d[i=1:b,j=1:a], j <= c[i])

@variable(model, e[i=1:b,j=1:a])

@constraint(model, [i=1:b,j=1:a], e[i,j] == d[i,j])

print(model)
feasibility
Subject to
𝑒[1,1]−(1.0<=𝑐[1])=0
𝑒[1,2]−(2.0<=𝑐[1])=0
𝑒[1,3]−(3.0<=𝑐[1])=0

Related: https://github.com/scipopt/SCIP.jl/issues/282

mlubin commented 12 months ago

The error message could be improved, but this is invalid JuMP syntax. A variable bound (as specified in the @variable macro) must be a number and not an expression.

LebedevRI commented 12 months ago

Right. I've been hitting a number of issues, and it really wasn't obvious which ones are user errors. The error diagnostic could really use an improvement then. Thanks!

blegat commented 12 months ago

Yes, improving these errors makes a big difference for users but it's difficult for us to know what are the cases that don't have nice errors which is why open issues like this one are very helpful.

LebedevRI commented 12 months ago

Yes, improving these errors makes a big difference for users but it's difficult for us to know what are the cases that don't have nice errors which is why open issues like this one are very helpful.

Thanks! I've filed a few more cases i have previously hit. I think there was at least one more aside from these...

LebedevRI commented 11 months ago

@odow thank you!