lanl-ansi / Alpine.jl

A Julia/JuMP-based Global Optimization Solver for Non-convex Programs
https://lanl-ansi.github.io/Alpine.jl/latest/
Other
241 stars 39 forks source link

ERROR: type Symbol has no field head #223

Closed freemin7 closed 1 year ago

freemin7 commented 1 year ago
ERROR: type Symbol has no field head
Stacktrace:
  [1] getproperty
    @ ./Base.jl:42 [inlined]
  [2] traverse_expr_linear_to_affine(expr::Symbol, lhscoeffs::Vector{Any}, lhsvars::Vector{Any}, rhs::Float64, bufferVal::Nothing, bufferVar::Nothing, sign::Float64, coef::Float64, level::Int64)
    @ Alpine ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:351
  [3] traverse_expr_linear_to_affine(expr::Expr, lhscoeffs::Vector{Any}, lhsvars::Vector{Any}, rhs::Float64, bufferVal::Nothing, bufferVar::Nothing, sign::Float64, coef::Float64, level::Int64) (repeats 3 times)
    @ Alpine ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:369
  [4] traverse_expr_linear_to_affine(expr::Expr)
    @ Alpine ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:327
  [5] expr_linear_to_affine(expr::Expr)
    @ Alpine ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:282
  [6] expr_conversion(m::Alpine.Optimizer)
    @ Alpine ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:97
  [7] process_expr
    @ ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:10 [inlined]
  [8] load!(m::Alpine.Optimizer)
    @ Alpine ~/.julia/packages/Alpine/fkUe3/src/main_algorithm.jl:110
  [9] optimize!(m::Alpine.Optimizer)
    @ Alpine ~/.julia/packages/Alpine/fkUe3/src/main_algorithm.jl:151
 [10] optimize!
    @ ~/.julia/packages/MathOptInterface/Ohzb2/src/Bridges/bridge_optimizer.jl:376 [inlined]
 [11] optimize!
    @ ~/.julia/packages/MathOptInterface/Ohzb2/src/MathOptInterface.jl:87 [inlined]
 [12] optimize!(m::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{Alpine.Optimizer}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}})
    @ MathOptInterface.Utilities ~/.julia/packages/MathOptInterface/Ohzb2/src/Utilities/cachingoptimizer.jl:316
 [13] optimize!(model::Model; ignore_optimize_hook::Bool, _differentiation_backend::MathOptInterface.Nonlinear.SparseReverseMode, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ JuMP ~/.julia/packages/JuMP/gVq7V/src/optimizer_interface.jl:185
 [14] optimize!(model::Model)
    @ JuMP ~/.julia/packages/JuMP/gVq7V/src/optimizer_interface.jl:163
 [15] top-level scope
    @ REPL[560]:1

Offending code:

nn = 4
np = 5
n0 = 1

Qmax = 500
Pmin = 5
Pmax = 45

Aqp = sparse([1,2,2,3,3,4,4,5,5],[1,1,2,1,3,2,4,3,4],[1.,1.,-1.,-1.,1.,1.,-1.,1.,-1.],5,4)
Aq0 = sparse([1],[1],[-1.],5,1)

Demands = fill(25.,(4))
Elevation = [90.,90.,88.,88.]
WaterHeight = [100.;]

nv = 4

## MINLP

using Ipopt, Alpine, LinearAlgebra, Juniper

mip = optimizer_with_attributes(HiGHS.Optimizer, 
                                         MOI.Silent() => true,
                                         "presolve"   => "on") 

# NLP optimizer
ipopt = optimizer_with_attributes(Ipopt.Optimizer, 
                                        MOI.Silent() => true, 
                                        "sb" => "yes", 
                                        "max_iter"   => 9999)

# Local MINLP feasibility pump
juniper = optimizer_with_attributes(
        Juniper.Optimizer,
       # MOI.Silent() => true,
        "mip_solver" => mip,
        "nl_solver" => ipopt,
    )

# Global optimizer
alpine = optimizer_with_attributes(Alpine.Optimizer, 
                                         "nlp_solver" => ipopt,
                                         "mip_solver" => mip,
                                         "minlp_solver" => juniper)
mnl = Model(juniper)
@variable(mnl, Pmin <= p[1:nn] <= Pmax)
@variable(mnl, 0 <= q[1:(2*np)] <= Qmax)
@variable(mnl, z[1:(2*np)], Bin)
@variable(mnl, hfq[1:(2*np)])

@constraint(mnl, transpose(Aqp)*(q[1:np] - q[(np+1):(2np)]) - Demands .== 0)

@variable(mnl, b[1:(2*np)])
@constraint(mnl, b[1:np] .== -Aqp*p -Aqp*Demands -Aq0*WaterHeight -hfq[1:np] )
@constraint(mnl, b[(np+1):(2np)] .== +Aqp*p +Aqp*Demands +Aq0*WaterHeight -hfq[(np+1):(2np)])

for i in 1:(2*np)
  @NLconstraint(mnl, 0 <=    q[i]*b[i] )
end

@constraint(mnl, -Aqp*p -Aqp*Demands -Aq0*WaterHeight -hfq[1:np] -100*z[1:np] .<= 0)
@constraint(mnl, +Aqp*p +Aqp*Demands +Aq0*WaterHeight -hfq[(np+1):(2np)] -100*z[(1+np):(2*np)] .<= 0)

@constraint(mnl, z[1:np] + z[(np+1):(2np)] .<= 1)
@constraint(mnl, sum(z) == nv)

for i in 1:(2*np)
  @NLconstraint(mnl, hfq[i] == q[i]^1.852)
end

@objective(mnl,Min,sum(p))

optimize!(mnl)

I hope i got the context right.

(model) pkg> status
      Status `~/privatepath/model/Project.toml`
  [07493b3f] Alpine v0.5.1
  [87dc4568] HiGHS v1.1.4
  [b99e6be6] Hypatia v0.7.0
  [b6b21f68] Ipopt v1.1.0
  [4076af6c] JuMP v1.3.1
  [2ddba703] Juniper v0.9.1
  [2f354839] Pajarito v0.8.0
  [2f01184e] SparseArrays
odow commented 1 year ago

I can't reproduce. I get

julia> optimize!(mnl)
nl_solver   : MathOptInterface.OptimizerWithAttributes(Ipopt.Optimizer, Pair{MathOptInterface.AbstractOptimizerAttribute, Any}[MathOptInterface.Silent() => true, MathOptInterface.RawOptimizerAttribute("sb") => "yes", MathOptInterface.RawOptimizerAttribute("max_iter") => 9999])
mip_solver  : MathOptInterface.OptimizerWithAttributes(HiGHS.Optimizer, Pair{MathOptInterface.AbstractOptimizerAttribute, Any}[MathOptInterface.Silent() => true, MathOptInterface.RawOptimizerAttribute("presolve") => "on"])
log_levels  : [:Options, :Table, :Info]

#Variables: 44
#IntBinVar: 10
Obj Sense: Min

Start values are not feasible.
Status of relaxation: NUMERICAL_ERROR
odow commented 1 year ago

What is

pkg> st -m MathOptInterface
      Status `/private/tmp/Manifest.toml`
  [b8f27783] MathOptInterface v1.8.2
freemin7 commented 1 year ago

wait, run it with alpine not juniper. Copied the wrong version.

odow commented 1 year ago

Can reproduce. Will take a look.

odow commented 1 year ago

Just as an FYI for future reports, it helps immensely if you can isolate a minimal reproducible example. In this case:

julia> using JuMP

julia> import Alpine

julia> model = Model(Alpine.Optimizer)
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: EMPTY_OPTIMIZER
Solver name: Alpine

julia> @variable(model, x)
x

julia> @NLconstraint(model, x^1.852 <= 1)
x ^ 1.852 - 1.0 ≤ 0

julia> optimize!(model)
ERROR: type Symbol has no field head
Stacktrace:
  [1] getproperty(x::Symbol, f::Symbol)
    @ Base ./Base.jl:33
  [2] traverse_expr_linear_to_affine(expr::Symbol, lhscoeffs::Vector{Any}, lhsvars::Vector{Any}, rhs::Float64, bufferVal::Nothing, bufferVar::Nothing, sign::Float64, coef::Float64, level::Int64)
    @ Alpine ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:351
  [3] traverse_expr_linear_to_affine(expr::Expr, lhscoeffs::Vector{Any}, lhsvars::Vector{Any}, rhs::Float64, bufferVal::Nothing, bufferVar::Nothing, sign::Float64, coef::Float64, level::Int64) (repeats 2 times)
    @ Alpine ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:369
  [4] traverse_expr_linear_to_affine(expr::Expr)
    @ Alpine ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:327
  [5] expr_linear_to_affine(expr::Expr)
    @ Alpine ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:282
  [6] expr_conversion(m::Alpine.Optimizer)
    @ Alpine ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:97
  [7] process_expr
    @ ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:10 [inlined]
  [8] load!(m::Alpine.Optimizer)
    @ Alpine ~/.julia/packages/Alpine/fkUe3/src/main_algorithm.jl:110
  [9] optimize!(m::Alpine.Optimizer)
    @ Alpine ~/.julia/packages/Alpine/fkUe3/src/main_algorithm.jl:151
 [10] optimize!
    @ ~/.julia/packages/MathOptInterface/Ohzb2/src/Bridges/bridge_optimizer.jl:376 [inlined]
 [11] optimize!
    @ ~/.julia/packages/MathOptInterface/Ohzb2/src/MathOptInterface.jl:87 [inlined]
 [12] optimize!(m::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{Alpine.Optimizer}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}})
    @ MathOptInterface.Utilities ~/.julia/packages/MathOptInterface/Ohzb2/src/Utilities/cachingoptimizer.jl:316
 [13] optimize!(model::Model; ignore_optimize_hook::Bool, _differentiation_backend::MathOptInterface.Nonlinear.SparseReverseMode, kwargs::Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
    @ JuMP ~/.julia/packages/JuMP/gVq7V/src/optimizer_interface.jl:185
 [14] optimize!(model::Model)
    @ JuMP ~/.julia/packages/JuMP/gVq7V/src/optimizer_interface.jl:163
 [15] top-level scope
    @ REPL[572]:1
odow commented 1 year ago

This might be one for @harshangrjn. Does Alpine support x^y?

The underling problem is here. The constraint coming in is x^1.852 - 1 <= 0, but is_strucural is false: https://github.com/lanl-ansi/Alpine.jl/blob/c6da8b92cf34bc1c88c1f488285a90d63fd3df55/src/nlexpr.jl#L64-L70

https://github.com/lanl-ansi/Alpine.jl/blob/c6da8b92cf34bc1c88c1f488285a90d63fd3df55/src/nlexpr.jl#L201-L224

And so the constraint gets classified as linear. Then later, when the linear constraint tries to get parsed, it encounters :^ and errors.

expr is :^, so none of these if statements are true, we reach the last and error: https://github.com/lanl-ansi/Alpine.jl/blob/c6da8b92cf34bc1c88c1f488285a90d63fd3df55/src/nlexpr.jl#L336-L351

harshangrjn commented 1 year ago

@odow non-integral (and non-positive) exponents are not supported by Alpine at this point. But the error it is reporting certainly needs to be fixed, and a more meaningful message saying fractional exponents aren't supported in Alpine will be useful.

freemin7 commented 1 year ago

I am a bit surprised positive fractional exponents are not handled. f(y) == x^(a/b) should be equivalent to f(y) == z, z^b == z^a, z>=0 sound like something a bridge could handle. I will rewrite the constraint with that in mind. Maybe suggesting such a thing in in error message might help, although that can lead to giant exponents. A more interesting approach would be to bound a/b with nicer fractions from above and below and refine them once they become an uncertainty and less sensitive intervals are left.

harshangrjn commented 1 year ago

@freemin7 The idea for leaving behind these exponents is that we didn't want to handle exponents which aren't supported by Gurobi, after applying relaxations at this point. But having fractional exponents shouldn't be too hard to include with a simple outer approximation.

harshangrjn commented 1 year ago

@odow The issue is not with is_strucural but is with the lack of check for fractional exponents in the constraints similar to the objective expression here: https://github.com/lanl-ansi/Alpine.jl/blob/c6da8b92cf34bc1c88c1f488285a90d63fd3df55/src/nlexpr.jl#L55

generic_linear classification is happening since the constraint is convex, although the naming is quite confusing. It needs some re-naming for sure. Anyway this should be fixed now where Alpine throws a more meaningful error message.

@freemin7 closing this for now as this should be addressed in v0.5.2. Plz feel free to re-open if you see any issue.