Closed freemin7 closed 1 year ago
I can't reproduce. I get
julia> optimize!(mnl)
nl_solver : MathOptInterface.OptimizerWithAttributes(Ipopt.Optimizer, Pair{MathOptInterface.AbstractOptimizerAttribute, Any}[MathOptInterface.Silent() => true, MathOptInterface.RawOptimizerAttribute("sb") => "yes", MathOptInterface.RawOptimizerAttribute("max_iter") => 9999])
mip_solver : MathOptInterface.OptimizerWithAttributes(HiGHS.Optimizer, Pair{MathOptInterface.AbstractOptimizerAttribute, Any}[MathOptInterface.Silent() => true, MathOptInterface.RawOptimizerAttribute("presolve") => "on"])
log_levels : [:Options, :Table, :Info]
#Variables: 44
#IntBinVar: 10
Obj Sense: Min
Start values are not feasible.
Status of relaxation: NUMERICAL_ERROR
What is
pkg> st -m MathOptInterface
Status `/private/tmp/Manifest.toml`
[b8f27783] MathOptInterface v1.8.2
wait, run it with alpine not juniper. Copied the wrong version.
Can reproduce. Will take a look.
Just as an FYI for future reports, it helps immensely if you can isolate a minimal reproducible example. In this case:
julia> using JuMP
julia> import Alpine
julia> model = Model(Alpine.Optimizer)
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: EMPTY_OPTIMIZER
Solver name: Alpine
julia> @variable(model, x)
x
julia> @NLconstraint(model, x^1.852 <= 1)
x ^ 1.852 - 1.0 ≤ 0
julia> optimize!(model)
ERROR: type Symbol has no field head
Stacktrace:
[1] getproperty(x::Symbol, f::Symbol)
@ Base ./Base.jl:33
[2] traverse_expr_linear_to_affine(expr::Symbol, lhscoeffs::Vector{Any}, lhsvars::Vector{Any}, rhs::Float64, bufferVal::Nothing, bufferVar::Nothing, sign::Float64, coef::Float64, level::Int64)
@ Alpine ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:351
[3] traverse_expr_linear_to_affine(expr::Expr, lhscoeffs::Vector{Any}, lhsvars::Vector{Any}, rhs::Float64, bufferVal::Nothing, bufferVar::Nothing, sign::Float64, coef::Float64, level::Int64) (repeats 2 times)
@ Alpine ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:369
[4] traverse_expr_linear_to_affine(expr::Expr)
@ Alpine ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:327
[5] expr_linear_to_affine(expr::Expr)
@ Alpine ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:282
[6] expr_conversion(m::Alpine.Optimizer)
@ Alpine ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:97
[7] process_expr
@ ~/.julia/packages/Alpine/fkUe3/src/nlexpr.jl:10 [inlined]
[8] load!(m::Alpine.Optimizer)
@ Alpine ~/.julia/packages/Alpine/fkUe3/src/main_algorithm.jl:110
[9] optimize!(m::Alpine.Optimizer)
@ Alpine ~/.julia/packages/Alpine/fkUe3/src/main_algorithm.jl:151
[10] optimize!
@ ~/.julia/packages/MathOptInterface/Ohzb2/src/Bridges/bridge_optimizer.jl:376 [inlined]
[11] optimize!
@ ~/.julia/packages/MathOptInterface/Ohzb2/src/MathOptInterface.jl:87 [inlined]
[12] optimize!(m::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{Alpine.Optimizer}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}})
@ MathOptInterface.Utilities ~/.julia/packages/MathOptInterface/Ohzb2/src/Utilities/cachingoptimizer.jl:316
[13] optimize!(model::Model; ignore_optimize_hook::Bool, _differentiation_backend::MathOptInterface.Nonlinear.SparseReverseMode, kwargs::Base.Iterators.Pairs{Union{}, Union{}, Tuple{}, NamedTuple{(), Tuple{}}})
@ JuMP ~/.julia/packages/JuMP/gVq7V/src/optimizer_interface.jl:185
[14] optimize!(model::Model)
@ JuMP ~/.julia/packages/JuMP/gVq7V/src/optimizer_interface.jl:163
[15] top-level scope
@ REPL[572]:1
This might be one for @harshangrjn. Does Alpine support x^y
?
The underling problem is here. The constraint coming in is x^1.852 - 1 <= 0
, but is_strucural
is false:
https://github.com/lanl-ansi/Alpine.jl/blob/c6da8b92cf34bc1c88c1f488285a90d63fd3df55/src/nlexpr.jl#L64-L70
And so the constraint gets classified as linear. Then later, when the linear constraint tries to get parsed, it encounters :^
and errors.
expr
is :^
, so none of these if
statements are true, we reach the last and error:
https://github.com/lanl-ansi/Alpine.jl/blob/c6da8b92cf34bc1c88c1f488285a90d63fd3df55/src/nlexpr.jl#L336-L351
@odow non-integral (and non-positive) exponents are not supported by Alpine at this point. But the error it is reporting certainly needs to be fixed, and a more meaningful message saying fractional exponents aren't supported in Alpine will be useful.
I am a bit surprised positive fractional exponents are not handled. f(y) == x^(a/b)
should be equivalent to f(y) == z
, z^b == z^a
, z>=0
sound like something a bridge could handle. I will rewrite the constraint with that in mind. Maybe suggesting such a thing in in error message might help, although that can lead to giant exponents. A more interesting approach would be to bound a/b with nicer fractions from above and below and refine them once they become an uncertainty and less sensitive intervals are left.
@freemin7 The idea for leaving behind these exponents is that we didn't want to handle exponents which aren't supported by Gurobi, after applying relaxations at this point. But having fractional exponents shouldn't be too hard to include with a simple outer approximation.
@odow The issue is not with is_strucural
but is with the lack of check for fractional exponents in the constraints similar to the objective expression here: https://github.com/lanl-ansi/Alpine.jl/blob/c6da8b92cf34bc1c88c1f488285a90d63fd3df55/src/nlexpr.jl#L55
generic_linear
classification is happening since the constraint is convex, although the naming is quite confusing. It needs some re-naming for sure. Anyway this should be fixed now where Alpine throws a more meaningful error message.
@freemin7 closing this for now as this should be addressed in v0.5.2
. Plz feel free to re-open if you see any issue.
Offending code:
I hope i got the context right.