JuliaSmoothOptimizers / NLPModelsJuMP.jl

Create NLPModels with JuMP
Other
36 stars 8 forks source link

Issue with new JuMP operator for problem HS87 #195

Closed tmigot closed 1 week ago

tmigot commented 2 weeks ago

I got the following issue when updating OptimizationProblems.jl to more recent versions of JuMP:

using JuMP, NLPModelsJuMP
function hs87(args...; kwargs...)
  nlp = Model()
  x0 = [390, 1000, 419.5, 340.5, 198.175, 0.5]
  lvar = [0, 0, 340, 340, -1000, 0]
  uvar = [400, 1000, 420, 420, 10000, 0.5236]
  @variable(nlp, lvar[i] <= x[i = 1:6] <= uvar[i], start = x0[i])

  a = 131078 // 1000
  b = 148577 // 100000
  ci = 90798 // 100000
  d = cos(147588 // 100000)
  e = sin(147588 // 100000)

  @constraint(nlp, 300 - x[1] - 1 / a * x[3] * x[4] * cos(b - x[6]) + ci / a * d * x[3] == 0)
  @constraint(nlp, -x[2] - 1 / a * x[3] * x[4] * cos(b + x[6]) + ci / a * d * x[4]^2 == 0)
  @constraint(nlp, -x[5] - 1 / a * x[3] * x[4] * cos(b + x[6]) + ci / a * e * x[4]^2 == 0)
  @constraint(nlp, 200 - 1 / a * x[3] * x[4] * sin(b - x[6]) + ci / a * e * x[3]^2 == 0)

  function f1(t)
    return if 0 <= t <= 300
      30 * t
    elseif 300 <= t <= 400
      31 * t
    else
      eltype(x)(Inf)
    end
  end

  function f2(t)
    return if 0 <= t <= 100
      28 * t
    elseif 100 <= t <= 200
      29 * t
    elseif 200 <= t <= 1000
      30 * t
    else
      eltype(t)(Inf)
    end
  end
  @operator(nlp, op_f1, 1, f1)
  @expression(nlp, op_f1)
  @operator(nlp, op_f2, 1, f2)
  @expression(nlp, op_f2)
  @objective(nlp, Min, op_f1(x[1]) + op_f2(x[2]))

  return nlp
end
nlp_jump = MathOptNLPModel(hs87())

which return the following error

ERROR: MathOptInterface.UnsupportedNonlinearOperator: The nonlinear operator `:op_f1` is not supported by the model.
Stacktrace:
  [1] _get_node_type(data::MathOptInterface.Nonlinear.Model, x::MathOptInterface.ScalarNonlinearFunction)
    @ MathOptInterface.Nonlinear .julia\packages\MathOptInterface\jqDoD\src\Nonlinear\parse.jl:76
  [2] _parse_without_recursion_inner(stack::Vector{…}, data::MathOptInterface.Nonlinear.Model, expr::MathOptInterface.Nonlinear.Expression, x::MathOptInterface.ScalarNonlinearFunction, parent::Int64)
    @ MathOptInterface.Nonlinear .julia\packages\MathOptInterface\jqDoD\src\Nonlinear\parse.jl:80
  [3]
    @ MathOptInterface.Nonlinear .julia\packages\MathOptInterface\jqDoD\src\Nonlinear\parse.jl:49
  [4] parse_expression
    @ .julia\packages\MathOptInterface\jqDoD\src\Nonlinear\parse.jl:14 [inlined]
  [5] set_objective(model::MathOptInterface.Nonlinear.Model, obj::MathOptInterface.ScalarNonlinearFunction)
    @ MathOptInterface.Nonlinear .julia\packages\MathOptInterface\jqDoD\src\Nonlinear\model.jl:63
  [6] _nlp_model(model::MathOptInterface.Utilities.CachingOptimizer{…})
    @ NLPModelsJuMP .julia\packages\NLPModelsJuMP\9DyFu\src\utils.jl:463
  [7] _nlp_block(model::MathOptInterface.Utilities.CachingOptimizer{…})
    @ NLPModelsJuMP .julia\packages\NLPModelsJuMP\9DyFu\src\utils.jl:472
  [8] nlp_model(moimodel::MathOptInterface.Utilities.CachingOptimizer{…}; hessian::Bool, name::String)
    @ NLPModelsJuMP .julia\packages\NLPModelsJuMP\9DyFu\src\moi_nlp_model.jl:35
  [9] nlp_model
    @ .julia\packages\NLPModelsJuMP\9DyFu\src\moi_nlp_model.jl:30 [inlined]
 [10] #MathOptNLPModel#21
    @ .julia\packages\NLPModelsJuMP\9DyFu\src\moi_nlp_model.jl:27 [inlined]
 [11] MathOptNLPModel(moimodel::MathOptInterface.Utilities.CachingOptimizer{…})
    @ NLPModelsJuMP .julia\packages\NLPModelsJuMP\9DyFu\src\moi_nlp_model.jl:26
 [12] MathOptNLPModel(jmodel::Model; kws::@Kwargs{})
    @ NLPModelsJuMP .julia\packages\NLPModelsJuMP\9DyFu\src\moi_nlp_model.jl:23
 [13] MathOptNLPModel(jmodel::Model)
    @ NLPModelsJuMP .julia\packages\NLPModelsJuMP\9DyFu\src\moi_nlp_model.jl:21
 [14] top-level scope
    @ REPL[30]:1
Some type information was truncated. Use `show(err)` to see complete types.

any idea?

amontoison commented 2 weeks ago

@operator is the new @register, right? @blegat probably knows how to fix this issue.

blegat commented 1 week ago

We need to get the list of UserDefinedFunction and set it to the nlp_model as it's done here: https://github.com/jump-dev/Ipopt.jl/blob/4c156461ef1fda3c9f015520197afda4e8ca3e26/src/MOI_wrapper.jl#L606-L611

tmigot commented 1 week ago

Thanks @blegat , so to clarify we should add something like this

attr = MOI.ListOfSupportedNonlinearOperators()
MOI.Nonlinear.register_operator(nlp_model, MOI.get(model, attr)) 

in the appropriate place, is that correct?

amontoison commented 1 week ago

https://discourse.julialang.org/t/get-gradient-of-user-defined-nonlinear-operator-in-jump/118989/2?u=amontoison

amontoison commented 1 week ago

I'm wondering if we don't have a bug in MOI:

using JuMP, MathOptInterface
function hs87(args...; kwargs...)
  nlp = Model()
  x0 = [390, 1000, 419.5, 340.5, 198.175, 0.5]
  lvar = [0, 0, 340, 340, -1000, 0]
  uvar = [400, 1000, 420, 420, 10000, 0.5236]
  @variable(nlp, lvar[i] <= x[i = 1:6] <= uvar[i], start = x0[i])

  a = 131078 // 1000
  b = 148577 // 100000
  ci = 90798 // 100000
  d = cos(147588 // 100000)
  e = sin(147588 // 100000)

  @constraint(nlp, 300 - x[1] - 1 / a * x[3] * x[4] * cos(b - x[6]) + ci / a * d * x[3] == 0)
  @constraint(nlp, -x[2] - 1 / a * x[3] * x[4] * cos(b + x[6]) + ci / a * d * x[4]^2 == 0)
  @constraint(nlp, -x[5] - 1 / a * x[3] * x[4] * cos(b + x[6]) + ci / a * e * x[4]^2 == 0)
  @constraint(nlp, 200 - 1 / a * x[3] * x[4] * sin(b - x[6]) + ci / a * e * x[3]^2 == 0)

  function f1(t)
    return if 0 <= t <= 300
      30 * t
    elseif 300 <= t <= 400
      31 * t
    else
      eltype(x)(Inf)
    end
  end

  function f2(t)
    return if 0 <= t <= 100
      28 * t
    elseif 100 <= t <= 200
      29 * t
    elseif 200 <= t <= 1000
      30 * t
    else
      eltype(t)(Inf)
    end
  end
  @operator(nlp, op_f1, 1, f1)
  @expression(nlp, op_f1)
  @operator(nlp, op_f2, 1, f2)
  @expression(nlp, op_f2)
  @objective(nlp, Min, op_f1(x[1]) + op_f2(x[2]))

  return nlp
end
nlp = hs87()
moi_backend = backend(nlp)

MOI.get(moi_backend, MOI.ListOfSupportedNonlinearOperators())
ERROR: MathOptInterface.GetAttributeNotAllowed{MathOptInterface.ListOfSupportedNonlinearOperators}: Getting attribute MathOptInterface.ListOfSupportedNonlinearOperators() cannot be performed: Cannot query MathOptInterface.ListOfSupportedNonlinearOperators() from `Utilities.CachingOptimizer` because no optimizer is attached (the state is `NO_OPTIMIZER`). You may want to use a `CachingOptimizer` in `AUTOMATIC` mode or you may need to call `reset_optimizer` before doing this operation if the `CachingOptimizer` is in `MANUAL` mode.
julia> moi_backend.mode
AUTOMATIC::CachingOptimizerMode = 1
amontoison commented 1 week ago

The sentence "You may want to use a `CachingOptimizer` in `AUTOMATIC` mode or you may need to call `reset_optimizer` before doing this operation if the `CachingOptimizer` is in `MANUAL` mode." seems wrong in that case.

amontoison commented 1 week ago

@odow May I ask if I'm doing something wrong in the code snippet above? https://github.com/JuliaSmoothOptimizers/NLPModelsJuMP.jl/issues/195#issuecomment-2330321889

odow commented 1 week ago

We can't get the list of supported operators because you haven't selected a solver

odow commented 1 week ago

We could perhaps return something meaningful for this case where you have a CachingOptimizer with no optimizer attached.

amontoison commented 1 week ago

Do you have a trick for us so that we can extract the operators and expect that they are supported? I don't know how much it differs from @register, so I may be completely wrong. I just wanted to do this kind of thing: https://github.com/JuliaSmoothOptimizers/NLPModelsJuMP.jl/pull/197/files#diff-47c27891e951c8cd946b850dc2df31082624afdf57446c21cb6992f5f4b74aa2R454-R459

Benoît implemented an optimizer but all JuMP models in OptimizationProblems.jl don't have an optimizer attached.

odow commented 1 week ago

I just wanted to do this kind of thing: https://github.com/JuliaSmoothOptimizers/NLPModelsJuMP.jl/pull/197/files#diff-47c27891e951c8cd946b850dc2df31082624afdf57446c21cb6992f5f4b74aa2R454-R459

That code is wrong. It returns a Vector{Symbol}. I should add an example to the docstring: https://github.com/jump-dev/MathOptInterface.jl/blob/5f5acaaffb3d0703f21d647605c87cf10a2796d0/src/attributes.jl#L2050-L2056

You need to implement support for MOI.UserDefinedFunction. Here's Ipopt: https://github.com/jump-dev/Ipopt.jl/blob/4c156461ef1fda3c9f015520197afda4e8ca3e26/src/MOI_wrapper.jl#L600-L620

It might be faster if I just take a look and make a PR :smile:

odow commented 1 week ago

This should be enough to point you in the right direction: https://github.com/JuliaSmoothOptimizers/NLPModelsJuMP.jl/pull/197#issuecomment-2333003371

I didn't test other than the hs87 example.

amontoison commented 1 week ago

Thanks a lot @odow!!! :smiley: