SciML / Optimization.jl

Mathematical Optimization in Julia. Local, global, gradient-based and derivative-free. Linear, Quadratic, Convex, Mixed-Integer, and Nonlinear Optimization in one simple, fast, and differentiable interface.
https://docs.sciml.ai/Optimization/stable/
MIT License
688 stars 75 forks source link

callback-generated stopping criteria no longer work in OptimizationNLopt #752

Closed donboyd5 closed 1 month ago

donboyd5 commented 1 month ago

Describe the example

Shows that, when using OptimizationNLopt, returning true from callback function triggers an error rather than allowing a graceful stop.

I think this is related to a change in NLopt.jl that now throws an error where it used to throw a warning. Here are some links that seem relevant:

My goal here is to define a complex stopping criterion in a callback function (as well as printing key progress information) and exit gracefully if the complex stopping criterion is met. It used to work, but no longer does.

I'm not an expert in julia or optimization so perhaps there is a better way to do this than via a callback function. I see that it is possible, when using NLopt.jl rather than OptimizationNLopt, to throw an error in the objective function when a complex stopping criterion is met and exit gracefully, but that doesn't seem to work in OptimizationNLopt - at least I haven't figured out how to make it work.

Minimal Reproducible Example 👇

Without MRE, we would only be able to help you to a limited extent, and attention to the issue would be limited. to know more about MRE refer to wikipedia and stackoverflow.

using Optimization
using OptimizationNLopt
using NLopt

rosenbrock(x, p) = (p[1] - x[1])^2 + p[2] * (x[2] - x[1]^2)^2

cbfalse = function(state, loss)
    println(state.iter, " ", state.u, " ", state.objective)
    return false
end

cbstopping = function(state, loss)
    println(state.iter, " ", state.u, " ", state.objective)
    return state.objective < 1
end

x0 = zeros(2)
p = [1.0, 100.0]

f = OptimizationFunction(rosenbrock)
prob = Optimization.OptimizationProblem(f, x0, p, lb = [-1.0, -1.0], ub = [1.0, 1.0])

sol = solve(prob, NLopt.LN_NELDERMEAD(), callback=cbfalse) # successfully prints progress when return is false
sol = solve(prob, NLopt.LN_NELDERMEAD(), callback=cbstopping) # successfully prints progress when return is false but errors when return is true

Error & Stacktrace ⚠️

Not Working Environment (please complete the following information):

Status `~/Documents/julia_projects/nlopt_test_v1.10/Project.toml`
  [76087f3c] NLopt v1.0.2
  [7f7a1694] Optimization v3.25.0
  [4e6fcdb7] OptimizationNLopt v0.2.0
  [44cfe95a] Pkg v1.10.0
Status `~/Documents/julia_projects/nlopt_test_v1.10/Manifest.toml`
  [47edcb42] ADTypes v1.2.1
  [1520ce14] AbstractTrees v0.4.5
  [7d9f7c33] Accessors v0.1.36
  [79e6a3ab] Adapt v4.0.4
  [4fba245c] ArrayInterface v7.10.0
  [38540f10] CommonSolve v0.2.4
  [a33af91c] CompositionsBase v0.1.2
  [88cd18e8] ConsoleProgressMonitor v0.1.2
  [187b0558] ConstructionBase v1.5.5
  [9a962f9c] DataAPI v1.16.0
  [e2d170a0] DataValueInterfaces v1.0.0
  [ffbed154] DocStringExtensions v0.9.3
  [4e289a0a] EnumX v1.0.4
  [e2ba6199] ExprTools v0.1.10
  [069b7b12] FunctionWrappers v1.1.3
  [77dc65aa] FunctionWrappersWrappers v0.1.3
  [46192b85] GPUArraysCore v0.1.6
  [3587e190] InverseFunctions v0.1.14
  [82899510] IteratorInterfaceExtensions v1.0.0
  [692b3bcd] JLLWrappers v1.5.0
  [5be7bae1] LBFGSB v0.4.1
  [1d6d02ad] LeftChildRightSiblingTrees v0.2.0
  [e6f89c97] LoggingExtras v1.0.3
  [1914dd2f] MacroTools v0.5.13
  [76087f3c] NLopt v1.0.2
  [7f7a1694] Optimization v3.25.0
⌅ [bca83a33] OptimizationBase v0.0.7
  [4e6fcdb7] OptimizationNLopt v0.2.0
  [bac558e1] OrderedCollections v1.6.3
  [aea7be01] PrecompileTools v1.2.1
  [21216c6a] Preferences v1.4.3
  [33c8b6b6] ProgressLogging v0.1.4
  [92933f4c] ProgressMeter v1.10.0
  [3cdcf5f2] RecipesBase v1.3.4
  [731186ca] RecursiveArrayTools v3.19.0
  [189a3867] Reexport v1.2.2
  [ae029012] Requires v1.3.0
  [7e49a35a] RuntimeGeneratedFunctions v0.5.13
  [0bca4576] SciMLBase v2.38.0
  [c0aeaf25] SciMLOperators v0.3.8
  [53ae85a6] SciMLStructures v1.2.0
  [efcf1570] Setfield v1.1.1
  [1e83bf80] StaticArraysCore v1.4.2
  [2efcf032] SymbolicIndexingInterface v0.3.21
  [3783bdb8] TableTraits v1.0.1
  [bd369af6] Tables v1.11.1
  [5d786b92] TerminalLoggers v0.1.7
  [81d17ec3] L_BFGS_B_jll v3.0.1+0
  [079eb43e] NLopt_jll v2.7.1+0
  [0dad84c5] ArgTools v1.1.1
  [56f22d72] Artifacts
  [2a0f44e3] Base64
  [ade2ca70] Dates
  [8ba89e20] Distributed
  [f43a241f] Downloads v1.6.0
  [7b1f6079] FileWatching
  [9fa8497b] Future
  [b77e0a4c] InteractiveUtils
  [b27032c2] LibCURL v0.6.4
  [76f85450] LibGit2
  [8f399da3] Libdl
  [37e2e46d] LinearAlgebra
  [56ddb016] Logging
  [d6f4376e] Markdown
  [ca575930] NetworkOptions v1.2.0
  [44cfe95a] Pkg v1.10.0
  [de0858da] Printf
  [3fa0cd96] REPL
  [9a3f8284] Random
  [ea8e919c] SHA v0.7.0
  [9e88b42a] Serialization
  [6462fe0b] Sockets
  [2f01184e] SparseArrays v1.10.0
  [10745b16] Statistics v1.10.0
  [4607b0f0] SuiteSparse
  [fa267f1f] TOML v1.0.3
  [a4e569a6] Tar v1.10.0
  [8dfed614] Test
  [cf7118a7] UUIDs
  [4ec0a83e] Unicode
  [e66e0078] CompilerSupportLibraries_jll v1.1.1+0
  [deac9b47] LibCURL_jll v8.4.0+0
  [e37daf67] LibGit2_jll v1.6.4+0
  [29816b5a] LibSSH2_jll v1.11.0+1
  [c8ffd9c3] MbedTLS_jll v2.28.2+1
  [14a3606d] MozillaCACerts_jll v2023.1.10
  [4536629a] OpenBLAS_jll v0.3.23+4
  [bea87d4a] SuiteSparse_jll v7.2.1+1
  [83775a58] Zlib_jll v1.2.13+1
  [8e850b90] libblastrampoline_jll v5.8.0+1
  [8e850ede] nghttp2_jll v1.52.0+1
  [3f19e933] p7zip_jll v17.4.0+2
Julia Version 1.10.3
Commit 0b4590a5507 (2024-04-30 10:59 UTC)
Build Info:
  Official https://julialang.org/ release
Platform Info:
  OS: Linux (x86_64-linux-gnu)
  CPU: 24 × AMD Ryzen 9 5900X 12-Core Processor
  WORD_SIZE: 64
  LIBM: libopenlibm
  LLVM: libLLVM-15.0.7 (ORCJIT, znver3)
Threads: 1 default, 0 interactive, 1 GC (on 24 virtual cores)
Environment:
  JULIA_EDITOR = code
  JULIA_NUM_THREADS = 

Working Environment (please complete the following information):


[deps]
CSV = "336ed68f-0bac-5ca0-87d4-7b16caf5d00b"
ChainRules = "082447d4-558c-5d27-93f4-14fc19e9eca2"
DataFrames = "a93c6f00-e57d-5684-b7b6-d8193f3e46c0"
Distributions = "31c24e10-a181-5473-b8eb-7969acd0382f"
FiniteDiff = "6a86dc24-6348-571c-b903-95158fe2bd41"
ForwardDiff = "f6369f11-7733-5829-9624-2563aa707210"
Ipopt = "b6b21f68-93f8-5de0-b562-5493be1d77c9"
JuMP = "4076af6c-e467-56ae-b986-b466b2749572"
LeastSquaresOptim = "0fc2ff8b-aaa3-5acd-a817-1944a5e08891"
LineSearches = "d3d80556-e9d4-5f37-9878-2ab0fcc64255"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
LocalRegistry = "89398ba2-070a-4b16-a995-9893c55d93cf"
LsqFit = "2fda8390-95c7-5789-9bda-21331edee243"
MINPACK = "4854310b-de5a-5eb6-a2a5-c1dee2bd17f9"
ModelingToolkit = "961ee093-0014-501f-94e3-6117800e7a78"
NLPModels = "a4795742-8479-5a88-8948-cc11e1c8c1a6"
NLPModelsIpopt = "f4238b75-b362-5c4c-b852-0801c9a21d71"
NLSolversBase = "d41bc354-129a-5804-8e4c-c37616107c6c"
NLopt = "76087f3c-5699-56af-9a33-bf431cd00edd"
NLsolve = "2774e3e8-f4cf-5e23-947b-6d7e65073b56"
OhMyREPL = "5fb14364-9ced-5910-84b2-373655c76a03"
OpenBLAS32_jll = "656ef2d0-ae68-5445-9ca0-591084a874a2"
Optim = "429524aa-4258-5aef-a3af-852621145aeb"
Optimisers = "3bd65402-5787-11e9-1adc-39752487f4e2"
Optimization = "7f7a1694-90dd-40f0-9382-eb1efda571ba"
OptimizationMOI = "fd9f6733-72f4-499f-8506-86b2bdd0dea1"
OptimizationNLopt = "4e6fcdb7-1186-4e1f-a706-475e75c168bb"
OptimizationOptimJL = "36348300-93cb-4f02-beb5-3c3902f8871e"
OptimizationOptimisers = "42dfb2eb-d2b4-4451-abcd-913932933ac1"
Parameters = "d96e819e-fc66-5662-9728-84c9c7592b0a"
Parquet = "626c502c-15b0-58ad-a749-f091afb673ae"
Printf = "de0858da-6303-5e67-8744-51eddeeeb8d7"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
ReverseDiff = "37e2e3b7-166d-5795-8a7a-e32c996b4267"
Revise = "295af30f-e4ad-537b-8983-00126c2a3abe"
SPGBox = "bf97046b-3e66-4aa0-9aed-26efb7fac769"
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"
StandaloneIpopt = "2377441b-c98c-4b62-8dd7-ad61dfe8e447"
Statistics = "10745b16-79ce-11e8-11f9-7d13ad32a3b2"
Symbolics = "0c5d862f-8b57-4792-8d23-62f2024744c7"
Tables = "bd369af6-aec1-5ad0-b16a-f7cc5008161c"
Tulip = "6dd1b50a-3aae-11e9-10b5-ef983d2400fa"
UUIDs = "cf7118a7-6976-5b1a-9a39-7adc72f591a4"
UpdateJulia = "770da0de-323d-4d28-9202-0e205c1e0aff"
Zygote = "e88e6eb3-aa80-5325-afca-941959d7151f"

The information is too long for this window. I'm copying selected info from my manifest file. I hope this helps.

Please note that the example above does not work exactly in the working environment because the calling signature of the callback function has changed in Optimization, but the adjustment is simple.


julia_version = "1.9.2"

[[deps.Optimization]]
deps = ["ADTypes", "ArrayInterface", "ConsoleProgressMonitor", "DocStringExtensions", "LinearAlgebra", "Logging", "LoggingExtras", "Pkg", "Printf", "ProgressLogging", "Reexport", "Requires", "SciMLBase", "SparseArrays", "SymbolicIndexingInterface", "TerminalLoggers"]
git-tree-sha1 = "d124973a6dacd4252ec9101e0b30e725afd056ac"
uuid = "7f7a1694-90dd-40f0-9382-eb1efda571ba"
version = "3.20.2"

[[deps.OptimizationNLopt]]
deps = ["NLopt", "Optimization", "Reexport"]
git-tree-sha1 = "dc1b76eae7c47ae77560803587911d4d219af531"
uuid = "4e6fcdb7-1186-4e1f-a706-475e75c168bb"
version = "0.1.8"

[[deps.NLopt]]
deps = ["MathOptInterface", "MathProgBase", "NLopt_jll"]
git-tree-sha1 = "5a7e32c569200a8a03c3d55d286254b0321cd262"
uuid = "76087f3c-5699-56af-9a33-bf431cd00edd"
version = "0.6.5"

[[deps.NLopt_jll]]
deps = ["Artifacts", "JLLWrappers", "Libdl", "Pkg"]
git-tree-sha1 = "9b1f15a08f9d00cdb2761dcfa6f453f5d0d6f973"
uuid = "079eb43e-fd8e-5478-9966-2cf3e3edb778"
version = "2.7.1+0"

Additional context

Add any other context about the problem here.

ChrisRackauckas commented 1 month ago

So is this just an upstream issue in NLopt?

donboyd5 commented 1 month ago

I don't really know. I think it may be possible to exit NLopt gracefully when called from Optimization.NLopt using try catch, thereby working around the NLopt.jl change - but unfortunately I don't have the expertise to be sure without diving in and learning a lot of new things. I was hoping someone might know off the top of their head.

Vaibhavdixit02 commented 1 month ago

I don't see the stacktrace anywhere. And no, the Nlopt change is strictly for the better since it now rethrows whatever the julia session throws, earlier it didn't show what the error was making it impossible to debug

donboyd5 commented 1 month ago

I'm sorry, I'm pretty new to all this. I hope the following is the stacktrace you need. If you need something else, please let me know:

julia> sol = solve(prob, NLopt.LN_NELDERMEAD(), callback=cbstopping) # successfully prints progress when return is false but errors when return is true
0 [0.0, 0.0] 1.0
0 [0.5, 0.0] 6.5
0 [0.0, 0.5] 26.0
0 [0.5, -0.5] 56.5
0 [0.125, 0.25] 6.2587890625
0 [-0.375, 0.25] 3.0869140625
0 [-0.5, 0.0] 8.5
0 [-0.03125, 0.1875] 4.542575836181641
0 [-0.34375, 0.0625] 2.1155128479003906
0 [0.03125, -0.1875] 4.490818023681641
0 [-0.2734375, 0.140625] 2.0553566366434097
0 [0.0703125, 0.078125] 1.3998669534921646
0 [0.34375, -0.0625] 3.6946144104003906
0 [-0.119140625, 0.08984375] 1.8247568146907724
0 [0.189453125, -0.01171875] 0.88366922136629
ERROR: Optimization halted by callback.
Stacktrace:
  [1] error(s::String)
    @ Base ./error.jl:35
  [2] (::OptimizationNLopt.var"#2#4"{OptimizationCache{…}})(θ::Vector{Float64})
    @ OptimizationNLopt ~/.julia/packages/OptimizationNLopt/tUOSK/src/OptimizationNLopt.jl:141
  [3] (::OptimizationNLopt.var"#3#5"{…})(θ::Vector{…}, G::Vector{…})
    @ OptimizationNLopt ~/.julia/packages/OptimizationNLopt/tUOSK/src/OptimizationNLopt.jl:151
  [4] nlopt_callback_wrapper(n::UInt32, x::Ptr{Float64}, grad::Ptr{Float64}, d_::Ptr{Nothing})
    @ NLopt ~/.julia/packages/NLopt/w0c7n/src/NLopt.jl:388
  [5] optimize!(o::Opt, x::Vector{Float64})
    @ NLopt ~/.julia/packages/NLopt/w0c7n/src/NLopt.jl:627
  [6] optimize(o::Opt, x::Vector{Float64})
    @ NLopt ~/.julia/packages/NLopt/w0c7n/src/NLopt.jl:634
  [7] __solve(cache::OptimizationCache{…})
    @ OptimizationNLopt ~/.julia/packages/OptimizationNLopt/tUOSK/src/OptimizationNLopt.jl:177
  [8] solve!
    @ ~/.julia/packages/SciMLBase/SDjaO/src/solve.jl:188 [inlined]
  [9] #solve#625
    @ ~/.julia/packages/SciMLBase/SDjaO/src/solve.jl:96 [inlined]
 [10] top-level scope
    @ ~/Documents/julia_projects/nlopt_test_v1.10/test_submitted.jl:24
 [11] eval
    @ ./boot.jl:385 [inlined]
 [12] include_string(mapexpr::typeof(REPL.softscope), mod::Module, code::String, filename::String)
    @ Base ./loading.jl:2076
 [13] invokelatest(::Any, ::Any, ::Vararg{Any}; kwargs::@Kwargs{})
    @ Base ./essentials.jl:892
 [14] invokelatest(::Any, ::Any, ::Vararg{Any})
    @ Base ./essentials.jl:889
 [15] inlineeval(m::Module, code::String, code_line::Int64, code_column::Int64, file::String; softscope::Bool)
    @ VSCodeServer ~/.vscode-server/extensions/julialang.language-julia-1.79.2/scripts/packages/VSCodeServer/src/eval.jl:271
 [16] (::VSCodeServer.var"#69#74"{…})()
    @ VSCodeServer ~/.vscode-server/extensions/julialang.language-julia-1.79.2/scripts/packages/VSCodeServer/src/eval.jl:181
 [17] withpath(f::VSCodeServer.var"#69#74"{…}, path::String)
    @ VSCodeServer ~/.vscode-server/extensions/julialang.language-julia-1.79.2/scripts/packages/VSCodeServer/src/repl.jl:276
 [18] (::VSCodeServer.var"#68#73"{…})()
    @ VSCodeServer ~/.vscode-server/extensions/julialang.language-julia-1.79.2/scripts/packages/VSCodeServer/src/eval.jl:179
 [19] hideprompt(f::VSCodeServer.var"#68#73"{…})
    @ VSCodeServer ~/.vscode-server/extensions/julialang.language-julia-1.79.2/scripts/packages/VSCodeServer/src/repl.jl:38
 [20] (::VSCodeServer.var"#67#72"{…})()
    @ VSCodeServer ~/.vscode-server/extensions/julialang.language-julia-1.79.2/scripts/packages/VSCodeServer/src/eval.jl:150
 [21] with_logstate(f::Function, logstate::Any)
    @ Base.CoreLogging ./logging.jl:515
 [22] with_logger
    @ ./logging.jl:627 [inlined]
 [23] (::VSCodeServer.var"#66#71"{VSCodeServer.ReplRunCodeRequestParams})()
    @ VSCodeServer ~/.vscode-server/extensions/julialang.language-julia-1.79.2/scripts/packages/VSCodeServer/src/eval.jl:263
 [24] #invokelatest#2
    @ ./essentials.jl:892 [inlined]
 [25] invokelatest(::Any)
    @ Base ./essentials.jl:889
 [26] (::VSCodeServer.var"#64#65")()
    @ VSCodeServer ~/.vscode-server/extensions/julialang.language-julia-1.79.2/scripts/packages/VSCodeServer/src/eval.jl:34
Stacktrace:
 [1] chk(o::Opt, result::Result)
   @ NLopt ~/.julia/packages/NLopt/w0c7n/src/NLopt.jl:224
 [2] optimize!(o::Opt, x::Vector{Float64})
   @ NLopt ~/.julia/packages/NLopt/w0c7n/src/NLopt.jl:630
 [3] optimize(o::Opt, x::Vector{Float64})
   @ NLopt ~/.julia/packages/NLopt/w0c7n/src/NLopt.jl:634
 [4] __solve(cache::OptimizationCache{…})
   @ OptimizationNLopt ~/.julia/packages/OptimizationNLopt/tUOSK/src/OptimizationNLopt.jl:177
 [5] solve!
   @ ~/.julia/packages/SciMLBase/SDjaO/src/solve.jl:188 [inlined]
 [6] #solve#625
   @ ~/.julia/packages/SciMLBase/SDjaO/src/solve.jl:96 [inlined]
 [7] top-level scope
   @ ~/Documents/julia_projects/nlopt_test_v1.10/test_submitted.jl:24
Some type information was truncated. Use `show(err)` to see complete types.
donboyd5 commented 1 month ago

Hope I was clear that I intentionally return true from my callback when desired complex stopping criteria are met before objective function is minimized. Perhaps that is not the smart way to get the optimization to stop and exit gracefully so that I can recover the solution at that point?

donboyd5 commented 1 month ago

Thank you! I seem to be unable to access the updated version of OptimizationNLopt.

I can see (I think) from the Project.toml files that I need Optimization version 3.25.1 and OptimizationNLopt version 0.2.2.

In a clean julia project environment with nothing but Pkg installed, I executed add Optimization, OptimizationNLopt, which installed Optimization version 3.25.1 but only installed OptimizationNLopt version 0.2.1. That generated a precompilation error, but more importantly, it's not the updated version of OptimizationNLopt.

I looked for a way to install the master branch of OptimizationNLopt but could not figure out how to do it (e.g., Pkg.add(url="https://github.com/SciML/Optimization.jl/tree/master/lib/OptimizationNLopt", rev="master") and variants do not work - I did not find a freestanding repo for OptimizationNLopt).

I certainly can wait until a new version is released, but I wanted to make sure you are aware that the updated version does not seem to be currently accessible.

Again, thank you.

Vaibhavdixit02 commented 1 month ago

That generated a precompilation error, but more importantly, it's not the updated version of OptimizationNLopt.

What was the error?

donboyd5 commented 1 month ago

Sorry. I didn't provide it because I thought the problem is that I don't know how to install the updated version of OptimizationNLopt (v.0.2.2), rather than any underlying problem with OptimizationNLopt. Here's how I produced the error, followed by the error:

I started with a clean project environment (only Pkg installed) and added Optimization and OptimizationNLopt, which gave me versions 3.25.1 and 0.2.1 respectively. As noted above, I think I want OptimizationNLopt version 0.2.2.

image

When it went to precompile, it gave a question mark but not an error:

image

When I use the two packages I get the following, including the error "Method overwriting is not permitted during Module precompilation." (However, I assume that error is because I'm not using version 0.2.2 of OptimizationNLopt):

image

I tried to guess at how to install version 0.2.2 of OptimizationNLopt but clearly I guessed the wrong location:

image

I assumed that version 0.2.2 has not yet been released (or something like that - I don't really understand the process), and that I'd just wait until that happened. I don't want to become a pain.

Vaibhavdixit02 commented 1 month ago

Ah, yeah that's not an error just a warning so this should still work and yes the release with that fixed is also done so should be available in a bit