Closed JunchengLiamLi closed 2 months ago
That's very weird. I've never seen this before with Ipopt. Do you have a reproducible example of the code that you are calling?
Are you running out of disk-space? Or running out of RAM?
That's very weird. I've never seen this before with Ipopt. Do you have a reproducible example of the code that you are calling?
Are you running out of disk-space? Or running out of RAM?
Thank you so much for the reply!
I am not sure if the error is reproducible as I could not produce the error with a small example of this code on my laptop (another computer). In fact, I have been using Ipopt.jl to do similar experiments on my laptop and never experienced any issue. It only happened when recently I began to run this experiment on a more powerful workstation.
Specs of my laptop: 16GB of RAM, operating system of win10, Julia version 1.6.3, JuMP version 0.23.2, Ipopt v1.1.0 Specs of the workstation: 64GB of RAM, operating system of win11pro, Julia version 1.9, JuMP version 1.18.1, Ipopt v1.6.0
The code where the error occured is like this:
function example_inner_function()
a = Vector{String}();
b = Vector{Vector{Int64}}(undef, n1);
# solve the optimization problems
for i in 1:n2
# call Gurobi to solve a series of MIP problems and record the results (capsulated in a function)
for j in 1:n3
# call Ipopt to solve a series of NLP problems and record the results (capsulated in a function)
end
end
# return a DataFrame
end
I then use an outer function to call this example inner function multiple times.
Is there any way to check the RAM or disk-space usage just before the error occurs?
JuMP version 0.23.2
Woah! Please update to any version after 1.0.
Is there any way to check the RAM or disk-space usage just before the error occurs?
Just have the task manager open and watch it.
I could not produce the error with a small example of this code
Can you share the full log from Ipopt? Does it error immediately, or after a few iterations? What about changing the number of n2
and n3
iterations?
Try running with a small number of n2
and n3
, and profile.
@time my_function(n2, n3)
, how long does it take and how much memory is allocated?using ProfileView; @profview my_function(n2, n3)
Does anything stand out in the flamegraph?using PProf; @pprof my_function(n2, n3)
, which makes things like https://github.com/jump-dev/JuMP.jl/issues/3729#issuecomment-2060268817Can you share the full log from Ipopt?
Yes, I attached the log file to this message. output_4_Github.txt
Does it error immediately, or after a few iterations?
The error occurs after a few iterations, i.e. Ipopt solved many problems before the error occurs
What about changing the number of n2 and n3 iterations?
There is the result from changing of the number of n2 and n3 iterations on my Laptop:
@time run_experiment(small_instances, "small", [1], [1], true) -> 46.437006 seconds (5.06 M allocations: 357.125 MiB, 0.17% gc time)
@time run_experiment(small_instances, "small", 1:2, 1:2, true) -> 182.814884 seconds (23.46 M allocations: 1.620 GiB, 0.18% gc time)
Could you try updating to Ipopt v1.6.2? I wonder if the issue is https://github.com/jump-dev/Ipopt.jl/issues/403
What is the output of julia> import Pkg; Pkg.pkg"status -m"
?
Seems we also had this a long time ago: https://github.com/jump-dev/Ipopt.jl/issues/77
Sorry for the late reply.
Could you try updating to Ipopt v1.6.2? I wonder if the issue is https://github.com/jump-dev/Ipopt.jl/issues/403
No, but I made a few changes in the code and it seems to have probably solved the issue.
If the batch file can finish the program, then it would more likely be an issue of the memory system of my PC instead of an issue in Ipopt.jl. I will update with you on this in a few days.
What is the output of julia> import Pkg; Pkg.pkg"status -m"?
I will also update that in a few days.
Seems we also had this a long time ago: https://github.com/jump-dev/Ipopt.jl/issues/77
My issue looks different as my result of "Pkg.test("Ipopt")" is successful and Ipopt.jl did successfully solve many problems before reporting the error.
I then broke the original big for-loop into many smaller ones and used a batch file to run those in a sequence. It is running OK for now.
Yes, I think then what is happening is that you are running out of RAM.
If you want some help improving the performance, please post a reproducible example on our community forum at https://jump.dev/forum
Closing because I Donn't know if there is anything we can do specifically in Ipopt.jl. If you have any other questions, please post on https://jump.dev/forum and we can discuss there.
I use Ipopt in a for-loop to solve a series of optimization problems. After a few successful iterations, it returns the following error. The version of Ipopt.jl is 1.6.0 and JuMP is 1.8.1. The operating system is windows10. The PC has 64GB of RAM. Just wondering if any one can help me with this issue. Many thanks !
ERROR: LoadError: ReadOnlyMemoryError() Stacktrace: [1] IpoptSolve(prob::IpoptProblem) @ Ipopt C:\Users\Guglielmo Lulli.julia\packages\Ipopt\oNDpH\src\C_wrapper.jl:442 [2] optimize!(model::Ipopt.Optimizer) @ Ipopt C:\Users\Guglielmo Lulli.julia\packages\Ipopt\oNDpH\src\MOI_wrapper.jl:951 [3] optimize! @ C:\Users\Guglielmo Lulli.julia\packages\MathOptInterface\3JqTJ\src\Bridges\bridge_optimizer.jl:380 [inlined] [4] optimize! @ C:\Users\Guglielmo Lulli.julia\packages\MathOptInterface\3JqTJ\src\MathOptInterface.jl:85 [inlined] [5] optimize!(m::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{Ipopt.Optimizer}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}) @ MathOptInterface.Utilities C:\Users\Guglielmo Lulli.julia\packages\MathOptInterface\3JqTJ\src\Utilities\cachingoptimizer.jl:316 [6] optimize!(model::Model; ignore_optimize_hook::Bool, _differentiation_backend::MathOptInterface.Nonlinear.SparseReverseMode, kwargs::Base.Pairs{Symbol, Union{}, Tuple{}, NamedTuple{(), Tuple{}}}) @ JuMP C:\Users\Guglielmo Lulli.julia\packages\JuMP\027Gt\src\optimizer_interface.jl:448 [7] optimize! @ C:\Users\Guglielmo Lulli.julia\packages\JuMP\027Gt\src\optimizer_interface.jl:409 [inlined] ... [13] top-level scope @ D:\LApprox\main_n.jl:279 [14] include(fname::String) @ Base.MainInclude .\client.jl:478 [15] top-level scope @ REPL[2]:1