Open jbcaillau opened 1 year ago
Agreed. Would be interesting to check with some benchmarking.
@PierreMartinon Do you use the function ipopt
only in solve.jl
?
I can open a PR to easily support HSL solvers.
@amontoison I think we use Ipopt
only here
We can see this here: https://github.com/search?q=repo%3Acontrol-toolbox%2FCTDirect.jl%20ipopt&type=code
@PierreMartinon Do you use the function
ipopt
only insolve.jl
? I can open a PR to easily support HSL solvers.
@amontoison YES. PR most welcome on this. All the more as there is currently an internship with @ocots @frapac that will benefit a lot from being able to easily try and change linear solvers.
Done in https://github.com/control-toolbox/CTDirect.jl/pull/113 @jbcaillau ๐
@ocots @0Yassine0 @amontoison guys, I need a serious update, here:
using HSL
? what about importing HSL_jll
? licensing...? not clear to mespral
instead of mumps
? (I had noticed Ipopt was compiled with it for >= 1.9
)@ocots @0Yassine0 @amontoison guys, I need a serious update, here:
- the initial post above goes back to Sep. 2023: are the information still up to date?
Yes, the information are still correct. For the BLAS / LAPACK backends, I recently updated the README of Ipopt to explain how to use other backends (like BLIS) and that I recommend a sequential BLAS / LAPACK with MA86/MA97.
- after Support HSL solversย #113 is it really as simple as
using HSL
? what about importingHSL_jll
licensing...? not clear to me
HSL_jll.jl has two versions, the official one that is available on the website of RAL and a dummy one that contains one C routine and that is available in the Julia registry.
Because we have the dummy version, HSL_jll.jl is an official Julia package and we can use it as any depenendency of a Julia package (like HSL.jl).
Thanks to the routine available in the dummy version, we can determine if the dummy or official HSL_jll.jl is installed. The C routine is LIBHSL_isfunctional
and returns a boolean.
In HSL.jl, I did a Julia wrapper to easily check that at the Julia level:
https://github.com/JuliaSmoothOptimizers/HSL.jl/blob/main/src/C/libhsl.jl#L1
and also take care of loading a BLAS / LAPACK backend for the user if he didn't loaded one before (like MKL or AppleAccelerate):
https://github.com/JuliaSmoothOptimizers/HSL.jl/blob/main/src/HSL.jl#L16
Conclusion: If the user has installed the official HSL_jll.jl, we automatically use MA57 here otherwise we use MUMPS as a fallback but everything is transparent for the user because we can check which version of HSL_jll.jl is installed.
- what about using
spral
instead ofmumps
? (I had noticed Ipopt was compiled with it for>= 1.9
)
SPRAL requires two environnement variables related to OpenMP. Because CTDirect.jl requires at least Julia 1.9, I don't check the Julia version but I ensure that the two environnement variables are well defined if the user wants to use it. I think MUMPS is more efficient than SPRAL so I don't dispatch to it by default if we can't use the HSL solvers but we need some benchmarks to verify that.
Note that I compiled MUMPS 5.7.2 yesterday with Yggdrasil. I wait a new release of SPRAL and I will recompile Ipopt with the updated linear solvers.
๐๐ฝ @amontoison now I get it. Actually, you probably already the dummy routine trick, but did not remembered it so was not able to figure where licensing was coming into play. having a try (before forgetting it again ๐ค๐ฝ)
@ocots @PierreMartinon @0Yassine0 As discussed now with @amontoison ๐๐ฝ: for sparse problems, typically for constraints resulting from discretising $\dot{x} = f(x,u)$ according to sth like
x[i+1] == x[i] + h[i] / 2 * ( f(x[i], u[i]) + f(x[i]+1, u[i+1]) )
one has very sparse Hessians and @amontoison suggests that ma27
should perform better than ma57
. TBT.
@0Yassine0 @ocots I installed HSL lib on a linux server, dev
-ed HSL_jll.jl
on the corresponding path but still git the following error (falling back to MUMPS, then):
Can you please share how you did the install end to end?
@0Yassine0 @ocots I installed HSL lib on a linux server,
dev
-edHSL_jll.jl
on the corresponding path but still git the following error (falling back to MUMPS, then):Can you please share how you did the install end to end?
I didn't use HSL
package from @JuliaSmoothOptimizers for the matter.
The installation steps are explained here in details.
To use it with OptimalControl
:
import HSL_jll
sol = OptimalControl.solve(Model,linear_solver="maXX" , hsllib=HSL_jll.libhsl_path)
@0Yassine0 @ocots I installed HSL lib on a linux server,
dev
-edHSL_jll.jl
on the corresponding path but still git the following error (falling back to MUMPS, then):Can you please share how you did the install end to end?
@jbcaillau You have an old version of HSL_jll.jl
(the first one!!), please download the last one.
We renamed JULIAHSL
into LIBHSL
and in your version you only have the routine JULIAHSL_isfunctional
.
You can directly download the new version of HSL_jll if you go to my order
on the HSL website.
@0Yassine0 @ocots I installed HSL lib on a linux server,
dev
-edHSL_jll.jl
on the corresponding path but still git the following error (falling back to MUMPS, then): Can you please share how you did the install end to end?@jbcaillau You have an old version of
HSL_jll.jl
(the first one!!), please download the last one. We renamedJULIAHSL
intoLIBHSL
and in your version you only have the routineJULIAHSL_isfunctional
. You can directly download the new version of HSL_jll if you go tomy order
on the HSL website.
@amontoison outdated me ๐
MKL + HSL set ๐๐ฝ
Started a basic benchmarking script. SPRAL does seems slower than MUMPS
julia> include("test/benchmark.jl")
Profile: linear_solver=mumps
Settings: tol=1e-08 grid_size=100 precompile=true
Precompilation step
goddard param
Benchmark step
goddard completed in 1.81 s
param completed in 2.74 s
Total time (s): 4.56
julia> include("test/benchmark.jl")
Profile: linear_solver=spral
Settings: tol=1e-08 grid_size=100 precompile=true
Precompilation step
goddard param
Benchmark step
goddard completed in 9.69 s
param completed in 12.20 s
Total time (s): 21.90
@PierreMartinon nice. what about ma57
and ma27
(the last might be better for our problems according to @amontoison)?
@PierreMartinon I suggest to
:mumps
...) instead of strings is more (we can also keep the string way for compatibility, not sure where it already appears...)I am ok with symbols. We can provide with the description.
@jbcaillau What is the issue with MA57? Is it an error in the linear solver?
@amontoison just a slightly deprecated / less precise convergence for this particular case that is not compatible with some further solution analysis.
Did you tune the tolerance of the optimization solver? I don't think that the issue is the linear solver.
only difference between the two runs = linear solver. same params otherwise (tols, etc.)
as we cannot be sure that a user will have HSL installed, it is safer to ensure that the default (= MUMPS) for Ipopt gives the requested precision for post computation. minor point, though.
I am ok with symbols. We can provide with the description.
ok for symbols as well (should we set this as a general rule ?) Equality tests may have to be updated for instance in https://github.com/control-toolbox/CTDirect.jl/blob/a9ae483a57382d44dba0612af914d78275fd6d7a/ext/CTSolveExt.jl
As for the default choice, this is a 1-line change here, so pretty straightforward https://github.com/control-toolbox/CTDirect.jl/blob/a9ae483a57382d44dba0612af914d78275fd6d7a/src/CTDirect.jl#L21
@PierreMartinon @gergaud @ocots Many thanks @amontoison for your input on this. To summarise:
solve
an option allowing to change the linear solver used byIpopt
: the default ismumps
, but there (at least) the other options below:ma27
andma57
from HSL[^1], andspral
from STFC/RAL:using HSL_jll stats = ipopt(nlp, linear_solver="ma27") stats = ipopt(nlp, linear_solver="ma57")
[^1]: free academic license available on request
Extra
Users on Intel processors (and Julia $\geq$ 1.9 that has libblastrampoline) can change their
OpenBLAS
backend to the Intel one:To display the current backend, do
See also
Performance tips