Closed bennerh closed 1 year ago
You could try the following:
using Libdl
Libdl.dlopen("ENTER_PATH_TO_liblapack.so.3", RTLD_GLOBAL)
prior to your code snippet and do the same for following errors (i.e. if symbols related to other libraries are undefined). Not sure if that's the best/recommended way, but I think it might do the trick.
Compiling the HSL libraries and getting everything linked can be finicky. If you have improvements to the instructions, please open a PR.
I will make a PR, but let's see first if this works out :D
@bennerh
I had the exact same problem this afternoon when I tried to use the configure
script of coinhsl
.
The issue is how they generate the shared library.
https://discourse.julialang.org/t/incorrect-objective-type-when-using-ma57-with-ipopt-in-jump/90578/8
I also have an Ubuntu OS for information.
I just released a new version of HSL.jl (0.3.5) to compile it automatically if it can help.
You could try the following:
using Libdl Libdl.dlopen("ENTER_PATH_TO_liblapack.so.3", RTLD_GLOBAL)
prior to your code snippet and do the same for following errors (i.e. if symbols related to other libraries are undefined). Not sure if that's the best/recommended way, but I think it might do the trick.
Thanks a lot, this solved the issue! The following code is running!
Libdl.dlopen("/usr/lib/x86_64-linux-gnu/lapack/liblapack.so.3", RTLD_GLOBAL)
Libdl.dlopen("/usr/lib/x86_64-linux-gnu/libmetis.so", RTLD_GLOBAL)
model = Model(Ipopt.Optimizer)
set_optimizer_attribute(model, "linear_solver", "ma57")
@variable(model, 0 <= x <= 2)
@variable(model, 0 <= y <= 30)
@objective(model, Max, 5x + 3 * y)
@constraint(model, con, 1x + 5y <= 3)
optimize!(model)
Do any of you see problems with this approach? If not, I could make a PR to include something like this in the instructions.
At least on an hpc cluster, I encountered the same problem despite metis and lapack being on the load path. just running:
dlopen("liblapack.so.3", RTLD_GLOBAL)
did the trick for me.
Considering that metis and lapack are usually on the load path, we could also conceive something that takes care of the rest for users in the Ipopt.jl code directly.
But tbh, I neither really have a clue why this works (except that ipopt and linear solvers are now able to access the library symbols, but the real question is why they weren't before), nor if the RTLD_GLOBAL
flag could produce other unforeseen problems downstream...
Here's a sneak peek of some improvements that @amontoison has coming up (not released yet):
julia> using JuMP
julia> import Ipopt
julia> import HSL_jll
julia> model = Model(Ipopt.Optimizer)
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: EMPTY_OPTIMIZER
Solver name: Ipopt
julia> set_attribute(model, "hsllib", HSL_jll.libhsl_path)
julia> set_attribute(model, "linear_solver", "ma97")
julia> @variable(model, x)
x
julia> @objective(model, Min, (x - 2)^2)
x² - 4 x + 4
julia> optimize!(model)
This is Ipopt version 3.14.4, running with linear solver ma97.
Number of nonzeros in equality constraint Jacobian...: 0
Number of nonzeros in inequality constraint Jacobian.: 0
Number of nonzeros in Lagrangian Hessian.............: 1
Total number of variables............................: 1
variables with only lower bounds: 0
variables with lower and upper bounds: 0
variables with only upper bounds: 0
Total number of equality constraints.................: 0
Total number of inequality constraints...............: 0
inequality constraints with only lower bounds: 0
inequality constraints with lower and upper bounds: 0
inequality constraints with only upper bounds: 0
iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls
0 4.0000000e+00 0.00e+00 4.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0
1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 2.00e+00 - 1.00e+00 1.00e+00f 1
Number of Iterations....: 1
(scaled) (unscaled)
Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00
Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00
Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00
Variable bound violation: 0.0000000000000000e+00 0.0000000000000000e+00
Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00
Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00
Number of objective function evaluations = 2
Number of objective gradient evaluations = 2
Number of equality constraint evaluations = 0
Number of inequality constraint evaluations = 0
Number of equality constraint Jacobian evaluations = 0
Number of inequality constraint Jacobian evaluations = 0
Number of Lagrangian Hessian evaluations = 1
Total seconds in IPOPT = 0.000
EXIT: Optimal Solution Found.
julia> set_attribute(model, "linear_solver", "ma57")
julia> optimize!(model)
This is Ipopt version 3.14.4, running with linear solver ma57.
Number of nonzeros in equality constraint Jacobian...: 0
Number of nonzeros in inequality constraint Jacobian.: 0
Number of nonzeros in Lagrangian Hessian.............: 1
Total number of variables............................: 1
variables with only lower bounds: 0
variables with lower and upper bounds: 0
variables with only upper bounds: 0
Total number of equality constraints.................: 0
Total number of inequality constraints...............: 0
inequality constraints with only lower bounds: 0
inequality constraints with lower and upper bounds: 0
inequality constraints with only upper bounds: 0
iter objective inf_pr inf_du lg(mu) ||d|| lg(rg) alpha_du alpha_pr ls
0 4.0000000e+00 0.00e+00 4.00e+00 -1.0 0.00e+00 - 0.00e+00 0.00e+00 0
1 0.0000000e+00 0.00e+00 0.00e+00 -1.0 2.00e+00 - 1.00e+00 1.00e+00f 1
Number of Iterations....: 1
(scaled) (unscaled)
Objective...............: 0.0000000000000000e+00 0.0000000000000000e+00
Dual infeasibility......: 0.0000000000000000e+00 0.0000000000000000e+00
Constraint violation....: 0.0000000000000000e+00 0.0000000000000000e+00
Variable bound violation: 0.0000000000000000e+00 0.0000000000000000e+00
Complementarity.........: 0.0000000000000000e+00 0.0000000000000000e+00
Overall NLP error.......: 0.0000000000000000e+00 0.0000000000000000e+00
Number of objective function evaluations = 2
Number of objective gradient evaluations = 2
Number of equality constraint evaluations = 0
Number of inequality constraint evaluations = 0
Number of equality constraint Jacobian evaluations = 0
Number of inequality constraint Jacobian evaluations = 0
Number of Lagrangian Hessian evaluations = 1
Total seconds in IPOPT = 0.001
EXIT: Optimal Solution Found.
I think this should resolve all of our HSL problems.
Have HSL changed their license? They never allowed binary redistribution except for the newest SPRAL solver, as far as I remember.
Nop but we can create artifacts without Yggdrasil :wink: You will understand when it will be released.
Yeah the plan is that it'll require a manual (licensed) download of HSL_jll
, which you can extract to a directory, and then run ] dev /path/to/hsl
. From then on, you can run import HSL_jll
and we'll load the libraries etc.
That means that we're downloaded compiled binaries, so we never have to compile on a user's machine.
The downloading of compiled binaries is what I meant by binary redistribution, which I always thought was disallowed by the HSL license. https://licences.stfc.ac.uk/product/coin-hsl
2.1.2 the Licensee may not distribute any of the Software to any third party, or share its use with any third party (regardless of whether such third party is from the same institution), and the Licensee may not sub-license the use of any of the Software;
The downloading of compiled binaries is what I meant by binary redistribution
It'll be an official download from https://licences.stfc.ac.uk
.
The downloading of compiled binaries is what I meant by binary redistribution, which I always thought was disallowed by the HSL license. https://licences.stfc.ac.uk/product/coin-hsl
2.1.2 the Licensee may not distribute any of the Software to any third party, or share its use with any third party (regardless of whether such third party is from the same institution), and the Licensee may not sub-license the use of any of the Software;
No, the license just says that you can't distribute the HSL package under any form. But for you own application on your computer you can do what you want with it.
Until now it was only possible to download the source code. The new package will have the source files and precompiled versions (under a user-friendly form for Julia users) but as before it must be only available for you.
If it's the HSL people hosting the download as an official part of what they license people to obtain directly from them, that works. Anyone other than them putting the files up somewhere else like on github releases wouldn't be allowed without some special arrangement that gets around that distribution prohibition.
If it's the HSL people hosting the download as an official part of what they license people to obtain directly from them, that works.
Yes, this is the plan. @amontoison has been working with them directly.
I'm going to close this in favor of #247 for now.
The underlying problem in this issue was an upstream issue with the HSL configure
script. We're about to release official binaries from HSL, which will render this issue moot. And with a much simple installation path, I'll be able to update the documentation in the README and close #247 for good.
@tkelman @bennerh @LukasBarner For information, we released JuliaHSL. :tada:
This is pretty cool!! 🎉 🎉
Hi everyone,
I tried to add the full hsl library to Ipopt on a clean ubuntu following the ubuntu specific instructions on from here. However, running a basic optimization problem
Leads to the following error message
Other linear solver attributes as "ma57" lead to the same message.
I tried installation on both Ubuntu 18.04 and 20.04, using Ipopt version 1.1.0 and JuMP version 1.4.0
Are there any ideas of what I might have forgotten? Thanks a lot in advance!