Closed iSoron closed 3 years ago
It's currently GLP_MSG_ERR
by default.
https://github.com/jump-dev/GLPK.jl/blob/5259d51e194e72e9ceee5711d5f56ed1f2077d9d/src/MOI_wrapper/MOI_wrapper.jl#L183
We could bump it up, I guess. Any reason? You can always use set_optimizer_attribute(model, "msg_lev", GLPK.GLP_MSG_ON)
(or similar, I forget the exact constant).
Any reason?
It currently does not satisfy the MOI specs that says that it should be on by default.
I strongly dislike this change: https://github.com/odow/SDDP.jl/pull/407/checks?check_run_id=2703078270
GLP_MSG_ON
is far too verbose. Which is probably why we had GLP_MSG_ERR
in the first place.
For algorithms calling a same solvers many times like SDDP, it makes sense to use MOI.Silent
Yes, I've turned silent on by default now. The problem is still that GLPK is far too talky. It doesn't print a nice summary, it just spams the console with rows of numbers without context.
I'm not convinced that having the output on by default is an improvement:
julia> model = Model(GLPK.Optimizer)
A JuMP Model
Feasibility problem with:
Variables: 0
Model mode: AUTOMATIC
CachingOptimizer state: EMPTY_OPTIMIZER
Solver name: GLPK
julia> @variable(model, x >= 0)
x
julia> @variable(model, 0 <= y <= 3)
y
julia> @objective(model, Min, 12x + 20y)
12 x + 20 y
julia> @constraint(model, c1, 6x + 8y >= 100)
c1 : 6 x + 8 y ≥ 100.0
julia> @constraint(model, c2, 7x + 12y >= 120)
c2 : 7 x + 12 y ≥ 120.0
julia> print(model)
Min 12 x + 20 y
Subject to
c1 : 6 x + 8 y ≥ 100.0
c2 : 7 x + 12 y ≥ 120.0
x ≥ 0.0
y ≥ 0.0
y ≤ 3.0
julia> optimize!(model)
0: obj = 0.000000000e+00 inf = 2.200e+02 (2)
3: obj = 2.120000000e+02 inf = 0.000e+00 (0)
* 4: obj = 2.050000000e+02 inf = 0.000e+00 (0)
Hi everyone, thank you for your work on this solver interface.
According to MathOptInterface documentation, all optimizers should be verbose by default:
GLPK.Optimizer, however, seems to be completely silent by default. For example, the following script produces no output:
Replacing
GLPK
in the snippet above byCbc
produces some output, as expected.Should this be considered a bug, and would you consider a PR to fix it?