jump-dev / JuMP.jl

Modeling language for Mathematical Optimization (linear, mixed-integer, conic, semidefinite, nonlinear)
http://jump.dev/JuMP.jl/
Other
2.22k stars 393 forks source link

`value` returns both `0.0` and `-0.0` on `Bin`ary values #3793

Closed LilithHafner closed 2 months ago

LilithHafner commented 2 months ago

The output here should be a BitMatrix, or, if you care about type stability, which is quite reasonable, a Matrix{Float64} with entries 0.0 and 1.0. Some stray -0.0s aren't "wrong" in the sense that -0.0 == 0.0, but unless they convey some meaningful information that sign bit should be normalized to 0.0:

julia> using JuMP

julia> import HiGHS

julia> function bin_pack(capacities::AbstractVector{<:Real}, sizes::AbstractVector{<:Real})
           model = Model(HiGHS.Optimizer)

           @variable(model, x[eachindex(sizes), eachindex(capac
           @objective(model, Max, sum(x))
           @constraint(model, sum(x, dims=2) .<= 1)
           @constraint(model, sizes' * x .<= capacities')

           set_silent(model)
           optimize!(model)

           value.(x)
       end
bin_pack (generic function with 1 method)

julia> bin_pack([4,5], [3,2,2,2])
4×2 Matrix{Float64}:
  0.0   1.0
  1.0  -0.0
  1.0  -0.0
 -0.0   1.0
odow commented 2 months ago

This is expected behavior that we won't be fixing.

Solvers, even when they have "binary" variables in fact solve in double precision with a tolerance (the HiGHS default is 1e-6), so anything in [N - tol, N + tol] is considered "integer" when N is an integer.

Note that, in most cases you can round the solution to recover integrality, but in some cases, the rounded solution may violate your constraints by more than the primal feasibility tolerance.

odow commented 2 months ago

I've opened a PR to clarify the docs: https://github.com/jump-dev/JuMP.jl/pull/3794