jump-dev / Convex.jl

A Julia package for disciplined convex programming
https://jump.dev/Convex.jl/stable/
Other
569 stars 123 forks source link

Various operations don't support broadcast #218

Closed aferragu closed 2 years ago

aferragu commented 6 years ago

I'm using Julia v0.6.1 with Mosek.jl v0.8.2 and the latest Convex.jl master installed via Pkg.checkout. I'm trying to solve a simple resource allocation problem, which is an LP of the form:

R=[1 0 0;1 1 0;0 1 1];
C=[60;150;100];
set_default_solver(MosekSolver())
x=Variable(3)
p=maximize(sum(x))
p.constraints+=R*x.<=C
p.constraints+=x.>=0
solve!(p)

And I obtain as output:

Problem
  Name                   :                 
  Objective sense        : min             
  Type                   : LO (linear optimization problem)
  Constraints            : 13              
  Cones                  : 0               
  Scalar variables       : 4               
  Matrix variables       : 0               
  Integer variables      : 0               

Optimizer started.
Presolve started.
Linear dependency checker started.
Linear dependency checker terminated.
Eliminator - tries                  : 0                 time                   : 0.00            
Lin. dep.  - tries                  : 1                 time                   : 0.00            
Lin. dep.  - number                 : 0               
Presolve terminated. Time: 0.00    
Optimizer terminated. Time: 0.00    

Interior-point solution summary
  Problem status  : PRIMAL_AND_DUAL_FEASIBLE
  Solution status : OPTIMAL
  Primal.  obj: -1.2000000000e+02   nrm: 1e+02    Viol.  con: 0e+00    var: 0e+00  
  Dual.    obj: -1.2000000000e+02   nrm: 1e+00    Viol.  con: 0e+00    var: 0e+00  

Basic solution summary
  Problem status  : PRIMAL_AND_DUAL_FEASIBLE
  Solution status : OPTIMAL
  Primal.  obj: -1.2000000000e+02   nrm: 1e+02    Viol.  con: 0e+00    var: 0e+00  
  Dual.    obj: -1.2000000000e+02   nrm: 1e+00    Viol.  con: 0e+00    var: 0e+00  

julia> x.value
3×1 Array{Float64,2}:
 60.0
 -0.0
 60.0

Which is clearly non-optimal (x=[60;0;100] works better).

Using SCS the problem is the same:

----------------------------------------------------------------------------
    SCS v1.2.6 - Splitting Conic Solver
    (c) Brendan O'Donoghue, Stanford University, 2012-2016
----------------------------------------------------------------------------
Lin-sys: sparse-direct, nnz in A = 22
eps = 1.00e-04, alpha = 1.80, max_iters = 20000, normalize = 1, scale = 5.00
Variables n = 4, constraints m = 13
Cones:  primal zero / dual free vars: 1
    linear vars: 12
Setup time: 7.20e-05s
----------------------------------------------------------------------------
 Iter | pri res | dua res | rel gap | pri obj | dua obj | kap/tau | time (s)
----------------------------------------------------------------------------
     0|      inf       inf      -nan      -inf      -inf       inf  3.61e-05 
   100| 7.80e-04  3.66e-03  3.16e-03 -1.20e+02 -1.21e+02  2.07e-14  7.84e-05 
   200| 1.83e-05  3.34e-04  1.80e-04 -1.20e+02 -1.20e+02  0.00e+00  1.18e-04 
   240| 6.17e-06  2.76e-05  2.43e-05 -1.20e+02 -1.20e+02  2.07e-14  1.36e-04 
----------------------------------------------------------------------------
Status: Solved
Timing: Solve time: 1.38e-04s
    Lin-sys: nnz in L factor: 41, avg solve time: 1.47e-07s
    Cones: avg projection time: 3.04e-08s
----------------------------------------------------------------------------
Error metrics:
dist(s, K) = 0.0000e+00, dist(y, K*) = 0.0000e+00, s'y/|s||y| = 2.5273e-18
|Ax + s - b|_2 / (1 + |b|_2) = 6.1731e-06
|A'y + c|_2 / (1 + |c|_2) = 2.7598e-05
|c'x + b'y| / (1 + |c'x| + |b'y|) = 2.4254e-05
----------------------------------------------------------------------------
c'x = -119.9981, -b'y = -120.0039
============================================================================

julia> x.value
3×1 Array{Float64,2}:
 59.9998     
  0.000433576
 59.9978     

I suspected an issue with broadcasting in p.constraints+=R*x.<=C. In fact inserting the constraints separately yields the correct answer.

ericphanson commented 5 years ago

With the current release of Convex, I get

ERROR: LoadError: MethodError: no method matching iterate(::Convex.MultiplyAtom)
Closest candidates are:
  iterate(::Core.SimpleVector) at essentials.jl:568
  iterate(::Core.SimpleVector, ::Any) at essentials.jl:568
  iterate(::ExponentialBackOff) at error.jl:199

when running your code, which seems much better than silently giving the wrong answer. Removing the broadcast on the constraints (which then matches the examples in https://convexjl.readthedocs.io/en/stable/types.html#constraints, for example) gives

julia> p.optval
160.0

julia> x.value
3×1 Array{Float64,2}:
 60.0
 90.0
 10.0

Is this right?

aferragu commented 5 years ago

Yeah that solution is valid. I was a user of Convex.jl but since migrated to JuMP. That day I was quickly solving an LP checking for some exam solutions and found it didn't work right and reported.

ericphanson commented 5 years ago

Ok, great!

On the one hand, I think using broadcasted notation could make sense for Convex, but on the other, I don’t think there’s really another way to interpret the non-broadcasted constraints besides elementwise. I think it isn’t good if broadcasted notation doesn’t error and gives a different result though. So I’ll submit a pull request to add some tests to make sure that the broadcasted notation errors, and update the docs to explicitly point out that one should not use broadcasted constraints. Then if that goes through maybe we can close this issue.

ericphanson commented 5 years ago

Hmm, well I realized

  x = Variable(5)
  y = rand(5)
  C = x .== y

yields that C is a vector of constraints, essentially,

[ x[i] == y[i] for i = 1:5 ]

It could be someone is using this for some purpose, although I'm not sure. So I'm not sure we want to disallow broadcasting entirely. Maybe the simplest fix is to add a test to check the formulation from your first post yields an error (with a reference to this issue), and if the broadcasting in R*x .<= y ever gets returned to Convex (even by accident), the failing test will prompt a check of this issue to make sure it's being used sensibly.

odow commented 2 years ago

Closing in favor of https://github.com/jump-dev/Convex.jl/issues/479