jump-dev / Convex.jl

A Julia package for disciplined convex programming
https://jump.dev/Convex.jl/stable/
Other
567 stars 121 forks source link

Allow SparseMatrixCSC in Constant #631

Closed ericphanson closed 5 months ago

ericphanson commented 5 months ago

Before:

julia> include("alternating_minimization.jl");
┌ Info: Running with Convex.jl...
└   (m, n, k) = (125, 125, 3)
  3.573375 seconds (482.01 k allocations: 4.005 GiB, 14.71% gc time, 2.11% compilation time)
┌ Info: Running with JuMP...
└   (m, n, k) = (125, 125, 3)
  1.746994 seconds (11.50 M allocations: 965.324 MiB, 9.38% gc time, 16.74% compilation time)
Test Summary: | Pass  Total  Time
Same results  |    2      2  0.0s

# edit to adjust to 500, 500, 5

julia> include("alternating_minimization.jl");
┌ Info: Running with Convex.jl...
└   (m, n, k) = (500, 500, 5)
zsh: killed     julia --project

After:

julia> include("alternating_minimization.jl")
┌ Info: Running with Convex.jl...
└   (m, n, k) = (125, 125, 3)
  0.891868 seconds (481.78 k allocations: 374.011 MiB, 10.03% compilation time)
┌ Info: Running with JuMP...
└   (m, n, k) = (125, 125, 3)
  2.643730 seconds (11.50 M allocations: 965.566 MiB, 35.61% gc time, 12.43% compilation time)
Test Summary: | Pass  Total  Time
Same results  |    2      2  0.0s
Test.DefaultTestSet("Same results", Any[], 2, false, false, true, 1.715032812872713e9, 1.715032812872855e9, false, "/Users/eph/Convex.jl/benchmark/alternating_minimization.jl")

# edit to adjust to 500, 500, 5

julia> include("alternating_minimization.jl")
┌ Info: Running with Convex.jl...
└   (m, n, k) = (500, 500, 5)
┌ Warning: Problem wasn't solved optimally
│   status = SLOW_PROGRESS::TerminationStatusCode = 19
└ @ Convex ~/Convex.jl/src/solution.jl:106
┌ Warning: Problem wasn't solved optimally
│   status = SLOW_PROGRESS::TerminationStatusCode = 19
└ @ Convex ~/Convex.jl/src/solution.jl:106
┌ Warning: Problem wasn't solved optimally
│   status = SLOW_PROGRESS::TerminationStatusCode = 19
└ @ Convex ~/Convex.jl/src/solution.jl:106
 13.208127 seconds (5.25 M allocations: 5.335 GiB, 9.69% gc time, 0.63% compilation time)
┌ Info: Running with JuMP...
└   (m, n, k) = (500, 500, 5)
 25.208499 seconds (207.07 M allocations: 15.508 GiB, 10.42% gc time, 1.34% compilation time)
Same results: Test Failed at /Users/eph/Convex.jl/benchmark/alternating_minimization.jl:115
  Expression: ≈(evaluate(X1), X3, atol = 0.01, rtol = 0.01)
   Evaluated: [3.985502847714369 6.336204901220693 … 3.0561144403421983 1.9811075092251802; 7.103073495040336 10.03035168385331 … 8.490997313059967 6.843293638612469; … ; 9.396602838044087 11.413079225490325 … 7.108408090021403 6.6699615362915345; 5.568163727865276 5.356379196890957 … 5.522929652986813 3.4495627069254464] ≈ [3.780960048278361 5.622524414892031 … 3.1261791636486658 2.313101796428146; 6.399984541890497 8.538191107376267 … 8.11128008838461 6.852751645554122; … ; 8.588782644432861 9.878977650066732 … 7.099949168036594 6.669088447309506; 5.081806995719069 4.4830189978506905 … 5.29058289903206 3.464746864457721] (atol=0.01, rtol=0.01)

Stacktrace:
 [1] macro expansion
   @ ~/.julia/juliaup/julia-1.10.3+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Test/src/Test.jl:672 [inlined]
 [2] macro expansion
   @ ~/Convex.jl/benchmark/alternating_minimization.jl:115 [inlined]
 [3] macro expansion
   @ ~/.julia/juliaup/julia-1.10.3+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Test/src/Test.jl:1577 [inlined]
 [4] top-level scope
   @ ~/Convex.jl/benchmark/alternating_minimization.jl:115
Same results: Test Failed at /Users/eph/Convex.jl/benchmark/alternating_minimization.jl:116
  Expression: ≈(evaluate(Y1), Y3, atol = 0.01, rtol = 0.01)
   Evaluated: [0.07295197285764587 -0.24286897306707095 … -0.5548581939642112 1.4633485414096428; 1.414134868735616 1.9004876767400078 … 1.6676190513350084 0.8649987706473459; … ; -1.4904480795374606 0.11746381884317214 … 0.7436972016699036 -0.04421684309960473; 0.9233135840332146 0.716863190462869 … -0.8394400807700355 -1.5306526191598768] ≈ [0.21439733516058615 -0.2901415368702991 … -0.524234840438445 1.7087464033753652; 1.4924783296899935 1.962305818151101 … 1.770677504251063 0.8545824272365898; … ; -1.6273950064460871 0.15516389028236127 … 0.9407293312551961 0.1103153021415286; 1.1179230855667397 0.9122501086590963 … -0.940312055696772 -1.6435436829197443] (atol=0.01, rtol=0.01)

Stacktrace:
 [1] macro expansion
   @ ~/.julia/juliaup/julia-1.10.3+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Test/src/Test.jl:672 [inlined]
 [2] macro expansion
   @ ~/Convex.jl/benchmark/alternating_minimization.jl:116 [inlined]
 [3] macro expansion
   @ ~/.julia/juliaup/julia-1.10.3+0.aarch64.apple.darwin14/share/julia/stdlib/v1.10/Test/src/Test.jl:1577 [inlined]
 [4] top-level scope
   @ ~/Convex.jl/benchmark/alternating_minimization.jl:115
Test Summary: | Fail  Total  Time
Same results  |    2      2  0.0s
ERROR: Some tests did not pass: 0 passed, 2 failed, 0 errored, 0 broken.

So we go from OOMing to solving it in 13s, not bad! Sparse constants are very important because we internally use constants to represent vectorized operations, in which we take a nxm matrix, vectorize it to a n*m vector, then put it on the diagonal of a (n*m) x (n*m) matrix - and if the latter ever gets densified, that's a huge issue.

JuMP and Convex don't get the same answers here, but I don't think it is too concerning, since it is an iterative algorithm so can be sensitive to small changes, and we formulate the problem a little differently.

ericphanson commented 5 months ago

btw, it was my first time profiling end-to-end with a pure julia solver, it's pretty neat to do a @profview in vscode and jump to kkt_update! and stuff like that!

codecov[bot] commented 5 months ago

Codecov Report

All modified and coverable lines are covered by tests :white_check_mark:

Project coverage is 97.84%. Comparing base (de4bff2) to head (95b1fa0).

Additional details and impacted files ```diff @@ Coverage Diff @@ ## master #631 +/- ## ========================================== + Coverage 97.82% 97.84% +0.02% ========================================== Files 88 88 Lines 5141 5144 +3 ========================================== + Hits 5029 5033 +4 + Misses 112 111 -1 ```

:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.