Closed blegat closed 2 months ago
I don't particularly like this design. The "Convex.jl" approach would probably be to add a TupleAtom
?
Attention: Patch coverage is 97.87234%
with 1 lines
in your changes are missing coverage. Please review.
Project coverage is 97.84%. Comparing base (
bb17c4c
) to head (02c96ab
). Report is 3 commits behind head on master.
Files | Patch % | Lines |
---|---|---|
src/expressions.jl | 0.00% | 1 Missing :warning: |
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
It's the concatenation that should belong to the cone so we should use a vcat
atom (which is seems gets rewritten with hcat
and reshape
s). If getindex
on a vcat
atom, would check if the index fit exactly in one of the argument of vcat
and then in that case unwrap the vcat
atom then it would fix https://github.com/jump-dev/Convex.jl/issues/603
Now that https://github.com/jump-dev/Convex.jl/issues/603 is fixed, I reverted the Tuple
approach
I had to refactor a bit in e220a8e to work around this bug of JuliaFormatter: https://github.com/domluna/JuliaFormatter.jl/issues/833
Looks good to me, any objection to merge ?
This is an attempt to refactor
GeomMeanEpiCone
intoGenericConstraint
. In the long term (meaning definitely post v0.16), we can replace the custom_add_constraint
with a bridge so that the cone can be used in JuMP as well. If we do the same for the two other remaining constraints, we can then get rid ofabstract type Constraint
and renameGenericConstraint
intoConstraint
.This design is blocked by https://github.com/jump-dev/Convex.jl/issues/603