Closed odow closed 3 months ago
All modified and coverable lines are covered by tests :white_check_mark:
Project coverage is 98.40%. Comparing base (
bf662d4
) to head (39b99bc
).
:umbrella: View full report in Codecov by Sentry.
:loudspeaker: Have feedback on the report? Share it here.
Okay, this looks like the problem. We never had any tests for this.
You can use JuMP.dual_shape
.
This is similar to what we do in
https://github.com/jump-dev/MathOptInterface.jl/blob/a15b67fe47ad20caf316eb1bba0369a46ceb5a34/src/Bridges/Constraint/bridges/square.jl#L401-L411
The rule in dual is to multiply by the adjoint of the linear transformation we did.
In case the matrix was symmetric, we assume that the transformation was taking the average of the two off-diagonal entries hence the transpose would set half the dual to each entry. But, because the triangular dot product multiplies by 2, the adjoint does not divide by 2 and just set the same value to both off-diagonal.
In here, because Zeros
has the standard dot product, the adjoint and transpose are the same so assuming that the transformation takes the average, this PR is the adjoint. Could you add a small explanation in the comments explaining that we are simply doing the adjoint (assuming we used the average).
You can use JuMP.dual_shape
I don't think so, or at least, not for SymmetricMatrixShape
and HermitianMatrixShape
, because these are a special case of us interpreting Matrix -in- Zeros
. If we implemented dual_shape
it would also modify things like Matrix in PSD.
Or we need a new matrix shape just for these Matrix-in-Zeros constraints.
Or we need a new matrix shape just for these Matrix-in-Zeros constraints.
Yes, maybe we need to do that. I'm in favor
Name suggestions for the primal and dual shapes?
SquareSymmetricMatrixShape
and SquareSymmetricMatrixDualShape
?
Okay, instead of adding more shapes, I've added needs_adjoint_dual
kwarg.
I'm afraid this PR introduced a bug: before if I tried to constrain A == B
for Symmetric A
and Hermitian B
it would give me an error. Now it silently fails.
using JuMP
using LinearAlgebra
import Hypatia
import Clarabel
import Ket
import Random
function dual_test(d)
Random.seed!(1)
ρ = Ket.random_state(Float64, d^2, 1)
model = Model()
@variable(model, σ[1:d^2, 1:d^2] in PSDCone())
@variable(model, λ)
noisy_state = Hermitian(ρ + λ * I(d^2))
@constraint(model, witness_constraint, σ == noisy_state)
@objective(model, Min, λ)
@constraint(model, Ket.partial_transpose(σ, 2, [d, d]) in PSDCone())
set_optimizer(model, Hypatia.Optimizer)
#set_optimizer(model, Clarabel.Optimizer)
optimize!(model)
W = dual(witness_constraint)
return W
end
Oops, can you open an issue so that we can track this ?
Now it silently fails
No, now it silently succeeds :smile:
Closes #3796
I need to think about whether there is a better option. I don't think we should just fix the default
reshape_vector
because we don't need to modify anything for the primal? Just the dual?