Closed emstoudenmire closed 4 years ago
I made this fix and bumped the version. It seems like the problem is that array(V)'
does a transpose and a complex conjugate, but we are using the convention that V
is already conjugated. I think this is just because of the ITensors.jl convention, but maybe we should bring it in line with Julia's convention.
Anyway, that test passes if you do:
@test isapprox(norm(array(U)*array(S)*transpose(array(V))-array(A)),0.0; atol=1e-14)
Great. Yes, I would argue that for NDTensors, we should have the convention for V be whatever is least surprising to users thinking of a Tensor as just like a Julia tensor but with a different interface. So it might make sense for svd to return a V such that V’ is what’s needed to reconstruct A when V is a matrix. I guess the changes to ITensors.jl as a result would be minimal, correct?
As you probably know, the reason for the convention in ITensors is that since you don’t have to do a transpose, because the indices themselves handle it, then it might be confusing to still have to do just a conj to get the right result.
Passing a complex block-sparse Tensor into the svd function can lead to an
InexactError
. Changing the lines (blocksparse/linearalgebra.jl) 153 and 154:To the following
appears to fix the issue. However, when trying to make a unit test I'm getting very large differences between USV' and the original tensor - perhaps I'm expecting the wrong output from the SVD?