bsc-quantic / Tenet.jl

Composable Tensor Network library in Julia
https://bsc-quantic.github.io/Tenet.jl/
Apache License 2.0
17 stars 1 forks source link

Implement `canonize` for canonical form transformation of `MatrixProductState`s #46

Closed jofrevalles closed 8 months ago

jofrevalles commented 1 year ago

Summary

In this PR we add the function canonize (fix #45), which takes a MatrixProductState{Open} and it transforms it to the desired canonical form. This function accepts a center, which determines the center of orthogonality, which can be an Integer or UnitRange. It performs svd to ensure that the tensors on the left and right of the center are left and right canonical, respectively. It also can limit the bond dimension chi to the desired value for the tensors outside of the center. We also added tests to ensure that it works accordingly.

Example

julia> using Tenet
julia> ψ = rand(MatrixProductState{Open}, 16, 2, 8)
TensorNetwork{MatrixProductState{Open}}(#tensors=16, #inds=31)
julia> ϕ = canonize(ψ, 8)
TensorNetwork{MatrixProductState{Open}}(#tensors=16, #inds=31)
julia> A = tensors(ϕ, 4); ein"ijk,ilk->jl"(A, conj(A))
16×16 Matrix{Float64}:
  1.0           2.00021e-16   1.55642e-16  -3.6175e-17    7.2592e-17    1.22594e-17   4.15944e-16  …  -1.33193e-16  -2.02137e-16   2.47257e-16  -9.46201e-17   1.3618e-16    6.85258e-17
  2.00021e-16   1.0           1.1956e-16    1.23764e-16   4.2185e-16   -4.7835e-16   -2.90511e-16     -1.60436e-17  -5.19301e-16   1.33808e-16  -1.58435e-16   9.47992e-17  -2.75936e-16
  1.55642e-16   1.1956e-16    1.0          -1.44702e-16  -1.55503e-17  -3.18376e-17   3.74575e-17      3.60339e-17  -1.89068e-16  -2.46219e-16  -7.9332e-17   -3.10887e-16   1.67774e-17
 -3.6175e-17    1.23764e-16  -1.44702e-16   1.0          -5.88042e-17   2.82227e-16  -7.15532e-17     -5.30028e-18  -8.86827e-17  -1.06777e-16  -8.00802e-17   7.91298e-17  -3.02182e-16
  7.2592e-17    4.2185e-16   -1.55503e-17  -5.88042e-17   1.0          -3.98925e-17  -1.44373e-16     -1.80332e-16  -2.1445e-16   -1.43922e-16  -1.88635e-16  -3.0402e-17   -2.28285e-16
  1.22594e-17  -4.7835e-16   -3.18376e-17   2.82227e-16  -3.98925e-17   1.0          -7.3377e-17   …   2.54161e-16   2.54636e-16  -1.84121e-17  -3.17101e-17   4.80834e-17   7.39615e-17
  4.15944e-16  -2.90511e-16   3.74575e-17  -7.15532e-17  -1.44373e-16  -7.3377e-17    1.0             -2.66094e-16  -4.13278e-17  -2.43669e-17   2.0519e-16   -3.93628e-16  -1.44632e-16
 -5.37008e-17  -4.56306e-17  -8.04112e-17  -4.56559e-16  -6.4265e-18   -3.35598e-16   6.23502e-16      2.66407e-17  -1.7158e-16   -1.41715e-18   6.19677e-17   6.93524e-18   2.79394e-16
  8.05436e-17   1.42453e-16   1.71197e-16   2.08164e-16  -5.35533e-17   1.02004e-16   6.79889e-16     -1.44901e-16  -5.18267e-17  -1.0725e-16    2.23824e-17  -3.37965e-16  -4.39888e-16
  1.61461e-16   1.08783e-16  -1.16969e-16   1.56489e-16   4.03115e-16   5.58961e-17   1.18552e-16     -1.39082e-17  -1.99405e-16   5.04873e-16  -9.1763e-17    3.91543e-17   3.13676e-17
 -1.33193e-16  -1.60436e-17   3.60339e-17  -5.30028e-18  -1.80332e-16   2.54161e-16  -2.66094e-16  …   1.0          -1.15568e-17  -1.47709e-16   1.73387e-16   1.80294e-16   1.60105e-16
 -2.02137e-16  -5.19301e-16  -1.89068e-16  -8.86827e-17  -2.1445e-16    2.54636e-16  -4.13278e-17     -1.15568e-17   1.0          -3.229e-16     4.53753e-16  -1.37335e-16  -6.98021e-17
  2.47257e-16   1.33808e-16  -2.46219e-16  -1.06777e-16  -1.43922e-16  -1.84121e-17  -2.43669e-17     -1.47709e-16  -3.229e-16     1.0          -5.98558e-17   7.25681e-17   2.80239e-16
 -9.46201e-17  -1.58435e-16  -7.9332e-17   -8.00802e-17  -1.88635e-16  -3.17101e-17   2.0519e-16       1.73387e-16   4.53753e-16  -5.98558e-17   1.0           1.24428e-16   3.75688e-19
  1.3618e-16    9.47992e-17  -3.10887e-16   7.91298e-17  -3.0402e-17    4.80834e-17  -3.93628e-16      1.80294e-16  -1.37335e-16   7.25681e-17   1.24428e-16   1.0           1.25388e-16
  6.85258e-17  -2.75936e-16   1.67774e-17  -3.02182e-16  -2.28285e-16   7.39615e-17  -1.44632e-16  …   1.60105e-16  -6.98021e-17   2.80239e-16   3.75688e-19   1.25388e-16   1.0
julia> B = tensors(ϕ, 12); ein"ijk,ljk->il"(B, conj(B))
32×32 Matrix{Float64}:
  1.0          -1.01264e-16   3.29597e-17  -2.1684e-17    7.1991e-17   -1.74557e-16   4.16334e-17  …   2.00297e-32   1.30963e-32   0.0           5.20417e-17   4.85723e-17  -4.0766e-17
 -1.01264e-16   1.0          -7.48099e-17  -3.85976e-17   1.64799e-17   1.74475e-16   1.84314e-17      1.75741e-32   1.07852e-32   1.54074e-33  -1.73472e-16   8.67362e-19  -1.82146e-16
  3.29597e-17  -7.48099e-17   1.0          -6.09322e-17  -4.51028e-17  -6.1149e-17   -1.64799e-17      5.10371e-33  -1.77186e-32   0.0           2.68882e-17  -1.04083e-17   1.21431e-17
 -2.1684e-17   -3.85976e-17  -6.09322e-17   1.0           1.73472e-18  -7.08797e-17   2.0383e-17       2.48445e-32   6.16298e-33   8.85928e-33   1.88651e-17   3.46945e-18  -1.06252e-17
  7.1991e-17    1.64799e-17  -4.51028e-17   1.73472e-18   1.0           9.56266e-17  -1.73472e-17      4.62223e-33   0.0          -7.70372e-34   2.77556e-16   3.46945e-18   2.77556e-17
 -1.74557e-16   1.74475e-16  -6.1149e-17   -7.08797e-17   9.56266e-17   1.0           2.33429e-16  …   3.58896e-27  -6.39421e-28   3.2983e-27    2.28767e-17   2.81893e-17  -7.24789e-17
  4.16334e-17   1.84314e-17  -1.64799e-17   2.0383e-17   -1.73472e-17   2.33429e-16   1.0             -9.62965e-34   7.70372e-34  -9.24446e-33  -6.4642e-32    3.46945e-17  -6.07153e-18
  2.25514e-17  -7.02563e-17   7.63278e-17   6.245e-17     1.38778e-17   1.99493e-17  -3.46945e-18     -1.09778e-32   1.00148e-32  -7.70372e-33  -6.59195e-17   6.245e-17     4.16334e-17
  9.71445e-17  -9.45424e-17   1.14492e-16   5.72459e-17   3.46945e-17  -5.63785e-17   1.73472e-17      1.20366e-18  -8.55969e-19  -5.97008e-20   1.04083e-17  -1.38778e-17   1.04083e-16
  ⋮                                                                     ⋮                          ⋱                                                           ⋮            
 -3.15853e-32  -9.05187e-33  -1.20371e-32  -2.32556e-32   5.77779e-33  -5.42224e-27   1.56e-32         2.08167e-17   8.32667e-17  -2.77556e-17  -9.26063e-18   7.692e-18    -1.6449e-17
 -1.50223e-32  -3.87112e-32  -1.71408e-32  -2.07037e-33   3.46667e-33   1.05074e-26  -4.23705e-33     -1.21431e-16   5.55112e-17   9.71445e-17   8.94486e-17   4.24816e-17   1.18325e-16
 -2.50371e-33   6.64446e-33   4.55001e-33  -4.21297e-33  -4.62223e-33  -3.8008e-27   -9.82224e-33  …   2.68882e-17   1.07553e-16  -6.93889e-17  -2.65699e-17   1.55548e-16  -3.69e-17
  2.00297e-32   1.75741e-32   5.10371e-33   2.48445e-32   4.62223e-33   3.58896e-27  -9.62965e-34      1.0          -1.38778e-17  -4.16334e-17   2.47101e-17   4.15695e-17   4.25864e-17
  1.30963e-32   1.07852e-32  -1.77186e-32   6.16298e-33   0.0          -6.39421e-28   7.70372e-34     -1.38778e-17   1.0          -2.22045e-16  -1.75484e-17  -3.31585e-17  -2.74651e-17
  0.0           1.54074e-33   0.0           8.85928e-33  -7.70372e-34   3.2983e-27   -9.24446e-33     -4.16334e-17  -2.22045e-16   1.0           2.38417e-17   1.31197e-17   3.0879e-17
  5.20417e-17  -1.73472e-16   2.68882e-17   1.88651e-17   2.77556e-16   2.28767e-17  -6.4642e-32       2.47101e-17  -1.75484e-17   2.38417e-17   1.0           9.36751e-17   1.7911e-16
  4.85723e-17   8.67362e-19  -1.04083e-17   3.46945e-18   3.46945e-18   2.81893e-17   3.46945e-17  …   4.15695e-17  -3.31585e-17   1.31197e-17   9.36751e-17   1.0           1.80411e-16
 -4.0766e-17   -1.82146e-16   1.21431e-17  -1.06252e-17   2.77556e-17  -7.24789e-17  -6.07153e-18      4.25864e-17  -2.74651e-17   3.0879e-17    1.7911e-16    1.80411e-16   1.0
julia> [size(tensors(ϕ, i)) for i in 1:length(ϕ)]
16-element Vector{Tuple{Int64, Int64, Vararg{Int64}}}:
 (2, 2)
 (2, 4, 2)
 (4, 8, 2)
 (8, 16, 2)
 (16, 32, 2)
 (32, 32, 2)
 (32, 16, 2)
 (16, 8, 2)
 (8, 16, 2)
 (16, 32, 2)
 (32, 32, 2)
 (32, 16, 2)
 (16, 8, 2)
 (8, 4, 2)
 (4, 2, 2)
 (2, 2)

julia> ϕ2 = canonize(ψ, 8; chi = 4)
TensorNetwork{MatrixProductState{Open}}(#tensors=16, #inds=31)
julia> [size(tensors(ϕ2, i)) for i in 1:length(ϕ2)]
16-element Vector{Tuple{Int64, Int64, Vararg{Int64}}}:
 (2, 2)
 (2, 4, 2)
 (4, 4, 2)
 (4, 4, 2)
 (4, 4, 2)
 (4, 4, 2)
 (4, 4, 2)
 (4, 8, 2)
 (8, 4, 2)
 (4, 4, 2)
 (4, 4, 2)
 (4, 4, 2)
 (4, 4, 2)
 (4, 4, 2)
 (4, 2, 2)
 (2, 2)
julia> A = tensors(ϕ2, 4); ein"ijk,ilk->jl"(A, conj(A))
4×4 Matrix{Float64}:
  1.0          -1.01412e-16  -5.00915e-16   2.35041e-16
 -1.01412e-16   1.0           1.14853e-17  -5.86129e-16
 -5.00915e-16   1.14853e-17   1.0          -3.02073e-16
  2.35041e-16  -5.86129e-16  -3.02073e-16   1.0
julia> B = tensors(ϕ2, 12); ein"ijk,ljk->il"(B, conj(B))
4×4 Matrix{Float64}:
  1.0          8.67362e-19  -6.245e-17    -1.04951e-16
  8.67362e-19  1.0           3.40006e-16   1.9082e-16
 -6.245e-17    3.40006e-16   1.0           2.63678e-16
 -1.04951e-16  1.9082e-16    2.63678e-16   1.0
codecov[bot] commented 1 year ago

Codecov Report

Merging #46 (f7c9a9f) into master (bd41d90) will increase coverage by 1.13%. The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master      #46      +/-   ##
==========================================
+ Coverage   82.78%   83.92%   +1.13%     
==========================================
  Files          12       12              
  Lines         610      653      +43     
==========================================
+ Hits          505      548      +43     
  Misses        105      105              
Impacted Files Coverage Δ
src/Tenet.jl 100.00% <ø> (ø)
src/Quantum/MatrixProductState.jl 94.44% <100.00%> (+2.87%) :arrow_up:
jofrevalles commented 1 year ago

I don't understand why I have this error in the test. Why is using OMEinsum not enough to load @ein_str? I am using this locally and I don't have issues.

mofeing commented 1 year ago

I don't understand why I have this error in the test. Why is using OMEinsum not enough to load @ein_str? I am using this locally and I don't have issues.

Strange error. Maybe OMEinsum is not added to the test project but anyway, you should try to use contract and svd with Tensors.

I'm gonna pull the code to fix it and refactor it a lil.

jofrevalles commented 1 year ago

Strange error. Maybe OMEinsum is not added to the test project but anyway, you should try to use contract and svd with Tensors.

Can I use contract with certain indices only?

I'm gonna pull the code to fix it and refactor it a lil.

Okay, however you prefer. I can also do it myself.

mofeing commented 1 year ago

Can I use contract with certain indices only?

Yes! That's the 3rd argument. Pass a list or tuple of Symbols.

jofrevalles commented 1 year ago

Can I use contract with certain indices only?

Yes! That's the 3rd argument. Pass a list or tuple of Symbols.

Is it supposed to work like this?

julia> A = tensors(ϕ, 3); ein"ijk,ilk->jl"(A, conj(A))
8×8 Matrix{Float64}:
  1.0          -1.2878e-16    3.2034e-16   -8.75272e-17   9.60378e-17  -1.16679e-16   5.70454e-17   1.98919e-16
 -1.2878e-16    1.0          -1.64091e-16  -1.1692e-16    5.85044e-16  -1.96559e-16   4.28362e-16   5.11654e-17
  3.2034e-16   -1.64091e-16   1.0          -4.59832e-17   5.08813e-17  -3.10939e-16   2.33305e-16  -3.48943e-16
 -8.75272e-17  -1.1692e-16   -4.59832e-17   1.0           2.93677e-18  -8.10736e-18   3.14701e-16  -2.33236e-16
  9.60378e-17   5.85044e-16   5.08813e-17   2.93677e-18   1.0          -3.38649e-16  -2.1951e-16    1.2226e-16
 -1.16679e-16  -1.96559e-16  -3.10939e-16  -8.10736e-18  -3.38649e-16   1.0           2.5346e-16    4.90241e-16
  5.70454e-17   4.28362e-16   2.33305e-16   3.14701e-16  -2.1951e-16    2.5346e-16    1.0           9.16482e-17
  1.98919e-16   5.11654e-17  -3.48943e-16  -2.33236e-16   1.2226e-16    4.90241e-16   9.16482e-17   1.0

julia> A = tensors(ϕ, 3); labels_A = labels(A); Tenet.contract(A, conj(A), (labels_A[1], labels_A[3]))
8-element Tensor{Float64, 1, Vector{Float64}}:
 1.0000000000000009
 0.9999999999999999
 1.0000000000000002
 0.9999999999999997
 0.9999999999999999
 0.9999999999999999
 0.9999999999999993
 0.9999999999999993

I would expect from conj(A) to have different labels than A so we can contract two selected indices and return a Matrix.

jofrevalles commented 1 year ago

I updated the code to use now contract instead of OMEinsum. The discussion on the labels argument of the contract function for Tensors is continued in Tensors/#13.

mofeing commented 8 months ago

Related code moved to Qrochet.jl.