bsc-quantic / Tenet.jl

Composable Tensor Network library in Julia
https://bsc-quantic.github.io/Tenet.jl/
Apache License 2.0
17 stars 1 forks source link

Fix `DiagonalReduction` tranformation for reduction over `skip_inds` #62

Closed jofrevalles closed 1 year ago

jofrevalles commented 1 year ago

Summary

This PR addresses a previously overlooked scenario in the DiagonalReduction transformation of the transform function. Specifically, it targets instances where the index to be reduced is included in skip_inds, which is set to be the open indices of the network by default. In such cases, the transformation previously failed to correctly reduce the dimension of a Tensor in a TensorNetwork.

To fix this, we have introduced a COPY tensor whenever an index is reduced over a Tensor with an index in skip_inds. to illustrate this, let's consider a TensorNetwork $TN=T^{1}_{ijk}T^{2}_{jkl}$, and we want to reduce the diagonal indices $ij$ where $i$ is on skip_inds since it is an open index. A regular reduction would lead to $\tilde{TN}=T^{1}_{ik} T^{2}_{ikl}$, but as a result, the index $i$ would no longer be an open index, and thus $TN \neq \tilde{TN}$. To ensure that the dimensionality of the two expressions is the same, we need to add a COPY tensor: ${TN}^{\prime}=T^{1}_{i_{1} k} T^{2}_{i_{2} kl} \delta_{i_{1} i_{2} i_{3}}$.

Additionally, we have refined the corresponding tests to better accommodate this specific scenario.

Example

Below is an example illustrating a diagonal reduction wherein one of the indices is an open index, thus introducing a COPY tensor in the reduction:

julia> using Tenet

julia> data = [1.0 0.0; 0.0 1.0;;; 1.0 0.0; 0.0 1.0] # The first and second indices are diagonal
2×2×2 Array{Float64, 3}:
[:, :, 1] =
 1.0  0.0
 0.0  1.0

[:, :, 2] =
 1.0  0.0
 0.0  1.0

julia> A = Tensor(data, (:i, :j, :k))
2×2×2 Tensor{Float64, 3, Array{Float64, 3}}: ...

julia> B = Tensor(rand(2, 2, 2), (:j, :k, :l))
2×2×2 Tensor{Float64, 3, Array{Float64, 3}}: ...

julia> tn = TensorNetwork([A, B])
TensorNetwork{Arbitrary}(#tensors=2, #inds=4)

julia> reduced = transform(tn, DiagonalReduction)
TensorNetwork{Arbitrary}(#tensors=3, #inds=6)

julia> [labels(tensor) for tensor in reduced.tensors]
3-element Vector{Tuple{Symbol, Symbol, Vararg{Symbol}}}:
 (:i1, :k)
 (:i2, :k, :l)
 (:i1, :i2, :i3)
codecov[bot] commented 1 year ago

Codecov Report

Merging #62 (e14b403) into master (03d0c8b) will increase coverage by 1.44%. The diff coverage is 100.00%.

@@            Coverage Diff             @@
##           master      #62      +/-   ##
==========================================
+ Coverage   82.76%   84.21%   +1.44%     
==========================================
  Files          12       12              
  Lines         650      703      +53     
==========================================
+ Hits          538      592      +54     
+ Misses        112      111       -1     
Impacted Files Coverage Δ
src/Transformations.jl 99.09% <100.00%> (+2.54%) :arrow_up:
mofeing commented 1 year ago

I don't like the idea that the output index gets renamed. Could you let the output index name untouched?

For the new indices, better use UUIDs.uuid4().

jofrevalles commented 1 year ago

I don't like the idea that the output index gets renamed. Could you let the output index name untouched?

For the new indices, better use UUIDs.uuid4().

Okay, now this is fixed:

...

julia> tn = TensorNetwork([A, B])
TensorNetwork{Arbitrary}(#tensors=2, #inds=4)

julia> [labels(tensor) for tensor in tn.tensors]
2-element Vector{Tuple{Symbol, Symbol, Symbol}}:
 (:i, :j, :k)
 (:j, :k, :l)

julia> reduced = transform(tn, DiagonalReduction)
TensorNetwork{Arbitrary}(#tensors=3, #inds=5)

julia> [labels(tensor) for tensor in reduced.tensors]
3-element Vector{Tuple{Symbol, Symbol, Vararg{Symbol}}}:
 (Symbol("18fe49cf-b246-444a-a4ce-60eb7337373d"), :k)
 (Symbol("7a09ade8-5d19-4462-b0dc-e6826b37f44f"), :k, :l)
 (:i, Symbol("18fe49cf-b246-444a-a4ce-60eb7337373d"), Symbol("7a09ade8-5d19-4462-b0dc-e6826b37f44f"))