Closed jofrevalles closed 1 year ago
Merging #62 (e14b403) into master (03d0c8b) will increase coverage by
1.44%
. The diff coverage is100.00%
.
@@ Coverage Diff @@
## master #62 +/- ##
==========================================
+ Coverage 82.76% 84.21% +1.44%
==========================================
Files 12 12
Lines 650 703 +53
==========================================
+ Hits 538 592 +54
+ Misses 112 111 -1
Impacted Files | Coverage Δ | |
---|---|---|
src/Transformations.jl | 99.09% <100.00%> (+2.54%) |
:arrow_up: |
I don't like the idea that the output index gets renamed. Could you let the output index name untouched?
For the new indices, better use UUIDs.uuid4()
.
I don't like the idea that the output index gets renamed. Could you let the output index name untouched?
For the new indices, better use
UUIDs.uuid4()
.
Okay, now this is fixed:
...
julia> tn = TensorNetwork([A, B])
TensorNetwork{Arbitrary}(#tensors=2, #inds=4)
julia> [labels(tensor) for tensor in tn.tensors]
2-element Vector{Tuple{Symbol, Symbol, Symbol}}:
(:i, :j, :k)
(:j, :k, :l)
julia> reduced = transform(tn, DiagonalReduction)
TensorNetwork{Arbitrary}(#tensors=3, #inds=5)
julia> [labels(tensor) for tensor in reduced.tensors]
3-element Vector{Tuple{Symbol, Symbol, Vararg{Symbol}}}:
(Symbol("18fe49cf-b246-444a-a4ce-60eb7337373d"), :k)
(Symbol("7a09ade8-5d19-4462-b0dc-e6826b37f44f"), :k, :l)
(:i, Symbol("18fe49cf-b246-444a-a4ce-60eb7337373d"), Symbol("7a09ade8-5d19-4462-b0dc-e6826b37f44f"))
Summary
This PR addresses a previously overlooked scenario in the
DiagonalReduction
transformation of thetransform
function. Specifically, it targets instances where the index to be reduced is included inskip_inds
, which is set to be the open indices of the network by default. In such cases, the transformation previously failed to correctly reduce the dimension of aTensor
in aTensorNetwork
.To fix this, we have introduced a COPY tensor whenever an index is reduced over a
Tensor
with an index inskip_inds
. to illustrate this, let's consider aTensorNetwork
$TN=T^{1}_{ijk}T^{2}_{jkl}
$, and we want to reduce the diagonal indices $ij$ where $i$ is onskip_inds
since it is an open index. A regular reduction would lead to $\tilde{TN}=T^{1}_{ik} T^{2}_{ikl}
$, but as a result, the index $i$ would no longer be an open index, and thus $TN \neq \tilde{TN}
$. To ensure that the dimensionality of the two expressions is the same, we need to add a COPY tensor: ${TN}^{\prime}=T^{1}_{i_{1} k} T^{2}_{i_{2} kl} \delta_{i_{1} i_{2} i_{3}}
$.Additionally, we have refined the corresponding tests to better accommodate this specific scenario.
Example
Below is an example illustrating a diagonal reduction wherein one of the indices is an open index, thus introducing a COPY tensor in the reduction: