Closed jofrevalles closed 1 year ago
Merging #70 (274fc08) into master (2b37db3) will decrease coverage by
0.66%
. The diff coverage is95.83%
.
@@ Coverage Diff @@
## master #70 +/- ##
==========================================
- Coverage 88.50% 87.84% -0.66%
==========================================
Files 9 9
Lines 600 625 +25
==========================================
+ Hits 531 549 +18
- Misses 69 76 +7
Impacted Files | Coverage Δ | |
---|---|---|
src/Transformations.jl | 97.46% <95.83%> (-0.30%) |
:arrow_down: |
Super!
julia> size.(tensors(reduced)) 5-element Vector{Tuple{Int64, Int64, Vararg{Int64}}}: (3, 3, 3) (3, 3, 3) (3, 1) (3, 1) (1, 1, 3, 3)
I think it's best to directly remove the indices whose resulting dimension is 1. What do you think @jofrevalles?
I think it's best to directly remove the indices whose resulting dimension is 1. What do you think @jofrevalles?
Maybe we could extend dropdims
for Tensor
s, right? This would be easier.
I think it's best to directly remove the indices whose resulting dimension is 1. What do you think @jofrevalles?
Maybe we could extend
dropdims
forTensor
s, right? This would be easier.
Yeah, that seems correct to me.
@jofrevalles I've implemented dropdims
for Tensor
and released in the new Tensors
v0.1.9 release. Update the compat
section to get the dropdims
method.
@mofeing In the quimb
library, they choose whether to apply split-simplification to a tensor based on the condition if max(prod(size(u)), prod(size(v))) < prod(size(tensor))
. Here, u
and v
are the tensors obtained from the SVD decomposition of a given tensor. This condition essentially checks if the total number of elements in the larger of the two new tensors (resulting from the split) is less than the number of elements in the original tensor, aiming for a reduction in total tensor elements.
In our current implementation, we've been using a slightly different approach by examining the rank of the singular values. Our idea is to perform the split if there are some singular values that are zero. I believe this approach is equivalent to quimb
's, but it saves us the computational expense of having to truncate the tensors after SVD.
Please let me know your thoughts on this.
Both conditions should be equivalent mathematically, so our implementation should be correct.
Summary
This PR introduces the
SplitSimplification
transformation as part of thetransform!
function forTensorNetwork
s, addressing issue #18 (resolve #18). The transformation applies the split simplification procedure, as outlined in this paper, to reduce the complexity of the tensor network while ensuring the rank of any tensor does not increase. The main strategy involves performing a singular value decomposition across all possible bipartitions of a tensor's indices, and replacing the tensor with a lower-rank approximation when this results in a simplification of theTensorNetwork
.Additionally, this PR includes comprehensive tests that verify the correctness and robustness of the newly implemented transformation.
Example
In this example, the
SplitSimplification
transformation is applied to a tensor network that includes a rank-4 tensor that can be simplified using svd into two vectors and a matrix. The result is a network containing five tensors, each of lower rank and dimensions.