Closed jofrevalles closed 1 year ago
Merging #66 (18c3a0e) into master (05a2bae) will increase coverage by
4.00%
. The diff coverage is100.00%
.
@@ Coverage Diff @@
## master #66 +/- ##
==========================================
+ Coverage 84.97% 88.98% +4.00%
==========================================
Files 9 9
Lines 599 599
==========================================
+ Hits 509 533 +24
+ Misses 90 66 -24
Impacted Files | Coverage Δ | |
---|---|---|
src/Quantum/Quantum.jl | 78.94% <100.00%> (+3.15%) |
:arrow_up: |
This pull request addresses the issue encountered (resolves #64) when computing the
norm
of a Matrix Product Operator (MPO).The problem arises when the function attempts to replace a tensor label in a tensor network with a new one that already exists. This scenario often occurs with MPOs and other tensor structures with multiple
interlayer
connections, as theplug
labels
might not be in the same order in the state and its adjoint.This error was not encountered with Matrix Product States (MPS) because they have a single
interlayer
that matches their adjoint.In this PR we have corrected this error and we also have introduced tests to validate the
norm
function for both MPS and MPO.