Closed jofrevalles closed 12 months ago
Merging #72 (f32a805) into master (6f21d05) will increase coverage by
0.23%
. The diff coverage is100.00%
.
@@ Coverage Diff @@
## master #72 +/- ##
==========================================
+ Coverage 78.13% 78.37% +0.23%
==========================================
Files 10 10
Lines 709 712 +3
==========================================
+ Hits 554 558 +4
+ Misses 155 154 -1
Impacted Files | Coverage Δ | |
---|---|---|
src/TensorNetwork.jl | 84.24% <100.00%> (+0.90%) |
:arrow_up: |
It seems that the new thrown error now shows us problems in the hcat
function
I've refactored some code and now it should be ok.
It seems that the new thrown error now shows us problems in the
hcat
function
The code is actually ok. The problem was that hcat
tests where connecting indices with wrong sizes, so it detected a bug!
Summary
This PR fixes issue #69 (resolves #69). The previous implementation did not raise a
DimensionMismatch
error in theTensorNetwork
constructor function for inconsistent dimensions across tensors sharing the same index label. This issue could lead to incorrect results when working withTensorNetwork
objects.In response, we have implemented a dimensionality check in the
TensorNetwork
constructor function, ensuring consistency across tensors that share an index label. This is similar to the check that already exists in thepush!
method.Furthermore, to confirm that both the
TensorNetwork
constructor andpush!
methods are functioning correctly, we have added tests to ensure they throw aDimensionMismatch
error when encountering inconsistent dimensions.Example
With our fix, a
DimensionMismatch
error is now correctly thrown: