Closed jofrevalles closed 1 year ago
Merging #31 (c6c1312) into master (5c2dae2) will increase coverage by
0.07%
. The diff coverage is100.00%
.
@@ Coverage Diff @@
## master #31 +/- ##
==========================================
+ Coverage 82.05% 82.12% +0.07%
==========================================
Files 6 6
Lines 234 235 +1
==========================================
+ Hits 192 193 +1
Misses 42 42
Impacted Files | Coverage Δ | |
---|---|---|
src/Numerics.jl | 87.30% <100.00%> (+0.20%) |
:arrow_up: |
The problem is that EinExprs
does not return only pairwise Tensor
s to be contracted, and that leads to error. You can check the issue I mentioned above. I think that this extension of the contract
function simply fixed those problems.
The problem is that
EinExprs
does not return only pairwiseTensor
s to be contracted, and that leads to error. You can check the issue I mentioned above. I think that this extension of thecontract
function simply fixed those problems.
Yep, but a optimization pass can be written such that
EinExpr(a,b,c,out=labels)
gets transformed to
EinExpr(EinExpr(a,b,out=labels_ab),c,out=labels)
Yep, but a optimization pass can be written such that
EinExpr(a,b,c,out=labels)
gets transformed to
EinExpr(EinExpr(a,b,out=labels_ab),c,out=labels)
@jofrevalles Actually, we don't need to write any other optimization pass: this should be currently done by just calling einexpr(EinExpr([a,b,c], out=labels), optimizer=Exhaustive())
.
I mean, just calling einexpr
on the EinExpr
that you want to "optimize".
@jofrevalles Actually, we don't need to write any other optimization pass: this should be currently done by just calling
einexpr(EinExpr([a,b,c], out=labels), optimizer=Exhaustive())
.I mean, just calling
einexpr
on theEinExpr
that you want to "optimize".
It should, but this doesn't work for unconnected tensors, since einexpr
does not return them two-by-two:
julia> tensor_vector = [
Tensor(rand(2, 2), (:i, :j)),
Tensor(rand(2, 2), (:k, :l)),
Tensor(rand(2, 2), (:m, :n))]
3-element Vector{Tensor{Float64, 2, Matrix{Float64}}}:
...
julia> einexpr(Exhaustive, EinExpr(tensor_vector)) |> contract
ERROR: MethodError: no method matching contract(::Tensor{Float64, 2, Matrix{Float64}}, ::Tensor{Float64, 2, Matrix{Float64}}, ::Tensor{Float64, 2, Matrix{Float64}}; dims::Vector{Symbol})
Closest candidates are:
contract(::Tensor, ::Tensor; dims)
@ Tensors ~/.julia/packages/Tensors/O9RAx/src/Numerics.jl:36
contract(::Tensor; dims)
@ Tensors ~/.julia/packages/Tensors/O9RAx/src/Numerics.jl:49
contract(::Tensor, ::TensorNetwork; kwargs...)
@ Tenet ~/git/Tenet.jl/src/TensorNetwork.jl:480
...
That's why I think that in this case it would be useful to extend contract
.
Okay, I'm gonna approve this but keep in mind that this is subject to changes in the future.
This PR enhances the
contract
function for multiple tensors, which resolves issue #16 in EinExprs. The function utilizesreduce
to perform pairwiseTensor
contractions. To ensure the robustness of these changes we added atestset
that covers the newcontract
functionality, checking its behavior with various tensors in inputs.