Closed Neutron3529 closed 3 years ago
You seem to be interested in batched matrix multiplication. TensorOperations.jl is currently restricted to tensor contractions that can be mapped to normal matrix multiplication, corresponding to strict Einstein summation convention, i.e. indices that appear twice in the right hand side are contracted over and cannot appear in the left hand side.
Maybe Tullio.jl can be of help to you.
You seem to be interested in batched matrix multiplication. TensorOperations.jl is currently restricted to tensor contractions that can be mapped to normal matrix multiplication, corresponding to strict Einstein summation convention, i.e. indices that appear twice in the right hand side are contracted over and cannot appear in the left hand side.
Maybe Tullio.jl can be of help to you.
Thank you for your reply. Previously, I was using R and meet some performance issues. I thought using a single instruction may provide better performance since it is true in R. But for Julia, maybe using loop is also an acceptable choice.
what I want to do could be: a=[rand(10,10) for i in 1:5] b=[rand(10,5) for b in 1:5] a.*b
which is faster enough.
what I could do to fix that error? (the reason that I use TensorOperations rather than Einsum is that Einsum is slower than TensorOperations(at least in some demos))