Open oschuett opened 5 years ago
This would indeed facilitate the use of DBCSR tensors but there are a few things to consider:
1) Not all tensor operations that can be expressed with Einstein notation are supported. This is due to the restriction of the low-level implementations that all operations must map to either matrix-matrix multiplication and/or data redistribution. Thus the following operations are supported:
tensor contractions: 2 tensors in, 1 tensor out, each tensor having at least rank 2.
Inner/outer products are possible but require workarounds (i.e. adding dimensions of size 1). Contractions involving more than 2 tensors need to be split into consecutive contraction steps.
2) The notation can not be fully abstract since, for good performance, the user needs to control the matrix representation of each tensor (at creation). However that could be expressed in a similar notation: ijk <-> ki,j
meaning that a tensor ijk
is mapped to a matrix with first index ki
and second index j
.
Machine learning libraries have found a nice API for tensor contractions, which is based on the Einstein notation. Examples are PyTorch, Tensorflow, and NumPy. I guess, it would be rather straightforward to add this to DBCSR as well.