I'm filing this issue to track / discuss how to handle multi-dimensional tensors as we go through packing passes and secret-to. I thought we had an issue about this, but I couldn't find one
Background
Right now:
tensor_ext rotate assumes 1-D tensors only.
align-tensor-sizes assumes 1-D tensors
tensor_ext canonicalization patterns assume rotation on 1-D tensors
target slot analysys assumes 1-D tensors
(because of above) SIMD vectorizer / HECO optimization
Situation right now
In https://github.com/google/heir/issues/906, @lawrencekhlim is rewriting a linalg.matmul operation into a diagonalized matrix - vector product. Matmul operations expect 2-D inputs, and so we currently are restricting to rewrite matmuls of the form NxN matrix with Nx1 vector. The nx1 vector needs to be rotated, and so obviously we hit an error where tensor_ext rotate expects 1-D tensors only.
What to do
I'm not sure if
(1) We should flatten / reshape the inputs so that the rotated vector becomes 1-D or
(2) Generalize rotate op (and tensor_ext canonicalization) to assume only one dimension can have a non-unit dimension
Obviously (2) would complicate implementations a bit, especially to determine the non-unit dimension, but it's slightly more realistic coming from the frontends that we would have tensors in the shape 1xN.
Other than this particular problem, if we have 2-D packing algorithms then we need to be able to operate on multi-dimensional tensors, so this probably isn't the only issue. I know we punted a little on defining rotations for multi-dimensional tensors.
I'm filing this issue to track / discuss how to handle multi-dimensional tensors as we go through packing passes and secret-to. I thought we had an issue about this, but I couldn't find one
Background
Right now:
Situation right now
In https://github.com/google/heir/issues/906, @lawrencekhlim is rewriting a
linalg.matmul
operation into a diagonalized matrix - vector product. Matmul operations expect 2-D inputs, and so we currently are restricting to rewrite matmuls of the form NxN matrix with Nx1 vector. The nx1 vector needs to be rotated, and so obviously we hit an error where tensor_ext rotate expects 1-D tensors only.What to do
I'm not sure if
(1) We should flatten / reshape the inputs so that the rotated vector becomes 1-D or (2) Generalize rotate op (and tensor_ext canonicalization) to assume only one dimension can have a non-unit dimension
Obviously (2) would complicate implementations a bit, especially to determine the non-unit dimension, but it's slightly more realistic coming from the frontends that we would have tensors in the shape 1xN.
Other than this particular problem, if we have 2-D packing algorithms then we need to be able to operate on multi-dimensional tensors, so this probably isn't the only issue. I know we punted a little on defining rotations for multi-dimensional tensors.