Open MaheshRavishankar opened 1 month ago
@Max191 is this something that you already had a pattern to fix? I think I remember you saying something like this but didnt connect it in my head.
Yes, this PR fixes it: https://github.com/llvm/llvm-project/pull/94637
A more canonical representation of this is
I don't know if it is more canonical to have one ordering over the other, but the reshape propagation patterns should not be blocked by cases like this. Ideally, we should also have patterns for the inverse case of expand_shape->collapse_shape
.
I was supposed to refactor this to Tensor dialect, but the PR got low on my priority list, so I haven't touched it in a while.
Some of the models (like SDXL) from torch seem to contain these types of patterns
What this is trying to do is go from a shape of
tensor<2x32x10x16384xf32>
to a shape oftensor<2x320x128x128xf32>
. The fusion heuristics that essentially relies on propagatingexpand_shapes
up andcollapse_shapes
gets stuck here and we miss out on some fusion opportunities. A more canonical representation of this isThen the propagation should be able to fuse better.