Open vinx13 opened 2 years ago
A related issue from AS, probably due to the same issue https://github.com/apache/tvm/issues/9476
Actually, if I replace the above repro with
mm = relay.nn.batch_matmul(x, x, out_dtype="int32", transpose_b=True)
func = relay.Function([x], mm)
I get a segfault from MS. Disabling RewriteLayout
made the error go away.
TOPI batch matmul mistakenly mark the RHS tensor of batch matmul as layout free placeholder when it is a variable. As a result,
RewriteLayout
is applied to it and it can't be constant-folded in Relay. It results in workload mismatch because themeta_schedule_layout_transform
op is fused with other operators, resulting a new workload that hasn't been tuned.Expected behavior
Successfully tune and compile the Relay function.
Actual behavior
One workload is missing from tuning database.
Environment
TVM v0.11.dev 5eab64885ad4
Steps to reproduce
cc @zxybazh @junrushao