mlc-ai / relax

Apache License 2.0
149 stars 75 forks source link

[DistIR] Distributed tensor sharding propagation #257

Closed jinhongyii closed 1 year ago

jinhongyii commented 1 year ago

This PR implements a GSPMD-like sharding spec propagation on Relax DTensor

jinhongyii commented 1 year ago

cc: @tqchen

jinhongyii commented 1 year ago

Things unsupported now:

  1. propagate sharding spec to constant
  2. more high-level relax operator support (Ideally we should use relax operators for mixed dim access patterns, like B[i, j] = f(A[i+j]), and TIR for the rest)
  3. dynamic shape