pytorch / xla

Enabling PyTorch on XLA Devices (e.g. Google TPU)
https://pytorch.org/xla
Other
2.46k stars 467 forks source link

[fp8] Support `fp8e4m3` in torch_xla #8005

Open miladm opened 2 weeks ago

miladm commented 2 weeks ago

🚀 Feature

Please enable fp8e4m3 in torch_xla. This feature is in flight in openxla: https://github.com/openxla/xla/pull/16585/files

Today, PyTorch doesn't support fp8e4m3 yet, only the funz variants are supported. @amithrm wants to see this feature as an alternative to fp8e4m3fn.

cc @amithrm @JackCaoG

miladm commented 3 days ago

@apivovarov @amithrm from the compiler side, what missing components are you looking to see materialize re: this feature?