pytorch-labs / float8_experimental

This repository contains the experimental PyTorch native float8 training UX
BSD 3-Clause "New" or "Revised" License
211 stars 20 forks source link

Any quality benchmarks? #318

Closed tsengalb99 closed 3 months ago

tsengalb99 commented 4 months ago

Do you have any benchmarks measuring degradation from using fp8 vs bf16 mixed precision?

vkuzo commented 3 months ago

hi @tsengalb99 , I think this is the best list of accuracy benchmarks from float8: https://github.com/NVIDIA/TransformerEngine?tab=readme-ov-file#fp8-convergence . This repo is doing the same thing mathematically as TransformerEngine.

we also just moved the code to https://github.com/pytorch/ao/tree/main/torchao/float8, so please feel free to continue the discussion there!