pytorch / FBGEMM

FB (Facebook) + GEMM (General Matrix-Matrix Multiplication) - https://code.fb.com/ml-applications/fbgemm/
Other
1.17k stars 474 forks source link

[Question] Is there FP8 embedding support for training? #2920

Closed ShijieZZZZ closed 1 month ago

ShijieZZZZ commented 1 month ago

Hello. Do classes in split_table_batched_embeddings_ops_training.py support SparseType==FP8 for training?

For example, class SplitTableBatchedEmbeddingBagsCodegen , set weights_precision: SparseType = SparseType.FP8

sryap commented 1 month ago

Hi @ShijieZZZZ: no we don't support it at the moment

ShijieZZZZ commented 1 month ago

Hello @sryap, thanks for your reply. Will there be fp8 embedding support for training in the near future (eg, in 2024)?

sryap commented 1 month ago

Yes, we plan to add this support by this half