NVIDIA / cutlass

CUDA Templates for Linear Algebra Subroutines
Other
5.35k stars 903 forks source link

[QST] Why _CUTLASS_TYPE_TO_TORCH_TYPE doesn't support torch.bfloat16? #1736

Open hxdtest opened 3 weeks ago

hxdtest commented 3 weeks ago

What is your question? Inpython/cutlass/emit/pytorch.py, bfloat16 is not supported?

_CUTLASS_TYPE_TO_TORCH_TYPE = {
    DataType.f16: "torch::kF16",
    DataType.f32: "torch::kF32",
    DataType.f64: "torch::kF64",
    DataType.s8: "torch::I8",
    DataType.s32: "torch::I32",
}
jackkosaian commented 3 weeks ago

We simply haven't implemented it. We welcome contributions in this space.