Open hxdtest opened 3 weeks ago
What is your question? Inpython/cutlass/emit/pytorch.py, bfloat16 is not supported?
python/cutlass/emit/pytorch.py
_CUTLASS_TYPE_TO_TORCH_TYPE = { DataType.f16: "torch::kF16", DataType.f32: "torch::kF32", DataType.f64: "torch::kF64", DataType.s8: "torch::I8", DataType.s32: "torch::I32", }
We simply haven't implemented it. We welcome contributions in this space.
What is your question? In
python/cutlass/emit/pytorch.py
, bfloat16 is not supported?