Open StrongChris opened 1 year ago
Currently only F32 is supported.
Would like to support all datatypes especially uint16, f16, bf16, complex64 and complex128.
uint16 isn't currently supported by pytorch. I've added a note in support of it's inclusion in https://github.com/pytorch/pytorch/issues/58734
More float and (u)int types are now supported. Complex datatypes not yet supported.
Currently only F32 is supported.
Would like to support all datatypes especially uint16, f16, bf16, complex64 and complex128.
uint16 isn't currently supported by pytorch. I've added a note in support of it's inclusion in https://github.com/pytorch/pytorch/issues/58734