Open junxnone opened 1 year ago
float
tf.float32
torch.float32
torch.float
tensor float 32
short float
tf.float16
torch.float16
torch.half
tf.bfloat16
torch.bfloat16
double
tf.float64
torch.float64
torch.double
long double
Precision
FP32
float
tf.float32
torch.float32
/torch.float
TF32
tensor float 32
FP16
short float
tf.float16
torch.float16
/torch.half
BF16
tf.bfloat16
torch.bfloat16
bf16 vs fp16 vs fp32
FP64
double
tf.float64
torch.float64
/torch.double
FP80
long double