Closed ShoufaChen closed 2 years ago
Hello,
Why didn't you use the PyTorch default torch.cuda.amp for mixed precision?
torch.cuda.amp
What's the difference between torch.cuda.amp and your convert_module_to_f16?
convert_module_to_f16
Thanks in advance.
at the time we worked on this project, torch.cuda.amp was not yet in a stable state.
Hello,
Why didn't you use the PyTorch default
torch.cuda.amp
for mixed precision?What's the difference between
torch.cuda.amp
and yourconvert_module_to_f16
?Thanks in advance.