Closed Pixie8888 closed 8 months ago
Hi,
I notice the function force_fp32 in softgroup/util/fp16.py. Why do you need to force it to be fp32 when training with amp?
It is implemented for mix precision training. But it is not in used currently.
Thanks
Hi,
I notice the function force_fp32 in softgroup/util/fp16.py. Why do you need to force it to be fp32 when training with amp?