When I run the training, it warns about precision=16, and recommends to change to 16-mixed.
When I changed to 16-mixed, it also got rid of other warnings as well:
The warning about how some hyper parameters can't be pickled when saving checkpoint
The warning about how checkpoint was saved before epoch ended when resuming.
Is there a negative side effect of changing precision=16 to 16-mixed in configs/base/trainers/base.py?
When I run the training, it warns about precision=16, and recommends to change to 16-mixed. When I changed to 16-mixed, it also got rid of other warnings as well:
Is there a negative side effect of changing precision=16 to 16-mixed in configs/base/trainers/base.py?