NVIDIA / apex

A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch
BSD 3-Clause "New" or "Revised" License
8.39k stars 1.4k forks source link

FusedAdam bias_correction argument incompatible with torch.optim.Adam #577

Open unendin opened 5 years ago

unendin commented 5 years ago

The docs suggest that 'apex.optimizers.FusedAdam may be used as a drop-in replacement for torch.optim.Adam'. However, when I load an optimizer state_dict generated by Adam, then resume training, I get:

File "/opt/conda/lib/python3.7/site-packages/apex/optimizers/fused_adam.py", line 105, in step bias_correction = 1 if group['bias_correction'] else 0 KeyError: 'bias_correction'

ddoron9 commented 2 years ago

I've got the same error while training fairseq. In my case It was the checkpoint issue. I tried to use the checkpoint pretrained further from wav2vec_small.pt in other device. in my new device it was fine pretraining with wav2vec_small.pt or without pretrained checkpoint. only the checkpoint I further pretrained was not working