NVIDIA / apex

A PyTorch Extension: Tools for easy mixed precision and distributed training in Pytorch
BSD 3-Clause "New" or "Revised" License
8.2k stars 1.36k forks source link

Add type hints to distributed Adam optimizer #1699

Closed timmoon10 closed 11 months ago

timmoon10 commented 12 months ago

This PR makes some minor stylistic changes to the distributed Adam optimizer:

None of these changes should affect functionality.