BaguaSys / bagua

Bagua Speeds up PyTorch
https://tutorials-8ro.pages.dev/
MIT License
872 stars 83 forks source link

do we need to check whether optimizers and model parameters are the same? #76

Closed todo[bot] closed 3 years ago

todo[bot] commented 3 years ago

https://github.com/BaguaSys/bagua/blob/96cb6fe72dfcb2d0394e465291a31aff1f3e0142/bagua/torch_api/distributed.py#L182-L187


This issue was generated by todo based on a TODO comment in 96cb6fe72dfcb2d0394e465291a31aff1f3e0142 when #24 was merged. cc @BaguaSys.
close-issue-app[bot] commented 3 years ago

This issue is closed because it does not meet our issue template. Please read it.

close-issue-app[bot] commented 3 years ago

This issue is closed because it does not meet our issue template. Please read it.

todo[bot] commented 3 years ago

This issue has been reopened because the TODO comment still exists in bagua/torch_api/distributed.py, as of 6f62a8468b0e88c59447564fe0e9045d795f49a7.


If this was not intentional, just remove the comment from your code. You can also set the reopenClosed config if you don't want this to happen at all anymore.