For example, this pattern is not guaranteed to work:
import torch.optim
...
torch.optim._multi_tensor.Adam # may fail to resolve _multi_tensor
And this is guaranteed to work:
import torch.optim._multi_tensor
...
torch.optim._multi_tensor.Adam # will always work
A recent change to PyTorch changed module initialization logic in a way that exposed this issue.
But the code has been working for years? This is the nature of undefined behavior, any change in the environment (in this the PyTorch code base can make it fail.
Summary: Lazy import changes
Python
import semantics, specifically when it comes to initialization of packages/modules: https://www.internalfb.com/intern/wiki/Python/Cinder/Onboarding/Tutorial/Lazy_Imports/Troubleshooting/For example, this pattern is not guaranteed to work:
And this is guaranteed to work:
A recent change to
PyTorch
changed module initialization logic in a way that exposed this issue.But the code has been working for years? This is the nature of undefined behavior, any change in the environment (in this the
PyTorch
code base can make it fail.Differential Revision: D58881291