from torchdrug import data as td
from torch.distributed.optim import DistributedOptimizer
Error:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
from torch.distributed.optim import DistributedOptimizer
File "/opt/conda/envs/tankbind_py38/lib/python3.8/site-packages/torch/distributed/optim/__init__.py", line 28, in <module>
from .zero_redundancy_optimizer import ZeroRedundancyOptimizer
File "/opt/conda/envs/tankbind_py38/lib/python3.8/site-packages/torch/distributed/optim/zero_redundancy_optimizer.py", line 267, in <module>
class ZeroRedundancyOptimizer(Optimizer, Joinable):
TypeError: metaclass conflict: the metaclass of a derived class must be a (non-strict) subclass of the metaclasses of all its bases
I have no idea why this happened, but it can work properly by swapping the positions of two lines
Hi! This may result from the patch on the optimizer in torchdrug. An easy fix is to import this optimizer before you import torchdrug. I will try to see if I can find some better solution for that.
Version:
Code:
Error:
I have no idea why this happened, but it can work properly by swapping the positions of two lines