Closed fanq15 closed 4 years ago
@fanq15 It is normal. Please upgrade to the latest PyTorch and set find_unused_parameters = True
for DistributedDataParallel
. Check out the document here https://pytorch.org/docs/stable/_modules/torch/nn/parallel/distributed.html.
Thank you very much!
I have a very weird bug. If I define a convolution layer in the FCOS head but I do not use this layer, the code will give me this error. I never met this bug before in PyTorch. I wonder whether it because the distribute training, all the defined layers must be used in the forward function? I know how to deal with this bug, but because it is weird, so I want to ask if you have encountered this bug. Thank you!