megvii-research / AnchorDETR

An official implementation of the Anchor DETR.
Other
338 stars 36 forks source link

Your module has parameters that were not used in producing a loss. #6

Closed zongdaoming closed 3 years ago

zongdaoming commented 3 years ago

RuntimeError: Expected to have finished reduction in the prior iteration before starting a new one. This error indicates that your module has parameters that were not used in producing a loss. You can enable unused parameter detection by (1) passing the keyword argument find_unused_parameters=True to torch.nn.parallel.DistributedDataParallel; (2) making sure all forward function outputs participate in calculating loss. If you already have done the above two steps, then the distributed data-parallel module wasn't able to locate the output tensors in the return value of your module's forward function. Please include the loss function and the structure of the return value of forward of your module when reporting this issue (e.g. list, dict, iterable).

tangjiuqi097 commented 3 years ago

@zongdaoming Thanks for your reminder. You can delete this line and these lines to solve it now. I will update the code and the trained models in near future.

zongdaoming commented 3 years ago

Thanks a lot. And when I delete this line your code work. Thanks again for helping me out with this problem. Nice work!

tangjiuqi097 commented 3 years ago

Now the codes and the retrained models have beed updated.