Closed xs-alt closed 4 years ago
Or you can just change the dtype of anchors in add_visibility_to() function in /modeling/rrpn/anchor_generator.py. It works for my pytorch 1.4 version and nan loss disappears.
@gdjmck Great!
Or you can just change the dtype of anchors in add_visibility_to() function in /modeling/rrpn/anchor_generator.py. It works for my pytorch 1.4 version and nan loss disappears.
@gdjmck Hi, your method works when not using FPN, how to change the code avoiding this warning in FPN?I changed some torch.uint8
in related files, but the warning still exists.
Or you can just change the dtype of anchors in add_visibility_to() function in /modeling/rrpn/anchor_generator.py. It works for my pytorch 1.4 version and nan loss disappears.
@gdjmck Hi, your method works when not using FPN, how to change the code avoiding this warning in FPN?I changed some
torch.uint8
in related files, but the warning still exists.
Maybe you can do something like below to raise the warning as exceptions to see which line of code causes the warning and deal with it.
import warnings warnings.filterwarnings('error', 'xxx')
@gdjmck ok, thanks!
Solved Make sure you have the torch version 1.0.0.