When using rotated RetinaNet, I want to remove the two additional feature layers (P6 and P7) in the FPN. I made the following modifications to the config file. However, RetinaNet seems to default to using 5 feature layers for classification and regression instead of 3, so I received the following error:
File "//mmrotate-main-0_32_old/mmrotate/models/dense_heads/rotated_anchor_head.py", line 472, in loss
assert len(featmap_sizes) == self.anchor_generator.num_levels
AssertionError
the len(featmap_sizes) is 3 and the self.anchor_generator.num_levels is 5.
How should I modify the config file to change self.anchor_generator.num_levels from 5 to 3?
What's the feature?
When using rotated RetinaNet, I want to remove the two additional feature layers (P6 and P7) in the FPN. I made the following modifications to the config file. However, RetinaNet seems to default to using 5 feature layers for classification and regression instead of 3, so I received the following error:
the len(featmap_sizes) is 3 and the self.anchor_generator.num_levels is 5. How should I modify the config file to change self.anchor_generator.num_levels from 5 to 3?
Any other context?
No response