open-mmlab / mmrotate

OpenMMLab Rotated Object Detection Toolbox and Benchmark
https://mmrotate.readthedocs.io/en/latest/
Apache License 2.0
1.85k stars 547 forks source link

How to modify num_ levels? #787

Open hh123445 opened 1 year ago

hh123445 commented 1 year ago

What's the feature?

When using rotated RetinaNet, I want to remove the two additional feature layers (P6 and P7) in the FPN. I made the following modifications to the config file. However, RetinaNet seems to default to using 5 feature layers for classification and regression instead of 3, so I received the following error: image

  File "//mmrotate-main-0_32_old/mmrotate/models/dense_heads/rotated_anchor_head.py", line 472, in loss
    assert len(featmap_sizes) == self.anchor_generator.num_levels
AssertionError

the len(featmap_sizes) is 3 and the self.anchor_generator.num_levels is 5. How should I modify the config file to change self.anchor_generator.num_levels from 5 to 3?

Any other context?

No response

zytx121 commented 1 year ago

Hi @hh123445, you can try to change the anchor_generator.strides from [8, 16, 32, 64, 128] to [8, 16, 32].