open-mmlab / mmdetection

OpenMMLab Detection Toolbox and Benchmark
https://mmdetection.readthedocs.io
Apache License 2.0
29.21k stars 9.4k forks source link

Am confused about feature maps on the FPN #614

Closed searchliu333 closed 5 years ago

searchliu333 commented 5 years ago

In config files: neck=dict( type='FPN', in_channels=[256, 512, 1024, 2048], out_channels=256, num_outs=5), rpn_head=dict( type='RPNHead', in_channels=256, feat_channels=256, anchor_scales=[8], anchor_ratios=[0.5, 1.0, 2.0], anchor_strides=[4, 8, 16, 32, 64], target_means=[.0, .0, .0, .0], target_stds=[1.0, 1.0, 1.0, 1.0], use_sigmoid_cls=True), bbox_roi_extractor=dict( type='SingleRoIExtractor', roi_layer=dict(type='RoIAlign', out_size=7, sample_num=2), out_channels=256, featmap_strides=[4, 8, 16, 32]), Question: in 'FPN' neck, num_outs=5, it indicates forming 5 feature maps; in 'RPNHead' rpn_head, anchor_strides=[4, 8, 16, 32, 64], it indicates generating anchors in 5 feature maps; However, in 'SingleRoIExtractor' bbox_roi_extractor, featmap_strides=[4, 8, 16, 32]), perhaps generating rois in 4 feature maps Why not use 5 feature maps generating rois??? Thank you very much!

thangvubk commented 5 years ago

RPNHead does not have ROIPooling (or ROIAlign), so it uses extra level 6 (of stride 64) to capture the large boxes. Another possible reason is that it is their design choice, which means if you use 5 levels for SingleRoIExtractor, I believe it still works.

searchliu333 commented 5 years ago

This method firstly trains and learns rpn network on the multi-levels feature maps(4, 8, 16, 32, 64, is 5 feature maps), secondly extracts rois on those multi-levels feature maps.
Above all, train and learn on 5 feature maps, extract on 4 feature maps, This caused a waste of computation. Why?

searchliu333 commented 5 years ago

RPNHead does not have ROIPooling (or ROIAlign), so it uses extra level 6 (of stride 64) to capture the large boxes. Another possible reason is that it is their design choice, which means if you use 5 levels for SingleRoIExtractor, I believe it still works. almost forgot!,thank you for your reply