joe-siyuan-qiao / DetectoRS

DetectoRS: Detecting Objects with Recursive Feature Pyramid and Switchable Atrous Convolution
Apache License 2.0
1.13k stars 176 forks source link

About ASPP Module #72

Open mhyeonsoo opened 4 years ago

mhyeonsoo commented 4 years ago

Hi,

Thanks for a great codes. Now I am trying to implement the code with my own dataset, and since I need to use tensorflow as a framework, I am modifying toe code based on the TF2.1.

It may be bit awkward to ask here, but I think it can be a question about module mechanism itself.

At ASPP model class, I saw that the Relu layer is coming after Global Average Pooling. So when I ran it, it was saying that the dimensions of outputs of layer for '(aspp_idx == self.aspp_num - 1)' case and else. For the first one, shape of the layer output was something like (None, 128, 128, 64) , and output of layer for '(aspp_idx == self.aspp_num - 1)' case was sth. like shape=(None, 256)

Is there any reason for using GAP before relu? and If so, can I fix this error caused by dim. mismatch?

Thanks alot!

joe-siyuan-qiao commented 4 years ago

Hi, the ASPP module was proposed by DeepLab which has a global average pooling layer. Please check the TF implementation you have. It seems that the numbers of channels are not set correctly for the GAP layer.