Closed bowenc0221 closed 6 years ago
First of all, there is no ReLU there in Detectron's FPN implementation either. Here's my opinion on why:
This exact question is answered in the paper already:
There are no non-linearities in these extra layers, which we have empirically found to have minor impacts
Thanks!
Hi @roytseng-tw ,
Thanks for the nice work. I have a question about the FPN code: why there is no ReLU activation in the Post-hoc scale-specific 3x3 convolutions as well as the lateral connection?