Closed CherrPeac closed 1 year ago
I am so stupid. It can be updated through backpropagation.
Hi @CherrPeac ,
Yes. The factor can be updated through backpropagation. It may also need to be noted that despite some benefit on the final performance, sometimes updating it will make the training phase unstable. Thus we currently freeze it by default (i.e., fix the factor to 1.0) in the code.
Hello, while reading your paper, you wrote: "Suppose that there are two layers C1 and C2 in the bottom-up pathway from shallow to deep, and two layers P1 and P2 in the top-down pathway corresponded. In the original FPN, P1 = C1 + upsample(P2); after we add the scale layer, it turns to be P1 = C1 + α× upsample(P2), where α is a learnable parameter." in methodology. I guess this is the function of the ConcatFusionFactor module, but in this module, this factor is 1.0 and It doesn't seem to change, or does it default to 1.0 and change elsewhere?