WindVChen / DRENet

The official implementation of DRENet (Degraded Reconstruction Enhancement Network) for tiny ship detection in remote sensing Images
GNU General Public License v3.0
43 stars 6 forks source link

Problems in the ConcatFusionFactor of common.py #15

Closed CherrPeac closed 1 year ago

CherrPeac commented 1 year ago

Hello, while reading your paper, you wrote: "Suppose that there are two layers C1 and C2 in the bottom-up pathway from shallow to deep, and two layers P1 and P2 in the top-down pathway corresponded. In the original FPN, P1 = C1 + upsample(P2); after we add the scale layer, it turns to be P1 = C1 + α× upsample(P2), where α is a learnable parameter." in methodology. I guess this is the function of the ConcatFusionFactor module, but in this module, this factor is 1.0 and It doesn't seem to change, or does it default to 1.0 and change elsewhere?

class ConcatFusionFactor(nn.Module):
    # Concatenate a list of tensors along dimension
    def __init__(self, dimension=1):
        super(ConcatFusionFactor, self).__init__()
        self.d = dimension
        self.factor=torch.nn.Parameter(torch.FloatTensor([1]))

    def forward(self, x):
        x[0] = x[0]*self.factor
        return torch.cat(x, self.d)
CherrPeac commented 1 year ago

I am so stupid. It can be updated through backpropagation.

WindVChen commented 1 year ago

Hi @CherrPeac ,

Yes. The factor can be updated through backpropagation. It may also need to be noted that despite some benefit on the final performance, sometimes updating it will make the training phase unstable. Thus we currently freeze it by default (i.e., fix the factor to 1.0) in the code.