HRNet / Lite-HRNet

This is an official pytorch implementation of Lite-HRNet: A Lightweight High-Resolution Network.
Apache License 2.0
819 stars 126 forks source link

Fuse layer problem #64

Open wangbotao opened 2 years ago

wangbotao commented 2 years ago

Here is the current code of the fuse layer:

if self.with_fuse:
    out_fuse = []
    for i in range(len(self.fuse_layers)):
        y = out[0] if i == 0 else self.fuse_layers[i][0](out[0])
        for j in range(self.num_branches):
            if i == j:
                y += out[j]
            else:
                y += self.fuse_layers[i][j](out[j])
        out_fuse.append(self.relu(y))
    out = out_fuse

Problem is that the first scale is added twice. First time: y = out[0] if i == 0 else self.fuse_layers[i][0](out[0]) Second time:

    if i == j:
        y += out[j]
    else:
        y += self.fuse_layers[i][j](out[j])

when j = 0.

To fix it, j should start from 1: for j in range(1, self.num_branches)

lingfengqiu commented 2 years ago

right, i find the same problem as you. here is the source code of hrnt about the fuse layer.

    for i in range(len(self.fuse_layers)):
        y = x[0] if i == 0 else self.fuse_layers[i][0](x[0])
        for j in range(1, self.num_branches):
            if i == j:
                y = y + x[j]
            else:
                y = y + self.fuse_layers[i][j](x[j])
        x_fuse.append(self.relu(y))

        for i in range(len(self.fuse_layers)):
            y = out[0] if i == 0 else self.fuse_layers[i][0](out[0])
            for j in range(self.num_branches):
                if i == j:
                    y += out[j]
                else:
                    y += self.fuse_layers[i][j](out[j])
            out_fuse.append(self.relu(y))