Closed songkq closed 4 years ago
Hi, @MinZHANG-WHU As for the conv_f layer of FF-Net, it is composed of Conv+ReLU in TABLE 1 in the paper. However, FF-Net implementation didn't include the ReLU operation. Does it matter?
conv_f
FF-Net
Conv+ReLU
TABLE 1
n.concat_1 = self.concat(n.data_t12, n.fd_1, n.up_2, n.up_3) n.conv_t = self.conv(n.concat_1, 3, self.ff_channel, stride=1, pad=1, name_w="conv_t_w", name_b="conv_t_b", lr_mult_w=1, lr_mult_b=1, decay_mult=1, bias_term=True) n.conv_prob = self.conv(n.conv_t, 1, 1, stride=1, pad=0, name_w="conv_prob_w", name_b="conv_prob_b", lr_mult_w=1, lr_mult_b=1, decay_mult=1, bias_term=True) n.sig = L.Sigmoid(n.conv_prob, in_place=False)
Sorry for the confusing. This is a careless mistake. Actually FF-Net implementation didn't include the ReLU layer.
OK.
Hi, @MinZHANG-WHU As for the
conv_f
layer ofFF-Net
, it is composed ofConv+ReLU
inTABLE 1
in the paper. However, FF-Net implementation didn't include the ReLU operation. Does it matter?