Closed creater-zq closed 2 years ago
Yes, the weights of the ImageNet model are intentionally detached from the compute graph so that the ImageNet model is not updated because we want to calculate the distance of the student features to the features of the original/frozen ImageNet model.
Yes, the weights of the ImageNet model are intentionally detached from the compute graph so that the ImageNet model is not updated because we want to calculate the distance of the student features to the features of the original/frozen ImageNet model.
Thanks!
Hello!
feat_loss, feat_log = self.calc_feat_dist(img, gt_semantic_seg,src_feat)
, usefeat_imnet = [f.detach() for f in feat_imnet]
separate feat_imnet from the calculation diagram。feat_dist = self.masked_feat_dist(feat[lay], feat_imnet[lay], fdist_mask)
get the Feature Distance. Then,feat_loss.backward()
, when back propagation calculates the gradient, The backbone of self.imnet_model will not calculate the gradient, so self.imnet_model will not be trained, and its parameters will not be updated?thanks!