yoctta / multiple-attention

The code of multi-attention deepfake detection
241 stars 54 forks source link

is the computation of aux_loss in code inconsistent with the paper? #10

Open lulin60 opened 2 years ago

lulin60 commented 2 years ago

In the code (MAT.py,274 line): ''' if not jump_aux: aux_loss,feature_matrix_d=self.auxiliary_loss(feature_maps_d,attentionmaps,y) ''' it first compute the AttentionPooling between the feature_maps_d and attentions,then get feature_matrix and loss; but I think it should use the parameter feature_maps (not feature_maps_d in code)? change it to be: ''' if not jump_aux: aux_loss,feature_matrix_d=self.auxiliary_loss(feature_maps,attentionmaps,y) ''' I want to known wether I make a wrong understanding about the code?