Closed ciwei123 closed 3 years ago
I found that the values are the same before and after normalization.I closed the issue.
min_y = Info['scores'][0, :].min()
loss_a(mapped_score[d], y[d]) = loss_a(mapped_score[d], (y[d]-min_y)/self.scale[d] )
loss_m(relative_score[d], y[d]) = loss_m(relative_score[d], (y[d]-min_y)/self.scale[d])
F.l1_loss(aligned_score[d], y[d]) / self.scale[d] = F.l1_loss((aligned_score[d] - min_y)/self.scale[d], (y[d]-min_y)/self.scale[d]))
@ciwei123 The reason is that loss_a and loss_m are range-independent.
@lidq92 Thanks for your sharing. I found that :
if self.loss_type == 'mixed': loss = [loss_a(mapped_score[d], y[d]) + loss_m(relative_score[d], y[d]) + F.l1_loss(aligned_score[d], y[d]) / self.scale[d] for d in range(len(y))]
mapped_score[d]:0--1 mapped_score[d]:0--1 aligned_score[d]:1--5 y:1--5 The scale is not uniform, do you need to scale before calculating? I can't understand why the first and second terms are not divided by self.scale[d], but the third term is divided by self.scale[d]. Thank you very much!!