Closed realkris closed 3 years ago
Add: the loss value starts from 10, and drop about 1.00 per item trained, and finally the loss is around -10.
Yes, it is normal. Positive SI-SDR is good, and we use the negative SI-SDR as objective to minimize.
Yes, it is normal. Positive SI-SDR is good, and we use the negative SI-SDR as objective to minimize.
then should we save the model with a minimal loss if val_loss is negative of each epoch?
Hi, @mpariente! Hoping you are doing fine.
I am following the librimix/Convtasnet recipe using the enh_single
task.
If I understood correctly, the negative loss
scores are normal due to them being used for optimization. But, in contrast with @realkris, I am getting negative val_loss
scores as well. Is that normal too or I am missing something?
Yes, perfectly normal. It's the negated SI-SDR
I'm using DNS dataset, wanting to train DCCRN to do denoising, and I followed instructions in #278, but there's differences: I used sisdr loss the same as #352 :
in train.py:
loss_func = PITLossWrapper(SingleSrcNegSDR("sisdr"), pit_from='pw_pt')
in model.py:class SimpleSystem(System): def common_step(self, batch, batch_nb, train): mixture, speech, noise = batch estimate = self(mixture.unsqueeze(1)) speech = speech.unsqueeze(1) loss = self.loss_func(estimate, speech) return loss
but I got negative loss values around -10:
Epoch 0: 1%|█ | 309/33600 [06:41<12:00:19, 1.30s/it, loss=-9.94, v_num=2, val_loss=13.50]
Is it normal for the value to be negative? I thought there must be something I did wrong. Would you please tell me how to fix this? Very appreciate!