LiJunnan1992 / DivideMix

Code for paper: DivideMix: Learning with Noisy Labels as Semi-supervised Learning
MIT License
529 stars 83 forks source link

Plot Figure 2 #39

Open xiqxin1 opened 2 years ago

xiqxin1 commented 2 years ago

Dear author

Thank you very much for your excellent code. My recent work is also trying to identify noise labels from correct labels.

I'm curious where you output the loss values (e.g., Fig 2(a)) from your code? Is it the value of in the following code? If not, could you tell me how to calculate it?

def warmup(epoch, net, optimizer, dataloader, args): # make noise labels in asym and sym ways ----net.train() ----num_iter = (len(dataloader.dataset) // dataloader.batch_size) + 1 ----CEloss = nn.CrossEntropyLoss() ----display_loss = [] ----for batch_idx, (inputs, labels, path) in enumerate(dataloader): --------inputs, labels = inputs.cuda(), labels.cuda() --------optimizer.zero_grad() --------outputs = net(inputs) --------loss = CEloss(outputs, labels) --------L = loss --------display_loss.append(L)

Thanks again for your help. Looking forward to your reply.

LiJunnan1992 commented 2 years ago

Hi, thanks for your interest. In order to compute the loss for all training samples with model.eval (Figure 2) , you can use the eval_train function.

xiqxin1 commented 2 years ago

Hi Junnan,

Thanks a lot for your reply! But I'm still uncertain about it. Did you mean we can plot the Fig2 with plt.plot(input_loss) or plt.plot(prob, input_loss) in the eval_train function?

xiqxin1 commented 2 years ago

Hi Junnan,

Could I contact you by email?

It is not easy to edit my questions in the channel. I've sent my question to your email (junnan.li@salesforce.com). Did you receive it?