gorkemalgan / deep_learning_with_noisy_labels_literature

This repo consists of collection of papers and repos on the topic of deep learning by noisy labels / label noise.
236 stars 31 forks source link

Missing References #2

Open eamid opened 3 years ago

eamid commented 3 years ago

I would like to point out that our work (Amid et al. 2019a) extends the Generalized CE loss (Zhang and Sabuncu 2018) by introducing two temperatures t1 and t2 which recovers GCE when t1 = q and t2 = 1. Our more recent work, called the bi-tempered loss (Amid et al. 2019b) extends these methods by introducing a proper (unbiased) generalization of the CE loss and is shown to be extremely effective in reducing the effect of noisy examples. Please consider adding these two papers to your list.

Google AI blog post: https://ai.googleblog.com/2019/08/bi-tempered-logistic-loss-for-training.html Code: https://github.com/google/bi-tempered-loss Demo: https://google.github.io/bi-tempered-loss/

(Amid et al. 2019a) Amid et al. "Two-temperature logistic regression based on the Tsallis divergence." In The 22nd International Conference on Artificial Intelligence and Statistics (AISTATS), 2019.

(Amid et al. 2019b) Amid et al. "Robust bi-tempered logistic loss based on Bregman divergences." In Advances in Neural Information Processing Systems (NeurIPS), 2019.

(Zhang and Sabuncu 2018) Generalized Cross Entropy Loss for Training Deep Neural Networks with Noisy Labels, In Advances in Neural Information Processing Systems (NeurIPS), 2018.

eamid commented 2 years ago

Hi @gorkemalgan

Just following up on this. Thanks!