danieltan07 / learning-to-reweight-examples

PyTorch Implementation of the paper Learning to Reweight Examples for Robust Deep Learning
351 stars 61 forks source link

implementation for the noisy cifar10 experiment #9

Open hansbu opened 5 years ago

hansbu commented 5 years ago

Neat implementation!

I am trying to modify your code to replicate the uniformflip noisy cifar10 experiment as mentioned in the paper but could not get it worked using the settings from the paper, not sure if I did it incorrectly or missing something. Do you happen to have the implementation for that experiment?

Witt-Wang commented 5 years ago

Neat implementation!

I am trying to modify your code to replicate the uniformflip noisy cifar10 experiment as mentioned in the paper but could not get it worked using the settings from the paper, not sure if I did it incorrectly or missing something. Do you happen to have the implementation for that experiment?

Have you got the same results on cifar10?

guixianjin commented 5 years ago

I have tried uniformflip noisy MNIST experiment, and the flip probability was set as 0.4. The reweight method really worked, and the accuracy is 0.65(not using reweight), 0.9(using reweight). But may be it is because the MNIST dataset is too simple.

hansbu commented 5 years ago

I have tried uniformflip noisy MNIST experiment, and the flip probability was set as 0.4. The reweight method really worked, and the accuracy is 0.65(not using reweight), 0.9(using reweight). But may be it is because the MNIST dataset is too simple.

@guixianjin: Cifar10 is almost the same structure as MNIST, could you try on Cifar10 and let us know the results?

guixianjin commented 5 years ago

Okay, I will try it this week.

guixianjin commented 5 years ago

I have tried it. But I don't try it in the same setting as the paper because I don't have enough computational resource. Maybe it won't help you, very sorry! By the way, there are official implementation by original author https://github.com/uber-research/learning-to-reweight-examples

My experimental settings are as follows: I only use two class in cifar10, class 9 and class 4, and the data sizes: train data 4950, val 50, test 5000.

I inject uniform flip noise with probability 0.4 into train data.

The result is as follows: image image

hansbu commented 5 years ago

Cool. Thank you for your update. I did not notice that the authors have released their codes.