Open hansbu opened 5 years ago
Neat implementation!
I am trying to modify your code to replicate the uniformflip noisy cifar10 experiment as mentioned in the paper but could not get it worked using the settings from the paper, not sure if I did it incorrectly or missing something. Do you happen to have the implementation for that experiment?
Have you got the same results on cifar10?
I have tried uniformflip noisy MNIST experiment, and the flip probability was set as 0.4. The reweight method really worked, and the accuracy is 0.65(not using reweight), 0.9(using reweight). But may be it is because the MNIST dataset is too simple.
I have tried uniformflip noisy MNIST experiment, and the flip probability was set as 0.4. The reweight method really worked, and the accuracy is 0.65(not using reweight), 0.9(using reweight). But may be it is because the MNIST dataset is too simple.
@guixianjin: Cifar10 is almost the same structure as MNIST, could you try on Cifar10 and let us know the results?
Okay, I will try it this week.
I have tried it. But I don't try it in the same setting as the paper because I don't have enough computational resource. Maybe it won't help you, very sorry! By the way, there are official implementation by original author https://github.com/uber-research/learning-to-reweight-examples
My experimental settings are as follows: I only use two class in cifar10, class 9 and class 4, and the data sizes: train data 4950, val 50, test 5000.
I inject uniform flip noise with probability 0.4 into train data.
The result is as follows:
Cool. Thank you for your update. I did not notice that the authors have released their codes.
Neat implementation!
I am trying to modify your code to replicate the uniformflip noisy cifar10 experiment as mentioned in the paper but could not get it worked using the settings from the paper, not sure if I did it incorrectly or missing something. Do you happen to have the implementation for that experiment?