thulas / dac-label-noise

Label de-noising for deep learning
58 stars 9 forks source link

Identifying Systematic Label Noise accuracy #7

Open csung7 opened 4 years ago

csung7 commented 4 years ago

Dear Sunil,

Thank you for sharing your great work. I was trying to reproduce your experiment results on Identifying Systematic Label Noise but with your DAC loss I don't get your accuracy. This is what I got from your code and hyper-parameters:

python train_dac.py --datadir ./data --dataset stl10-c --train_y train_y_downshifted_random_monkeys.bin --test_y test_y_downshifted_random_monkeys.bin --nesterov --net_type vggnet -use-gpu --epochs 75 --loss_fn dac_loss --learn_epochs 10 --seed 0

And from the above running I got:

8331 Acc@1: 82.100% Acc@2: 92.247%
| Validation Epoch #75                  Abstained: 398 Loss: 1.9122 Acc@1: 54.88% Acc@2: 57.75%
| Elapsed time : 0:10:41

[Phase 4] : Testing model
* Test results : Acc@1 = 65.20%

Can you help me how you get a better number from your DAC loss?

Thank you.

thulas commented 4 years ago

@csung7 Acc@1 seems correct as this is the accuracy of the non-abstaining classifier on the test set. What is Acc@2 that you're seeing on the test set? This would be the accuracy of the DAC and should be > 70%.

csung7 commented 4 years ago

@thulas Thanks for your comments but I don't see any accuracy difference between the baseline and your DAC.

thulas commented 4 years ago

@csung7

There should be an Acc@2 number for the test data -- can you report what this is? I don't see it in your output above. This will almost certainly be different from Acc@1, simply because Acc@2 only computes accuracy over the non-abstained samples.