ydkim1293 / NLNL-Negative-Learning-for-Noisy-Labels

NLNL: Negative Learning for Noisy Labels
102 stars 23 forks source link

How to select the samples after NL? #5

Open monsterlyg opened 4 years ago

monsterlyg commented 4 years ago

After NL, the next step is SelNL, but i have problems with "py > 1/c". According to my understanding, the ideal condition is that the network will output a low probability corresponding to a complementary label after NL. If we select the samples with output probability over 1/c, did we select the data that can not be effectively splitted after NL? So what exactly the 'py' means? Or what the confidence 'py' represents?

Codeczh commented 3 years ago

After NL, the next step is SelNL, but i have problems with "py > 1/c". According to my understanding, the ideal condition is that the network will output a low probability corresponding to a complementary label after NL. If we select the samples with output probability over 1/c, did we select the data that can not be effectively splitted after NL? So what exactly the 'py' means? Or what the confidence 'py' represents?

I guess the y is the noisy label in trainset, not the complementary label.