YyzHarry / imbalanced-semi-self

[NeurIPS 2020] Semi-Supervision (Unlabeled Data) & Self-Supervision Improve Class-Imbalanced / Long-Tailed Learning
https://arxiv.org/abs/2006.07529
MIT License
735 stars 115 forks source link

About the Proof of Theorem1 #11

Closed YangPatrick closed 3 years ago

YangPatrick commented 3 years ago

At the end of proof, the probability of event E is 1-P1-P2-P3 BUT WHY not the product of three probability which is (1-P1)(1-P2)(1-P3)?

YyzHarry commented 3 years ago

Hi - here we are using the union bound to derive the lower bound. This does not assume whether the events are independent or not.

YangPatrick commented 3 years ago

Thanks for your response. There is another question is about theoretical motivation in self-supervision. You assume the major class is negative and negative samples have larger variance. Are thes two events relevent or just assumption? In visualization, class with more samples seems to have larger variance, but i think the true distribution is not related to the number of samples.

YyzHarry commented 3 years ago

These are just assumptions for the toy model, and are not necessarily relevent.