xuguodong03 / SSKD

[ECCV2020] Knowledge Distillation Meets Self-Supervision
234 stars 47 forks source link

Implementation of data augmentations of self-supervised tasks #2

Closed winycg closed 4 years ago

winycg commented 4 years ago

Hi, I find four augmentations are used in your manuscript, but the implementation may only include rotation, so I would like to know the reported results derived from only rotation or above four augmentations?

Thanks!

xuguodong03 commented 4 years ago

Hi there, thanks for the questions.

The results in paper are derived from four augmentations. But we also found that, the influences of four augmentations are not the same. Rotation plays a main role and the effects of other three are not as crucial as rotation. In implementation, we only include rotation, and its results are nearly the same as that in paper (may have a tiny difference in some teacher-student pairs).

winycg commented 4 years ago

Thank you for your response,I get!