dvlab-research / ReviewKD

Distilling Knowledge via Knowledge Review, CVPR 2021
248 stars 34 forks source link

why kernal was set 4, 2, 1 in hcl(fstudent, fteacher) ? #3

Closed eeric closed 3 years ago

eeric commented 3 years ago

in reviewkd.py

akuxcw commented 3 years ago

Hi, this is an intuitive setting and turns out to be good. A more careful design may have better performance.

eeric commented 3 years ago

smart idea! for your distilling knowledge, I thought hcl named loss may be the most important work that attributed to kernal (4, 2, 1), while ABF was a gimmick, it's a joke, of course.