winycg / CLIP-KD

[CVPR-2024] Official implementations of CLIP-KD: An Empirical Study of CLIP Model Distillation
30 stars 0 forks source link

The performance when abandoning task loss #8

Open youngtboy opened 2 weeks ago

youngtboy commented 2 weeks ago

This is a comprehensive experimental study, but I have a question: if we abandon task loss (CLIP Pretrain Loss) and only use distillation loss (such as FD), how does the performance compare to the provided results?