issues
search
SwinTransformer
/
Feature-Distillation
MIT License
244
stars
11
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
Finetuned checkpoint of FD-SwinV2-G
#16
tung-nd
opened
1 year ago
0
What is the difference between distilled model and fine-tuned model?
#15
shenyehui
closed
1 year ago
0
Was Whitening implemented incorrectly?
#14
jsrdcht
opened
1 year ago
1
resnet as student network and CLIP as teacher network
#13
gqingc
opened
1 year ago
0
Can you share the logs for the released models?
#12
yxchng
opened
1 year ago
0
Which part is teacher in class FD?
#11
xs1997zju
opened
1 year ago
0
Distilling features of SwinV2-L to SwinV2-T
#10
yopknopixx
opened
1 year ago
1
哈喽
#9
ross-Hr
opened
2 years ago
0
model release
#8
futureisatyourhand
opened
2 years ago
0
About Loss
#7
Kathrine94
closed
1 year ago
0
can distilling recursively improve the performance?
#6
ray-lee-94
opened
2 years ago
2
Where do the properties come from?
#5
jsrdcht
opened
2 years ago
0
When will you release the code of Contrastive Learning Rivals Masked Image Modeling in Fine-tuning via Feature Distillation?
#4
geekinglcq
opened
2 years ago
2
Didn't get the purpose of the method
#3
NikAleksFed
opened
2 years ago
3
Are the Teacher and Student the same pretrained model with the same initialization?
#2
zhangxinyu-xyz
opened
2 years ago
2
Code Release
#1
abhigoku10
opened
2 years ago
0