issues
search
Hi-FT
/
ERD
Overcoming Catastrophic Forgetting in Incremental Object Detection via Elastic Response Distillation
Apache License 2.0
92
stars
10
forks
source link
issues
Newest
Newest
Most commented
Recently updated
Oldest
Least commented
Least recently updated
How to do multi-step training
#19
miaomiaojun122
opened
4 months ago
0
Can GFLV2 use the ideas of this article?
#18
0x0f0f0f0f
opened
11 months ago
0
Only Knowledge-Distilation also implemented ?
#17
jeantirole
opened
1 year ago
0
How to train this model
#16
yangchengxin
opened
1 year ago
2
where is l2 distillation loss.
#15
tungvd345
opened
1 year ago
0
Migrate code to MMDetection3.0
#14
beiyan1911
closed
1 year ago
0
Does the Project work for models other than Faster Rcnn such as yolo?
#13
ishulei
opened
1 year ago
2
data split
#12
tobocca
opened
1 year ago
1
Error reported during fine tuning experiment
#11
shixingy
opened
1 year ago
0
loss is nan
#10
jingong
opened
1 year ago
2
How to get the checkpoint file "epoch_12.pth"
#9
qduOliver
closed
1 year ago
0
How to get the checkpoint file "epoch_12.pth"
#8
jingong
opened
1 year ago
4
Bunch of Issues
#7
satpalsr
opened
2 years ago
51
what is the difference of instances_train2017_sel_last_40_cats.json and instances_train2017.json in coco
#6
OpenAI2022
opened
2 years ago
0
Where are the fcos_head_tune、fcos_head_incre
#5
chenfangchenf
opened
2 years ago
2
The code is incompatible with Algorithm1?
#4
shuangshuangguo
closed
2 years ago
3
Will new-class data be feed in the teacher model?
#3
enmengyi
opened
2 years ago
0
Update README.md
#2
JacobYuan7
closed
2 years ago
0
when could the source code be released?
#1
lowestbuaaer
opened
2 years ago
10