Open Ji-Hu opened 1 year ago
It is because the experimental results show that ‘teacher’ perform better. You can switch to 'student' for inference.
I'm using soft teacher configuration with custom data. However, I found that if I changed the loop to EpochBasedTrainLoop
, the training would exceed the displayed iteration number in one epoch like Epoch(train) [1][2150/1458]
without stopping to do evaluation. Kindly request for some explanations or help please!
I have the same question, type='EpochBasedTrainLoop'
and Epoch(train) [1][1000/881]
@Jeffery-MIC
Only support IterBasedTrainLoop
, since there exits two datasets: labeled and unlabled.
The problem has been solved according to your method, thank you @Czm369
File "e:\lzx\mmdetection\mmdet\models\detectors\soft_teacher.py", line 255, in rcnn_cls_loss_by_pseudo_instances losses['loss_cls'] = losses['loss_cls'] * len( KeyError: 'loss_cls' 你好,我遇到了一个问题,关于损失函数没有loss_cls列表的问题
Have you solved the problem? @lzx101
Have you solved the problem? @lzx101
解决了,我没有改具体的数据,我只是修改了source_ratio=[1,2]
Describe the bug
In the default configuration of the soft-teacher model provided by mmdetection,
I confirmed this configuration makes inference on the teacher model
BUT, the soft-teacher author's configuration defines inference on the student model (and also the paper) author's config line :253
Is the mmdetection's configuration intended or just a typo error?