Open arminiusresistance opened 2 months ago
rtdetrv2_r18vd_120e_coco_rerun_48.1.pth
only contains ema.state_dict
checkpoint
in training process contains model.state_dict, optimizer.state_dict, ema.state_dict, etc.
see detials https://github.com/lyuwenyu/RT-DETR/blob/main/rtdetrv2_pytorch/src/solver/_solver.py#L102-L116
I just started using RT-DETR. The checkpoint of
rtdetrv2_r18vd_120e_coco_rerun_48.1.pth
has around 77 MB, but after running this command,python tools/train.py -c configs/rtdetrv2/rtdetrv2_r18vd_120e_coco.yml--use-amp --seed=0
The checkpoints for each epoch are around 300MB
Shouldn't the checkpoints and the pretrained models have the same size?