We have added RegNet, and DLA34 backbone support to the already existing rtdetr_pytorch architecture. These two backbones offer a compelling balance between parameter efficiency and precision. With only 38M and 34M parameters respectively, the rtdetr_dla34 achieves an impressive APval of 49.6 and the rtdetr_regnet surpasses it with 51.6, outperforming existing architectures such as rtdetr_r50vd_m (42M params, APval 51.3). These enhancements not only enhance model performance but also ensure faster inference speeds, making them ideal for real-world applications.
We have added RegNet, and DLA34 backbone support to the already existing rtdetr_pytorch architecture. These two backbones offer a compelling balance between parameter efficiency and precision. With only 38M and 34M parameters respectively, the rtdetr_dla34 achieves an impressive APval of 49.6 and the rtdetr_regnet surpasses it with 51.6, outperforming existing architectures such as rtdetr_r50vd_m (42M params, APval 51.3). These enhancements not only enhance model performance but also ensure faster inference speeds, making them ideal for real-world applications.
I have detailed the relevant information below.
Model Zoo
Training
RegNet
1. python tools/train.py -c configs/rtdetr/rtdetr_regnet_6x_coco.yml
DLA34
2. python tools/train.py -c configs/rtdetr/rtdetr_dla34_6x_coco.yml