Implementation for: DistilPose: Tokenized Pose Regression with Heatmap Distillation
DistilPose: Tokenized Pose Regression with Heatmap Distillation,
Suhang Ye*, Yingyi Zhang*, Jie Hu*, Liujuan Cao, Shengchuan Zhang✉, Lei Shen, Jun Wang, Shouhong Ding, Rongrong Ji. \ In: Conference on Computer Vision and Pattern Recognition~(CVPR), 2023
arXiv preprint (arXiv 2303.02455)
(* equal contribution)
In the field of human pose estimation, regression-based methods have been dominated in terms of speed, while heatmap-based methods are far ahead in terms of performance. How to take advantage of both schemes remains a challenging problem. In this paper, we propose a novel human pose estimation framework termed DistilPose, which bridges the gaps between heatmap-based and regression-based methods.
Our contributions are summarized as follows:
DistilPose depends on PyTorch and MMPose, please install following packages:
conda create -n distilpose python=3.8 pytorch=1.7.0 torchvision -c pytorch -y
conda activate distilpose
pip3 install openmim
mim install mmcv-full==1.3.8
git submodule update --init
cd mmpose
git checkout v0.22.0
pip3 install -e .
cd ..
pip3 install -r requirements.txt
Download teacher checkpoint and save to "./teacher_chkpts". Train model on COCO as below:
./tools/dist_train.sh configs/body/2d_kpt_sview_rgb_img/distilpose/coco/DistilPose_S_coco_256x192.py 8
For evaluating on COCO, downlowd checkpoint and run the following command lines:
./tools/dist_test.sh configs/body/2d_kpt_sview_rgb_img/distilpose/coco/DistilPose_S_coco_256x192.py \
./checkpoints/distilpose_s.pth 8
Name | Role | Param | GFLOPs | AP | download |
---|---|---|---|---|---|
DistilPose-S | Student | 5.4 | 2.38 | 71.6 | Code:3733 |
DistilPose-L | Student | 21.3 | 10.33 | 74.4 | Code:5tni |
TokenPose-L | Teacher | 69.4 | 17.03 | 75.2 | Code:b8vn |
Please consider citing our papers in your publications if the project helps your research. BibTeX reference is as follows.
@article{ye2023distilpose,
title={DistilPose: Tokenized Pose Regression with Heatmap Distillation},
author={Ye, Suhang and Zhang, Yingyi and Hu, Jie and Cao, Liujuan and Zhang, Shengchuan and Shen, Lei and Wang, Jun and Ding, Shouhong and Ji, Rongrong},
journal={arXiv preprint arXiv:2303.02455},
year={2023}
}