YizJia / DDPS

Official Pytorch codebase for "Disaggregation Distillation for Person Search" [TMM 2024]
1 stars 0 forks source link

Introduction

This is the official implementation for "Disaggregation Distillation for Person Search" in TMM 2024.

Contributions

Overall architecture of DDPS

DDPS

Performance

Dataset CUHK-SYSU CUHK-SYSU PRW PRW
Backbone mAP Top-1 mAP Top-1
T ResNet-50 91.96 92.72 41.43 79.24
S ResNet-18 89.96 91.03 38.50 77.39
w/ DDPS 92.57 93.45 41.93 79.58
ResNet-18(0.5) 79.23 81.34 32.29 74.28
w/DDPS 86.03 87.21 36.50 76.28
MobileNetV2 89.60 91.00 39.85 79.97
w/ DDPS 91.12 92.45 41.78 80.21

Installation

conda create -n ddps python=3.8 -y && conda activate ddps
pip install -r requirements.txt

Quick Start

Let's say $ROOT is the root directory.

  1. Download CUHK-SYSU and PRW datasets, and unzip them to $ROOT/data
    $ROOT/data
    ├── CUHK-SYSU
    └── PRW
  2. Following the link in the above table, download our pretrained model to anywhere you like, e.g., $ROOT/exp_cuhk
  3. Run an inference demo by specifing the paths of checkpoint and corresponding configuration file. python demo.py --cfg $ROOT/exp_cuhk/config.yaml --ckpt $ROOT/exp_cuhk/epoch_19.pth You can checkout the result in demo_imgs directory.

demo.jpg

Execution Instructions

Training

  1. Training the teacher model with ResNet-50 as the backbone.
bash job_train_base.sh
  1. Training the student model with the teacher and our proposed DDPS method.
bash job_train_ddps.sh

Test

Suppose the output directory is $ROOT/exp_cuhk. Test the trained model:

python train_ddps.py --cfg $ROOT/exp_cuhk/config.yaml --eval --ckpt $ROOT/exp_cuhk/epoch_19.pth

Acknowledge

Thanks to the solid codebase from SeqNet.