This is the code of Robust Partial Matching for Person Search in the Wild accepted in CVPR2020. The Align-to-Part Network(APNet) is proposed to alleviate the misalignment problem occurred in pedestrian detector, facilitating the downstream re-identification task. The code is based on maskrcnn-benchmark.
apex
.NOTE: If you meet some problems during the installation, you may find a solution in issues of official maskrcnn-benchmark.
APNet
git clone https://github.com/zhongyingji/APNet.git
cd APNet
rm -rf build/
python setup.py build develop
Make sure you have downloaded the dataset of person search like PRW-v16.04.20.
keypoint_pred/
. Copy all the files into the root dir of dataset, like /path_to_prw_dataset/PRW-v16.04.20/
:cp keypoint_pred/* /path_to_prw_dataset/PRW-v16.04.20/
datasets/
as follows:ln -s /path_to_prw_dataset/PRW-v16.04.20/ maskrcnn_benchmark/datasets/PRW-v16.04.20
APNet composes of three modules, OIM, RSFE and BBA. To train the entire network, you can simply run:
./train.sh
which contains the training scripts of the three modules.
NOTE: Both RSFE and BBA are required to be intialised with the trained OIM. For more details, please check train.sh
.
You can alter the scripts in train.sh
in the following aspects:
We train OIM on 2 GPUS with batchsize 4. If you encounter out-of-memory (OOM) error, reduce the batchsize by setting SOLVER.IMS_PER_BATCH
to a smaller number.
If you want to use 1 GPU, replace the command of OIM with single GPU training script:
python tools/train_net.py --config-file "configs/reid/prw_R_50_C4.yaml" SOLVER.IMS_PER_BATCH 2 TEST.IMS_PER_BATCH 8 OUTPUT_DIR "models/prw_oim"
After each of the module has been trained, you can run exactly the same training script of that module to test the performance.
If you find this work or code is helpful in your research, please consider citing: