We release code for Light-Head R-CNN.
This is my best practice for my research.
This repo is organized as follows:
light_head_rcnn/
|->experiments
| |->user
| | |->your_models
|->lib
|->tools
|->output
80k
training and 35k
validation images. Test on minival which is a 5k
subset in validation datasets. Noticing test-dev should be little higher than minival.Model Name | sub>mAP@all</sub | sub>mAP@0.5</sub | sub>mAP@0.75</sub | sub>mAP@S</sub | sub>mAP@M</sub | sub>mAP@L</sub |
---|---|---|---|---|---|---|
R-FCN, ResNet-v1-101 our reproduce baseline | 35.5 | 54.3 | 33.8 | 12.8 | 34.9 | 46.1 |
Light-Head R-CNN ResNet-v1-101 | 38.2 | 60.9 | 41.0 | 20.9 | 42.2 | 52.8 |
Light-Head,ResNet-v1-101 +align pooling | 39.3 | 61.0 | 42.4 | 22.2 | 43.8 | 53.2 |
Light-Head,ResNet-v1-101 +align pooling + nms0.5 | 40.0 | 62.1 | 42.9 | 22.5 | 44.6 | 54.0 |
Experiments path related to model:
experiments/lizeming/rfcn_reproduce.ori_res101.coco.baseline
experiments/lizeming/light_head_rcnn.ori_res101.coco
experiments/lizeming/light_head_rcnn.ori_res101.coco.ps_roialign
experiments/lizeming/light_head_rcnn.ori_res101.coco.ps_roialign
${lighthead_ROOT}
.git clone https://github.com/zengarden/light_head_rcnn
cd ${lighthead_ROOT}/lib;
bash make.sh
Make sure all of your compiling is successful. It may arise some errors, it is useful to find some common compile errors in FAQ
cd ${lighthead_ROOT};
mkdir output
mkdir data
data should be organized as follows:
data/
|->imagenet_weights/res101.ckpt
|->MSCOCO
| |->odformat
| |->instances_xxx.json
| |train2014
| |val2014
Download res101 basemodel:
wget -v http://download.tensorflow.org/models/resnet_v1_101_2016_08_28.tar.gz
tar -xzvf resnet_v1_101_2016_08_28.tar.gz
mv resnet_v1_101.ckpt res101.ckpt
We transfer instances_xxx.json to odformat(object detection format), each line in odformat is an annotation(json) for one image. Our transformed odformat is shared in GoogleDrive odformat.zip .
-d
to assign gpu_id for testing. (e.g. -d 0,1,2,3
or -d 0-3
)-s
to visualize the results. We share our experiments output(logs) folder in GoogleDrive. Download it and place it to ${lighthead_ROOT}
, then test our release model.
e.g.
cd experiments/lizeming/light_head_rcnn.ori_res101.coco.ps_roialign
python3 test.py -d 0-7 -se 26
We provide common used train.py in tools, which can be linked to experiments folder.
e.g.
cd experiments/lizeming/light_head_rcnn.ori_res101.coco.ps_roialign
python3 config.py -tool
cp tools/train.py .
python3 train.py -d 0-7
This repo is designed be fast
and simple
for research. There are still some can be improved: anchor_target and proposal_target layer are tf.py_func
, which means it will run on cpu.
This is an implementation for Light-Head R-CNN, it is worth noting that:
If you find Light-Head R-CNN is useful in your research, pls consider citing:
@article{li2017light,
title={Light-Head R-CNN: In Defense of Two-Stage Object Detector},
author={Li, Zeming and Peng, Chao and Yu, Gang and Zhang, Xiangyu and Deng, Yangdong and Sun, Jian},
journal={arXiv preprint arXiv:1711.07264},
year={2017}
}
First, find where is cuda_config.h.
e.g.
find /usr/local/lib/ | grep cuda_config.h
then export your cpath, like:
export CPATH=$CPATH:/usr/local/lib/python3.5/dist-packages/external/local_config_cuda/cuda/