bailvwangzi / repulsion_loss_ssd

Repulsion Loss: Detecting Pedestrians in a Crowd. https://arxiv.org/abs/1711.07752
MIT License
234 stars 66 forks source link
pytorch repulsion ssd

Repulsion Loss implemented with SSD

Forked from PyTorch-SSD, which is a PyTorch implementation of Single Shot MultiBox Detector from the 2016 paper by Wei Liu, Dragomir Anguelov, Dumitru Erhan, Christian Szegedy, Scott Reed, Cheng-Yang, and Alexander C. Berg. The official and original Caffe code can be found here.

Table of Contents

       

Installation

Datasets

To make things easy, we provide bash scripts to handle the dataset downloads and setup for you. We also provide simple dataset loaders that inherit torch.utils.data.Dataset, making them fully compatible with the torchvision.datasets API.

COCO

Microsoft COCO: Common Objects in Context

Download COCO 2014
# specify a directory for dataset to be downloaded into, else default is ~/data/
sh data/scripts/COCO2014.sh

VOC Dataset

PASCAL VOC: Visual Object Classes

Download VOC2007 trainval & test
# specify a directory for dataset to be downloaded into, else default is ~/data/
sh data/scripts/VOC2007.sh # <directory>
Download VOC2012 trainval
# specify a directory for dataset to be downloaded into, else default is ~/data/
sh data/scripts/VOC2012.sh # <directory>

Training SSD

mkdir weights
cd weights
wget https://s3.amazonaws.com/amdegroot-models/vgg16_reducedfc.pth
python train.py

Evaluation

To evaluate a trained network:

python eval.py

You can specify the parameters listed in the eval.py file by flagging them or manually changing them.

Example

SSD:

SSD + repulsion loss:

Performance

VOC2007 Test

mAP
Method mAP mAP on Crowd
SSD 77.52% 48.24%
SSD+RepGT 77.43% 50.12%

Demos

Use a pre-trained SSD network for detection

Download a pre-trained network

Try the demo notebook

# make sure pip is upgraded
pip3 install --upgrade pip
# install jupyter notebook
pip install jupyter
# Run this inside ssd.pytorch
jupyter notebook

Try the webcam demo

TODO

We have accumulated the following to-do list, which we hope to complete in the near future

Authors

References