AbnerHqC / GaitSet

A flexible, effective and fast cross-view gait recognition network
577 stars 170 forks source link
cross-view gait gait-recognition python python3 pytorch

GaitSet

LICENSE 996.icu

GaitSet is a flexible, effective and fast network for cross-view gait recognition. The paper has been published on IEEE TPAMI.

Flexible

The input of GaitSet is a set of silhouettes.

Effective

It achieves Rank@1=95.0% on CASIA-B and Rank@1=87.1% on OU-MVLP, excluding identical-view cases.

Fast

With 8 NVIDIA 1080TI GPUs, it only takes 7 minutes to conduct an evaluation on OU-MVLP which contains 133,780 sequences and average 70 frames per sequence.

What's new

The code and checkpoint for OUMVLP dataset have been released. See OUMVLP for details.

Prerequisites

Getting started

Installation

Noted that our code is tested based on PyTorch 0.4

Dataset & Preparation

Download CASIA-B Dataset

!!! ATTENTION !!! ATTENTION !!! ATTENTION !!!

Before training or test, please make sure you have prepared the dataset by this two steps:

Futhermore, you also can test our code on OU-MVLP Dataset. The number of channels and the training batchsize is slightly different for this dataset. For more detail, please refer to our paper.

Pretreatment

pretreatment.py uses the alignment method in this paper. Pretreatment your dataset by

python pretreatment.py --input_path='root_path_of_raw_dataset' --output_path='root_path_for_output'

Configuration

In config.py, you might want to change the following settings:

Train

Train a model by

python train.py

Evaluation

Evaluate the trained model by

python test.py

It will output Rank@1 of all three walking conditions. Note that the test is parallelizable. To conduct a faster evaluation, you could use --batch_size to change the batch size for test.

OUMVLP

Since the huge differences between OUMVLP and CASIA-B, the network setting on OUMVLP is slightly different.

To Do List

Authors & Contributors

GaitSet is authored by Hanqing Chao, Yiwei He, Junping Zhang and JianFeng Feng from Fudan Universiy. Junping Zhang is the corresponding author. The code is developed by Hanqing Chao and Yiwei He. Currently, it is being maintained by Hanqing Chao and Kun Wang.

Citation

Please cite these papers in your publications if it helps your research:

@ARTICLE{chao2019gaitset,
  author={Chao, Hanqing and Wang, Kun and He, Yiwei and Zhang, Junping and Feng, Jianfeng},
  journal={IEEE Transactions on Pattern Analysis and Machine Intelligence}, 
  title={GaitSet: Cross-view Gait Recognition through Utilizing Gait as a Deep Set}, 
  year={2021},
  pages={1-1},
  doi={10.1109/TPAMI.2021.3057879}}

Link to paper:

License

GaitSet is freely available for free non-commercial use, and may be redistributed under these conditions. For commercial queries, contact Junping Zhang.