movearbitrarily / multi-stream

1 stars 1 forks source link

Multi-Channel Network: Constructing Efficient GCN Baselines for Skeleton-based Action Recognition

Dependencies

Data Preparation

Disk usage warning: after preprocessing, the total sizes of datasets are around 77G, 157GB, 63GB for NTU RGB+D 60, NTU RGB+D 120, and Kinetics 400, respectively. The raw/intermediate sizes may be larger.

Download Datasets

There are 3 datasets to download:

NTU RGB+D 60 and 120

  1. Request dataset here: http://rose1.ntu.edu.sg/Datasets/actionRecognition.asp

  2. Download the skeleton-only datasets:

    • nturgbd_skeletons_s001_to_s017.zip (NTU RGB+D 60)
    • nturgbd_skeletons_s018_to_s032.zip (NTU RGB+D 120, on top of NTU RGB+D 60)
    • Total size should be 5.8GB + 4.5GB.
  3. Download missing skeletons lookup files from the authors' GitHub repo:

    • NTU RGB+D 60 Missing Skeletons: wget https://raw.githubusercontent.com/shahroudy/NTURGB-D/master/Matlab/NTU_RGBD_samples_with_missing_skeletons.txt

    • NTU RGB+D 120 Missing Skeletons: wget https://raw.githubusercontent.com/shahroudy/NTURGB-D/master/Matlab/NTU_RGBD120_samples_with_missing_skeletons.txt

    • Remember to remove the first few lines of text in these files!

      NW-UCLA

  4. Download dataset from here

  5. Move all_sqe to ./data/NW-UCLA

Data Preprocessing

Directory Structure

Put downloaded data into the following directory structure:

- data/
  - nturgbd_raw/
    - nturgb+d_skeletons/     # from `nturgbd_skeletons_s001_to_s017.zip`
      ...
    - nturgb+d_skeletons120/  # from `nturgbd_skeletons_s001_to_s017.zip``nturgbd_skeletons_s018_to_s032.zip`
      ...
    - NTU_RGBD_samples_with_missing_skeletons.txt
    - NTU_RGBD120_samples_with_missing_skeletons.txt

Generating Data

Pretrained Models

- multi-stream/
  - pretrained-models/
  - main.py
  - ...

Training & Testing

Acknowledgements

This repo is based on

Thanks to the original authors for their work!