This is the offical Pytorch implementation of our paper:
Clone this project. NVIDIA GPUs are needed.
git clone https://github.com/ShirleyMaxx/ChimpACT
cd ChimpACT
For three benchmarks, we use MMTracking, MMPose, and MMAction2, with seperate running environments, respectively. We recommend you to use Anaconda virtual environments. Follow below installation instructions for each benchmark task.
We follow the installation instructions in MMTracking.
conda create -n chimp_track python=3.8 -y
conda activate chimp_track
pip install torch==1.8.0+cu111 torchvision==0.9.0+cu111 torchaudio==0.8.0 -f https://download.pytorch.org/whl/torch_stable.html
# install the latest mmcv
# pip install mmcv-full -f https://download.openmmlab.com/mmcv/dist/{cu_version}/{torch_version}/index.html
pip install mmcv-full -f https://download.openmmlab.com/mmcv/dist/cu111/torch1.8.0/index.html
# install mmdetection
pip install mmdet==2.28.2
# install mmtracking
cd mmtracking
pip install -r requirements/build.txt
pip install -v -e .
cd TrackEval
pip install -e .
cd ..
pip install ipdb termcolor imageio imageio[ffmpeg] communities future tensorboard
cd ..
We follow the installation instructions in MMPose.
conda create --name chimp_pose python=3.8 -y
conda activate chimp_pose
pip install torch==1.8.0+cu111 torchvision==0.9.0+cu111 torchaudio==0.8.0 -f https://download.pytorch.org/whl/torch_stable.html
pip install -U openmim
mim install mmengine
mim install "mmcv>=2.0.0"
cd mmpose
pip install -r requirements.txt
pip install -v -e .
# "-v" means verbose, or more output
# "-e" means installing a project in editable mode,
# thus any local modifications made to the code will take effect without reinstallation.
pip install tensorboard pycocotools seaborn tqdm ipdb imageio openpyxl
pip uninstall -y Pillow
pip install Pillow==9.5.0
cd ..
We follow the installation instructions in MMAction2.
conda create --name chimp_action python=3.8 -y
conda activate chimp_action
pip install torch==1.8.0+cu111 torchvision==0.9.0+cu111 torchaudio==0.8.0 -f https://download.pytorch.org/whl/torch_stable.html
pip install -U openmim
mim install mmengine
mim install mmcv
mim install mmdet
mim install mmpose
cd mmaction2
pip install -v -e .
# "-v" means verbose, or more output
# "-e" means installing a project in editable mode,
# thus any local modifications made to the code will take effect without reinstallation.
pip install tensorboard seaborn tqdm ipdb imageio
pip install imageio[ffmpeg]
cd ..
Please check that there is more than 20G storage on your workstation. Download ChimpACT dataset and unzip it to data/ChimpACT_release/
. The content in ChimpACT_release
contains the original dataset:
videos_full
includes 163 video clips in .mp4
format,labels
includes 163 label .json
file in COCO-style for each video clip,action_list.txt
contains the action categories.:star: Remind to make a data
directory under the ChimpACT
project first.
:star: Our dataset is distributed under the CC BY-NC 4.0 license.
To run the three tracks, please process the data using the script. Please activate chimp_track
env. first.
conda activate chimp_track
(chimp_track)$ sh scripts/process_data.sh
If everything goes well, the data structure should be like this. Folder ChimpACT_processed
is about 12G. Folder ChimpACT_release
is about 5G.
ChimpACT
|-- data
│-- ChimpACT_processed
│-- annotations
│-- action
│-- action_list.txt
│-- train_action_excluded_timestamps.csv
│-- train_action_gt.pkl
│-- train_action.csv
│-- val_action_excluded_timestamps.csv
│-- val_action_gt.pkl
│-- val_action.csv
│-- test_action_excluded_timestamps.csv
│-- test_action_gt.pkl
│-- test_action.csv
|-- train.json
│-- val.json
│-- test.json
│-- reid
│-- imgs
│-- meta
|-- train
│-- images
│-- videos
|-- val
│-- images
│-- videos
|-- test
│-- images
│-- videos
│-- ChimpACT_release
│-- labels
│-- videos_full
│-- action_list.txt
├-- mmaction2
├-- mmpose
├-- mmtracking
├-- scripts
├-- tools
├-- README.md
Make soft links in the three codebases. Change the ${absolute_path}
to the real ABSOLUTE path to the ChimpACT
project.
ln -s ${absolute_path}/ChimpACT/data mmtracking/
ln -s ${absolute_path}/ChimpACT/data mmpose/
ln -s ${absolute_path}/ChimpACT/data mmaction2/
We provide code for data statistics in tools/cal_vis_stat.py
, and for data visualization in tools/vis_annot.py
. All the running scripts can be found in scripts/visualize_data.sh
.
conda activate chimp_track
sh scripts/visualize_data.sh
cd
to mmtracking
/mmpose
/mmaction2
folder to conduct corresponding experiments.
conda activate chimp_track
cd mmtracking
conda activate chimp_pose cd mmpose
conda activate chimp_action cd mmaction2
- The running scripts is highly similar. All the `${CONFIG_FILE}` are under `mm*/configs`. To train/evaluate the model,
```python
# train with a GPU
CUDA_VISIBLE_DEVICES=0 python tools/train.py ${CONFIG_FILE} [ARGS]
# train with multiple GPUs
bash tools/dist_train.sh ${CONFIG_FILE} ${GPU_NUM} [PY_ARGS]
# eval with a GPU
CUDA_VISIBLE_DEVICES=0 python tools/test.py ${CONFIG_FILE} ${CHECKPOINT_FILE} [ARGS]
# eval with multiple GPUs
bash tools/dist_test.sh ${CONFIG_FILE} ${CHECKPOINT_FILE} ${GPU_NUM} [PY_ARGS]
conda activate chimp_track
cd mmtracking
# train with 4 GPUs
bash tools/dist_train.sh configs/mot/bytetrack/bytetrack_yolox_x_chimp.py 4
# evaluate with 4 GPUs
bash tools/dist_test.sh configs/mot/bytetrack/bytetrack_yolox_x_chimp.py 4 --checkpoint work_dirs/bytetrack_yolox_x_chimp/latest.pth --eval track bbox
conda activate chimp_pose
cd mmpose
# train with 4 GPUs
bash tools/dist_train.sh configs/chimp_2d_keypoint/topdown_heatmap/coco/td-hm_cpm_8xb64-210e_coco-256x192.py 4 --show-dir vis_pose --interval 1000
# eval with 4 GPUs
bash tools/dist_test.sh configs/chimp_2d_keypoint/topdown_heatmap/coco/td-hm_cpm_8xb64-210e_coco-256x192.py work_dirs/td-hm_cpm_8xb64-210e_coco-256x192/epoch_210.pth 4 --dump results_save/td-hm_cpm.pkl
conda activate chimp_action
cd mmaction2
# train with 4 GPUs
bash tools/dist_train.sh configs/detection/slowfast/slowfast_kinetics400-pretrained-r50_8xb8-8x8x1-20e_chimp-rgb.py 4
# eval with 4 GPUs
bash tools/dist_test.sh configs/detection/slowfast/slowfast_kinetics400-pretrained-r50_8xb8-8x8x1-20e_chimp-rgb.py work_dirs/slowfast_kinetics400-pretrained-r50_8xb8-8x8x1-20e_chimp-rgb/epoch_20.pth 4
@article{ma2023chimpact,
title={Chimpact: A longitudinal dataset for understanding chimpanzee behaviors},
author={Ma, Xiaoxuan and Kaufhold, Stephan and Su, Jiajun and Zhu, Wentao and Terwilliger, Jack and Meza, Andres and Zhu, Yixin and Rossano, Federico and Wang, Yizhou},
journal={Advances in Neural Information Processing Systems},
volume={36},
pages={27501--27531},
year={2023}
}
This repo is built on the excellent work MMTracking, MMPose, and MMAction2. Thanks for these great projects.