[ Project Page ][ MOYO Dataset ][ Paper ][ Video ][ Register MoYo Account ]
This is a repository for download, preprocessing, visualizing, running evaluations on the MOYO dataset.
Our dataset provides a challenging new benchmark; it has extreme poses, strong self-occlusion, and significant body-ground and self-contact.
Get a copy of the code:
git clone https://github.com/sha2nkt/moyo.git
General Requirements:
Install the environment:
ENV_NAME=moyo_p39
conda create -n $ENV_NAME python=3.9
conda activate $ENV_NAME
pip install .
conda install -c conda-forge ezc3d
MOYO provides the following data:
20220923_20220926_with_hands/images [741G]
: Full 2K-resolution images20220923_20220926_with_hands/cameras.zip [1.7M]
: Camera parameters for the 8 IOI RGB cameras20220923_20220926_with_hands/mosh.zip [1.3G]
: SMPL-X fits with hand markers20220923_20220926_with_hands/mosh_smpl.zip [1.3G]
: SMPL fits20220923_20220926_with_hands/pressure.zip [298M]
: Pressure mat data20220923_20220926_with_hands/vicon.zip [257M]
: Raw marker data from Vicon20221004_with_com/images [635G]
: Full 2K-resolution images20221004_with_com/cameras.zip [840K]
: Camera parameters for the 8 IOI RGB cameras20221004_with_com/mosh.zip [1,1G]
: SMPL-X fits without hand markers20221004_with_com/mosh_smpl.zip [1,1G]
: SMPL fits20221004_with_com/pressure.zip [517M]
: Pressure mat data20221004_with_com/coms.md [489M]
: Center of mass data from Vicon plug-in gaitNote: The SMPL fits are obtained from the MOYO SMPL-X fits using the SMPLX-to-SMPL conversion script.
⚠️ Register accounts on MOYO, and then use your username and password when prompted.
The following command downloads the full dataset to ./data/
minus the images and unzips them (-u flag).
bash ./moyo/bash/download_moyo.sh -o ./data/ -u
If you additionally want to download the images, you can run the following command:
bash ./moyo/bash/download_moyo.sh -o ./data/ -u -i
The following command downloads the full dataset to ./data/
(including images), unzips the downloaded zips and deletes
the zip files to save space. This will take a while but will give you a fully usable dataset.
bash ./moyo/bash/download_moyo.sh -o ./data/ -u -i -d
MOYO provides the following AMASS formats:
⚠️ Register accounts on MOYO, and then use your username and password when prompted.
The following command downloads the full dataset to ./data/ minus the images and unzips them (-u flag).
bash ./moyo/bash/download_moyo.sh -o ./data/ -u -a <AMASS_FORMAT>
The following command downloads the full dataset to ./data/
(including images), unzips the downloaded zips and deletes
the zip files to save space. This will take a while but will give you a fully usable dataset.
bash ./moyo/bash/download_moyo.sh -o ./data/ -u -d -a <AMASS_FORMAT>
Replace the <AMASS_FORMAT>
with the split name you want to download: SMPLH_FEMALE
, SMPLH_NEUTRAL
, SMPLX_FEMALE
or SMPLX_NEUTRAL
.
We include a simple script to project vicon markers on the RGB images using the provided camera parameters. A similar approach can be used to project the full mesh.
python scripts/ioi_vicon_frame_sync.py --img_folder ../data/moyo/20220923_20220926_with_hands/images/ --c3d_folder ../data/moyo/20220923_20220926_with_hands/vicon --cam_folder_first ../data/moyo/20220923_20220926_with_hands/cameras/20220923/220923_Afternoon_PROCESSED_CAMERA_PARAMS/cameras_param.json --cam_folder_second ../data/moyo/20220923_20220926_with_hands/cameras/20220926/220926_Morning_PROCESSED_CAMERA_PARAMS/cameras_param.json --output_dir ../data/moyo_images_mocap_projected --frame_offset 1 --split val
To visualize the exact the pressure mat markers alignment with respect to the subject, we provide a blender file
in assets/mat_marker_configuration.blend
.
We provide evaluation scripts to run evaluations fro estimated pressure and com w.r.t groud truth as reported in our paper.
Since the following scripts use all dataset modalities: video, 3D bodies, pressure and com, they are helpful to anyone wanting to get started with how to use the Moyo data.
python eval/pressure_map_evaluation.py --img_folder ../data/moyo/20220923_20220926_with_hands/images/val/ --pp_folder ../data/moyo/20220923_20220926_with_hands/mosh/val/ --pressure_xml_folder ../data/moyo/20220923_20220926_with_hands/pressure/val/xml --pressure_csv_folder ../data/moyo/20220923_20220926_with_hands/pressure/val/single_csv
If you would like to visualize, per frame results, please add the --save_outputs
flag.
python eval/com_evaluation.py --img_folder ../data/moyo/20221004_with_com/images/val/ --pp_folder ../data/moyo//20221004_with_com/mosh/val/ --nexus_com_c3d_folder ../data/moyo//20221004_with_com/com/val
If you would like to visualize, per frame results, please add the --save_outputs
flag.
The above implementation is not optimized for speed. We will be releasing a faster version soon.
If you found this code helpful, please consider citing our work:
@inproceedings{tripathi2023ipman,
title = {{3D} Human Pose Estimation via Intuitive Physics},
author = {Tripathi, Shashank and M{\"u}ller, Lea and Huang, Chun-Hao P. and Taheri Omid
and Black, Michael J. and Tzionas, Dimitrios},
booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern
Recognition (CVPR)},
month = {June},
year = {2023}
}
See LICENSE.
Constructing the MOYO dataset is a huge effort. The authors deeply thank Tsvetelina Alexiadis, Taylor McConnell, Claudia Gallatz, Markus Höschle, Senya Polikovsky, Camilo Mendoza, Yasemin Fincan, Leyre Sanchez and Matvey Safroshkin for data collection, Giorgio Becherini for MoSh++, Joachim Tesch and Nikos Athanasiou for visualizations, Zicong Fan, Vasselis Choutas and all of Perceiving Systems for fruitful discussions. This work was funded by the International Max Planck Research School for Intelligent Systems (IMPRS-IS) and in part by the German Federal Ministry of Education and Research (BMBF), Tübingen AI Center, FKZ: 01IS18039B.".
We would also like to extend a special thanks to Giorgio Becherini and Neelay Shah for helping with the release of the AMASS version of the MOYO dataset.
For technical questions, please create an issue. For other questions, please contact ipman@tue.mpg.de
.
For commercial licensing, please contact ps-licensing@tue.mpg.de
.