This is the Official Repository for FreeMan dataset, including example loading codes for FreeMan dataset.
🌏 Project Page • 🙋♂️ Request • 📄 Paper • ▶️ YouTube • 🖥️ Code
[2024-02-24] FreeMan is accepted by CVPR2024! See you all in Seattle this summer.
[2023-09-07] Project page updated! Details & download methods are presented.
[2023-06-15] Hi! We are almost there! Data are uploading to cloud server. Please sign this Form for latest updates.
After downloading datasets, files are expected to be uncompressed as following. You can also adjust for your convenience.
FreeMan
├── 30FPS
│ ├── bbox2d
│ │ ├── yyyymmdd_xxxxxxxx_subjNN.npy
│ │ └── ...
│ ├── keypoints2d
│ │ ├── yyyymmdd_xxxxxxxx_subjNN.npy
│ │ └── ...
│ ├── keypoints3d
│ │ ├── yyyymmdd_xxxxxxxx_subjNN.npy
│ │ └── ...
│ ├── motions
│ │ ├── yyyymmdd_xxxxxxxx_subjNN_view0.npy
│ │ ├── ...
│ │ ├── yyyymmdd_xxxxxxxx_subjNN_view8.npy
│ │ └── ...
│ ├── cameras
│ │ ├── yyyymmdd_xxxxxxxx_subjNN.json
│ │ └── ...
│ └── videos
│ ├── yyyymmdd_xxxxxxxx_subjNN
│ │ ├── cameras.json
│ │ ├── chessboard.pkl
│ │ └── vframes
│ │ ├── c01.mp4
│ │ ├── ...
│ │ └── c08.mp4
│ ├── ...
├── 60FPS
│ ├── ...
from freeman_loader import FreeMan
# Initialize dataset
freeman = FreeMan(base_dir="YOUR PATH", fps="30 or 60")
# Load video frame
video_frames = FreeMan.load_frames(freeman.get_video_path("SESSION_ID", "CAM ID"))
# Load keypoints
kpts2d = FreeMan.load_keypoints2d(freeman.keypoints2d_dir, "SESSION_ID")
kpts3d = FreeMan.load_keypoints3d(freeman.keypoints3d_dir, "SESSION_ID")
If you find FreeMan helpful and used in your project, please cite our paper.
@article{wang2023freeman,
title={FreeMan: Towards Benchmarking 3D Human Pose Estimation in the Wild},
author={Wang, Jiong and Yang, Fengyu and Gou, Wenbo and Li, Bingliang and Yan, Danqi and Zeng, Ailing and Gao, Yijun and Wang, Junle and Zhang, Ruimao},
journal={arXiv preprint arXiv:2309.05073},
year={2023}
}
This project and FreeMan dataset uses lisence of CC BY NC SA 4.0.
Great appreciation to all volunteers participated in FreeMan.
Thanks to great work of AIST++.