cd data_processing
Python Version: 3.11
pip install -r requirements.txt
JSONv3
data_processing/ip_config.py
set VR_HOST
to VR headset IP address and LOCAL_HOST
to your computer's IParcap_release.apk
using MQDH -> Device Manager.ARCap_Unity
ARCap_unity
, waiting for Unity Editor to openBuild And Run
In data_processing
folder, mkdir data
data_processing
folder, run:
python data_collection_server.py # --no_camera if D435 is not connected
data_processing
folder, run:
python data_collection_server.py --handedness left # --no_camera if D435 is not connected
All collected data will be stored in data/<yyyy-mm-dd-hh-mm-ss>
, each trajectory has its own folder named: data/<yyyy-mm-dd-hh-mm-ss>/<yyyy-mm-dd-hh-mm-ss>
. Each frame is stored as a .npz
In data_processing
folder, mkdir data_processed
data_processing
folder, run:
python convert_data_with_robot.py --root_path <yyyy-mm-dd-hh-mm-ss> --visualize
data_processing
folder, run:
python convert_data_with_robot.py --root_path <yyyy-mm-dd-hh-mm-ss> --use_gripper --visualize
All processed data will be stored in data_processed/<yyyy-mm-dd-hh-mm-ss>
, each trajectory has its own folder, named: data_processed/<yyyy-mm-dd-hh-mm-ss>/demo_i
. Each frame is stored as a subfolder data_processed/<yyyy-mm-dd-hh-mm-ss>/demo_i/frame_j
, which contains joint angles of hand and arm, wrist poses and point clouds
If you find this repository useful please cite:
@article{chen2024arcap,
title={ARCap: Collecting High-quality Human Demonstrations for Robot Learning with Augmented Reality Feedback},
author={Chen, Sirui and Wang, Chen and Nguyen, Kaden and Fei-Fei, Li and Liu, C Karen},
journal={arXiv preprint arXiv:2410.08464},
year={2024}
}