midea-ai / CMG-Net

[ICRA 2023] CMG-Net: An End-to-End Contact-based Multi-Finger Dexterous Grasping Network
Other
3 stars 2 forks source link

CMG-Net

Baseline model for "CMG-Net: An End-to-End Contact-based Multi-Finger Dexterous Grasping Network" (ICRA 2023).

[arXiv]

In this repository, we propose an end-to-end deep neural network, CMG-Net, for multi-finger grasping.

图片

Requirements

Installation

This code has been tested with python3.7, pytorch 1.7, CUDA 10.1

Create the conda env

conda env create -f environment.yml

Compile and install pointnet2 operators (code adapted from VoteNet)

cd pointnet2
python setup.py install

Get Pyrender with OSMesa follow here

Download Models and Data

CMG-Net Model

Download trained model from here and extract it into the checkpoints/ folder.

Object Model

Download object models from here, extract them to the dataset/ folder and place them in the following structure:

dataset
|-- urdfs
|  |-- barrett_object
|  |-- setup

Training Data

Download a mini-dataset from here and extract it to the folder:

dataset
|-- view_7

Test Data

Download 50 test scenes from here and extract it to the folder:

dataset
|-- test

Inference

CMG-Net can output multi-finger hand configurations and grasp poses for an input single-shot viewpoint in a cluttered scene.

Using test scenes to test our CMG-Net and evaluate the Successful Rate(SR) in simulation, execute:

python3 test.py --use_normal --checkpoint_path checkpoints/36_vw7_1155_2048.pth

Training

Training CMG_Net

Start training with distributed data parallel:

python -m torch.distributed.launch --nproc_per_node=1 train.py --use_normal

Notes:

Citation

Please cite our paper in your publications if it helps your research:

@inproceedings{wei2023cmgnet,
  title={CMG-Net: An End-to-End Contact-Based Multi-Finger Dexterous Grasping Network}, 
  author={Mingze Wei and Yaomin Huang and Zhiyuan Xu and Ning Liu and Zhengping Che and Xinyu Zhang and Chaomin Shen and Feifei Feng and Chun Shan and Jian Tang},
  year={2023},
  booktitle={Proceedings of the 2023 IEEE International Conference on Robotics and Automation (ICRA)},
}

Dependencies

This project uses the following third-party code:

License

This project is licensed under this license.