fredfyyang / Touch-and-Go

22 stars 4 forks source link

Touch-and-Go

Dataset | Website | Paper





This repository contains the official PyTorch implementation of our applications paper Touch and Go: Learning from Human-Collected Vision and Touch .

Touch and Go: Learning from Human-Collected Vision and Touch
Fengyu Yang, Chenyang Ma, Jiacheng Zhang, Jing Zhu, Wenzhen Yuan, Andrew Owens
University of Michigan and Carnegie Mellon University
In NeurIPS 2022 Datasets and Benchmarks Track

Todo

Citation

If you use this code for your research, please cite our paper.

@inproceedings{
yang2022touch,
  title={Touch and Go: Learning from Human-Collected Vision and Touch},
  author={Fengyu Yang and Chenyang Ma and Jiacheng Zhang and Jing Zhu and Wenzhen Yuan and Andrew Owens},
  booktitle={Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track},
  year={2022}
}

Acknowledgments

We thank Xiaofeng Guo and Yufan Zhang for the extensive help with the GelSight sensor, and thank Daniel Geng, Yuexi Du and Zhaoying Pan for the helpful discussions. This work was supported in part by Cisco Systems and Wang Chu Chien-Wen Research Scholarship.