Making Sign Language Recognition Accessible
Check the documentation on how to use the library:
ReadTheDocs: 👐OpenHands
pip install --upgrade OpenHands
pip install git+https://github.com/AI4Bharat/OpenHands
This project is released under the Apache 2.0 license.
Please cite the respective datasets if you used them in your research. Also check the licensing terms for the dataset used.
Dataset | Link |
---|---|
AUTSL | Link |
CSL | Link |
DEVISIGN | Link |
GSL | Link |
INCLUDE | Link |
LSA64 | Link |
WLASL | Link |
For datasets without the pose data, poses can be extracted from the videos using this script.
If you find our work useful in your research, please consider citing us:
@misc{2021_openhands_slr_preprint,
title={OpenHands: Making Sign Language Recognition Accessible with Pose-based Pretrained Models across Languages},
author={Prem Selvaraj and Gokul NC and Pratyush Kumar and Mitesh Khapra},
year={2021},
eprint={2110.05877},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
@inproceedings{
nc2022addressing,
title={Addressing Resource Scarcity across Sign Languages with Multilingual Pretraining and Unified-Vocabulary Datasets},
author={Gokul NC and Manideep Ladi and Sumit Negi and Prem Selvaraj and Pratyush Kumar and Mitesh M Khapra},
booktitle={Thirty-sixth Conference on Neural Information Processing Systems Datasets and Benchmarks Track},
year={2022},
url={https://openreview.net/forum?id=zBBmV-i84Go}
}