Hand Physics Toolkit (HPTK) is a toolkit to implement hand-driven interactions in a modular and scalable way. Platform-independent. Input-independent. Scale-independent. Can be combined with MRTK-Quest for UI interactions.
A ready-to-go project is available at HPTK-Sample.
# Main features
- **Data model** to access parts, components or calculated values with very little code
- **Code architecture** based on MVC-like modules. Support to custom modules
- **Platform-independent.** Tested on VR/AR/non-XR applications
- **Input-independent.** Use hand tracking or controllers
- **Pupettering** for any avatar or body structure
- **Scale-independent.** Valid for any hand size
- **Realistic** configurable **hand physics**
- Define strategies to deal with tracking loss
- Physics-based hover/touch/grab detection
- Tracking noise smoothing
# Documentation
Some documentation entries:
- [Home](https://jorge-jgnz94.gitbook.io/hptk/)
- [Setup](https://jorge-jgnz94.gitbook.io/hptk/setup)
- [FAQs](https://jorge-jgnz94.gitbook.io/hptk/faqs)
# Supported versions
- Unity 2022
- Unity 2019-2021 (Legacy)
# Supported input
## Hand tracking
- Meta Quest - Android
- Leap Motion - Standalone
## Controllers
- Oculus Touch
- WMR
- Vive
- OpenVR
# Supported render pipelines
- Universal Render Pipeline (URP)
- Standard RP
# Getting started with HPTK (Oculus Quest)
1. Obtain **HPTK**
1. Change **ProjectSettings & BuildSettings**
1. Import the built-in **integration packge** (if needed)
1. Drag & drop the **default setup** to your scene
1. **Build and test**
Check [documentation](https://jorge-jgnz94.gitbook.io/hptk/setup) for a detailed **step-by-step guide**.
# Author
**Jorge Juan González**
[LinkedIn](https://www.linkedin.com/in/jorgejgnz/) - [Twitter](https://twitter.com/jorgejgnz) - [GitHub](https://github.com/jorgejgnz)
## Acknowledgements
**Oxters Wyzgowski** - [GitHub](https://github.com/oxters168) - [Twitter](https://twitter.com/OxGamesCo)
**Michael Stevenson** - [GitHub](https://github.com/mstevenson)
Andreea Muresan, Jess Mcintosh, and Kasper Hornbæk. 2023. [Using Feedforward to Reveal Interaction Possibilities in Virtual Reality](https://dl.acm.org/doi/full/10.1145/3603623). ACM Trans. Comput.-Hum. Interact. 30, 6, Article 82 (December 2023), 47 pages. https://doi.org/10.1145/3603623
Nasim, K, Kim, YJ. Physics-based assistive grasping for robust object manipulation in virtual reality. Comput Anim Virtual Worlds. 2018; 29:e1820. [https://doi.org/10.1002/cav.1820](https://doi.org/10.1002/cav.1820)
Linn, Allison. Talking with your hands: How Microsoft researchers are moving beyond keyboard and mouse. The AI Blog. Microsoft. 2016
[https://blogs.microsoft.com/](https://blogs.microsoft.com/ai/talking-hands-microsoft-researchers-moving-beyond-keyboard-mouse/)
# License
[MIT](./LICENSE.md)