microsoft / MixedReality-WebRTC

MixedReality-WebRTC is a collection of components to help mixed reality app developers integrate audio and video real-time communication into their application and improve their collaborative experience
https://microsoft.github.io/MixedReality-WebRTC/
MIT License
898 stars 277 forks source link

Help Required: Getting a custom Hand Tracking system as an articulated hand. #840

Closed HyperLethalVector closed 2 years ago

HyperLethalVector commented 2 years ago

Hi everyone!

For Experimentation purposes, I've created a virtual hand within the unity environment, and have been aiming to input the virtual hand joint positions as an articulated hand into the MRTK. I've done so based on the current MRTK Leap Motion Input Provider.

So far, I've been able to create an Input Provider, and successfully raise/close the input source. I've also verified that the poses for the joints are updating correctly within the Input Provider, with the correct sources for each 'joint' to the MRTK.

However, I'm not seeing any cursors coming out of the hands (even though all my pointers are attached to the articulated hand), nor are any of my virtual 'pinch' gestures. Only the head gaze seems to be working.

Is there something I'm possibly missing?

The codebase for this provider can be found here:

https://drive.google.com/drive/folders/1a81DBVNaS7W6gwd2PYVMI14nKvczHJNc?usp=sharing

Basically, the AutoHand MRTK Skeleton obtains the transforms from the virtual hands, which are then used by the provider.

Any help would be massively appreciated :(

spacecheeserocks commented 2 years ago

I think you may have posted in the wrong project by mistake. This sounds like an MRTK issue?

HyperLethalVector commented 2 years ago

Oh my gosh, you're quite right, I'll mark the issue as closed and repost in the right place ><