microsoft / MixedReality-UXTools-Unreal

UX tools and components for developing Mixed Reality applications in UE4.
https://microsoft.github.io/MixedReality-UXTools-Unreal/
MIT License
315 stars 87 forks source link

Unable to Interact with any Controls on Quest #68

Open druidsbane opened 1 year ago

druidsbane commented 1 year ago

Hi,

I've tried the latest plugin and examples on UE 4.27, 5.0 and 5.1 using the Oculus Quest 2 both with AirLink and deployed natively. I am unable to get the hands to interact or show any form of laser pointer. I can get menus and buttons statically added to a level but the hands just have no effect.

Are there additional steps that need to be taken or does the current plugin not really leverage OpenXR hand tracking for anything and this is reserved purely for the HoloLens?

Thanks!

ri5c commented 1 year ago

I honestly do not think they care about Unreal and only really focus on Unity. I could not find enough documentation to get it working on Unreal and switched to Unity but nothing works right on Unity either.

druidsbane commented 1 year ago

Agreed. Some of this appears to be on the Oculus side as well though, they aren't using the latest OpenXR setup for hand tracking and there is no clarity on when if ever they will. Reading through the source here though it is clear that this could have been made more generic so that we could integrate the pointers as we see fit. It wouldn't be so bad to have a pointer component one could attach directly to a digit and perform all the transforms needed to keep it in the right spot. All the logic for poking and other things could have been abstracted into an event handler rather than directly in the NearPointer and FarPointer.