Closed f3-lwiedemann closed 3 years ago
Hi, thanks for flagging this I will look into it and investigate ways to more cleanly integrate the UX framework with XRI.
Hi @f3-lwiedemann I did some exploration on this and was able to integrate the XR Interaction toolkit components for placed objects (translation, scale, rotation). There are two ways to do it with placing objects. You can use the ARPlacementInteractable and link in the UX callbacks for when the object is placed (in the inspector events). It's also possible to still use the PlaceObjectOnPlane script, this is what I did. You just need to configure your placed prefab and add a few extra components in the scene like the AR Gesture Interactor and XR Interaction Manager.
Here's a video of it working: https://www.youtube.com/watch?v=Q3IVwFGiqRU
Here's a branch with it all configured properly. I don't want to pull this into the master now as none of the other demos use XRI but that may change in the future. https://github.com/Unity-Technologies/arfoundation-demos/tree/XRI_UX_Framework
Very cool! Thank you for exploring this 😊
I have a problem integrating the UX part of this repo into my app. The PlaceObjectOnPlane script replaces the ARPlacementInteractable of XRInteractionToolkit but doesn't integrate well with the ARSelectionInteractable. The PlaceObjectOnPlane script places a new object instead of deselecting another object. Maybe it is an option to integrate the UX example into XRInteractionToolkit as it is a useful feature for almost all AR apps?