MixedRealityToolkit / MixedRealityToolkit-Unity

This repository holds the third generation of the Mixed Reality Toolkit for Unity. The latest version of the MRTK can be found here.
BSD 3-Clause "New" or "Revised" License
334 stars 86 forks source link

[MRTK3] When the MetaXR feature is enabled in the OpenXR Feature Group, the Ray is no longer displayed in Quest hand tracking. #113

Open IssueSyncBot opened 10 months ago

IssueSyncBot commented 10 months ago

Original issue opened by:

@spg666 @spg666


Describe the bug

When the MetaXR feature is enabled in the OpenXR Feature Group, the Ray is no longer displayed in Quest hand tracking.

To reproduce

Steps to reproduce the behavior:

  1. Use the VR template of Unity 2021.3.25f1.
  2. Install Mixed Reality OpenXR 1.7.2, MRTK Input 3.0.0-pre15, MRTK UX Components 3.0.0-pre15 using MixedRealityFeatureTool.exe.
  3. Install Oculus Integration SDK 53.1.
  4. Enable MetaXR feature in the OpenXR Feature Group.
  5. Build and run on Quest 2.
  6. Notice that the Ray is not displayed during hand tracking.

Expected behavior

Even when the MetaXR feature is enabled, the Ray should still be visible during hand tracking on Quest.

Your setup (please complete the following information)

Target platform (please complete the following information)

Additional context

When the MetaXR feature is deactivated within the OpenXR Feature Group, the Ray properly appears during hand tracking on the Quest. Furthermore, while the hand tracking system allows for successful button interaction, it fails to respond to the ObjectManipulator. The Ray is visible when using Oculus Link to run the application within the Unity Editor.


ISSUE MIGRATION

Issue migrated from: https://github.com/microsoft/MixedRealityToolkit-Unity/issues/11553

IssueSyncBot commented 10 months ago

Original comment by:

@MickGerrit MickGerrit


I was having the same issue... I walked into this issue because I tried to make passthrough working by importing Oculus Integration. With MetaXR enabled, not only does the ray not folllow hands anymore, I could also not do any pinch interactions. But... I was able to press buttons with the tip of my index finger

IssueSyncBot commented 10 months ago

Original comment by:

@Phantomxm2021 Phantomxm2021


I was having the same issue. And I have debug the MRTKRayInteractor -> Update [the aimPose and devicePose value are always 0 ]

mtucker-virtra commented 9 months ago

Any work around besides disabling the Meta XR feature group entirely? Any other thoughts\ideas on how to get these important features to work?

Ali-Can-Keskin commented 7 months ago

So, there is still no update/workaround for this ?

spg666 commented 7 months ago

Not all features of the Meta XR feature group are available, but with Unity OpenXR: Meta, pass-through and some features are available without Oculus Integration. However, I am just a user, so I am not sure if this workaround is correct.

Ali-Can-Keskin commented 7 months ago

Thank you for this message. I will use this workaround/solution until they officially support it.

ggYANG123 commented 4 months ago

I am also facing exactly the same thing. I want to switch between pass through and no pass through in the software, but when I encountered a problem, I had to check MetaXR to achieve pass through. However, there was a problem with the hand ray and the same problem occurred. When I disable MetaXR, pass through fails. Is there any update on this issue at present?

zxzkf1992 commented 2 months ago

I face the same thing, in unity 2022.3.26f1 and meta XR core SDK 64.0.0, i am not use Oculus Integration. when i uncheck Meta XR under OpenXR Feature Groups, the ray is comeback. Have anyone solved this problem or other solution find?

update: now i am using Unity OpenXR: Meta package to open passthrough, its works. i just wanna use passthrough, so if you guys have same requirement, just download this package (com.unity.xr.meta-openxr)

SashelI commented 4 days ago

Hi, still no workaround using meta sdk for this ?

SashelI commented 17 hours ago

Update @spg666 : I managed to have hands and controllers work in far while keeping meta xr sdk by modifying a few prefabs, scripts and the interaction profile.

Step 1 - Make sure that PolyfillHandRay is present as a pose source in the MRTKRayInteractor of the "Far Ray" object in each controller in the scene :

image image

Step 2

Now, mind that the polyfill hand is working for quest hands, but when meta xr enabled, the values aren't read in polyfill but in "InputAction", which isn't null but all values are at zero. Now we could just invert the order in the PoseSourceList, but that might cause issues if we're doing crossplatform, but more importantly makes Quest 3 controllers incorrect (the ray will go from the hand and not the tip of the controller). So we need to modify the FallbackCompositePoseSource script so that we can ignore InputAction values (and so fallback on polyfill) with quest hands, but not with any other controller.

FallbackCompositePoseSource , add :

public bool IsController { get; set; } = false;

FallbackCompositePoseSource, modify

/// <summary>
/// Tries to get a pose from each pose source in order, returning the result of the first pose source
/// which returns a success.
/// </summary>
public bool TryGetPose(out Pose pose)
{
    for (int i = 0; i < poseSourceList.Length; i++)
    {
        IPoseSource currentPoseSource = poseSourceList[i];
        if (currentPoseSource != null && currentPoseSource.TryGetPose(out pose))
        {

#if UNITY_ANDROID && !UNITY_EDITOR

            //We want to use polyfill source when using Quest hands, but InputAction works better with meta controllers.
            if (currentPoseSource is InputActionPoseSource && !IsController)
            {
                continue;
            }
#endif
            return true;
        }
    }

    pose = Pose.identity;
    return false;
}

Step 3

Now those modifications implies other modifications, this time in MRTKRayInteractor :

MRTKRayInteractor, add :

protected override void Start()
{
    base.Start();

    if (controllerDetectedAction == null || controllerDetectedAction.action == null) { return; }
    controllerDetectedAction.action.started += ControllerStarted;
    controllerDetectedAction.action.canceled += ControllerEnded;
    controllerDetectedAction.EnableDirectAction();
}
private void ControllerStarted(InputAction.CallbackContext context)
{
    if (context.control.device is UnityInputSystem.XR.XRController)
    {
        isController = true;
    }
}

private void ControllerEnded(InputAction.CallbackContext context)
{
    if (context.control.device is UnityInputSystem.XR.XRController)
    {
        isController = false;
    }
}

MRTKRayInteractor, in the Update method, add this before the current code :

if (AimPoseSource is FallbackCompositePoseSource poseSource)
{
    poseSource.IsController = isController;

    if (poseSource.TryGetPose(out Pose aimPose))
    {
        transform.SetPositionAndRotation(aimPose.position, aimPose.rotation);

        if (hasSelection)
        {
            float distanceRatio = PoseUtilities.GetDistanceToBody(aimPose) / refDistance;
            attachTransform.localPosition = new Vector3(initialLocalAttach.position.x,
                initialLocalAttach.position.y, initialLocalAttach.position.z * distanceRatio);
        }
    }
}
else
{ [current code] }

Now we have : hand tracking, ray display, BUT ! The pinch action isn't detected any more.

STEP 4 - in the MRTK interaction profile, add the following for each hand :

(provider comes from meta aim package from openxr-meta)

image

image

STEP 5 - My OpenXR settings are as follows :

image

image

image

image

image

There you go !