Closed brandf closed 3 years ago
Due to the way Unity's Eyes
struct was originally designed (left and right eyes, vs HL2's combined gaze), HL2 eye data wasn't able to be piped through the same system.
You can access through WMR-specific usages (EyeGazePosition and EyeGazeRotation) off UnityEngine.XR.InputDevices.GetDeviceAtXRNode(XRNode.CenterEye)
, but I know that's not the cross-platform API you're exactly looking for. This should be improved in future releases.
Do you have any sample code that applies EyeGazeRotation to a Transform? It almost seems like there is a bug in the conversion to Unity's Quaternion.
I can't figure out what coordinate frame it's suppose to be in, and docs don't specify this. I tried assuming it was world space like other position/rotations, but things seem backwards. If I just try setting a Transform.rotation to EyeGazeRotation and it is pointing backwards (e.g. euler: 18.3, 181.2, 0.0)
There also seems to be a bug where eye tracking works in holographic remoting once, and then if you reconnect and play you'll just get identity for EyeGazeRotation/Position even though WindowsMRUsages.EyeGazeAvailable is true.
Only way I've found to fix it is to restart unity, which makes development on this platform a nightmare.
Ok after some more investigation and a dozen unity restarts I'm pretty sure it's just a buggy / broken api. If not, please provide an example @keveleigh since it's not at all obvious how you're suppose to use it. Google comes up with nothing so idk if it ever worked.
The closest workaround I've been able to come up with is:
if (XRHMDDevice.InputDevice.TryGetFeatureValue(WindowsMRUsages.EyeGazeRotation, out var gazeRotation))
XRHMDDevice.Gaze.rotation = Quaternion.Euler(0, 180, 0) * Quaternion.Inverse(gazeRotation);
This works perfectly as long as you're facing down world Z+, but if you turn right/left then the up/down eye movement doesn't work as expected.
I'll take a look at the implementation. It's possible the data isn't coming through as expected
I recorded a video to demonstrate the issue. Interpreting it directly as a world space rotation resulted in the gaze pointing behind me, so this is using the 'closest workaround' above.
https://drive.google.com/file/d/1afpE5Izk5NPDYvlY1H5JpW0i0-qNGkhZ/view?usp=sharing
FWIW, I don't get the same bug using SpatialPointerPose.TryGetAtTimestamp. I'm using that for now, but it's not clear how to make that work in the Holographic Remoting, so my iteration times are down the tube (have to build a full player every time).
Haven't quite gotten to the bottom of this one yet. MRTK has some helper methods to find the math for conversion from Windows' coordinate system to Unity's if it seems like that's where the underlying problem is (won't be able to use them exactly due to the types, but it at least shows the math):
It's also looking like a possibility that the data is sent relative to the head instead of relative to the origin (basically, head local space vs world space), if that helps you get it working. I'll continue digging on the Unity side though.
I did try applying that at one point to see if it was really a Numerics Quaternion even though the API returns a Unity Quaternion. didn't help.
I am trying to interface Hololens eyetracker without using MRTK directly using XR & ARFoundation from Unity. @keveleigh I tried using InputDevices.GetDevicesWithCharacteristics(InputDeviceCharacteristics.EyeTracking ...) without success then tried device.TryGetFeatureValue(WindowsMRUsages.EyeGazeAvailable...) and eventually UnityEngine.XR.InputDevices.GetDeviceAtXRNode(XRNode.CenterEye); as you advised. But calling next TryGetFeatureValue(WindowsMRUsages.EyeGazeRotation, out myGazeVar)) it always fails. Have you got an idea why ? BTW Gazeinput capability is checked. Hope you would be able to help me.
Another point : is it possible to get left & right gaze data independently now ?
FYI : I can track hand position/rotation without any problem (finding Input devices with InputDeviceCharacteristics.Right/Left and calling TryGetFeatureValue(CommonUsages.devicePosition/deviceRotation ...) ), I am using Unity 2020.2.2f1 with ARFoundation 4.1.3 package and Windows XR Plugin 4.4.0 package (and spatial mapping is working pretty well too)
my coworker discovered a workaround to the gaze rotation being wrong. it wasn't using the unity coordinate system, so you have to convert it.
gazeRotation.Set(-gazeRotation.x, -gazeRotation.y, gazeRotation.z, gazeRotation.w);
gazeRotation = gazeRotation * Quaternion.Euler(0, 180, 0);
note that due to another bug, this only work on Windows XR Plugin 2.5.2 or less. the latest version won't even give you broken eye gaze.
see https://github.com/microsoft/MixedRealityToolkit-Unity/issues/9470
@keveleigh can you commit to adding some tests on this stuff in Microsoft's test suite? It seems like there are so many bugs with it that it's like painfully peeling back an onion. I have to assume you guys don't test it at all.
Windows XR Plugin 2.7.0 has now shipped for Unity 2019.4.24+. The eye gaze WindowsMRUsages should now all work! With the addition of a new tracked usage as well. They're also now all in a predictable world coordinate system and should be usable without any additional transforms to their coordinate systems. Please let me know if you run into any issues!
Describe the bug
I am using XR SDK for hand tracking on HL2 successfully, but doing the same for eye tracking doesn't work.
I expected eye tracking to work via XR SDK on Hololens 2, but was surprised to find that InputDevice.TryGetFeatureValue(CommonUsages.eyesData, out var EyesData)) returns false, even with the GazeInput capability.
I realize I can use SpatialPointerPose.Eyes, but I'm trying to use non-proprietary api's as much as possible since this is a cross platform application. Given that HL2 supports eye tracking, it sees like it should expose it via the XR SDK api (CommonUsages.eyesData)
To reproduce
Steps to reproduce the behavior:
Expected behavior
TryGetFeatureValue returns true with the eyeData, so I can use the same eye tracking code with HL2 as I do on other devices.
Target platform (please complete the following information)