Closed Paultheslayer21 closed 1 year ago
It is totally possible, I actually had worked on this a while ago, however stopped as most of the VR Headsets used by Tobii Pro lost support, and that I had no way to test the SDK.
Which devices are you planning to utilize with the Tobii Pro SDK?
I have a Vive 1.0 with a Tobii eye tracker added on to it. It was the precursor to the Vive Pro Eye
This device is discontinued by Tobii Pro, in replacement for the HTC Vive Pro Eye, so chances are, the current Pro SDK may not like it, however, the Tobii XR SDK should still support it. One slight issue, for "unfiltered" data, you need a Tobii Ocumen license, which who knows how much that costs or what hurdles you have to jump through to get that.
All I know that you're allowed to access on the XR SDK without Ocumen licensing is:
With the Ocumen licensing:
I'm unsure if Ocumen's features are 'free' with the Pro SDK, however I'm not sure if the Pro SDK will even work with this HMD; it's just something that would need to be tested and verified, or asked to Tobii.
I'll look into seeing what steps to take for Tobii Ocumen and it's features.
I'll look into seeing what steps to take for Tobii Ocumen and it's features.
Regardless, that would require storing a license in source, which isn't possible, and note how I said the Tobii Pro SDK might work with the headset, just that Tobii Pro says it's discontinued in replacement for the Vive Pro Eye.
According to this forum post the Vive Eye Devkit should work with SRAnipal: https://developer.tobii.com/community/forums/topic/no-longer-able-to-access-individual-eye-gaze/
AFAIK VRCFT will only enable SRAniapl Eye Tracking if the headset in question is the Vive Pro Eye. Perhaps a secondary check to see if the HMD is the Vive Eye Devkit is all that is needed?
Hey there,
As like @Paultheslayer21, I have the Retrofitted Original Vive and was interested in getting eyetracking working for VRChat. I tried going through this via the SRanipal runtime but unfortunately this did not work, it may be the case that the eye tracker does work with the SRanipal SDK but I never did test this. In the end, I ended creating a custom very hacky module with the following SDK https://vr.tobii.com/sdk/develop/native/stream-engine/
The only disadvantage to this is that you cannot get pupil dilation/constriction data with the general non business license, similar to what @200Tigersbloxed said and the actual blinking is a simple boolean with no fine control. You do however have access to individual eye data etc which you can check in the API reference. Eg.
tobii_wearable_consumer_eye_t.pupil_position_in_sensor_area_validity
My hacked version of the VRCFaceTracking broke due to the new release but I'll see if I can get the Tobii XR recreated as a proper module and submitted, it was easy to base it off the PiMax module prior to the release. Ideally it would be nice to use the SRanipal SDK however if this headset supports that. Sorry for the terrible job in advance, I have never used C# before, rarely program and had minimal documentation 😅
Attached in Mega is the super hacky old Tobii XR module you can find in TrackingLibs, the original C demo program and then the demo converted to C# with the included Stream Engine bindings. https://mega.nz/file/YYIzSRSa#WPQ4-cbi_xpeBSNrWACwxxXdjqNp6Vb0tl2bn-hrFBs
As I have access to the headset, I will try set up the SRanipal SDK to confirm whether this headset supports that.
On further checkup, I am getting the following icon
Which corresponds to
I therefore doubt the claim that SRanipal supports the Tobii Devkit considering it is the predecessor to the Vive Pro and is consequently discontinued. Accessing via the Tobii XR SDK appears to be the only still supported way.
You can set up the SDK for the Devkit via Step 1 and 2 here https://vr.tobii.com/sdk/develop/unity/getting-started/tobii-htc-dev-kit/
As previously planned, The best solution is to go ahead and get a module together for Tobii XR supported devices
Returning to this, I've just finished up on a VR Facetracking Module based ontop of Tobii Stream Engine which should in theory work with a majority of Tobii devices. I've only tested/manually calibrated as best I could for the HTC Vive Tobii Devkit which I use for now. I'll be working on a separate small application for calibration however.
I'm actually pretty curious what needs calibration on the module? For reference: the gaze rotation from the Vive Pro Eye gets translated from degrees into radians from the gaze quaternion/vector to VRCFT's gaze, and openness is how open the eye is (i suppose in the case of Tobii it could lerp from 0 to 1 like how the Varjo module did it before variable openness was introduced).
Hi there @regzo2 , "Callibration" is quite a harsh word in reality. To sum it up, the method Tobii's Stream Engine method I am using to gather the eye coordinates should be going from top left being 0, 0 (x, y) to bottom right being 1, 1. To make this work with conventional vrcft parameter settings, multiplying by 2 and taking away by one for the x axis and basically the inverse for the y axis worked so conversion isn't in particular the issue. (See https://tobiitech.github.io/stream-engine-docs/ pupil_position_in_sensor_area_xy. position_guide_xy would return nothing for some reason.)
However after testing with my Tobii Devkit I noticed that even when looking as left as humanly possible and recalibrating my tobii headset several times over the maximum value of looking left would be something more like 0.3-0.5 on the X axis not 0 to 0.1 or so which was to be expected and vice versa for the right side. And this is with the raw data, not any converted data.
Perhaps this isn't an issue on other Tobii trackers, I am not sure. Nevertheless to solve this and increase the "sensitivity" I set an arbitrary maximum value constant for each axis based on observations and calculated a percentage where the max constant set would equal to the maximum vrcft value eg. 0.5 in Tobii now equals 1 in VRCFT.
A constant max value of 0.5 appears to work good enough for the Tobii HTC Vive Devkits, but I can't guarantee the same for other devkits or other Tobii products. There may be Tobii products that output raw 1-0 range correctly in which a constant of 1 wouldn't change anything to do with it anyways aside from converting it.
Perhaps lerp would simplify a lot of this but I'm still under the impression that if this is a consistent thing across tobii trackers nevertheless a maximum value for each axis/side will need to be calculated. I may be overcomplicating things drastically. Correct me if I'm wrong though, I'm a rookie!
I'll try have a poke/mess around with the Stream Engine API to see if I can get a method to provide more accurate eye tracking without all of the math hackery and more but this is the working solution for now, you can see demonstrations of my implementation on my GitHub page.
Oh you're perfectly fine! I was just curious what you were meaning.
I was looking through your code and you might be far more interested in subscribing to tobii_wearable_advanced_data_subscribe instead of the consumer data. If you still want to remain on consumer data: Instead of using the pupil position (which might not be giving the data that you are expecting) you could use the combined normalized gaze data from the consumer data and modify it using the convergence to get focal tracking for each eye. Though if you can use the advanced data It may be simpler to use the advanced data's individual normalized eye gazes, which is what VRCFT uses directly from the SRanipal SDK likewise.
Thank you for your input. Certainly I have had a look at tobii_wearable_advanced_data_subscribe and it would be the preferred approach, unfortunately I need the license TOBII_FEATURE_GROUP_PROFESSIONAL according to https://developer.tobii.com/product-integration/stream-engine/ tobii_licensing.h and via tobii_license_key_store/tobii_license_key_retrieve methods according to the Stream Engine docs meaning the license key is stored on the device itself and the key is read/appropriate access granted as part of the method tobii_device_create where it would fail without the license. This means other users not using Pro devices will need to get their own license key if not using a Pro device which may prove inconvenient for regular users.
I may make this an option to switch to the advanced data for users however in the future if I can attain a professional license with no issues. Also technically the original issue opener @Paultheslayer21 did request for Tobii Pro VR in particular to be added/leveraged so leveraging the Pro license/API would be preferred to satisfy this issue in particular.
Ideally to support all devices with no issue, the consumer data would be preferred. Your consumer data approach seems interesting, I am having a hard time following however apologies. In essence will not the combined normalized gaze data and convergence be the same thing as gaze_direction_combined_normalized_xyz method should also include the distance to the convergence/gaze point in the z axis?
...method should also include the distance to the convergence/gaze point in the z axis?
the gazes don't include anything on the combined gaze Z axis as far as I am aware, usually it represents the rotation on the Z axis of the eye (or the roll of the eyes).
Sorry about being a bit vague on the gazes; I was proposing that you could use the convergence data to rebuild the individual left/right gazes from the combined data if you had to use the consumer data. This would retain default combined eye functionality if a Tobii VR interface can not provide convergence, but would allow an interface to converge the eyes in VRCFT if it can provide convergence.
From the consumer data the combined gaze data is the averaged gaze of each eye, and the convergence (the distance in mm where the eyes end up converging) of the eyes are given. Using these two datas you should be able to replicate a separated left/right gaze instead of a combined gaze, and using the users IPD you should be able to extrapolate to this diagram to get individual gazes:
This diagram I made is a pretty simplified version of what I was thinking of, essentially you would use trig to convert the convergence and IPD, and combined gaze (in green) from a user and convert it into an offset angle that the eyes could use by adding/subtracting it from each eye's gaze respectively (in black). Only issue is you will have to convert from this angle to whatever notation Tobii is using, however I believe Tobii's eye tracking is in a normalized cartesian you would need to convert to cartesian units. For the X axis I believe you can just use x = rcosθ
with r = 1
to get the desired cartesian unit to add/subtract for each eye gaze.
Going to close this issue from further discussion as there is an external module that solves this issue; any further issues with the Tobii XR tracking interface or tracking module should be issued to the respective module repo. Thank you @Spacur for your work on the module! It will be included with our modules list in the VRCFaceTracking README.
I'm just wondering if it would be possible to implement Tobii Pro VR eye tracking. I don't know how difficult it would be, considering it doesn't support Sranipal. It uses the Tobii Pro SDK.
Tobii Pro SDK and Setup: https://www.tobiipro.com/product-listing/tobii-pro-sdk/
Documentation: https://developer.tobiipro.com/
Unity guide: https://vr.tobii.com/sdk/develop/unity/getting-started/tobii-htc-dev-kit/