alvr-org / alvr-visionos

Experimental visionOS client for ALVR - SteamVR on Apple Vision Pro!
MIT License
430 stars 31 forks source link

Eye tracking (for OSC etc) #21

Open shinyquagsire23 opened 9 months ago

shinyquagsire23 commented 9 months ago

Possibly relevant docs: https://developer.apple.com/documentation/compositorservices/4082136-cp_drawable_get_rasterization_ra https://developer.apple.com/documentation/metal/mtlrasterizationratemap

The rasterization map might leak some basic eye tracking? Needs investigation. And maybe pestering Apple for an actual permission + API if the above doesn't work.

xuhao1 commented 8 months ago

@shinyquagsire23 In accessibility, we can enable pointer control with eyes, and a white pointer will show on screen just like the mouse pointer for iPad but driven by eyes. Can we obtain the position of this pointer by putting a virtual screen and capture the mouse event?

https://developer.apple.com/documentation/realitykit/arview/mousemoved(with:)

shinyquagsire23 commented 8 months ago

I wish lol, none of the accessibility stuff even seems to work in immersive mode, keyboard-based pinch events report 0,0,0 as their gaze ray, and we only have one event handler accessible bc there's no windows

xuhao1 commented 8 months ago

@shinyquagsire23 I can still see a mouse pointer in immersive mode. Will test if it have correct mouse input.

shinyquagsire23 commented 8 months ago

yes it's visible but there's no flat plane to get the movements from. Frankly I doubt even flat plane apps get them.

xuhao1 commented 8 months ago

Can we put a huge cube to capture it?

shinyquagsire23 commented 8 months ago

Technically maybe, but again the flip side is like, I don't know if those movements can even be sent to panels at all, and Apple doesn't allow programmatically moving windows, so the user would have to position them by hand :/