Open shinyquagsire23 opened 9 months ago
@shinyquagsire23 In accessibility, we can enable pointer control with eyes, and a white pointer will show on screen just like the mouse pointer for iPad but driven by eyes. Can we obtain the position of this pointer by putting a virtual screen and capture the mouse event?
https://developer.apple.com/documentation/realitykit/arview/mousemoved(with:)
I wish lol, none of the accessibility stuff even seems to work in immersive mode, keyboard-based pinch events report 0,0,0 as their gaze ray, and we only have one event handler accessible bc there's no windows
@shinyquagsire23 I can still see a mouse pointer in immersive mode. Will test if it have correct mouse input.
yes it's visible but there's no flat plane to get the movements from. Frankly I doubt even flat plane apps get them.
Can we put a huge cube to capture it?
Technically maybe, but again the flip side is like, I don't know if those movements can even be sent to panels at all, and Apple doesn't allow programmatically moving windows, so the user would have to position them by hand :/
Possibly relevant docs: https://developer.apple.com/documentation/compositorservices/4082136-cp_drawable_get_rasterization_ra https://developer.apple.com/documentation/metal/mtlrasterizationratemap
The rasterization map might leak some basic eye tracking? Needs investigation. And maybe pestering Apple for an actual permission + API if the above doesn't work.