As mentioned in #207 and #217, a primary interaction mode for AceJump is staring at a location in the editor and jumping to a fixed spot on the screen. The goal of this feature is to use a gaze detection library (such as eye-gaze or deepgaze) to prioritize tags nearby where the user is staring, or to update the cursor position while the keyboard is inactive. We could use gaze as a control mechanism (ie. blink to jump), or as a smooth control feature to adjust caret position within the editor.
Ideas for implementation:
Use an activation sequence or press-to-talk to avoid spuriously triggering Gaze Mode
Use reinforcement learning while user is typing to train the model (nb. should be opt-in)
Jump locations can be filtered by gaze direction (ie. if the user activates gaze mode explicitly)
Gaze direction or head movement can be used to control the scroll bar (scroll mode)
Use gaze direction to target buttons or menu options in the IDE, then select with keyboard
As mentioned in #207 and #217, a primary interaction mode for AceJump is staring at a location in the editor and jumping to a fixed spot on the screen. The goal of this feature is to use a gaze detection library (such as eye-gaze or deepgaze) to prioritize tags nearby where the user is staring, or to update the cursor position while the keyboard is inactive. We could use gaze as a control mechanism (ie. blink to jump), or as a smooth control feature to adjust caret position within the editor.
Ideas for implementation:
Libraries