Currently, the TinyPilot web UI is almost unusable from a touch device (e.g., a tablet). We want to improve this situation, and based on our discussion in the exploratory proof-of-concept branch we want to do this in multiple steps. The first step (this PR) includes:
(a) Resolving the most major UI issues that basically prevent usage from a touch device altogether. The main problems we identified are:
The mobile OS pulls up the native keyboard as soon as the remote screen receives focus.
The mobile OS intercepts and handles touch actions on with its own native logic
(b) Interpreting single taps as single left clicks
(a) is mainly achieved by calling preventDefault() on the touch events. For (b), we can introduce an adapter class that translates touch events into synthetic mouse events. The idea is that we don’t clutter the <remote-screen> component (which is already quite complex) with more logic, but that we separate the touch logic as best as possible.
So as of this PR, you would be able to use the TinyPilot web UI on a touch device, and issue single left clicks on the remote screen.
For testing this PR, it’s probably best to use a real touch device. As an alternative, the browser dev tools offer a touch device emulation (see this Chrome guide for example), but that has some limitations.
If you don’t have a tablet at hand, it would suffice for me if you do a code check. I’ve tested the changes using an iPad and a macOS computer as target machine. It’s also hard to really break anything with this PR, as the current touch experience is basically non-existent.
Resolves https://github.com/tiny-pilot/tinypilot/issues/270.
Currently, the TinyPilot web UI is almost unusable from a touch device (e.g., a tablet). We want to improve this situation, and based on our discussion in the exploratory proof-of-concept branch we want to do this in multiple steps. The first step (this PR) includes:
(a) is mainly achieved by calling
preventDefault()
on the touch events. For (b), we can introduce an adapter class that translates touch events into synthetic mouse events. The idea is that we don’t clutter the<remote-screen>
component (which is already quite complex) with more logic, but that we separate the touch logic as best as possible.So as of this PR, you would be able to use the TinyPilot web UI on a touch device, and issue single left clicks on the remote screen.
For testing this PR, it’s probably best to use a real touch device. As an alternative, the browser dev tools offer a touch device emulation (see this Chrome guide for example), but that has some limitations.
If you don’t have a tablet at hand, it would suffice for me if you do a code check. I’ve tested the changes using an iPad and a macOS computer as target machine. It’s also hard to really break anything with this PR, as the current touch experience is basically non-existent.