Some games have slightly distorted UI for their HUD, sometimes moving a bit to match the player's camera movement and simulate a 3D effect:
While we have plans for per-layer transforms for 3D-looking panels, it might be additionally useful to support distortion that applies at the fragment level. We'd also want to make sure input lines up with the distorted UI as well so that the mouse doesn't feel offset.
I think this could be implemented satisfyingly outside of yakui, by projecting yakui's 2D domain (e.g. mapping onto a curved mesh) and inverse-projecting mouse input (e.g. raycasting) however the user likes.
Some games have slightly distorted UI for their HUD, sometimes moving a bit to match the player's camera movement and simulate a 3D effect:
While we have plans for per-layer transforms for 3D-looking panels, it might be additionally useful to support distortion that applies at the fragment level. We'd also want to make sure input lines up with the distorted UI as well so that the mouse doesn't feel offset.