Previously, all UI panels listened to hardcoded SDL keycodes and mouse button presses, and each panel manually implemented the means of clicking a button.
Now, panels register various types of input listeners and all SDL events are handled by the InputManager. Some of the input listeners are for input actions (inspired by Unity's input actions) which allow panels to register a callback with a string that points to an input action definition for a physical input event (key down, key up, mouse button down, etc.). This should make key rebinding much easier to implement, although that has not been started yet.
There are a couple weird places where a fullscreen button is used to implement a panel's interactivity. Not sure if that needs to change, but it's just a little weird.
Not completely set on the idea of held keys and mouse buttons being broadcast as events; might make them public getters on the input manager in the future instead.
Previously, all UI panels listened to hardcoded SDL keycodes and mouse button presses, and each panel manually implemented the means of clicking a button.
Now, panels register various types of input listeners and all SDL events are handled by the InputManager. Some of the input listeners are for input actions (inspired by Unity's input actions) which allow panels to register a callback with a string that points to an input action definition for a physical input event (key down, key up, mouse button down, etc.). This should make key rebinding much easier to implement, although that has not been started yet.
There are a couple weird places where a fullscreen button is used to implement a panel's interactivity. Not sure if that needs to change, but it's just a little weird.
Not completely set on the idea of held keys and mouse buttons being broadcast as events; might make them public getters on the input manager in the future instead.