The user input system ought to be unit-tested and is already isolated enough to do so directly in nodejs
Perhaps less important, but it may be good to have the vive and oculus touch bindings use an implementation more like the windows mixed reality bindings, where bindings are split in to helper functions for re-use -- this is especially useful when the left and right controllers share certain bindings.
The user input system is in need of some refactoring (as of this writing @9223ec5)
button
andaxes
helper functions, like the windows mixed reality bindings: https://github.com/mozilla/hubs/blob/9223ec59daa3ac4ae6d48437587be81dc8ddfd1c/src/systems/userinput/paths.js#L246-L264┆Issue is synchronized with this Jira Task