Even if custom ranges for the axes are set, the raw (HID) values that are reported seem to be still in the fixed range of -32767 to 32767. This is sometimes problematic for software that works with raw values directly and not with DirectX normalized ranges... For example, some games using the Unity default input handler get confused with the negative part of the range. I realize that those issues should be better fixed on the game side, but it is not likely... Moreover, it sometimes affects Windows calibration, as it also works with raw values.
Could we have an option that the raw values are reported in the range that is set for the axis? For example, if I set the X axis range to 0 65535, that is exactly what is reported in raw values?
Even if custom ranges for the axes are set, the raw (HID) values that are reported seem to be still in the fixed range of -32767 to 32767. This is sometimes problematic for software that works with raw values directly and not with DirectX normalized ranges... For example, some games using the Unity default input handler get confused with the negative part of the range. I realize that those issues should be better fixed on the game side, but it is not likely... Moreover, it sometimes affects Windows calibration, as it also works with raw values.
Could we have an option that the raw values are reported in the range that is set for the axis? For example, if I set the X axis range to 0 65535, that is exactly what is reported in raw values?