Open ocornut opened 5 years ago
Here's what Android styli can report (I briefly listed them in that mentioned issue):
Pressure: 0.0f to 1.0f
AXIS_DISTANCE For a stylus, reports the distance of the stylus from the screen. A value of 0.0 indicates direct contact and larger values indicate increasing distance from the surface.
AXIS_ORIENTATION For a stylus, the orientation indicates the direction in which the stylus is pointing in relation to the vertical axis of the current orientation of the screen. The range is from -PI radians to PI radians, where 0 is pointing up, -PI/2 radians is pointing left, -PI or PI radians is pointing down, and PI/2 radians is pointing right. See also AXIS_TILT.
AXIS_TILT For a stylus, reports the tilt angle of the stylus in radians where 0 radians indicates that the stylus is being held perpendicular to the surface, and PI/2 radians indicates that the stylus is being held flat against the surface.
Personally (currently) I use WebAssembly via EMSCRIPTEN that do not have a specific Stylus/Pen event, but only touch ones. Currently there are no differences between fingers or other: no direct function to get pressure or tilt... but it is acquirable via Pointer Events HTML5 API -> alias via direct JS code inside C/C++ using EM_JS macro (pressure or tilt, personally never used, yet).
typedef struct EmscriptenTouchPoint
{
long identifier; //An identification number for each touch point.
long screenX; //The touch coordinate relative to the whole screen origin, in pixels.
long screenY;
long clientX; //The touch coordinate relative to the viewport, in pixels.
long clientY;
long pageX; //The touch coordinate relative to the viewport, in pixels, and including any scroll offset.
long pageY;
EM_BOOL isChanged; //Specifies whether the touch point changed during this event.
EM_BOOL onTarget; //Specifies whether this touch point is still above the original target on which it was initially pressed.
long targetX; //These fields give the touch coordinates mapped relative to the coordinate space of the target DOM element receiving the input events (Emscripten-specific extension).
long targetY;
long canvasX; //The touch coordinates mapped to the Emscripten canvas client area, in pixels (Emscripten-specific extension).
long canvasY;
} EmscriptenTouchPoint;
typedef struct EmscriptenTouchEvent {
int numTouches; //The number of valid elements in the touches array.
EM_BOOL ctrlKey; //Specifies which modifiers were active during the touch event.
EM_BOOL shiftKey;
EM_BOOL altKey;
EM_BOOL metaKey;
EmscriptenTouchPoint touches[32]; //An array of currently active touches, one for each finger.
} EmscriptenTouchEvent;
A pointer to EmscriptenTouchEvent
struct is passed to single touch registered event, and four events are available via installable callbacks:
EMSCRIPTEN_RESULT emscripten_set_touchstart_callback(const char *target, void *userData, EM_BOOL useCapture, em_touch_callback_func callback);
EMSCRIPTEN_RESULT emscripten_set_touchend_callback(const char *target, void *userData, EM_BOOL useCapture, em_touch_callback_func callback);
EMSCRIPTEN_RESULT emscripten_set_touchmove_callback(const char *target, void *userData, EM_BOOL useCapture, em_touch_callback_func callback);
EMSCRIPTEN_RESULT emscripten_set_touchcancel_callback(const char *target, void *userData, EM_BOOL useCapture, em_touch_callback_func callback);
And this is the form of callback function to pass:
typedef EM_BOOL (*em_touch_callback_func)(int eventType, const EmscriptenTouchEvent *touchEvent, void *userData);
Note: Recently there are some changes internally EMSCRIPTEN, Upcoming breaking change: HTML5 API DOM element lookup rules are changing:
// Now deprecated -> return always 0,0
long EmscriptenTouchPoint::canvasX;
long EmscriptenTouchPoint::canvasY;
// Using instead, also for "#canvas"
long EmscriptenTouchPoint::targetX;
long EmscriptenTouchPoint::targetY;
Below there is an example of how I use of touchs events: tap, move and pinching (for zoom)
My emsMDeviceClass
class manages the touch events and is declared in the files emsTouch.h
and implemented in emsTouch.cpp
(They are excluded from VS and/or desktop build: need emscripten cmake build, to compile it... eventually read main project page for instructions)
The callbacks are registered in the glApp.cpp
I'm currently adding win/mac/linux pen tablet support to glfw : https://github.com/glfw/glfw/pull/1445
I developed some animation and digital painting tools for in house anim studio. In general it comes down to:
Tablet events:
pen 'cursor' : was the pen cursor changed (pen or eraser for ex)
Pen data (a queue is important as the frequency can be high on wacom)
relevant issue for emscripten (which has touch, but not pen) https://github.com/emscripten-core/emscripten/issues/7278
also noted SDL 3 has pen support with SDL_pen.h
I would like to eventually add fields in ImGuiIO to be able to distinguish Mouse from Stylus/Pen or Touch inputs. For practical and legacy purpose we may need to keep using the name 'Mouse' for several fields, but we could have a way to select the current input system. Then we can go and tweak some widgets behavior according to the type of input.
(In particular, we should also aim to improve quality of life with touch inputs, see #2334 - this itself it out of the scope of this discussion and will be done separately.)
Because I am not a stylus user at the moment, what I would like to discuss here is clarify the data we need to add to ImGuiIO to hold all the required information provided by Pen API. Even if default Dear ImGui widgets don't use any or all of them yet, their availability would facilitate the creation and sharing of custom widgets.
If you have experienced with those, could you provide feedback of which information are useful or essential to program applications?
Anything specific to know in term of variation of "portability" accross brands and technology? (e.g. a Wacom pen may not provide the same info as a iPad pro pen)
The goal is to design the type/name of the fields that would be added to ImGuiIO, e.g.
Aside from the Pressure value, everything gets more complicated. Apple reports a radius (1 float) + tolerance, Windows seems to report it as a rectangle (4 values). Does anyone knows enough about this to help design a sensible common structure?
I'm just dropping those note here but I haven't researched it much yet.
API References:
Apple Pencil https://developer.apple.com/documentation/uikit/pencil_interactions/handling_input_from_apple_pencil https://developer.apple.com/documentation/uikit/uitouch
Windows seems to have a
POINTER_TOUCH_INFO
structure: https://docs.microsoft.com/en-us/windows/win32/api/winuser/ns-winuser-pointer_touch_infoWacom WinTab api https://developer-docs.wacom.com/display/DevDocs/Windows+Wintab+Documentation
Namely