Open liubingyan123 opened 5 years ago
I have Nuklear running on ARM cortex m7 with touch panel. You can wrap touch panel data and simulate mouse behavior. You'll lose mouse hover features, but that is expected with touch. Key functions are:
NK_API void nk_input_motion(struct nk_context*, int x, int y);
NK_API void nk_input_button(struct nk_context*, enum nk_buttons, int x, int y, int down);
I do some workarounds like parking "mouse" to 0;0 to avoid hovering above recently touched button etc., but it is straightforward.
More work comes into code of rendering back-end when you want to use Nuklear on bare metal.
@diggit can you post an example of your work or at least partial code?
Hi @frink , in which part you are interested?
Here is code part which converts touch data to mouse actions
static bool released = true;
nk_input_begin(ctx);
if (tPos.active) {
nk_input_motion(ctx, tPos.x, tPos.y);
nk_input_button(ctx, NK_BUTTON_LEFT, tPos.x, tPos.y, 1);
released = false;
}
else {
if (!released) {
nk_input_button(ctx, NK_BUTTON_LEFT, tPos.x, tPos.y, 0);
nk_input_motion(ctx, 0, 0);
released = true;
}
}
nk_input_end(ctx);
Where tPos
is simple structure with coordinates and flag if this touch slot is active. (Touch Controller I use, can handle up to 5 simultaneous touches, but all my code works with with first one only (for now).
I'm on x64 but I'm doing a touchscreen interface for audio. Would like to do full 10 point multitouch...
So if I'm understanding correctly, you wrangle touch events before you start the draw loop. Then you draw from there. Is that right?
One thing I've thought to do is use a touch to mouse conversion for X11/xorg but I haven't found anything that works reliably. Touch is definitely the most infuriating part of modern interfaces. Here is the touch to click event map I'm looking to create...
short tap = left click
long tap = right click
double tap = double left click
triple tap = middle click
long tap drag = left click drag
double tap drag = right click drag
triple tap drag = middle click drag
flick up = scroll down
flick down = scroll up
flick left = scroll right
flick right = scroll left
I've opted not to use multitouch gestures because I want to be able to use each point as its own click context. Most often will be doing multiple click drags. I'm planning on some sort of immediate mode GUI in the graphics thread but to have that preempted by the audio threads. I haven't gotten as far as communication between threads yet.
On my (embedded) HW ,events from TC are read every ~20ms, as a response to interrupt line. Touch data is stored and that is all work of TC code. Then, there is main loop spinning and waiting for free frame buffer to render to. Once it gets one, it looks onto latest touch data, calls nuklear input handling, then gui is defined and then it is rendered.
So Iam not sure how could my HW specific code help.
Nuklear input API is pretty simple. It takes mouse/touch absolute position and state of buttons, (and kbd input). It's up to user to convert data from any input device to those API calls. Basically, you have to write piece of code to classify touch and movement events to what you described and then give it to nuklear.
I have no (tweaking) experience with nuklear under X11.
This explanation is worth 10hrs of hacking! Thanks so much.
as the title says, I want to port nuklear to a ARM cortex m4 microcontroller, does nuklear support touchpanel?