lbonn / rofi

Rofi: A window switcher, run dialog and dmenu replacement - fork with wayland support
Other
1.02k stars 42 forks source link

Rofi unresponsive to touch input[BUG] #122

Open Mahgozar opened 8 months ago

Mahgozar commented 8 months ago

Rofi version (rofi -v or git commit in case of build issue)

1.7.5+wayland3-3-g47ae31eb (makepkg)

Configuration

https://gist.github.com/Mahgozar/e2f5afb41231f13dfaf91e3d90d6b264

Theme

https://gist.github.com/Mahgozar/381d21193b224f58bb83ac852fa972c6

Timing report

No response

Launch command

rofi -show drun --show-icons

Step to reproduce

launch rofi try to launch an app with touch input nothing happens

Expected behavior

after clicking on the apps with touch screen they are expected to be launched

Actual behavior

unresponsive to touch input

Additional information

No response

Using wayland display server protocol

I've checked if the issue exists in the latest stable release

lVentus commented 5 months ago

is this solved? I have same problems

Dutt-A commented 4 months ago

It is also unresponsive to mouse input for me. Perhaps mouse/trackpad/touch support was not added anyways, since you would need to rewrite a few things since you wouldn't be able to fall back on xinput.

ppenguin commented 4 months ago

For mouse input you can do this (adapt to your theme/config):

rofi -me-select-entry 'MousePrimary' -me-accept-entry '!MousePrimary' -theme ~/.config/rofi/appmenu.rasi -show drun -show-icons

For me this works better than the default with a mouse, i.e. click selects and release accepts. This works the same with trackpad for me (under Hyprland).

However, I came here because touch screen also isn't working. I checked with wev and saw that I get wl_touch: down and wl_touch: up events, which is of course distinctly different from a mouse button event (touchpad with tap generates the same event as MousePrimary click). So I guess we need to be able to configure the correct event for touch screen (i.e. wl_touch) as a keysym in the config. The big question is how?

(Ideally something semantically clear as rofi -me-select-entry 'TouchDown' -me-accept-entry 'TouchUp'?)

EDIT: I took a quick look at the source, and noticed that input events are handled by libnkutils, which doesn't appear to have specific "modern" libinput support, i.e. things seem to be handled at the level of key-presses and pointer/button events. That could mean that to make touch screen work, one would additionally need something like https://github.com/libts/tslib and/or modify/overhaul the input handling in some other way?