Zyell / Python-Touchscreen-RightClick

Implements multitouch gestures for right click using evdev on Linux
MIT License
12 stars 7 forks source link

ui.write and device capabilities issues #2

Open gevasiliou opened 7 years ago

gevasiliou commented 7 years ago

Hello, Your script is really nice. I have a convertible Toshiba laptop with ELAN Screen running Debian 8.5 Sid & XFCE, but i had some issues with the UInput and ui.write method.

In my system , the code as described in _initiate_right_click does not work. The problem turned out to be the device capabilites declaration. Instead of this:

    capabilities = {ecodes.EV_ABS: (ecodes.ABS_X, ecodes.ABS_Y),
                    ecodes.EV_KEY: (ecodes.BTN_LEFT, ecodes.BTN_RIGHT)}

I had to use this:

    capabilities = {
  ecodes.EV_KEY : [ecodes.BTN_LEFT, ecodes.BTN_RIGHT],
  ecodes.EV_ABS : [(ecodes.ABS_X, AbsInfo(value=1900, min=0, max=3264, fuzz=0, flat=0, resolution=13)), (ecodes.ABS_Y, AbsInfo(1050, 0, 1856, 0, 0, 13))]
   }

I had to apply to AbsInfo min,max,fuzz,flat and resolution values same as my ELAN screen capabilites.

PS1: In any case, i forked your script and i used pymouse for right click injection insted of UInput.

PS2: I also had to modify the position_event part of your script, since in my screen when i'm working with fingers i have a slight change in ABS_X and ABS_Y values when i keep pressing my screen in one point probably due to the high resolution of the screen. This small change cancels the right click event to be injected , so i modified like bellow to allow +/- 50 pixels movement without cancelling right click:

if self.position[event_code] is None:
    self.position[event_code] = value
else:
    OldValue = self.position[event_code] 
    NewValue = value
    diff = OldValue - NewValue
    if abs(diff) > 50:   
        self._moved_event()

After above modifications, your script works great for me. The touchscreen gestures trap works like a charm.

Zyell commented 7 years ago

Hello! Thanks for the feedback! I'm glad that you were able to get the script working. I have committed some changes to account for the issues you ran into. First, I grab the AbsInfo for ABS_X & ABS_Y as reported by evdev. Including this in the capabilities did not cause any issues on my laptop. However, I don't know for sure that it is fixed for you, as I did not have any issues with the original implementation. If this works, it should generically handle the problem. Please let me know if it works for you.

Thank you for the position variation idea during long press. I had not thought of that before, but after playing with it, the behavior of the right click is certainly more consistent. However, I didn't want to implement a hard-coded value of 50. In python-evdev, this value is actually not pixels, but units for measuring touch resolution (this doesn't necessarily match the screen resolution). It gives the max and min for each axis in the AbsInfo tuple. It also gives a resolution value for the # units/mm on your touch screen. With this info, it is possible to set a more relative, adaptable threshold movement.

After looking up average index finger width for the fun of it and testing my own touch variation, I decided to take a 10% variation across (5% radius of the touch point) of the width of the largest average index finger. I use this and the resolution reported in each axis to gauge a movement tolerance. This variation can be adjusted according to your desire/need. I thought this might provide a more robust implementation across different hardware with different resolutions. :-) Let me know if you have any issues. Thanks!

gevasiliou commented 7 years ago

Hello, Thank you for your feedback. I have to try your modified code and let you know for the results.

In the mean time, i got some more ideas that you might find interesting:

  1. instead of applying res-x and res-y to a fixed value of 13 (which is also suitable for my ELAN), you could use the real value reported by real ELAN device capabilities using ui.capabilities(verbose=True,absinfo=True)). Thus you can ensure that AbsInfo method for the uinput device will work in all screens / all cases.
  2. I just discovered that for my set up also this method works great:
capabilities = {
    e.EV_REL : (e.REL_X, e.REL_Y), 
    e.EV_KEY : (e.BTN_LEFT, e.BTN_RIGHT),
}

If you have time you could check it in your system.

3, It would be very interesting to modify the script by injecting the right click while we keep pressing the screen . It is quite common for users to expect first to see the right click menu and then to remove fingers. This is something i'm working in your (forked) script, but still i need long way (and time) to go.

Regards, George V.

Zyell commented 7 years ago

To answer your suggestions:

For 1), I haven't hard-coded a value of 13 in, that is only the default value set in the event that AbsInfo is incomplete for some reason (I have no idea if this might occur, so I simple set a default to handle this possibility).

if human_code == 'ABS_X':
    vals = type_code[1]
    abilities[ecodes.EV_ABS][0] = (ecodes.ABS_X, vals)
    res_x = vals[-1] # setting to value reported for resolution by AbsInfo
elif human_code == 'ABS_Y':
    vals = type_code[1]
    abilities[ecodes.EV_ABS][1] = (ecodes.ABS_Y, vals)
    res_y = vals[-1] # setting to value reported for resolution by AbsInfo

The above code takes care of setting the values reported by evdev. For 2), yes the REL setting works too. I had strange numbers reported when I was building the script, so I stuck with the absolute frame. For 3) that is a good point. Let me play with that and see what I can do. :-) Thanks for the suggestion!

Zyell commented 7 years ago

Ok, your request 3) has been implemented! I also cleaned up and simplified the base code. Let me know how it functions for you and I will close this issue out. Thanks!

gevasiliou commented 7 years ago

Dear Zyell, Your support and coding is great.

I can't wait to go back at home to test your final script (i'm in work right now without touch screen available).

By reading your modified code, it seems that everything should be working perfectly (and not only on ELAN Touchscreens). I just want to try it and give you some feedback in order to close this issue.

PS: Since Python is great, capabilities of this script could be expanded as further as anyone wants. Just to share some ideas that pop up in my head and i would like to try them my self as a kind of "homework" in the future:

And many more! This list could be really big.

PS2 : Sorry for my chattering; i just got excited with Python capabilities in linux and also i got disapponited by the limited capabilities that Linux provides at 2016 for modern touch screen laptops.

In any case, I will be back with the test results of your final script in order to close this issue !

gevasiliou commented 7 years ago

Hello, These are the Test Results:

a. On the very first short tap, script was interrupted with IOError in ungrab function. According to the file device.py , the ungrab function gives an IOError if the device is already released. As a result i had to modify your ungrab code like bellow to make it work:

            try:
                self.dev.ungrab()
            except (OSError,IOError):  # capture case where grab was never initiated
                pass

b. Moved Event. I got one right click every 10 long press attempts and i got suspicious about possible "false" self.moved events. I added a line print('moved too much') just after self.moved=1 in def _moved_event and i retested. Indeed, i was getting a lot of "moved too much" messages in the terminal.

I change the value of var_x and var_y to *2res** ,but i still got a lot of move events (fewer than before). By applying a value of 4_res_x and 4_res_y , i got it right; No anymore "false" movement events.

c. After applying fixes 1 and 2, right click was again rarely injecting/rarely appearing. To be more accurate i had one right click every 10 or more attempts. I added a line at the end of the "initiate right click" function, with print ('right click injected'); i just want to monitor if the right click function was correctly called. As a result i could verify that every long press (that was not moved) was printing a "right click injected" message in terminal, but a right click menu was almost never popping out.

Conclusion: I really don't know why right click is not working. Maybe has something to do with the grab/ungrab.

What i can provide as an extra info for troubleshooting is the following:

from evdev import UInput, AbsInfo, ecodes as e
import time
capabilities = {
     e.EV_KEY : [e.BTN_LEFT, e.BTN_RIGHT],
     e.EV_ABS : [
          (e.ABS_X, AbsInfo(value=3100, min=0, max=3264, fuzz=0, flat=0, resolution=13)),
          (e.ABS_Y, AbsInfo(1090, 0, 1856, 0, 0, 13))]
} 
ui = UInput(capabilities)
time.sleep(10)
ui.write(e.EV_ABS, e.ABS_X, 0) 
ui.write(e.EV_ABS, e.ABS_Y, 0) 
ui.write(e.EV_KEY, e.BTN_RIGHT, 1)
ui.write(e.EV_KEY, e.BTN_RIGHT, 0)
ui.syn()

PS: By the way, if instead of the evdev/uinput method i use PyMouse method, your latest script release works like a charm! I can see right click menu every single time, and right click appears while i keep pressing the screen, exactly as requested!!!

Zyell commented 7 years ago

Your tests are very interesting! I have two different touchscreen laptops, different manufacturers and different window managers (unity and gnome-shell). Both of them behave identically, so I have never encountered the issues you have. Also odd is the OSError vs. the IOError. Mine gives an OSError rather than an IOError for the same issue. Regardless, I will include both exceptions. The grab and ungrab are there in order to ensure the right click menu stays when you lift your finger after a long press. Can you post the output from evtest? I am very interested to compare what you have to what I have from my laptops. I am perplexed as to why the click does not function with UInput. I would rather figure that out than add an additional dependency and overhead with PyUserInput. I have installed PyUserInput (includes PyMouse) and preliminarily tested it. I need to be sure the ABS_X and ABS_Y as reported by evdev match the input to PyMouse. I want to make sure the denser touch units are matched appropriately to the screen units used by PyMouse since mine are definitely of different densities.

Also, what is the variation you are seeing when the moved attribute is set? I'm curious as to the differences there. In the event that I create an indicator package, it would help to know this for the calibration option.

As for auto-rotate abilities, that is beyond the scope of this script. That is a kernel dependent issue. The drivers have to be correct in the kernel and then the window manager has to access the accelerometer. I have a lenovo yoga 2 pro that auto-rotates. I have ubuntu-gnome installed on it because it natively handles it. The program I used under unity no longer functions with the current kernel. If you want auto-rotate abilities, I would suggest using gnome-shell at this point (works very well for me). It is also possible that someone else has written something to work under unity, but I do not know at this point. I definitely understand your frustration with touchscreen and auto-rotation under linux. It has been pain for a while, but manufacturers are getting much better with driver support in the kernel. If they had been doing this in the first place, development would have been further along by now, but such is the way of things.

In the meantime, if you can share that info, that would be awesome. I will work on PyUserInput (PyMouse) and verify it works on my end. If that option works across hardware better, then that is the approach I will take. I have looked briefly into packaging an indicator and it looks doable and would be a nice addition. :-) However, it may be a couple weeks until I get back to this. I will be out of the country enjoying some vacation time. :-) But I will work on this when I get back! Thanks for your input and your testing!

gevasiliou commented 7 years ago

Hello, Thanks for your reply. Just to share some first comments:

About evtest: I will try to send evtest, although it could be difficult to catch the uinput virtual device since it is only created when uinput is called and is gone as soon as ui.syn is executed. I will figure something out.

About pymouse: I don't know if it makes any difference but i have installed just pymouse from pip , not the PyUserInput. By the way, PyMouse works with screen resolution and the values reported by Pymouse are different compared to ABS_X and ABS_Y. But this is not a problem. Accuracy of pymouse is really good. You can find your latest version modified with pymouse here (works perfect for my system):

https://github.com/gevasiliou/Python-Touchscreen-RightClick/blob/master/Python_Touchscreen_RightClick.py

PS: Grab and Ungrab were absolutelly necessary and will also required even if you use pymouse.

About AutoRotate: For autorotate capabilies, there is a nice shell script built by another guy based on iio-sensor-proxy, that works also fine for me (kernel 4.7). I have heard that iio-sensor-proxy has some issues with 4.8 kernel. You can have a look in this autorotate script here: https://github.com/gevasiliou/PythonTests/blob/master/autorot.sh Due to the fact that this shell script works due to inotify tools (who talk directly to kernel) seems to use low resources of my system.

About Tray Icon: If you consider to apply a kind of tray icon for enable/disable/calibrate these working python tests about gtkstatusicon is exactly what you need: https://github.com/gevasiliou/PythonTests/blob/master/TrayLeftClickMenu.py https://github.com/gevasiliou/PythonTests/blob/master/TrayAllClicksMenu.py PS: in case of tray icon, we need everything to work with left clicks , since desktop environments do not support right clicks yet :-)

About Gnome: Just watch out that Gnome behaviour may differ according to the desktop icon settings. In my Gnome Tests (i have a partition with Ubuntu 16.04 + Gnome 3.18) right click works out of the box without the need of any script, but works only if you have choose Dekstop Icons to be hidden = no desktop. If you enable desktop icons in Gnome right click of Gnome is no more working (in my case at least). PS: I think i have not tested under Unity.

I will send more info soon.

Regards, George V.

Zyell commented 7 years ago

Thanks for the info!

So for PyMouse, it looks like it hasn't been developed independently for a long time. On PyPi, it hasn't been updated since 2010; whereas, PyUserInput was updated this past August. The PyMouse project was wrapped up into PyUserInput a while back. What is cool too is that PyUserInput contains keyboard control stuff too. So you can launch shortcuts from a script.

Very nice with the tray icon, it looks like it should be simple enough to integrate. :-) As for Gnome, I had noticed that. Gnome has the best support for touchscreens but their implementation is definitely incomplete.

I have played around with a few other packages touchegg, easystroke, and such for gesture support beyond the standard ones available. But most of these implementations were for trackpads and were extended to touchscreens back when touchscreen events were treated identically to mouse/trackpad events. However, this broke down when Ubuntu started treating them differently (which I honestly think was the right choice and should have been the case from the beginning). I came across a library MtDev which supports touch events across linux. It is used by a GUI library in Python called Kivy. I've used the library many times for other applications and it is quite nice. But it uses MtDev on the back-end to deal with touch events. It also has an efficiently implemented gesture recognition engine. I played with it last night and was able to run windowless and capture all touch events and even calculate the speed of gesture creation very quickly with seemingly little overhead. So, this is just a toy thing right now. But conceivably, this could be used to implement arbitrary gestures and link them to actions. It could also be used to implement certain simple swipe actions for changing the workspace like you mentioned before. I've only made a toy so far, but when I get back from vacation I plan to implement some tests for it and see if it can do what I hope it can. Then maybe we could have a working simple gesture/touch system that fills in all of the holes currently in Linux touchscreen implementations. :-) Anyway, these are just some thoughts right now. I will pick this back up in a few weeks when I get back from vacation!