pwr-Solaar / Solaar

Linux device manager for Logitech devices
https://pwr-solaar.github.io/Solaar
GNU General Public License v2.0
5.46k stars 404 forks source link

test suite for Solaar #1097

Closed kloczek closed 7 months ago

kloczek commented 3 years ago

Everyting is moving away from setuppy test to pytest so it will be good to prepare solaar to pytest as well :)

+ /usr/bin/python3 -Bm pytest -ra
=========================================================================== test session starts ============================================================================
platform linux -- Python 3.8.8, pytest-6.2.2, py-1.10.0, pluggy-0.13.1
rootdir: /home/tkloczko/rpmbuild/BUILD/Solaar-1.0.5
plugins: flaky-3.6.1, forked-1.3.0, shutil-1.7.0, virtualenv-1.7.0, asyncio-0.14.0, expect-1.1.0, pyfakefs-4.1.0, cov-2.11.1, mock-3.5.1, httpbin-1.0.0, xdist-2.2.1, flake8-1.0.7, hypothesis-6.3.3, timeout-1.4.2
collected 0 items

========================================================================== no tests ran in 0.08s ===========================================================================
pfps commented 3 years ago

Solaar doesn't have a test suite at all, AFAICT.

If you are interested in setting up a pytest test suite for Solaar, help would be appreciated.

kloczek commented 3 years ago

A yesh .. you are right however 'setup.py test` exit with 0 exit code :P

rijnhard commented 3 years ago

eeeek. well I've written a test structure similar to what I think will work here.

Can we debate this for a sec? So I would almost not even bother with unit level tests except for particularly sensitive or obviously suited components.

I'd go with an approach similar to how the Polly.JS handles HTTP requests. We should capture and make stubs of the communications coming from the devices and play that back to solaar and ensure that solaar responds or does what is expected.

This way we will guard against regressions very heavily. I can probably do the primary structure, but the capturing and replaying (with time sensitivities) of the communications would be pretty tough for my skill set.

Thoughts?

pfps commented 3 years ago

That sounds like the start of a way to go. But there is almost certainly a lot of work that needs to be done here.

There are a few issues that need to be considered.

Solaar uses udev so that will have to be faked as well. I think faking at a high level is (device availability) is OK.

Devices come and go so there needs to be a way to provide error returns for the I/O.

Solaar is multi-threaded and reads and writes from several devices so the I/O stubs need to include timing.

Solaar is mostly GUI-based so there needs to be some way of simulating GUI input.

As far as capturing the communications, Solaar has a debug mode that logs all I/O to the devices, including timing, so these logs can be mined to get interactions.

rijnhard commented 3 years ago

I think it should be separated into phases, it will also be easier to find contributors for the different aspects.

For example we can fake the input by replaying the I/O logs against predefined expected scenarios without interacting with the gui at all.

GUI tests are its own kettle of fish, I'd almost keep those completely separate and just ensure that they trigger the state changes they are supposed to, completely separate from I/O device scenario testing.

I'm willing to help with the setup of the test framework so that we can do the I/O device scenario testing (mainly because I've done similar in other projects), but someone else will have to add the feedback mechanism for the I/O errors.

Gui tests on the other hand... not my forte.

pfps commented 3 years ago

That sounds reasonable.

What counts as a success for an I/O test, then? Solaar's reaction to information from devices is to update its internal data structures. How will the test harness determine how internal structures have changed?

rijnhard commented 3 years ago

we will need a dictionary of responses for the different devices, basically made from the captured logs/responses.

in the test solaar will issue X command, then based on the dictionary for the device we will reply with Y Response. If the command does not match anything in the dictionary it will fail.

Thats atleast the MVP in my head.

The upside is you will quickly pick up regressions due to unintentional changes, but the downsides are that if you intentionally change a command you will need to recapture the device logs.

Is this a major problem? Do commands being issued for the same purpose to the same device change often?

pfps commented 3 years ago

(Sorry for the delayed reply. I wrote a reply and then neglected to post it.)

The dictionary needs to be a bit sophisticated because there is a random section in the command. This section is at a known place and is echoed in the response so it should be possible to model this.

There also should be a way to inject asynchronous inputs from the devices and delay the response to commands as one of the problems in Solaar occurs when input from devices goes to the wrong consumer.

rijnhard commented 3 years ago

I don't see any of that being an issue, it will probably end up being the fun part.

I'll start playing around with designs this coming week.

On Tue, Mar 16, 2021, 23:42 Peter F. Patel-Schneider < @.***> wrote:

(Sorry for the delayed reply. I wrote a reply and then neglected to post it.)

The dictionary needs to be a bit sophisticated because there is a random section in the command. This section is at a known place and is echoed in the response so it should be possible to model this.

There also should be a way to inject asynchronous inputs from the devices and delay the response to commands as one of the problems in Solaar occurs when input from devices goes to the wrong consumer.

— You are receiving this because you commented. Reply to this email directly, view it on GitHub https://github.com/pwr-Solaar/Solaar/issues/1097#issuecomment-800628216, or unsubscribe https://github.com/notifications/unsubscribe-auth/ABAXP2BCFQHFNTA4BB6AKL3TD7GFDANCNFSM4YK6S6TA .

pfps commented 3 years ago

Excellent. Having this testing ability could have caught #1113

FFY00 commented 3 years ago

See the proposal in #590, that way we can actually test the whole stack. Basically, we emulate the devices using the Linux UHID API. The drawback is that it cannot run in Github action as it does not give us access to /dev/uhid.

jeblad commented 2 years ago

With a lot of globals and semi globals, the code will be pretty hard to test. It will be more testing of implementation than business logic.

pfps commented 2 years ago

The test suite would be more for regression testing than testing for actual correctness.

MattHag commented 8 months ago

I'd love to support this request and introduce tests for Solaar. Starting with unit tests of small components is the simplest way to get started.

With a lot of globals and semi globals, the code will be pretty hard to test. It will be more testing of implementation than business logic.

Such issues can be identified while writing unit tests and the code improved for better testability. This will be an ongoing project, which should lead to an easier implementation of integration tests in the end.

Suggestion to introduce tests

  1. Add GitHub action that triggers pytest tests for Linux on pull request
  2. Introduce unit tests (and simplify code that's hard to test) Steadily raises coverage and lowers the entry hurdle for new developers, as automated tests catch more and more issues.
  3. Introduce integration tests (hardware mocks etc.) Could ease development with faked hardware.
  4. Extend GitHub action to test on Windows and macOS
pfps commented 8 months ago

That sounds reasonable, except that the first step should probably be to set up some unit tests so that the pytest has something to do. Feel free to set up a couple of these. Is there a good way of setting up data to run the tests?

A problem is how to simulate hardware. Is there something that can do this for python, at least to respond to the messages that Solaar sends, even if it doesn't simulate the delays that come from sending actual messages. And then there is the issue that hardware sends spontaneous messages.

MattHag commented 8 months ago

A simple test case is sufficient to setup the CI and test it. I have a change ready.

Unit tests should not require any hardware and just test a function, a class or a module on its own. However, I have seen that most code is tightly coupled, which makes testing harder. But that's a good point to introduce tests and at the same time reduce coupling of code.

For End-to-end tests, which can be added at a later point need more thought and will require a hardware fake, that supports the most basic commands. With a nice abstraction of the OS specific commands, it could easier to achieve this.

pfps commented 8 months ago

Solaar has several interfaces to the OS. One is to find devices. The main one is to send and receive HID++ commands. The others are for rules. The first one to simulate would probably be for HID++ commands.

pfps commented 8 months ago

A lot of Solaar code iteracts with devices. But most of that uses a very small number of function calls. If these calls can somehow be diverted to accessing data structures, then a lot of Solaar code could easily be tested.

MattHag commented 8 months ago

Yeah, this can al be done, but a bit of a refactoring towards easier testable components would be good. It is good practice to test branches with simple and quick unit tests, that do require an I/O. Due to the tight coupling of several components this is hard to achieve. The intermediate step would is declouping as much as possible. This can be achieved with e.g. Protocols, instead of directly accessing a class from another module or even from another package.

First try to get coverage via unit tests and later on cover whole compositions with integration tests. During this process existing and new code should get easier to test. And each small step on the way is a step towards it.

MattHag commented 8 months ago

The Device class could be one of the first, that could be covered with unit tests after some refactoring.

Things that currently impede unit tests for the device module:

pfps commented 8 months ago

hidpp10 and hidpp20 do different things. There are two protocols - version1 and version 2+ and thus two files that are mostly concerned with getting data from devices (and receivers). There isn't any reason to combine these two files as they shouldn't depend on each other. Right now there is a dependency, as hidpp10 uses two "sets" from hidpp20. These should be moved to common.

I don't see what benefit there is to decouple device.py from hidpp10 and hidpp20. There are lots of little functions from hidpp10 and hidpp20 that are used in device.py. What should probably be done is enforce a better line - hidpp10 and hidpp20 interact with devices (and receivers) and implement logical commands and pass back values and structures.

I agree that the import ... as ... should be eliminated, except for the ones that are very heavily used where they form a useful abbreviation.

I also agree that imports in init aren't useful.

MattHag commented 8 months ago

There isn't any reason to combine these two files as they shouldn't depend on each other. Right now there is a dependency, as hidpp10 uses two "sets" from hidpp20. These should be moved to common.

They should not be combined.

I don't see what benefit there is to decouple device.py from hidpp10 and hidpp20. There are lots of little functions from hidpp10 and hidpp20 that are used in device.py.

Yes and they can still do that afterwards, while being testable as component. The issues in the current structure are, that loading one module, secretly loads nearly every module. That makes testing a nightmare. You can't test the lowest level base class NamedInt without loading 10 other modules. That shouln't be the case.

MattHag commented 8 months ago

Adding to the decoupling of modules.

Decouple device module from hidpp

By removing the tight coupling of the device class with the hid modules, the class becomes testable without relying on a specific hidpp module implementation. The proposed design is modular and makes it easy to rewrite and exchange modules, but also testing becomes so much easier.

Add protocols for the necessary hidpp functions, which are necessary in the device module. I know this pattern for classes, I am not sure if there's an equivalent for modules. Probably it's best to group the current hidpp module level functions in a class.

device.py

class HIDPP10Protocol (typing.Protocol):
    def get_firmware(self):
        ...

    def get_notification_flags(self):
        ...

# and more

class HIDPP20Protocol (typing.Protocol):
    def get_firmware(self):
        ...

    def LEDEffectsInfo(self):
        ...

    def get_remap_keys(self):
        ...

# and more

class Device:
    def __init__(self, hidpp10: hidpp10Protocol, hidpp20: hidpp20Protocol)

With that devices.py does not rely on a specific hiddpp module to instantiate. This makes testing of the device module fun, as we can pass a fake hidpp implementation without patching and do not need any hardware related libraries or I/O access. Our fake classes for testing will be simple functions.

Wherever a device is instantiated, it is clearly visible which dependencies it actually has.

hidpp10_instance = hidpp10.HIDPP10()
hidpp20_instance = hidpp20.HIDPP20()
device = Device(hidpp10_instance, hidpp20_instance, ...)

Test device module

And for testing, we can easily replace the hidpp classes by our fakes, that do not need hardware or I/O. _tests/logitech_receiver/testdevice.py

class FakeHIDPP10:
   ...

class FakeHIDPP20:
   ...

hidpp10_instance = FakeHIDPP10()
hidpp20_instance = FakeHIDPP20()
device = Device(hidpp10_instance, hidpp20_instance, ...)
pfps commented 8 months ago

This seems to actually be quite a bit less radical than what I was thinking of, and is probably a good idea for at least the short term.

It is easy to use modules here, and keep separate hidpp10 and hidpp20 modules. The modules themselves are already like the classes, they just need to have all their functionality encapsulated in a class. (Unless Python has a way of treating a module as a class object.)

An issue to consider is just how to design the interface. If hidpp20 is a class then its objects can store the device being manipulated so that most calls don't need to specify the device.

MattHag commented 8 months ago

Exactly, they are well prepared for this improvement. And having eveything inside the class sounds like a nice side effect as well.

BTW: There is also functools.partial, which can remove the need to pass a specific argument to a function every time it is called.

MattHag commented 7 months ago

Open To-Dos

pfps commented 7 months ago

This issue is about using pytest. As there is now a partial test suite using pytest this issue can be closed. Issues can be opened about missing parts of the test suite, accompanied by suggestions on how to proceed.