HelloThisIsFlo / Appdaemon-Test-Framework

Clean, human-readable tests for Appdaemon
https://hellothisisflo.github.io/Appdaemon-Test-Framework/
MIT License
45 stars 19 forks source link

Appdaemon Test Framework

Build Status PyPI

Clean, human-readable tests for your Appdaemon automations.

How does it look?
def test_during_night_light_turn_on(given_that, living_room, assert_that):
    given_that.state_of('sensor.living_room_illumination').is_set_to(200) # 200lm == night
    living_room._new_motion(None, None, None)
    assert_that('light.living_room').was.turned_on()

def test_click_light_turn_on_for_5_minutes(given_that, living_room, assert_that):
    living_room._new_button_click(None, None, None)
    assert_that('light.bathroom').was.turned_on()

    # At T=4min light should not yet have been turned off
    time_travel.fast_forward(4).minutes()
    assert_that('light.bathroom').was_not.turned_off()

    # At T=5min light should have been turned off
    time_travel.fast_forward(1).minutes()
    time_travel.assert_current_time(5).minutes()
    assert_that('light.bathroom').was.turned_off()

Table of Contents


5-Minutes Quick Start Guide

Initial Setup

  1. Install pytest: pip install pytest
  2. Install the framework: pip install appdaemontestframework
  3. Create a conftest.py file in the root of your project with the following code:
    from appdaemontestframework.pytest_conftest import *

Write you first unit test

Let's test an Appdaemon automation we created, which, say, handles automatic lighting in the Living Room: class LivingRoom

  1. Initialize the Automation Under Test with the @automation_fixture decorator:
    @automation_fixture(LivingRoom)
    def living_room():
     pass
  2. Write your first test:
    Our first unit test
    def test_during_night_light_turn_on(given_that, living_room, assert_that):
       given_that.state_of('sensor.living_room_illumination').is_set_to(200) # 200lm == night
       living_room._new_motion(None, None, None)
       assert_that('light.living_room').was.turned_on()
    Note

    The following fixtures are injected by pytest using the conftest.py file and the initialisation fixture created at Step 1:

    • living_room
    • given_that
    • assert_that
    • time_travel

Result

# Important:
# For this example to work, do not forget to copy the `conftest.py` file.

@automation_fixture(LivingRoom)
def living_room():
    pass

def test_during_night_light_turn_on(given_that, living_room, assert_that):
    given_that.state_of('sensor.living_room_illumination').is_set_to(200) # 200lm == night
    living_room._new_motion(None, None, None)
    assert_that('light.living_room').was.turned_on()

def test_during_day_light_DOES_NOT_turn_on(given_that, living_room, assert_that):
    given_that.state_of('sensor.living_room_illumination').is_set_to(1000) # 1000lm == sunlight
    living_room._new_motion(None, None, None)
    assert_that('light.living_room').was_not.turned_on()

General Test Flow and Available helpers

0. Initialize the automation: @automation_fixture

# Command
@automation_fixture(AUTOMATION_CLASS)
def FIXTURE_NAME():
    pass

# Example
@automation_fixture(LivingRoom)
def living_room():
    pass

The automation given to the fixture will be:

1. Set the stage to prepare for the test: given_that

2. Trigger action on your automation

The way Automations work in Appdaemon is:

To trigger actions on your automation, simply call one of the registered callbacks.

Note

It is best-practice to have an initial test that will test the callbacks are actually registered as expected during the initialize() phase. See: Bonus - Assert callbacks were registered during initialize()

Example

LivingRoomTest.py
def test_during_night_light_turn_on(given_that, living_room, assert_that):
   ...
   living_room._new_motion(None, None, None)
   ...
With LivingRoom.py
class LivingRoom(hass.Hass):
    def initialize(self):
        ...
        self.listen_event(
                self._new_motion,
                'motion',
                entity_id='binary_sensor.bathroom_motion')
        ...

    def _new_motion(self, event_name, data, kwargs):
        < Handle motion here >

3. Assert on your way out: assert_that

Bonus — Assert callbacks were registered during initialize()

_See related test file for more examples: test_assert_callback_registration.py_

Bonus — Travel in Time: time_travel

This helper simulate going forward in time.

It will run the callbacks registered with the run_infunction of Appdaemon:

Simulate time

time_travel.fast_forward(MINUTES).minutes() time_travel.fast_forward(SECONDS).seconds()

Assert time in test — Only useful for sanity check

time_travel.assert_current_time(MINUTES).minutes() time_travel.assert_current_time(SECONDS).seconds()

Example

2 services:

* 'first/service': Should be called at T=3min

* 'second/service': Should be called at T=5min

time_travel.assert_current_time(0).minutes()

time_travel.fast_forward(3).minutes() assert_that('some/service').was.called() assert_that('some_other/service').was_not.called()

time_travel.fast_forward(2).minutes() assert_that('some_other/service').was.called()


---
## Examples
### Simple
*  [**Pytest**](https://github.com/FlorianKempenich/Appdaemon-Test-Framework/blob/master/doc/pytest_example.py)
*  [**Unittest**](https://github.com/FlorianKempenich/Appdaemon-Test-Framework/blob/master/doc/unittest_example.py)

### Special cases
#### Complex setup fixture
In this scenario, we wish to test what happens **after** the light was turned on by motion and the `device_tracker.person` is set to `away`.
After the first `_new_motion`, the mock functions must be reset to ensure a clean state. For that purpose, it is entirely possible to use `given_that.mock_functions_are_cleared()` inside a particular test:
```python
def test_light_turned_on_by_motion_and_person_away__do_not_turn_on_again(given_that,
                                                                         living_room,
                                                                         assert_that):
  # Given:
  # - Light was turned on by motion
  given_that.state_of('sensor.living_room_illumination').is_set_to(200) # 200lm == night
  living_room._new_motion(None, None, None)
  given_that.mock_functions_are_cleared()
  # - Person is away
  given_that.state_of('device_tracker.person).is_set_to('away')

  # When: New motion
  living_room._new_motion(None, None, None)

  # Then: Light isn't turned on
  assert_that('light.living_room').was_not.turned_on()

Alternatively, if multiple scenarios are to be tested with the same fixture (after turned on by motion & person is away), an intermediate fixture can be used:

class AfterLightTurnedOnByMotion:

  @pytest.fixture
  def living_room_after_light_turned_on_by_motion(self, living_room, given_that):
    given_that.state_of('sensor.living_room_illumination').is_set_to(200) # 200lm == night
    living_room._new_motion(None, None, None)
    given_that.mock_functions_are_cleared()

  def test_person_away__new_motion__do_not_turn_on_again(self,
                                                         given_that,
                                                         living_room_after_light_turned_on_by_motion,
                                                         assert_that):
    given_that.state_of('device_tracker.person).is_set_to('away')
    living_room_after_light_turned_on_by_motion._new_motion(None, None, None)
    assert_that('light.living_room').was_not.turned_on()

  def test_some_other_condition__do_something_else(self,
                                                   given_that,
                                                   living_room_after_light_turned_on_by_motion,
                                                   assert_that):
    ...

Complete Project


Under The Hood

This section is entirely optional For a guide on how to use the framework, see the above sections!

Understand the motivation

Why a test framework dedicated for Appdaemon? The way Appdaemon allow the user to implement automations is based on inheritance. This makes testing not so trivial. This test framework abstracts away all that complexity, allowing for a smooth TDD experience.

Couldn't we just use the MVP pattern with clear interfaces at the boundaries? Yes we could... but would we? Let's be pragmatic, with this kind of project we're developing for our home, and we're a team of one. While being a huge proponent for clean architecture, I believe using such a complex architecture for such a simple project would only result in bringing more complexity than necessary.

Enjoy the simplicity

Every Automation in Appdaemon follows the same model:

AppdaemonTestFramework captures all calls to the API and helpers make use of the information to implement common functionality needed in our tests.

Methods from the hass.Hass class are patched globally in the HassMocks class and the mocks can be accessed with the hass_functions property. The test fixture is injected in the helper classes as hass_mocks. After all tests have run, the hass_mocks test fixture will automatically unpatch all the mocks.

Deprecated hass_functions test fixture Before HassMocks existed, hass_functions was used for interacting with the patch hass methods. The hass_mocks test fixture now contains hass_functions and should be accessed through there. A hass_functions test fixture that returns hass_mocus.hass_fuctions will be kept for a while to ease transition, but wil generate a deprecation warning.

  1. hass_functions: dictionary with all patched functions
  2. unpatch_callback: callback to un-patch all patched functions

hass_functions are injected in the helpers when creating their instances. After all tests, unpatch_callback is used to un-patch all patched functions.

Setup and teardown are handled in the pytest_conftest.py file.

Appdaemon Test Framework flow

1. Setup

Feature focus

@automation_fixture

To be able to test an Appdaemon automation, the automation needs to be created after the framework has been initialized. This could be done with a regular @pytest.fixture by injecting given_that (or any other helper) before creating the automation. This would invoke the helper which would initialize the framework beforehand and the correct functions would be patched before the automation is created:

@pytest.fixture
def living_room(given_that):
    # `None` dependencies are mocked by the framework
    living_room = LivingRoom(None, None, None, None, None, None, None, None)
    return living_room

Also, for convenience, we could initialize the automation with its initialize() method to make it available in tests as it would be in production once Appdaemon is started, making sure the mocks are clear of any calls when the fixture is injected. We could also set the automation instance name to the name of the automation class.

@pytest.fixture
def living_room(given_that):
    living_room = LivingRoom(None, LivingRoom.__name__, None, None, None, None, None, None)
    living_room.initialize()
    given_that.mock_functions_are_cleared()
    return living_room

However, since this code would have to be repeated for every single automation, it was creating un-necessary clutter. For that reason, the @automation_fixture was introduced. It is simple syntactic sugar, and performs the exact same steps as the fixture above in much fewer lines:

@automation_fixture(LivingRoom)
def living_room():
    pass

Advanced Usage

@automation_fixture - Extra features

Deprecation Warnings

As development continues of this test framework, some interfaces and test fixtures need to get deprecated. In general, the following method is used to ease the transitions

  1. Mark deprecated calls with a warning
  2. Provide an expressive message directing you how to change your code
  3. At least one minor release will include the warning
  4. In a future MAJOR release, this warning will change to an error and/or the deprecated code will be removed all together.

This deprecation strategy allows you to keep using Appdaemon Test Framework as you always have, as long as you don't update to a new MAJOR version.

Silencing deprecation warnings

The deprecation warnings can be a bit overwhelming depending on the current state of the codebase. If you would like to run tests and ignore these warnings use the following pytest options:

pytest -W ignore::DeprecationWarning

Without pytest

If you do no wish to use pytest, first maybe reconsider, pytest is awesome :) If you're really set on using something else, worry not it's pretty straighforward too ;)

What the conftest.py file is doing is simply handling the setup & teardown, as well as providing the helpers as injectable fixtures. It is pretty easy to replicate the same behavior with your test framework of choice. For instance, with unittest a base TestCase can replace pytest conftest.py. See the Unittest Example

Direct call to mocked functions

/!\ WARNING — EXPERIMENTAL /!\

Want a functionality not implemented by the helpers? You can inject hass_mocks directly in your tests and use hass_mocks.hass_functions, patched functions are MagicMocks. The list of patched functions can be found in the hass_mocks module.


Contributing

PR simply patching new functions in hass_functions

  • Will be merged almost immediately
  • Are exempt of the "must have tests" rule, although there is rarely too much tests 😉

Thanks for your contribution 😀 👍

Tests

When adding new feature, you can TDD it, add unit tests later, or only rely on integration tests. Whichever way you go is totally fine for this project, but new features will need to have at least some sort of tests, even if they're super basic 🙂

Using Pipenv

This project has been developed using Pipenv. It is totally optional, but if you are interested, here is a quick guide to get you started:

  1. Install pipenv -> brew install pipenv (or whichever methods works best for you)
  2. cd in the project directory
  3. Run pipenv install --dev
    That will create a virtualenv and automatically install all required dependencies
  4. Run pipenv shell
    That will activate the virtualenv in the current shell
  5. Run pytest to ensure that everything is working as expected
  6. You're good to go and can start contributing 😊

Author Information

Follow me on Twitter: @ThisIsFlorianK Find out more about my work: Florian Kempenich — Personal Website