Clean, human-readable tests for your Appdaemon automations.
given_that.state_of('sensor.temperature').is_set_to('24.9')
assert_that('light.bathroom').was.turned_on()
time_travel.fast_forward(2).minutes()
def test_during_night_light_turn_on(given_that, living_room, assert_that):
given_that.state_of('sensor.living_room_illumination').is_set_to(200) # 200lm == night
living_room._new_motion(None, None, None)
assert_that('light.living_room').was.turned_on()
def test_click_light_turn_on_for_5_minutes(given_that, living_room, assert_that):
living_room._new_button_click(None, None, None)
assert_that('light.bathroom').was.turned_on()
# At T=4min light should not yet have been turned off
time_travel.fast_forward(4).minutes()
assert_that('light.bathroom').was_not.turned_off()
# At T=5min light should have been turned off
time_travel.fast_forward(1).minutes()
time_travel.assert_current_time(5).minutes()
assert_that('light.bathroom').was.turned_off()
pip install pytest
pip install appdaemontestframework
conftest.py
file in the root of your project with the following code:
from appdaemontestframework.pytest_conftest import *
Let's test an Appdaemon automation we created, which, say, handles automatic lighting in the Living Room: class LivingRoom
@automation_fixture
decorator:
@automation_fixture(LivingRoom)
def living_room():
pass
def test_during_night_light_turn_on(given_that, living_room, assert_that):
given_that.state_of('sensor.living_room_illumination').is_set_to(200) # 200lm == night
living_room._new_motion(None, None, None)
assert_that('light.living_room').was.turned_on()
Note
The following fixtures are injected by pytest using the
conftest.py
file and the initialisation fixture created at Step 1:
living_room
given_that
assert_that
time_travel
# Important:
# For this example to work, do not forget to copy the `conftest.py` file.
@automation_fixture(LivingRoom)
def living_room():
pass
def test_during_night_light_turn_on(given_that, living_room, assert_that):
given_that.state_of('sensor.living_room_illumination').is_set_to(200) # 200lm == night
living_room._new_motion(None, None, None)
assert_that('light.living_room').was.turned_on()
def test_during_day_light_DOES_NOT_turn_on(given_that, living_room, assert_that):
given_that.state_of('sensor.living_room_illumination').is_set_to(1000) # 1000lm == sunlight
living_room._new_motion(None, None, None)
assert_that('light.living_room').was_not.turned_on()
@automation_fixture
# Command
@automation_fixture(AUTOMATION_CLASS)
def FIXTURE_NAME():
pass
# Example
@automation_fixture(LivingRoom)
def living_room():
pass
The automation given to the fixture will be:
initialize()
function@pytest.fixture
given_that
apps.yaml
configSee: Appdaemon - Passing arguments to Apps
# Command
given_that.passed_arg(ARG_KEY).is_set_to(ARG_VAL)
# Example
given_that.passed_arg('color').is_set_to('blue')
Note: See Pre-initialization Setup if
arguments are required in the initialize()
method.
# Command
given_that.state_of(ENTITY_ID).is_set_to(STATE_TO_SET)
given_that.state_of(ENTITY_ID).is_set_to(STATE_TO_SET, ATTRIBUTES_AS_DICT)
given_that.state_of(ENTITY_ID).is_set_to(STATE_TO_SET,
ATTRIBUTES_AS_DICT,
LAST_UPDATED_AS_DATETIME,
LAST_CHANGED_AS_DATETIME)
# Example
given_that.state_of('media_player.speaker').is_set_to('playing')
given_that.state_of('light.kitchen').is_set_to('on', {'brightness': 50,
'color_temp': 450})
given_that.state_of('light.kitchen').is_set_to(
'on',
last_updated=datetime(2020, 3, 3, 11, 27))
# Command
given_that.time_is(TIME_AS_DATETIME)
# Example
given_that.time_is(time(hour=20))
# Clear all calls recorded on the mocks
given_that.mock_functions_are_cleared()
# To also clear all mocked state, use the option: 'clear_mock_states'
given_that.mock_functions_are_cleared(clear_mock_states=True)
# To also clear all mocked passed args, use the option: 'clear_mock_passed_args'
given_that.mock_functions_are_cleared(clear_mock_passed_args=True)
The way Automations work in Appdaemon is:
initialize()
phaseTo trigger actions on your automation, simply call one of the registered callbacks.
Note
It is best-practice to have an initial test that will test the callbacks are actually registered as expected during the
initialize()
phase. See: Bonus - Assert callbacks were registered duringinitialize()
LivingRoomTest.py
def test_during_night_light_turn_on(given_that, living_room, assert_that):
...
living_room._new_motion(None, None, None)
...
LivingRoom.py
class LivingRoom(hass.Hass):
def initialize(self):
...
self.listen_event(
self._new_motion,
'motion',
entity_id='binary_sensor.bathroom_motion')
...
def _new_motion(self, event_name, data, kwargs):
< Handle motion here >
assert_that
# Available commmands
assert_that(ENTITY_ID).was.turned_on(OPTIONAL_KWARGS)
assert_that(ENTITY_ID).was.turned_off(OPTIONAL_KWARGS)
assert_that(ENTITY_ID).was_not.turned_on(OPTIONAL_KWARGS)
assert_that(ENTITY_ID).was_not.turned_off(OPTIONAL_KWARGS)
# Examples
assert_that('light.living_room').was.turned_on()
assert_that('light.living_room').was.turned_on(color_name=SHOWER_COLOR)
assert_that('light.living_room').was_not.turned_off()
assert_that('light.living_room').was_not.turned_off(transition=2)
# Available commmands
assert_that(SERVICE).was.called_with(OPTIONAL_KWARGS)
assert_that(SERVICE).was_not.called_with(OPTIONAL_KWARGS)
# Examples
assert_that('notify/pushbullet').was.called_with(
message='Hello :)',
target='My Phone')
assert_that('media_player/volume_set').was.called_with(
entity_id='media_player.bathroom_speaker',
volume_level=0.6)
initialize()
# Available commmands
assert_that(AUTOMATION) \
.listens_to.event(EVENT) \
.with_callback(CALLBACK)
assert_that(AUTOMATION) \
.listens_to.event(EVENT, OPTIONAL_KWARGS) \
.with_callback(CALLBACK)
# Examples - Where 'living_room' is an instance of 'LivingRoom' automation
assert_that(living_room) \
.listens_to.event('click', entity_id='binary_sensor.button') \
.with_callback(living_room._new_click_button)
# Available commmands
assert_that(AUTOMATION) \
.listens_to.state(ENTITY_ID) \
.with_callback(CALLBACK)
assert_that(AUTOMATION) \
.listens_to.state(ENTITY_ID, OPTIONAL_KWARGS) \
.with_callback(CALLBACK)
# Examples - Where 'living_room' is an instance of 'LivingRoom' automation
assert_that(living_room) \
.listens_to.state('binary_sensor.button', old='on', new='off') \
.with_callback(living_room._no_more_motion)
# Available commmands
assert_that(AUTOMATION) \
.registered.run_daily(TIME_AS_DATETIME) \
.with_callback(CALLBACK)
assert_that(AUTOMATION) \
.registered.run_daily(TIME_AS_DATETIME, OPTIONAL_KWARGS) \
.with_callback(CALLBACK)
# Examples - Where 'living_room' is an instance of 'LivingRoom' automation
assert_that(living_room) \
.registered.run_daily(time(hour=12), time_as_text='noon') \
.with_callback(automation._new_time)
NOTE: The above examples can be used with run_minutely.
_See related test file for more examples: test_assert_callback_registration.py_
time_travel
This helper simulate going forward in time.
It will run the callbacks registered with the run_in
function of Appdaemon:
Automatically resets between each test (with default config)
# Available commmands
time_travel.fast_forward(MINUTES).minutes() time_travel.fast_forward(SECONDS).seconds()
time_travel.assert_current_time(MINUTES).minutes() time_travel.assert_current_time(SECONDS).seconds()
time_travel.assert_current_time(0).minutes()
time_travel.fast_forward(3).minutes() assert_that('some/service').was.called() assert_that('some_other/service').was_not.called()
time_travel.fast_forward(2).minutes() assert_that('some_other/service').was.called()
---
## Examples
### Simple
* [**Pytest**](https://github.com/FlorianKempenich/Appdaemon-Test-Framework/blob/master/doc/pytest_example.py)
* [**Unittest**](https://github.com/FlorianKempenich/Appdaemon-Test-Framework/blob/master/doc/unittest_example.py)
### Special cases
#### Complex setup fixture
In this scenario, we wish to test what happens **after** the light was turned on by motion and the `device_tracker.person` is set to `away`.
After the first `_new_motion`, the mock functions must be reset to ensure a clean state. For that purpose, it is entirely possible to use `given_that.mock_functions_are_cleared()` inside a particular test:
```python
def test_light_turned_on_by_motion_and_person_away__do_not_turn_on_again(given_that,
living_room,
assert_that):
# Given:
# - Light was turned on by motion
given_that.state_of('sensor.living_room_illumination').is_set_to(200) # 200lm == night
living_room._new_motion(None, None, None)
given_that.mock_functions_are_cleared()
# - Person is away
given_that.state_of('device_tracker.person).is_set_to('away')
# When: New motion
living_room._new_motion(None, None, None)
# Then: Light isn't turned on
assert_that('light.living_room').was_not.turned_on()
Alternatively, if multiple scenarios are to be tested with the same fixture (after turned on by motion & person is away), an intermediate fixture can be used:
class AfterLightTurnedOnByMotion:
@pytest.fixture
def living_room_after_light_turned_on_by_motion(self, living_room, given_that):
given_that.state_of('sensor.living_room_illumination').is_set_to(200) # 200lm == night
living_room._new_motion(None, None, None)
given_that.mock_functions_are_cleared()
def test_person_away__new_motion__do_not_turn_on_again(self,
given_that,
living_room_after_light_turned_on_by_motion,
assert_that):
given_that.state_of('device_tracker.person).is_set_to('away')
living_room_after_light_turned_on_by_motion._new_motion(None, None, None)
assert_that('light.living_room').was_not.turned_on()
def test_some_other_condition__do_something_else(self,
given_that,
living_room_after_light_turned_on_by_motion,
assert_that):
...
This section is entirely optional For a guide on how to use the framework, see the above sections!
Why a test framework dedicated for Appdaemon? The way Appdaemon allow the user to implement automations is based on inheritance. This makes testing not so trivial. This test framework abstracts away all that complexity, allowing for a smooth TDD experience.
Couldn't we just use the MVP pattern with clear interfaces at the boundaries? Yes we could... but would we? Let's be pragmatic, with this kind of project we're developing for our home, and we're a team of one. While being a huge proponent for clean architecture, I believe using such a complex architecture for such a simple project would only result in bringing more complexity than necessary.
Every Automation in Appdaemon follows the same model:
hass.Hass
self
AppdaemonTestFramework captures all calls to the API and helpers make use of the information to implement common functionality needed in our tests.
Methods from the hass.Hass
class are patched globally in the HassMocks
class and the mocks can be accessed with the hass_functions
property. The test fixture is injected in the helper classes as hass_mocks
.
After all tests have run, the hass_mocks
test fixture will automatically unpatch all the mocks.
Deprecated
hass_functions
test fixture BeforeHassMocks
existed,hass_functions
was used for interacting with the patch hass methods. Thehass_mocks
test fixture now containshass_functions
and should be accessed through there. Ahass_functions
test fixture that returnshass_mocus.hass_fuctions
will be kept for a while to ease transition, but wil generate a deprecation warning.
hass_functions
: dictionary with all patched functionsunpatch_callback
: callback to un-patch all patched functionshass_functions
are injected in the helpers when creating their instances.
After all tests, unpatch_callback
is used to un-patch all patched functions.
Setup and teardown are handled in the pytest_conftest.py
file.
hass.Hass
functions and create HassMocks
instance for tracking state.hass_mocks
in helpers: given_that
, assert_that
, time_travel
hass.Hass
functions@automation_fixture
To be able to test an Appdaemon automation, the automation needs to be created after the framework has been
initialized. This could be done with a regular @pytest.fixture
by injecting given_that
(or any other helper) before
creating the automation. This would invoke the helper which would initialize the framework beforehand and the correct
functions would be patched before the automation is created:
@pytest.fixture
def living_room(given_that):
# `None` dependencies are mocked by the framework
living_room = LivingRoom(None, None, None, None, None, None, None, None)
return living_room
Also, for convenience, we could initialize the automation with its initialize()
method to make it available in tests
as it would be in production once Appdaemon is started, making sure the mocks are clear of any calls when the fixture
is injected. We could also set the automation instance name to the name of the automation class.
@pytest.fixture
def living_room(given_that):
living_room = LivingRoom(None, LivingRoom.__name__, None, None, None, None, None, None)
living_room.initialize()
given_that.mock_functions_are_cleared()
return living_room
However, since this code would have to be repeated for every single automation, it was creating un-necessary clutter.
For that reason, the @automation_fixture
was introduced. It is simple syntactic sugar, and performs the exact same
steps as the fixture above in much fewer lines:
@automation_fixture(LivingRoom)
def living_room():
pass
@automation_fixture
- Extra featuresFor some automations, the initialize()
step requires some pre-configuration of the global state.
Maybe it requires time to be setup, or maybe it needs some sensors to have a particular state.
Such pre-initialization setup is possible with the @automation_fixture
. The fixture can be injected with the
following 2 arguments:
given_that
- For configuring the statehass_mocks
- For more complex setup stepshass_functions
was deprecated in favor of using hass_mocks
Any code written in the fixture will be executed before initializing the automation. That way your
initialize()
function can safely rely on the Appdaemon framework and call some of its methods, all you
have to do is setup the context in the fixture.
Let's imagine an automation, Terasse
, that turns on the light at night. During the initialize()
phase, Terasse
checks the current time to know if it should immediately turn on the light (for instance, if appdaemon is started
during the night).
Without mocking the time before the call to initialize()
the test framework will not know which time to return
and an error will be raised. To prevent that, we set the time in the fixture, the code will be executed before the
call to initialize()
and no error will be raised.
@automation_fixture
def terasse(given_that):
given_that.time_is(datetime.time(hour=20))
# With Terasse:
class Terasse(hass.Hass):
def initialize(self):
...
current_time = self.time()
if self._is_during_night(current_time):
self._turn_on_light()
...
The @automation_fixture
can actually be used in 4 different ways:
# Single Class:
@automation_fixture(MyAutomation)
# Multiple Classes:
@automation_fixture(MyAutomation, MyOtherAutomation)
# Single Class w/ params:
@automation_fixture((upstairs.Bedroom, {'motion': 'binary_sensor.bedroom_motion'}))
# Multiple Classes w/ params:
@automation_fixture(
(upstairs.Bedroom, {'motion': 'binary_sensor.bedroom_motion'}),
(upstairs.Bathroom, {'motion': 'binary_sensor.bathroom_motion'}),
)
The alternate versions can be useful for parametrized testing:
(Initialized_Automation, params)
As development continues of this test framework, some interfaces and test fixtures need to get deprecated. In general, the following method is used to ease the transitions
This deprecation strategy allows you to keep using Appdaemon Test Framework as you always have, as long as you don't update to a new MAJOR version.
Silencing deprecation warnings
The deprecation warnings can be a bit overwhelming depending on the current state of the codebase. If you would like to run tests and ignore these warnings use the following pytest options:
pytest -W ignore::DeprecationWarning
pytest
If you do no wish to use pytest
, first maybe reconsider, pytest
is awesome :)
If you're really set on using something else, worry not it's pretty straighforward too ;)
What the conftest.py
file is doing is simply handling the setup & teardown, as well as providing the helpers as injectable fixtures.
It is pretty easy to replicate the same behavior with your test framework of choice. For instance, with unittest
a base TestCase
can replace pytest conftest.py
. See the Unittest Example
/!\ WARNING — EXPERIMENTAL /!\
Want a functionality not implemented by the helpers?
You can inject hass_mocks
directly in your tests and use hass_mocks.hass_functions
, patched functions are MagicMocks
.
The list of patched functions can be found in the hass_mocks
module.
All contributions are welcome!
All PR must be accompanied with some tests showcasing the new feature
For new feature, a short discussion will take place to argue whether the feature makes sense and if this is the best place to implement
Then the process goes as follow:
PR simply patching new functions in
hass_functions
- Will be merged almost immediately
- Are exempt of the "must have tests" rule, although there is rarely too much tests 😉
Thanks for your contribution 😀 👍
When adding new feature, you can TDD it, add unit tests later, or only rely on integration tests. Whichever way you go is totally fine for this project, but new features will need to have at least some sort of tests, even if they're super basic 🙂
Pipenv
This project has been developed using Pipenv
.
It is totally optional, but if you are interested, here is a quick guide to get you started:
pipenv
-> brew install pipenv
(or whichever methods works best for you)cd
in the project directorypipenv install --dev
virtualenv
and automatically install all required dependenciespipenv shell
virtualenv
in the current shellpytest
to ensure that everything is working as expectedFollow me on Twitter: @ThisIsFlorianK Find out more about my work: Florian Kempenich — Personal Website