Open PerMac opened 1 year ago
Pytest don't seem to follow the timeout value from testcase.yaml
I had to modify the twister_harness/device/simulator_adapter.py
@SeppoTakalo Thanks for picking this out! We already have a fix for this waiting to be accepted https://github.com/zephyrproject-rtos/zephyr/pull/58491#issuecomment-1573547888 Is this fixing what you are referring to?
@PerMac No that is not fixing the timeout.
When simulation starts, device.flash_and_run()
is called without any timeout parameter.
https://github.com/zephyrproject-rtos/zephyr/blob/8c4bfcf324662d0526e0c0ec7c2b36c37be48986/scripts/pylib/pytest-twister-harness/src/twister_harness/fixtures/dut.py#L32
So even if I define my test case like this:
tests:
sample.net.lwm2m:
harness: pytest
timeout: 300
That timeout does not affect how long the simulation can run. I would assume that the testcase timeout parameter should flow into the flash_and_run()
as well.
Indeed, thanks for pointing this out. I think the timeout from the yaml might not be the best to pass, since it is a value for the whole scenario (an image built), i.e. times for all related pytest test combined. If we want to have 5 tests, each with 100s timeout, than the timeout in yaml should be 500s. However, as you pointed out, it is not possible yet to set timeouts for each individual test. We are thinking of adding timeout parametrization to the dut fixture. Therefore, a developer will be able to define the needed timeout on a test level, when asking for dut fixture
@SeppoTakalo
Flash timeout is now hardcoded to 60s. There is no separate timeout for simulation (I agree adding this will be valuable).
timeout
from testcase.yaml
is used to avoid blocking the CI:
https://github.com/zephyrproject-rtos/zephyr/blob/68589ca0f1d0af789bf1853c6a296d39fe5ebfe1/scripts/pylib/twister/twisterlib/harness.py#L328
This timeout is used per one test in testcase.yaml
(flash + simulation), so if you have couple of tests in a pytest file, then you should consider extending this timeout. If you run pytest directly (wihout twister call), than timeout from yaml is not used.
@gchwier In PR https://github.com/zephyrproject-rtos/zephyr/pull/58791 I'm working on test cases that run LwM2M client against a real server. Many tests that are defined on LwM2M specification requires over 30 second timeout to be waited, so I cannot execute those without significantly modifying the simulation time.
Pytest don't seem to follow the timeout value from
testcase.yaml
I had to modify the
twister_harness/device/simulator_adapter.py
@SeppoTakalo I created a PR where I solved this issue connected with "not respecting timeout
from testcase.yaml
": https://github.com/zephyrproject-rtos/zephyr/pull/61224. If you will have time, please verify if this works for you.
In testing LwM2M client, it would help me a lot if I can share a DUT between testcases. Currently DUT or Shell are function scoped. If those would optionally allow session scoped, then I could run setup phase through for the DUT, then run multiple testcases in sequence.
Feature request
In testing LwM2M client, it would help me a lot if I can share a DUT between testcases. Currently DUT or Shell are function scoped. If those would optionally allow session scoped, then I could run setup phase through for the DUT, then run multiple testcases in sequence.
@SeppoTakalo - thank you for letting us know. We discussed about this shortly, and this is possible to implement, but this is not "one line change" and we need some time to take a look at this. We have to rethink how to deal with initialization of logging files without matter what is dut
fixture scope.
Temporarily if you would like to achieve something like what you described, you can create your own dut_session_scope
fixture basing on device_object
fixture (which has session
scope). For example:
@pytest.fixture(scope='session')
def dut_session_scope(device_object: DeviceAdapter) -> Generator[DeviceAdapter, None, None]:
"""Return launched device - with run application."""
try:
device_object.launch()
yield device_object
finally: # to make sure we close all running processes execution
device_object.close()
@pytest.fixture(scope='session')
def shell_session_scope(dut_session_scope: DeviceAdapter) -> Shell:
"""Return ready to use shell interface"""
shell = Shell(dut_session_scope, timeout=20.0)
logger.info('Wait for prompt')
assert shell.wait_for_prompt()
return shell
@SeppoTakalo please find, test and review requested feature: #64356
@PerMac I have another feature request:
I'm planning to mark some testcases using Pytest markers, like slow/fast/smoke, etc.
@pytest.mark.slow
def test_LightweightM2M_1_1_int_310(shell: Shell, leshan: Leshan, endpoint: str):
Then in the testcase.yaml
I could already use those to filter tests:
tests:
net.lwm2m.interop.smoke:
harness: pytest
timeout: 60
harness_config:
pytest_dut_scope: module
pytest_args: ['-k not slow']
net.lwm2m.interop.slow:
harness: pytest
timeout: 600
slow: true
harness_config:
pytest_dut_scope: module
pytest_args: ['-k slow']
However, that fails because then if I allow slow tests to run, both tests start at the same time and Zephyr network (TUN/TAP) is only capable of running one instance at a time.
So this needs to be solved, either by:
pytest_args
from Twister command line
- Allow me to limit which tests can run in parallel.
@SeppoTakalo it is not easy to limit that. This is rather request to Twister core functionality, not related with pytest.
For now, simplest solution / workaround for you, is to run Twister with --jobs 1
- it limits jobs used for building / executing tests (but it increase the time of building .. so you can run Twister separately for building --build-only
on all jobs and for executing with --test-only --jobs 1
)
- Allow me to feed
pytest_args
from Twister command line
I will try to add this in next week. We thought about that also, just to have possibility to filter tests using pytest command,
e.g. to run only one test:
-T zephyr/tests/net/lib/lwm2m/interop -s tests/net/lib/lwm2m/interop/net.lwm2m.interop --pytest-args='-k verify_LightweightM2M_1_1_int_0'
but when someone starts using pytest-args
in testcase.yaml
, then it will be overwritten.
Using markers in pytest scripts is great, but it is 'hidden' for Twister. Maybe would be better to add some filtering to testcase.yaml
.
For now, you can mark test configuration in testcase.yaml
as slow
, select which scenario / module is used and then run twister with --enable-slow
flag.
slow: true
harness_config:
pytest_root:
- pytest/test_slow_scenarios.py
- pytest/test_lwm2m.py::test_LightweightM2M_1_1_int_102
- pytest/test_lwm2m.py::test_LightweightM2M_1_1_int_104
(you can find some examples in docs: twister )
Integration between twister and pytest: tracking issue.
An integration between pytest and twister is added with #57117. More details about the integration can be found at https://docs.zephyrproject.org/latest/develop/test/pytest.html. This issue is meant to track the current limitations of the integration and provide a space to discuss what and how is to be added and/or improved. For bigger tasks an individual issues will be created and linked in the list.
Known limitations:
Requested functionality
Ideas for pytest usage: