Closed cladmi closed 5 years ago
It may not be broken but it looks like this is horribly slow.
Any idea @maribu
With make term
it takes 3 whole minutes D:
BOARD=frdm-kw41z make tests-sht1x flash term
2019-08-20 18:07:40,595 - INFO # Connect to serial port /dev/riot/ttyFRDM_KW41Z
Welcome to pyterm!
Type '/exit' to exit.
2019-08-20 18:10:41,707 - INFO # e��.
2019-08-20 18:10:41,708 - INFO # OK (1 tests)
2019-08-20 18:10:45,681 - INFO # Exiting Pyterm
This test verifies that the integer only math has the same result as the math on double
provided in the data sheet by going through a huge number of possible raw values. If the difference is within a given tolerance for every tested value, the test passes.
This test was written with the native
board in mind. Running it on a device with a soft FPU only will fry the CPU :-(
But I guess the conversion math in the sht1x
driver will never be touched again anyway, so just dropping the unit test seems fine to me.
Is going through the huge number of raw values needed or just bruteforce?
The curve must be somehow deterministic and as long as you match the math function checking the boundaries and a few values in the middle must be ok.
Also, even if designed for native
pay attention that native has different float handling than a board would have as it may be using the extended float registers: https://stackoverflow.com/a/612640/395687 So testing on native does not mean working on a board.
Also, even if designed for
native
pay attention that native has different float handling than a board would have as it may be using the extended float registers: https://stackoverflow.com/a/612640/395687 So testing on native does not mean working on a board.
The driver is using integer only math - which should yield the exact same results on every target. With the 80 bit FPU on x86 the unit tests compares the results of the integer only implementation to more accurate results of the formulas given in the data sheet. This means there is even more reason to only execute this unit test on native
.
How about guarding the test with #ifdef CPU_NATIVE
... #endif
?
The driver is using integer only math - which should yield the exact same results on every target. With the 80 bit FPU on x86 the unit tests compares the results of the integer only implementation to more accurate results of the formulas given in the data sheet. This means there is even more reason to only execute this unit test on
native
.
I did not read the implementation. Thanks for the clarification, so its only in the test somehow.
How about guarding the test with
#ifdef CPU_NATIVE
...#endif
?
Maybe not 100% guarding, but reducing the test vectors to a few on embedded definitely :)
I will try something.
Maybe not 100% guarding, but reducing the test vectors to a few on embedded definitely :)
The simplest thing would be to set the a mount the raw temperature and humidity value is incremented at the top of the file using something like
#ifdef CPU_NATIVE
#define RAW_TEMP_STEP (13)
#else
#define RAW_TEMP_STATE (1300)
#endif
Yes I am exactly doing this and testing the speed impact :)
I also reduced the "temperature" steps to save 2 seconds.
It goes from the original 3 minutes, to now 2 seconds.
I still found another issue with this test.
It currently adds FEATURES_REQUIRED += periph_gpio
which disables the chronos
board. (chronos
currently also fails because of issues but they need to be tackled)
It should either be moved outside of unittests
or the driver be split to have the conversions in a module that do not require periph_gpio
.
What would you prefer ?
Maybe it is the easiest to move the unit tests and integrate it into tests/driver_sht1x
. The sht1x
is a rather trivial driver and the conversions is not useful for anything else but the sht1x
. For these reasons, I think it might be hard to justify having two modules for it.
Should I open a PR, or do you want to tackle this?
As you wish, I will not have time today but could do it too.
Description
The test
tests-sht1x
intests-unittests
makes the unittest freeze onfrdm-kw41z
No debugging done for the moment.
Steps to reproduce the issue
It can be shown that it is only this test by deleting the directory and running the test:
Expected results
Test success
Actual results
Test fails
Versions
Current master but the unittest were also failing on the previous releases like 2019.07 and 2019.04.