Closed kb2ma closed 4 years ago
Ran 02-tests/task-01. Found failure on tests/gnrc_rpl_srh. Posted RIOT-OS/RIOT#12436.
This is my first time running many of these tests, so please bear with my questions.
The instructions for spec 02-tests say to run tests/compile_and_test_for_board. The readme for that test says that it runs only automated tests. For the purpose of 02-tests/01, is it necessary to run any/all manual tests? If so, is there an easy way to determine which tests are expected to be run this way?
For 02-test, tasks 02 and 03: How does one decide on a reasonable subset of tests?
For 01-ci/01, I ran a few of these. However, I am missing some toolchains and the tests look like they will take a significant amount of time to run. Looking at the last release, maybe some Docker image is the way to go. If someone has an environment set up already for this, that would be helpful.
I 'll ran task 99 on the following BOARDS
, might take a day to get the results.
nucleo-f091rc b-l072z-lrwan1 nucleo-l152re nucleo-f103rb nucleo-l432kc nucleo-f746zg frdm-k64f b-l475e-iot01a nrf52840dk pba-d-01-kw2x nucleo-f207zg nrf52dk nucleo-l073rz nucleo-f767zi nucleo-l496zg frdm-kw41z nucleo-f446re nucleo-f303k8
Ran tasks 01-2, 01-3 and 01-4 with success:
I ran task 99 on arduino-mega2560 ek-lm4f120xl frdm-kw41z mulle nucleo-f103rb stm32f3discovery cc2650-launchpad esp32-wroom-32 frdm-k64f msba2 nrf52dk pba-d-01-kw2x sltb001a
using BUILD_IN_DOCKER=1
and the compile_and_test_for_board.py
script.
Except some local machine handling, I had to disable reset
on esp32-wroom-32
as it gets stuck when testing and use esptool.py
for flashing. Otherwise I am now directly running the RC1
.
The local configuration is explained in paper-ci and points to the aggregated failuresummary.md
.
I will do a readable summary in this comment.
Tests on arduino-mega2560
and mulle
failed completely due to hardware/local setup issues I think. Errors should be ignored. I will see what I can do.
Due to not being able to reset, with esp32-wroom-32
many many many tests in RIOT fail for this board as they rely on this to be working.
I just publish the result for reference but should not be really considered right now. It is to support changing this in the future.
All tests are currently run even the BLACKLISTED
. So failures in the following tests is expected when run automatically. I did not run them manually so consider them untested on these boards.
git grep -l 'TEST_ON_CI_BLACKLIST += all' | sed 's|/Makefile||'
examples/suit_update
tests/gnrc_ipv6_ext
tests/gnrc_ipv6_ext_frag
tests/gnrc_rpl_srh
tests/gnrc_sock_dns
tests/gnrc_tcp
tests/lwip
tests/pkg_fatfs_vfs
129 test failed. But most are due to not being supported by RIOT testing method.
No failures. (Except in blacklisted tests)
I will run the automated tests on samr21-xpro
and iotlab-m3
using BUILD_IN_DOCKER=1
.
@smlng Can you take care of some of the interop tests (at least the raspi ones)?
@MrKevinWeiss sure! As long as you it merge on the timer test 😉
I 'll ran task 99 on the following BOARDS, might take a day to get the results.
nucleo-f091rc b-l072z-lrwan1 nucleo-l152re nucleo-f103rb nucleo-l432kc nucleo-f746zg frdm-k64f b-l475e-iot01a nrf52840dk pba-d-01-kw2x nucleo-f207zg nrf52dk nucleo-l073rz nucleo-f767zi nucleo-l496zg frdm-kw41z nucleo-f446re nucleo-f303k8
My machine crashed, it's is a shitty machine but I'll have to find a replacement or re-start the tests.
Task #01 - ICMPv6 echo on iotlab-m3 with three hops (static route)
Task #02 - UDP on iotlab-m3 with three hops (static route)
Task #03 - ICMPv6 echo on iotlab-m3 with three hops (RPL route)
Task #04 - UDP on iotlab-m3 with three hops (RPL route)
The automated tests on iotlab-m3
and samr21-xpro
failed only for tests/riotboot
and ones with TEST_ON_CI_BLACKLIST
when using BUILD_IN_DOCKER=1
. I did not run them manually.
The automated tests on
iotlab-m3
andsamr21-xpro
failed only fortests/riotboot
and ones withTEST_ON_CI_BLACKLIST
when usingBUILD_IN_DOCKER=1
. I did not run them manually.
@cladmi, so this is 02-tests for those boards? Are the riotboot failures known issues, or do we need to report/investigate?
@cladmi, so this is 02-tests for those boards? Are the riotboot failures known issues, or do we need to report/investigate?
I opened #12446 to address this, it is a known issue.
I opened #12446 to address this, it is a known issue.
Got it, thanks. I updated the issue comment.
@aabadie, thanks for running 01-ci/01. What was the outcome?
I ran task 99 on
arduino-mega2560 ek-lm4f120xl frdm-kw41z mulle nucleo-f103rb stm32f3discovery cc2650-launchpad esp32-wroom-32 frdm-k64f msba2 nrf52dk pba-d-01-kw2x sltb001a
usingBUILD_IN_DOCKER=1
and thecompile_and_test_for_board.py
script.
@cladmi, I compared the results to those reported for 2019.07. I see a lot of repeat failures, but I also notice new issues:
Are the failures with riotboot addressed by RIOT-OS/RIOT#12446? Did you test ek-lm4f120xl for 2019.07? Are the shell and periph_rtc failures concerning?
I tested the tests/shell/
on the nrf52dk
and it passes for me. I think maybe the errors are an infrastructure issue not a true test failure however I don't have the msb2 or the ek-lm4f120xl
Can anyone tell why the RC1 RIOT_VERSION is 2020.01-devel-HEAD
when using?
BUILD_IN_DOCKER = 1, native
main(): This is RIOT! (Version: 2020.01-devel-HEAD)
BOARD=nucleo-l433rc, BUILD_IN_DOCKER=0
main(): This is RIOT! (Version: 2020.01-devel-32-g9bc60)
Actually I don't think I did this right... Maybe I bug @miri64 tomorrow about this...
Fixed, looks good.
Regarding the failure for 02-tests/01, fixes have been applied to the release branch. However, there still is inconsistency in some results for tests/gnrc_rpl_srh
. We should still expect failures here, particularly for test_forward_uncomp, test_forward_uncomp_not_first_ext_hdr, and test_multicast_addr.
See RIOT-OS/RIOT#12442 for details.
However, it is possible to run a manual test to verify the fix.
Clarification: the manual test had nothing to do with gnrc_rpl_srh
. It was required to make sure that the regression fixed in https://github.com/RIOT-OS/RIOT/pull/12442 doesn't regress the bug fixed in https://github.com/RIOT-OS/RIOT/pull/11970 (which introduced the first regression). So no: the manual tests did not verify the fix, but made sure the fix did not break the desired behavior of another fix ;)
Thanks for the clarification, @miri64. I edited my misstatement out of that comment.
@miri64 It seems like 10-07 step 3 does not work anymore. When I try:
GNRC_NETIF_NUMOF=2 USEMODULE=socket_zep \
TERMFLAGS="-z [::]:17755 tap0" \
make -C examples/gnrc_networking clean all term
I get:
/home/kevinweiss/WorkingDirectory/RIOT/examples/gnrc_networking/../../Makefile.include:643: recipe for target 'term' failed
0% packet loss
This issue lists the status of all tests for the Release Candidate 1 of the 2019.10 release.
Specs tested: