Closed github-actions[bot] closed 4 months ago
Follow up form https://github.com/osrf/buildfarm-tools/issues/39#issuecomment-2047794181
Debbuilders can't find python3-disutils package, and it seems currently it's not available for noble
Reference build:
In iceoryx_posh output:
[100%] Built target ext_cpptoml
error: patch failed: CMakeLists.txt:1
error: CMakeLists.txt: patch does not apply
CMake Warning at cmake/cpptoml/CMakeLists.txt:81 (message):
CMake step [patch] for 'cpptoml-build' failed! Error code: 1! Build of
'cpptoml-build' might fail
Maybe a possible culprit could be: https://github.com/eclipse-iceoryx/iceoryx/pull/2232
Reference build: https://build.osrfoundation.org/job/gz_sim-ci-main-jammy-amd64/63/
Test regressions:
None of the test has changed in the last 2 years:
This issue was reported in: https://github.com/gazebosim/gz-sim/issues/2371
Reference build: https://build.ros2.org/view/Rci/job/Rci__nightly-release_ubuntu_jammy_amd64/805/
Build timed out after 300 minutes
I've been checking the log time differences between build 804 (passing) and 805 (failing):
Running this build in our database:
SELECT job_name, last_success_time
FROM (
SELECT bs.job_name, MAX(bs.build_datetime) AS last_success_time
FROM build_status bs
INNER JOIN server_status ss ON bs.job_name = ss.job_name
WHERE bs.status = 'SUCCESS'
GROUP BY bs.job_name
) AS last_success_times
ORDER BY last_success_time ASC;
Gets you a sorted version of the last time some builds passed
And this query returns the list of job_names that don't have any SUCCESS build:
SELECT bs.job_name
FROM build_status bs
INNER JOIN server_status ss ON bs.job_name = ss.job_name
WHERE bs.job_name NOT IN (
SELECT job_name
FROM build_status
WHERE status = 'SUCCESS'
)
GROUP BY bs.job_name;
As part of our efforts to increase greenness metrics of the buildfarms, I would like us to take a closer look to get those to green,
Note: I skipepd Rci jobs because we migrated to noble today, so we expect some change in terms of greeness (and also we lose track of test regressions in the internal database
Reference build: https://ci.ros2.org/job/nightly_linux_coverage/2344/
It started failing when the job configuration started using Noble as Ubuntu distro.
The coverage analysis section was the last log output before the agent got killed:
Processing build/test_tf2/CMakeFiles/test_message_filter.dir/test/test_message_filter.cpp.gcda
geninfo: WARNING: using JSON module "JSON::PP" - which is much slower than some alternatives. Consider installing one of JSON::XS or Cpanel::JSON::XS
geninfo: WARNING: /usr/include/c++/13/bits/atomic_base.h:501: unexecuted block on non-branch line with non-zero hit count. Use "geninfo --rc geninfo_unexecuted_blocks=1 to set count to zero.
geninfo: ERROR: mismatched end line for _ZN31MessageFilter_noTransforms_Test8TestBodyEv at /home/jenkins-agent/workspace/nightly_linux_coverage/ws/src/ros2/geometry2/test_tf2/test/test_message_filter.cpp:79: 79 -> 100
(use "geninfo --ignore-errors mismatch ..." to bypass this error)
Traceback (most recent call last):
File "/home/jenkins-agent/workspace/nightly_linux_coverage/run_ros2_batch.py", line 32, in <module>
sys.exit(main())
^^^^^^
File "/home/jenkins-agent/workspace/nightly_linux_coverage/ros2_batch_job/__main__.py", line 154, in main
return run(args, build_function, blacklisted_package_names=blacklisted_package_names)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/jenkins-agent/workspace/nightly_linux_coverage/ros2_batch_job/__main__.py", line 689, in run
rc = build_function(args, job, colcon_script)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/jenkins-agent/workspace/nightly_linux_coverage/ros2_batch_job/__main__.py", line 386, in build_and_test
process_coverage(args, job)
File "/home/jenkins-agent/workspace/nightly_linux_coverage/ros2_batch_job/__main__.py", line 280, in process_coverage
subprocess.run(cmd, check=True)
File "/usr/lib/python3.12/subprocess.py", line 571, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['lcov', '--capture', '--directory', 'build', '--output', 'build/coverage.info']' returned non-zero exit status 1.
Build step 'Execute shell' marked build as failure
Also, we had 3 test regressions:
/usr/bin/colcon test-result --test-result-base "build"
build/ros2topic/pytest.xml: 9 tests, 0 errors, 1 failure, 0 skipped^M
build/rosidl_typesupport_introspection_tests/Testing/20240418-0558/Test.xml: 1 test, 0 errors, 1 failure, 0 skipped
build/rosidl_typesupport_introspection_tests/test_results/rosidl_typesupport_introspection_tests/test_multi_nested_message_introspection.gtest.xml: 6 tests, 0 errors, 1 failure, 0 skipped
Reported in: https://github.com/ros2/ci/issues/770
Reference build: https://ci.ros2.org/view/nightly/job/nightly_linux_release/3048/
Output:
00:13:29.434 Err:23 http://archive.ubuntu.com/ubuntu noble/main amd64 libicu74 amd64 74.2-1ubuntu3
00:13:29.434 Undetermined Error [IP: 185.125.190.36 80]
00:13:30.420 Err:24 http://archive.ubuntu.com/ubuntu noble/main amd64 libxml2 amd64 2.9.14+dfsg-1.3ubuntu3
00:13:30.420 Could not connect to archive.ubuntu.com:80 (185.125.190.36), connection timed out Could not connect to archive.ubuntu.com:80 (185.125.190.39), connection timed out Could not connect to archive.ubuntu.com:80 (91.189.91.83), connection timed out [IP: 185.125.190.36 80]
...
00:13:30.444 E: Failed to fetch http://archive.ubuntu.com/ubuntu/pool/universe/w/wheel/python3-wheel_0.42.0-2_all.deb Unable to connect to archive.ubuntu.com:80: [IP: 185.125.190.36 80]
00:13:30.444 E: Failed to fetch http://archive.ubuntu.com/ubuntu/pool/universe/p/python-pip/python3-pip_24.0%2bdfsg-1ubuntu1_all.deb Unable to connect to archive.ubuntu.com:80: [IP: 185.125.190.36 80]
00:13:30.444 E: Unable to fetch some archives, maybe run apt-get update or try with --fix-missing?
00:13:30.444 The command '/bin/sh -c apt-get update && apt-get install --no-install-recommends -y build-essential ccache cmake pkg-config python3-empy python3-pip python3-setuptools python3-vcstool' returned a non-zero code: 100
Seems like a network issue
Previous log #39