Open traversaro opened 4 years ago
CUDA Toolkit ( https://askubuntu.com/questions/1230645/when-is-cuda-gonna-be-released-for-ubuntu-20-04 ) required by on-the-fly recognition or iol (@xEnVrE may have more details)
Looking into some old notes written by @xEnVrE, I noticed that back in time another dependency was cuDNN, that at the moment is still not available for Ubuntu 20.04 as well (@xEnVrE feel free to reply to let us know cuDNN is still a dependency for HSP-related software, thanks!).
A quick search for cuDNN in robotology revealed the following software that in their docs depend on cuDNN:
Hi @traversaro, sorry for the late response.
I will try to clarify a bit about possible links between IOL and CUDA. Last time we setup an iCub with the superbuild was for the CTS in December 2019. At that time, we though it was a good idea to show some preliminary results on the integration between IOL and Mask R-CNN (a network for object segmentation) which mandatorily requires CUDA for a decent experience. However, this combo is not available in any official channel of robotology. Indeed, the original IOL will still be able to run using Caffe that must be compiled anyway (unfortunately I don't remember the exact reason for that, i.e. why we don't use a pre-compiled version of it). As regards himrep
, which is part of the IOL pipeline, the dependency on CUDA
and cuDNN
, for caffeCoder
, is only optional (but of course advised for better performance).
I would also like to notify that lately the software for stereo vision on iCub, see stereo-vision, has been updated by @damianomal in order to, among several features, support CUDA for better performance. However, I am not sure if it also requires cuDNN.
That said, in the future is very likely that we will have consolidated pipelines, as part of the HSP-related software, involving software for object detection, object segmentation and object pose estimation that use deep learning tools. Some of these software, while being SoA, are not updated software-wise and they might end up using old versions of the most popular frameworks such as tensorflow
or pytorch
. For example, using tensorflow 1.x
in Ubuntu 20.04 seems to be a bit tricky, see e.g here.
Also, I would like to say that the only officially supported version of CUDA for Ubuntu 20.04 is >= 11
. However, again, it is very likely that many tools that we are using at the moment are not compatible with it yet. The typical solution to all these problems is to install older versions of both CUDA and cuDNN even if they are not officially supported on a more recent version of Ubuntu.
I think it is very complicated to find a general answer, and probably it depends from package to package (that is why we end up using virtual environments, each using different versions of CUDA sometimes).
I think we should also invite @Arya07, @damianomal and @GiuliaP to the discussion.
Thanks a lot @xEnVrE, that clarifies definitely the situation!
(that is why we end up using virtual environments, each using different versions of CUDA sometimes).
With virtualenv you mean python's virtualenv or something else? As far as I understand (but I am definitely not a CUDA expert!) on Ubuntu CUDA needs to be installed in the system using the .deb from https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/, so it needs to be installed in the system via apt/dpkg
, so I am not sure how you can have different CUDA versions in different virtualenv, but most probably I am missing something.
With virtualenv you mean python's virtualenv or something else?
Python's virtualenvs.
As far as I understand (but I am definitely not a CUDA expert!) on Ubuntu CUDA needs to be installed in the system using the .deb from https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/, so it needs to be installed in the system via apt/dpkg, so I am not sure how you can have different CUDA versions in different virtualenv, but most probably I am missing something.
As you said, there is the possibility of using debs. Even in this case, there are some post-installation actions to be performed manually (see here) which basically consist in extending PATH
and LD_LIBRARY_PATH
with pointers to the CUDA installation path.
In any case, there is another way to install CUDA which consists in downloading specific runfiles
, publicly available in NVIDIA websites, that extract the files required for CUDA in a directory specified by the user. And that is how you can have many versions of CUDA in the same system coexisting without any kind of issue. (Even IIT servers 17, 18 and 19 adopt this solution). Of course, environment variables must be set accordingly (and you might have several python virtual environments using different version of CUDA thanks to this solution).
Thanks @xEnVrE , that is definitely cleared the situation, at least for me. Just to understand, in this virtual enviroments, are you using YARP? How do you compile/install it? You have a version of YARP for each virtual env or you have it compiled/installed somewhere outside from the virtual env and then you activate by properly setting the enviroment variables? We had some discussion on how to handle installation of Python bindings compiled by CMake projects in virtual env (see for example https://github.com/robotology/gym-ignition/issues/244) so understanding if YARP is indeed used in virtual envs would be interesting.
in this virtual environments, are you using YARP?
Yes, it happens sometimes. I tipically compile YARP outside the virtual env and then change python path inside the venv to point to right place.
One thing that I noticed is that if I try to compile the bindings only inside the venv, e.g.
cmake -DCREATE_PYTHON=True $PATH_TO_YARP_SRC/bindings/ $BUILD_DIR
the build system uses the version of Python installed system-wise and does not consider the version of Python used in the virtual environment. I don't know if this was already known or if this is was the intended behavior.
the build system uses the version of Python installed system-wise and does not consider the version of Python used in the virtual environment. I don't know if this was already known or if this is was the intended behavior.
Thanks, this depends on the CMake module used to find Python. With the FindPython3 that ships with CMake 3.16 ( https://cmake.org/cmake/help/v3.16/module/FindPython3.html#hints ) you can specify the policy that you prefer w.r.t. to that with the Python3_FIND_VIRTUALENV
variable.
iCubOS not running on Ubuntu 20.04
Fixed in https://github.com/icub-tech-iit/documentation/pull/84 .
Hi everyone, I wanted to check how the situation is evolved since I first opened the issue, out of the original open points:
Given all of this, I think that dropping Ubuntu 18.04 is not viable. However, there are some experimental subprojects (bipedal-locomotion-framework) that already dropped support for Ubuntu 18.04 due to having quite old dependencies (see https://github.com/robotology/robotology-superbuild/pull/654). At the moment the version of this project is fixed for ProjectTagsStable, but I prefer to remove this fixed version as soon as possible. Possible solutions are:
The solution that I prefer is the 2. , because it is the most simple one to support. If for any reason we need some parts of ROBOTOLOGY_ENABLE_DYNAMICS, such as iDynTree or whole-body-estimators (that includes wholebodydynamics device), on Ubuntu 18.04 we can simple add them also to ROBOTOLOGY_ENABLE_CORE
as well.
I would also like to add that using conda-forge dependencies it is possible to use without the robotology-superbuild in any distro with glibc >= 2.12, so even Ubuntu 12.04 . However, this way of installing packages is still not tested enough to rely on it for Ubuntu 18.04 compatibility.
@vtikha @Nicogene at the robot-bazaar level, given that the Docker images enable also ROBOTOLOGY_ENABLE_DYNAMICS, do we still need Ubuntu 18.04 compatibility?
cc @xEnVrE correct me if I got something wrong
just wanted to add that CUDA 11 is supported on Ubuntu 20.04. I guess older versions, that we still tend to use a lot, will never be supported officially. Nevertheless, lately we started experimenting a bit with docker and singularity images, hence our typical scenario might become the following:
nvidia/cuda:10.x-devel-ubuntu18.04
ROBOTOLOGY_ENABLE_DYNAMICS=OFF
) - if we need to deploy a machine learning module getting/sending data/images via YARP ports (this is the most typical scenario)Hence, solution 2. would be doable for our needs too!
Changed the title as instead Ubuntu 18.04 will be supported for a long time using conda-provided dependencies.
* JetPack SDK stuck to Ubuntu 18.04 ( https://forums.developer.nvidia.com/t/when-will-jetpack-move-to-ubuntu-20-04/142517 )
On this, the last comment on https://forums.developer.nvidia.com/t/when-will-jetpack-move-to-ubuntu-20-04/142517/51 from Nvidia as of 2021/04/20 is:
Sorry to keep you waiting for the upgrade - as discussed above, updating JetPack to Ubuntu 20.04 also requires an update to Linux kernel 5.x. This is a massive undertaking to apply/verify all of the changes needed to support the Tegra devices. As discussed above with @gtj, we are working to mitigate this migration effort needed for future updates.
In https://github.com/robotology/event-driven/issues/138 the possibility of dropping Ubuntu 18.04 (and Debian Buster) compatibility for ROBOTOLOGY_ENABLE_EVENT_DRIVEN
is discussed, as it was already broken, probably because users are using Ubuntu 20.04 .
In robotology/event-driven#138 the possibility of dropping Ubuntu 18.04 (and Debian Buster) compatibility for
ROBOTOLOGY_ENABLE_EVENT_DRIVEN
is discussed, as it was already broken, probably because users are using Ubuntu 20.04 .
In that case, the issue was fixed by @arrenglover in https://github.com/robotology/event-driven/pull/139, so for the time being ROBOTOLOGY_ENABLE_EVENT_DRIVEN
still supports Ubuntu 18.04 with apt dependencies.
So, now the Ubuntu 18.04/apt CI jobs Stable are not currently working. We need to understand if we are ready to drop support for Ubuntu 18.04/apt for ROBOTOLOGY_ENABLE_DYNAMICS
as discussed in this issue, or not. The last open points seems to be:
I will take care of OP2, @vtikha @Nicogene do you have any idea on OP1?
Unfortunately, on the DIC side there are still a lot of machines using Ubuntu 18.04. For this reason, I think we need to revise our plans for 2021.05 . In particular, I think we should support ROBOTOLOGY_ENABLE_DYNAMICS
on Ubuntu 18.04/apt on 2021.05. This is done by:
bipedal-locomotion-framework
to ROBOTOLOGY_ENABLE_DYNAMICS_FULL_DEPS
(that will not be supported on Ubuntu 18.04), until the Ubuntu 18.04 situation is not solved. This was discuss with @S-Dafarra (https://github.com/robotology/robotology-superbuild/pull/739) .Unfortunately, on the DIC side there are still a lot of machines using Ubuntu 18.04. For this reason, I think we need to revise our plans for 2021.05 . In particular, I think we should support
ROBOTOLOGY_ENABLE_DYNAMICS
on Ubuntu 18.04/apt on 2021.05. This is done by:* [x] Temporary moving `bipedal-locomotion-framework` to `ROBOTOLOGY_ENABLE_DYNAMICS_FULL_DEPS` (that will not be supported on Ubuntu 18.04), until the Ubuntu 18.04 situation is not solved. This was discuss with @S-Dafarra ([Move bipedal-locomotion-framework to `ROBOTOLOGY_ENABLE_DYNAMICS_FULL_DEPS` component #739](https://github.com/robotology/robotology-superbuild/pull/739)) . * [x] Fix YARP_telemetry to work on Ubuntu 18.04 ([Fix compilation when using Ubuntu 18.04 yarp-telemetry#134](https://github.com/robotology/yarp-telemetry/pull/134))
This decision that was taken for 2021.05 and subsquent releases (drop support for 18.04 for the ROBOTOLOGY_ENABLE_DYNAMICS_FULL_DEPS
was unfortunatly not mentioned in the release notes, so everyone probably was not fully aware of this. fyi @paolo-viceconte @isorrentino @S-Dafarra @GiulioRomualdi
The 18.04 apt CI just failed again due to YARP_telemetry:
2021-12-14T15:16:18.1694329Z [ 60%] Building CXX object src/telemetryDeviceDumper/CMakeFiles/yarp_telemetryDeviceDumper.dir/yarp_plugin_telemetryDeviceDumper.cpp.o
2021-12-14T15:16:19.4857060Z In file included from /home/runner/work/robotology-superbuild/robotology-superbuild/src/YARP_telemetry/src/telemetryDeviceDumper/TelemetryDeviceDumper.h:27:0,
2021-12-14T15:16:19.4860110Z from /home/runner/work/robotology-superbuild/robotology-superbuild/build/src/YARP_telemetry/src/telemetryDeviceDumper/yarp_plugin_telemetryDeviceDumper.cpp:10:
2021-12-14T15:16:19.4863534Z /home/runner/work/robotology-superbuild/robotology-superbuild/src/YARP_telemetry/src/libYARP_telemetry/src/yarp/telemetry/experimental/BufferManager.h: In member function ‘std::__cxx11::string yarp::telemetry::experimental::BufferManager<T>::fileIndex() const’:
2021-12-14T15:16:19.4866536Z /home/runner/work/robotology-superbuild/robotology-superbuild/src/YARP_telemetry/src/libYARP_telemetry/src/yarp/telemetry/experimental/BufferManager.h:497:22: error: ‘put_time’ is not a member of ‘std’
2021-12-14T15:16:19.4867931Z time << std::put_time(&tm, m_bufferConfig.file_indexing.c_str());
2021-12-14T15:16:19.4868489Z ^~~~~~~~
2021-12-14T15:16:19.6804825Z make[5]: *** [src/telemetryDeviceDumper/CMakeFiles/yarp_telemetryDeviceDumper.dir/yarp_plugin_telemetryDeviceDumper.cpp.o] Error 1
2021-12-14T15:16:19.6807578Z make[4]: *** [src/telemetryDeviceDumper/CMakeFiles/yarp_telemetryDeviceDumper.dir/all] Error 2
2021-12-14T15:16:19.6821726Z src/telemetryDeviceDumper/CMakeFiles/yarp_telemetryDeviceDumper.dir/build.make:75: recipe for target 'src/telemetryDeviceDumper/CMakeFiles/yarp_telemetryDeviceDumper.dir/yarp_plugin_telemetryDeviceDumper.cpp.o' failed
2021-12-14T15:16:19.6824317Z CMakeFiles/Makefile2:214: recipe for target 'src/telemetryDeviceDumper/CMakeFiles/yarp_telemetryDeviceDumper.dir/all' failed
2021-12-14T15:16:19.6825371Z make[3]: *** [all] Error 2
2021-12-14T15:16:19.6826472Z make[2]: *** [src/YARP_telemetry/CMakeFiles/YCMStamp/YARP_telemetry-build] Error 2
2021-12-14T15:16:19.6827328Z make[1]: *** [CMakeFiles/YARP_telemetry.dir/all] Error 2
2021-12-14T15:16:19.6827867Z make: *** [all] Error 2
2021-12-14T15:16:19.6828524Z Makefile:135: recipe for target 'all' failed
2021-12-14T15:16:19.6829896Z CMakeFiles/YARP_telemetry.dir/build.make:85: recipe for target 'src/YARP_telemetry/CMakeFiles/YCMStamp/YARP_telemetry-build' failed
2021-12-14T15:16:19.6831241Z CMakeFiles/Makefile2:2114: recipe for target 'CMakeFiles/YARP_telemetry.dir/all' failed
2021-12-14T15:16:19.6832160Z Makefile:100: recipe for target 'all' failed
~3.5 years after the 18.04 release and 1.5 years after 20.04 release, probably it does not make a lot of sense to put effort in mantaing 18.04 . If someone wants to run recent software on old distros, he/she can always use conda or similar package managers. My proposal is to drop 18.04 apt support for 2022.02, at least for ROBOTOLOGY_ENABLE_DYNAMICS
component.
My proposal is to drop 18.04 apt support for 2022.02, at least for ROBOTOLOGY_ENABLE_DYNAMICS component.
👍🏻
Related announcement done in https://github.com/robotology/community/discussions/568 .
See https://github.blog/changelog/2022-08-09-github-actions-the-ubuntu-18-04-actions-runner-image-is-being-deprecated-and-will-be-removed-by-12-1-22/ , if we want to keep some stealth CI jobs with 18.04, we need to just use Docker, not ubuntu-18.04
GitHub Actions images.
JetPack SDK stuck to Ubuntu 18.04 ( https://forums.developer.nvidia.com/t/when-will-jetpack-move-to-ubuntu-20-04/142517 )
JetPack 5.0.2 is now based on Ubuntu 20.04 : https://developer.nvidia.com/embedded/jetpack .
JetPack SDK stuck to Ubuntu 18.04 ( https://forums.developer.nvidia.com/t/when-will-jetpack-move-to-ubuntu-20-04/142517 )
JetPack 5.0.2 is now based on Ubuntu 20.04 : https://developer.nvidia.com/embedded/jetpack .
The JetPack 5.0.2 has been released also for the carrier that we actually use in ergoCub and R1: https://connecttech.com/product/rogue-carrier-nvidia-jetson-agx-xavier/#tab-downloads, https://connecttech.com/ftp/Drivers/L4T-Release-Notes/Jetson-AGX-Xavier/AGX-35.1.0.pdf .
We have to await the jetpack to be released also for the jetson on the new icub head (see ref) and then we can safely drop Ubuntu 18.04
We have to await the jetpack to be released also for the jetson on the new icub head (see ref) and then we can safely drop Ubuntu 18.04
I may be wrong, but from https://connecttech.com/product/quark-carrier-nvidia-jetson-xavier-nx/ it seems that the JetPack 5 based on Ubuntu 20.04 was released for that board on the 11th of November!
We should first test it with the basler cameras because this point in the release notes is quite scary
cc @pattacini @maggia80
Cool, I just realized that 18.04 EOL is ~two month away: https://ubuntu.com//blog/18-04-end-of-standard-support . I do not think we ever reached the point of still supporting a EOLed Ubuntu LTS release. :D
In the next release of iDynTree the IDYNTREE_USES_IRRLICHT
option will not work on Ubuntu 18.04, see https://github.com/robotology/idyntree/pull/1071, so we need an hack like the one in https://github.com/robotology/robotology-superbuild/pull/1406 to continue to support Ubuntu 18.04 .
More problems due to the fact that we did not dropped 18.04 support: https://github.com/robotology/robotology-superbuild/pull/1459 .
We have to await the jetpack to be released also for the jetson on the new icub head (see ref) and then we can safely drop Ubuntu 18.04
Connecttech should release a patch for the bsp relatively soon for add the jetpack 5 support for the basler camera we are using 🤞🏻
More problems due to the fact that we did not dropped 18.04 support:
We are in the endgame for this, unstable deps are failing with:
2023-11-15T16:41:45.1581271Z -- Check size of long double
2023-11-15T16:41:45.1581920Z -- Check size of long double - done
2023-11-15T16:41:45.1583438Z CMake Error in /__w/robotology-superbuild/robotology-superbuild/build/src/YARP/CMakeFiles/CMakeFiles/CMakeTmp/CMakeLists.txt:
2023-11-15T16:41:45.1585982Z Target "cmTC_692c1" requires the language dialect "CXX20" . But the
2023-11-15T16:41:45.1586971Z current compiler "GNU" does not support this, or CMake does not know the
2023-11-15T16:41:45.1587727Z flags to enable it.
2023-11-15T16:41:45.1588027Z
2023-11-15T16:41:45.1588034Z
2023-11-15T16:41:45.1588355Z CMake Error at cmake/YarpSystemCheck.cmake:55 (try_run):
2023-11-15T16:41:45.1589285Z Failed to generate test project build system.
2023-11-15T16:41:45.1589933Z Call Stack (most recent call first):
2023-11-15T16:41:45.1590667Z cmake/YarpSystemCheck.cmake:82 (check_floating_point_is_iec559)
2023-11-15T16:41:45.1591459Z CMakeLists.txt:45 (include)
2023-11-15T16:41:45.1591812Z
2023-11-15T16:41:45.1591820Z
2023-11-15T16:41:45.1592179Z -- Configuring incomplete, errors occurred!
2023-11-15T16:41:45.1593412Z See also "/__w/robotology-superbuild/robotology-superbuild/build/src/YARP/CMakeFiles/CMakeOutput.log".
2023-11-15T16:41:45.2662497Z [81/240] Performing update step for 'ICUB'
This is happening intentionally due to https://github.com/robotology/yarp/pull/3039 . @Nicogene do we have some issue/documentation on the boards that are still in use and that are stuck to 18.04 ?
We are in the endgame for this, unstable deps are failing with:
2023-11-15T16:41:45.1581271Z -- Check size of long double 2023-11-15T16:41:45.1581920Z -- Check size of long double - done 2023-11-15T16:41:45.1583438Z CMake Error in /__w/robotology-superbuild/robotology-superbuild/build/src/YARP/CMakeFiles/CMakeFiles/CMakeTmp/CMakeLists.txt: 2023-11-15T16:41:45.1585982Z Target "cmTC_692c1" requires the language dialect "CXX20" . But the 2023-11-15T16:41:45.1586971Z current compiler "GNU" does not support this, or CMake does not know the 2023-11-15T16:41:45.1587727Z flags to enable it. 2023-11-15T16:41:45.1588027Z 2023-11-15T16:41:45.1588034Z 2023-11-15T16:41:45.1588355Z CMake Error at cmake/YarpSystemCheck.cmake:55 (try_run): 2023-11-15T16:41:45.1589285Z Failed to generate test project build system. 2023-11-15T16:41:45.1589933Z Call Stack (most recent call first): 2023-11-15T16:41:45.1590667Z cmake/YarpSystemCheck.cmake:82 (check_floating_point_is_iec559) 2023-11-15T16:41:45.1591459Z CMakeLists.txt:45 (include) 2023-11-15T16:41:45.1591812Z 2023-11-15T16:41:45.1591820Z 2023-11-15T16:41:45.1592179Z -- Configuring incomplete, errors occurred! 2023-11-15T16:41:45.1593412Z See also "/__w/robotology-superbuild/robotology-superbuild/build/src/YARP/CMakeFiles/CMakeOutput.log". 2023-11-15T16:41:45.2662497Z [81/240] Performing update step for 'ICUB'
This is happening intentionally due to robotology/yarp#3039 . @Nicogene do we have some issue/documentation on the boards that are still in use and that are stuck to 18.04 ?
The only references I think are in the upgrade kit doc page
https://icub-tech-iit.github.io/documentation/upgrade_kits/head_4k/support/
Ok, I got a bit of informations on Teams from @Nicogene :
For both, there is the problem that probably Nvidia will never provide an update to Ubuntu 22.04. For that there i ongoing work (but nothing ready yet) to update to Jetson Orin NX.
So in the short term the biggest problem is on iCub3/iRonCub3, I will align with the team working on that.
I do not know the situation on R1.
So in the short term the biggest problem is on iCub3/iRonCub3, I will align with the team working on that.
I talked with @gabrielenava . On iRonCub-Mk3, the icub-cam-head
is not used at all, and similarly the camera eyes are not used . So, at this point I think we can just remove 18.04 apt CI. The problem is not solved as in the future an head with Jetson Xavier + Basler may need to be produced, but it does not make sense to spend a lot of effort now to deal with problems on robots that do not exists. 18.04 users can either use conda packages or Docker images.
The problem is not solved as in the future an head with Jetson Xavier + Basler may need to be produced
The 4K head will need to be redesigned from the ground up, most likely, with new cameras and a new computational unit.
it does not make sense to spend a lot of effort now to deal with problems on robots that do not exists
Totally agree 👍🏻
cc @maggia80 @Nicogene
There are several issues that would be simple to solve if we could drop Ubuntu 18.04 support:
In this issue I think we can track the things that are blocking us to drop Ubuntu 18.04 support. I am afraid there are a few of those, but better to track them explicitly so that we can be fully aware:
Feel free to suggest more.