introlab / rtabmap

RTAB-Map library and standalone application
https://introlab.github.io/rtabmap
Other
2.62k stars 763 forks source link

VTK issue when launching rtabmap #1062

Open felron124 opened 1 year ago

felron124 commented 1 year ago

Hello, I am trying to use a Kinect with a raspberry pi with ros melodic on Debian Buster. When I install rtabmap from source and run it I get the error below. I am very new to linux so I would appreciate some help fixing this issue in layman's terms. Thanks.

pi@raspberrypi:~ $ rtabmap qt5ct: using qt5ct plugin qt5ct: custom style sheet is disabled qt5ct: D-Bus global menu: no libpng warning: iCCP: known incorrect sRGB profile libpng warning: iCCP: known incorrect sRGB profile libpng warning: iCCP: known incorrect sRGB profile Warning: In /build/vtk7-jXXwY7/vtk7-7.1.1+dfsg1/Rendering/OpenGL2/vtkOpenGLRenderWindow.cxx, line 647 vtkXOpenGLRenderWindow (0x114ae60): VTK is designed to work with OpenGL version 3.2 but it appears it has been given a context that does not support 3.2. VTK will run in a compatibility mode designed to work with earlier versions of OpenGL but some features may not work.

Warning: In /build/vtk7-jXXwY7/vtk7-7.1.1+dfsg1/Rendering/OpenGL2/vtkOpenGLRenderWindow.cxx, line 647 vtkXOpenGLRenderWindow (0x1473768): VTK is designed to work with OpenGL version 3.2 but it appears it has been given a context that does not support 3.2. VTK will run in a compatibility mode designed to work with earlier versions of OpenGL but some features may not work.

ERROR: In /build/vtk7-jXXwY7/vtk7-7.1.1+dfsg1/Rendering/OpenGL2/vtkOpenGLPolyDataMapper.cxx, line 3535 vtkOpenGLPolyDataMapper (0x14602b8): failed after BuildBufferObjects 1 OpenGL errors detected 0 : (1280) Invalid enum

ERROR: In /build/vtk7-jXXwY7/vtk7-7.1.1+dfsg1/Rendering/OpenGL2/vtkOpenGLPolyDataMapper.cxx, line 1794 vtkOpenGLPolyDataMapper (0x14602b8): failed after UpdateShader 1 OpenGL errors detected 0 : (1280) Invalid enum

ERROR: In /build/vtk7-jXXwY7/vtk7-7.1.1+dfsg1/Rendering/OpenGL2/vtkOpenGLActor.cxx, line 107 vtkOpenGLActor (0xbc7480): failed after Render 1 OpenGL errors detected 0 : (1280) Invalid enum

ERROR: In /build/vtk7-jXXwY7/vtk7-7.1.1+dfsg1/Rendering/OpenGL2/vtkOpenGLPolyDataMapper.cxx, line 1794 vtkOpenGLPolyDataMapper (0x14602b8): failed after UpdateShader 1 OpenGL errors detected 0 : (1282) Invalid operation

ERROR: In /build/vtk7-jXXwY7/vtk7-7.1.1+dfsg1/Rendering/OpenGL2/vtkOpenGLActor.cxx, line 107 vtkOpenGLActor (0xbc7480): failed after Render 1 OpenGL errors detected 0 : (1280) Invalid enum

Program started... ERROR: In /build/vtk7-jXXwY7/vtk7-7.1.1+dfsg1/Rendering/OpenGL2/vtkOpenGLPolyDataMapper.cxx, line 3535 vtkOpenGLPolyDataMapper (0x1739cc0): failed after BuildBufferObjects 1 OpenGL errors detected 0 : (1280) Invalid enum

ERROR: In /build/vtk7-jXXwY7/vtk7-7.1.1+dfsg1/Rendering/OpenGL2/vtkOpenGLPolyDataMapper.cxx, line 1794 vtkOpenGLPolyDataMapper (0x1739cc0): failed after UpdateShader 1 OpenGL errors detected 0 : (1280) Invalid enum

ERROR: In /build/vtk7-jXXwY7/vtk7-7.1.1+dfsg1/Rendering/OpenGL2/vtkOpenGLActor.cxx, line 107 vtkOpenGLActor (0xbc7480): failed after Render 1 OpenGL errors detected 0 : (1280) Invalid enum

ERROR: In /build/vtk7-jXXwY7/vtk7-7.1.1+dfsg1/Rendering/OpenGL2/vtkOpenGLPolyDataMapper.cxx, line 1794 vtkOpenGLPolyDataMapper (0x1739cc0): failed after UpdateShader 1 OpenGL errors detected 0 : (1282) Invalid operation

ERROR: In /build/vtk7-jXXwY7/vtk7-7.1.1+dfsg1/Rendering/OpenGL2/vtkOpenGLActor.cxx, line 107 vtkOpenGLActor (0xbc7480): failed after Render 1 OpenGL errors detected 0 : (1280) Invalid enum

matlabbe commented 1 year ago

It seems VTK is expecting another OpenGL version (depending if VTK has been built for mobile but used in a desktop environment). I got similar issues on jetson some years ago, that required to rebuild VTK and QT from source... which you don't want to do because it is too long (days to compile everything right). Rendering would be very laggy on RPI, I suggest to use ROS and visualize the result on another computer. I never visualize rtabmap on RPI, always on a remote computer, the RPI is not meant to render 3D graphics.

felron124 commented 1 year ago

Yes, the plan is to use another computer but I thought that rtbabmap needed to be installed on both RPI and the computer? I forgot to give full context in my original post but I am following this tutorial RGB-D SLAM With Kinect on Raspberry Pi 4 ROS Melodic which is saying to install rtabmap on the RPI. Thanks.

damavand1 commented 1 year ago

i have similar error

WIN_20230614_15_55_34_Pro

felron124 commented 1 year ago

It seems VTK is expecting another OpenGL version (depending if VTK has been built for mobile but used in a desktop environment). I got similar issues on jetson some years ago, that required to rebuild VTK and QT from source... which you don't want to do because it is too long (days to compile everything right). Rendering would be very laggy on RPI, I suggest to use ROS and visualize the result on another computer. I never visualize rtabmap on RPI, always on a remote computer, the RPI is not meant to render 3D graphics.

Could you help point me in a direction where I can find out how to visualize rtabmap on a remote computer? Like I said in the original post, I'm very new to all this so I'm not sure where to look. Also, just to clarify, are you saying its possible to have the kinect connected to the RPI and then have the rendering done on a remote computer? Thanks.

matlabbe commented 1 year ago

@felron124 With ROS it is possible, see http://official-rtab-map-forum.206.s1.nabble.com/RGB-D-SLAM-example-on-ROS-and-Raspberry-Pi-3-tp1250.html

For RPI4 related issues with VTK rendering, I'll need to reproduce it on my RPI4 later to be able to help more.

felron124 commented 1 year ago

Ok, thank you for the help. Is it possible to have the computer running windows or will I need to use linux?

matlabbe commented 1 year ago

If you just want to try hand-held kinect mapping, that can be done with windows binaries: https://github.com/introlab/rtabmap/releases/tag/0.21.0 (and check this tutorial)

To use more custom inputs, the rtabmap_ros package on ubuntu is the easiest way.

felron124 commented 11 months ago

@matlabbe Sorry its been awhile but I'm back on this now. I will use ubuntu on a remote laptop to process the mapping. I have 3 questions if you don't mind. 1. Does it matter what version of ubuntu I use, 2. the packages installed during this tutorial ,RGB-D SLAM With Kinect on Raspberry Pi 4 ROS Melodic, am I right in saying that these also go on the remote laptop and 3. is ros melodic the only thing that goes on the pi? I know that 2 and 3 arent questions related to rtabmap but you're the only person that I've found that has an idea of this. Thanks.

Edit: On the main page of the rtabmap github, it says that the ROS melodic build is disabled. Does this mean that rtabmap wont work with melodic at all? If so then I'm guessing Ubuntu 20.04 + noetic is the way to go in relation to the remote pc.

matlabbe commented 11 months ago

ROS melodic is EOL, so not supported anymore, though it may still work on 18.04+melodic. However, if you can install 20.04 on RPI4 with Noetic, I strongly suggest to do that. So if you have both computers with ROS noetic, that is the best.

felron124 commented 11 months ago

@matlabbe I don't think 20.04 desktop for rpi is available anymore but the server version is. Will the server version work fine since the visual map will be on the laptop or is the desktop version needed? Also, with the remote laptop visualising the map, is the kinect supposed to be connected to the pi or the remote laptop? Thanks.

Edit: After some research. I found that 20,04 desktop never existed but instead you have to install the server version and then install some packages. I'd still like to know about where the kinect is supposed to be connected to.

matlabbe commented 10 months ago

Is the laptop following RPI4 (e..g, laptop sitting on the robot)? If so, the kinect could be connected to laptop if rtabmap is running on it. If rtabmap should be ran on RPI4 or that the laptop won't follow the pi, connect kinect to RPI4.

Ubuntu Mate 20.04 should work on RPI4, though the link looks dead, only 22.04 is officially available. ROS noetic cannot be installed on 22.04. Digging more, the 20.04 image may be available from here: https://releases.ubuntu-mate.org/20.04/arm64/

felron124 commented 10 months ago

@matlabbe The laptop wont be sitting on the robot, the pi and the kinect will be on the robot. So are you saying that the kinect has to be connected to the device that has rtabmap installed on it? Thanks.

matlabbe commented 10 months ago

No the kinect should be connected to the robot, thus the Pi on the robot. You can then have the choice to launch rtabmap node on RPI (robot) or streaming the kinect images thought WiFi to laptop running rtabmap.

felron124 commented 10 months ago

@matlabbe Ah ok, I want stream the images to the laptop. Do I have to install freenect on both devices or just the pi and is the link you sent earlier how I execute it when everything is set up? Thanks.

matlabbe commented 10 months ago

Maybe a useful link for you: http://wiki.ros.org/rtabmap_ros/Tutorials/RemoteMapping

You only have to install freenect on RPI, and rtabmap on laptop.

felron124 commented 10 months ago

@matlabbe I created the file but when I run it I got an error saying "Failed to load nodelet [/camera/rgbd_sync] of type [rtabmap_ros/rgbd_sync] even after refreshing the cache: According to the loaded plugin descriptions the class rtabmap_ros/rgbd_sync with base class type nodelet::Nodelet does not exist.". I then edited it like it says on the page , changing it to "args="standalone rtabmap_ros/rgbd_sync", but then got the same error. I then reverted back to the original code added the "" on line 2 but got the same error again.

matlabbe commented 10 months ago

You should now use rtabmap_sync/rgbd_sync (see Migration guide for latest rtabmap_ros version)

felron124 commented 10 months ago

@matlabbe Since I just installed it, I presume the new package names are already installed? Do I just have to replace the old names in the launch file? Thanks.

matlabbe commented 10 months ago

If you are using a custom launch file using the old interface, yes. If you are using a launch file that is installed from that rtabmap version and it fails, it is a bug (which launch file are you using if so?), as all launch files should have been updated.

felron124 commented 10 months ago

@matlabbe I'm using the launch file in the site you linked http://wiki.ros.org/rtabmap_ros/Tutorials/RemoteMapping, so I'll just have to change the names. I wasn't aware that there are other launch files, where can I find a list of them? Thanks.

matlabbe commented 10 months ago

I updated the tutorial: http://wiki.ros.org/rtabmap_ros/Tutorials/RemoteMapping

Other launch files: https://github.com/introlab/rtabmap_ros/tree/master/rtabmap_demos/launch https://github.com/introlab/rtabmap_ros/tree/master/rtabmap_examples/launch https://github.com/introlab/rtabmap_ros/tree/master/rtabmap_launch/launch

felron124 commented 10 months ago

@matlabbe Got it working!!! Thank you. One thing though, I get the following message on the pi during runtme "[ WARN] [1693418333.817639794]: The time difference between rgb and depth frames is high (diff=0.010220s, rgb=1693418333.811776s, depth=1693418333.801557s). You may want to set approx_sync_max_interval lower than 0.01s to reject spurious bad synchronizations or use approx_sync=false if streams have all the exact same timestamp". Where is approx_sync_max_interval or is there another way to fix this? Also, on the remote laptop, I get the following "[WARN]: Could not get transform from odom to camera_link after 0.2000000 seconds (for stamp=1693418333.76238)! Error='Lookup would require extrapolation 134.037218s into the future. Requested time 1693418333.76223826 but the latest data is at the time 1693418199.725055695, when looking up transform from frame [camera_link] to frame [odom]. canTransform returned after 0.201367 timeout was 0.2.". Thanks.

matlabbe commented 10 months ago

For approx_sync_max_interval, it can be set as parameter for rgbd_odometry and rtabmap nodes.

For the 134 sec difference between the timestamps, I am not sure where/how the odom is computed, but you may have to synchronize the clocks between your laptop and the RPI4 (using ntpdate or chrony)

felron124 commented 10 months ago

@matlabbe Works really well now, thank you. One question on the tutorial you linked http://wiki.ros.org/rtabmap_ros/Tutorials/RemoteMapping , I don't think I'm reading the flowchart correctly but is there a stack or a way to take the data and create controls for the robot? If not, do you have a stack that you would recommend for go-to-goal technique? Thanks.

matlabbe commented 10 months ago

rtabmap is generally only for mapping. If you need to navigate, I'll suggest to look at the navigation stack of ROS (e.g., move_base). There is an example of integration between move_base and rtabmap here: http://wiki.ros.org/rtabmap_ros/Tutorials/MappingAndNavigationOnTurtlebot