introlab / rtabmap_ros

RTAB-Map's ROS package.
http://wiki.ros.org/rtabmap_ros
BSD 3-Clause "New" or "Revised" License
922 stars 549 forks source link

Use with Unitree Go 1 robot RGBD cameras #1031

Closed nicolasukd closed 6 months ago

nicolasukd commented 9 months ago

Hi, im trying to make a map with the Unitree Go 1 robot, but in the first tutorial after the installation (http://wiki.ros.org/rtabmap_ros/TutorialsOldInterface/HandHeldMapping) shows how to use rtabmap_ros out-of-the-box with a Kinect-like sensor, but i think the cameras on the Unitree Go 1 robot aren't there or at least they are not OpenNI and OpenNI2 compliant.

Im asking to confirm my issue (the cameras on my robot wont work out-of-the-box) and what can i change to make it work for those cameras

matlabbe commented 9 months ago

What are the topics available after you bringup the robot? If the robot is using a stereo camera, you may have to look at examples under http://wiki.ros.org/rtabmap_ros/TutorialsOldInterface/StereoHandHeldMapping instead.

nicolasukd commented 9 months ago

What are the topics available after you bringup the robot? If the robot is using a stereo camera, you may have to look at examples under http://wiki.ros.org/rtabmap_ros/TutorialsOldInterface/StereoHandHeldMapping instead.

The topics related to the cameras are this

Screenshot from 2023-09-11 20-18-15

I think those topics have the RGBD sensor structure

matlabbe commented 9 months ago

Qhat is cam5 topics?

But yeah those topics are more like RGB-D data. From what I see, this could work (assuming depth images are registered to color camera):

roslaunch rtabmap_ros rtabmap.launch \
    rtabmap_args:="--delete_db_on_start" \
    rgb_topic:=/camera_face/color/image_raw \
    depth_topic:=/camera_face/depth/image_raw \
    camera_info_topic:=/camera_face/color/camera_info \
    frame_id:=base_link \
    approx_sync:=false

Adjust approx_sync if the stamps are not exactly the same between the topics. frame_id would be the base frame of the robot.

Note: If all cameras have all the same resolution, it is possible to estimate visual odometry using all of them at the same time to increase FOV. Start with single camera like above first, then if you are interested trying multiple cameras, I could try to make an example, but I will need more info (the full rostopic list, are topics of same cameras syncrhonzized or not...). Ideally a rosbag would be nice.

nicolasukd commented 9 months ago

Qhat is cam5 topics?

But yeah those topics are more like RGB-D data. From what I see, this could work (assuming depth images are registered to color camera):

roslaunch rtabmap_ros rtabmap.launch \
    rtabmap_args:="--delete_db_on_start" \
    rgb_topic:=/camera_face/color/image_raw \
    depth_topic:=/camera_face/depth/image_raw \
    camera_info_topic:=/camera_face/color/camera_info \
    frame_id:=base_link \
    approx_sync:=false

Adjust approx_sync if the stamps are not exactly the same between the topics. frame_id would be the base frame of the robot.

Note: If all cameras have all the same resolution, it is possible to estimate visual odometry using all of them at the same time to increase FOV. Start with single camera like above first, then if you are interested trying multiple cameras, I could try to make an example, but I will need more info (the full rostopic list, are topics of same cameras syncrhonzized or not...). Ideally a rosbag would be nice.

Woow thank you, im gonna try that.

Here is the entire list of topics

Screenshot from 2023-09-12 08-07-53

Screenshot from 2023-09-12 08-10-46

If i understood correctly i have to run my robot simulation like this (https://github.com/unitreerobotics/unitree_guide in the readme explains how iam running it) and then i need to run your command roslaunch rtabmap_ros ... But if i do that it doesnt show right,

Screenshot from 2023-09-12 11-49-49

matlabbe commented 9 months ago

Without any visual features in the environment, visual odometry will likely to get lost. It would make more sense to use a digital twin like in this repo: https://github.com/matlabbe/rtabmap_drone_example#rtabmap_drone_example

nicolasukd commented 9 months ago

Without any visual features in the environment, visual odometry will likely to get lost. It would make more sense to use a digital twin like in this repo: https://github.com/matlabbe/rtabmap_drone_example#rtabmap_drone_example

I could change the environment, would that work? What do you mean with use a digital twin?

matlabbe commented 9 months ago

You could have a simulated environment with more visual features. A digital twin of the environment is a scan of a real environment that we can import in Gazebo like in the project linked.

nicolasukd commented 9 months ago

There is a way to use the same world in that example? cause ive tried and failed

image

I tried to copy all the world and models files from that example and created the same folders, also modified the gazeboSim.launch file and change some of the parameters and the world name to 'apt'

matlabbe commented 9 months ago

Yeah you could use the same world. What are the errors?

nicolasukd commented 9 months ago

Its like the map doesnt appear

Screenshot from 2023-09-14 12-46-27

image

The console shows this error when running the roslaunch unitree_guide gazeboSim.launch command

matlabbe commented 9 months ago

I think it is because it cannot find the model. For easy setup, move rtabmap_drone_example/models/apt model directory to ~/.gazebo/models so that Gazebo can find it easily. For apt.world, put it in worlds directory of unitree_gazebo. Then launching with:

roslaunch unitree_guide gazeboSim.launch wname:=apt

image

matlabbe commented 9 months ago

I created a launch file for this particular demo to see how to setup multi-camera VSLAM on that robot: https://github.com/introlab/rtabmap_ros/blob/master/rtabmap_demos/launch/demo_unitree_quadruped_robot.launch

See video: https://youtu.be/lUkQmWu1PGc

nicolasukd commented 9 months ago

I used your launch file (after i run the gazeboSim.launch file) and shows some errors.

Screenshot from 2023-09-25 10-54-53 Screenshot from 2023-09-25 11-00-33

matlabbe commented 9 months ago

Rtabmap should be built with opengv support.

nicolasukd commented 9 months ago

I think i understood what i have to do but when i install opengv and do 'cmake ..' it doesnt find opengv.

Screenshot from 2023-09-28 12-26-11

I already install it inside and outside of the rtabmap folder

matlabbe commented 9 months ago

I already install it inside and outside of the rtabmap folder

This is how I usually install opengv:

git clone https://github.com/laurentkneip/opengv.git && \
    cd opengv && \
    git checkout 91f4b19c73450833a40e463ad3648aae80b3a7f3 && \
    wget https://gist.githubusercontent.com/matlabbe/a412cf7c4627253874f81a00745a7fbb/raw/accc3acf465d1ffd0304a46b17741f62d4d354ef/opengv_disable_march_native.patch && \
    git apply opengv_disable_march_native.patch && \
    mkdir build && \
    cd build && \
    cmake -DCMAKE_BUILD_TYPE=Release .. && \
    make -j$(nproc) && \
    make install && \
    cd && \
    rm -r opengv
nicolasukd commented 9 months ago

It worked, but the map that's generated doesn't seem right. Any idea of what am i doing wrong?

Screenshot from 2023-10-02 14-27-46

Screenshot from 2023-10-02 12-59-37

matlabbe commented 9 months ago

Have you done this https://github.com/introlab/rtabmap_ros/blob/68066b7e9a32d87766440a3f0033a0763f146350/rtabmap_demos/launch/demo_unitree_quadruped_robot.launch#L6 ?

nicolasukd commented 9 months ago

Great, that was it. Thank you so much. Do you where can i visualize the generated scan?.

Also, its normal that is working at like 2 fps (very slow), there is a way i can optimize the mapping?

sumitsarkar1 commented 9 months ago

I build RTabMap with OpenGV ( Not the way you mentioned...the usuall git clone from opengv repo and mkdir build and cmake and make and make install....)...it was build and installed successfully...RTabMap standalone detected it also and I build Rtabmap stabdalone with opengv...but then when i did catkin_make for building rtabmap_ros I got this error :

usr/bin/ld: /usr/local/lib/librtabmap_core.so.0.21.2: undefined reference to `octomath::Pose6D::Pose6D(octomath::Pose6D const&)'

Earlier catkin_make was working fine when standalone Rtabmap was NOT built with opengv....so is it that the particular committ and patch needs to be applied for opengv

(After 1 hour of struggle !) With the patch and committ of opengv I am getting the same error !

matlabbe commented 9 months ago

@nicolasukd The mapping would be a 1 Hz, but visual odometry should be faster (>15-20 Hz on a desktop computer).

@sumitsarkar1 Some resources about that error:

sumitsarkar1 commented 9 months ago

@nicolasukd The mapping would be a 1 Hz, but visual odometry should be faster (>15-20 Hz on a desktop computer).

@sumitsarkar1 Some resources about that error:

Thank you @matlabbe ..its solved

nicolasukd commented 6 months ago

Hi @matlabbe, thank you for your help in the past. Im here again cause im implementing this functionality on the real robot. Im having trouble to make this work, specially because the sdk in charge of controlling the cameras (https://github.com/unitreerobotics/UnitreecameraSDK) doesnt have the function to get the camera info, and your launch file needed this topic. So my question is that if is necessary this topic for rtabmap to work? in case the answer is yes, do you know a way of creating this topic? i only have a .yaml file that configure the camera.

matlabbe commented 6 months ago

The camera info is required. You may use a script like this: https://github.com/introlab/rtabmap_ros/blob/master/rtabmap_util/scripts/yaml_to_camera_info.py to publish a synced camera_info with the corresponding image topic.

nicolasukd commented 6 months ago

thanks, and do i have to give it parameters like "rosrun unitree_camera yaml_to_camera_info.py ~yaml_path='~/Unitree/x.yaml ~frame_id='x' '". If so, whats the frame_id parameter?

nicolasukd commented 6 months ago

thanks, and do i have to give it parameters like "rosrun unitree_camera yaml_to_camera_info.py ~yaml_path='~/Unitree/x.yaml ~frame_id='x' '". If so, whats the frame_id parameter?

also, i think the yaml structure its different from whats expected for that script

image

matlabbe commented 6 months ago

The frame_id should be the frame of the camera (the same frame_id that the camera image topic is using).

The yaml format that yaml_to_camera_info.py is expecting these fields: https://github.com/introlab/rtabmap_ros/blob/master/rtabmap_examples/launch/config/euroc_left.yaml

nicolasukd commented 6 months ago

ive noticed that the camera image topic doesnt have any frame_id, how do i set the frame_id? im using image_trasnport to publish the image data. ive got another question related to the minimum number of cameras that rtabmap_ros needs to do the mapping, its one enough? or does it have to be three like in your example?

matlabbe commented 6 months ago

In the sensor_msgs::Image header, you could set a frame_id before publishing it.

rtabmap_ros requires at least one RGB-D camera or stereo camera. Monocular camera is not supported.

NicolasJorquera commented 6 months ago

In the end I managed to set frame_id to the image, now something strange happened regarding the errors that appeared when running your example, which ran well with the simulation. Before, the error appeared that the topics "camera/color/raw_image", "camera/depth/raw_image" and "camera/color/camera_info" have not arrived. When I changed the frame_id and left them all the same, this error disappeared. but the error remains that it does not receive the odometry topics and some others from the "rtabmap" node. The strange thing is that these topics are supposed to be generated by the rtabmap itself, so I suspect it has to do with it. launch that you did for the simulation, so I created another issue to ask you about what that same script would be like for just one camera. I don't know if there are any special considerations, taking into account that it is a real robot and not a simulation (as was the case for which that script was intended)

matlabbe commented 6 months ago

The strange thing is that these topics are supposed to be generated by the rtabmap itself, so I suspect it has to do with it.

camera/color/raw_image, camera/depth/raw_image and camera/color/camera_info are input topics, they are the topics coming from the simulated camera. If the real camera is not using the same topic name, adjust the remap in rgbd_sync.

nicolasukd commented 6 months ago

image

image

image

image

This are the topics of the robot, the pdf that generated the command rosrun tf2_tools view_frame.py and the errors that give me the launch file.

matlabbe commented 6 months ago

It looks like the bringup is not giving any frames for all robot links (I would have expected at least a base frame and frames for each joint on the robot + sensors).

Let's continue on this other issue: https://github.com/introlab/rtabmap_ros/issues/1093

yemerge commented 5 months ago

If you run roslaunch unitree_guide gazeboSim.launch wname:=apt and run roslaunch rtabmap_demos demo_unitree_quadruped_robot.launch, an error like the picture is displayed. The result of 'cmake ..' in rtabmap/build is OpenGV 1.0 = yes. Also modified unitree_ros/robots/go1_description/xacro/depthCamera.xacro. (464, 400) What's the problem? Screenshot from 2024-01-08 16-48-15 Screenshot from 2024-01-08 16-48-58 Screenshot from 2024-01-08 16-49-36 Screenshot from 2024-01-08 16-51-04 Screenshot from 2024-01-08 17-20-06

matlabbe commented 5 months ago

Is rtabmap_ros rebuilt with rtabamap built with opengv? Make sure to uninstall alll rtabmap binary package to start the right one.

sudo apt remove ros-noetic-rtabmap*
yemerge commented 5 months ago

Thank you @matlabbe! It's solved.