Closed rahulsharma11 closed 3 years ago
Hi @rahulsharma11 It is possible to build up a point cloud over time until you are satisfied with the amount of detail on the scan. This is demonstrated in Intel's D435i SLAM tutorial for ROS. At the bottom of the guide, the process for saving the cloud as a 2D map in a pcd file and playing back the pcd is described.
https://github.com/IntelRealSense/realsense-ros/wiki/SLAM-with-D435i
Another approach is to record multiple individual ply point cloud files and then "stitch" then together into a single large ply file. This method is useful for creating a 'mega-cloud' with a single camera if you do not have multiple cameras.
The point clouds can be aligned with an affine transform operation.
https://github.com/IntelRealSense/librealsense/issues/7853#issuecomment-733733434
Hi @MartyG-RealSense , Thanks for the points.
I used ros based example and found no display on Rviz. Terminal output is -
[ERROR] (2021-04-03 16:24:51.940) Rtabmap.cpp:1147::process() RGB-D SLAM mode is enabled, memory is incremental but no odometry is provided. Image 1097 is ignored!
[ WARN] (2021-04-03 16:24:52.001) OdometryF2M.cpp:557::computeTransform() Registration failed: "Not enough inliers 0/20 (matches=0) between -1 and 1049" [ INFO] [1617447292.001975490]: Odom: quality=0, std dev=0.000000m|0.000000rad, update time=0.047713s [ WARN] (2021-04-03 16:24:52.070) OdometryF2M.cpp:557::computeTransform() Registration failed: "Not enough inliers 0/20 (matches=0) between -1 and 1050" [ INFO] [1617447292.070473352]: Odom: quality=0, std dev=0.000000m|0.000000rad, update time=0.048623s
Any settings i am missing?
Advice about the inlier warning is provided by Doronhi the RealSense ROS wrapper developer in the link below.
https://github.com/IntelRealSense/realsense-ros/issues/1080
Doronhi advises adjusting the position of the camera to see whether the inlier warning goes away.
Hi @rahulsharma11 It is possible to build up a point cloud over time until you are satisfied with the amount of detail on the scan. This is demonstrated in Intel's D435i SLAM tutorial for ROS. At the bottom of the guide, the process for saving the cloud as a 2D map in a pcd file and playing back the pcd is described.
https://github.com/IntelRealSense/realsense-ros/wiki/SLAM-with-D435i
Another approach is to record multiple individual ply point cloud files and then "stitch" then together into a single large ply file. This method is useful for creating a 'mega-cloud' with a single camera if you do not have multiple cameras.
The point clouds can be aligned with an affine transform operation.
Any c++ example to do stiching of multiple ply files? or it is done through affine-transform only? But this is used by python only right? So any sample code for the same?
one more thing, I also checked this link "https://github.com/IntelRealSense/realsense-ros" to get ros based output.
Ran with- "roslaunch realsense2_camera rs_camera.launch filters:=pointcloud"
Terminal output is - process[camera/realsense2_camera_manager-1]: started with pid [90753] process[camera/realsense2_camera-2]: started with pid [90754] [ INFO] [1617610928.922958382]: Initializing nodelet with 8 worker threads. [ INFO] [1617610929.250957027]: RealSense ROS v2.2.23 [ INFO] [1617610929.251017262]: Built with LibRealSense v2.42.0 [ INFO] [1617610929.251042466]: Running with LibRealSense v2.43.0
[ WARN] [1617610929.251096351]: running with a different librealsense version [ WARN] [1617610929.251127846]: than the one the wrapper was compiled with!
[ INFO] [1617610929.283295530]:
[ INFO] [1617610929.300793379]: Device with serial number 943222070941 was found.
[ INFO] [1617610929.300839289]: Device with physical ID /sys/devices/pci0000:00/0000:00:08.1/0000:04:00.3/usb2/2-2/2-2:1.0/video4linux/video2 was found. [ INFO] [1617610929.300847203]: Device with name Intel RealSense D435I was found. [ INFO] [1617610929.301146778]: Device with port number 2-2 was found. [ INFO] [1617610929.301186999]: Device USB type: 3.2
[ INFO] [1617610929.345608173]: JSON file is not provided [ INFO] [1617610929.345625744]: ROS Node Namespace: camera [ INFO] [1617610929.345655906]: Device Name: Intel RealSense D435I [ INFO] [1617610929.345672425]: Device Serial No: 943222070941 [ INFO] [1617610929.345692290]: Device physical port: /sys/devices/pci0000:00/0000:00:08.1/0000:04:00.3/usb2/2-2/2-2:1.0/video4linux/video2 [ INFO] [1617610929.345711414]: Device FW version: 05.12.12.100 [ INFO] [1617610929.345729806]: Device Product ID: 0x0B3A [ INFO] [1617610929.345745483]: Enable PointCloud: On [ INFO] [1617610929.345765659]: Align Depth: Off [ INFO] [1617610929.345782298]: Sync Mode: On [ INFO] [1617610929.345944682]: Device Sensors: [ INFO] [1617610929.361919474]: Stereo Module was found. 05/04 13:52:09,362 WARNING [140024456673024] (backend-v4l2.cpp:1357) Pixel format 36315752-1a66-a242-9065-d01814a likely requires patch for fourcc code RW16! [ INFO] [1617610929.388166222]: RGB Camera was found. [ INFO] [1617610929.390001849]: Motion Module was found. [ INFO] [1617610929.390055132]: (Confidence, 0) sensor isn't supported by current device! -- Skipping... [ INFO] [1617610929.390072863]: Add Filter: pointcloud [ INFO] [1617610929.394091746]: num_filters: 1 [ INFO] [1617610929.394129592]: Setting Dynamic reconfig parameters. [ WARN] [1617610929.440075468]: Param '/camera/rgb_camera/power_line_frequency' has value 3 that is not in the enum { {50Hz: 1} {60Hz: 2} {Disabled: 0} }. Removing this parameter from dynamic reconfigure options. [ INFO] [1617610929.455989091]: Done Setting Dynamic reconfig parameters. [ INFO] [1617610929.457171272]: depth stream is enabled - width: 848, height: 480, fps: 30, Format: Z16 [ INFO] [1617610929.458515267]: color stream is enabled - width: 1280, height: 720, fps: 30, Format: RGB8
[ INFO] [1617610929.461555865]: Expected frequency for depth = 30.00000 [ INFO] [1617610929.512069064]: Expected frequency for color = 30.00000
[ INFO] [1617610929.547208595]: insert Depth to Stereo Module [ INFO] [1617610929.547281222]: insert Color to RGB Camera [ INFO] [1617610929.616292632]: SELECTED BASE:Depth, 0 [ INFO] [1617610929.631485774]: RealSense Node Is Up!
but after opening Rviz, nothing is there. I saw "Global_status:Error".
Any ideas? Thanks.
RealSense ROS v2.2.23 Built with LibRealSense v2.42.0 Running with LibRealSense v2.43.0
The messages above indicate that you may have built the ROS wrapper on librealsense 2.42.0 and then upgraded to 2.43.0. You have to rebuild the ROS wrapper after a change of librealsense version though, otherwise you receive the above error.
I could not find any librealsense code examples for stitching ply files. The open-source software CloudCompare (available on Linux via a Snap package if you do not have access to Windows) has the ability to stitch multiple ply into a single ply though if you are able to use non-SDK software for the operation.
https://cloudcompare.org/forum/viewtopic.php?t=2543
http://www.cloudcompare.org/doc/wiki/index.php?title=Merge
Linux snap package https://snapcraft.io/cloudcompare
Thanks for support.
Issue Description
Hi, If i am not wrong, the point cloud generated from rs-pointcloud.cpp (by adding " points.export_to_ply("pointcloud.ply", color)") is per frame and generated ply file is just for last updated frame. Thus previous data is lost.
I want to append the pointcloud data till the app runs like wanted to get pointcloud for a big environment where i am holding the camera and moving within the environment.
Note- I have checked the "start recording " option in realsense-viewer app. I want similar thing but with code and with raw pointcloud data. Is it possible ? thanks.