cartographer-project / cartographer_turtlebot

Provides TurtleBot integration for Cartographer.
Apache License 2.0
151 stars 96 forks source link

Is there a way to write assets from the turtlebot bag file? #74

Closed ghost069 closed 7 years ago

ghost069 commented 7 years ago

First of all,thanks for all of your wonderful work and kind help. I was trying to write assets from the turtlebot bag file,especially a pcd file.I followed the documentation and got the pcd file of a laser bag.And I successfully run the turtlebot demo. But how can I get assets from the turtlebot bag file?I mean,I tried run assets writer like the laser bag.The pbstream I got is very tiny,and the pcd file is merely black in the viewer.I did get some pictures,though. I noticed there is no sensor_msgs/PointCloud2 but depth image in the cartographer_turtlebot_demo.bag.That's the reason,right?Assets writer only read point cloud data but it don't convert depth image into pointcloud.(Just a stupid thought, correct me if I were wrong)

Any way,could you please tell me how can I export SLAM point cloud data in a pcd file or a ply file from the turtlebot bag file?Thanks for your time!

SirVer commented 7 years ago

Assets writer only read point cloud data but it don't convert depth image into pointcloud.(Just a stupid thought, correct me if I were wrong)

It reads PointCloud2, LaserScan and MultiEchoLaserScan at this point in time. Converting your depth images to PointCloud2 will help.

ghost069 commented 7 years ago

@SirVer Sorry for bothering you again.But I converted my depth images to PointCloud2 and it still didn't work. I used the cartographer bagfile from the doc of Cartographer ROS for TurtleBots And I wrote a python file to convert the depth image.Here's my code. Run the python file,you will get a bagfile with pointcloud2 message named cartographer_turtlebot_demo-points.bag. Then I run

roslaunch cartographer_turtlebot demo_depth_camera_3d.launch bag_filename:=${HOME}/Downloads/cartographer_turtlebot_demo-points.bag

I got the pbstream by

rosservice call /finish_trajectory 0

rosservice call /write_state ${HOME}/Downloads/cartographer_turtlebot_demo-points.bag_points.pbstream

In the end,to get the pcd file

roslaunch cartographer_ros assets_writer_backpack_3d.launch bag_filenames:=${HOME}/Downloads/cartographer_turtlebot_demo-points.bag pose_graph_filename:=${HOME}/Downloads/cartographer_turtlebot_demo-points.bag_points.pbstream I put the generated pbstream file and pcd file in here.

I know you might be busy, but I would be so grateful if you could help. Thans a lot!

gaschler commented 7 years ago

I ran the conversion tool and your command above and observed in rviz that SLAM results are worse after this conversion and switch to 3D. Then, I ran our diagnosis tool cartographer_ros/lib/cartographer_ros/cartographer_rosbag_validate -bag_filename cartographer_turtlebot_demo-points.bag and found that the converted topic "/camera/depth/points" has 3x the frequency than the original topic "/laser_scan", which is strange. What do you mean by "it doesn't work"? If you are concerned about the SLAM results, I'd recommend to double-check the conversion tool first.

If you are missing point cloud output from the SLAM result, please make sure that you pass to the asset writer a launch/lua file configuration where the lua file contains the action { action = "write_ply", filename = "points.ply", },. Documentation on the asset writer is here: http://google-cartographer-ros.readthedocs.io/en/latest/assets_writer.html. Most of it's features are hidden in https://github.com/googlecartographer/cartographer/tree/master/cartographer/io.

ghost069 commented 7 years ago

@gaschler Thanks a lot for your time and reply!!!The problem is still confusing me,though... The reason why "/camera/depth/points" has 3x the frequency than the original topic "/laser_scan" is because '/camera/depth/raw_image' has 3x the frequency.The conversion file I wrote is to convert every raw image to pointcloud. Here's what I got when I print some topics and time of the bag.You can see what I said is ture: image_raw and points are in the same time.After every three points and raw images is laser_scan.

/camera/depth/image_raw 1474964029947664780

/camera/depth/points 1474964029947664780

/camera/depth/image_raw 1474964029980940632

/camera/depth/points 1474964029980940632

/camera/depth/image_raw 1474964030014303530

/camera/depth/points 1474964030014303530

/laser_scan 1474964030024218872

/camera/depth/image_raw 1474964030047686207

/camera/depth/points 1474964030047686207

/camera/depth/image_raw 1474964030081046165

/camera/depth/points 1474964030081046165

/camera/depth/image_raw 1474964030114383437

/camera/depth/points 1474964030114383437

/laser_scan 1474964030123687458

I didn't miss point cloud output,because I know how to write a lua file and I did.Actually I do get a point cloud output,but it's only 12.5 MB,so I think"it didn't work". When I ran cartographer_assets_writer,I also found the screen print is strange:

I1025 09:54:15.688046 5019 assets_writer_main.cc:246] Processed 0 of 326.898 bag time seconds... I1025 10:27:32.008136 5019 counting_points_processor.cc:43] Processed 1038880 and finishing.

The first line are not supposed to end so quickly.When I run the asset writer of laser scan,processed time will change from 0 to the whole bag time. (Like "Processed 30.3 of 326.898 bag time" “Processed 50.8 of 326.898 bag time” ... "Processed 326.898 of 326.898 bag time") I hope I described it clearly...

Anyway,this is not the right way to get the assets of a turtlebot demo.I know that simply converting every raw image to pointcloud is possibly not correct,but I don't know what kind of point cloud data I need to write in even though I tried to read the source code of the assets writer.Actually still trying...That's why I came to you guys for help.

I just uploaded the generated ply file in here.It might help. Thanks for your generous help.It means a lot to me.

gaschler commented 7 years ago

Thanks for clarifying the different frame rates between depth image and laser scan.

About the output of assets_writer, it only logs every 100000 messages, so not logging in this case is normal.

Just to understand correctly, you want to export the 3D point cloud using the pbstream of 2D SLAM results and a point cloud converted with some add_pointcloud_to_bagfile.py from depth images.

I believe that the depth is not correctly converted from depth image intensities to PointCloud2 in this case. Therefore the action = "min_max_range_filter" in the lua configuration throws away the incorrect points, and all you see is a few points coming from laser scans. (For me, removing the filter did generate a 25GB pcd file with all points, then.)

I'd recommend to remove the min_max_range_filter action from your assets writer config and then try to fix add_pointcloud_to_bagfile.py. Or, camera parameters in /camera/depth/camera_info https://storage.googleapis.com/cartographer-public-data/bags/turtlebot/cartographer_turtlebot_demo.bag do not exist and are needed for this conversion. I'd be interested which of this is the case.

ghost069 commented 7 years ago

@gaschler Oh my god.I just found out that the pbstream is written from 2D SLAM results,which might explain why the pcd file is so tiny... Nothing is wrong.My conversion file is correct and camera parameters are not missed.The pcd file is tiny because that's supposed to be.I installed a pcl viewer to visualize the pcd file,and that's what I got: 2017-10-26 10 34 31 Rotate it to the side: 2017-10-26 10 48 38

I'm so sorry for wasting your time...So sorry.You may close the issue now...

SirVer commented 7 years ago

Oh my god.I just found out that the pbstream is written from 2D SLAM results,which might explain why the pcd file is so tiny...

This is not an issue though. The .pbstream is just a trajectory for the asset writer, using a 2D result means that your trajectory will have roll, pitch and z zero, but otherwise it is fine. You can use this to do 3D assets as @gaschler pointed out. The problem is that you are filtering out all the camera points with the min_max_range_filter.

ghost069 commented 7 years ago

@SirVer I don't understand why do both of you think it's a pbstream file from a 2D result.I got it by calling rosservice when I was running demo_depth_camera_3d.launch How could that be a 2D result...

this pictures generated by the asset writer convice me that,too. (the picture below is cartographer_turtlebot_demo-points.bag_xray_xy_all.png) cartographer_turtlebot_demo-points bag_xray_xy_all

(the picture below is cartographer_turtlebot_demo-points.bag_xray_xz_all.png) cartographer_turtlebot_demo-points bag_xray_xz_all

These are two different visual angle of the trajectory,right?Exactly like the point cloud output...

If my pbstream file was generated by calling rosservice when running a 3d launch file and it turns out to be a 2D result,I really don't know what else way to generate a pdstream file...

SirVer commented 7 years ago

I don't understand why do both of you think it's a pbstream file from a 2D result.I got it by calling rosservice when I was running demo_depth_camera_3d.launch How could that be a 2D result...

We thought you were slaming with a laser from your OP - that would mean 2D SLAM. And you should probably do that, because the depth camera from turtlebot does not give good SLAM results.

But you are right that the side view of the asset writer clearly shows that you are not doing 2D SLAM.

SirVer commented 7 years ago

Closing as requested in https://github.com/googlecartographer/cartographer_turtlebot/issues/74#issuecomment-339535702. I still do not fully understand what the problem now was and how it was solved though.