cartographer-project / cartographer_ros

Provides ROS integration for Cartographer.
Apache License 2.0
1.64k stars 1.2k forks source link

Simulated drone gives bad SLAM results #624

Closed monabf closed 6 years ago

monabf commented 6 years ago

Hi,

I have been using Cartographer for a few different things and want to thank you for making this great tool open source. I am trying to simulate a drone mapping a certain cube-like structure, and to send the simulated lidar data to Cartographer for SLAM. I managed to simulate the drone circling this structure, and to run Cartographer with that bag, and it worked fine.

However, now I want the drone to fly in a cross-like pattern over the structure and map it from above. And suddenly, it doesn't seem like the SLAM catches on anymore: with odometry data provided it gives a reasonable trajectory but still drifts a lot, and without odometry it does not run at all. I have tried several things: making the drone much slower (which helped a lot), giving it an unrealistically large lidar swath so that it never loses sight of the structure during mapping (which changed nothing), tuning... But the trajectory and pointcloud outputted by Cartographer still aren't good enough. Maybe the SLAM algorithm has more trouble when the plane in which the drone moves is perpendicular to the plane in which the sensor is oriented. Otherwise I'm probably doing something wrong, and I would be very grateful for any kind of help.

I ran rosbag_validate, and here is a link to the required Git repo. The structure the drone is mapping is confidential, therefore here are two bags in which I replaced this structure with something similar found on the internet. One of the bags is a simulation of a regular swath lidar, the other of a very wide swath for debugging purposes.

Thank you for your help!

gaschler commented 6 years ago

In short, Cartographer currently does not support flying robots the way it extrapolates IMU input. Currently, PoseExtrapolator and ImuTracker use IMU acceleration input only to extract gravity direction using a low-pass filter. Velocity is estimated by the difference of the last two scan matches or by odometry input. Yes, we decided against a Kalman filter. (Perhaps this explains why it cannot detect motion perpendicular to the plane in your case.) We found this extrapolation approach best for mobile robots, cars, and sensors that are carried/moved around. To work with 3D laser scans from a flying vehicle, you would need to fork and make large changes to PoseExtrapolator and ImuTracker. 2D scans cannot be used with flying vehicles.

Concerning synthetic data, we found results gathered with synthetic data sometimes not very realistic. (Some simulation may be more realistic than other.) Choosing the right algorithm and tuning should ideally be based on measurements.

monabf commented 6 years ago

Thank you for your help. If I may, it should probably be mentioned somewhere that Cartographer can only be used as is for "rolling" robots that more or less only move inside a 2D plane.

If I understood well, Cartographer does not support robots with 6 degrees of freedom because pose and motion are estimated without using full IMU data, which is probably less accurate but more computationnally efficient than using a Kalman filter. It would probably help to modify Pose Extractor to add a Kalman filter and use the whole IMU data for more accurate state estimation.

However I would be surprised if it were the main reason why Cartographer cannot work with a drone, and map under or over the robot. Correct me if I'm wrong, but it seems to me like 2D Cartographer creates a grid normal to the detected direction of gravity and does scan matching there; then probably 3D Cartographer creates a certain number of grids stacked on top of each other, all normal to gravity, and makes a pile of grids approximately as high as vertical lidar swath. The reason why it is not applicable with a flying vehicle is because it does not do "grid-to-grid" matching and therefore no matching in the direction parallel to gravity, and if the robot were to go up and down it would not detect it and change the pile of grids accordingly. I'm not sure if I'm clear, but I suppose there is a "scan matching reason" for the fact that Cartographer cannot be used to map under the robot (even if that robot weren't a drone but a sensor on a string from the ceiling, moving back and forth in one plane). If you could explain that reason to me some more, and if you have any ideas on how to change the scan matching so that it could make "true" 3D occupancy grids where scan matching can be done in all directions, that would be very helpful.

Otherwise I agree that 2D scans cannot be used with flying vehicles (we only really use 3D) and that eventually we should use real measurements, but we were doing ok with the simulations as long as the drone moved more or less in a plane and was mapping around itself, comparable to a rolling robot.

gaschler commented 6 years ago
  1. How about adding Kalman filtering? Cartographer did use an unscented Kalman filter in the past, it was removed in https://github.com/googlecartographer/cartographer/pull/378 and replaced by the current PoseExtrapolator approach. Both solutions are computationally efficient, but the current gives better results in practice. So, we are very hesitant to add a Kalman filter now.

  2. Can the 3D pipeline work with 3D scans only under the camera? The 3D pipeline uses HybridGrid, which is not just layers of ProbabilityGrid, but a voxel grid where all three dimensions are treated equally. So the data structure has no limitation here, they are "true 3D occupancy grids". Scan matching in 3D does compare accumulated range data with a HybridGrid and then adds it to this and the previous one. You are right there is no grid-to-grid matching, but still this is unrelated to the question. Yes, cartographer can very well recognize up-and-down motion. I would actually argue that laser range measurements pointing down will give very precise up-and-down results. As I said, the only limitation is that accelerometer data is not used to predict translation. Also, the angle of the 3D scanner should be wide in both directions, not just 4 stripes close to each other.

  3. Generally, we would be glad to receive an open-source bag file with real data collected by an aerial vehicle, try to tune it and discuss good config parameters. It is always good to receive example data that is high quality and different from our exisiting scenarios. I'd expect that for small, shaky quadrotors, IMU measurements would need to be high frequency and in sync with lidar time stamps.

monabf commented 6 years ago

Thank you very much for your extensive answer! It is hard to know how 3D Cartographer works exactly since, to my knowledge, there is no paper explaining how it went from 2D to 3D.

After taking a closer look at the code, I can see that 2D mapping uses a 2D ProbabilityGrid but 3D uses a HybridGrid and therefore a real 3D occupancy grid, I missed that before. Great, that probably means that the use of IMU data is indeed the only thing stopping us from using Cartographer with a drone! We will implement the Kalman filter to use accelerometer data and see if that helps, we'll come back to you if any new problems come up.

As for the data, if you are interested in it as soon as we start doing real experiments mapping non-confidential structures I can post it here and we can discuss tuning and configuration!

monabf commented 6 years ago

Hi again,

We ran simulations in which the drone is artificially stabilized (set IMU data manually so that drone doesn't tilt/vibrate/oscillate), and with those simulations Cartographer ran perfectly. This proves that you were right, and that the only thing stopping the SLAM from working properly is the fact that the faster drone dynamics are not taken into account by the Pose Extrapolator since it doesn't use all IMU data.

We are now starting to implement a UKF inside the Pose extrapolator, using the earlier Cartographer commit you talked about. Our new Cartographer version compiles but it does not run, several checks are triggered and even when they are all commented out the SLAM does not run properly (no trajectory output).

Here is the new Git repo, the bag files are still here (one with normal swath, one with wide swath, and one with wide swath and an artificially stable drone with constant IMU data, which runs with the usual Cartofgrapher version). The output of rosbag_validate on the first bag (normal swath) is still the same.

Could you tell me if you can see why our version is not running ? Is there an obvious reason why numerous check functions that were not triggered before should be after our modifications ? Is there something missing/a mistake in what we did ?

Thank you in advance for your help!

gaschler commented 6 years ago

Just to make sure, if you fork an old version of cartographer, you also need to use the corresponding version of cartographer_ros, there were some changes related to timestamps, among others. It would be better to use github's fork mechanism (for both repos) so one can see what the diff to the old base is.

monabf commented 6 years ago

Hi, We now use the old UKF version of Cartographer in parallel with the current version, have aerial (drone) and underwater (rover) simulations and run Cartographer with the simulated lidar data. In the simulations the vehicle maps a certain structure, either by spiraling around it, or by going over it. Most of the time the UKF version of Cartographer performs much better. So I was just wondering: why exactly did you give up on the UKF and went to a much more basic state estimation? You said that gave better experimental results, I suppose that's only true for slow dynamics, more or less planar movements like with the bagpack? Did you ever figure out why the simple estimation works better than the UKF for you in those cases? I would very much like to know in which cases the UKF works better than the current Pose Extrapolator and why, so if you have any ideas or explanations or can tell me precisely why you chose to give up on the UKF that would be great!

Thanks for your help!

myavari commented 6 years ago

Hi @monabf

May I ask how far did you go with making the drone version of cartographer?

We need a SLAM library for drones and are very interested to use cartographer. I appreciate any hint or updates you share.

thanks

monabf commented 6 years ago

Hi @myavari

My quick fix for using Cartographer on a drone was to go back to an earlier version of the code, where a UKF was used in Pose Tracker for estimating the state of the vehicle, instead of the current Pose Extrapolator. I used the versions of Cartographer from approximately June 30th 2017, commit number 17a22ed of cartographer and more or less corresponding 125aee3 of cartographer_ros. It worked rather ok, not as well with 3D than with 2D motion but still ok. Certain complex scenarios had trouble and I ended up more or less giving up on it, but if you want to do rather simple scenarios where the drone looks around itself and doesn't go too fast you should be fine.

Hope that helps!