derektan95 / sensor-fusion-projects-udacity-nanodegree

This repository contains projects using LiDAR, Camera, Radar and Kalman Filters for Sensor Fusion. Since each type of sensors has their inherent strengths and limitations, it is important to investigate how they can complement each other to provide the most reliable results when attempting to determine the position and velocity of obstacles.
19 stars 4 forks source link

Question #1

Open dineshrboson opened 3 years ago

dineshrboson commented 3 years ago

Is it possible to fuse all this data and make a perfect prediction? How to fuse radar data that is in matlab with the remaining files?

derektan95 commented 3 years ago

Hi @dineshrboson, Kalman filters can be used to help fuse sensors readings for the autonomous system to adapt to a wide range of situations. If you are using this for a robotics application, you could take a look at this ROS package here. You could select which data to fuse, for instance for Odometry, you could choose to fuse GPS, IMU and/or wheel encoder information. You could also select between Extended or Unscented Kalman Filters.

Radar data in this case is used to detect obstacle's position and velocity. You could use the Kalman Filter to fuse it with any other sensor sources that gives obstacle's position and velocity (e.g. Lidar + Object Detection CV). Hope it helps.