simondlevy / BreezySLAM

Simple, efficient, open-source package for Simultaneous Localization and Mapping
GNU Lesser General Public License v3.0
758 stars 251 forks source link

using kinect instead of lidar? #20

Open ghost opened 6 years ago

ghost commented 6 years ago

in your example code you use lidar = myLidarModel() and then in the loop: scan = readLidar()

Any chance for replacing that with calls to kinect depth data?

simondlevy commented 6 years ago

That sounds like an excellent new feature, but it would unfortunately involve quite a bit of work. Currently, BreezySLAM is set up to work with a rotating lidar that returns a sequence of distances, each associated with a horizontal scan angle. This is pretty much the opposite of Kinect, which returns a 2D image (vertical, horizontal) of distances based on a fixed-orientation scan.

ghost commented 6 years ago

Hi Simon Thanks for your answer. Assuming I have the depth map I could generate the distance/angle data for a single vertical row and fill the missing angles with "None"-distancies? As I have a movement mode where the whole robot rotates 360 degrees I could

2018-03-08 19:30 GMT+01:00 Simon D. Levy notifications@github.com:

That sounds like an excellent new feature, but it would unfortunately involve quite a bit of work. Currently, BreezySLAM is set up to work with a rotating lidar that returns a sequence of distances, each associated with a horizontal scan angle. This is pretty much the opposite of Kinect, which returns a 2D image (vertical, horizontal) of distances based on a fixed-orientation scan.

— You are receiving this because you authored the thread. Reply to this email directly, view it on GitHub https://github.com/simondlevy/BreezySLAM/issues/20#issuecomment-371579267, or mute the thread https://github.com/notifications/unsubscribe-auth/AD0_CPRZ4SYrX3ONdtDigE480jxLJ9QWks5tcXjHgaJpZM4SilpT .

-- Jürg Maier Trungerstrasse 33 9543 St. Margarethen Schweiz juerg.maier@pobox.ch