homeskies / uw_common

Skills and non-robot dependent code
0 stars 0 forks source link

Collect, store 3D data persistently while running #14

Open nickswalker opened 4 years ago

nickswalker commented 4 years ago

There are two possible "first-whacks" at picking and placing.

One is doing perception from scratch whenever we need it:

The advantage is that the robot is well localized with respect to the point cloud; it hasn't moved since it gathered the data. The disadvantage is that it takes multiple seconds to repeat this perception and processing each time it approaches the table.

The alternative is to keep a coarse octomap running and integrating point clouds constantly, and to rely on being localized enough :

In practice, the time constraints of tasks mean that it's highly advantageous to avoid re perceiving. It is an open question how well the octomap performs given our sensors and compute. The octomap will probably be based on realsense point clouds, so it'll run on the xavier to avoid transmitting that data to the main computer. We'll need the bringup and launch files to be in place already (homeskies/uw_fetch#7). We also need the realsense to be well calibrated before we try this (homeskies/uw_fetch#4)

nickswalker commented 4 years ago

Update: I extracted a basic version of Villa's pointcloud aggregator/octomap wrapper. It'll need to be hooked up to a 2D bounding box detector (#5) in order for it to be useful. It also contains some of the surface detection, tabletop-perception style extraction, though it isn't exposed well.