pongrobot / pongrobot_perception

ROS package to handle camera and vision related software for the pong robot
0 stars 0 forks source link

pongrobot_perception 1.0.0

ROS package for processing camera related inputs for the brobot pong robot. Includes arduino code for reading IMU orientation from the MPU6050 and publishing into ROS. This tag should be run against pongrobot_actuation 1.0.0 for best compatibility.

Dependencies

A Note on Realsense

There are currently problems with running the current release of realsense on ARM64 distributed through apt when the camera is configured to use both rgb and stereo depth simultaneously. To resolve this, we recommend building and installing librealsense2 and realsense-ros from source and using the ROS wrapper tag realsense-ros 2.2.24 and the corresponding SDK tag librealsense v2.44.0

Node Descriptions

Topics

Config Options

Node rate config options are available in config/rate_config.yaml and loaded under the namespace /rate

Coordinate frame config options are available in config/frame_config.yaml and loaded under the namespace /frame

Cup Detector config options are available in config/detector_config.yaml and loaded under the namespace /detector

Calibrator Config options are also available in config/detector_config.yaml and loaded under the namespace /detector

Game Manager config options are available in config/game_config.yaml and loaded under the namespace /game

Provided Launchfiles

Launchfiles can be found under /launch and coordinate running the required nodes and rviz configurations. Headless variations of many launchfiles are provided, the only difference is these do not run rviz.

Utilities

Some useful utilities are provided under /utils, these are used for setting up system services, remote connections and udev rules.

Udev Rules

Before the package can be run, the appropriate udev rules must be setup. This will allow the system to correctly identify USB devices. To setup the rule for the transform node, run sudo cp 11-brobot-tf.rules /etc/udev/rules.d. NOTE: Udev rules for the realsense are not included as they should be installed with the library.

Setup Remote Environment

The system is setup so the robot will normally run without rendering the visualization but it will stream all the necessary data to do so on a remote machine. If ROS is properly configured on another computer on the same network, all the visualization and debugging tools such as rviz can connect to the robot. In order to speed up this process, a shell script has been provided to quickly set the necessary environment variables for remove visualization. To enable remove monitoring, run source utils/setupRemoteEnv.sh from the package root.

Running as a Background Service

The launch files can be installed as a system service, to run in the background on startup. This is managed using the robot_upstart package.

The /utils/service/ folder contains several scripts to manage this process:

Cup Detector Pipeline

Here is a high level overview of the point cloud processing pipeline used to detect the cups in the cup_detector_node.

  1. Transform to Robot frame: The first step is to pull down the transform from the IMU to make sure the cloud is properly aligned with the ground. This will make it easier to detect the table going forward.
  2. Passthrough Filter: Once the cloud is in the correct reference frame, a passthough filter is applied to the depth axis to cut out background noise based on the config. Normally this step would include a downsampling but the realsense doesn't provide a particularly dense cloud in each frame.
  3. RANSAC Plane Detection: Next, the cloud is passed into a PCL segmenter using RANSAC to fit against the model of a plane parallel to the depth axis. Knowing that the table is more or less parallel to the ground in this case is very helpful.
  4. Build the Plane: Once the coefficients for the plane model have been determined, the inliers of the cloud are extracted and the extreme values are used to find the approximate bounds of the table.
  5. Filter Out Objects on the Table: Knowing the approximate horizontal height of the horizontal dimensions of the table the max height of and inliers on the table, a CubeBox filter can be created to filter out everything from the original cloud except for what is on the table.
  6. Cluster the Objects: Once the cloud has been reduced to only the objects on the table, a Euclidean Clustering algorithm is applied to isolate cups or groups of cups.
  7. Calculate Centroid: After all the clusters have been detected, the centroid of each cluster can be calculated and used as th location of the cup or set of cups. While the detector cannot always distinguish between cups that are very close together (usually because of the camera FOV and angle) it will still detect the centroid of the group of cups. It happens that this will usually (but not always) be the best location to aim at so this is not a huge problem

Reference Frames

MPU ATMega32U4 VL53L0X
VCC <--- --> VCC <-- ---> VCC
GND <-- --> GND <-- ---> GND
SCL <-- --> 3 <-- ---> SCL
SDA <-- --> 2 <-- ---> SDA
INT <-- --> 7 NC