lardemua / SmObEx

SmObEx is a package, made for my MSc thesis, that autonomously explores a given space
GNU General Public License v3.0
7 stars 4 forks source link
fanuc ros xtion

SmObEx: Smart Object Exploration

Mechanical Engineering Masters Thesis in the field of Robotics.

The end goal of this dissertation was to develop an autonomous exploration robot that is capable of choosing the Next Best View which reveals the most amount of information about a given volume.

The exploration solution is based on a robotic manipulator, a RGB-D sensor and ROS. The manipulator provides movement while the sensor evaluates the scene in its Field of View. Using an OcTree implementation to reconstruct the environment, the portions of the defined exploration volume where no information has been gathered yet are segmented. This segmentation (or clustering) will help on the pose sampling operation in the sense that all generated poses are plausible. Ray casting is performed, either based on the sensor's resolution or the characteristics of the unknown scene, to assess the pose quality. The pose that is estimated to provide the evaluation of the highest amount of unknown space is the one chosen to be visited next, i.e., the Next Best View. The exploration reaches its end when all the unknown voxels have been evaluated or, those who were not, are not possible to be measured by any reachable pose.

The solution provided is, not only, adaptable to changes in the environment during the exploration, but also, portable to other manipualtors rather than the one used in the development.

Released Content

Dissertation title

Smart object exploration by robotic manipulator

Department of Mechanical Engineering (DEM), University of Aveiro (UA)

LAR: Laboratory of Automation and Robotics

2019

Advisor

Miguel Riem de Oliveira GitHub

DEM, UA

Aveiro, Portugal

Co-Advisor

Rafael Arrais GitHub

INESC TEC

Porto, Portugal

Built with

Hardware

Software

Installation Guide

After installing ROS Melodic, you will need to run the dependencies installer and the clone and build this repository.

Robot Simple Usage

Everything is based on this tutorial.

Moving the joint with the interface

roslaunch fanuc_m6ib_support test_m6ib6s.launch

If the graphics aren't right, the solution is on this issue. So all you have to do is

export LC_NUMERIC="en_US.UTF-8"

roslaunch fanuc_m6ib_support test_m6ib6s.launch

Moving the robot with the TP

Wire connect the robot to the machine.

Configure the pc IP accordingly to the robot, in this example: 192.168.0.200

On the TP, run rosstate

On the Linux machine, run (using the IP of your robot)

roslaunch fanuc_m6ib_support robot_state_visualize_m6ib6s.launch robot_ip:=192.168.0.230

Moving the robot with MoveIt

Start the ros TPE program in auto mode.

In the terminal run (using the IP of your robot)

roslaunch fanuc_xtion_moveit_config moveit_planning_execution.launch sim:=false robot_ip:=192.168.0.230

SmObEx Usage

Intrinsic Calibration

The intrinsic calibration process was done following this tutorial.

roslaunch openni2_launch openni2.launch 

rosrun camera_calibration cameracalibrator.py image:=/camera/rgb/image_raw camera:=/camera/rgb --size 8x6 --square 0.105

intrinsic calibration calibration

Extrinsic Calibration Mode

Video: SmObEx - ROS aruco hand2eye extrinsic calibration

For the calibration do the following steps:

  1. place the ArUco marker (on smobex_bringup/launch/bringup.launch put the correct marker id and size)

  2. run ROSSTATE on the TP and then

roslaunch smobex_bringup bringup.launch config:=true calibration:=true

calibration rviz

  1. to store the calibration, open another terminal and run
rosrun smobex_calibration store_calibration.py

(thanks to @miguelriemoliveira for the source code).

calibration terminal

Note: verify if store_calibration.py as running permissions.

Define Space Mode

Video: SmObEx - OctoMap mapping of selected volume of the world

roslaunch smobex_bringup bringup.launch config:=true define_vol:=true

In the end don't forget to click on the grey sphere to save the configuration and only then to run

rosrun smobex_bringup store_volume.py

define space

Manual Pose Evaluation

Video: SmObEx - Manual mode exploration of a volume

Autonomous Exploration

Video: SmObEx - Fully autonomous exploration

roslaunch smobex_bringup bringup.launch

Pose Evaluated

Green: Free Voxels

Red: Occupied Voxels

Orange: Unknown Voxels

Blue: Voxels expected to be known with this pose

Red Wireframe: camera's frustum

Gray Lines: Rays used in raycasting

Auxiliary Functionalities

Recording mode

To record the point cloud run

roslaunch smobex_bringup record.launch

(You must change the saving path in the launch file)

If at any moment you desire to save the OctoMap run

rosnode kill /accumulatedpointcloud

rosrun octomap_server octomap_saver -f test.ot

To visualize the point cloud or the OctoMap run, respectively,

pcl_viewer auto_save.pcd

octovis test.ot

360 degree mapping of LAR files:

LAR_360_octomap

LAR_360_pointCloud

Record bagfile

roslaunch smobex_bringup record_bag.launch name:=file_name

roslaunch smobex_bringup bringup.launch

Play bagfile

roslaunch smobex_bringup play_bag.launch file:=file_name

roslaunch smobex_bringup bringup.launch online:=false

Simulating in Roboguide

Connect both Linux and Windows machines by ethernet cable.

In the Windows set the IPv4 as 192.168.0.233. In the Linux as 192.168.0.230.

Start Roboguide and run the ROS TPE program.

Then run

roslaunch fanuc_xtion_moveit_config moveit_planning_execution.launch robot_ip:=192.168.0.233 use_bswap:=false sim:=false

and set with rqt_reconfigure

<param name="move_group/trajectory_execution/allowed_execution_duration_scaling" value="4.0" />
<param name="move_group/trajectory_execution/execution_duration_monitoring" value="false" />

Offline (robot) Mode

roslaunch smobex_bringup bringup.launch online:=false

offline mode

Contributions