Mechanical Engineering Masters Thesis in the field of Robotics.
The end goal of this dissertation was to develop an autonomous exploration robot that is capable of choosing the Next Best View which reveals the most amount of information about a given volume.
The exploration solution is based on a robotic manipulator, a RGB-D sensor and ROS. The manipulator provides movement while the sensor evaluates the scene in its Field of View. Using an OcTree implementation to reconstruct the environment, the portions of the defined exploration volume where no information has been gathered yet are segmented. This segmentation (or clustering) will help on the pose sampling operation in the sense that all generated poses are plausible. Ray casting is performed, either based on the sensor's resolution or the characteristics of the unknown scene, to assess the pose quality. The pose that is estimated to provide the evaluation of the highest amount of unknown space is the one chosen to be visited next, i.e., the Next Best View. The exploration reaches its end when all the unknown voxels have been evaluated or, those who were not, are not possible to be measured by any reachable pose.
The solution provided is, not only, adaptable to changes in the environment during the exploration, but also, portable to other manipualtors rather than the one used in the development.
Smart object exploration by robotic manipulator
Department of Mechanical Engineering (DEM), University of Aveiro (UA)
LAR: Laboratory of Automation and Robotics
2019
Miguel Riem de Oliveira GitHub
DEM, UA
Aveiro, Portugal
Rafael Arrais GitHub
INESC TEC
Porto, Portugal
[MoveIt]()
FANUC Driver (based on)
After installing ROS Melodic, you will need to run the dependencies installer and the clone and build this repository.
Everything is based on this tutorial.
roslaunch fanuc_m6ib_support test_m6ib6s.launch
If the graphics aren't right, the solution is on this issue. So all you have to do is
export LC_NUMERIC="en_US.UTF-8"
roslaunch fanuc_m6ib_support test_m6ib6s.launch
Wire connect the robot to the machine.
Configure the pc IP accordingly to the robot, in this example: 192.168.0.200
On the TP, run rosstate
On the Linux machine, run (using the IP of your robot)
roslaunch fanuc_m6ib_support robot_state_visualize_m6ib6s.launch robot_ip:=192.168.0.230
Start the ros TPE program in auto mode.
In the terminal run (using the IP of your robot)
roslaunch fanuc_xtion_moveit_config moveit_planning_execution.launch sim:=false robot_ip:=192.168.0.230
The intrinsic calibration process was done following this tutorial.
roslaunch openni2_launch openni2.launch
rosrun camera_calibration cameracalibrator.py image:=/camera/rgb/image_raw camera:=/camera/rgb --size 8x6 --square 0.105
Video: SmObEx - ROS aruco hand2eye extrinsic calibration
For the calibration do the following steps:
place the ArUco marker (on smobex_bringup/launch/bringup.launch put the correct marker id and size)
run ROSSTATE on the TP and then
roslaunch smobex_bringup bringup.launch config:=true calibration:=true
rosrun smobex_calibration store_calibration.py
(thanks to @miguelriemoliveira for the source code).
Note: verify if store_calibration.py as running permissions.
Video: SmObEx - OctoMap mapping of selected volume of the world
roslaunch smobex_bringup bringup.launch config:=true define_vol:=true
In the end don't forget to click on the grey sphere to save the configuration and only then to run
rosrun smobex_bringup store_volume.py
Video: SmObEx - Manual mode exploration of a volume
Video: SmObEx - Fully autonomous exploration
roslaunch smobex_bringup bringup.launch
Green: Free Voxels
Red: Occupied Voxels
Orange: Unknown Voxels
Blue: Voxels expected to be known with this pose
Red Wireframe: camera's frustum
Gray Lines: Rays used in raycasting
To record the point cloud run
roslaunch smobex_bringup record.launch
(You must change the saving path in the launch file)
If at any moment you desire to save the OctoMap run
rosnode kill /accumulatedpointcloud
rosrun octomap_server octomap_saver -f test.ot
To visualize the point cloud or the OctoMap run, respectively,
pcl_viewer auto_save.pcd
octovis test.ot
360 degree mapping of LAR files:
roslaunch smobex_bringup record_bag.launch name:=file_name
roslaunch smobex_bringup bringup.launch
roslaunch smobex_bringup play_bag.launch file:=file_name
roslaunch smobex_bringup bringup.launch online:=false
Connect both Linux and Windows machines by ethernet cable.
In the Windows set the IPv4 as 192.168.0.233. In the Linux as 192.168.0.230.
Start Roboguide and run the ROS TPE program.
Then run
roslaunch fanuc_xtion_moveit_config moveit_planning_execution.launch robot_ip:=192.168.0.233 use_bswap:=false sim:=false
and set with rqt_reconfigure
<param name="move_group/trajectory_execution/allowed_execution_duration_scaling" value="4.0" />
<param name="move_group/trajectory_execution/execution_duration_monitoring" value="false" />
roslaunch smobex_bringup bringup.launch online:=false