This repo contains the first pilot prepared by the URJC. It is composed of several elements that are explained below:
This module contains the components that control the robot's mission during the pilot. We have used BICA components to generate the robot's behavior.
A BICA component is a ROS2 lifecycle node that can activate another BICA component by merely declaring it as a dependency. Besides, each BICA component can use behavior trees to implement its behavior, being able to declare the dependencies on each behavior tree leaf. In this way, we can have hierarchical behavior trees.
In this pilot, the mission controller uses a behavior tree to sequence the phases of the test. In some stages, the robot is made to navigate, and in others, it starts a dialogue. As the dialogs are also composed of several stages, other BICA components have been used to implement it, activated from a leaf of the mission tree's behavior tree.
The dialogue system can use both the robot's audio and tablet to communicate with a human. The selection of the communication mode is carried out using the hri_mode parameter. If the hri_mode parameter is "audio" the audio is used. If the hri_mode parameter is "tablet", the robot tablet is used.
The dialogue system uses a distributed graph that is used in BICA for many tasks. When a "say: XXX" arc is added between the robot and the person, the robot says XXX. If the arc is "ask: YYY", a dialogue takes place until YYY finds out.
This module contains a ROS node that simulates the contingencies in the specifications of this pilot. Besides, it is responsible for asking the mode-manager to change the system mode. This node presents a simple menu in which a contingency can be activated, and when it ends. The module also contains the configuration of the pilot system modes.
The modes are:
The modes change next parameters in the components (shown in the figure), with some effects:
We will use vcs-tool to get the dependencies and packages. We assume that you have a ros2 workspace ([ros2_ws]), if you don't have one, just create it with:
source /opt/ros/foxy/setup.bash
mkdir -p [path-to-your-ros2-ws]/src
Fetch, build and install navigation2 stack:
sudo apt install ros-foxy-slam-toolbox ros-foxy-gazebo-ros-pkgs
cd [ros2_ws]/src
git clone https://github.com/MROS-RobMoSys-ITP/Pilot-URJC.git
vcs import < Pilot-URJC/dependencies.repos
cd ..
rosdep install -y -r -q --from-paths src --ignore-src --rosdistro foxy
colcon build --symlink-install
NOTE: if the compilation fails because of it could not find some packages run again
rosdep install -y -r -q --from-paths src --ignore-src --rosdistro foxy
source [ros2_ws]/install/setup.bash
Turtlebot3 is one of tests platforms. Ignore this if you are not using it. Fetch, build and install turtlebot3 packages:
cd [ros2_ws]/src
vcs import < Pilot-URJC/turtlebot3.repos
cd ..
rosdep install -y -r -q --from-paths src --ignore-src --rosdistro foxy
colcon build --symlink-install
This pilot has been tested on different platforms. Above we show how to run the demo in each one.
In the TIAGO_specifications.pdf of this repository you can find a full description of the TIAGo robot.
Make sure that the navigation, localization and map-server are switched off in the robot before start the demo.
The shell windows that will launch the ros1_bridge and the ros1 components needs a correct network configuration, setting the ROS_IP and ROS_MASTER_URI environment variables.
We have used rmw_cyclonedds_cpp and rmw_fastdds_cpp RMW_IMPLEMENTATION for the tests.
: To launch the demo in real TIAGo we have to use the some bridges, at this moment TIAGo drivers are not migrated to ROS2.
ros2 run ros1_bridge twist_2_to_1
ros2 run ros1_bridge scan_1_to_2
ros2 run ros1_bridge imu_1_to_2
ros2 run ros1_bridge tf_static_1_to_2
ros2 run ros1_bridge tf_1_to_2
params/pilot_modes.yaml
.
ros2 launch pilot_urjc_bringup nav2_tiago_launch.py
Launch turtlebot3 world in gazebo sim
export GAZEBO_MODEL_PATH=$GAZEBO_MODEL_PATH:[ros2_ws]/src/turtlebot3/turtlebot3_simulations/turtlebot3_gazebo/models:[ros2_ws]/src/Pilot-URJC/pilot_urjc_bringup/worlds/models
export TURTLEBOT3_MODEL=${TB3_MODEL}
ros2 launch pilot_urjc_bringup tb3_sim_launch.py
After the last command, the Gazebo simulator is running in background. Don't worry if no window is opened.
TB3 navigation launcher
This launcher includes rviz, nav2, amcl, map-server, system-modes, etc.
The system_modes mode_manager takes the modes description from params/pilot_modes.yaml
.
export TURTLEBOT3_MODEL=${TB3_MODEL}
ros2 launch pilot_urjc_bringup nav2_turtlebot3_launch.py
RVIz opens, and the navigation system is waiting for the activation of the laser_driver. This activation will be made automatically by the dummy_metacontroller in Launching Pilot-URJC step. It is not necessary to set an initial robot position with the 2D Pose Estimate tool. When the laser_driver is up, the pose will be set automatically.
To install ROS (Melodic version), all the full documentation and steps to do during this proccess are in the next page:
http://wiki.ros.org/melodic/Installation/Ubuntu
Turtlebot ROS1 Gazebo simulator:
Launch the turtlebot2 simulator and its sensors. If you don't have a launcher to do this, you can find an example here
ros2 launch gb_robots sim_house.launch
:
ros2 launch nav2_bringup nav2_tb3_system_modes_sim_launch.py
nav2-TIAGo-support: The integration of the nav2 and turtlebot2 through the ros1_bridge needs one tools to fix some issues.
rosrun tf_static_resender tf_static_resender
Navigation launcher: This launcher includes rviz, nav2, amcl, map-server, system-modes, etc. The system-modes mode-manager takes the modes description from params/pilot_modes.yaml.
ros2 launch pilot_urjc_bringup nav2_turtlebot2_launch.py
sudo apt-get install ros-eloquent-eigen*
cd [ros2_ws]/src
git clone https://github.com/MROS-RobMoSys-ITP/Pilot-URJC.git
vcs import src < Pilot-URJC/dependencies_kobuki.repos # if you did vcs import of dependencies.repos before, you have to comment Behavior Tree and navigation2 in the dependencies_kobuki.repos file before do this vcs import.
cd ..
colcon build --symlink-install
Add a little modification in navigation2/nav2_bringup/bringup/launch/nav2_bringup_launch.py:
kobuki_dir = get_package_share_directory('pilot_kobuki')
declare_map_yaml_cmd = DeclareLaunchArgument(
'map',
default_value=os.path.join(kobuki_dir, 'map', 'map_name.yaml'),
description='Full path to map yaml file to load')
base_frame_id: "base_link"
source [ros2_ws]/install/setup.bash
ros2 launch pilot_kobuki kobuki2.launch.py
A dummy metacontroller
With this tool, you can simulate different reconfiguration scenarios. By now, the battery contingency is the only one supported. The laser failure is managed by the system_modes package
ros2 run metacontroller_pilot metacontroller
When the metacontroller is bringing up, the laser_driver goes to active mode, and everything starts. RVIz shows the local and global costmaps, the laser measures, and the transform tree.
Finally, we launch the pilot.
Pilot Behavior
ros2 launch pilot_behavior pilot_urjc_launch.py
The robot patrols the scenario and its battery is draining.
Laser failure sim With this RVIz tool, you can simulate a laser failure and its consequences.