This is Vizzy's "oh so amazing" repository!
This repository contains the necessary tools to interact with Vizzy - both on simulation and real robot usage.
The real robot uses two different middlewares for distinct body parts (YARP for the upperbody and ROS for the mobile base). In simulation there are two options in terms of middleware - one aligned with the real robot (using YARP and ROS) and one aiming simulation experiences (using exclusively ROS).
display.launch
to check the model with the Rviz graphical tool.Note that for simulation purposes one can ignore all YARP dependencies as explained on the repository description.
At any time you might need to install some more specific dependencies (like some missing ROS packages). Please open an issue in case you can't solve these or other dependencies.
You should have a catkin workspace on your file system to be able to compile the code. If you don't know how to do this please follow these instructions.
As soon as you have your catkin_workspace configured you are ready to open a terminal and run the following instructions:
git clone https://github.com/vislab-tecnico-lisboa/vizzy_install
Run the scripts in the vizzy-install repostory in the following order. You will be asked about the ROS version, which in this case is kinetic
, and the location of your catkin workspace (e.g. /home/user/my_catkin_ws
)
cd vizzy_install
./ros_install.sh
source ~/.bashrc
./ros_packages_install.sh
source ~/.bashrc
./install_yarp.sh
source ~/.bashrc
Run the scripts in the vizzy-install repostory in the following order. You will be asked about the ROS version, which in this case is melodic
, and the location of your catkin workspace (e.g. /home/user/my_catkin_ws
)
cd vizzy_install
./ros_install.sh
source ~/.bashrc
./ros_packages_install.sh
source ~/.bashrc
./install_yarp_1804.sh
source ~/.bashrc
For now let's focus on simulation. Open a terminal:
roslaunch vizzy_launch vizzy_simulation.launch
This configuration starts Vizzy in the ROS-mode, meaning that all the controllers are emulated using ROS. This configuration works with Gazebo 7.x and ROS kinetic (The default install procedure of ROS kinetic) and also with Gazebo 9 and ROS kinetic
The upper body controllers simulated by the gazebo-yarp-plugins can be started by
yarpserver
roslaunch vizzy_launch vizzy_simulation.launch use_yarp:=true
Don't forget you'll need to have yarpserver
running when you have the use_yarp
argument set as true.
Feel free to play with the arguments as you want or to change the low-level launchers with more functionality.
When using ROS camera interface, be careful to use the correct matrix for point (back-)projection
.../image_rect_color
the correct projection matrix is P (with zero distorsion parameters).../image_raw
the correct projection matrix is K (with distortion parameters)To access pulseaudio and all the sound options through the network you need to make it discoverable. For that use:
paprefs
Add the vizzy user to the audio group
sudo usermod -aG audio,pulse,pulse-access `whoami`
In order to access the audio configurations via SSH you need to define the the following environment variable on Vizzy:
export PULSE_SERVER=127.0.0.1
Furthermore, without a X11 session the PulseAudio server will not launch automatically since it normally requires X11. To run PulseAudio in a headless machine you need to run it in daemon mode:
pulseaudio -D
This should be launched automatically, but if the audio is not working you should check if the pulseaudio server is running. If not, execute the previous command and it should work.
To control the audio volume use
alsamixer
Now you can access audio configurations via ssh -X. Useful commands:
gnome-control-center
pavucontrol
pacmd
For more details see the following reference:
@inproceedings{moreno2016vizzy,
title={Vizzy: A humanoid on wheels for assistive robotics},
author={Moreno, Plinio and Nunes, Ricardo and Figueiredo, Rui and Ferreira, Ricardo and Bernardino, Alexandre and Santos-Victor, Jos{\'e} and Beira, Ricardo and Vargas, Lu{\'\i}s and Arag{\~a}o, Duarte and Arag{\~a}o, Miguel},
booktitle={Robot 2015: Second Iberian Robotics Conference},
pages={17--28},
year={2016},
organization={Springer}
}
CMakeFiles/charging_action_server.dir/src/charging_action_server_node.cpp.o: In function pcl::PPFHashMapSearch::setInputFeatureCloud(boost::shared_ptr<pcl::PointCloud<pcl::PPFSignature> const>): charging_action_server_node.cpp:(.text+0xd0): multiple definition of pcl::PPFHashMapSearch::setInputFeatureCloud(boost::shared_ptr<pcl::PointCloud<pcl::PPFSignature> const>) CMakeFiles/charging_action_server.dir/src/charging_action_server.cpp.o:charging_action_server.cpp:(.text+0x170): first defined here...
The solution is to edit your "ppf_registration.hpp" and inline the functions: setInputFeatureCloud and nearestNeighborSearch:
sudo vim /usr/include/pcl-1.7/pcl/registration/impl/ppf_registration.hpp
inline void
pcl::PPFHashMapSearch::setInputFeatureCloud (PointCloud<PPFSignature>::ConstPtr feature_cloud)
{
inline void
pcl::PPFHashMapSearch::nearestNeighborSearch (float &f1, float &f2, float &f3, float &f4,
std::vector<std::pair<size_t, size_t> > &indices)
{
All kind of issues and contributions will be very welcome. Please get in touch on our issues page when help is needed!