pietrolechthaler / UR5-Pick-and-Place-Simulation

Simulate the iteration of a UR5 robot with Lego bricks
https://youtu.be/v45mPw_XEXA
MIT License
97 stars 22 forks source link

UR5 Pick and Place Simulation in Ros/Gazebo


Table of contents

Description

This repository demonstrates UR5 pick-and-place in ROS and Gazebo. The UR5 uses a Xbox Kinect cam to detect eleven types of Lego Bricks, and publish its position and angolation.

The goals of this project are:

Folder

UR5-Pick-and-Place-Simulation/catkin_ws/
├── levelManager
├── vision
├── motion_planning
├── gazebo_ros_link_attacher
├── robot

Requirements

For running each sample code:

Setup

After installing the libraries needed to run the project. Clone this repo:

git clone https://github.com/pietrolechthaler/UR5-Pick-and-Place-Simulation/

Setup the project:

cd UR5-Pick-and-Place-Simulation/catkin_ws
source /opt/ros/noetic/setup.bash
catkin build
source devel/setup.bash
echo "source $PWD/devel/setup.bash" >> $HOME/.bashrc

Clone and install YoloV5:

cd ~
git clone https://github.com/ultralytics/yolov5
cd yolov5
pip3 install -r requirements.txt

Usage

Launch the world

roslaunch levelManager lego_world.launch

Choose the level (from 1 to 4):

rosrun levelManager levelManager.py -l [level]

Start the kinematics process

rosrun motion_planning motion_planning.py

Start the localization process

rosrun vision vision.py -show

Contributors

Name Github
Davice Cerpelloni https://github.com/davidecerpelloni
Leonardo Collizzolli https://github.com/leocolliz
Pietro Lechthaler https://github.com/pietrolechthaler
Stefano Rizzi https://github.com/StefanoRizzi