The ROS package that runs on Unitree A1 NX and expose all functionality of Unitree A1 to ROS network.
Make sure you can access your ROS system. or source /opt/ros/kinetic/setup.bash
Make sure your system has lcm
installed and optionally realsense-ros
installed.
Setup your ROS workspace on your robot (A1) where you want to run the neural network.
mkdir -p unitree_ws/src
cd unitree_ws
catkin_make
Download unitree_legged_sdk v3.3.1 and extract it under unitree_ws/src/
Clone this repository to unitree_ws/src/
by
cd unitree_ws/src
git clone https://github.com/Tsinghua-MARS-Lab/unitree_ros_real.git
Run catkin_make
under the workspace folder
cd ../
catkin_make
If error occurrs related to message file, you can run catkin_make
again or temporarily stop the package unitree_legged_real
from compiling, then compile unitree_legged_real
after the message files are generated.
Important NOTE: Keep the emergency stop button in your hand or connect your robot using a stoppable power source if you run this package for the first time.
Make sure your ROS system is accessable and unitree_ws
is sourced (e.g. source unitree_ws/devel/setup.bash
)
Make a detail check on the options in unitree_legged_real/launch/robot.launch
.
Make sure your robot has all motor powered off, or pressing R2 + B
on your remote control for A1.
Make sure all joints on the robot are not at the mechanical limit. (Or the safety guard will stop the node)
Launch the ROS node by
If you want only see the robot states but not reacting any motor command.
roslaunch unitree_legged_real robot.launch
If you want to start the robot node and responding to motor command
roslaunch unitree_legged_real robot.launch dryrun:=false
Make sure you have another terminal with ROS access and unitree_ws
workspace access. Make sure the ROS robot node is running.
Checkout robot state topic and command topic. Just publish and subscribe through these two topics.
If you did not change anything in robot.launch
, you will see /a112138/low_state
as robot proprioception state and /a112138/legs_cmd
as where the robot recieve the motor commands (defined in the same order as the SDK)
Launch the robot node by
roslaunch unitree_legged_real robot.launch publish_joint_state:=true publish_imu:=true
Launch the odom node by
roslaunch unitree_legged_real odom.launch
Visualize in another terminal in the same ROS network that have access to a compiled unitree_ws
workspace and GUI running. (a.k.a a laptop connected to A1's ROS network)
roslaunch a1_description a1_rviz.launch
publish aligned realsense RGB-D camera image and camera Transform (intrinsic parameters)
(Usually it is two topics: image_raw and camera_info for RGB and for Depth)