Closed james2254 closed 7 years ago
Yes. With ViSP you can implement a visual servoing (let say may be pose-based) that brings the arm to the grasping position and publish the velocities as a cmd_vel.
Closed since this is not an issue
Hi @james2254 have u solved your issue ?
I doing visual servoing with UR5, but I am stuck over the flow,
-1- I am able now to run roslaunch visp_auto_tracker tracklive_usb.launch
I can detect the RQ-code's position and orientation.
rostopic echo visp_auto_tracker/object_position
-2- From this UR5 diver I can control the real arm with rviz and gazebo.
But I don't know how to combine between them and do visual servoing ???
You should implement a ros node that does the visual servoing. This node should subscribe to the pose published by visp_auto_tracker
, then compute a position based visual servoing (PBVS) where you have to specify a desired pose to reach. Here a good example for PBVS. The node should then publish the velocity computed by vpServo::computeControlLaw()
to the node that contains the UR5 driver.
Here a ROS example that shows how to use the pose published by visp_auto_tracker
in order to achieve an image based visual servoing.
@fspindle Thanks for clarification
I prepared a ROS node which can subscribe to topic /visp_auto_tracker/object_position
I don't know how to use the PBVS example that you suggested.
Would you please help ?
Here's the ROS node that I use to subscribe to /visp_auto_tracker/object_position
#include <vector>
#include "ros/ros.h"
#include "geometry_msgs/PoseStamped.h"
std::vector<geometry_msgs::PoseStamped::ConstPtr> poses;
void handle_poses(const geometry_msgs::PoseStamped::ConstPtr& msg)
{
ROS_INFO_STREAM("Position X " << msg->pose.position.x);
ROS_INFO_STREAM("Position Y: " << msg->pose.position.y);
ROS_INFO_STREAM("Position Z: " << msg->pose.position.z);
ROS_INFO_STREAM("Orientation X: " << msg->pose.orientation.x);
ROS_INFO_STREAM("Orientation Y: " << msg->pose.orientation.y);
ROS_INFO_STREAM("Orientation Z: " << msg->pose.orientation.z);
ROS_INFO_STREAM("Orientation W: " << msg->pose.orientation.w);
// Use the msg object here to access the pose elements,
// like msg->pose.pose.position.x
poses.push_back(msg);
}
int main(int argc, char **argv)
{
ros::init(argc, argv, "listenerccp");
ros::NodeHandle n;
ros::Subscriber sub = n.subscribe("/visp_auto_tracker/object_position",
1000, handle_poses);
ros::spin();
return 0;
}
@fspindle Thanks for clarification
I prepared a ROS node which can subscribe to topic
/visp_auto_tracker/object_position
I don't know how to use the PBVS example that you suggested.
Would you please help ?
Here's the ROS node that I use to subscribe to
/visp_auto_tracker/object_position
#include <vector> #include "ros/ros.h" #include "geometry_msgs/PoseStamped.h" std::vector<geometry_msgs::PoseStamped::ConstPtr> poses; void handle_poses(const geometry_msgs::PoseStamped::ConstPtr& msg) { ROS_INFO_STREAM("Position X " << msg->pose.position.x); ROS_INFO_STREAM("Position Y: " << msg->pose.position.y); ROS_INFO_STREAM("Position Z: " << msg->pose.position.z); ROS_INFO_STREAM("Orientation X: " << msg->pose.orientation.x); ROS_INFO_STREAM("Orientation Y: " << msg->pose.orientation.y); ROS_INFO_STREAM("Orientation Z: " << msg->pose.orientation.z); ROS_INFO_STREAM("Orientation W: " << msg->pose.orientation.w); // Use the msg object here to access the pose elements, // like msg->pose.pose.position.x poses.push_back(msg); } int main(int argc, char **argv) { ros::init(argc, argv, "listenerccp"); ros::NodeHandle n; ros::Subscriber sub = n.subscribe("/visp_auto_tracker/object_position", 1000, handle_poses); ros::spin(); return 0; }
hello @Abduoit did you solve your issue? I'm working on the same setup. Where did you get the visp_auto_tracker
node?
Can I do a demo with visp in UR robot for grasping?