atenpas / gpd

Detect 6-DOF grasp poses in point clouds
BSD 2-Clause "Simplified" License
606 stars 233 forks source link

Generate some strange grasps #59

Closed onepre closed 5 years ago

onepre commented 5 years ago

@atenpas Hi , I run tutorial1.launch with ROS and kinect.I placed some objects on a desktop,and gpd can generate grasps,but sometime it generate some strange grasps(directly through the desktop),l am confused by this phenomenon, and do not know how to resolve this problem.

gpd3

nevermore0127 commented 5 years ago

I don't know if it can help but your qhd cloud should be linked to kinect2_rgb_optical_frame

onepre commented 5 years ago

@nevermore0127 thanks for reply , I have change the fixed frame to kinect2_rgb_optical_frame,but the previous problem still exists

atenpas commented 5 years ago

Tutorial 2 offers a possible solution where grasps are not sampled on the table plane.

onepre commented 5 years ago

@atenpas thanks for reply,I have try to create the CloudSamples based on the point cloud('/kinect2/qhd/points') and the detection range, and then sends it to GPD,but the Terminal display error,and I don't konw how to fix it. (run the tutorial1.launch under ROS with Kinect)

terminal:

[ERROR] [WallTime: 1543311487.500961] bad callback: <function cloudCallback at 0x7fa2e5e53e60>
Traceback (most recent call last):
  File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/topics.py", line 750, in _invoke_callback
    cb(msg)
  File "src/gpd/scripts/sampletest.py", line 36, in cloudCallback
    pub.publish(cloudsamples)
  File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/topics.py", line 882, in publish
    self.impl.publish(data)
  File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/topics.py", line 1066, in publish
    serialize_message(b, self.seq, message)
  File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/msg.py", line 152, in serialize_message
    msg.serialize(b)
  File "/home/efun/ur_demo/devel/lib/python2.7/dist-packages/gpd/msg/_CloudSamples.py", line 190, in serialize
    buff.write(_get_struct_q().pack(val1.data))
AttributeError: 'Point' object has no attribute 'data'

code:

import rospy
from sensor_msgs.msg import PointCloud2
from sensor_msgs import point_cloud2
import numpy as np
from scipy.linalg import lstsq
from gpd.msg import CloudIndexed
from gpd.msg import CloudSamples
from std_msgs.msg import Header, Int64
from geometry_msgs.msg import Point

cloud = [] # global variable to store the point cloud

def cloudCallback(msg):
    print('aaaaaaaaaa')
    global cloud
    if len(cloud) == 0:
        for p in point_cloud2.read_points(msg):
            cloud.append([p[0], p[1], p[2]])

    cloud = np.asarray(cloud)
    cloudsamples = CloudSamples()
    header = Header()
    header.frame_id = "/kinect2_link"
    header.stamp = rospy.Time.now()

    cloudsamples.cloud_sources.cloud = point_cloud2.create_cloud_xyz32(header,cloud.tolist())

    cloudsamples.cloud_sources.view_points.append(Point(0,0,0))
    cloudsamples.cloud_sources.camera_source.append(Point(0,0,0))

    cloudsamples.samples.append(Point(-1,-1,-1))
    cloudsamples.samples.append(Point(1,1,1))
    rospy.sleep(1)
    pub.publish(cloudsamples)
    rospy.sleep(1)

if __name__ == '__main__':
    rospy.init_node('select_grasp')
    pub = rospy.Publisher('cloud_samples', CloudSamples, queue_size=1)
    cloud_sub = rospy.Subscriber('/kinect2/qhd/points',PointCloud2,cloudCallback)
    rospy.spin()

tutorial1.launch

<launch>

  <arg name="device"                     default="0"
        doc="target device to execute inference. [0:CPU]|1:GPU|2:VPU|3:FPGA" />

  <!-- Load hand geometry parameters -->  
  <include file="$(find gpd)/launch/hand_geometry.launch">
    <arg name="node" value="detect_grasps" />
  </include>

  <!-- Load classifier parameters -->
  <!-- <include file="$(find gpd)/launch/caffe/classifier_15channels.launch"> -->
  <include file="$(find gpd)/launch/caffe/classifier_15channels.launch">
    <arg name="node" value="detect_grasps" />
  </include>

    <node name="detect_grasps" pkg="gpd" type="detect_grasps" output="screen">

    <param name="device" type="int" value="$(arg device)" />

    <!-- If sequential importance sampling is used (default: false) -->
    <param name="use_importance_sampling" value="false" />

    <!-- What type of point cloud is used and what ROS topic it comes from -->
    <!--<param name="cloud_type" value="0" />  0: PointCloud2, 1: CloudSized, 2: CloudIndexed, 3: CloudSamples -->
    <!--<param name="cloud_topic" value="/camera/depth_registered/points" />-->
    <param name="cloud_type" value="2" />
    <param name="cloud_topic" value="/cloud_samples" />
    <!-- <param name="cloud_topic" value="/passthrough/output" /> -->

    <!-- (optional) The ROS topic that the samples come from (default: an empty string) -->
    <param name="samples_topic" value="" />

    <!-- Plotting parameters -->
    <param name="plot_normals" value="false" />
    <param name="plot_samples" value="false" />    
    <param name="plot_candidates" value="false" />
    <param name="plot_filtered_grasps" value="false" />
    <param name="plot_valid_grasps" value="false" />
    <param name="plot_clusters" value="false" />
    <param name="plot_selected_grasps" value="false" />
    <param name="rviz_topic" value="grasps_rviz" />

    <!-- Preprocessing of point cloud -->
    <param name="voxelize" value="true"/>
    <!-- 修改 -->
    <!-- <param name="remove_outliers" value="false"/> -->
    <param name="remove_outliers" value="true"/>

    <!-- <rosparam param="workspace"> [-1, 1, -1, 1, -1, 1] </rosparam> -->
    <rosparam param="workspace"> [-0.2,0,-1,1,-1,1] </rosparam>
    <!-- <rosparam param="workspace"> [0, 0.3, 0, 1, -1, 1] </rosparam>    调试成功  -->
    <rosparam param="camera_position"> [0, 0, 0] </rosparam>

    <!-- General parameters -->
    <!--<param name="num_samples" value="100" />
    <param name="num_threads" value="4" />-->
    <param name="num_samples" value="1000" />
    <param name="num_threads" value="8" />

    <!-- Parameters for local grasp candidate search -->
    <param name="nn_radius" value="0.01" />
    <param name="num_orientations" value="8" /> <!-- Number of orientations to consider -->

    <!-- Filtering of grasp candidates --> 
    <param name="filter_grasps" value="false" /> <!-- on workspace and robot hand aperture -->
    <rosparam param="worksapce_grasps"> [-0.1, 0.2, -0.3, 0, 0.3, 0.65] </rosparam>
    <!-- <rosparam param="worksapce_grasps"> [0.0, 0.10, -0.30, 0.30, -2.00, 0.40] </rosparam>  调试成功 -->
    <!-- <rosparam param="workspace_grasps"> [0.55, 1.0, -0.41, 0.03, -0.29, 1.0] </rosparam> -->
    <param name="filter_half_antipodal" value="false"/> <!-- on half antipodal -->

    <!-- Grasp image creation -->
    <param name="create_image_batches" value="false" /> <!-- creates grasp images in batches (less memory usage) -->
    <!--<param name="remove_plane_before_image_calculation" value="false" />  removes table plane from point cloud to speed up shadow computations -->
    <param name="remove_plane_before_image_calculation" value="false" />
    <!-- Clustering of grasps -->
    <param name="min_inliers" value="1" />
    <!-- <param name="min_inliers" value="1" /> -->

    <!-- Grasp selection -->
    <!--<param name="min_score_diff" value="0" />
    <param name="min_aperture" value="0.029" />
    <param name="max_aperture" value="0.072" />
    <param name="num_selected" value="5" />-->

    <param name="min_score_diff" value="50" />
    <param name="min_aperture" value="0.0" />
    <param name="max_aperture" value="0.085" />
    <param name="num_selected" value="1" />

    </node>

</launch>
atenpas commented 5 years ago

The error is in this line:

cloudsamples.cloud_sources.camera_source.append(Point(0,0,0))

The camera_source field is of type std_msgs/Int64 (see CloudSources.msg). It's the index of the camera that obtained the i-th point in the cloud. If you have only one camera, you can set all of camera_source to zero.

onepre commented 5 years ago

@atenpas thanks a lot !! I have changed the code to:

cloudsamples.cloud_sources.camera_source.append(Int64(0))

it's work now,and gpd can generate some grasps,but i still confuse about CloudSamples. if i set the cloudsamples like this:

cloudsamples.samples.append(Point(-1,-1,-1))
cloudsamples.samples.append(Point(1,1,1))

it means the gpd detection range is x[-1,1] y[-1,1] z[-1,1]?

atenpas commented 5 years ago

No. If you read the documentation for CloudSamples, you can see that samples is a list of points at which GPD should search for grasps. In your case, you have two points (-1,-1,-1) and (1,1,1) at which GPD will search for grasps.

onepre commented 5 years ago

@atenpas Thanks for your patience.I have filtered point clouds from kinect using PCL(cloud_filter.cpp),and then creates CloudIndexed based on the filtered point cloud , finally, send it to gpd(grasp.py),but the CloudIndexed terminal always output

Published cloud with 0 indices

and can not see any grasps in rviz.

gpdnow1 gpdnow2

cloud_filter.cpp

ros::Publisher pub;
void callback(const sensor_msgs::PointCloud2::ConstPtr& input)
{
    pcl::PCLPointCloud2* cloud = new pcl::PCLPointCloud2;
    pcl::PCLPointCloud2ConstPtr cloudPtr(cloud);
    pcl::PCLPointCloud2* cloud_filtered = new pcl::PCLPointCloud2;
    pcl::PCLPointCloud2Ptr cloudFilteredPtr (cloud_filtered);

    //Convert to PCL data type
    pcl_conversions::toPCL(*input, *cloud);

    pcl::PointCloud<pcl::PointXYZRGB> *xyz_cloud = new pcl::PointCloud<pcl::PointXYZRGB>;
    pcl::PointCloud<pcl::PointXYZRGB>::Ptr xyzCloudPtr(xyz_cloud);

    //convert the pcl::PointCloud2 tpye to pcl::PointCloud<pcl::PointXYZRGB>
    pcl::fromPCLPointCloud2(*cloudPtr,*xyzCloudPtr);

    pcl::PointCloud<pcl::PointXYZRGB> *xyz_cloud_filtered =   new pcl::PointCloud<pcl::PointXYZRGB>;
    pcl::PointCloud<pcl::PointXYZRGB>::Ptr xyzCloudPtrFiltered(xyz_cloud_filtered);

    pcl::PassThrough<pcl::PointXYZRGB> pass;    
    pass.setInputCloud(xyzCloudPtr);
    pass.setFilterFieldName("x");
    pass.setFilterLimits(-0.28,-0.15);
    pass.filter(*xyzCloudPtrFiltered);

    pcl::PointCloud<pcl::PointXYZRGB> *xyz_cloud_filtered2 =  new pcl::PointCloud<pcl::PointXYZRGB>;
    pcl::PointCloud<pcl::PointXYZRGB>::Ptr xyzCloudPtrFiltered2(xyz_cloud_filtered2);

    pcl::PassThrough<pcl::PointXYZRGB> passy;
    passy.setInputCloud(xyzCloudPtrFiltered);
    passy.setFilterFieldName("y");
    passy.setFilterLimits(-0.38,0.08);
    passy.filter(*xyzCloudPtrFiltered2);

    pcl::PointCloud<pcl::PointXYZRGB> *xyz_cloud_filtered3 =  new pcl::PointCloud<pcl::PointXYZRGB>;
    pcl::PointCloud<pcl::PointXYZRGB>::Ptr xyzCloudPtrFiltered3(xyz_cloud_filtered3);

    pcl::PassThrough<pcl::PointXYZRGB> passz;
    passz.setInputCloud(xyzCloudPtrFiltered2);
    passz.setFilterFieldName("z");
    passz.setFilterLimits(1.0 ,1.3);
    passz.filter(*xyzCloudPtrFiltered3);

    sensor_msgs::PointCloud2 output;
    pcl::PCLPointCloud2 outputPCL;

    pcl::toPCLPointCloud2(*xyzCloudPtrFiltered3 ,outputPCL);
    pcl_conversions::fromPCL(outputPCL, output);
    output.header.stamp = ros::Time::now();
    output.header.frame_id = "kinect2_link";

    pub.publish(output);
}

int main(int argc, char *argv[])
{
    ros::init(argc, argv,"grasp_test");
    ros::NodeHandle nh;

    ros::Subscriber sub = nh.subscribe<sensor_msgs::PointCloud2>("/kinect2/qhd/points",1,callback);
    pub = nh.advertise<sensor_msgs::PointCloud2> ("/cloud_filter",1);
    ros::spin();
}

grasp.py

cloud = [] # global variable to store the point cloud

def cloudCallback(msg):
    global cloud
    if len(cloud) == 0:
        for p in point_cloud2.read_points(msg,skip_nans=True):
            cloud.append([p[0], p[1], p[2]])

    cloud = np.asarray(cloud)
    X = cloud
    print(cloud)
    A = np.c_[X[:,0], X[:,1], np.ones(X.shape[0])]
    C, _, _, _ = lstsq(A, X[:,2])
    a, b, c, d = C[0], C[1], -1., C[2] # coefficients of the form: a*x + b*y + c*z + d = 0.
    dist = ((a*X[:,0] + b*X[:,1] + d) - X[:,2])**2
    err = dist.sum()
    idx = np.where(dist > 0.01)

    msg = CloudIndexed()
    header = Header()
    header.frame_id = "/kinect2_link"
    header.stamp = rospy.Time.now()
    msg.cloud_sources.cloud = point_cloud2.create_cloud_xyz32(header, cloud.tolist())
    msg.cloud_sources.view_points.append(Point(0,0,0))
    # for i in xrange(cloud.shape[0]):
    msg.cloud_sources.camera_source.append(Int64(0))
    for i in idx[0]:
        msg.indices.append(Int64(i))    
    rospy.sleep(1)
    pub.publish(msg)
    rospy.sleep(1)
    print 'Published cloud with', len(msg.indices), 'indices'

rospy.init_node('select_grasp')

# Subscribe to the ROS topic that contains the grasps.
cloud_sub = rospy.Subscriber('/cloud_filter', PointCloud2, cloudCallback)
pub = rospy.Publisher('/cloud_indexed', CloudIndexed, queue_size=1)

# Wait for point cloud to arrive.
while len(cloud) == 0:
    rospy.sleep(0.01)

rospy.spin()

tutorial1.launch

<launch>

  <arg name="device"                     default="0"
        doc="target device to execute inference. [0:CPU]|1:GPU|2:VPU|3:FPGA" />

  <!-- Load hand geometry parameters -->  
  <include file="$(find gpd)/launch/hand_geometry.launch">
    <arg name="node" value="detect_grasps" />
  </include>

  <!-- Load classifier parameters -->
  <!-- <include file="$(find gpd)/launch/caffe/classifier_15channels.launch"> -->
  <include file="$(find gpd)/launch/caffe/classifier_15channels.launch">
    <arg name="node" value="detect_grasps" />
  </include>

  <node name="detect_grasps" pkg="gpd" type="detect_grasps" output="screen">

    <param name="device" type="int" value="$(arg device)" />

    <!-- If sequential importance sampling is used (default: false) -->
    <param name="use_importance_sampling" value="false" />

    <!-- What type of point cloud is used and what ROS topic it comes from -->
    <!--0: PointCloud2, 1: CloudSized, 2: CloudIndexed, 3: CloudSamples -->
    <param name="cloud_type" value="1" />
    <param name="cloud_topic" value="/cloud_indexed" />

    <!-- (optional) The ROS topic that the samples come from (default: an empty string) -->
    <param name="samples_topic" value="" />

    <!-- Plotting parameters -->
    <param name="plot_normals" value="false" />
    <param name="plot_samples" value="false" />    
    <param name="plot_candidates" value="false" />
    <param name="plot_filtered_grasps" value="false" />
    <param name="plot_valid_grasps" value="false" />
    <param name="plot_clusters" value="false" />
    <param name="plot_selected_grasps" value="false" />
    <param name="rviz_topic" value="grasps_rviz" />

    <!-- Preprocessing of point cloud -->
    <param name="voxelize" value="true"/>
    <!-- change -->
    <!-- <param name="remove_outliers" value="false"/> -->
    <param name="remove_outliers" value="true"/>
    <rosparam param="workspace"> [-1,1,-1,1,1.0,1.3] </rosparam>     
    <rosparam param="camera_position"> [0, 0, 0] </rosparam>

    <!-- General parameters -->
    <!--<param name="num_samples" value="100" />
    <param name="num_threads" value="4" />-->
    <param name="num_samples" value="1000" />
    <param name="num_threads" value="8" />

    <!-- Parameters for local grasp candidate search -->
    <param name="nn_radius" value="0.01" />
    <param name="num_orientations" value="8" /> <!-- Number of orientations to consider -->

    <!-- Filtering of grasp candidates --> 
    <param name="filter_grasps" value="false" /> <!-- on workspace and robot hand aperture -->
    <rosparam param="worksapce_grasps"> [-1,1,-0.5,0.0,1.1,1.2] </rosparam>
    <param name="filter_half_antipodal" value="false"/> <!-- on half antipodal -->

    <!-- Grasp image creation -->
    <param name="create_image_batches" value="false" /> <!-- creates grasp images in batches (less memory usage) -->
    <!--<param name="remove_plane_before_image_calculation" value="false" />  removes table plane from point cloud to speed up shadow computations -->
    <param name="remove_plane_before_image_calculation" value="false" />
    <!-- Clustering of grasps -->
    <param name="min_inliers" value="1" />
    <!-- <param name="min_inliers" value="1" /> -->

    <!-- Grasp selection -->
    <!--<param name="min_score_diff" value="0" />
    <param name="min_aperture" value="0.029" />
    <param name="max_aperture" value="0.072" />
    <param name="num_selected" value="5" />-->

    <param name="min_score_diff" value="50" />
    <param name="min_aperture" value="0.0" />
    <param name="max_aperture" value="0.085" />
    <param name="num_selected" value="1" />

    </node>

</launch>
atenpas commented 5 years ago

Does the plane fitting work? I think you just copied the code from Tutorial 2, right? That won't work because it looks like your point cloud is in a different frame than in the tutorial. I would suggest you use a more general plane fitting method, like the one in PCL.

onepre commented 5 years ago

@atenpas the plane fitting doesn't work,I just copied the code from Tutorial 2, but I have change the point cloud frame to "kinect2_link"(I'm using kinect v2),it still doesn't work.

As for Plane model segmentation,I have used it to successfully cut out the table plane.but if I cut out the table plane, I have get grasps that approach an object from below the table.so I use CloudIndexed with Plane model segmentation.Below is the code.I still get grasps that approach an object from below the table,and I don't konw where the problem is?

ros::Publisher pub_indexed;

void callback(const sensor_msgs::PointCloud2::ConstPtr& input)
{
    pcl::PCLPointCloud2* cloud = new pcl::PCLPointCloud2;
    pcl::PCLPointCloud2ConstPtr cloudPtr(cloud);
    pcl::PCLPointCloud2* cloud_filtered = new pcl::PCLPointCloud2;
    pcl::PCLPointCloud2Ptr cloudFilteredPtr (cloud_filtered);

    //Convert to PCL data type
    pcl_conversions::toPCL(*input, *cloud);

    pcl::PointCloud<pcl::PointXYZRGB> *xyz_cloud = new pcl::PointCloud<pcl::PointXYZRGB>;
    pcl::PointCloud<pcl::PointXYZRGB>::Ptr xyzCloudPtr(xyz_cloud);

    //convert the pcl::PointCloud2 tpye to pcl::PointCloud<pcl::PointXYZRGB>
    pcl::fromPCLPointCloud2(*cloudPtr,*xyzCloudPtr);

    pcl::PointCloud<pcl::PointXYZRGB> *xyz_cloud_filtered =   new pcl::PointCloud<pcl::PointXYZRGB>;
    pcl::PointCloud<pcl::PointXYZRGB>::Ptr xyzCloudPtrFiltered(xyz_cloud_filtered);

    //cloud filter x y z
    pcl::PassThrough<pcl::PointXYZRGB> pass;    
    pass.setInputCloud(xyzCloudPtr);
    pass.setFilterFieldName("x");
    pass.setFilterLimits(-0.3,-0.1);
    pass.filter(*xyzCloudPtrFiltered);

    pcl::PointCloud<pcl::PointXYZRGB> *xyz_cloud_filtered2 =  new pcl::PointCloud<pcl::PointXYZRGB>;
    pcl::PointCloud<pcl::PointXYZRGB>::Ptr xyzCloudPtrFiltered2(xyz_cloud_filtered2);

    pcl::PassThrough<pcl::PointXYZRGB> passy;
    passy.setInputCloud(xyzCloudPtrFiltered);
    passy.setFilterFieldName("y");
    passy.setFilterLimits(-0.4,0.1);
    passy.filter(*xyzCloudPtrFiltered2);

    pcl::PointCloud<pcl::PointXYZRGB> *xyz_cloud_filtered3 =  new pcl::PointCloud<pcl::PointXYZRGB>;
    pcl::PointCloud<pcl::PointXYZRGB>::Ptr xyzCloudPtrFiltered3(xyz_cloud_filtered3);

    pcl::PassThrough<pcl::PointXYZRGB> passz;
    passz.setInputCloud(xyzCloudPtrFiltered2);
    passz.setFilterFieldName("z");
    passz.setFilterLimits(1.0 ,1.3);
    passz.filter(*xyzCloudPtrFiltered3);

    // create a pcl object to hold the ransac filtered results
    pcl::PointCloud<pcl::PointXYZRGB> *xyz_cloud_ransac_filtered = new pcl::PointCloud<pcl::PointXYZRGB>;
    pcl::PointCloud<pcl::PointXYZRGB>::Ptr xyzCloudPtrRansacFiltered (xyz_cloud_ransac_filtered);

    // perform ransac planar filtration to remove table top
    pcl::ModelCoefficients::Ptr coefficients (new pcl::ModelCoefficients);
    pcl::PointIndices::Ptr inliers (new pcl::PointIndices);
    // Create the segmentation object
    pcl::SACSegmentation<pcl::PointXYZRGB> seg1;
    // Optional
    seg1.setOptimizeCoefficients(true);
    // Mandatory
    seg1.setModelType (pcl::SACMODEL_PLANE);
    seg1.setMethodType (pcl::SAC_RANSAC);
    // seg1.setMaxIterations (100);
    seg1.setDistanceThreshold (0.02);
    seg1.setInputCloud (xyzCloudPtrFiltered3);
    seg1.segment(*inliers, *coefficients);

    // Create the filtering object
    pcl::ExtractIndices<pcl::PointXYZRGB> extract;

    extract.setInputCloud (xyzCloudPtrFiltered3);
    extract.setIndices(inliers);
    extract.setNegative (true);
    extract.filter(*xyzCloudPtrRansacFiltered);

    //cloudIndexed
    gpd::CloudIndexed indexed;
    indexed.cloud_sources.cloud = output;
    std_msgs::Int64 camera_source;
    camera_source.data = 0;
    indexed.cloud_sources.camera_source.push_back(camera_source);
    geometry_msgs::Point view_point;
    view_point.x = 0;
    view_point.y = 0;
    view_point.z = 0;
    indexed.cloud_sources.view_points.push_back(view_point);

    for(int i = 0; i < xyzCloudPtrRansacFiltered->size(); i++)
    {
        std_msgs::Int64 indices;
        indices.data = (long long)(xyzCloudPtrRansacFiltered->points[i].x);
        indexed.indices.push_back(indices); 
    }

    pub_indexed.publish(indexed);

}

int main(int argc, char *argv[])
{
    ros::init(argc, argv,"grasp_test");
    ros::NodeHandle nh;

    ros::Subscriber sub = nh.subscribe<sensor_msgs::PointCloud2>("/kinect2/qhd/points",1,callback);
    pub_indexed = nh.advertise<gpd::CloudIndexed> ("/cloud_indexed",1);
    ros::spin();
}
atenpas commented 5 years ago

Sounds like you're on the right track! GPD generates grasps from all kinds of directions towards an object (for more details, see our paper). Your next step should be to filter out the grasps that approach an object from below the table (you would need to write the code for this).

onepre commented 5 years ago

@atenpas thanks for your encouragement, I found a param called "filter_grasps" in tutorial1.launch,Is it used to filter grasps? or Can you give me some suggestions for filtering out the grasps that approach an object from below the table?

atenpas commented 5 years ago

The filter_grasps parameter might help somewhat. It will filter grasps based on comparing their positions to the parameter workspace_grasps which you would need to set according to your scenario. You want workspace_grasps to be tighter (smaller in volume) than workspace.

For your own filter, you can compare the grasp approach axis to some other axis, e.g., one that is orthogonal to the table plane. You could look at the dot product between the vectors corresponding to these axes to determine which grasps approach objects from below the table.

nevermore0127 commented 5 years ago

@atenpas Hey! I did some similar work like @onepre did ,and also piontcloud segmentation tryings.But i found that the grasps are not so good .The grasping positions are a little strange and the grasps generated for the segmented object seem to be collided with other things and the table. So,any ideas to solve this? thanks a lot

atenpas commented 5 years ago

You can filter those that collide with the table.

The grasping positions are a little strange

What does that mean? Do you have a screenshot?

onepre commented 5 years ago

@atenpas thanks a lot, Now the program can generate some correct grasps.

github_gpd

But there are still two problems. First ,I split the real-time point cloud image, resulting in a very long time to generate grasps.How to reduce the generation time of the graps? Second,The point cloud generated by the kinect I used seems to be bad.Do you mind telling me what camera you are using?

atenpas commented 5 years ago

I split the real-time point cloud image

What does that mean?

How to reduce the generation time of the graps?

This depends mainly on the density of the point cloud and the number of samples. The less samples, the faster.

Do you mind telling me what camera you are using?

We use a structure.io depth camera (no RGB): https://structure.io/structure-sensor

onepre commented 5 years ago

@atenpas

sorry for my question is unclear, I meant that Split point clouds from kinect in real time.

A warnning appeared today.and I don't konw if it will affect the grasps?

[ WARN] [1544510740.880090540]: Marker 'finger/0' contains unnormalized quaternions. This warning will only be output once but may be true for others; enable DEBUG messages for ros.rviz.quaternions to see more details.

gpd_strange

WwYyFan commented 5 years ago

@onepre I also encountered this warnning.But after a while, the correct grasps will be regenerated.

kctoayo88 commented 5 years ago

@onepre I refer to this issue and it help me a lot. thanks for you and @atenpas , I really appreciate it. But I would like to know what kind of method do you use for "filtering out the grasps that approach an object from below the table" problem. Hope you could give me some tips.

Thanks !

atenpas commented 5 years ago

But I would like to know what kind of method do you use for "filtering out the grasps that approach an object from below the table" problem.

This has already been explained in a comment above:

For your own filter, you can compare the grasp approach axis to some other axis, e.g., one that is orthogonal to the table plane. You could look at the dot product between the vectors corresponding to these axes to determine which grasps approach objects from below the table.

So you would be comparing the grasp approach vector with a vertical, upward-pointing vector. If both vectors point in about the same direction (corresponding to a dot product close to 1), this means that the grasp approaches an object from below, and should be filtered out.

kctoayo88 commented 5 years ago

@atenpas Thanks for your reply, it's a helpful information for me. But now I encounter a unsolvable problem about get quaternion of robot grasp pose.

Here is my code:

#!/usr/bin/env python
import rospy
import sys
import tf
import numpy as np
from gpd.msg import GraspConfigList
from tf.transformations import quaternion_from_matrix
from geometry_msgs.msg import Pose
from geometry_msgs.msg import PoseStamped 

grasps = []

def callback(msg):
    global grasps
    grasps = msg.grasps

# Create a ROS node.
rospy.init_node('get_grasps')
rospy.loginfo('Start to get grasp data...')

# Subscribe to the ROS topic that contains the grasps.
sub = rospy.Subscriber('/detect_grasps/clustered_grasps', GraspConfigList, callback)

# Wait for grasps to arrive.
rospy.sleep(1)

while not rospy.is_shutdown():    
    if len(grasps) > 0:
        top_grasp = grasps[0] # grasps are sorted in descending order by score.
        rospy.loginfo('Received %d grasps.', len(grasps))

        R = (top_grasp.approach, top_grasp.binormal, top_grasp.axis)
        R = np.asarray(R)
        print(R)

        qua_R = tf.transformations.quaternion_from_matrix(R)
        print qua_R

        break   

rospy.spin()

I got these error message:

 cyc@cyc-X550JX:~$ rosrun gpd test_grasp.py 
[INFO] [1558753204.760210]: Start to get grasp data...
[INFO] [1558753209.430311]: Received 1 grasps.
[x: 0.527165424825
y: 0.842580358548
z: 0.110249509106
 x: 0.849633644563
y: -0.524891291709
z: -0.0511057914036
 x: 0.0148082712001
y: 0.120612898469
z: -0.992589161651]
Traceback (most recent call last):
  File "/home/cyc/catkin_ws/src/gpd/src/test_grasp.py", line 74, in <module>
    qua_R = tf.transformations.quaternion_from_matrix(R)
  File "/opt/ros/kinetic/lib/python2.7/dist-packages/tf/transformations.py", line 1206, in quaternion_from_matrix
    M = numpy.array(matrix, dtype=numpy.float64, copy=False)[:4, :4]
TypeError: float() argument must be a string or a number

Is it need convert to any special data type from the GraspConfig.msg(R) matrix? Thanks a lot.

atenpas commented 5 years ago

It looks like quaternion_from_matrix expects a 4x4 matrix.

By the way, your issue is unrelated to GPD. Please use google or stackoverflow to solve such issues.

kctoayo88 commented 5 years ago

@atenpas Thanks for your comments. I figure out this problem already.