Open louxibai opened 5 years ago
You should try out GPD: https://github.com/atenpas/gpd.
GPG uses simple geometric constraints to create a set of grasp candidates. GPD classifies these candidates as good or bad grasps, where good grasps are basically mechanically stable (antipodal grasps) and expected to be collision free. If you want more background on the algorithm, check out our paper.
Hope this helps!
I run gpg over a partial pointcloud of a cuboid collected by Kinect, but it looks like your algorithm is interested in the blank area below the object. How to make it generate more desired candidates?
2019-01-16 16:51:32.023774 Failed to find match for field 'rgba'. finger_width: 0.01 hand_outer_diameter: 0.12 hand_depth: 0.06 hand_height: 0.02 init_bite: 0.01 voxelize: 1 remove_outliers: 0 workspace: -1.0 1.0 -1.0 1.0 -1.0 1.0 camera_pose: -1 8.74227908e-08 3.55271368e-15 1.41000152e-02 6.18172749e-08 7.07107067e-01 7.07106531e-01 -5.47394216e-01 6.18172180e-08 7.07106531e-01 -7.07107067e-01 7.41109848e-01 0 0 0 1 num_samples: 2 num_threads: 4 nn_radius: 0.01 num_orientations: 1 rotation_axis: 2 plot_grasps: 1 plot_normals: 1 Loaded point cloud with 328 points 3 Processing cloud with: 328 points. After workspace filtering: 328 points left. After voxelization: 206 points left. Subsampled 2 at random uniformly. Calculating surface normals ... camera: 0, #indices: 206, #normals: 206 runtime (normals): 0.000823014 Reversing direction of normals that do not point to at least one camera ... reversed 0 normals runtime (reverse normals): 3.4601e-06 Drawing 206 normals Estimating local reference frames ... Estimated 2 local reference frames in 7.1371e-05 sec. Finding hand poses ... Found 2 grasp candidate sets in 9.1517e-05 sec. ====> HAND SEARCH TIME: 0.000218619