Closed Belidan closed 6 years ago
The workspace filtering is explained in the README:
The first defines the volume of space in which to search for grasps as a cuboid of dimensions [minX, maxX, minY, maxY, minZ, maxZ], centered at the origin.
You should make the volume as close as possible to the workspace that your robot needs. Just for testing it though, you could start from a workspace that is very large, e.g., [-10,10,-10,10,-10,10]
.
I am also trying to use your package. To be perfectly clear, this workspace should be specified in camera frame (the frame_id
of the point cloud), correct ?
Yes, it should be specified in the frame_id
of the point cloud.
I would suggest to transform that frame into a different frame though (e.g., the base frame of your robot) because the workspace is measured from the origin along the axes of the frame. Depending on the orientation of your camera, these might be difficult to measure.
Will do thanks :)
Hey atenpas, at first great work with your grasp. works pretty nice. Unfortunately i was trying to run it with the care-o-bot simulation on Gazebo. It creates just a virtual point cloud instead of a real one. After changing the subscribed topic to the necessary the algorithm creates the follow output:
Processing cloud with: 710 points. After workspace filtering: 0 points left. [pcl::KdTreeFLANN::setInputCloud] Cannot create a KDTree with an empty input cloud! Estimating local reference frames ... Error: No samples or no indices!
The processing can be more (tried with the raw point cloud from the simulation with arround 10k points), but the error is still occuring. What exactly is the workspace filtering doing? and how can i prevent this error?
Kind regards, Dimitrij-M Holm