Open tianshiz opened 4 years ago
Hey, that's hard to tell without further info. Do you have an image?
I suspect something is not 100% calibrated to explain why the virtual view is not fitting the real view better, ooor maybe you're using the wrong urdf link when configuring the filter?
Hi,
You can see the sliver on the arm here. I'm feeding the filtered depth image the sliver appears a bit after the robot stops moving. debug window here:
here's my filter parameter:
used for urdf realtime filter
fixed_frame: /map
camera_frame: /zed_left_camera_optical_frame_3
camera_offset: translation: [0.0, 0.0, 0.0] rotation: [0.0, 0.0, 0.0, 1.0] models:
- model: "robot_description" tf_prefix: "" depth_distance_threshold: 0.05 show_gui: true filter_replace_value: 0.0
Maybe it's a TF freq thing? I had to add TF.waitforTransform infront of all the lookupTransform calls in the code
You can deal with this problem by adding another filter type such as 'StatisticOutlierRemoval' or 'RadiusOutlierRemoval' after converting depth image to point clouds.
Hi all,
Sorry to jump into this conversation with an off-topic question but I cannot convert the output of this package in a PointCloud msg. Can anyone help me with this procedure?
Thanks in advance
Hi @lucarossini-iit.
This place is supposed to be used to talk about issues related to the realtime_urd_filter package. Next time, please go ask questions at https://answers.ros.org/questions/.
I have a piece of launchfile that could be helpful for your problem: https://github.com/kuka-isir/kinects_human_tracking/blob/master/launch/create_pc.launch#L1-L16
More information here: http://wiki.ros.org/depth_image_proc
You can deal with this problem by adding another filter type such as 'StatisticOutlierRemoval' or 'RadiusOutlierRemoval' after converting depth image to point clouds.
The above method can only solve the redundant point cloud of the manipulator in the static environment. The fundamental solution is to use the dilate operation of image morphology from OpenCV for the manipulator in the urdf_filter.cpp files,and I made it.
I'm running on melodic with my robotic arm, the majority of the arm is filtered, but when converted to pointcloud, sliver remains. The sliver is a good 10% of the urdf so it's very noticeable.
Does this mean that there is some misalignment with the urdf? Or alternatively is there a way to inflate the mask a bit more to make up for this?