IntelRealSense / librealsense

Intel® RealSense™ SDK
https://www.intelrealsense.com/
Apache License 2.0
7.59k stars 4.82k forks source link

The depth quality get worse in the edge of object #3741

Closed TouchDeeper closed 5 years ago

TouchDeeper commented 5 years ago

Required Info
Camera Model D435
Firmware Version 05.11.01.00
Operating System & Version Ubuntu 16
Kernel Version (Linux Only) 4.15.0
Platform PC
SDK Version 2.8.1
Language C++
Segment Robot

Issue Description

Hi, I want to do a pose estimation based on point cloud. My object is a banana. The scene I get the point cloud of banana is shown in the image below: 2019-04-12 09-38-03屏幕截图

The point cloud I get from the point cloud is shown in the image below: view along the Z axes: 2019-04-12 09-38-16屏幕截图

side view: 2019-04-12 09-39-27屏幕截图

As you can see from the side view, the point cloud in the edge will become elongation along the Z axes. What can I do to preserve the point cloud in the edge of banana?

TouchDeeper commented 5 years ago

I have applied the spatial filter and the temporal filter. The preset I use is the ShortRangePreset.json

RealSenseCustomerSupport commented 5 years ago

Hi TouchDeeper,

I think that you are on right track. And below are some general things that I can suggest: 1) If not yet, make sure to validate your camera's depth quality by going through Depth Quality Tool. And then do calibration if needed. 2) Starting with a baseline (either the default preset as it is tuned for best trade-offs among all parameters, or use the ShortRangePreset). And add no enabling post-processing; Also use 1280x720 fpr D415, and 848x480 for D435. You may compare with a few different presets (especially for high density if you are looking for edge preserving); 3) Keep nice lighting environment and laser on. Increase laser power if needed. 4) Turn on post-processing with spacial/temporal filter on, and also try other post-processing filters. 5) You may also adjust depth control parameters under "Advanced Controls => Depth Control"

And refer to this whitepaper about tuning depth quality, if not yet: https://dev.intelrealsense.com/docs/tuning-depth-cameras-for-best-performance

Again, edge preserving is a tough topic and there are various factors to consider. Hope this can be helpful.

Thanks!

TouchDeeper commented 5 years ago

Thanks, we also have a discussion in the Intel forum, and when I get closer from object to the camera, the depth quality of the edge of the object become better.

tugbakara commented 1 year ago

In this link1 and link2, If I set this parameters in the viewer will they help in ROS side? @MartyG-RealSense ?

MartyG-RealSense commented 1 year ago

Hi @tugbakara It is not necessary to use the Viewer to set a camera configuration Visual Preset or the Decimation filter in ROS as you can do so in the ROS wrapper.

https://github.com/IntelRealSense/realsense-ros/issues/1924#issuecomment-858369624 provides information about defining a Decimation filter in the wrapper.

Apparently if you use the Decimation filter though then you cannot set align_depth to true, as described at https://github.com/IntelRealSense/realsense-ros/issues/2269

In regard to the preset, you could try downloading a json preset file from the Visual Presets page and then loading it into the wrapper during launch by defining the json_file_path wrapper parameter, as descibed at https://github.com/IntelRealSense/realsense-ros/issues/2445#issuecomment-1211568100

tugbakara commented 1 year ago

Really thanks! I will try an give a fedback! @MartyG-RealSense :) EDIT: Are they applicable for ROS Melodic?