luxonis / depthai-ros

Official ROS Driver for DepthAI Sensors.
MIT License
251 stars 185 forks source link

3d Map generated from rtabmap-ros does not accurate enough to visualize the real environment #535

Closed Jenanaputra closed 3 months ago

Jenanaputra commented 4 months ago

Hi! I am currently working on mapping the environment around me using rtabmap in the launch file. The map can be generated. But, what I found is the odometry is sometimes missing (but I knew that this can be solved by doing this click here) and the generated map is not accurate enough (not that clear and it is shaded) to visualize the environment around me.

I also attached the map that is generated as below: Screenshot from 2024-05-27 13-12-49

Screenshot from 2024-05-27 15-17-05

For additional information, I am using OAKD-POE to generated map with i_subpixel and i_low_bandwidth parameters are set to be false and true respectively.

Does anyone know why did this happen? Is this possible because the odometry data is not accurate enough ?

Serafadam commented 4 months ago

Hi, this might be caused by some data delays from using the PoE interface, you can additionally try lowering FPS/resolution to get improved latency.

Jenanaputra commented 3 months ago

Hi @Serafadam , I tried lowering only the FPS parameter to 10. I also tried lowering the resolution, but I got a warning that 1080P is the lowest resolution of the IMX378 sensor used by the OAK-D POE click here

Even though i made the fps lower and the camera seemed updating the image faster (get improved latency), I still got the same result on the generated map. The map can not visualize the object of interest properly. Sometimes it is shaded and overlapped when i change the orientation of the camera relative to object of interest.

This is just what I think. Is it possible that this is due to the lack of a IMU data (need to calibrate it) and the lack of odometry data (when it is lost and then regained)? What do you think?

Serafadam commented 3 months ago

Hi, you can get lower resolutions with ISP scaling, more information on that here, and also here is the sheet that you can use for output size calculation. You need to input 1920x1080 in the image size and when all fields are green near output ratio/size (for example for 2,3 you get 1280x720) then the ratio will be correct for depth output as well. Regarding mapping, it might be that the odometry is lagging a bit behind but AFAIK IMU is not taken into account for it, only for initial orientation estimation. Regarding odometry losses, they can happen with fast motions or when the camera has autoexposure turned on).

Jenanaputra commented 3 months ago

Hi @Serafadam ,

I just want to confirm: If I want to use ISP scaling and have found suitable values for denum and num so that all fields on the sheets are green, the parameters that I need to set arergb.i_isp_den, rgb.i_isp_num,rgb.i_output_isp, rgb.i_set_isp3a_fps, and rgb.i_set_isp_scale, right?

If I want to turn off the auto exposure, which parameter do I need to change? Also, regarding the lagging odometry data, do you have any ideas on how to overcome it?

Jenanaputra commented 3 months ago

Hi @Serafadam ..

Do you have any thoughts about the lagging odometry's data ?

Serafadam commented 3 months ago

Hi, sorry for the delay:

If I want to use ISP scaling and have found suitable values for denum and num so that all fields on the sheets are green, the parameters that I need to set are rgb.i_isp_den, rgb.i_isp_num, rgb.i_output_isp, rgb.i_set_isp3a_fps, and rgb.i_set_isp_scale, right?

Just set_isp_scale and isp_den/isp_num should be enough. For exposure - r_set_man_exposure: true

Regarding lags, are those nodes working in the same component container? DDS can also introduce some lags, you can try different implementations, tune parameters or use IPC (more information on that can be found in ROS docs). In near future we will also provide VIO and RTABMap DAI nodes, which should get rid of those ROS limitations.

Jenanaputra commented 3 months ago

@Serafadam Ok, noted. Thanks