IntelRealSense / realsense-ros

ROS Wrapper for Intel(R) RealSense(TM) Cameras
http://wiki.ros.org/RealSense
Apache License 2.0
2.58k stars 1.76k forks source link

underwater calibration for d435 #1723

Closed robotlearning123 closed 3 years ago

robotlearning123 commented 3 years ago

Hi, we made a waterproof housing for the d435 camera so it needs to be recalibrated. The default calibration tool from realsense seems only to calibrate the RGB image, right? But I think we need to calibrate the stereo images to get an accurate depth. What is more, we do not use the IF camera because it does not work underwater.

I want to use like this camera_calibration package to calibrate the stereo camera, but there is on left and right image topic to get. Any suggestions for that? Thanks a lot!

MartyG-RealSense commented 3 years ago

Hi @wangcongrobot The Dynamic Calibration tool provides a robust calibration for the depth sensors and the RGB sensor too if the RealSense camera model has one (which the D435 model does, of course).

You can also calibrate the camera's depth using the On-Chip Calibration function. This is accessible from the RealSense Viewer program by going to the 'More' option near the top of the Viewer's options side-panel and selecting the On-Chip Calibration menu option.

The Tare Calibration function from the same menu checks the camera's absolute distance measurement accuracy.

A white-paper document about On-Chip calibration and Tare is at the link below.

https://dev.intelrealsense.com/docs/self-calibration-for-depth-cameras

image

I would recommend using one of those two options (Dynamic Calibration or On-Chip) for calibrating the camera, since these packages can write the completed calibration to storage inside the camera hardware once you are satisfied with the results.

MartyG-RealSense commented 3 years ago

Hi @wangcongrobot Do you require further assistance with this case, please? Thanks!

robotlearning123 commented 3 years ago

Thanks a lot for helping! The calibration tool is good! I will close this issue.

robotlearning123 commented 3 years ago

@MartyG-RealSense Hi, another question. Can I access the stereo camera images (left and right) if we want to calibrate the camera using a checkboard by ourself? I didn't found the topic for the default launch file. Thanks a lot!

MartyG-RealSense commented 3 years ago

I would recommend using the SDK's official Dynamic Calibration or On-Chip Calibration tools to calibrate the camera, as they have the ability to write the new calibration to storage inside the camera hardware once you are satisfied with the results.

A popular calibration tool amongst RealSense users who prefer to use a method other than the two official ones above is Kalibr.

https://github.com/ethz-asl/kalibr

If you prefer to develop your own solution then you can enable both the left and right infrared topics (which are disabled by default in the RealSense ROS wrapper) by adding the commands below to the end of your roslaunch instruction:

enable_infra1:=true enable_infra2:=true

robotlearning123 commented 3 years ago

Thanks for your quickly replay! After the calibration using Dynamic Calibration tool, the depth is good. But when I test the results, the wall is a curve and not plane. So I not sure whether Dynamic Calibration tool will calibrate the stereo image parameters? P.S.: The camera works underwater with a waterproof housing. Any suggestions for this problem? Screenshot from 2021-03-03 10-42-26 Screenshot from 2021-03-03 10-42-39

MartyG-RealSense commented 3 years ago

There was a curved-wall case on this RealSense ROS forum in December 2020.

https://github.com/IntelRealSense/realsense-ros/issues/1534

It is a lengthy case but in short, the problem was corrected in that particular case by resetting the camera to factory default configuration.

https://github.com/IntelRealSense/realsense-ros/issues/1534#issuecomment-741953174

MartyG-RealSense commented 3 years ago

Hi @wangcongrobot Do you require further assistance with this case, please? Thanks!

robotlearning123 commented 3 years ago

Hi @MartyG-RealSense Thanks a lot for your help! Can I ask another question? Why I cannot launch the topics of infra camera, even enable_infra1:=true enable_infra2:=true were added in rs_camera.launch file? There is only rgb and depth topics.

MartyG-RealSense commented 3 years ago

Hi @wangcongrobot The instructions that are written in that form are placed on the end of the roslaunch instruction, not inside the launch file.

roslaunch realsense2_camera rs_camera.launch enable_infra1:=true enable_infra2:=true

You can define enable_infra1 and enable_infra2 within the launch file if you prefer, though the instructions are written a little differently.

robotlearning123 commented 3 years ago

Hi @MartyG-RealSense, I tried this but there was no infra topic. Here is the screen of realsense-viewer from windows, but cannot find the infra. realsense-viewer I find this image(https://support.intelrealsense.com/hc/en-us/community/posts/360040163593-How-to-get-pure-infrared-image-from-Realsense-D415-), which have infra in the stereo. I also update the latest firmware.

MartyG-RealSense commented 3 years ago

The RealSense Viewer and other programs written in librealsense do not interact with the RealSense ROS wrapper. So setting options in the Viewer will have no effect on the camera's performance in the ROS terminal. This can also mean that the camera might stream without problems in the Viewer but cannot do so in ROS.

Can you confirm please that you have built the RealSense ROS wrapper?

https://github.com/IntelRealSense/realsense-ros

robotlearning123 commented 3 years ago

I mean the realsense viewer in Windows system and ros wrapper in Linux can not get the infra image. And I can confirm this ROS wrapper is correct.

MartyG-RealSense commented 3 years ago

Okay, I understand now. Thank you very much. Could you post an image of the Stereo Module section of your Viewer controls when streaming is off please so that I can see the stream selection boxes under Available Streams.

robotlearning123 commented 3 years ago

Thank you a lot! I didn't find the infra streams in the Viewer. realsense-viewer-panel

MartyG-RealSense commented 3 years ago

I am sorry, please show me the Stereo Module when it is Off (red icon, not blue) so I can see the Available Streams settings.

robotlearning123 commented 3 years ago

Thank you a lot! I know the problems... I must turn on the infra image when the stereo module is off. Now the infra image is normal. I need to check the ROS wrapper and how to get the infra image topics next.

MartyG-RealSense commented 3 years ago

You are very welcome. Yes, you can only select the streams whilst the Stereo Module is off.

robotlearning123 commented 3 years ago

@MartyG-RealSense Thank you! I know the problems why I cannot get infra images. My system is Ubuntu 18.04 and I rebuild the kernel with an RT kernel. When I change to another system, it works.

roslaunch realsense2_camera rs_camera.launch enable_infra1:=true enable_infra2:=true

What is more, how can I get the raw stereo images from ros topic, not rect_raw? We want to use the raw stereo images directly.

MartyG-RealSense commented 3 years ago

In regard to accessing raw RealSense frames in ROS, the advice in the link below by Doronhi the RealSense ROS wrapper developer may be what you require.

https://github.com/IntelRealSense/realsense-ros/issues/787#issuecomment-500383618

doronhi commented 3 years ago

The infrared images in Y8 format received from the device are already rectified as these are the ones used for depth calculation. In order to receive un-rectified infrared images, one should request the Y16 infrared images. While doing this, depth stream cannot be retrieved. This option is not supported by realsense2_camera.

robotlearning123 commented 3 years ago

Thanks for helping! Another solution for the ros wrapper: I download the realsense2_ros to my workspace, choose v2.2.18 and cakin_make. Now I can get the infra image topics.

@MartyG-RealSense Thank you! I know the problems why I cannot get infra images. My system is Ubuntu 18.04 and I rebuild the kernel with an RT kernel. When I change to another system, it works.

roslaunch realsense2_camera rs_camera.launch enable_infra1:=true enable_infra2:=true

What is more, how can I get the raw stereo images from ros topic, not rect_raw? We want to use the raw stereo images directly.

robotlearning123 commented 3 years ago

The infrared images in Y8 format received from the device are already rectified as these are the ones used for depth calculation. In order to receive un-rectified infrared images, one should request the Y16 infrared images. While doing this, depth stream cannot be retrieved. This option is not supported by realsense2_camera.

Thank you! Yes, what I want is to subscribe to the raw infra image topics from the ros wrapper.

According to this comment I have tryied this code to get the Y16 raw data:

import pyrealsense2 as rs
import numpy as np
import cv2

# Configure depth and color streams
pipeline = rs.pipeline()
config = rs.config()

config.enable_stream(rs.stream.infrared, 1, 640, 400, rs.format.y16, 25)
config.enable_stream(rs.stream.infrared, 2, 640, 400, rs.format.y16, 25)

# Start streaming
pipeline.start(config)

try:
    while True:

        # Wait for a coherent pair of frames: depth and color
        frames = pipeline.wait_for_frames()
        ir_frame_left = frames.get_infrared_frame(1)
        ir_frame_right = frames.get_infrared_frame(2)

        # Convert images to numpy arrays
        ir_left_image = np.asanyarray(ir_frame_left.get_data())
        ir_right_image = np.asanyarray(ir_frame_right.get_data())

        # Stack both images horizontally
        images2 = np.hstack((ir_left_image, ir_right_image))
        # Show images
        cv2.namedWindow('RealSense', cv2.WINDOW_AUTOSIZE)
        cv2.imshow("Display pic_irt", images2)

        key = cv2.waitKey(1)
        # Press esc or 'q' to close the image window
        if key & 0xFF == ord('q') or key == 27:
            cv2.destroyAllWindows()
            break

finally:
    # Stop streaming
    pipeline.stop()

I will close this issue. Thank you for your help! @MartyG-RealSense @doronhi

MartyG-RealSense commented 3 years ago

You are very welcome @wangcongrobot - thanks very much for sharing the details of your solution with the RealSense ROS community!

robotlearning123 commented 3 years ago

Hi @MartyG-RealSense. Can I ask what the meaning of the calibration data (left intrinsic) in the picture below? It iseems like the 3x3 intrinsic matrix, but the exact values are not equal to the real calibration data (fx,fy,ppx,ppy). Screenshot from 2021-03-18 21-33-27 I download the calibration data and like this:

{
  "baseline": "-49.9496",
  "intrinsic_left.x.x": "0.501421",
  "intrinsic_left.x.y": "0.80142",
  "intrinsic_left.x.z": "0.497364",
  "intrinsic_left.y.x": "0.509169",
  "intrinsic_left.y.y": "-0.0556315",
  "intrinsic_left.y.z": "0.0643663",
  "intrinsic_left.z.x": "-0.00033221",
  "intrinsic_left.z.y": "-0.00143369",
  "intrinsic_left.z.z": "-0.0207925",
  "intrinsic_right.x.x": "0.500943",
  "intrinsic_right.x.y": "0.800996",
  "intrinsic_right.x.z": "0.49935",
  "intrinsic_right.y.x": "0.50034",
  "intrinsic_right.y.y": "-0.0554353",
  "intrinsic_right.y.z": "0.0647207",
  "intrinsic_right.z.x": "0.000149549",
  "intrinsic_right.z.y": "-0.00121234",
  "intrinsic_right.z.z": "-0.0208747",
  "rectified.0.fx": "970.688",
  "rectified.0.fy": "970.688",
  "rectified.0.height": "1080",
  "rectified.0.ppx": "955.654",
  "rectified.0.ppy": "545.571",
  "rectified.0.width": "1920",
  "rectified.1.fx": "647.125",
  "rectified.1.fy": "647.125",
  "rectified.1.height": "720",
  "rectified.1.ppx": "637.102",
  "rectified.1.ppy": "363.714",
  "rectified.1.width": "1280",
  "rectified.10.fx": "582.413",
  "rectified.10.fy": "582.413",
  "rectified.10.height": "0",
  "rectified.10.ppx": "357.392",
  "rectified.10.ppy": "363.343",
  "rectified.10.width": "0",
  "rectified.11.fx": "465.93",
  "rectified.11.fy": "465.93",
  "rectified.11.height": "0",
  "rectified.11.ppx": "285.914",
  "rectified.11.ppy": "290.674",
  "rectified.11.width": "0",
  "rectified.12.fx": "647.125",
  "rectified.12.fy": "647.125",
  "rectified.12.height": "400",
  "rectified.12.ppx": "637.102",
  "rectified.12.ppy": "403.714",
  "rectified.12.width": "640",
  "rectified.13.fx": "4.70255e-37",
  "rectified.13.fy": "0",
  "rectified.13.height": "576",
  "rectified.13.ppx": "0",
  "rectified.13.ppy": "0",
  "rectified.13.width": "576",
  "rectified.14.fx": "0",
  "rectified.14.fy": "0",
  "rectified.14.height": "720",
  "rectified.14.ppx": "0",
  "rectified.14.ppy": "0",
  "rectified.14.width": "720",
  "rectified.15.fx": "0",
  "rectified.15.fy": "0",
  "rectified.15.height": "1152",
  "rectified.15.ppx": "0",
  "rectified.15.ppy": "0",
  "rectified.15.width": "1152",
  "rectified.2.fx": "388.275",
  "rectified.2.fy": "388.275",
  "rectified.2.height": "480",
  "rectified.2.ppx": "318.261",
  "rectified.2.ppy": "242.228",
  "rectified.2.width": "640",
  "rectified.3.fx": "428.721",
  "rectified.3.fy": "428.721",
  "rectified.3.height": "480",
  "rectified.3.ppx": "422.08",
  "rectified.3.ppy": "242.461",
  "rectified.3.width": "848",
  "rectified.4.fx": "323.563",
  "rectified.4.fy": "323.563",
  "rectified.4.height": "360",
  "rectified.4.ppx": "318.551",
  "rectified.4.ppy": "181.857",
  "rectified.4.width": "640",
  "rectified.5.fx": "214.36",
  "rectified.5.fy": "214.36",
  "rectified.5.height": "240",
  "rectified.5.ppx": "211.04",
  "rectified.5.ppy": "121.23",
  "rectified.5.width": "424",
  "rectified.6.fx": "194.138",
  "rectified.6.fy": "194.138",
  "rectified.6.height": "240",
  "rectified.6.ppx": "159.131",
  "rectified.6.ppy": "121.114",
  "rectified.6.width": "320",
  "rectified.7.fx": "242.672",
  "rectified.7.fy": "242.672",
  "rectified.7.height": "270",
  "rectified.7.ppx": "238.913",
  "rectified.7.ppy": "136.393",
  "rectified.7.width": "480",
  "rectified.8.fx": "647.125",
  "rectified.8.fy": "647.125",
  "rectified.8.height": "800",
  "rectified.8.ppx": "637.102",
  "rectified.8.ppy": "403.714",
  "rectified.8.width": "1280",
  "rectified.9.fx": "485.344",
  "rectified.9.fy": "485.344",
  "rectified.9.height": "540",
  "rectified.9.ppx": "477.827",
  "rectified.9.ppy": "272.786",
  "rectified.9.width": "960",
  "world2left_rot.x.x": "0.999991",
  "world2left_rot.x.y": "-0.001011",
  "world2left_rot.x.z": "0.00407789",
  "world2left_rot.y.x": "0.00101346",
  "world2left_rot.y.y": "0.999999",
  "world2left_rot.y.z": "-0.000602294",
  "world2left_rot.z.x": "-0.00407728",
  "world2left_rot.z.y": "0.000606421",
  "world2left_rot.z.z": "0.999991",
  "world2right_rot.x.x": "0.999987",
  "world2right_rot.x.y": "-0.00498434",
  "world2right_rot.x.z": "-0.000791915",
  "world2right_rot.y.x": "0.00498482",
  "world2right_rot.y.y": "0.999987",
  "world2right_rot.y.z": "0.000602387",
  "world2right_rot.z.x": "0.000788903",
  "world2right_rot.z.y": "-0.000606327",
  "world2right_rot.z.z": "1"
}

The dynamics calibration tool does not calibrate the intrinsic parameters? I want to write the whole intrinsic and extrinsic calibration data manually. One way I know is like this (read/write):

/usr/bin/Intel.Realsense.CustomRW -r
CustomRW for Intel RealSense D400, Version: 2.11.0.0

  Device PID: 0B07
  Device name: Intel RealSense D435
  Serial number: 817412070543
  Firmware version: 05.12.11.00

Calibration parameters from the device:
  resolutionLeftRight: 1280 800

  FocalLengthLeft: 641.818970 641.135986
  PrincipalPointLeft: 636.626038 407.334991
  DistortionLeft: 0.363936 0.282857 -0.007648 0.004442 0.000000

  FocalLengthRight: 641.206970 640.796997
  PrincipalPointRight: 639.168030 400.271973
  DistortionRight: 0.384194 0.273692 -0.008033 0.001912 0.000000

  RotationLeftRight: 0.999980 -0.003970 -0.004872
                     0.003976 0.999991 0.001188
                     0.004867 -0.001208 0.999987
  TranslationLeftRight: -49.949001 -0.248990 -0.039405

  HasRGB: 1

  resolutionRGB: 1920 1080

  FocalLengthColor: 1390.260010 1390.770020
  PrincipalPointColor: 966.973022 551.890015
  DistortionColor: 0.000000 0.000000 0.000000 0.000000 0.000000
  RotationLeftColor: 0.999996 -0.001200 0.002669
                     0.001204 0.999998 -0.001320
                     -0.002667 0.001324 0.999996
  TranslationLeftColor: 14.586600 0.046570 0.292184

I am confused about these. Now I have got the stereo calibration data of the infra1/infra2 images using MatLab. So which one is the correct one that I need to rewrite?

Some useful links for this problem: official doc of projection

According to this, I got the intrinsic and extrinsic parameters:

 Intrinsic of "Infrared 1" / 1280x800 / {Y8}
  Width:        1280
  Height:       800
  PPX:          643.958923339844
  PPY:          396.196990966797
  Fx:           1024.92724609375
  Fy:           1024.92724609375
  Distortion:   Brown Conrady
  Coeffs:       0   0   0   0   0  
  FOV (deg):    63.96 x 42.64

 Intrinsic of "Infrared 1" / 640x480 / {Y8}
  Width:        640
  Height:       480
  PPX:          322.375366210938
  PPY:          237.940032958984
  Fx:           614.956359863281
  Fy:           614.956359863281
  Distortion:   Brown Conrady
  Coeffs:       0   0   0   0   0  
  FOV (deg):    54.98 x 42.64

Extrinsic from "Infrared 1"   To      "Infrared 2" :
 Rotation Matrix:
   1   0   0
   0   1   0
   0   0   1

 Translation Vector: -0.0499496385455132  0  0 

Extrinsic from "Infrared 2"   To      "Infrared 1" :
 Rotation Matrix:
   1   0   0
   0   1   0
   0   0   1

 Translation Vector: 0.0499496385455132  0  0  

 Intrinsic of "Infrared 2" / 640x480 / {Y8}
  Width:        640
  Height:       480
  PPX:          322.375366210938
  PPY:          237.940032958984
  Fx:           614.956359863281
  Fy:           614.956359863281
  Distortion:   Brown Conrady
  Coeffs:       0   0   0   0   0  
  FOV (deg):    54.98 x 42.64
MartyG-RealSense commented 3 years ago

I do not have a reference source for the meaning of the values shown in the Viewer's Camera Calibration interface, so I would prefer not to speculate on their meaning and risk providing incorrect information.

I believe that the data in this interface is controlled by the script in the link below:

https://github.com/IntelRealSense/librealsense/blob/master/common/calibration-model.cpp

I would recommend relying on calibration data provided by official SDK calibration tools (Dynamic Calibration or On-Chip) rather than an external software tool, since the official tools are dedicated to RealSense hardware.

The standard version of the Dynamic Calibration tool only calibrates extrinsics, not intrinsics, because it is extrinsics that have the most impact on the image. There is an OEM version of this tool that calibrates both intrinsics and extrinsics and can be obtained by purchasing the $1500 OEM calibration target system (which is intended for engineering departments and manufacturing facilities) from the official RealSense store. For the majority of RealSense users though, calibrating just the extrinsics with the standard version of the tool is fine.

https://store.intelrealsense.com/buy-intel-realsense-d400-cameras-calibration-target.html

The On-Chip calibration system uses both intrinsics and extrinsics and can write a new calibration table. On-Chip calibration functions can also be accessed using C, C++ or Python scripting, as detailed in the appendices of the Self Calibration white paper.

https://dev.intelrealsense.com/docs/self-calibration-for-depth-cameras#section-appendix

The Dynamic Calibration tool can output calibration data as an xml file whose data is formatted in human-readable form, but the actual raw data stored in the camera's calibration table is not in human-readable format. My understanding is that calibration data read from the calibration table by the On-Chip tool is in the raw data format and not the human-readable form that the CustomRW tool can provide as an xml file.

robotlearning123 commented 3 years ago

@MartyG-RealSense Thank you for the reply! I have tried the Dynamic Calibration Tools, but it is not enough for me. The environment I used is underwater which leading to a very large distortion of raw images. So the Dynamic Calibration Tools doesn't work and I need to calibrate it by myself. As shown in this, the curve is caused by the distortion of water (intrinsic parameters), and resetting the hardware doesn't work.

MartyG-RealSense commented 3 years ago

RealSense cameras have been used successfully in underwater applications or observing through aquarium glass.

For sensing through the glass of an aquarium filled with water, the discussion in the link below is a good reference.

https://github.com/IntelRealSense/librealsense/issues/7966

In regard to depth-sensing underwater, the FishSense project's videos in the YouTube links below is a good proof-of-concept.

https://www.youtube.com/watch?v=ey7oqCiqm6o

https://www.youtube.com/watch?v=-aETBlM_tkc

The curvature that you have resembles an effect known as pincushion distortion, where the image bends inwards instead of bulging outwards (barrel distortion). A RealSense user in the link below provided advice for how they dealt with it when the Dynamic Calibration did not solve it.

https://github.com/IntelRealSense/librealsense/issues/4939#issuecomment-599324579

robotlearning123 commented 3 years ago

@MartyG-RealSense Thanks for the sharing! It is useful for underwater applications. I have also tried d435 and it works well underwater. The big problem is the underwater calibration. I tried the dynamics calibration tools, but not work. Are there any examples to do this?

robotlearning123 commented 3 years ago

Underwater calibration:

https://github.com/IntelRealSense/librealsense/issues/7966

https://github.com/IntelRealSense/realsense-ros/issues/701#issue-425413805

MartyG-RealSense commented 3 years ago

Is part of the problem having access to the camera to calibrate it whilst it is underwater (i.e you cannot recalibrate unless the camera is returned to the surface)?

Or is it that you do not have a chessboard target underwater to calibrate to?

robotlearning123 commented 3 years ago

@MartyG-RealSense

Is part of the problem having access to the camera to calibrate it whilst it is underwater (i.e you cannot recalibrate unless the camera is returned to the surface)?

Or is it that you do not have a chessboard target underwater to calibrate to?

The camera with housing can work underwater and I also print the official chessboard to make the calibration underwater. But After calibration, there is still a very large curve, not a plane.

I can use the stereo infra images to get depth/point cloud by myself, not the sdk (like this). But I also want to change the inner parameters to make the sdk get correct depth/point cloud.

MartyG-RealSense commented 3 years ago

You could try enabling the depth stream and then opening the Viewer's Camera Calibration interface. Once it is open, enable the option I know what I'm doing in the bottom corner. When depth streaming is enabled and 'I know what I'm doing' is enabled then this unlocks the Write Table option. You can double-leftclick on numeric values to edit them manually, and click on 'Write Table' to write the changes to the camera.

If you make a mistake, you can use the Restore Factory option to restore the default factory-new calibration values.

image

robotlearning123 commented 3 years ago

I want to try this one. According to this comment, I am not sure about the exact meaning of left/right intrinsics. It is different with parameters from /usr/bin/Intel.Realsense.CustomRW and rs-enumerate-devices -c.

One test when I using /usr/bin/Intel.Realsense.CustomRW to rewrite paramters with large D: before: Screenshot from 2021-03-19 16-09-03 after: Screenshot from 2021-03-19 16-07-11

There is an obvious difference.

Update: According to here: Intel RealSense D400 Series

robotlearning123 commented 3 years ago

I know the last five parameters of Left Intrinsics is D/coeff, but not sure about the first four parameters.

MartyG-RealSense commented 3 years ago

I cannot locate any documentation for the intrinsic parameters of the Camera Calibration window, so I will seek advice from Intel. Thanks for your patience.

robotlearning123 commented 3 years ago

It looks like the current official calibration tools do not change the intrinsic parameters.

MartyG-RealSense commented 3 years ago

The Dynamic Calibrator does not calibrate intrinsics, only extrinsics (unless using the special OEM version of the tool, which calibrates both intrinsics and extrisics).

The On-Chip tool calbrates intrinsics and extrinsics.

robotlearning123 commented 3 years ago

https://dev.intelrealsense.com/docs/self-calibration-for-depth-cameras

https://github.com/IntelRealSense/librealsense/issues/6685#issuecomment-648673659

I tried all of the official calibration tools, but I cannot get the correct depth/point cloud underwater.

robotlearning123 commented 3 years ago

https://support.intelrealsense.com/hc/en-us/community/posts/360048112173-Technology-data-output-under-water-use

MartyG-RealSense commented 3 years ago

Could you provide more details about your waterproof camera housing please? This will provide clues about whether there is something about the transparent section of the housing's material that is bending the depth image.

robotlearning123 commented 3 years ago

It is not complex about the waterproof camera housing and is a transparent plane in front of the camera. The whole camera will work underwater. MicrosoftTeams-image (2)

MartyG-RealSense commented 3 years ago

Intel provide guidelines about transparent cover materials in front of the lenses on page 134 of the current edition of the data sheet document for the 400 Series cameras.

https://dev.intelrealsense.com/docs/intel-realsense-d400-series-product-family-datasheet

The Chief Technical Officer for the RealSense Group at Intel also offers advice in the link below.

https://github.com/IntelRealSense/librealsense/issues/2566#issuecomment-431852571

robotlearning123 commented 3 years ago

Thanks for your advice. I have read the instructions. Actually, If only using the waterproof housing and working in the air, the result is not bad. Screenshot from 2021-03-24 13-41-06 But the underwater environment is different. Here are the raw underwater infra images, which have obvious distortions. The On-Chip tool cannot deal with this, right? Even it can calibrate the intrinsic parameters. Screenshot from 2021-03-19 19-42-53 So what I want is to rewrite the intrinsic/extrinsic parameters from other calibration methods.

MartyG-RealSense commented 3 years ago

I found an underwater calibration research paper that discusses the same curved effect on a chessboard. It suggested that undistorting the image (on page 10 of the PDF version of the paper) could resolve the problem.

https://www.researchgate.net/publication/273913268_Omnidirectional_Underwater_Camera_Design_and_Calibration

A simpler approach than undistorting the image may be to use the RealSense camera's Y16 infrared format, as - unlike the other stream formats - it is unrectified (no distortion model applied) because it is used for the purposes of camera calibration.

robotlearning123 commented 3 years ago

Thank you for your reference. After a long discussion, I can make a short summary here.

For the infra image, I found the Y8 and Y16 are almost the same in the air, or maybe very small distortion compared with the underwater image. For now, I have two methods for this problem.

The second way has some problems now, as discussed above.

MartyG-RealSense commented 3 years ago

I will seek advice from Intel about calibrating underwater. Thanks again for your patience.

MartyG-RealSense commented 3 years ago

Hi @wangcongrobot Intel are working on a solution for your problem that will be communicated to you when a complete answer has been developed. Thanks very much for your patience.

agrunnet commented 3 years ago

Hi @wangcongrobot

I want to add some more details of what one can expect when using the D435 under water.

  1. Window: Your design is awesome. The window has to be flat, like you have it. It is important not to used any type of curved window as it will impact the depth quality. Ideally also make sure the camera is parallel to the surface.
  2. Projector: If you are planning on turning on the projector, case must be taken to isolate any back-reflections from scattering from the window into the sensors. This can be done by moving the camera very close window and maybe adding a small black gasket. Alternatively, use an external projector. You can check by steaming the left and right images to see if you see any differences or glare when you turn on the projector.
  3. Self-calibration: Once it is installed, feel free to run self-calibration. It will clean up any of the very minor calibration issues that may occur due to mounting or transmission through glass.

Now to address some of the more specific issues you are seeing with depth.

  1. FOV: The field of view under water will change. Due to Snell's law of refection at an optical interface between different materials (water and air), you will see a bending of the rays. Since the refractive index of visible (and 850nm) is close to n=1.33, you will see that the 90x58 deg FOV of the D435 will shrink to ~64x43 deg (for the widest angles, use the equation thetaOut=arcsin(sin(thetaIn)/1.33).

  2. Depth Z in center: The distance you measure will be 1/n smaller than reality. So you may measure 1m, but in reality the object will be at 1.33m. A easy intuitive way to remember this is that when you look down at someone standing in a pool, their submerged lower body will look much shorter than it really is.

  3. Cylindrical Bending of Z-plane: Due to Snells' law of refraction, you will see a cylindrical bending of the depth plane. Objects at the wider angles will appear closer than objects in the center. Do how do you account for this? In principle, you could do a full recalibration underwater, but that may not be feasible. Plus then you have the opposite issue if you want to use it in air. A recalibration will essentially change the lens distortion function of each camera and create more of a fisheye distortion (yes, it will reduce FOV, but the angles will now pin-cushion out at larger angle). Instead, one could just correct the depth map in post-processing. Basically add an x-axis remapping of the depth. (Please note that the Y-axis is NOT changed).

We will share some plots in the next comment.

agrunnet commented 3 years ago

image Here is a curve of showing the relative change in depth along the x-axis (angle). The x-axis is the real angle of the object point in the water. We also assume for simplicity here that the camera right up against the window and water. If there is long propagation in air before hitting water then the equations will change. Note that the center shows a reduction in distance of ~25%, which is 1/n-1 where n=1.33.

The error is defined as (measured Z – actual Z) and so is <0 since objects appear closer.

Here is the same plot, but just plotted in absolute units as opposed to relative. image