ubc-subbots / triton

ROS2 system for the Triton AUV
11 stars 2 forks source link

Add Plugin/Node to Generate an Underwater Image from Non-Underwater Image #33

Closed loganfillo closed 3 years ago

loganfillo commented 3 years ago

We need a way to produce an underwater image from a non underwater image to simulate an actual underwater camera. This can be done in gazebo as a plugin, or by using existing plugins and a ROS node. The system should take in a pair of RGB/Depth images and run the generative underwater algorithm on these image pairs and output the result to a ros topic.

Resources UWIR Algorithm Sensor Plugins

Notes The depth camera and RGB camera images MUST be synchronous for this to work.

Tasks

loganfillo commented 3 years ago

Seems like writing our own gazebo plugin would be reinventing the wheel. Using a ros depth camera plugin, we can get raw and depth images, and then match the two up using the header timestamp. There is not always a corresponding timestamp for each image pair, so we will instead have to write the logic to pair the closest time stamps if an exact correspondence can't be found (note that without an exact correspondence, there is usually a pair within 0.05 seconds apart)

loganfillo commented 3 years ago

It seems there is a package message_filter which performs the exact logic as stated in the last comment. I have implemented this and now all we need to do is implement the underwater synthesis algorithm. Check out logan/issue33 for the current progress. To use the underwater camera use the following launch commands

ros2 launch triton_gazebo gazebo_launch.py world:=uc_test.world (for a gazebo world with cubes) ros2 launch triton_gazebo underwater_camera_launch.py (for the underwater camera)

You can also now see the images in rqt view, here are the topics, respectively image, depth image, underwater image(should show nothing)

/triton/drivers/front_camera/gazebo/image_raw /triton/drivers/front_camera/gazebo/depth/image_raw /triton/drivers/front_camera/image_raw

You can also modify the gazebo world uc_test.world as you'd like to test the algorithm.

loganfillo commented 3 years ago

These are the light parameters used by the paper:

Database for RGB spectral responses of different cameras (at different wavelengths): http://www.gujinwei.org/research/camspec/db.html (see "Database")

Database for illumination irradiance at different wavelengths: https://www.nrel.gov/grid/solar-resource/spectra-am1.5.html (see "spreadsheet")

Table for attenuation coefficients of different watertype (at different wavelengths): https://www-sciencedirect-com.ezproxy.library.ubc.ca/science/article/pii/S0422989408709292 (see "Table XX")

The only thing I couldn't find (and that they only briefly mention in equations (7) and (8)), is the "reflectance spectrum of the object surface" (i.e rho(lambda)). Not sure how important this value is but it is worth looking into.