Coral reefs are among nature’s most complex and colorful living formations. But as any underwater photographer knows, pictures of them taken without artificial lights often come out bland and blue. Even shallow water selectively absorbs and scatters light at different wavelengths, making certain features hard to see and washing out colors—especially reds and yellows. This effect makes it difficult for coral scientists to use computer vision and machine-learning algorithms to identify, count and classify species in underwater images; they have to rely on time-consuming human evaluation instead.
But a new algorithm called Sea-thru, developed by engineer and oceanographer Derya Akkaynak, removes the visual distortion caused by water from an image. The effects could be far-reaching for biologists who need to see true colors underneath the surface. Akkaynak and engineer Tali Treibitz, her postdoctoral adviser at the University of Haifa in Israel, detailed the process in a paper presented in June at the IEEE Conference on Computer Vision and Pattern Recognition.
Sea-thru’s image analysis factors in the physics of light absorption and scattering in the atmosphere, compared with that in the ocean, where the particles that light interacts with are much larger. Then the program effectively reverses image distortion from water pixel by pixel, restoring lost colors.
One caveat is that the process requires distance information to work. Akkaynak takes numerous photographs of the same scene from various angles, which Sea-thru uses to estimate the distance between the camera and…