How do underwater vision systems overcome the problems of light scattering and color distortion in water?
Imagine driving on a foggy day, or looking through blue sunglasses. Underwater vision systems face similar, if not more challenging, problems. Water scatters light (like fog) and absorbs many colors (color distortion, like blue sunglasses). To solve these two issues, we usually employ a combination of hardware and software solutions.
1. Combating 'Underwater Fog' (Light Scattering)
-
Hardware: Strategic Lighting If you hold a flashlight directly in front of your eyes in thick fog, you'll see a white blur because the light reflects off the fog back into your eyes. But if you move the flashlight to the side and illuminate the object from an angle, the effect is much better. The same principle applies to underwater robots. We don't place the camera and spotlight right next to each other. Instead, we separate them by a certain distance, or use special "structured light" projected from a specific angle. This prevents light from directly illuminating the "murky water" between the lens and the target, allowing the camera to "see through" suspended particles and capture clearer images of the target.
-
Software: 'One-Click Defogging' This is similar to photo editing software on your phone. Many clever image processing algorithms can analyze a blurry image, determine the "density" of the fog, and then, like solving a math problem, subtract this layer of "fog" from the picture. This technique is called "image defogging" or "image enhancement." Some advanced algorithms even involve computers learning from thousands of pairs of clear and blurry comparison photos, after which the computer itself learns how to make blurry photos clear, with very natural results.
2. Addressing the 'Blue Filter' (Color Distortion)
-
Hardware: Bringing Your Own 'Sun' Why does everything underwater appear blue-green? Because water preferentially absorbs light. It "eats up" red light first, then orange, yellow... leaving mostly blue and green light. So, the deeper you go, the bluer things look. The most direct solution is to bring your own light source, not just an ordinary flashlight, but a professional underwater spotlight that emits light containing all colors (full-spectrum). By illuminating objects with sufficiently strong "white light," the colors "eaten" by the water can be restored, allowing objects to regain their original hues.
-
Software: 'Smart Color Adjustment' This is also like photo editing, known as "color correction" or "white balance."
- One method is 'compensation': Since we know water absorbs red light, algorithms can specifically enhance the red and yellow components in captured images, while reducing the excessive blue and green, thereby restoring color balance.
- Another, more intelligent method relies on 'learning': We use AI (Artificial Intelligence) to show it massive amounts of underwater photos and what their "correct" colors should look like under normal lighting. After extensive viewing, the AI learns a skill: given a blue-green tinted underwater photo, it can predict what the photo should look like after "removing" the blue filter, and then automatically restore it for you.
In summary, modern underwater vision systems are masters of combination. They use cleverly designed hardware (separated lights and cameras) to capture images with the best possible "raw material," and then apply powerful software algorithms (defogging + color adjustment) to "refine" these photos. Only then can you see a clear, color-accurate underwater world.