A team from MIT and the Woods Hole Oceanographic Institution (WHOI) has developed an image-analysis tool that cuts through the ocean’s optical effects and generates images of underwater environments that look as if the water had been drained away, revealing an ocean scene’s true colors. The team paired the color-correcting tool with a computational model that converts images of a scene into a three-dimensional underwater “world,” that can then be explored virtually.
The researchers have dubbed the new tool SeaSplat, in reference to both its underwater application and a method known as 3D Gaussian splatting (3DGS), which takes images of a scene and stitches them together to generate a complete, three-dimensional representation that can be viewed in detail, from any perspective.
For now, the method requires hefty computing resources in the form of a desktop computer that would be too bulky to carry aboard an underwater robot. Still, SeaSplat could work for tethered operations, where a vehicle, tied to a ship, can explore and take images that can be sent up to a ship’s computer.
That’s actually a really clever use of Gaussian splats, though I’m not really sure what the practical use of it would be. You can probably create some really cool, interactive renders of shipwrecks and reefs and such, but I’m not immediately seeing the value beyond edutainment content.
Corridor does a really good breakdown of what Gaussian splats are here, for those interested. The explanation ends when the sponsor segment begins, for those who don’t want to watch the whole video.
It seems like it will allow more faithful/accurate scans while keeping the sensor farther away from the subjects, or cloudy/foggy condition
Lotsa shy sea creatures
Yes but as per my understanding, this requires shots taken at different vantage points and then it recreates a clear zoomed-out picture with better depth of view. So unfortunately nothing to help see shy sea creatures