4B.6 Applications of advanced underwater robotics and imaging sensors in coral reef research

Friday, 13 November 2009: 11:50 AM
Roy A. Armstrong, NOAA Center for Atmospheric Sciences (NCAS), Lajas, PR

Due to the exponential attenuation of light in the water column, coral reefs and other benthic communities present below 20 m depth are beyond the limit of airborne and satellite optical remote sensing and require the use of in situ platforms such as autonomous underwater vehicles (AUVs) and remotely operated vehicles (ROVs). Since 2002 we have been using the Seabed AUV, which was designed for high-resolution underwater optical and acoustic imaging, to characterize the upper insular shelf coral reefs of Puerto Rico and the US Virgin Islands. More recently, a Seabotix ROV has been used to supplement the AUV data by providing real-time video and sampling capability. The basic geomorphology, benthic community structure, and biodiversity of mesophotic (depth range: 30-100+ m) coral communities in the U.S. Caribbean remain largely unknown. This includes ecologically-relevant parameters such as percent coral cover, reef rugosity, incidence of disease, and species richness and diversity. In spite of the importance of mesophotic reefs, very little information is available on the distribution and condition of these reefs, largely because they lie beyond the range of safe diving operations. Deeper reefs appear to be healthier than their shallow water counterparts and are known habitats of commercially important fish species. These benthic assessments could provide the required information for selecting unique areas of high coral cover, biodiversity and structural complexity for habitat protection and ecosystem-based management. These quantitative, georeferenced AUV surveys and photomosaics could also provide the baseline data required for future change detection of the deeper coral reef zones. Data from Seabed sensors and related imaging technologies are being used to conduct multi-beam sonar surveys, 3-D image reconstruction from a single camera, photo mosaicking, image based navigation, and multisensory fusion of acoustic and optical data.
- Indicates paper has been withdrawn from meeting
- Indicates an Award Winner