coregistered hyperspectral and stereo image seafloor...

18
Coregistered Hyperspectral and Stereo Image Seafloor Mapping from an Autonomous Underwater Vehicle Daniel L. Bongiorno Defence Science & Technology Group and Australian Centre for Field Robotics, University of Sydney, Sydney, Australia e-mail: [email protected] Mitch Bryson Australian Centre for Field Robotics, University of Sydney, Sydney, Australia e-mail: [email protected] Tom C. L. Bridge ARC Centre of Excellence for Coral Reef Studies, James Cook University, Townsville, Queensland, Australia e-mail: [email protected] Donald G. Dansereau Stanford Computational Imaging Lab, Stanford University, CA, USA e-mail: [email protected] Stefan B. Williams Australian Centre for Field Robotics, University of Sydney, Sydney, Australia e-mail: [email protected] Received 13 May 2016; accepted 13 January 2017 We present a new method for in situ high-resolution hyperspectral mapping of the seafloor utilizing a spec- trometer colocated and coregistered with a high-resolution color stereo camera system onboard an autonomous underwater vehicle (AUV). Hyperspectral imagery data have been used extensively for mapping and distin- guishing marine seafloor habitats and organisms from above-water platforms (such as satellites and aircraft), but at low spatial resolutions and at shallow water depths (<10 m). The use of hyperspectral sensing from in-water platforms (such as AUVs) has the potential to provide valuable habitat data in deeper waters and with high spatial resolution. Challenges faced by in-water hyperspectral imaging include difficulties in correcting for water column effects and the spatial registration of point/line-scan hyperspectral sensor measurements. The methods developed in this paper overcome these issues through coregistration with a high spatial resolution, stereo color camera, and precise modeling and compensation of the water column properties that attenuate hyperspectral signals. We integrated two spectrometers into our SeaBED class AUV, and one on-board a sup- port surface vessel to measure and estimate the effects of light passing through the water column. Spatial calibration of the spectrometers/stereo cameras and the synchronized acquisition of both sensors allowed for spatial registration of the resulting hyperspectral reflectance profiles. We demonstrate resulting hyperspectral imagery maps with a spatial resolution of 30 cm over large areas of the seafloor that are not adversely effected by above-water conditions (such as cloud cover) that would typically prevent the use of remote-sensing methods. Results are presented from an AUV mapping survey of a coral reef ecosystem over Pakhoi Bank on the Great Barrier Reef, Queensland, Australia, demonstrating the ability to reconstruct hyperspectral reflectance profiles for a diverse range of abiotic and biotic coverage types including sand, corals, seagrass, and algae. Profiles are then used to automatically classify different coverage types with a 10-fold cross validation accuracy of 91.99% using a linear support vector machine (SVM). C 2017 Wiley Periodicals, Inc. 1. INTRODUCTION The ocean floor contains a multitude of different habitats that harbor and support marine life and ecosystems such Direct correspondence to: Daniel L. Bongiorno, e-mail: daniel. [email protected] as coral reefs, kelp forests, and seagrass meadows, which possess immense social and economic value and are ex- pected to suffer increasing environmental pressures in the coming years and decades (Hoegh-Guldberg et al., 2007) (Wilkinson, 2008). Conservation and management strate- gies used to monitor these habitats rely on regularly updated data on their structure, composition, and health, Journal of Field Robotics 00(0), 1–18 (2017) C 2017 Wiley Periodicals, Inc. View this article online at wileyonlinelibrary.com DOI: 10.1002/rob.21713

Upload: others

Post on 26-Sep-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

Coregistered Hyperspectral and Stereo Image SeafloorMapping from an Autonomous Underwater Vehicle

• • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • • •

Daniel L. BongiornoDefence Science & Technology Group and Australian Centre for Field Robotics, University of Sydney, Sydney, Australiae-mail: [email protected] BrysonAustralian Centre for Field Robotics, University of Sydney, Sydney, Australiae-mail: [email protected] C. L. BridgeARC Centre of Excellence for Coral Reef Studies, James Cook University, Townsville, Queensland, Australiae-mail: [email protected] G. DansereauStanford Computational Imaging Lab, Stanford University, CA, USAe-mail: [email protected] B. WilliamsAustralian Centre for Field Robotics, University of Sydney, Sydney, Australiae-mail: [email protected]

Received 13 May 2016; accepted 13 January 2017

We present a new method for in situ high-resolution hyperspectral mapping of the seafloor utilizing a spec-trometer colocated and coregistered with a high-resolution color stereo camera system onboard an autonomousunderwater vehicle (AUV). Hyperspectral imagery data have been used extensively for mapping and distin-guishing marine seafloor habitats and organisms from above-water platforms (such as satellites and aircraft),but at low spatial resolutions and at shallow water depths (<10 m). The use of hyperspectral sensing fromin-water platforms (such as AUVs) has the potential to provide valuable habitat data in deeper waters and withhigh spatial resolution. Challenges faced by in-water hyperspectral imaging include difficulties in correcting forwater column effects and the spatial registration of point/line-scan hyperspectral sensor measurements. Themethods developed in this paper overcome these issues through coregistration with a high spatial resolution,stereo color camera, and precise modeling and compensation of the water column properties that attenuatehyperspectral signals. We integrated two spectrometers into our SeaBED class AUV, and one on-board a sup-port surface vessel to measure and estimate the effects of light passing through the water column. Spatialcalibration of the spectrometers/stereo cameras and the synchronized acquisition of both sensors allowed forspatial registration of the resulting hyperspectral reflectance profiles. We demonstrate resulting hyperspectralimagery maps with a spatial resolution of 30 cm over large areas of the seafloor that are not adversely effected byabove-water conditions (such as cloud cover) that would typically prevent the use of remote-sensing methods.Results are presented from an AUV mapping survey of a coral reef ecosystem over Pakhoi Bank on the GreatBarrier Reef, Queensland, Australia, demonstrating the ability to reconstruct hyperspectral reflectance profilesfor a diverse range of abiotic and biotic coverage types including sand, corals, seagrass, and algae. Profiles arethen used to automatically classify different coverage types with a 10-fold cross validation accuracy of 91.99%using a linear support vector machine (SVM). C© 2017 Wiley Periodicals, Inc.

1. INTRODUCTION

The ocean floor contains a multitude of different habitatsthat harbor and support marine life and ecosystems such

Direct correspondence to: Daniel L. Bongiorno, e-mail: [email protected]

as coral reefs, kelp forests, and seagrass meadows, whichpossess immense social and economic value and are ex-pected to suffer increasing environmental pressures in thecoming years and decades (Hoegh-Guldberg et al., 2007)(Wilkinson, 2008). Conservation and management strate-gies used to monitor these habitats rely on regularlyupdated data on their structure, composition, and health,

Journal of Field Robotics 00(0), 1–18 (2017) C© 2017 Wiley Periodicals, Inc.View this article online at wileyonlinelibrary.com • DOI: 10.1002/rob.21713

Page 2: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

2 • Journal of Field Robotics—2017

which are acquired and interpreted from a range of sourcesincluding remote sensing (satellite and aircraft imagingfrom above the water (Andrefouet et al., 2002)) and insitu/underwater means (such as visual observations bydivers, ship-towed video data, and imagery from under-water robotic platforms such as autonomous underwatervehicles (AUVs) (Williams et al., 2010)). Remote sensingfrom platforms such as satellites and airborne vehiclesprovide broad-scale information on seafloor habitats but atlow spatial resolutions (>2.4 m per pixel (Wang, Franklin,Guo, & Cattet, 2010)) and are limited to imaging theocean floor in water depths no greater than 20 m (Kobryn,Wouters, Beckley, & Heege, 2013). These platforms makeup for their lack of spatial resolution by using sophisticatedmultispectral and hyperspectral imaging systems that pro-vide images with tens to hundreds of precise color channels(Mishra, Narumalani, Rundquist, Lawson, & Perk, 2007)and have been exploited to classify seafloor substrates(Guild et al., 2008; Hochberg & Atkinson, 2003; Idris, Jean,& Zakariya, 2009) and biological communities (Holden& LeDrew, 2001; Volent, Johnsen, & Sigernes, 2007). Incontrast, in situ underwater imaging systems such as towedvideo and AUVs provide data with very high spatial res-olutions (subcentimeter per pixel, depending on operatingaltitude) but over relatively smaller extents (constrainedby vehicle range and operating altitude). These platformstypically deploy only monochrome or three-color imagery(red–green–blue) (Campos, Garcia, Alliez, & Yvinec, 2015;Williams et al., 2012), owing to the difficulties of fieldinghyperspectral imaging systems used by satellites and air-craft in the water and the mitigation of the effects of waterattenuation and scattering on light from a moving platform.

In this paper, we present a novel underwater imagingsystem and a series of calibration and postprocessing pro-cedures designed to derive hyperspectral imagery maps ofthe seafloor using an AUV, thus combining the advantagesof both in situ and remote-sensing sources of data for moni-toring marine habitats. Instead of deploying a large, heavy,and potentially expensive hyperspectral imaging system (i.e.a sensor that contains either a one- or two-dimensional arrayof sensing elements that measure incoming light at differentspectra/colors), our system utilizes a single-point hyper-spectral radiometer co-mounted with a three-color stereocamera that relies on the forward motion of the AUV (anda precise series of processing steps to spatially register spotmeasurement on the seafloor) to build up a hyperspectralimage. The resulting sensor combination is lightweight, rel-atively in-expensive (compared to hyperspectral imagers)and provides higher spectral resolution and better signal tonoise performance than a hyperspectral imaging sensor. Wedevelop a processing methodology that reconstructs the hy-perspectral reflectance (ratio of outgoing to incoming lightacross many spectra/colors) of the seafloor by accountingfor ambient light conditions, lights from strobes on-boardthe AUV and the inherent optical properties (IOPs) of the

surrounding water column which cause attenuation of lightunderwater at different wavelengths that must be correctedfor. We demonstrate the ability to reconstruct hyperspectralreflectance profiles from a moving AUV platform for a di-verse range of abiotic and biotic coverage types includingsand, corals, seagrass, and algae and further develop auto-matic seafloor classification algorithms using hyperspectralsignatures and supervised machine learning methods. Re-sults of the system are presented using data collected by theSirius AUV over coral reef habitats at Pakoi Bank on theGreat Barrier Reef, Australia.

1.1. Related Work

Imaging the seafloor from above or within the water columnprovides a number of challenges. The air–water interfaceacts to refract and reflect away light depending on windand wave conditions, limiting the spatial resolution andclarity with which measurements can be made from abovethe water (Mobley, 1994). When imaging is performed fromwithin the water (e.g., using an AUV), wavelength-selectiveattenuation and scattering of light result in loss of imageclarity with distance to the object and a shift in the appar-ent object color (Mobley, 1994). Various approaches havebeen used to account for color variations in underwater im-agery including active illumination (Vasilescu, Detweiler,& Rus, 2011), qualitative machine learning (Torres-Mendez& Dudek, 2005), and image formation model-based ap-proaches (Bryson, Johnson-Roberson, Pizarro, & Williams,2016). Object and organism color has been cited as an impor-tant feature for classification, identification, and measure-ment (Mehta, Ribeiro, Gilner, & van Woesik, 2007); however,limitations in the ability to accurately reconstruct color inprevious work have resulted in limited use (Clement, Dun-babin, & Wyeth, 2005) (Pizarro, Rigby, Johnson-Roberson,Williams, & Colquhoun, 2008). Instead most approaches usespatial texture-based features (Clement et al., 2005) (Bei-jbom, Edmunds, Kline, Mitchell, & Kriegman, 2012) as theprimary means of organism classification in images. In con-trast, organism color (measured at hyperspectral resolu-tions) has been used extensively in field-based spectroscopystudies where data from point-based sensors have beenused to distinguish coral genera (Roelfsema & Phinn, 2012)and examine coral health (Holden & LeDrew, 2001). Mea-surements in these studies are made manually by divers byplacing the sensor very close (<10 cm) from the target sur-face, thus avoiding the issue of water column influence, butlimiting the extent over which measurements can be madeto a handful of point-based samples.

The use of hyperspectral measurements over broaderextents made from within the water is limited. Johnsen et al.(2013) discusses an underwater sliding hyperspectral cam-era test rig; however, results are limited to a static, nonmov-ing platform. Hyperspectral radiometers (or spectrometers)have been used on AUVs to measure properties of the water

Journal of Field Robotics DOI 10.1002/rob

Page 3: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

Bongiorno et al.: Coregistered Hyperspectral and Stereo Image Seafloor Mapping • 3

column itself (English & Carder, 2006) (Hartmann, de Souza,Timms, Davie, & Ieee, 2009) and also to measure the seafloorreflectivity (English & Carder, 2006), but at limited, single-point samples. Previous studies have not demonstrated thecapacity to produce spectral reflectance measurements ata fine spatial resolution and over a broad extent nor mea-surements that have been corrected for the influence of thewater column. The ability to recover in-water hyperspectralreflectance measurements over marine habitats such as coralreefs has the potential to provide a valuable source of datato complement existing approaches to habitat classificationand assessment based on automatic machine learning meth-ods currently deployed on monochrome and three-colorimages collected by AUVs (Friedman, Steinberg, Pizarro,& Williams, 2011).

1.2. Contribution of Our Work

In this work, we are concerned with the ability to create geo-referenced maps of hyperspectral reflectance of the seafloorthat can be used on their own (or with additional high-resolution stereo imagery) to automatically identify andclassify seafloor organisms and habitat types, using datacollected from a moving robotic platform (an AUV). Webuild upon previous developments in hyperspectral imag-ing, image-based data fusion, and classification to developa system of AUV data collection and processing to achievethis aim. The key contributions of this work are to

1. develop a novel system by which measurements from aspectrometer can be georeferenced and mapped to theseafloor from a moving AUV.

2. develop a system of data collection and processing thatallows for the recovery of hyperspectral reflectance ofthe seafloor by modeling and measurement of the effectsand interactions of water column attenuation, on-boardlighting, ambient lighting, and the air–water interface.

Additionally, our system is demonstrated to effectivelydistinguish between different seafloor biota through the de-velopment of automatic classification algorithms based onhyperspectral reflectance profiles, with results presentedover a broad-scale AUV mission. In contrast to previouswork in underwater field spectroscopy (Roelfsema & Phinn,2012) (Holden & LeDrew, 2001), we develop a method formeasuring tens of thousands of individual reflectance pro-files over kilometers of the seafloor through the use ofan AUV and account of the effects of the water columnon underwater light. In contrast to past work on under-water color correction using three-color images (Vasilescuet al., 2011) (Torres-Mendez & Dudek, 2005) (Bryson et al.,2016), we employ a light attenuation model-based tech-nique to correct color at hyperspectral resolutions and usethese data to effectively classify seafloor substrates andorganisms.

2. METHODOLOGY

2.1. Overview of Approach

A flow chart of our data collection and data processingmethod is shown in Figure 1. The system utilizes multi-ple, synchronized spectrometer measurements above andbelow the surface of the water to measure and compensatefor the effects of ambient and artificial lighting sources re-quired to measure the reflectance of light at the seafloor.A model of the air–water interface is derived from mea-surements taken from the ship-borne inertial measurementunit (IMU), Global Positioning System (GPS) position of theship, time of day, conductivity, and temperature of the wa-ter, and wind data for the region. Ambient light passingfrom the surface of the water to the AUV is measured by anupward-looking spectrometer mounted to the AUV. Datafrom these two sensors are incorporated to effectively mea-sure the contribution of ambient light at the seafloor surface(differences in intensity across the visual spectrum) and es-timate IOPs of the water. Reflected light from the seaflooris measured by a downward-looking spectrometer on theAUV and processed in conjunction with this information toestimate the reflectance of the seafloor. Reflectance measure-ments are spatially georegistered via information derivedby the stereo camera system, which also provides relativerange information to the seafloor surface which is neces-sary to quantify and compensate the effects of the watercolumn on the downward-looking spectrometer measure-ments. The resulting georeferenced spectral profiles are thenused to classify seafloor biota using a supervised learningframework based on a small number of training examplesprovided by a marine biologist from the associated colorstereo camera images.

2.2. Autonomous Underwater Vehicle

The Australian Centre for Field Robotics, University of Sydney,operates a Woods Hole Oceanographic Institution Seabed-class AUV (Singh et al., 2004), the SIRIUS, (see Figure 2(b))that carries a suite on on-board sensors including navi-gation sensors (doppler velocity log, ultra short base line(USBL) acoustic positioning system, GPS receiver, depthand attitude sensors), water sensors (Colored DissolvedOrganic Matter (CDOM), chlorophyll, backscatter), andseafloor sensors (multibean sonar and stereo camerasystem with Light Emitting Diode (LED) strobes forartificial lighting) (Williams et al., 2010). The AUV typicallytravels at 0.5 m/s and aims to maintain an altitude of2 m from the seabed. Data from the stereo camera andother navigation sensors are postprocessed to producethree-dimensional (3D) topographic and color imagerymosaics using techniques in simultaneous localizationand mapping (SLAM), structure-from-motion and terrainreconstruction [see Williams et al. (2012) for details].

Journal of Field Robotics DOI 10.1002/rob

Page 4: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

4 • Journal of Field Robotics—2017

Figure 1. The first stage of the processing is data acquisition from the ship and AUV- borne sensors (these include upwardfacing spectrometers on the AUV and surface vessel, downward-facing spectrometer, stereo camera, IMU on the surface vessel,conductivity and temperature sensors). The second stage involves the generation the water, air–water interface and illuminationmodels that derive the spectral reflectance curves. The classifier is trained by an expert labeled subset of the data, which thengenerates a seafloor habitat map.

Journal of Field Robotics DOI 10.1002/rob

Page 5: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

Bongiorno et al.: Coregistered Hyperspectral and Stereo Image Seafloor Mapping • 5

Figure 2. (a) A view of the camera housing window showing the positioning of the spectrometer’s aperture (small hole) withrespect to the lenses of the stereo cameras (two larger holes). The downward-facing spectrometer used was an Ocean Optics USB2000+ with a wavelength range of 340–1032 nm and a full width at half maximum of 7.7 nm. (b) Sirius AUV underwater.

Figure 3. An example of a depth map derived from the 3D mesh generated from the stereo camera data; this is generated from the3D mesh created from the stereo camera data. These depth maps allow for an accurate measurement of altitude for the spectrometerreading. An accurate altitude reading is critical for correcting the spectra for water attenuation.

2.3. Spectrometer Data Collection and Calibration

For our proposed system, a series of spectrometers were in-stalled aboard the AUV SIRIUS and on the support surfacevessel (a 35-m long research vessel operating in the vicinityof the AUV). Figure 1 shows the positioning of these spec-trometers. The two upward-looking spectrometers wereused to collect downwelling irradiance on the surface beforethe light enters the water (from the surface-based supportvessel) and as the surface light (sunlight and skylight)passes the vehicle (mounted to the AUV). A downwardfacing spectrometer is colocated with the stereo camera andused to collect the spectra of the imaged seabed. Figure 2(a)shows a photograph of the window of the camera housingshowing the positioning of the spectrometer colocated withthe stereo cameras. The stereo cameras have been radio-metrically characterized using the method from Bongiorno,Bryson, Dansereau, Williams, and Pizarro (2013a).

2.3.1. AUV Downward-Facing Spectrometer

To implement our method of spectral imaging onboardSIRIUS, we reengineered the stereo camera housing to fitan Ocean Optics USB 2000+ spectrometer (dimensions:89.1 mm × 63.3 mm × 34.4 mm). The USB 2000+ collects2c048 spectral bins with a full width at half maximum of7.7 nm; it is sensitive in the spectral region of 340–1032 nm.The spectrometer’s optical entrance was connected via a10-cm fiber optic cable to a collimating lens1 colocated withthe stereo camera lenses at the window of the camera system[Figure 2(a)]. The collimating lens allowed for the focusingof the field of view (FOV) of the spectrometer to a sharpedged spot. Measurement acquisition from the spectrome-ter was synchronized to the image acquisition by the stereo

1Ocean Optics 74-VIS: 350–2000 nm wavelength range, f/2 BK-7glass.

Journal of Field Robotics DOI 10.1002/rob

Page 6: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

6 • Journal of Field Robotics—2017

Figure 4. This shows the complete parameters obtained for modeling for the air-water interface.

cameras and firing of the strobes through a common triggersignal. An autoexposure algorithm was implemented to ad-just the spectrometer exposure/integration time to achievea desired mean signal strength using measured signals overthe previous 30 s interval. In practice, only wavelengths inthe range 400–800 nm were used in subsequent processing,owing to the almost complete absorption of light of wave-lengths outside this range while in the water. A Savitzky-Golay filter (Savitzky & Golay, 1964) was used to reducenoise in the measured spectra. The window of the under-water housing in front of the spectrometer lens was made ofacrylic and laboratory tests prior to AUV deployment con-firmed that this material provided consistent transmissionacross wavelengths of 400–800 nm.

2.3.2. AUV Upward-Looking Spectrometer

For collecting the downwelling irradiance through the wa-ter column above the AUV, an AUV-mounted, upward-looking spectrometer was installed on the top of the AUV.This spectrometer samples the sun and sky light penetratingdown through the water column from the surface above.The spectrometer used was a low-cost microspectrometer(Ocean Optics STS-200 VIS), chosen based on its high res-olution and small form factor (40 mm × 42 mm × 24 mm)with a spectral range of 335–820 nm. A cosine corrector2 was

2Ocean Optics CC-3: Opaline glass, 180◦ FOV, 350–1000 nm wave-length range.

installed on the fore-optics, for integrating the incoming ra-diance. The spectrometer operated in a different underwaterhousing to the downward-facing spectrometer and stereocamera system; all sensor data were logged to a network-sychronized time stamp at a rate of 1 Hz. Synchronizationof the data between this spectrometer and the other sensorsachieved via postprocessing.

2.3.3. Ship-borne Upward-Looking Spectrometer

To measure the irradiance before it enters the water, anupward-looking spectrometer was installed on the supportvessel on the surface. This spectrometer is mounted in thezenith position in an unobstructed high vantage point onthe ship. The sensor is identical to the upward-facing spec-trometer on the AUV (Ocean Optics STS-200 VIS), with thesame fore-optics (cosine corrector) and was mounted in-side a waterproof housing with an acrylic window. Thespectral data were logged to a ship-based computer overRS-232 serial using a custom Python-based driver. An au-togain system was implemented in the device driver simi-lar to the one on the downward-looking spectrometer (seeSection 2.3.1). The time on the surface data logging com-puter was synchronized with the AUV’s onboard computerpreceding each dive. The data acquisition system loggedspectra at approximately 1 Hz. Position, roll, pitch, andheading were recorded from an IMU mounted on the sup-port vessel sampled at 10 Hz, and later used to process thefraction of light entering the water from directly above. A

Journal of Field Robotics DOI 10.1002/rob

Page 7: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

Bongiorno et al.: Coregistered Hyperspectral and Stereo Image Seafloor Mapping • 7

Savitzky–Golay filter (Savitzky & Golay, 1964) was used toreduce noise in the measured spectra from both the ship-borne and AUV-mounted, upward-looking spectrometers.

2.3.4. Relative Radiometric Calibration of Spectrometers

Each of the spectrometers used were sensitive to slightlydifferent wavelengths of light, based on differences in theunits themselves, collimating lenses and materials in theunderwater housing; a relative radiometric calibration pro-cedure was performed to provide a wavelength-dependentscaling factor for each spectrometer such that radiance mea-surements were consist among the three units. The scalingfactor was determined by imaging of a common white cali-bration panel prior to AUV deployment.

2.4. Spatial Calibration of Downward-LookingSpectrometer and Stereo Camera System

We used an existing procedure (Bongiorno, Fairley, Bryson,& Williams, 2013b) to spatially coregister the footprint ofboth the downward-looking spectrometer and stereo cam-era system, performed prior to AUV operations. The colli-mating optics used at the front of the downward-lookingspectrometer resulted in a focused footprint of the sensoron the target surface; this footprint has an intrinsic spatialdistribution of sensitivity within the FOV, which is not nec-essarily circular and uniform. Additionally, the optical axisof the spectrometer is not aligned to the optical axis of thecamera. A two-stage calibration procedure was developedto account for these effects.

The first stage of the calibration involved estimationof the spectrometer FOV and spatial sensitivity distributionwithin the FOV of the stereo camera image. The methodused a laboratory setup in a dark room in which both sensorswere pointed toward a computer monitor that produced asmall white square against a black field. The position of thewhite square was varied across a regular pattern, and theresulting position of the square and spectrometer-measuredintensity recorded. The resulting intensities were used toproduce a map of spectrometer sensitivity as a function ofimage position.

The second stage of the calibration involved comput-ing a relative translation and rotation value between theoptical centers of the camera and spectrometer by repeat-ing the imaging of the monitor and varying distances fromthe camera. The resulting calibration procedure produceda relationship that described the FOV of the spectrometerfootprint within the camera image space as a function of therange to the target.

When operating on the AUV, the calibration relation-ship was used to project the position of spectrometer read-ings into the camera space based on information about therange to targets provided by the stereo-imaging system.

2.5. Water Column Modeling and ReflectanceReconstruction

This section describes our methodology for recovering mea-surements of hyperspectral reflectance from the seafloor us-ing spectrometer data acquired by the different spectrome-ters. Hyperspectral reflectance of surfaces on the seafloor isdefined as the ratio of radiance of light leaving the surfaceto the incident light arriving at the surface, as a functionof different light wavelengths. To estimate these two radi-ance values, information is required on the light leavingthe surface (measured by the downward-looking spectrom-eter) and light available in the scene from both the artificialstrobes carried on the AUV and the ambient light avail-able from the sun (measured by the support vessel spec-trometer). Information is also required on the wavelength-dependent attenuation of these different light sources asthey move through the water column.

2.5.1. Water Column Modeling

The molecules of a body of water absorb light in awavelength-dependent manner, depending on the distancetraveled through the water via the Lambert–Beers law:

Ed (λ) = Es(λ)e−K(λ)d , (1)

where Ed (λ) is the irradiance with respect to wavelengthλ at distance d , Es(λ) is the irradiance at zero distance,andK(λ) is the diffuse downwelling attenuation coefficientfor the water (a function of wavelength). Estimates of thediffuse downwelling attenuation coefficient K(λ) can be as-certained by measuring the ratio of irradiance before andafter travelling through a known distance of water by rear-ranging Eq. (1):

K(λ) = − 1d

loge

(Es(λ)Ed (λ)

). (2)

Synchronized measurements of hyperspectral radiancefrom both the surface-based and AUV-mounted upward-looking spectrometers were used to estimate the attenuationcoefficient of the water column in which the AUV operated.Measurements from the surface-based spectrometer wereprocessed to account for the motion of the vessel and theeffect of the air–water interface (see the next subsection) toestimate the radiance of light entering the water columnabove the AUV. This value was then combined with theupward-looking spectrometer measurements via Eq. (1) toestimate the attenuation coefficient of the water column. Es-timates of the coefficient were made for each measurementto account for variations encountered when moving fromone location to the next by the AUV (differencing in waterattenuation properties induced by changes in, e.g., watertemperature, salinity, constitutes).

Journal of Field Robotics DOI 10.1002/rob

Page 8: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

8 • Journal of Field Robotics—2017

2.5.2. Compensation of Air–water Interface Effects

The complex nature of the air–water interface results inchanges in the intensity of light just above and just below thewater’s surface depending on the time of day, wind speed,and other water properties. Light passing through the air–water interface will be reduced through various factors suchas reflection and refraction, arising from the angle of the sun,waves, and wind elements. A series of processing steps wereused to derive estimates of the light radiance just under thewater surface from measurement above the surface.

To determine the amount of reflection from the inter-face, the refractive index is derived using the empiricalequation for the index of refraction of seawater [Eq. (3)](Quan & Fry, 1995), which is a function of wavelength (λ innm), salinity (S), and temperature (T ).

n(S, T , λ) = n0 + (n1 + n2T + n3T2)S + n4T

2

+n5 + n6S + n7T

λ+ n8

λ2+ n9

λ3, (3)

n0 = 1.31405 n1 = 1.779 × 10−4 n2 = −1.05 × 10−6

n3 = 1.6 × 10−8 n4 = −2.02 × 10−6 n5 = 15.868

n6 = 0.01155 n7 = −0.00423 n8 = −4, 382

n9 = 1.1455 × 106

From the derived refractive index nwater(S, T , λ), Snell’slaw [Eq. (4)] was used to determine the exit angle of thelight in the water:

θw(λ) = arcsin(

nair

nwater(S, T , λ)sin(θa(λ))

), (4)

where the index of refraction of air nair ≈ 1 and the entryangle (θa(λ)) is the zenith angle of the sun. The refractedangle at which the light now passes through the water afterthe air–water interface is defined by θw(λ). A proportionof light will be reflected at the interface surface and doesnot pass into the water. The reflectance r at the surface isa function of the zenith angle of the incident light in air(θa) and the exit angle (relative to vertical) in water θw . Thisrelationship is given by Fresnel’s equation:

r = 12

sin2(θa − θw)sin2(θa + θw)

+ 12

tan2(θa − θw)tan2(θa + θw)

. (5)

There will also be light reflection from wind-derivedwhite caps on the surface of the water, which can be modeledusing a white cap percentage (Spillane, Monahan, Bowyer,Doyle, & Stabeno, 1986):

whitecap% = 2.692 × 10−5w2.625s , (6)

where wind speed ws is in meters per second, and wasobtained from an Australian Institute of Marine Scienceoffshore weather station at Davies Reef (18.83 S, 147.63 E),which was close to our mapping sites. The work of Koepke(1984) found the white caps had an effective reflectance of

Figure 5. This shows the layout of the imaging pod and thestrobes on our AUV. This is important for modeling the strobeillumination: � indicates altitude of vehicle and imaging pod;pf and pr are strobe path length from strobe to scene for frontand rear strobes respectively; θf and θr are illumination anglesas a function of altitude and linear offset on the AUV; ψf andψr are fixed tilt of the strobes on the vehicle.

22% over time due to their short life. The extinction per-centage (ε) of the irradiance passing through the air-waterinterface is derived in Eq. (7) with the resultant light irra-diance just below the surface Es being defined in Eq. (8).

ε = whitecap% × 22% + r. (7)

Finally, correction was applied for the changes in thezenith angle of the measuring spectrometer above the water,owing to the roll and pitch of the surface vessel, which wererecorded using an IMU and GPS navigation system, withtime-synchronized measurements to the spectrometer. Theresulting irradiance of light just below the surface of thewater (Es(λ)) is computed using Eq. (8):

Es(λ) = Eψ (λ)(

1 − ε

cos(θη)

), (8)

where θη is the angle between the sensor’s optical axis andthe zenith angle at the time of acquisition and Eψ (λ) is theirradiance measurement from the above water optical sen-sor. Measurements of Es(λ) are combined with measure-ments from the AUV-mounted upward-looking spectrom-eter Es(λ) and the depth of the AUV (calculated from thedepth sensor on-board the AUV) to estimate K(λ) usingEq. (2).

2.5.3. Modeling of Illuminating Light Sources

Light arriving at the seafloor which is then reflected andmeasured by the AUV is a result of a combination of theambient light (from the sun) and on-board strobes. In thissection, we discuss models used to calculate the total irra-diance of light arriving at the surface (where the AUV isimaging) from these two sources. Figure 5 illustrates the

Journal of Field Robotics DOI 10.1002/rob

Page 9: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

Bongiorno et al.: Coregistered Hyperspectral and Stereo Image Seafloor Mapping • 9

physical layout of the front- and rear-mounted strobes onthe AUV, the imaging pod, and path lengths of light to theobserved location of the seafloor used in the modeling.

The spectral irradiance of the strobes was measuredusing both an in-air and in-water procedure. Spectral mea-surements of light from the strobes reflected against a whitecalibration target (Spectralon panel) were measured in airin dark surroundings (so as to isolate light from the strobe),in which losses through the atmosphere were assumednegligible. Additionally, an in-water procedure was usedin which a white object was measured from the spectrome-ter with both strobes on and strobes off. The strobe spectralpower was estimated by subtracting the strobe-off measure-ment from the strobe-on measurement and accounting forthe attenuation of water, calculated using Eq. (2) and us-ing additional upward-looking measurement at the AUVdepth and from the surface. The resulting spectra of bothprocedures agreed.

The irradiance of light arriving at the seafloor surfacefrom each strobe is a function of both the distance and rel-ative pointing angle of the strobe, which varied dependingon the altitude of the AUV above the seafloor. The pointspread of light from each strobe’s direction vector was es-timated using a cosine drop-off relationship such that thetotal power of light leaving each strobe, in the direction ofthe imaged section of seafloor (Ef and Er , front and rearstrobes) was

Ef = cos(|θf − ψf |), (9)

Er = cos(|θr − ψr |), (10)

where θf , θf , ψr , and ψr are the angles illustrated in Figure 5.Light from the strobes is then attenuated through a length ofwater column pf and pr . In addition, there is a contributionfrom the ambient light (from the sun); these are combinedin Eq. (11) to compute the total irradiance of light arrivingat the seafloor surface (Eseafloor(λ)):

Eseafloor(λ) = Ef (λ)e(−K(λ)pf + Er (λ)e(−K(λ)pr + Ed (λ)e(−K(λ)�),

(11)

where information on the relative position of the imagedsection of seafloor and hence distances �, pf , and pr wereall provided by the navigation estimates and 3D terrainsurface models computed by the AUV (see Section 2.2).

2.5.4. Calculation of Seafloor Reflectance

Final estimates of the seafloor reflectance were obtainedthrough the ratio of estimated light arriving at the surfaceand measured light leaving the surface:

R(λ) = Eu(λ)Eseafloor(λ)

, (12)

where Eu(λ) is the light leaving the seafloor surface andwas calculated from the downward-looking spectrometermeasurements by accounting for the attenuation during thepath up to the imaging pod using

Eu(λ) = 1e−K(λ)�

Eσ (λ), (13)

where Eσ (λ) is the measured irradiance from thedownward-looking spectrometer.

2.6. Georegistration of Reconstructed ReflectanceProfiles

Through the motion of the AUV, a series of spectral re-flectance measurements along the transect of the AUVsurvey were produced. The vehicle typically travelled at0.5 m/s, and reflectance profiles were reconstructed at 1.5Hz. The centroid location and footprint radius of each mea-surement were projected onto a mosaic map, reconstructedfrom the stereo imagery using the relationships estimatedof the pose of the AUV and relative terrain positioning fromthe on-board navigation and mapping systems. These poseswere estimated using the information from the pressure sen-sors, compass, tilt sensors, USBL positioning system, andfeature matching of the stereo camera using a SLAM algo-rithm (Williams et al., 2012). The position of the hyperspec-tral spot can be defined by

pNEDf = pNED

AUV + CNEDc pc

f . (14)

where pNEDf is the georeferenced coordinate of the hy-

perspectral spot in an NED (north-east-down) referenceframe, pNED

AUV is the position of the AUV in the NEDframe, CNED

c is the direction cosine matrix representingthe AUV’s orientation, and pc

f is the position of the hy-perspectral spot measurement in the stereo camera frameof reference (derived using the techniques outlined inSection 2.4).

2.7. Classification of Seafloor Biota fromReconstructed Reflectance Spectra

A supervised classification scheme was developed that usedreconstructed reflectance spectra measured at the seafloorto classify different seafloor organisms and surface types.Pairs of reflectance profiles and associated color imagesfrom the stereo camera were extracted and used by a ma-rine biologist to produce a list of dominant classes of objectspresent, including seafloor biota and substrate types. A sub-set of the image data was then labeled by the biologist andthe corresponding reflectance spectra used as training datafor a classification algorithm. Several supervised classifi-cation techniques such as support vector machine (SVM),decision trees, K-nearest neighbors (KNN), and ensembleboosting, and bagging were used to evaluate variations inperformance owing to algorithm selection. The classifica-tion algorithms were implemented using the Statistics and

Journal of Field Robotics DOI 10.1002/rob

Page 10: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

10 • Journal of Field Robotics—2017

Machine Learning Toolbox within Matlab (MATLAB, 2014).The raw values of the recovered reflectance within the rangeof 400–800 nm with a resolution of 1 nm were used directlyas the feature vector.

To compare the relative performance of classificationbased on reflectance data to imagery data, classifiers werealso trained by using stereo image–based texture featuresand extracted RGB color data. Local binary patterns (LBP)(Ojala, Pietikainen, & Maenpaa, 2002) were extracted fromthe color stereo imagery corresponding to the disk ofseafloor encompassed by the spectrometer’s FOV using thecalibration relationships discussed in Section 2.4. Two LBPswere computed on the image spot: One LBP had a radius of1 pixel with surrounding eight points, and the other with aradius of 3 pixels with surrounding eight points. This com-bination was chosen based on its success in the literaturewith classifying underwater imagery from an AUV (Bew-ley et al., 2015). The histogram of both LBPs was used as thefeature vector for classification. Patches of image color wereextracted from the camera images and corrected for waterattenuation using Eq. (13). Four separate classifiers were de-veloped: one based only on reflectance profiles, one basedonly on image texture, one based only on RGB data, and onebased on a combination of reflectance and texture features.For each type of classifier, automatic feature selection wasperformed while training the classifier to ensure the greatestinterclass variability in the feature matrix (Friedman, 2013).When testing and training, the data set was whitened bysubtracting the mean and dividing by the variance at thefeature level.

3. RESULTS

3.1. Experimental Setup

Data collected from an AUV survey over a coral shoal atPakhoi Bank (+147.883◦ N, −19.441◦ E), on the Great Bar-rier Reef, Queensland, Australia (December 17, 2013, 14:02to 16:40), were used to demonstrate our methodology. TheAUV followed a coarse-grid transect with approximately100 m in-between parallel grid lines. This was a long mis-sion where 14,168 stereo pairs of images and correspondingspectrometer readings were taken over a total path lengthof approximately 4.5 km. The mean depth of operation was27.1 m, and mean altitude was 2.2 m (AUV from seafloor).This was one of many surveys conducted over small shoalson the Great Barrier Reef approximately 40 km off the coastof Townsville, Australia (Roberts, Moloney, Sweatman, &Bridge, 2015). A wide variety of substrates and biota werepresent in the study region including sandy and rocky bot-toms, coral rubble, live coral of several different genera, seagrass and other turf, and encrusting species of macro- andmicroalgae.

3.2. Spectral Reflectance Profiles

Figures 6–10 illustrate corresponding pairs of image data (a)and recovered reflectance spectra (b) where the red circlein images indicates the FOV of the spectrometer. Figure6(c) and 6(d) additionally show the measured ambient lightirradiance (at the surface and at the depth of the AUV)and the estimated attenuation coefficient (K(λ)), each as afunction of wavelength.

By examining the spectral reflectance, we are able toextract more information about the pigments that may bepresent within each seafloor class. The coral species shownin Figure 7 (Montipora sp.) exhibits a higher green pigment(475–575 nm) than the more distinctly brown coloration ofthe coral observed in Figure 6 (Acropora sp. 1). The dis-tinctive blue tips of the coral Acropora sp. 2 (Figure 9) areseen as a small peak in the spectral reflectance around 480nm. The seagrass measurement observed in Figure 10 ex-hibits a distinct chlorophyll absorption (lower reflectance) at675 nm, common for seagrasses and algae. Such differencesin absorption and reflectance between biota were importantfeatures in the ability to effectively classify biota classes fromreflectance signals.

3.3. Mapping Results

Figure 11 shows sections of the reconstructed mosaic im-agery maps and associated reflectance spot locations, withannotated labels of the corresponding seafloor classes.These maps illustrate single-transect sections (approxi-mately 2 m wide) from broad-scale transects.

Figure 11(a) shows a section of the survey mission overa sand to seagrass transition. This section was not includedin the classification training data set and so demonstratestypical test set classification performance. Major substratesthat were present in the data were classified with high ac-curacy. Some misclassification of coral [Figure 11(b)] waspresent due in part to the number of training samples in themodel and the size of the material within the FOV of thespectrometer. This aspect is discussed further in the nextsection. Figure 11(c) like the previous figure demonstratesthe typical classification performance over a section of coralreef.

3.4. Classification Performance

Table I shows the resulting classification performances (ac-curacy) for the four different combinations of featuresemployed and different classification algorithms used, eval-uated using 10-fold cross-validation. It was found that a lin-ear SVM with a combination of the hyperspectral reflectanceand LBP features performed the best with a classification ac-curacy at 91.99% ± 2.32. Use of the hyperspectral reflectanceprofile as a sole feature consistently outperformed theclassification based on either image texture of three-channel

Journal of Field Robotics DOI 10.1002/rob

Page 11: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

Bongiorno et al.: Coregistered Hyperspectral and Stereo Image Seafloor Mapping • 11

Figure 6. Spectral reflectance of Acropora sp. 1 coral.

Figure 7. Spectral reflectance of Montipora sp. coral.

RGB image color (excepting marginal differences when us-ing an SVM with radial basis functions).

Use of image texture (LBP) on its own as a feature vec-tor using the ensemble boosting method resulted in a goodclassified rate of 82.38%. In comparison, the HS feature de-

scriptor performed better (91.69%) than a LBP using a linearSVM. When examining the combined HS + LBP results, theperformance increase above HS alone was slight.

Table II shows the confusion matrix for the lin-ear SVM LBP+HS trained classifier (best performer),

Journal of Field Robotics DOI 10.1002/rob

Page 12: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

12 • Journal of Field Robotics—2017

Figure 8. Sand reflectance.

Figure 9. Acropora sp. 2 coral reflectance.

Figure 10. Seagrasses: Halophila spinulosa and Halophila decipiens.

Journal of Field Robotics DOI 10.1002/rob

Page 13: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

Bongiorno et al.: Coregistered Hyperspectral and Stereo Image Seafloor Mapping • 13

Figure 11. (a) Mosaic of a small section of the classification prediction: This shows a transitional section of sand to seagrass.Yellow circles signify a label of sand, and green is the label for seagrass. (b) and (c) Mosaics of a small section of the classificationprediction: This shows a section of coral reef. They show the transition from sandy substrate to coral and algal substratum.

showing the true class of a data point versus the predictedclass. The confusion matrix can explain some reasons for asuboptimal classification: Coral rubble is mostly predictedaccurately (84.7%); however, it is also predicted to be al-gae, Acropora cf. grandis, other corals, and Acropora sp.This is likely due to coral rubble being made up of bro-ken dead corals mostly covered in algae. Seagrass was

sometimes misclassified (2.1%) because images of seagrassalso contain sand. Turf algae was misclassified as coraland rubble (28%) because turf algae is often present overmost underwater surfaces. Soft coral was being misclassi-fied as Acropora sp. (39.3%) most likely because the pro-cess is picking up on a pigment common to most corals(Hedley & Mumby, 2002; Hochberg, Apprill, Atkinson, &

Journal of Field Robotics DOI 10.1002/rob

Page 14: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

14 • Journal of Field Robotics—2017

Table I. Classification performance (accuracy) for different feature combinations and classification algorithms using 10-fold cross-validation on the hand-labeled data points (RGB image color data only, LBP: image texture only, HS: hyperspectral reflectance only,and HS+LBP: hyperspectral reflectance and image texture). A linear SVM performed the best on a combination of hyperspectralreflectance data (HS) and LBP. We can see from this table that LBP does not contribute much to the performance of the classificationover just using HS. Classification results are shown with ±1standard deviation.

Classifier RGB LBP HS HS + LBP

Linear SVM 39.13% ± 0.51% 73.30% ± 1.79% 91.69% ± 1.75% 91.99% ± 2.32%Polynomial SVM 39.13% ± 0.52% 39.13% ± 0.53% 88.41% ± 1.81% 88.48% ± 2.99%SVM Sigmoid Kernel 39.13% ± 0.46% 39.13% ± 0.41% 71.32% ± 1.67% 71.40% ± 1.77%SVM Radial Basis Fn 39.13% ± 0.48% 39.66% ± 1.02% 39.13% ± 0.54% 39.13% ± 0.34%KNN 84.59% ± 1.47% 79.79% ± 3.67% 86.73% ± 1.91% 86.80% ± 1.13%Ensemble Boosting 82.61% ± 2.07% 82.38% ± 2.42% 91.00% ± 2.28% 89.24% ± 2.56%Ensemble Bagging 82.68% ± 1.13% 81.08% ± 3.22% 90.92% ± 1.49% 89.40% ± 1.30%Decision Tree 84.67% ± 2.20% 78.72% ± 2.20% 88.56% ± 2.36% 88.02% ± 2.08%

Table II. Confusion matrix for the classification model using a linear SVM with HSI and LBP data. The model was trained using10-fold cross-validation on the hand-labeled data set. CCA: crustose coralline algae, Encrusting: encrusting corals. The majorclasses, seagrass, sand, rubble, turf algae, and Acropora perform very well; the other smaller classes do not, believed to be in partdue to their small numbers to train the classifier. These particular classes were rare in the data set.

Actual Class

CCA Seagrass SandCoral

Rubble Turf Algae Acropora Encrusting SpongeSoft

Coral Cyanobacteria

Predicted Class CCA 6 0 0 0 6 0 2 2 2 0Seagrass 0 502 3 1 0 1 1 0 0 0Sand 0 11 455 1 0 0 0 0 0 0Coral Rubble 0 0 4 61 4 0 1 1 4 6Turf Algae 4 0 0 5 15 3 3 0 0 3Acropora 1 0 0 0 4 130 4 2 11 0Encrusting 2 0 0 0 1 2 2 1 1 0Sponge 0 0 0 1 0 2 0 9 1 0Soft Coral 0 0 0 1 0 4 0 0 9 0Cyanobacteria 0 0 0 2 2 0 0 0 0 12

Class size (n) 13 513 462 72 32 142 13 15 28 21

% CorrectlyPredicted

46 98 98 85 47 92 15 60 32 57

Bidigare, 2006) and the small number of samples of softcoral.

There will also be confusion due to the relatively largespatial coverage of the spectrometer’s FOV which oftencovered more than one material. The spectrometer’s FOVhas a diameter of approximately 30 cm when imaging theseafloor from a height of 2 m. This spatial attribute can alsoexplain the poor performance in some classes. CCA, turfalgae, encrusting coral, soft coral, and cyanobacteria per-formed poorly. A majority of these biota covered very smallregions of the FOV of the spectrometer, resulting in spectralmixing. The material they are mixed with will change, andthus the spectra for these materials will not be consistent

leading to classification errors. To overcome this, either afiner spatial resolution is required or closer measurementsof the substrate are needed. For the larger materials such asseagrass, sand, coral rubble, and Acropora, they all performwell under the 10-fold cross-validation due to the reductionin spectral mixing. The limited spatial resolution can be aproblem for identifying these smaller materials in the scene.

3.5. Classified Seafloor Habitat Map

Figure 12 shows a 3D map of georegistered and classifiedbiota data using the best performing classifier describedabove. From the spatial distribution of classes observed,

Journal of Field Robotics DOI 10.1002/rob

Page 15: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

Bongiorno et al.: Coregistered Hyperspectral and Stereo Image Seafloor Mapping • 15

Figure 12. Spectral map from an isometric angle showing the different elevations of the seafloor types. The elevation has beenamplified by a factor of 5 to demonstrate the class distribution as a function of elevation further. The key for the class labels is on theright. This data set contains over 14,000 points, due to the limit of displaying the classified points in this diagram, the entire detailof classification cannot be fully shown in this figure. More detailed figures of subsets of the map overlaid on the correspondingimage mosaics are shown in Figure 11.

Acropora sp. coral tended to occur around the higher eleva-tion areas on top of the coral outcrops, whereas seagrass wasgrowing in the lower sandy patches. Coral rubble patcheswere more prevalent on the exposed eastern side of the reef.The prevailing storm and cyclone activity most often camefrom the southeast of the study region, and this was at theedge of a large flat sandy expanse. On the western, inshore,side more live Acropora sp. coral was observed.

4. DISCUSSION

4.1. Advantages of Our Approach

Experimental results shown in our work demonstrate theability to recover fine-scale hyperspectral reflectance pro-files of objects on the seafloor and to use these profiles toaccurately classify both biota and substrate types. Our ap-proach allows for the recovery of reflectance data throughcareful measurement, calibration, and mitigation of light inthe underwater environment including effects from the wa-ter surface and attenuation, accounting for changes in theAUV operating depth and altitude above the underwaterterrain. Being able to map biota, for example, in coral reefhabitats at fine resolutions as demonstrated here allows forfine-scale investigations of various indicators of the biolog-ical health or productivity of a region and is important inthe context of long-term monitoring.

When viewing the seafloor from above the water, thelight from the sun must pass through the water, reflect offthe substrate (sandy substrate 25–35% reflectance, coral 5–10% reflectance) and back through the water column. Ourtechnique allowed us to image at a greater depth and stillreceive solar illumination, as a result of being in situ andthus having a shorter optical path length. We also receiveda stronger reflected signal due to the artificial illuminationput out by the AUV.

The ability to discriminate seafloor material types atsuch a fine scale has other applications in the analysisof marine environments. There are several uses for thedata in above water applications such as retrieving groundtruth spectra for larger-scale aerial mapping and collectionof the optical properties of a water body for use in cor-recting remotely sensed data over coastal regions. Oftenaerial mapping algorithms make large assumptions as to thehomogeneity of the water body they are imaging through,whereas our approach continuously estimates and compen-sates for these parameters during the dive.

4.2. Limitations

The selective absorption of light by water means that datafrom images or radiometers (i.e., spectrometers) are alwayslimited by the operating distance to the seafloor and towavelength ranges in which losses are minimised. Light

Journal of Field Robotics DOI 10.1002/rob

Page 16: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

16 • Journal of Field Robotics—2017

in the near-infrared region of the spectrum (700–900 nm) iscommonly used as a strong indicator of both species healthand for class differentiation in coral, algae, and grass species.Our spectrometers received minimal signal in this regiondue to the heavy absorption by the water. A large amountof light flux would be needed to receive a decent signal inthis spectral region. This may not be practical when operat-ing from a battery-powered robotic platform.

The second limitation we identified was due in partto the footprint of the spectrometer. The footprint was ap-proximately 30 cm in diameter. This altered the degree ofmaterial purity in each reading. This influenced the classifi-cation performance, the less pure the sample the poorer theperformance. This attribute is known as spectral mixing,and there are methods for unmixing these combinations ofmaterial types; however, this is outside of the scope of thiswork.

5. CONCLUSION AND FUTURE WORK

Above water hyperspectral imaging of the ocean floor(using remote sensing) presents many challenges, includ-ing limited spatial resolution, air–water interface artefacts,weather reliance, time of day dependence, and depth limita-tion due to the attenuation of the water column. The abilityto reconstruct hyperspectral reflectance information fromwithin the water largely overcomes these limitations. AnAUV is an ideal platform for this application, being ableto travel to depths beyond the euphotic zone, operationsnot heavily dependent on surface weather, and achievesexcellent spatial resolution. Our use of a spectrometer (incontrast to a hyperspectral imager) allowed us to maintaingood signal-to-noise ratios while still providing spatial res-olutions sufficient for organism classification at a fractionof the size, power, onboard storage/processing power, andcost requirements that would be required from an under-water hyperspectral imager.

Future work will examine methods to help resolve sub-footprint abundances of small or rare organisms observedby our system. Processing techniques in the remote-sensingliterature known as spectral unmixing (Keshava & Mus-tard, 2002) allow for resolving the fractional abundances ofsubpixel constituents (end-members) by relying on a pre-existing spectral library (Bioucas-dias et al., 2012; Keshava& Mustard, 2002). The high spatial resolution imagery fromthe stereo cameras on the AUV could also be utilized to as-sist in resolving the pure end-members for an unsupervisedcreation of a spectral library of pure end-members (Zortea& Plaza, 2009). Through an inversion processing step, dif-ferent materials present and their abundances within eachfootprint could then potentially be resolved.

We used LBP as a feature descriptor for the camera im-agery; however, other techniques could be incorporated toadd more features to our classifier. The 3D data could beused for assisting in coral taxa classification; above water

hyperspectral coral classification methods have been un-able to distinguish coral on the species level (Hochberg &Atkinson, 2003). An additional classification feature couldbe the use of rugosity; this has also been shown to be anecological indicator of bioproductivity (Friedman, Pizarro,Williams, & Johnson-Roberson, 2012).

Future work could also consider the use this techniquefor deriving the health of the coral (Holden & LeDrew, 2001)affected by bleaching or invasive algal blooms. Our methodspotentially allow for revisiting large areas of reef to examinethe small level changes that may occur over time.

ACKNOWLEDGMENTS

The author would like to thank the Australian Institute ofMarine Science, the crew of the RV Cape Ferguson andChristian Lees and Andrew Durrant for their excellent sup-port in collecting the field data, Roy Hughes and DSTOfor their guidance and assistance. The authors acknowl-edge support from Australia’s Integrated Marine Observ-ing (IMOS) program. IMOS is an initiative of the AustralianGovernment being conducted as part of the National Col-laborative Research Infrastructure Strategy (NCRIS). TheAUV facility is one of the 11 IMOS facilities funded byNCRIS and provides support for the deployment of AUVsystems at benthic reference sites around Australia. The au-thors also acknowledge the support of the Australian Centrefor Field Robotics and the University of Sydney.

ADDITIONAL ONLINE RESOURCES

During the AUV operations. a GoPro was recording footagefrom onboard the vehicle; some clips of the region mappedin this paper are shown in the video. The video also showsthe water conditions, clarity, and light levels. This videomay be view at http://vimeo.com/84354609.

REFERENCES

Andrefouet, S., Berkelmans, R., Odriozola, L., Done, T., Oliver,J., & Muller-Karger, F. (2002). Choosing the appropriatespatial resolution for monitoring coral bleaching eventsusing remote sensing. Coral Reefs, 21(2), 147–154.

Beijbom, O., Edmunds, P. J., Kline, D. I., Mitchell, B. G., & Krieg-man, D. (2012). Automated annotation of coral reef surveyimages. Paper presented at IEEE Conference on ComputerVision and Pattern Recognition 2012, Providence, RI (pp.1170–1177). New York: IEEE.

Bewley, M. S., Nourani-Vatani, N., Rao, D., Douillard, B.,Pizarro, O., & Williams, S. B. (2015). Hierarchical clas-sification in AUV imagery. In: L. Mejias, P. Corke, & J.Roberts (Eds.), Field and service robotics (pp. 3–16). Berlin:Springer.

Bioucas-dias, J. M., Plaza, A., Dobigeon, N., Parente, M., Du, Q.,Gader, P., & Chanussot, J. (2012). Hyperspectral unmixingoverview: Geometrical, statistical, and sparse regression-based approaches. IEEE Journal of Selected Topics in

Journal of Field Robotics DOI 10.1002/rob

Page 17: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

Bongiorno et al.: Coregistered Hyperspectral and Stereo Image Seafloor Mapping • 17

Applied Earth Observations and Remote Sensing, 5(2),354–379.

Bongiorno, D. L., Bryson, M., Dansereau, D. G., Williams, S. B.,& Pizarro, O. (2013a). Spectral characterisation of COTSRGB cameras using a linear variable edge filter. Paper pre-sented at IS&T/SPIE Electronic Imaging Burlingame, CA.International Society for Optics and Photonics.

Bongiorno, D. L., Fairley, A. J., Bryson, M., & Williams, S. B.(2013b). Automatic spectrometer/RGB camera spatial cal-ibration. Paper presented at Proceedings of the IEEE In-ternational Geoscience and Remote Sensing Symposium,Melbourne, Australia. IEEE.

Bryson, M., Johnson-Roberson, M., Pizarro, O., & Williams, S.B. (2016). True color correction of autonomous underwatervehicle imagery. Journal of Field Robotics, 33(6), 853–874.

Campos, R., Garcia, R., Alliez, P., & Yvinec, M. (2015). A surfacereconstruction method for in-detail underwater 3D opticalmapping. The International Journal of Robotics Research,34(1), 64–89.

Clement, R., Dunbabin, M., & Wyeth, G. (2005). Toward ro-bust image detection of crown-of-thorns starfish for au-tonomous population monitoring. Paper presented at Aus-tralasian Conference on Robotics and Automation 2005.Australian Robotics and Automation Association, Inc.,University of New South Wales, Sydney, Australia.

English, D. C., & Carder, K. L. (2006). Determining bottom re-flectance and water optical properties using unmannedunderwater vehicles under clear or cloudy skies. Jour-nal of Atmospheric and Oceanic Technology, 23(2), 314–324.

Friedman, A., Pizarro, O., Williams, S. B., & Johnson-Roberson,M. (2012). Multi-scale measures of rugosity, slope and as-pect from benthic stereo image reconstructions. PloS One,7(12), e50440.

Friedman, A., Steinberg, D., Pizarro, O., & Williams, S. B. (2011).Active learning using a variational Dirichlet processmodel for pre-clustering and classification of underwaterstereo imagery. Paper presented at 2011 IEEE/RSJ Interna-tional Conference on Intelligent Robots and Systems, LosAngeles, CA (pp. 1533–1539).

Friedman, A. L. (2013). Automated interpretation of benthicstereo imagery. Doctoral thesis, University of Sydney,Sydney.

Guild, L., Lobitz, B., Armstrong, R., Gilbes, F., Goodman, J.,Detres, Y., . . . & Kerr, J. (2008). NASA airborne AVIRIS andDCS remote sensing of coral reefs. In Proceedings of theEleventh International Coral Reef Symposium. NationalCoral Reef Institute, Ft. Lauderdale, Florida, USA (pp. 623–627).

Hartmann, K., de Souza, P., Timms, G., Davie, A., & Ieee (2009).Measuring light attenuation with a compact optical emis-sion spectrometer and CTD mounted on a low cost AUV.Paper presented at Oceans 2009 - Europe (Vols. 1 and 2,pp. 540–544). New York: IEEE.

Hedley, J. D., & Mumby, P. J. (2002). Biological and remote sens-ing perspectives of pigmentation in coral reef organisms.Advances in Marine Biology, 43, 277–317.

Hochberg, E. J., Apprill, A., Atkinson, M., & Bidigare, R. (2006).Bio-optical modeling of photosynthetic pigments in corals.Coral Reefs, 25(1), 99–109.

Hochberg, E. J., & Atkinson, M. J. (2003). Capabilities of re-mote sensors to classify coral, algae, and sand as pureand mixed spectra. Remote Sensing of Environment, 85(2),174–189.

Hoegh-Guldberg, O., Mumby, P. J., Hooten, a. J., Steneck, R.S., Greenfield, P., Gomez, E., Harvell, C. D., Sale, P. F.,Edwards, a. J., Caldeira, K., Knowlton, N., Eakin, C. M.,Iglesias-Prieto, R., Muthiga, N., Bradbury, R. H., Dubi, a.,& Hatziolos, M. E. (2007). Coral reefs under rapid climatechange and ocean acidification. Science (New York, N.Y.),318(5857), 1737–42.

Holden, H., & LeDrew, E. (2001). Hyperspectral discriminationof healthy versus stressed corals using in situ reflectance.Journal of Coastal Research, 17(4), 850–858.

Idris, M., Jean, K., & Zakariya, R. (2009). Hyperspectral discrim-ination and separability analysis of coral reef communi-ties in redang island. Journal of Sustainability Science andManagement, 4(2), 36–43.

Johnsen, G., Volent, Z., Dierssen, H., Pettersen, R., Van Ardelan,M., Søreide, F., Fearns, P., Ludvigsen, M., & Moline, M.(2013). Underwater hyperspectral imagery to create bio-geochemical maps of seafloor properties. In subsea opticsand imaging (pp. 508–535). Cambridge, UK: WoodheadPublishing.

Keshava, N., & Mustard, J. F. (2002). Spectral unmixing. IEEESignal Processing Magazine, 19(1).

Kobryn, H. T., Wouters, K., Beckley, L. E., & Heege, T. (2013).Ningaloo reef: Shallow marine habitats mapped using ahyperspectral sensor. PLoS One, 8(7), e70105.

Koepke, P. (1984). Effective reflectance of oceanic whitecaps.Applied Optics, 23(11), 1816.

MATLAB (2014). version R2014b. Natick, MA: The MathWorks,Inc.

Mehta, A., Ribeiro, E., Gilner, J., & vanWoesik, R. (2007). Coralreef texture classification using support vector machines.Paper presented at International Conference on ComputerVision Theory and Applications, Barcelona, Spain (pp.302–310).

Mishra, D. R., Narumalani, S., Rundquist, D., Lawson, M., &Perk, R. (2007). Enhancing the detection and classificationof coral reef and associated benthic habitats: A hyperspec-tral remote sensing approach. Journal of Geophysical Re-search, 112, C08014.

Mobley, C. D. (1994). Light and water: Radiative transfer innatural waters (Vol. 592). New York: Academic Press.

Ojala, T., Pietikainen, M., & Maenpaa, T. (2002). Multireso-lution gray-scale and rotation invariant texture classifi-cation with local binary patterns. IEEE Transactions onPattern Analysis and Machine Intelligence, 24(7), 971–987.

Pizarro, O., Rigby, P., Johnson-Roberson, M., Williams, S. B., &Colquhoun, J. (2008). Towards image-based marine habitatclassification. In OCEANS 2008, Quebec City, QC, Canada(pp. 1–7). IEEE.

Journal of Field Robotics DOI 10.1002/rob

Page 18: Coregistered Hyperspectral and Stereo Image Seafloor ...dgd.vision/Papers/bongiorno2018coregistered.pdf · water color correction using three-color images (Vasilescu et al., 2011)

18 • Journal of Field Robotics—2017

Quan, X., & Fry, E. S. (1995). Empirical equation for the in-dex of refraction of seawater. Applied Optics, 34(18),3477–3480.

Roberts, T. E., Moloney, J. M., Sweatman, H. P. A., & Bridge, T. C.L. (2015). Benthic community composition on submergedreefs in the central Great Barrier Reef. Coral Reefs, 34(2),569–580.

Roelfsema, C. M., & Phinn, S. R. (2012). Spectral reflectancelibrary of selected biotic and abiotic coral reef featuresin Heron Reef. Technical report, Centre for Remote Sens-ing & Spatial Information Science, School of Geography,Planning & Environmental Management, University ofQueensland, Brisbane, Australia.

Savitzky, A., & Golay, M. J. E. (1964). Smoothing and differ-entiation of data by simplified least squares procedures.Analytical Chemistry, 36(8), 1627–1639.

Singh, H., Armstrong, R., Gilbes, F., Eustice, R., Roman, C.,Pizarro, O., & Torres, J. (2004). Imaging coral I : Imagingcoral habitats with the SeaBED AUV. Subsurface SensingTechnologies and Applications, 5(1), 25–42.

Spillane, M. C., Monahan, E. C., Bowyer, P. A., Doyle, D.M., & Stabeno, P. J. (1986). Whitecaps and global fluxes.In E. Monahan, & G. Niocaill (Eds.), Oceanic whitecapsSE - 19, volume 2 of Oceanographic Sciences Library,(pp. 209–218). Dodrecht, The Netherlands: Springer.

Torres-Mendez, L. A., & Dudek, G. (2005). A statistical learning-based method for color correction of underwater images.Research on Computer Science, 17(10).

Vasilescu, I., Detweiler, C., & Rus, D. (2011). Color-accurate un-derwater imaging using perceptual adaptive illumination.Autonomous Robots, 31(2–3), 285.

Volent, Z., Johnsen, G., & Sigernes, F. (2007). Kelp forest map-ping by use of airborne hyperspectral imager. Journal ofApplied Remote Sensing, 1(1), 011503.

Wang, K., Franklin, S. E., Guo, X., & Cattet, M. (2010). Remotesensing of ecology, biodiversity and conservation: a reviewfrom the perspective of remote sensing specialists. Sensors(Basel, Switzerland), 10(11), 9647–9667.

Wilkinson, C. (2008). Status of coral reefs of the world: 2008.Global Coral Reef Monitoring Network and Reef and Rain-forest Research Center, Townsville, Australia.

Williams, S., Pizarro, O., Jakuba, M., Johnson, C., Barrett, N.,Babcock, R., Kendrick, G., Steinberg, P., Heyward, A., Do-herty, P., Mahon, I., Johnson-Roberson, M., Steinberg, D., &Friedman, A. (2012). Monitoring of benthic reference sites:Using an autonomous underwater vehicle. IEEE Roboticsand Automation Magazine, 19(1), 73–84.

Williams, S. B., Pizarro, O., Webster, J. M., Beaman, R. J., Ma-hon, I., Johnson-Roberson, M., & Bridge, T. C. L. (2010).Autonomous underwater vehicle-assisted surveying ofdrowned Reefs on the shelf edge of the great barrierReef, Australia. Journal of Field Robotics, 27(5), 675–697.

Zortea, M., & Plaza, A. (2009). Spatial preprocessing for end-member extraction. IEEE Transactions on Geoscience andRemote Sensing, 47(8), 2679–2693.

Journal of Field Robotics DOI 10.1002/rob