an overview of autonomous underwater vehicle research and ... · navigation underwater must rely...

15
PAPER An Overview of Autonomous Underwater Vehicle Research and Testbed at PeRL AUTHORS Hunter C. Brown Ayoung Kim Ryan M. Eustice University of Michigan ABSTRACT This article provides a general overview of the autonomous underwater vehicle (AUV) research thrusts being pursued within the Perceptual Robotics Laboratory (PeRL) at the University of Michigan. Founded in 2007, PeRLs research centers on improving AUV autonomy via algorithmic advancements in environmentally based perceptual feedback for real-time mapping, navigation, and control. Our three major research areas are (1) real-time visual simultaneous localization and mapping (SLAM), (2) cooperative multi-vehicle navigation, and (3) perception- driven control. Pursuant to these research objectives, PeRL has developed a new multi-AUV SLAM testbed based upon a modied Ocean-Server Iver2 AUV platform. PeRL upgraded the vehicles with additional navigation and perceptual sensors for underwater SLAM research. In this article, we detail our testbed development, pro- vide an overview of our major research thrusts, and put into context how our mod- ied AUV testbed enables experimental real-world validation of these algorithms. Keywords: AUVs, SLAM, navigation, mapping, testbed Introduction T he Perceptual Robotics Labo- ratory (PeRL) at the University of Michigan (UMich) is actively in- volved in three major research efforts: real-time vision-based simultaneous localization and mapping (SLAM), heterogeneous multi-vehicle coopera- tive navigation, and perception-driven control. To test and experimentally validate these algorithms, we have developed a new multi-autonomous underwater vehicle (AUV) testbed based upon a modied Ocean-Server Iver2 commercial AUV platform. This new AUV testbed provides a simple man-portable platform for real-world experimental validation and serves as a dedicated engineering testbed for proof-of-concept algorith- mic implementations. In this article, we report on our developments in this area and provide an overview of this new experimental facility (Figure 1). Overview of Underwater Navigation One of the major limitations in the eld of underwater robotics is the lack of radio-frequency transmis- sion modes. The opacity of water to electromagnetic waves precludes the use of the global positioning system (GPS) as well as high-speed under- water radio communication (Stewart, 1991). Hence, communication and navigation underwater must rely upon other means. Kinsey et al. (2006) pro- vided an overview of the current state-of-the-art in underwater vehicle navigation, of which we briey sum- marize here. Conventional Underwater Navigation Systems Two broad categories of under- water navigation methods exist for localizing vehicles and instruments: absolute positioning and relative dead-reckoning. The traditional long- baseline (LBL) method of underwater FIGURE 1 PeRLs multi-AUV testbed is based upon a modied Ocean-Server Iver2 AUV platform. Shown in the foreground is a modied vehicle displaying a new nose cone designed and fabricated by PeRL. For comparison, a stock vehicle displaying the original nose cone is shown in the background. Spring 2009 Volume 43, Number 2 33

Upload: others

Post on 11-Jul-2020

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: An Overview of Autonomous Underwater Vehicle Research and ... · navigation underwater must rely upon other means. Kinsey et al. (2006) pro-vided an overview of the current state-of-the-art

P A P E R

An Overview of Autonomous UnderwaterVehicle Research and Testbed at PeRL

A U T H O R SHunter C. BrownAyoung KimRyan M. EusticeUniversity of Michigan

A B S T R A C T

This article provides a general overview of the autonomous underwater vehicle

(AUV) research thrusts being pursued within the Perceptual Robotics Laboratory(PeRL) at the University of Michigan. Founded in 2007, PeRL’s research centerson improving AUV autonomy via algorithmic advancements in environmentallybased perceptual feedback for real-time mapping, navigation, and control. Ourthree major research areas are (1) real-time visual simultaneous localization andmapping (SLAM), (2) cooperative multi-vehicle navigation, and (3) perception-driven control. Pursuant to these research objectives, PeRL has developed a newmulti-AUV SLAM testbed based upon a modified Ocean-Server Iver2 AUV platform.PeRL upgraded the vehicles with additional navigation and perceptual sensors forunderwater SLAM research. In this article, we detail our testbed development, pro-vide an overview of our major research thrusts, and put into context how our mod-ified AUV testbed enables experimental real-world validation of these algorithms.Keywords: AUVs, SLAM, navigation, mapping, testbed

tive navigation, and perception-driven

Introduction

The Perceptual Robotics Labo-ratory (PeRL) at the University ofMichigan (UMich) is actively in-volved in three major research efforts:real-time vision-based simultaneouslocalization and mapping (SLAM),heterogeneous multi-vehicle coopera-

control. To test and experimentallyvalidate these algorithms, we havedeveloped a new multi-autonomousunderwater vehicle (AUV) testbedbased upon a modified Ocean-ServerIver2 commercial AUV platform.This new AUV testbed provides asimple man-portable platform forreal-world experimental validationand serves as a dedicated engineeringtestbed for proof-of-concept algorith-mic implementations. In this article,we report on our developments inthis area and provide an overviewof this new experimental facility(Figure 1).

Overview of UnderwaterNavigation

One of the major limitations inthe field of underwater robotics isthe lack of radio-frequency transmis-sion modes. The opacity of water toelectromagnetic waves precludes theuse of the global positioning system(GPS) as well as high-speed under-

water radio communication (Stewart,1991). Hence, communication andnavigation underwater must rely uponother means. Kinsey et al. (2006) pro-vided an overview of the currentstate-of-the-art in underwater vehiclenavigation, of which we briefly sum-marize here.

Conventional UnderwaterNavigation Systems

Two broad categories of under-water navigation methods exist forlocalizing vehicles and instruments:absolute positioning and relativedead-reckoning. The traditional long-baseline (LBL) method of underwater

FIGURE 1

PeRL’s multi-AUV testbed is based upon a modified Ocean-Server Iver2 AUV platform. Shownin the foreground is a modified vehicle displaying a new nose cone designed and fabricatedby PeRL. For comparison, a stock vehicle displaying the original nose cone is shown in thebackground.

Spring 2009 Volume 43, Number 2 33

Page 2: An Overview of Autonomous Underwater Vehicle Research and ... · navigation underwater must rely upon other means. Kinsey et al. (2006) pro-vided an overview of the current state-of-the-art

positioning estimates absolute posi-tion by measuring time-of-flight rangesto fixed beacons (Hunt et al., 1974;Milne, 1983). The precision of thisestimate is bounded, and the accuracyis determined by system biases. Therange of this solution is limited to afew kilometers in the best acousticconditions, and the positioning resolu-tion is on the order of 1 m. The slowupdate rate of LBL is constrained bythe acoustic travel times—typically up-dating every few seconds. In contrastto slow, coarse, but absolute LBL posi-tioning, a Doppler velocity log (DVL)or inertial navigation system (INS) in-stead estimates the distance traveledto infer position. Dead-reckoning isfast (∼10 Hz) and delivers fine resolu-tion (∼1 cm), but the precision of thisrelative measurement is unbounded,growing monotonically with time.This makes it difficult to return to aknown location or to relate measure-ments globally to one another.

Underwater SLAMOver the past decade, a significant

research effort within the terrestrialmobile robotics community has beento develop environmentally basednavigation algorithms that eliminatethe need for additional infrastructureand bound position error growth tothe size of the environment—a keyprerequisite for truly autonomousnavigation. The goal of this work hasbeen to exploit the perceptual sensingcapabilities of robots to correct foraccumulated odometric error by local-izing the robot with respect to land-marks in the environment (Baileyand Durrant-Whyte, 2006; Durrant-Whyte and Bailey, 2006).

One of the major challenges of theSLAM problem is (1) defining fixedfeatures from raw sensor data and(2) establishing measurement to fea-

34 Marine Technology Society Journa

ture correspondence [i.e., the prob-lem of data association (Neira andTardos, 2001)]. Both of these taskscan be nontrivial—especially in anunstructured underwater environ-ment. In man-made environments,typically composed of planes, lines,and corners primitives, point featurescan be more easily defined; however,complex underwater environmentspose a more challenging task for fea-ture extraction and matching.

One approach to curbing the chal-lenges of defining fixed features fromraw sensor data is to seed the environ-ment with artificial landmarks that areeasily recognizable. For example, onepractical application of a range-onlyunderwater SLAM has been to local-ize an AUV in an acoustic-beaconnetwork that has a priori unknowngeometry (Newman and Leonard,2003; Olson et al., 2006). In this sce-nario, the beacon geometry is learnedonline by the SLAM algorithm, elim-inating the need to conduct an initialsurvey calibration of the acoustic-beacon network by a surface ship.Moreover, this paradigm can easily beextended to a scenario where the AUVself-deploys the acoustic beacons in situover the survey site.

More recently, progress has beenmade in applying SLAM in an a prioriunknown underwater environmentwithout the aid of artificial landmarks.In particular, one SLAMmethodologythat has seen recent success in the ben-thic realm is to apply a pose-graphscan-matching approach (Fleischer,2000; Eustice et al., 2006a). Pose-graph SLAM approaches do notrequire an explicit representation offeatures and instead use a data-drivenapproach based upon extractingrelative-pose constraints from rawsensor data. The main idea behindthis methodology is that registering

l

overlapping perceptual data, for ex-ample, optical imagery as reportedin by Eustice et al. (2006a, 2006b)or sonar bathymetry as reported byRoman and Singh (2005), introducesspatial drift-free edge constraints intothe pose-graph. These spatial con-straints effectively allow the robot toclose-the-loop when revisiting a pre-viously visited place, thereby resettingany accumulated dead-reckoningerror.

Overview of PeRL’sResearch Thrusts

PeRL’s three major research areasare (1) real-time visual SLAM, (2) co-operative multi-vehicle navigation,and (3) perception-driven control. Inthis section, we provide an overviewof our laboratory’s research thrusts asthey pertain to AUV algorithms.

Real-Time Visual SLAMThe first of the three PeRL research

domains, real-time vision-based SLAMalgorithms, has direct application toareas such as autonomous underwatership-hull inspection (Eustice, 2008;Kim and Eustice, 2009) and deep-seaarchaeological missions (Ballard,2008; Foley et al., 2009).

Present day means for ship-hulland port security inspection require ei-ther putting divers in the water or pilot-ing a remotely operated vehicle (ROV)over the area of interest—both of thesemethods are manpower intensiveand generally cannot quantitativelyguarantee 100% survey coverage. Au-tomating this task, however, is chal-lenging and compounded by the factthat areas around ships in berth areseverely confined, cluttered, and com-plex sensing environments (e.g., acous-tically, optically, magnetically). Currenttethered robotic inspection systemspresent issues of snagging, maneuver

Page 3: An Overview of Autonomous Underwater Vehicle Research and ... · navigation underwater must rely upon other means. Kinsey et al. (2006) pro-vided an overview of the current state-of-the-art

degradation, and tether management,al l of which make maneuveringaround the ship at the pier difficult.Moreover, current robotic inspectionmethods require human in-the-loopintervention for both sensory in-terpretation and control (e.g., ROVpiloting). Navigation feedback inthese scenarios is typically performedusing acoustic transponder time-of-flight ranging (Smith and Kronen,1997). This necessitates the setupand calibration of the acoustic-beaconinfrastructure, and therefore vitiatesour ability to rapidly and repeatedlyinspect multiple underwater structures.

Similarly, deep-sea archaeologyalso requires high-performance nav-igation (Ballard, 2008; Foley et al.,2009). Optical imagery, bathymetricsonar, and environmental measure-ments are all products of interest tothe archeologist. The data may be col-lected over multiple missions or evenmultiple field seasons, and it is theprecision of the navigation that makesit possible to transform these measure-ments into co-registered maps.

To combat the aforementioned nav-igation limitations (i.e., infrastructure-based and unbounded error growth),PeRL has been developing a camera-based navigation system that usesvehicle-collected imagery of the hull(port/hull inspection) or seafloor (un-derwater archeology) to extract mea-surements of vehicle motion. Thesecamera-derived spatial measurementsare fused with the onboard dead-reckoned data to produce a bounded-error navigation estimate (Figure 2).

In essence, the AUV builds a digi-tal map of the seafloor or hull by regis-tering overlapping digital-still images(both along-track and cross-track im-agery). Images that are successfullyregistered produce a relative measure-ment of the vehicle’s attitude (head-

ing, pitch, and roll) and translational(x, y, z) displacement. When fusedwith the onboard navigation datafrom a bottom-lock DVL, the resultis a navigation system whose error iscommensurate or much better thanLBL but which is infrastructure free.The significant advantage of this navi-gation paradigm is that it is in situ. Forarcheological surveys, this means thatthe AUV can be more easily deployedfor exploratory surveys to investigatetarget shipwreck sites—without hav-ing to invest significant ship time indeploying a beacon network to obtainprecision navigation; and for hull in-spection surveys, no additional acoustic-beacon infrastructure is required forprecise global localization along thehull. In layman’s terms, these algo-

rithms allow the AUV to navigatemuch like a human does by visuallynavigating with respect to the seafloorenvironment.

A useful and important by-productof this navigation methodology is thatthe overlapping registered imagery canbe used to construct an optically derivedbathymetry map (Figure 3). This mapcan be used to construct a quantita-tively accurate three-dimensional (3-D)photomosaic by back-projecting theimagery over the optical bathymetrymap. Figure 3 displays the result of ap-plying this technology to the fourth-century B.C. Chios classical ancientshipwreck site (Foley et al., 2009). Inparticular, Figure 3a shows the opti-cally derived bathymetry map for a15 m by 45 m swath centered overtop

FIGURE 2

The foundation of visually augmented navigation (Eustice et al., 2008) is the fusion of “zero-drift” camera measurements with dead-reckoned vehicle navigation data to produce a boundederror position estimate. These constraints are fused with onboard navigation sensor data in aview-based stochastic map framework; the model is composed of a pose-graph where thenodes correspond to historical robot poses and the edges represent either navigation or cameraconstraints.

Spring 2009 Volume 43, Number 2 35

Page 4: An Overview of Autonomous Underwater Vehicle Research and ... · navigation underwater must rely upon other means. Kinsey et al. (2006) pro-vided an overview of the current state-of-the-art

36 Marine Technology Society Journal

the wreck site. The optical bathymetryis generated from a 3-D triangulatedpoint cloud derived from pixel corre-spondences. In Figure 3b, we displaya quantitatively accurate 3-D photo-mosaic obtained by back-projectingthe imagery onto the gridded surface.It should be emphasized that this resultis fully automatic and metrically quan-titative, in other words, measurementsof object size and geometric relation-ships can be derived. Although thistechnology is still very much in theactive research stage, its current andfuture value for in situ, rapid, quan-titative documentation of marine ar-cheological wreck sites or ship-hullinspection cannot be overstated.

Cooperative NavigationIn addition to real-time visual

SLAM, PeRL is working toward co-operative multi-vehicle missions forlarge-area survey. Multi-vehicle co-operative navigation offers promiseof efficient exploration by groups ofmobile robots working together topool their mapping capability. Mostprior research in the SLAM commu-nity has focused on the case of single-agent mapping and exploration.Although these techniques can oftenbe extended to a centralized multi-agent framework (Walter and Leonard,2004) (provided that there are nocommunication bandwidth restric-tions), the extension of single-agenttechniques to a decentralized multi-vehicle SLAM framework is often nei-ther obvious nor appropriate. Muchof the previous research in the area ofdistributed multi-vehicle SLAM hasfocused primarily on terrestrial (i.e.,land and aerial) applications (Williamset al., 2002; Ridley et al., 2002; Rekleitiset al., 2003; Bourgault et al., 2004;Ong et al., 2006; Partan et al., 2006).There, high-bandwidth radio commu-

FIGURE 3

A by-product of camera-based AUV navigation is the ability to produce an optically derivedbathymetry. VAN-derived bathymetric maps from the Chios 2005 shipwreck survey are depicted(Foley et al., 2009). (a) Triangulated 3-D point cloud from registered imagery. The point cloud isgridded to 5 cm to produce an optically derived bathymetry map. (b) A quantitatively correct 3-Dmosaic is generated by back-projecting the imagery onto the optical bathymetry map. The grayareas correspond to regions of no image overlap. (c) A zoomed overhead view shows detail ofthe amphorae pile in the center mound of the 3-D photomosaic.

Page 5: An Overview of Autonomous Underwater Vehicle Research and ... · navigation underwater must rely upon other means. Kinsey et al. (2006) pro-vided an overview of the current state-of-the-art

nication is possible; however, under-water communication bandwidth isdistinctly limited from that on land(Partan et al., 2006).

It requires on the order of 100times more power to transmit thanit does to receive underwater, whichmakes acoustic transmission and re-ception asymmetrical for mediumaccess schemes (Partan et al., 2006).Half duplex time division multipleaccess networks are usual, with typi-cal acoustic-modem data rates rang-ing from 5 kbits/s at a range of 2 km(considered a high rate) to as little as80 bits/s (a low rate). The low acousticdata rates are not simply a limitationof current technology—the theoreticalperformance limit for underwateracoustic communications is 40 kmkbps (e.g., a max theoretical data rateof 20 kbps at a range of 2 km) (Partanet al., 2006). Therefore, any type ofmulti-vehicle SLAM framework mustadhere to the physical bandwidth lim-itations of the underwater domain.

In a previous work, Eustice et al.(2006c, 2007) developed a synchro-nous clock acoustic-modem-basednavigation system capable of support-ing multi-vehicle ranging. The systemconsisted of a Woods Hole Oceano-graphic Institution (WHOI) Micro-Modem (Freitag et al., 2005a, 2005b)(an underwater acoustic modem devel-oped by WHOI) and a low-powerstable clock board. This system canbe used as a synchronous-transmissioncommunication/navigation systemwherein data packets encode time oforigin information as well as localephemeris data (e.g., x, y, z positionaldata and error metric). This allows forthe direct measurement of inter-vehicleone-way travel time (OWTT) time-of-flight ranging. The advantage of aOWTT ranging methodology is thatall passively receiving nodes within lis-

tening range are able to decode andmeasure the inter-vehicle range to thebroadcasting node.

PeRL is currently investigatingprobabilistic fusion methods for aOWTT cooperative navigation multi-vehicle framework that scales acrossa distributed network of nodes in anon-fully connected network topol-ogy. The proposed acoustic-modemnavigation framework will exploitinter-vehicle OWTT ranging to sup-plement perceptual SLAM localizationthereby reducing the need for statecommunication. The goal is to dis-tribute state estimation between thevehicles in a coordinated fashion,allowing navigation impoverished ve-hicles (e.g., no INS or DVL) to sharefrom positional accuracies of betterequipped vehicles (e.g., those withDVL bottom-lock, or VAN-navigatedvehicles near the seafloor).

Figure 4 depicts preliminary re-sults reported by Eustice et al. (2007),demonstrating the OWTT proof ofconcept. Here, a GPS-equipped sur-face ship navigationally aided a sub-

merged AUV by broadcasting theship GPS position to the networkwhile the AUV measured its range tothe ship via the OWTTs.

Perception-Driven ControlAnother research focus is in the

domain of perception-driven control.Algorithms are under development toenable a vehicle to respond to the en-vironment by autonomously selectingalternative search patterns based uponperceived feature distributions in theenvironment. This creates improve-ments in survey efficiency by limitingthe duration in benign feature-poorareas and instead spends more bottom-time over actual targets. A seafloorsurvey vehicle, for example, may driveinto an area devoid of features dur-ing a mission. Instead of continuingto search the featureless space, wherethere is little return on investmentfrom a visual navigation system, thevehicle would return to a previouslyknown feature rich area and beginsearching in another direction. PeRLis currently working on algorithms to

FIGURE 4

Preliminary OWTT results as reported by Eustice et al. (2007) for a two-node network consistingof an AUV and a surface ship. (left) The raw dead-reckoned AUV trajectory is shown in blue,GPS-derived ship position in cyan, OWTT fused AUV trajectory in red, and the LBL measuredAUV position in green, which serves as an independent ground-truth. (right) Zoomed view of theAUV trajectory. (Color versions of figures available online at: http://www.ingentaconnect.com/content/mts/mtsj/2009/00000043/00000002.)

Spring 2009 Volume 43, Number 2 37

Page 6: An Overview of Autonomous Underwater Vehicle Research and ... · navigation underwater must rely upon other means. Kinsey et al. (2006) pro-vided an overview of the current state-of-the-art

assist in the decision making process ofwhen to revisit known landmarks ver-sus continuing new exploration. Forexample, Figure 5 depicts imageryfrom a hull inspection survey of a de-commissioned aircraft carrier (Kimand Eustice, 2009). At the one ex-treme, we see that some areas of thehull are very texture-rich (heavily bio-fouled), whereas other areas are nearlyfeature-less (no distinguishing featuressuch as weld seams, rivets, port open-ing, etc). In this type of environment,it makes no sense to incorporate imag-ery from the featureless region into thevisual SLAM map because the imageregistration algorithm will fail to local-ize the vehicle because of a lack of vi-sual features. Instead, by couplingthe visual navigation feedback into thetrajectory planning and control, theAUV can more intelligently adaptits survey and map-building strategyso as to only return to feature-richareas of the hull when it accrues toomuch pose uncertainty. By jointlyconsidering along-hull pose uncer-tainty, map feature content, and en-ergy expenditure accrued duringAUV transit, we can frame the onlinepath planning problem as a multi-objective optimization problem thatseeks to find an optimal trajectory withrespect to the possibly divergent criteriaof exploration versus localization.

Overview of PeRL’sAUV Testbed

To pursue PeRL’s three researchthrust areas, two commercial Ocean-Server Iver2 AUV systems were pur-chased and modified to serve as areal-world testbed platform for SLAMresearch at UMich. Although severalother vehicle platforms currently in-clude stereo-vision systems and DVLsensors, the Iver2 (Figure 6) was

38 Marine Technology Society Journa

FIGURE 5

Depiction of the multi-objective constraints at play in perception-driven control.

FIGURE 6

PeRL’s modified Ocean-Server Iver2: external and internal view.

l

Page 7: An Overview of Autonomous Underwater Vehicle Research and ... · navigation underwater must rely upon other means. Kinsey et al. (2006) pro-vided an overview of the current state-of-the-art

selected as a testbed development plat-form because of its ability to be trans-ported in a personal vehicle andlaunched by a single user. The vehi-cles, as shipped, are rated to a depth of100 m, have a maximum survey speedof approximately 2 m/s (4 knots), andweigh approximately 30 kg allowingfor transport by two people (Andersonand Crowell, 2005).

Because the commercial-off-the-shelf Iver2 vehicle does not comeequipped with camera or DVL sens-ing, sensor upgrades were requiredto enable the stock vehicle to performSLAM and coordinated multi-AUVmissions. PeRL upgraded the vehicles

with additional navigation and percep-tion sensors (detailed in Figure 6band Table 1), including 12-bit stereodown-looking Prosilica cameras, aTeledyne RD Instruments (RDI)600 kHz Explorer DVL, a KVH In-dustries, Inc. (KVH) single-axis fiber-optic gyroscope (FOG), a Microstrain3DM-GX1 attitude-heading-referencesensor, a Desert Star SSP-1 digital pres-sure sensor, and a WHOI Micro-modem for inter-vehicle communication.For illumination, we contracted the cus-tom design and fabrication of a LEDarray by Farr and Hammar EngineeringServices, LLC. To accommodate the ad-ditional sensor payload, a new Delrin

nose cone was designed and fabricated.An additional 32-bit embedded PC104CPU hardware was added for data-logging, real-time control, and in situreal-time SLAM algorithm testingand validation. Details of the designmodification are discussed herein.

Mechanical/Electrical Designand Integration

The design goals during the inte-gration phase of vehicle developmentconsisted of minimizing the hydro-dynamic drag, maintaining the neutralbuoyancy, and maximizing the sensorpayload capacity within the pressurehull. These requirements were achieved

TABLE 1

Integrated sensors on the PeRL vehicles.

Iver2 Instruments

Variable Update Rate Precision

Spring 2009

Range

Volume 43, Numb

Drift

Ocean-Server OS5000 Compass

Attitude 0.01-40 Hz 1-3° (Hdg), 2° (Roll/Pitch) 360° –

Measurement Specialties PressureSensor MSP-340

Depth

0.01-40 Hz <1% of FS† 0-20.4 atm –

Imagenex Sidescan Sonar(Dual Freq.)

Sonar image

330 or 800 kHz – 15-120 m –

USGlobalSat EM-406a GPS

XYZ position 1 Hz 5-10 m – –

New Instruments

Prosilica GC1380H(C) Camera(down-looking stereo-pair)

Gray/color image

0.1-20 fps 1360 × 1024 pixels12-bit depth

Strobed LED Array‡

Illumination 0.1-4 fps – 2-4 m altitude –

Teledyne RDI 600 kHz Explorer DVL

Body velocity 7 Hz 1.2-6 cm/s (at 1 m/s) 0.7-65 m –

KVH DSP-3000 1-Axis FOG

Yaw rate 100 Hz 1-6° /h ±375°/s 4°/h/√Hz

Desert-Star SSP-1 300PSIG DigitalPressure Transducer

Depth

0.0625-4 Hz 0.2% of FS† 0-20.4 atm –

Applied Acoustics USBL

XYZ position 1-10 Hz ±0.1 m (Slant Range) 1000 m –

OWTT* Nav (Modem + PPS)

Slant range Varies 18.75 cm (at 1500 m/s) Varies <1.5 m/14 h

•WHOI Micro-modem

Communication Varies – Varies –

•Seascan SISMTB v4 PPS Clock

Time 1 Hz 1 µs – <1 ms/14 h

Microstrain 3DM-GX1 AHRS

Attitude 1-100Hz 2° (Hdg), 2° (Roll/Pitch) 360° –

Angular rate

1-100 Hz 3.5°/s ±300°/s 210°/h/√Hz †Full scale.‡In development.*One-way travel time.

er 2 39

Page 8: An Overview of Autonomous Underwater Vehicle Research and ... · navigation underwater must rely upon other means. Kinsey et al. (2006) pro-vided an overview of the current state-of-the-art

through the use of lightweight mate-rials such as acrylonitrile butadienestyrene (ABS), Delrin, and aluminum,and through careful center of buoy-ancy and center of mass computations.The entire vehicle was modeled us-ing SolidWorks solid modeling soft-ware, and the extensive use of thesecomputer-aided design (CAD) modelsprovided optimal arrangements of in-ternal components prior to actual in-stallation (Figure 7).

The addition of a redesigned SLAMnose cone and sensor payload shiftedboth the original center of buoyancyand center of gravity. New positionswere estimated using the CAD modelsand refined during ballast tests atthe UMich Marine HydrodynamicsLaboratory (MHL). The vehicle isballasted to achieve approximately0.11 kg (0.25 lbs) of reserve buoyancyfor emergency situations when thevehicle must surface without power.Vehicle trim is set neutral to achievepassive stability and to optimize bothdiving and surfacing operations.

The interior component arrange-ment within the main body took intoaccount mass, geometry, heat dissipa-tion, and electrical interference consid-erations when determining the spatiallayout and arrangement of sensors,computing, and electronics in the main

40 Marine Technology Society Journa

tube. Because of the high density ofsensors and other devices in the pres-sure housing, the components withthe highest heat radiation, such ascomputers and DC-DC converters,were placed in direct contact with thealuminum chassis to allow better heatdissipation. Also, sensors that are proneto electrical noise from surroundingelectronics were spatially separated inthe layout (e.g., the Micro-Electro-Mechanical Systems (MEMS) Micro-strain 3DM-GX1 is located in thenose cone tip, the furthest point fromthe motor and battery pack influence).

Electrically, the vehicle is poweredby a 665 Wh Li-ion battery packmade up of seven 95 Wh laptop bat-teries. This battery pack is managedby an Ocean-Server Intelligent Batteryand Power System module. The addi-tional sensors and PC104 computingadded by PeRL draw less than 40 Wtotal. This load is in addition to theoriginal 12 W nominal vehicle hotelload and 110 W propulsion load ofthe stock Iver2, which results in a com-bined maximum total power draw ofapproximately 162 W (this assumesfull hotel load and the motor at fullpower). The estimated run time atfull load is 4.1 h; however, takinginto account a more realistic assess-ment of propulsion load for photo

l

transect survey conditions, the contin-uous run time will be closer to 5.2+ hat a 2 knot (i.e., 75 W propulsion) sur-vey speed.

SLAM Nose ConeIn order to support the real-time

visual SLAM objectives of PeRL, adown-looking stereo-vision systemwas added to the Iver2 vehicles. Adual camera arrangement was chosenbecause it provides greater flexibilityfor visual SLAM research. For exam-ple, using a stereo-rig allows for themeasurement of metrical scale in-formation during pairwise image reg-istration and, thereby, can be used toimprove odometry estimation by ob-serving velocity scale error in DVLbottom-track measurements. Addi-tionally, the dual camera arrangementcan be used to run one of the camerasin a low-resolution mode suitable forreal-time image processing (e.g., VGAor lower), whereas the second cameracan be run at its native 1.6 megapixelresolution for offline processing. Atdepth, illumination will come froma LED array mounted at the aft ofthe vehicle. The array will provide op-timal illumination over a 2-4 m imag-ing altitude at up to 4 fps strobe ratewith a camera-to-light separation dis-tance of approximately 1.25 m.

To accommodate the two-cameravision system and the DVL transducer,a new nose cone was designed and fab-ricated. The UMich custom-designednose cone (Figure 8) was fabricatedfrom Acetron GP (Delrin) because ofthe material’s high tensile strength,scratch resistance, fatigue endurance,low friction, and low water absorption.Threaded inserts are installed in thenose cone to prevent stripped threads,and stainless fasteners with a polytetra-fluoroethene paste (to prevent corro-sion issues) are used in all locations.

FIGURE 7

Mechanical layout.

Page 9: An Overview of Autonomous Underwater Vehicle Research and ... · navigation underwater must rely upon other means. Kinsey et al. (2006) pro-vided an overview of the current state-of-the-art

The designed working depth of thenose cone is 100 m (to match the fullrating of the Iver2). Calculations wereperformed according to the ASMESection VIII Boiler and Pressure VesselCode to verify the wall thickness ineach of the nose cone sections. A min-imum factor of safety of at least 2.64was attained for all sections of thenose cone. Pressure tests, conductedat Woods Hole Oceanographic In-stitution, demonstrated the structuralintegrity of the nose cone to 240 mwater depth. Three short durationtests of 12 min each were made to24.5 atm (240 m salt water equiva-lent), and one long duration test of5 h was made to 24.5 atm.

The Teledyne RDI 600 kHz Ex-plorer DVL is integrated into the nosecone using fasteners to attach the DVLhead to threaded inserts in the nosecone material. The limited internalcavity space of the Iver2 nose precludesthe use of RDI’s recommended clampattachment method. Instead, self-

sealing fasteners are used to eliminatea fluid path through the mountingholes of the DVL to the interior ofthe nose cone. The associated DVLelectronics module is mounted in themain chassis of the vehicle just behindthe forward bulkhead. The electron-ics module and transducer head areconnected by RDI’s recommendedshielded twisted-pair cabling to reducestray electromagnetic field effects.

Two nose cone plugs were designedfor camera integration. These plugsinclude a sapphire window and twomounting brackets each (Figure 8).The synthetic sapphire window wascustom designed and fabricated bythe Optarius Company of the U.K.Sapphire was chosen because of itshigh scratch resistance and superiortensile strength to that of plastic orglass materials. Themounting bracketswere designed in CAD and printed inABS plastic using a Dimension FusedDeposition Modeling (FDM) Eliterapid prototype machine. Static face

and edge o-ring seals prevent water in-gress through the plug around the sap-phire window.

A Desert Star SSP-1 pressure trans-ducer is mounted to an internal faceof the nose cone and is exposed tothe ambient environment through a1/8 in. shaft drilled perpendicular tothe nose cone wall to reduce the flownoise influence on the sensor. TheMicrostrain 3DM-GX1 is integratedinto the nose cone tip by mountingthe Ocean-Server OS5000 compass ontop of the 3DM-GX1 andmilling a cav-ity in the tip to allow additional verti-cal clearance. All o-rings installed in thenose cone are of Buna-N (acrylonitrile-butadiene) material and are lightlylubricated with Dow Corning #4prior to installation.

Mission Planning and ControlThe stock Iver2 control computer

is a 500 MHz AMD Geode CPU run-ning Windows XP Embedded forthe operating system. Ocean-Serverprovides a Graphical User Interface(GUI) mission planning module calledVectorMap (Figure 11) and a vehiclecontrol software called UnderwaterVehicle Control (UVC). The Vector-Map mission planning software allowsthe user to graphically layout a missiontrajectory over a geo-registered imageor nautical navigation chart and tospecify parameters such a depth,speed, goal radius, timeout, and otherattributes for each waypoint. Theoutput of VectorMap is an ASCIIwaypoint file that the UVC loadsand executes within a Proportional-Integral-Derivative (PID) controlframework (Leveille, 2007). Addition-ally, the UVC supports a backseatdriver Application Programming In-terface (API) interface for user-levelcontrol from a host client. This APIsupports two control primitives: (1) a

FIGURE 8

(top) Exploded and transparent view of PeRL’s redesigned nose cone. (bottom) The fabricatednose cone as installed.

Spring 2009 Volume 43, Number 2 41

Page 10: An Overview of Autonomous Underwater Vehicle Research and ... · navigation underwater must rely upon other means. Kinsey et al. (2006) pro-vided an overview of the current state-of-the-art

high-level control interface where thehost specifies the reference heading,speed, and depth set points, and (2) alow-level control interface where thehost specifies servo commands for thethruster and control surfaces.

The modified PeRL vehicle usesa Digital-Logic ADL945 1.2 GHzCore-Duo PC104 for backseat con-trol and data logging. The ADL945runs less than 10 W in power con-sumption, has Gigabit Ethernet forinterfacing with the Prosilica GigEcameras, and runs 32-bit UbuntuLinux. The host stack records datafrom the cameras and navigation sen-sors, performs online state estimation,and directs the real-time control ofthe Iver2 via the backseat driver API.For ease of software development andmodularity, we have adopted a multi-process software paradigm that usesthe open-source LCM inter-processcommunication library developedby the Massachusetts Institute ofTechnology (MIT) Defense AdvancedResearch Projects Agency (DARPA)Urban Grand Challenge team (Leonardet al., 2008; LCM, 2009).

Missions and TestingCurrent missions and testing con-

ducted by PeRL include testing at theUMich MHL tow tank, automatedvisual ship-hull inspection (conductedat AUV Fest 2008), field testing atthe UMich Biological Station, and ar-chaeological surveys of shipwrecks inthe Thunder Bay National MarineSanctuary.

UMich MarineHydrodynamics Laboratory

The UMich MHL provides a con-trolled experimental environment fortesting real-time underwater visual

42 Marine Technology Society Journa

SLAM algorithms. The freshwaterphysical model basin measures 110 m ×6.7 m × 3.0 m (Figure 9) and canachieve prescribed vehicle motionsvia the electronically controlled tankcarriage. Trajectory ground-truth isobtained from the encoder measuredcarriage position and will be used tovalidate real-time visual SLAM poseestimates derived from registering theimagery of the tank floor. For easeof real-time algorithm development,we can attach a wet-mateable wiredEthernet connection to a vehicle bulk-head so that imagery and sensor datacan be streamed live to a topside desk-top computer. This allows for greaterflexibility in real-time software devel-opment, visualization, and debugging.The use of the test facility has beenbeneficial for early development andtesting of our Iver2 hardware andreal-time underwater visual SLAMalgorithms.

AUV Fest 2008The Iver2 serves as a testbed for

real-time visual autonomous port andhull inspection algorithm research atUMich. We use the Iver2 as a proxyfor visual hull inspection by develop-

l

ing and testing our algorithms to nav-igate over the seafloor (because theunderlying visual navigation algorithmis fundamentally the same in the twoscenarios). This allows us to use theIver2 to validate the accuracy of our vi-sually augmented navigation method,and to test and implement our real-time algorithms on the type of embed-ded system hardware typically foundon AUVs.

Meanwhile, to test our VAN algo-rithms in a real hull-inspection sce-nario, PeRL collaborated with MITand Bluefin Robotics at AUV Fest2008 to put one of our camera systemson the Hovering Autonomous Under-water Vehicle (HAUV) (Vaganay et al.,2005). In this experiment, we col-lected imagery of the hull of theUSS Saratoga—a decommissionedU.S. aircraft carrier stationed at New-port, Rhode Island (Figure 10a). PeRLpackaged and mounted a calibratedProsilica GC1380HC camera (the samecamera system used in the Iver2 SLAMnose cone) and a flash strobe light sys-tem on the HAUV hull inspectionvehicle. Boustrophedon survey im-agery was collected by the HAUV ofthe hull of the USS Saratoga. The

FIGURE 9

Vehicle testing at MHL tow tank.

Page 11: An Overview of Autonomous Underwater Vehicle Research and ... · navigation underwater must rely upon other means. Kinsey et al. (2006) pro-vided an overview of the current state-of-the-art

HAUV is equipped with a 1200 kHzDVL, a FOG, and a depth sensor, allof which are comparable to the sensorsuite integrated into PeRL’s Iver2testbed.

Preliminary results for visual hull-relative navigation are shown in Fig-

ure 10 (Kim and Eustice, 2009). Here,we see a pose-graph of the camera con-straints generated through pairwiseregistration of overlapping imagery.These constraints are fused with navi-gation data in an extended informationfilter framework to provide a bounded

error precision navigation estimateanywhere along the hull. Each nodein the network corresponds to a digital-still image taken along the hull (over1300 images in all). Note the discretedropout of imagery along the secondleg in the region of 5 m to 20 m alongthe hull axis. Because of a loggingerror, we did not record any imageryduring this time; however, this gap inthe visual data record actually highlightsthe utility of our hull-referenced visualnavigation approach. Because we areable to pairwise register views of thehull taken from different times and loca-tions, the camera-based navigation algo-rithm is able to “close-the-loop” andregister itself to earlier imagery fromthe first leg of the survey—thereby reset-ting any incurred DVL navigation errorduring the data dropout period. It is pre-cisely this hull-referenced navigation ca-pability that allows the AUV to navigatein situ along the hull without the needfor deploying any external aiding (e.g.,acoustic-beacon transponders).

UMich Biological StationField trials were held on Douglas

Lake at the UMich Biological Station(UMBS) in Pellston, Michigan, dur-ing July 2008. Four days of on-watertesting demonstrated maneuverability,vehicle speed, dead-reckon naviga-tion, wireless Ethernet communica-tion, sidescan sonar functionality,digital compass, and manual surfacejoystick operation modes as summa-rized in Table 2.

Launch and recovery were con-ducted from shore, dock, and from apontoon boat. A full sidescan sonarsurvey of the southeastern bay atDouglas Lake was run from theUMBS docks (Figure 11). After com-pletion of the mission, the vehicle wasmanually driven under joy-stick con-

FIGURE 10

Hull inspection results from AUV Fest 2008. (a) Depiction of the AUV Fest 2008 experimentalsetup. (b) The camera-derived pose constraints are shown as red and green links. Each vertexrepresents a node in the pose-graph enclosed by its 3σ covariance uncertainty ellipsoid. Be-cause of the change of the visual feature richness along the hull, the uncertainty ellipsoid inflateswhen the vehicle is not able to build enough pose constraints but then deflates once VAN cre-ates camera constraints with previous tracklines. Three small figure insets depict the typicalfeature richness for different regions of the hull. (c) Triangulated 3-D points are fitted to obtaina smooth surface reconstruction and then texture mapped to create a 3-D photomosaic. The sixwhite dots are targets that were manually placed on the hull and used to verify the utility of thevisual hull inspection algorithm.

Spring 2009 Volume 43, Number 2 43

Page 12: An Overview of Autonomous Underwater Vehicle Research and ... · navigation underwater must rely upon other means. Kinsey et al. (2006) pro-vided an overview of the current state-of-the-art

44 Marine Technology Society Journal

trol from a portable wireless stationback to the dock for recovery. This on-shore launch and recovery capabilityfacilitates the ease of experimentalfield testing with the Iver2.

Thunder Bay NationalMarine Sanctuary

In August 2008, PeRL collaboratedwith the National Oceanic and Atmo-spheric Administration (NOAA) Thun-der Bay National Marine Sanctuary(TBNMS) researchers to map un-explored areas outside the Sanctuary’scurrent boundaries (Figure 12a). Es-tablished in 2000, the TBNMS protects

TABLE 2

Results of testing stock Iver2 at UMBS.

Attribute

Performance

Maneuverability

1.5 m altitude bottom following

25 cm depth band control

4.5 m turning radius

Speed

0.5-1.5 m/s

DR navigation accuracy (DR = prop counts +compass u/w; GPS when on surface)

10% distance traveled

802.11g Wi-Fi

100 m range (on surface)

FIGURE 11

VectorMap mission-planning interface for the Iver2. The depicted tracklines are for a sidescan sonar survey of a portion of Douglas Lake. Launchand recovery of the Iver2 were performed from a dock on shore.

Page 13: An Overview of Autonomous Underwater Vehicle Research and ... · navigation underwater must rely upon other means. Kinsey et al. (2006) pro-vided an overview of the current state-of-the-art

one of the nation’s most historicallysignificant collections of shipwrecks.Located in the northeast cornerof Michigan’s Lower Peninsula, the

448 mi2 sanctuary contains 40 knownhistoric shipwrecks. Archival researchindicates that over 100 sites await dis-covery within and just beyond the

sanctuary’s current boundaries. Thisfact, coupled with strong public sup-port and the occurrence of dozens ofknown shipwrecks, provides the ratio-nale for the sanctuary’s desire to ex-pand from 448 mi2 to 3,662 mi2 (aneightfold increase). To date, however,a comprehensive remote sensing sur-vey has not been conducted in thepotential expansion area. Moreover,significant portions of the existingsanctuary have not been explored.

PeRL is engaged in a 5-year colla-boration effort with TBNMS to usethe Sanctuary as a real-world engi-neering testbed for our AUV algo-rithms research. TBNMS providesin-kind support of ship time and fa-cilities and in return receives SLAM-derived archaeological data productsranging from 3-D photomosaics ofshipwrecks to sidescan sonar maps ofthe Sanctuary seafloor. In addition,PeRL is engaged in public outreachefforts in collaboration with TBNMSto educate the general public in theuse and technology of underwaterrobotics. In development is an AUVtechnology display in their state-of-the-art Great Lakes Maritime Heri-tage Center, (a 20,000 ft2 buildingfeaturing a 100-seat theater, 9,000 ft2

of exhibit space, and distance learn-ing capabilities), which will consistof an Iver2 AUV hull, a multimediakiosk, and maps and data productsderived from PeRL’s field testing inthe Sanctuary.

This past August, as part of anNOAA Ocean Exploration grant,PeRL fielded one of its two Iver2AUVs to collect sidescan sonar imag-ery in unmapped regions of the Sanc-tuary seafloor. Figure 12 shows surveytracklines and sonar imagery collectedof a newly found wreck outside of theSanctuary’s boundaries in approxi-mately 50 m of water depth.

FIGURE 12

Sidescan sonar mapping results from the 2008 summer field season in TBNMS. (a) Thunder BayNational Marine Sanctuary facility. (b) A large area search was conducted first to locate the targetwreck indicated in the white box (survey tracklines are shown in green). A second finer-scalesurvey was then conducted to map the target at higher resolution (tracklines overlaid in gray).(c) Target imagery found using 330 kHz sonar. (d) Detailed target imagery using 800 kHz sonar.

Spring 2009 Volume 43, Number 2 45

Page 14: An Overview of Autonomous Underwater Vehicle Research and ... · navigation underwater must rely upon other means. Kinsey et al. (2006) pro-vided an overview of the current state-of-the-art

ConclusionThis paper provided an overview

of PeRL’s AUV algorithms researchand testbed development at UMich.To summarize, PeRL ’s main re-search thrusts are in the areas of(1) real-time visual SLAM, (2) coop-erative multi-vehicle navigation, and(3) perception-driven control. Towardthat goal, we reported the modifica-tions involved in preparing two com-mercial Ocean-Server AUV systemsfor SLAM research at UMich. PeRLupgraded the vehic les with ad-ditional navigation and perceptualsensors, including 12-bit stereo down-looking Prosilica cameras, a TeledyneRDI 600 kHz Explorer DVL for3-axis bottom-lock velocity measure-ments, a KVH single-axis FOG foryaw rate, and a WHOI Micro-modemfor communication, along with othersensor packages. To accommodatethe additional sensor payload, a newDelrin nose cone was designed andfabricated. An additional 32-bit em-bedded CPU hardware was added fordata-logging, real-time control, andin situ real-time SLAM algorithm test-ing and validation. Our field testingand collaboration with the TBNMSwill provide a validation of the pro-posed navigation methodologies in areal-world engineering setting, whereasa new museum exhibit on under-water robotics at their visitor centerwill disseminate the findings and resultsof this research to the general public.

AcknowledgmentsThis work is supported in part

through grants from the NationalScience Foundation (Award #IIS0746455), the Office of Naval Re-search (Award #N00014-07-1-0791),and an NOAA Ocean Explorationgrant (Award #WC133C08SE4089).

46 Marine Technology Society Journa

ReferencesAnderson, B., Crowell, J. 2005. Workhorse

AUV—A cost-sensible new autonomous un-

derwater vehicle for surveys/soundings, search

& rescue, and research. In Proc. of the IEEE/

MTS OCEANS Conference and Exhibition.

pp. 1228-1233. Washington, D.C.: IEEE.

Bailey, T., Durrant-Whyte, H. 2006. Simul-

taneous localization and mapping (SLAM):

Part II. IEEE Robot. Autom. Mag. 13(3):

108-117.

Ballard, R.D. (ed). 2008. Archaeological

Oceanography. Princeton, New Jersey:

Princeton University Press. 296 pp.

Bourgault, F., Furukawa, T., Durrant-Whyte,

H. 2004. Decentralized Bayesian negotiation

for cooperative search. In Proc. of the IEEE/

RSJ International Conference on Intelligent

Robots and Systems. pp. 2681-2686. Sendai,

Japan: IEEE.

Durrant-Whyte, H., Bailey, T. 2006. Simul-

taneous localization and mapping: Part I.

IEEE Robot. Autom. Mag. 13(2):99-110.

Eustice, R.M. 2008. Toward real-time visu-

ally augmented navigation for autonomous

search and inspection of ship hulls and port

facilities. In: International Symposium on

Technology and the Mine Problem. Monterey,

CA: Mine Warfare Association (MINWARA).

Eustice, R.M., Singh, H., Leonard, J.J.

2006a. Exactly sparse delayed-state filters for

view-based SLAM. IEEE Trans. Robot.

22(6):1100-1114.

Eustice, R.M., Singh, H., Leonard, J.J.,

Walter, M.R. 2006b. Visually mapping the

RMS Titanic: Conservative covariance esti-

mates for SLAM information filters. Int. J.

Rob. Res. 25(12):1223-1242.

Eustice, R.M., Whitcomb, L.L., Singh, H.,

Grund, M. 2006c. Recent advances in

synchronous-clock one-way-travel-time

acoustic navigation. In Proc. of the IEEE/

MTS OCEANS Conference and Exhibition.

pp. 1-6. Boston, MA: IEEE.

Eustice, R.M., Whitcomb, L.L., Singh, H.,

Grund, M. 2007. Experimental results in

l

synchronous-clock one-way-travel-time

acoustic navigation for autonomous under-

water vehicles. In Proc. of the IEEE/RSJ

International Conference on Intelligent Ro-

bots and Systems. pp. 4257-4264. Rome,

Italy: IEEE.

Eustice, R.M., Pizarro, O., Singh, H. 2008.

Visually augmented navigation for autono-

mous underwater vehicles. IEEE J. Oceanic

Eng. 33(2):103-122.

Fleischer, S. 2000. Bounded-error vision-

based navigation of autonomous underwater

vehicles. PhD thesis, Stanford University.

Foley, B., DellaPorta, K., Sakellariou, D.,

Bingham, B., Camilli, R., Eustice, R.,

Evagelistis, D., Ferrini, V., Hansson, M.,

Katsaros, K., Kourkoumelis, D., Mallios, A.,

Micha, P., Mindell, D., Roman, C., Singh, H.,

Switzer, D., Theodoulou, T. 2009. New

methods for underwater archaeology: The

2005 Chios ancient shipwreck survey.

J. Hesperia. In press.

Freitag, L., Grund, M., Singh, S., Partan, J.,

Koski, P., Ball, K. 2005a. The WHOI micro-

modem: an acoustic communications and

navigation system for multiple platforms.

In Proc. of the IEEE/MTS OCEANS

Conference and Exhibition. pp. 1086-1092.

Washington, D.C: IEEE.

Freitag, L., Grund, M., Partan, J., Ball, K.,

Singh, S., Koski, P. 2005b. Multi-band

acoustic modem for the communications

and navigation aid AUV. In Proc. of the

IEEE/MTS OCEANS Conference and

Exhibition. pp. 1080-1085. Washington,

D.C.: IEEE.

Hunt, M., Marquet, W., Moller, D., Peal, K.,

Smith, W., Spindel, R. 1974. An acoustic

navigation system. Woods Hole Oceanographic

Institution Technical Report WHOI-74-6.

Kim, A., Eustice, R.M. 2009. Pose-graph vi-

sual SLAM with geometric model selection for

autonomous underwater ship hull inspection.

In: IEEE/RSJ International Conference on

Intelligent Robots and Systems. Submitted for

publication.

Page 15: An Overview of Autonomous Underwater Vehicle Research and ... · navigation underwater must rely upon other means. Kinsey et al. (2006) pro-vided an overview of the current state-of-the-art

Kinsey, J.C., Eustice, R.M., Whitcomb, L.L.

2006. Underwater vehicle navigation: Recent

advances and new challenges. In Proc. of

the IFAC Conference on Maneuvering and

Control of Marine Craft. Lisbon, Portugal:

International Federation of Automatic

Control (IFAC).

LCM. 2009. Lightweight Communications

and Marshalling. http://code.google.com/p/

lcm/ (Accessed April 15, 2009).

Leonard, J., How, J., Teller, S., Berger, M.,

Campbell, S., Fiore, G., Fletcher, L., Frazzoli,

E., Huang, A., Karaman, S., Koch, O.,

Kuwata, Y., Moore, D., Olson, E., Peters, S.,

Teo, J., Truax, R., Walter, M., Barrett, D.,

Epstein, A., Maheloni, K., Moyer, K., Jones,

T., Buckley, R., Antone, M., Galejs, R.,

Krishnamurthy, S., Williams, J. 2008. A

perception-driven autonomous urban vehicle.

J. Field Robot. 25(10):727-774.

Leveille, E.A. 2007. Analysis, redesign and

verification of the Iver2 autonomous under-

water vehicle motion controller. Masters the-

sis, University of Massachusetts Dartmouth.

Milne, P. 1983. Underwater acoustic posi-

tioning systems. Houston: Gulf Publishing

Company. 284 pp.

Neira, J., Tardos, J. 2001. Data association in

stochastic mapping using the joint compati-

bility test. IEEE Trans. Robot. Autom.

17(6):890-897.

Newman, P.M., Leonard, J.J. 2003. Pure

range-only sub-sea SLAM. In Proc. of the

IEEE International Conference on Robotics

and Automation. pp. 1921-1926. Taipei,

Taiwan: IEEE.

Olson, E., Leonard, J.J., Teller, S. 2006.

Robust range-only beacon localization. IEEE

J. Oceanic Eng. 31(4):949-958.

Ong, L.L, Upcroft, B., Bailey, T., Ridley, M.,

Sukkarieh, S., Durrant-Whyte, H. 2006. A

decentralised particle filtering algorithm for

multi-target tracking across multiple flight

vehicles. In Proc. of the IEEE/RSJ Interna-

tional Conference on Intelligent Robots and

Systems. pp. 4539-4544. Beijing, China:

IEEE.

Partan, J., Kurose, J., Levine, B. N. 2006. A

survey of practical issues in underwater net-

works. In: Proc. of the ACM International

Workshop on Underwater Networks.

pp. 17-24. New York: ACM Press.

Rekleitis, I., Dudek, G., Milios, E. 2003.

Probabilistic cooperative localization and

mapping in practice. In Proc. of the IEEE

International Conference on Robotics and

Automation. pp. 1907-1912. Taipei, Taiwan:

IEEE.

Ridley, M., Nettleton, E., Sukkarieh, S.,

Durrant-Whyte, H. 2002. Tracking in de-

centralised air-ground sensing networks. In

Proc. of the IEEE International Conference

on Information Fusion. pp. 616-623.

Annapolis, Maryland: IEEE.

Roman, C., Singh, H. 2005. Improved vehi-

cle based multibeam bathymetry using sub-

maps and SLAM. In Proc. of the IEEE/RSJ

International Conference on Intelligent

Robots and Systems. pp. 2422-2429.

Edmonton, Alberta, Canada: IEEE.

Smith, S., Kronen, D. 1997. Experimental

results of an inexpensive short baseline

acoustic positioning system for AUV naviga-

tion. In Proc. of the IEEE/MTS OCEANS

Conference and Exhibition. pp. 714-720.

Halifax, Nova Scotia, Canada: IEEE.

Stewart, W. 1991. Remote-sensing issues for

intelligent underwater systems. In Proc. of the

IEEE Conference on Computer Vision and

Pattern Recognition. pp. 230-235. Maui, HI,

USA: IEEE.

Vaganay, J., Elkins, M., Willcox, S., Hover,

F., Damus, R., Desset, S., Morash, J.,

Polidoro, V. 2005. Ship hull inspection

by hull-relative navigation and control. In

Proc. of the IEEE/MTS OCEANS Confer-

ence and Exhibition. pp. 761-766. Washing-

ton, D.C.: IEEE.

Walter, M.R., Leonard, J.J. 2004. An exper-

imental investigation of cooperative SLAM.

In Proc. of the IFAC/EURON Symposium on

Intelligent Autonomous Vehicles. Lisbon,

Portugal: International Federation of Auto-

matic Control (IFAC).

Williams, S.B., Dissanayake, G., Durrant-

Whyte, H. 2002. Towards multi-vehicle si-

multaneous localisation and mapping. In

Proc. of the IEEE International Conference

on Robotics and Automation. pp. 2743-2748.

Washington, D.C.: IEEE.

Spring 2009 Volume 43, Number 2 47