an augmented reality architecture for the creation of...

6
070427-011 1 Abstract—System evaluation and testing of Unmanned Un- derwater Vehicles (UUVs) in their destined environment can be tedious, error prone, time consuming and consequently, expen- sive. However, pre-real world testing facilities, such as Hard- ware-in-the-loop (HIL), are not always available. This is due to the time and expense required to create a specific test environ- ment for the vehicle. Thus the system is not as fault tolerant as it could be since problems can remain undetected until the real- world testing phase. Debugging and fixing errors in the real- world testing phase is much more time consuming and expensive due to the nature of the harsh underwater environment. This paper introduces a novel framework for the rapid construction of virtual environment testing scenarios for remote platforms with embedded systems such as AUVs. The framework provides testing facilities across all stages of the reality continuum providing ca- pabilities for pure simulation, HIL, Hybrid Simulation (HS) and Real world testing. The framework architecture is both very generic and flexible and allows mixing of real and simulated com- ponents. The framework is supported by a distributed communi- cations protocol to provide location transparency of systems which is key to providing mixed reality testing facilities. Index Terms—Augmented Reality, Hardware-in-the-loop, Un- manned Underwater Vehicle, Autonomous Underwater Vehicle, Hybrid Simulation. I. INTRODUCTION ystem integration and validation of embedded technologies has always been a challenge, particularly in the case of autonomous underwater vehicles (AUVs). The inaccessibility of AUV platforms combined with the difficulty and cost of field operations have been the main obstacles to the maturity and evolution of underwater technologies. Additionally, the analysis of embedded technologies is hampered by data proc- essing and analysis time lags (due to low bandwidth data communications with the underwater platform). This has meant that the developer/operator is unable to react quickly or in real-time to stimulus and therefore unable to easily debug problems whilst testing in the remote environment. As a con- Manuscript received 9, 8, 2007. This work was supported by Engineering and Physical Sciences Research Council (EPSRC). Benjamin Davis is with the Ocean Systems Laboratory, Heriot-Watt Uni- versity - School of EPS, Edinburgh, EH144AS. (tel: +44 131 451 3506; fax : +44 131 451 4155 Email: [email protected] ) sequence real world testing is tedious, time consuming and very expensive. This has tended to mean that testing is less thorough due to pre-real-world testing facilities not being available. This paper addresses the problem of reducing the amount of real world testing required for a remote platform. This is achieved by concentrating on improving the pre-real-world testing facilities by providing a generic extensible modular framework of components which can arranged in a multitude of different ways to provide virtual environment scenarios for improving remote awareness and developmental testing. The first improvement is to make data output by the remote plat- form more intuitive and provide multiple ways in which it can be visualised by the user. The developer should then be able to identify problems faster. Previous research has provided the basis that the proposed framework will build upon. For exam- ple, the Neptune simulator [1] research identifies a useful tax- onomy of synthetic environment facilities. However, the Nep- tune simulator itself is not generic enough to provide the do- main independent testing facilities required. The concepts it identifies need extending by providing a solution which is ap- plicable to any domain. Requirements: Real world testing of systems in remote environments is of- ten tedious due to inaccessibility and its hazardous nature (deep sea or outer space). As a result, it is not always feasible due to the high expense. It would therefore be beneficial to test the systems in a laboratory first. One method of doing this is via pure simulation (PS) [1] of all the platform’s systems and the remote environment. The problem with using this method alone is that system integration errors will still occur because many modules will not have been tested working together. Diagnosing all integration errors on the platform during real world testing is generally more time consuming than on a workstation. In order to test the real working platform easily, with relatively low expense, a different method known as hardware-in-the-loop (HIL) [2] is used. This requires the simulation of some modules and sensors, and it needs to be transparent to the rest of the platform’s systems. It is usually only lower level systems, such as exterioceptive sensors, which An Augmented Reality Architecture for the Creation of Hardware-in-the-Loop & Hybrid Simulation Test Scenarios for Unmanned Underwater Vehicles Benjamin C. Davis, Pedro Patrón and David M. Lane S

Upload: buidat

Post on 07-Mar-2018

216 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: An Augmented Reality Architecture for the Creation of ...osl.eps.hw.ac.uk/files/uploads/publications/davis07augmented... · ple, the Neptune simulator [1] ... namics, sonar etc. Most

070427-011

1

Abstract—System evaluation and testing of Unmanned Un-

derwater Vehicles (UUVs) in their destined environment can be

tedious, error prone, time consuming and consequently, expen-

sive. However, pre-real world testing facilities, such as Hard-

ware-in-the-loop (HIL), are not always available. This is due to

the time and expense required to create a specific test environ-

ment for the vehicle. Thus the system is not as fault tolerant as it

could be since problems can remain undetected until the real-

world testing phase. Debugging and fixing errors in the real-

world testing phase is much more time consuming and expensive

due to the nature of the harsh underwater environment. This

paper introduces a novel framework for the rapid construction of

virtual environment testing scenarios for remote platforms with

embedded systems such as AUVs. The framework provides testing

facilities across all stages of the reality continuum providing ca-

pabilities for pure simulation, HIL, Hybrid Simulation (HS) and

Real world testing. The framework architecture is both very

generic and flexible and allows mixing of real and simulated com-

ponents. The framework is supported by a distributed communi-

cations protocol to provide location transparency of systems

which is key to providing mixed reality testing facilities.

Index Terms—Augmented Reality, Hardware-in-the-loop, Un-

manned Underwater Vehicle, Autonomous Underwater Vehicle,

Hybrid Simulation.

I. INTRODUCTION

ystem integration and validation of embedded technologies

has always been a challenge, particularly in the case of

autonomous underwater vehicles (AUVs). The inaccessibility

of AUV platforms combined with the difficulty and cost of

field operations have been the main obstacles to the maturity

and evolution of underwater technologies. Additionally, the

analysis of embedded technologies is hampered by data proc-

essing and analysis time lags (due to low bandwidth data

communications with the underwater platform). This has

meant that the developer/operator is unable to react quickly or

in real-time to stimulus and therefore unable to easily debug

problems whilst testing in the remote environment. As a con-

Manuscript received 9, 8, 2007. This work was supported by Engineering

and Physical Sciences Research Council (EPSRC).

Benjamin Davis is with the Ocean Systems Laboratory, Heriot-Watt Uni-

versity - School of EPS, Edinburgh, EH144AS. (tel: +44 131 451 3506; fax :

+44 131 451 4155 Email: [email protected])

sequence real world testing is tedious, time consuming and

very expensive. This has tended to mean that testing is less

thorough due to pre-real-world testing facilities not being

available.

This paper addresses the problem of reducing the amount of

real world testing required for a remote platform. This is

achieved by concentrating on improving the pre-real-world

testing facilities by providing a generic extensible modular

framework of components which can arranged in a multitude

of different ways to provide virtual environment scenarios for

improving remote awareness and developmental testing. The

first improvement is to make data output by the remote plat-

form more intuitive and provide multiple ways in which it can

be visualised by the user. The developer should then be able to

identify problems faster. Previous research has provided the

basis that the proposed framework will build upon. For exam-

ple, the Neptune simulator [1] research identifies a useful tax-

onomy of synthetic environment facilities. However, the Nep-

tune simulator itself is not generic enough to provide the do-

main independent testing facilities required. The concepts it

identifies need extending by providing a solution which is ap-

plicable to any domain.

Requirements:

Real world testing of systems in remote environments is of-

ten tedious due to inaccessibility and its hazardous nature

(deep sea or outer space). As a result, it is not always feasible

due to the high expense. It would therefore be beneficial to test

the systems in a laboratory first. One method of doing this is

via pure simulation (PS) [1] of all the platform’s systems and

the remote environment. The problem with using this method

alone is that system integration errors will still occur because

many modules will not have been tested working together.

Diagnosing all integration errors on the platform during real

world testing is generally more time consuming than on a

workstation. In order to test the real working platform easily,

with relatively low expense, a different method known as

hardware-in-the-loop (HIL) [2] is used. This requires the

simulation of some modules and sensors, and it needs to be

transparent to the rest of the platform’s systems. It is usually

only lower level systems, such as exterioceptive sensors, which

An Augmented Reality Architecture for the

Creation of Hardware-in-the-Loop & Hybrid

Simulation Test Scenarios for Unmanned

Underwater Vehicles

Benjamin C. Davis, Pedro Patrón and David M. Lane

S

Page 2: An Augmented Reality Architecture for the Creation of ...osl.eps.hw.ac.uk/files/uploads/publications/davis07augmented... · ple, the Neptune simulator [1] ... namics, sonar etc. Most

070427-011

2

require simulation since these are the parts of the real system

which interact directly with the remote environment. There-

fore, some of the real systems need to be switched seamlessly

with simulated systems. This places a heavy constraint on the

way the platform’s systems interact, since most simulated sys-

tems will not be running on the remote platform, but instead

running on another computer in another location. Therefore

each system needs to be able to communicate with every other

system regardless of location. To provide this location trans-

parency, the systems would ideally communicate by message

passing using a standard communication network such as

Internet Protocol (IP) (See section IV on Communications).

For developmental testing and mission observation purposes

an operator of a remote platform would benefit from having

intuitive visual feedback of the remote platform’s operations.

In order to provide an intuitive display, large quantities of data

needs to be interpreted and displayed so that it is human read-

able. Decoding of data by the user should be effective and

efficient, thus an intuitive display is essential. It is paramount

that data is displayed in such a way that it is meaningful to

anyone, not just trained personnel. It is also of great impor-

tance that errors can be seen and reacted to quickly, especially

in regards to hardware-in-the-loop testing and real world op-

erations.

Solution:

This paper proposes a novel framework, the Augmented Re-

ality Framework (ARF), for the rapid construction of virtual

environments for mixed reality testing of embedded systems in

remote environments. Mixed reality techniques are used to

render data visually in order to increase the situational aware-

ness of the operator, and to provide simulation of systems for

testing the remote platform. ARF provides facilities for all

types of testing. Therefore increasing the rate at which errors

are detected, hence developing a more robust system in less

time.

ARF provides the required functionality by incorporating a

set of visualisation tools, a distributed communication protocol

and many simulated modules and sensors such as vehicle dy-

namics, sonar etc. Most importantly, it provides high-fidelity

simulation of real platforms and sensors which can be used in

mixed reality systems e.g. in hybrid simulation (HS) [1, 3]

(both real and virtual sensors working together). ARF uses the

OceanSHELL [4] distributed communication protocol to pro-

vide the capability for seamless substitution of real for virtual

modules, such as sensors and vehicle systems.

The rest of this paper is dedicated to explaining the various

different parts of ARF, the flexible implementation, and exam-

ples of current and future usages and their benefits. Firstly,

some background about the concepts of Augmented Reality

and Mixed Reality testing is given.

II. HARDWARE-IN-THE-LOOP CONCEPTS

Ronald T. Azuma [5] describes the idea of Milgrim’s Real-

ity-Virtuality continuum [6]. His description shows the contin-

uum from reality to virtual reality and all the hybrid stages in

between. The hybrid stages between real and virtual are

known as augmented reality [7] and augmented virtuality [7]

(see figure 1). The hybrid reality concepts are built upon by

the ideas of HIL and Hybrid Simulation. ARF is able to pro-

vide functionality across all stages of the continuum allowing

for virtually any testing scenario to be realised. For this reason

it is referred to as a mixed reality framework. There are cur-

rently 4 different testing scenarios:

1. Pure Simulation [1] (PS) - individual testing of each

module before integration onto the intended platform.

2. Hardware-in-the-loop [2] (HIL) - where testing of

the real integrated platform is carried out in a labora-

tory environment. However, exterioceptive sensors

(e.g. sonar, video), which interact with the intended

environment, may have to be simulated to fool the ro-

bot into thinking it is in the real world. Thus, this is

very useful for integration testing as the entire system

can be tested as a whole allowing for any system inte-

gration errors to be detected in advance of real world

trials.

3. Hybrid Simulation [1, 3] (HS) - This involves test-

ing the platform in its intended environment in con-

junction with some simulated sensors driven from a

virtual world. For example, virtual objects can be

added to the real world and the exterioceptive sensor

data altered so that the robot thinks that something in

the sensor dataset is real. This type of system is used

if some higher level modules are not yet reliable

enough to be trusted to behave as intended. Conse-

quently, fictitious data is used instead, augmented

with current data, as inputs to these systems. This is

so that if a mistake is made it doesn’t damage the

platform. An example of this would be obstacle

avoidance system testing on an AUV (see example in

section VII).

4. Real world testing - This is the last stage of testing.

When all systems are trusted the platform is ready for

testing in the intended environment. Therefore, all

implementation errors should have been fixed in the

previous stages otherwise this stage is very costly.

The next Section gives examples of how each type of testing

corresponds to a different type of reality.

III. REALITY TYPES AND TESTING

The different types of testing correspond to different stages

of the reality continuum (See figure 1).

Augmented Reality (AR) and Augmented Virtuality (AV)

refer to how the reality or virtual reality is altered respectively.

In the case of Augmented Reality, simulated data is added to

the real world perception of some entity. For example, sonar

data on an AUV could be altered so that it contains fictitious

objects i.e. objects which are not present in the real world, but

which are present in the virtual world. This can be used to test

the higher level systems of an AUV such as obstacle detection

Page 3: An Augmented Reality Architecture for the Creation of ...osl.eps.hw.ac.uk/files/uploads/publications/davis07augmented... · ple, the Neptune simulator [1] ... namics, sonar etc. Most

070427-011

3

(See Obstacle detection and avoidance example in Section

VII). Generally, a virtual world is used to generate synthetic

sensor data which is then mixed with the real world data. The

virtual world has to be kept in precise synchronization with the

real world. This is known as the “registration” problem. The

accuracy of registration is dependent on the accuracy of the

position/navigation systems. Registration is a well known

problem with underwater vehicles when trying to match differ-

ent sensor datasets to one another for visualisation. Accurate

registration is paramount for displaying the virtual objects in

the correct position in the simulated sensor data.

Augmented Virtuality is the opposite of augmented reality

i.e. instead of being from a person’s perspective it is from the

virtual world’s perspective - the virtual world is augmented

with real data. For example, real data collected by real sensors

on an AUV is rendered in real time in the virtual world in or-

der to recreate the real world in virtual reality. This can be

used for Online Monitoring [1] (OM) and operator training

[1] (TR) as the operator can see how the AUV is situated in

the remote environment, increasing situational awareness.

HS is where the platform operates in its intended environ-

ment in conjunction with some sensors being simulated in real

time by a synchronized virtual environment. Similarly to AR,

the virtual environment is kept in synchronization using posi-

tion data transmitted from the remote platform. Thus simulated

sensors are attached to the virtual reality version of the remote

platform and moved around in synchronization with the real

platform. Simulated sensors collect data from the virtual world

and transmit the data back to the real systems on the remote

platform. The remote platform systems then interpret this data

as if it were real. It is important that simulated data is very

similar to the real data so that the higher level systems cannot

distinguish between the two.

In summary, the real platform’s perception of the real envi-

ronment is being augmented with virtual data. Hence HS is

inherently Augmented Reality. An example of a real scenario

where AR testing procedures are useful is in the case of obsta-

cle detection and avoidance in the underwater environment by

an AUV. See Obstacle detection and avoidance example in

Section VII.

Hardware-in-the-Loop (HIL) is another type of Mixed Real-

ity testing. The platform is not situated in its intended envi-

ronment, but is instead fooled into thinking that it is. This is

achieved by simulating all required exterioceptive sensors us-

ing a virtual environment. Virtual sensor data is then sent to

the real platform’s systems in order to fool them. If possible

the outputs of higher level systems, which rely on the simu-

lated data, are relayed back and displayed in the virtual envi-

ronment. This can help show the system developer that the

robot is interpreting the simulated sensor data correctly. In

HIL it is all the sensors and systems that interact directly with

the virtual environment which are simulated. Vehicle naviga-

tion systems are a good example since these use exterioceptive

sensors, actuators and motors to determine position. Using

simulated sensors means that the developer can specify exactly

the data which will be fed into the systems being tested. This is

complicated to do reliably in the real environment as there are

too many external factors which cannot be controlled. There-

fore, HIL actually places the robot in Virtual Reality. Some-

times a virtual environment is augmented with the feedback of

the platform’s data for observation, hence HIL is Augmented

Virtuality as well. Thus HIL and HS are both deemed to be

Mixed Reality concepts.

ARF provides the ability to execute all testing regimes

across the reality continuum. It does this by incorporating: the

OceanSHELL distributed communication protocol, vehicle

dynamics & navigation simulators, sensor simulation, an inter-

active three-dimensional (3d) virtual world and information

display. All spatially distributed components are easily inter-

connected using message passing via the communication pro-

tocol, or directly by method call using ARF’s visual program-

ming interface. The key to ARF’s HIL and HS capabilities is

the flexibility of the communications protocol. ARF can sup-

port any protocol, but only implementations for OceanSHELL

are currently implemented.

IV. FRAMEWORK COMMUNICATIONS PROTOCOL

The Obstacle detection and avoidance example (in section

VII) highlights the need for a distributed communication sys-

Real World

Tests (Reality)

Hybrid Simulation (Augmented Reality – Plac-ing virtual objects in the real)

HIL Testing (Augmented Virtuality – Placing real objects in

the virtual)

Synthetic

Tests (Virtual Reality)

Start Finish

Figure 1: Reality Continuum combined with Testing scenarios.

Page 4: An Augmented Reality Architecture for the Creation of ...osl.eps.hw.ac.uk/files/uploads/publications/davis07augmented... · ple, the Neptune simulator [1] ... namics, sonar etc. Most

070427-011

4

tem. ARF requires for modules to be able to be swapped, for

similar simulated modules, without the other systems knowing,

having to be informed or programmed to allow it. The underly-

ing communication protocol which provides the flexibility

needed by the framework is OceanSHELL [4]. OceanSHELL

provides distributed communications allowing modules to run

anywhere i.e. provides module location transparency. Location

transparency makes mixed reality testing straight forward be-

cause modules can run either on the remote platform, or some-

where else such as a laboratory.

OceanSHELL is a software library implementing a low

overhead architecture for organising and communicating be-

tween distributed processes. OceanSHELL’s low overhead in

terms of execution speed, size and complexity make it emi-

nently suited for embedded applications. An extension to

OceanShell, called JavaShell, is portable because it runs on

Java [8] platforms. Both JavaShell and OceanShell fully inter-

act, the only difference being that OceanShell uses C structures

to specify message definitions instead of the XML files which

JavaShell uses. However, both systems are fully compatible.

OceanShell is not only platform independent but also language

independent, making it fully portable.

Wireless Network Bridge

Osh

Manager

Osh

Manager

Figure 2: This diagram shows how OceanSHELL provides the backbone for

switching between real and simulated (topside) components for use with

HS/HIL.

OceanShell provides extra functionality for TCP/IP for-

warding of UDP/OceanShell traffic across large networks.

This is a software router which can be used to bridge Ocean-

Shell networks and filter the traffic. This technology is cur-

rently being researched in the Ocean Systems Laboratory and

is called OshManager. OshManager also provides the ability to

stop and start modules which implement the new OshManager

interface. This means that real modules can be switched off

when HS or HIL testing needs to be executed. This makes sub-

stitution of real and simulated modules straightforward which

is useful for mixed reality testing. Figure 2 shows how Ocean-

SHELL is used to communicate between the remote environ-

ment and the virtual environment.

V. FACILITIES IMPLEMENTATION

The Augmented Reality Framework (ARF) is a configurable

and extendible virtual reality framework of tools for creating

mixed reality environments. It provides sensor simulation,

sensor data interpretation, visualisation and operator interac-

tion with the remote platform. ARF can be extended to use

many sensors and data interpreters specific to the needs of the

user and target environment type. ARF is domain independent

and can be tailored to the specific needs of the application.

The ARF framework provides modularity and extendibility by

providing mechanisms to load specific modules created by the

user, and provides a visual programming interface used to link

together the different components.

ARF provides many programming libraries which allow a

developer to create their own programmed components. Also

ARF has vast amounts of components ready for the user to

create their own tailored virtual environment.

The ARF framework provides a 3d virtual world which

components can use to display data and to sense the virtual

environment for virtual sensors. ARF provides many basic

components to build virtual environments from. These compo-

nents can then be configured specifically to work as desired by

the user. If the required functionality does not exist the user

can program their own component and then add it to the ARF

component library. For example, a component could be a sen-

sor data listener which listens for certain data messages on

some communication protocol (OceanSHELL), and then dis-

plays the data “live” in the virtual environment. The compo-

nent may be literally an interface to a communications proto-

col like OceanSHELL, that from which other components can

be connected to in order to listen and transmit data. Thus, the

number of components will grow and so will the flexibility of

the ARF.

ARF also has the ability to link together and configure com-

ponents. A linked set of components can then be exported as a

super-component to the ARF component library for others to

use. For example, an AUV super-component could include: a

3D model of an AUV, a Vehicle Dynamics simulator, a sonar,

and a control input to the Vehicle Dynamics (keyboard or joy-

stick). These virtual components can then be substituted for

the real AUV systems for use with HIL and HS.

ARF allows complete scenarios to be loaded and saved so

that no work is required to recreate an environment. ARF has

components which provide interfaces to OceanSHELL, sensor

simulation (sonar and video) and provides components for

interpreting live OceanSHELL traffic and displaying it mean-

ingfully in the virtual world.

In order to provide the HIL and HS capabilities, ARF pro-

vides a graphical interface to OshManager which allows the

user to decide which type of OceanSHELL messages to for-

ward to and from the robot. This also allows the user to choose

which data to communicate between the virtual environment

and remote platform. Figure 4 & 5 show a simple Sonar simu-

lation using the ARF virtual environment.

VI. IMPLEMENTATION

In order to provide these facilities ARF needs to be very

flexible and modular. ARF provides a visual programming

interface which allows the user to configure and link together

objects i.e. link object outputs to inputs. ARF objects can be

chained together to provide any desired functionality. In addi-

tion ARF provides a 3D virtual world which the user can also

add objects to and input and output data from this to other ob-

Page 5: An Augmented Reality Architecture for the Creation of ...osl.eps.hw.ac.uk/files/uploads/publications/davis07augmented... · ple, the Neptune simulator [1] ... namics, sonar etc. Most

070427-011

5

jects in ARF. ARF also allows the user to import their own

objects either for use in the 3D world or for stand alone data

processing modules.

ARF builds on a programming convention by Sun Microsys-

tems called JavaBeans [9]. This is used to provide dynamic

loading of user made modules into ARF, and to allow the

modules to be configured on the fly using the ARF Graphical

User Interface (GUI). In addition to providing a standard pro-

gramming interface for JavaBeans, ARF introduces a new type

of JavaBean object for use in ARF’s 3d virtual environment.

These are called Java3DBeans. ARF incorporates Java3D [10]

to provide the 3d virtual world. ARF provides the ability to

add Java3D classes, created by the user, to the Java3D world.

ARF provides a collection of special utility objects for the

user. These objects are also provided in the programming API

for the user to extend and create their own Java3DBeans if

need be. Java3DBeans are merely extensions to Java3D ob-

jects which adhere to the JavaBean programming conventions.

ARF is capable of identifying which Beans are Java3DBeans

and therefore knows how to deal with them. The only real dif-

ference between Java3DBeans and JavaBeans is that

Java3DBeans are added to the 3d virtual world part of ARF as

JavaBeans are only added as objects to the ARF BeanBoard

(which keeps track of all objects). However, Java3DBeans can

still communicate with any other objects in ARF’s BeanBoard.

Figure 3: Sauc-e AUV Competition Virtual environment for HIL testing 2007.

In summary, JavaBeans are a collection of conventions

which, if the programmer adheres to, allow a Java Class to be

dynamically loaded and configured using a graphical interface.

The configurations of objects can also be loaded and saved at

the click of a mouse button to a simple human readable XML

file. This removes the need for a programmer to have to design

a special user interface to control and configure each Java ob-

ject.

ARF provides many utility JavaBeans and Java3DBeans

which the user can use directly, or extend by programming

their own specialized objects. These include:

• Geometric Shapes for building scenes

• Mesh File loaders for importing VRML, X3D, DXF

and many more 3D file types.

• Input listeners for controlling 3d objects with input de-

vices (keyboard, mouse, joystick).

• Behaviours for making 3d objects do things.

• Camera control for inspecting and following the pro-

gress of objects.

• OceanSHELL input/output behaviours for rendering

real data and for outputting virtual data from simulated

sensors.

• Basic sensors for underwater technologies are provided

such as forward looking sonar, sidescan sonar, bathy-

metric sonar, altimeter, inertial measurement unit

(IMU), Doppler velocity log (DVL) etc.

• Vehicle Dynamics Models for movement simulation.

VII. REAL WORLD APPLICATIONS & EXAMPLES

Although the set of applications is innumerable, this section

describes some representative examples of applications and

topics of research that are already gaining benefits from the

capabilities provided by the Augmented Reality Framework.

A. Obstacle detection and avoidance

One of the most common problems for unmanned vehicles is

trajectory planning. This is the need to navigate in unknown

environments, trying to reach a goal or target, while avoiding

obstacles. These environments are expected to be in permanent

change. As consequence, sensors are installed on the vehicle to

continuously provide local information about these changes.

When object detections or modifications are sensed, the plat-

form is expected to be able to react in real time and continu-

ously adapt the trajectory to the current mission targeted way-

point.

Figure 4: ARF simulating Forward-look sonar of virtual objects.

Figure 5: The resulting image of the simulated Forward look sonar.

Testing these kinds of adaptive algorithms requires driving

the vehicle against man-made structures in order to analyse its

response behaviours. This incurs high collision risks on the

platform and clearly compromises the vehicle’s survivability.

A novel approach to this problem uses ARF to remove the

collision risk during the development process. Using Hybrid

Simulation, the approach uses a set of simulated sensors for

rendering synthetic acoustic images from virtually placed ob-

Page 6: An Augmented Reality Architecture for the Creation of ...osl.eps.hw.ac.uk/files/uploads/publications/davis07augmented... · ple, the Neptune simulator [1] ... namics, sonar etc. Most

070427-011

6

stacles. The algorithms are then debugged on a real platform

performing avoidance manoeuvres over the virtual obstacles in

a real environment. Figure 2 shows the required framework

components, Figure 4 shows the virtual environment view and

Figure 5 shows the resulting simulated sonar of the obstacles.

It should be noted that the topside simulated components can

be switched on to replace the remote platform’s real compo-

nents, therefore achieving HIL or HS. A detailed description

of the evaluation and testing of obstacle avoidance algorithms

for AUVs can be found in [11] and [12].

B. Multi vehicle applications

The main object of the European Project GREX[13] is to

create a conceptual framework and middleware systems to

coordinate a swarm of diverse, heterogeneous physical objects

(underwater vehicles) working in cooperation to achieve a well

defined practical goal (e.g. search of hydrothermal vents) in an

optimised manner.

In the context of GREX, algorithms for coordinated control

are being developed. As these algorithms need to be tested in

different vehicle platforms (and for different scenarios), testing

in real life becomes difficult due to the cost of transporting

vehicles; furthermore, the efficiency and safety of the different

control strategies need to be tested. The ARF virtual environ-

ment provides the ideal test bed: simulations can be run exter-

nally and fed into the virtual AUVs, so that the suitability of

the different control strategies can be observed. The virtual

environment serves not only as an observation platform, but

can be used to simulate sensors for finding mines as used in

Sotzing’s multi-agent architecture [14]. Figure 6 shows this.

Figure 6: Multiple Vehicles running completely synthetically cooperating and

collaborating to complete a mission more efficiently.

C. Autonomous tracking for pipeline inspection

Oil companies are raising interest in AUV technologies for

improving large field oil availability and, therefore, produc-

tion. It is known that Inspection, Repair and Maintenance

(IRM) comprise up to 90% of the related field activity. This

inspection is clearly dictated by the vessels availability. One

analysis of potential cost savings is using an inspection AUV.

The predicted savings of this over traditional methods for in-

specting a pipeline network system are up to 30%.

Planning and control vehicle payloads, such as the

AUTOTRACKER payload [15], can provide such capabilities.

However, as mentioned, vessel availability and off-shore op-

eration costs make these types of payloads a difficult technol-

ogy to evaluate. ARF can provide simulated sidescan sonar

sensors for synthetic generated pipeline rendering. These ca-

pabilities provide a transparent interface for the correct and

low cost debug of the tracking technologies.

D. Simultaneous localisation and mapping

Simultaneous localisation and mapping (SLAM) is tested

using simulated sonar from the ARF virtual environment. Fea-

ture detection then executed on simulated sonar output. This

estimates the position of the AUV based upon the detected

feature positions and tank corner positions relative to the vehi-

cle. Figure 3 shows SLAM being used to test the Sauc-e AUV

competition entry AUV (Nessie 2).

VIII. CONCLUSIONS AND FURTHER WORK

The Augmented Reality Framework provides a generic ar-

chitecture for creating virtual environments. The architecture

is both flexible and extendible due to JavaBeans and XML

being at the core of its design. Currently ARF has many utility

classes which allow for the creation of mixed reality testing

scenarios for platforms in the underwater domain. For exam-

ple, further work is required to provide a more feature rich set

of utility components for other domains, but this will grow as a

consequence of more exposure and use of the ARF.

The basic functionality has already been demonstrated in the

real world applications discussed earlier. Also, further research

is being carried out in the field of more accurate sensor simula-

tion and extending the component hierarchy.

REFERENCES

[1] E.; Ribas D.; Carreras M.; Ridao, P.; Batlle. Neptune: A HIL simulator

for multiple UUVS. In OCEANS ’04. MTS/IEEE TECHNO-

OCEAN’04 Volume 1, 9-12 Nov. 2004 Page(s):524 - 531 Vol.1, 2004.

[2] G.J.; Randall G.; Edwards I.; Lane, D.M.;Falconer. Interoperability and

synchronisation of distributed hardware-in-the-loop simulation for un-

derwater robot development: Issues and experiments. Amsterdam, Hol-

land, 2001. Robotics and Automation, 2001. Proceedings 2001 ICRA.

IEEE International Conference on Volume 1, 2001 Page(s):909 - 914

vol.1.

[3] J. Choi, S.K.; Yuh. A virtual collaborative world simulator for underwa-

ter robots using multidimensional, synthetic environment. In Robotics

and Automation, 2001. Proceedings 2001 ICRA. IEEE International

Conference on Volume 1, 2001 Page(s):926 - 931 vol.1, 2001.

[4] ‘Ocean-Shell: An embedded library for Distributed Applications and

Communications’, Ocean Systems Laboratory, Heriot-Watt University

[5] R Ronald T. Azuma. A survey of augmented reality. In Teleoperators

and Virtual Environments 6, 4 Pages 355-385, August 1997.

[6] H. Takemura et al. Milgram, P. Augmented reality: A class of displays

on the reality-virutality continuum. SPIE Proceedings: Telemanipulator

and Telepresence Technologies . H. Das, SPIE. 2351 : 282-292, 1994.

[7] Y.; Behringer R.; Feiner S.; Julier S.; MacIntyre B.; Azuma, R.; Baillot.

Recent advances in augmented reality. Computer Graphics and Applica-

tions, IEEE, Volume 2, Issue 6:34 – 47, Nov.-Dec. 2001.

[8] Java http://www.java.com.

[9] JavaBeans by Sun Microsystems http://java.sun.com/products/javabeans

[10] Java3D, Sun Microsystems http://java.sun.com/products/java-media/3D

[11] C. Pêtrès and Y. Pailhas and P. Patrón and Y. Petillot and J. Evans and D. M. Lane; Path Planning for Autonomous Underwater Vehicles;

IEEE Transactions on Robotics, April, 2007

[12] P. Patrón, B. Smith, Y. Pailhas, C. Capus and J. Evans; Strategies and

Sensors Technologies for UUV Collision, Obstacle Avoidance and Es-

cape; 7th Unmanned Underwater Vehicle Showcase; September 2005

[13] http://www.grex-project.eu/

[14] C.C. Sotzing and J. Evans and D.M. Lane; A Multi-Agent Architecture

to Increase Coordination Efficiency in Multi-AUV Operations; In

proceedings IEEE Oceans 2007, Aberdeen; July 2007

[15] P. Patrón, J. Evans, J. Brydon and J. Jamieson; AUTOTRACKER:

Autonomous pipeline inspection: Sea Trials 2005; World Maritime

Technology Conference - Advances in Technology for Underwater Ve-

hicles; March 2006