bio inspired robotic for traffic area

Upload: misterio-zin-yab

Post on 05-Apr-2018

218 views

Category:

Documents


0 download

TRANSCRIPT

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    1/28

    Bio-inspired robotics for air traffic

    weather information managementVinh Bui, Viet V. Pham, Antony W. Iorio,

    Jiangjun Tang, Sameer Alam and Hussein A. AbbassDefence and Security Applications Research Center, University of New SouthWales at the Australian Defence Force Academy, Northcott Drive, Canberra,Australia 2600

    With the increase in automation to serve the growing needs and challenges of aviation, air trafficcontrollers (ATCs) are now faced with an information overload from a myriad of sources, both ingraphical and textual format. One such source is weather information, which is typicallycomprised of wind speed, wind direction, thunderstorms, cloud cover, icing, temperature andpressure at various altitudes. This information requires domain expertise to interpret and com-municate to ATCs, who then employ this information to manage air traffic efficiently and safely.Unfortunately, ATCs are not trained meteorologists, so there are significant challenges associatedwith the correct interpretation and utilization of this information by ATCs. In this paper, we

    propose a bio-inspired weather robot, which interacts with the air traffic environment andprovides targeted weather-related information to ATCs by identifying which airspace sectorsthey are working on. It uses bio-inspired techniques for processing weather information andpath planning in the air traffic environment and is fully autonomous in the sense that it onlyinteracts with the air traffic environment passively and has an onboard weather informationprocessing system. The weather robot system was evaluated in an experimental environmentwith live Australian air traffic, where it successfully navigated the environment, processedweather information, identified airspace sectors and delivered weather-related information forthe relevant sector using a synthetic voice.

    Key words: air traffic management; bio-inspired robotics; image processing; path planning;

    weather information management.

    Address for correspondence: Hussein A. Abbass, Defence and Security Applications Research Center,University of New South Wales at the Australian Defence Force Academy, Northcott Drive, Canberra,Australia 2600. E-mail: [email protected] 1, 6, 8, 1014 and 16 appear in colour online: http://tim.sagepub.com

    Transactions of the Institute of Measurement and Control 34, 2/3 (2012) pp. 291317

    2010 The Institute of Measurement and Control 10.1177/0142331210366688

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    2/28

    1. Automation in air traffic management

    Present-day air traffic is reaching its operational limits and accommodating air trafficgrowth is becoming a challenging task for air traffic controllers (ATCs) and airnavigation service providers. There are clear needs to increase substantially the

    existing capacity of airspaces and to minimize disruptions that can be caused by poorweather, the cognitive load on ATCs and inefficient air routes (ICAO, 2002). As aresult of these needs, new paradigms are being introduced that increase anticipatorycapabilities (Alam, 2008; Erzberger, 2004). These approaches increase automationwith the intention to support the human operators in the exploitation of timely anddynamic information on atmospheric hazards, traffic fluctuations and airspaceutilization.

    In current air traffic management (ATM) systems, an ATC is primarily responsiblefor ensuring safe separation among the aircraft in his/her sector as efficiently aspossible. To achieve this task, the controller uses weather reports, voice communication

    with pilots and other sector controllers, flight strips (for historic and future flight path),and a radar display that provides data on the current position, altitude, speed and trackof all aircraft in a sector (Nolan, 2004).

    In next-generation ATM systems, there will be a significant increase in automation.Several displays will be utilized, where much of the data presented on the displays will

    be presented in a graphical format (Azuma et al., 2000). Two next-generation ATMsystems that integrate such new capabilities include the Single European Sky Airtraffic Research system (SESAR) in Europe (2007) and the Next Generation AirTransport System (NextGen) in the USA (Joint Planning and Development Office,2007). SESAR and NextGen combine increased automation with new procedures to

    achieve safety, capacity, environmental, economic and security benefits.These prospective aids incorporated into next-generation systems can make itpossible for a controller in one sector to anticipate thunderstorms or conflict down theline, and make appropriate adjustments in a flight route, thereby solving potentialproblems long before they occur. It is also expected that with increased automationthere will be less voice communication, fewer tactical problems needing the controllersattention, a shift from textual to graphical information and an extended time frame formaking decisions (Wickens, 1999; Wickens et al., 1998). However, in severe weatherconditions, emergency situations or in the case of automation failure, the controller will

    be able to take over and manage traffic.Furthermore, increased automation has the capability both to compensate for a

    controllers cognitive vulnerabilities, such as ability to construct complex mental mapsof the monitored environment, and to better support and exploit a controllerscognitive strengths. When controllers are provided with accurate and timely informa-tion, they can be very effective at solving problems but if such problem solving requiresknowledge from other domains (eg, for meteorology, scheduling), problem solving

    becomes challenging (Broach, 2005). Of course, there are still challenges associated with

    292 Bio-inspired robotics for air traffic weather information management

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    3/28

    the approaches being investigated for next-generation ATM systems. For example,when controllers were asked to review operational requirements for future automationto assess their cognitive skills and abilities needed, they pointed out that their ability totranslate and interpret data from a myriad of sources would be extremely challenging(Broach, 2005). In particular, meteorological reports on weather phenomena that can

    impact flight operations are clearly an important component of such cognitivechallenges.

    2. Automated integrated weather system

    A regular supply of accurate, timely and reliable meteorological information isnecessary for safe day-to-day aircraft operations (Sherry et al., 2001). Currently, in orderto interpret meteorological information correctly, very specialized domain knowledgeis required, which is burdensome to ATCs because they are already overloaded withresponsibilities with respect to management, tracking and scheduling of flights (Evans,

    2001). In addition, although multiple and overlapping weather data sources provideredundancy and completeness of information, they also present a number of challengesfor ATCs: typically weather data introduces additional screens for ATCs to monitor andincreases the cognitive burden for the controller; weather data can divert the focus ofcontrollers from time-critical air traffic control responsibilities; and weather data ishighly domain specific, requiring domain expertise to interpret the data.

    In addition to these challenges, controllers have to digest and fuse this information inthe context of existing air traffic flow in order to form mental maps over time andspace. In turn, this can eventually lead to cognitive overload, which can interfere withthe controllers primary job of maintaining safe separation between aircraft.

    One of the questions associated with the ATM system weather integration problem ishow information can be provided to controllers in a timely and targeted manner whilealso not overloading them. Such an integrated system should be capable of fusingmultiple data sources and other weather reports in time and space, and to generateuseful meta-data, such as the likely presence of turbulence or the possibility of stormcells forming from known atmospheric conditions.

    Such a system should also deliver the information in the best possible manner to therelevant controller in a manner that does not overly burden the ATC cognitively, ordistract the ATC from primary air traffic control duties. The information should betargeted appropriately, and communicated in a non-invasive fashion. Although

    overlaying graphical weather information on a flight control screen may be possible,presenting weather information with flight information on one screen may not befeasible because of an increase in clutter on an already information-overloaded airtraffic radar display, which can further affect the cognitive capacity of humancontrollers. Moreover, all the ATC systems are closed systems, in that other software isnot permitted to interact with core air traffic software systems because of safetyconcerns and requirements. Of course, an ideal but impractical solution would be to

    Bui et al. 293

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    4/28

    have an aviation meteorologist seated next to each and every controller in ATC centres,who can then continuously monitor the weather information and provide expertdomain advice to controllers accordingly. Between these infeasible solutions exists analternative, namely a passive automation system that does not interfere with the coreATC operations, and that can also address the stated objectives and concerns.

    3. Development of a bio-inspired weather robot

    In light of the cognitive burden aspects of air traffic control screens and infeasibility ofintegrating weather data systems with existing systems, we propose an alternativepassive approach involving a collaboration between ATCs and a situated embodiedautonomous robot managing weather data and informing controllers of relevantmeteorological hazards.

    Broadly speaking, an agent system is embodied if it is structurally coupled to itsenvironment (Ziemke, 2003). In other words, embodiment is a result of an interaction of

    an agent with its environment through co-adaptation of the agent and environment(Teo and Abbass, 2005). Embodiment provides benefits in terms of the dynamics of thesystem as a whole through this coupling. This subtle coupling of the robot with itsenvironment allows it to communicate weather severity through its physical presenceat ATC screens associated with a particular sector where there are high levels ofweather disturbances. Furthermore, an independent platform that is not coupled withthe closed ATC system avoids the safety and security issues mentioned in Section 2.

    This collaborative approach is analogous to the collaboration between a meteorol-ogist and ATCs we mentioned earlier. The motivation for embodying an agent in thisenvironment as opposed to using a software-based solution, is to minimize cognitive

    distractions and invasive visual messages on critical screens associated with ATCoperations.

    In the weather robot system, we have developed bio-inspired techniques for path-planning and real-time weather data processing. The robot interacts with the air trafficcontrol environment and is capable of navigating in this environment usingevolutionary computation applied to path planning. It also uses neural networks forweather data processing. The weather robot also has the capability to process imagesfrom its environment, namely the air traffic sectors on ATC screens, and can verbalizeweather conditions associated with an ATC sector. The robot assists and affects ATCswith respect to weather phenomenon related to the ATC sectors by moving around the

    control room and inspecting sector screens, identifying sectors and informingcontrollers verbally of the relevant weather alerts for a particular sector. In futuredevelopment, the impact of the robots physical presence will become more importantas the robot will adaptively change its path navigation strategy according to weatherseverity in different sectors.

    The weather robot has a number of components, which are highlighted in Figure 1: auser interface for manual control intervention; a decision making module, which consists of

    294 Bio-inspired robotics for air traffic weather information management

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    5/28

    a navigation module for planning paths and obstacle avoidance; an image recognitionmodule for identifying ATC sector screens; a weather information module for gatheringand interpreting weather data; a voice command module for issuing verbal instructionsto the robot; a text-to-speech module for alerting controllers to relevant weatherconditions and the robots current activity; a communication module; and finally thehardware platform, which has the sensors, communication hardware and CPU. In the

    following section, we will provide further information about these components.

    4. Weather robot design

    We chose to build our robot on a Targo platform (http://alife-robotics.co.jp).The particular computing platform used in this robot is an Intel-based industrial PCrunning Windows XP. It provides the computing resources necessary to achieve thestated goals of this project, and to implement a variety of state-of-the art technologiesfrom signal processing, speech recognition and image processing.

    4.1 Hardware platform

    An Intel-based industrial PC with Pentium MMX 1.6 MHz CPU, 1 Gb RAM and 10 GbHDD provides the robot with its computing capabilities. The computer interfaces withexternal peripheral through a PCI-PAZ 3305 Mini PCI Interface and USB ports. The robothas two independent step motor drive wheels in the front and a balancing wheel (orcastor) in the centre. As a result, it can turn and move freely in any direction with the

    User interface

    Decision making

    module

    Navigation

    module

    Image recognition

    module

    Weather information

    module

    Voice command

    module

    Operating system (Windows XP)

    Hardware platform

    CPU Mobile platform Sensors

    Positioning WIFI

    Bluetooth

    Ethernet

    Audio

    Interfacing devicesObstacle detector

    Imaging

    Communication devices Peripherals

    Text to speech

    moduleCommunication

    module

    Figure 1 Modular design of the weather robot

    Bui et al. 295

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    6/28

    resolution of the step motor. To determine locations and to detect obstacles whilemoving, the robot relies on an optical sensor. It also has a wireless network link via bothWIFI and public GSM networks. The robot hardware is illustrated in Figure 2.

    4.2 Navigation module

    The navigation module is responsible for tracking the robots location and navigating ittowards a desired destination. Since the operating environment of the robot is dynamic,this module has the capability to determine correctly the robot location while carryingout adaptive route planning with obstacle avoidance tasks.

    4.3 Image processing module

    The weather robot has to be able to recognize air traffic sectors automatically fromvideo images in an ATC centre. In order to achieve this task, an image from the robot

    camera is processed using the Sobel edge detection technique (Sobel and Feldman,1968) before fitting into a sector detection algorithm that finds the corresponding sector,

    Figure 2 The weather robot hardware, showing the main CPU,optical mouse-based location device and the wireless communi-cation device

    296 Bio-inspired robotics for air traffic weather information management

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    7/28

    based on the principle of template matching. The details of the sector detectionalgorithm are discussed in Section 5.4.

    4.4 Voice command module

    The weather robot can communicate with ATC operators using simple voicecommands. The voice command module carries out the speech recognition function-ality, which translates natural language commands into machine-readable commands.To process voice commands, the Microsoft Speech Recognition Engine (http://www.microsoft.com/speech/speech2007/default.mspx) is used.

    4.5 Text to speech module

    In this module, weather messages and meta-data are interpreted and announced by the

    robot to ATCs in natural language. Currently, we rely on the Microsoft Text to SpeechEngine (http://www.microsoft.com/speech/speech2007/default.mspx) to carry outthis task.

    4.6 Communication module

    This module provides communication functionalities for the robot. It allows the robotto communicate with human operators by means of voice, video and short textmessages (SMS). For this work, only voice communication is implemented.

    4.7 Decision making module

    This module is responsible for deciding whether the robot should move, listen or speakto human ATCs by co-ordinating information from other modules such as robotlocation, the active air traffic sector and relevant weather information.

    4.8 Weather information system

    A weather monitoring system typically consists of sensors (both airborne as well asground) located in or near the terminal area as well as local and regional forecastinformation. The main types of information generated by aviation weather monitoringsystems are the SIGMET (Significant Meteorological Hazard Warning), AIRMET(Airmens Meteorological Warning), TTF (Trend Type Forecast), TAF (TerminalAerodrome Forecast), ARFOR (Low level Area Forecasts), Area QNH (air pressure),SIGWX (Significant Weather Charts) and grid-point wind, temperature and pressureforecasts (Lester, 2000).

    Bui et al. 297

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    8/28

    The weather information system can retrieve and process SIGMET, ARFOR, andnumerical grid-point wind, temperature and pressure data. This data consists of semi-structured and structured data. The system accesses the BoM (Bureau of Meteorology,Australia) website, downloads the HTML messages containing SIGMETs and parsesthem. The SIGMET parser extracts structured data from the SIGMET, identifying the

    region, weather phenomena, flight level and time of the event. This data is then storedin a database for retrieval and future updates.

    The interpretation of region information in a SIGMET is typically reported as verticesof a polygon in latitude and longitude co-ordinates. Sometimes this data is presented as amix of waypoint codes, or latitude and longitude co-ordinates. For ATCs that may look atmore than one sector in a day, depending on traffic load, and this can be burdensome, asthe ATCs have to have familiarity with the waypoints in each region.

    The robots onboard weather information system automates the extraction of thewaypoint codes and converts this data to latitude and longitude co-ordinates, and canprovide the most current SIGMET within a specified time window. Furthermore, the

    system extracts numerical grid-point wind, temperature and pressure data fromdifferent weather sources. In addition to the SIGMET and high-level numerical grid-point wind, temperature and pressure data, the weather and wind processing systemalso downloads and parses the low-level ARFOR (Area forecast) data and extracts thewind direction and intensity at various flight levels for each area. Figure 3 illustratesthe process flow for wind, atmospheric and weather data processing.

    4.9 Weather robot navigation

    There are a number of techniques that can be used for mobile robot location and

    navigation, such as odometry and GPS. Since the weather robot is intended to operateonly in an ATC centre, we chose to build our navigation module based on an opticalmouse, as shown in Figure 4. The use of an optical mouse for mobile robot navigationhas a number of advantages. Firstly, an optical mouse is a very low cost sensor.Secondly, it can accurately measure displacements independently of the kinematics ofthe robot by using external natural microscopic ground landmarks to obtain effectiverelative displacement data. Most importantly, this approach is capable of providingaccurate location measurements for the problem at hand (Palacin et al., 2006).

    The optical mouse also facilitates obstacle detection through sensing a change inlocation while the drive system is active. If a change in location does not occur when

    the drive system is active, then an obstacle is detected. Although this mechanism issimple, it does not provide the robot with the capability of avoiding obstacles withoutmaking physical contact. Therefore, combining it with a more sophisticate obstacledetection and avoidance mechanism, which relies on ultrasonic sensors, is an approachto solving this problem.

    298 Bio-inspired robotics for air traffic weather information management

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    9/28

    5. Bio-inspired techniques in weather robot

    The weather robot uses bio-inspired techniques for wind and atmospheric dataprocessing as well as for navigation in a constrained environment. In the followingsubsections, the techniques used are described in more detail. To begin, we describe anapproach in Section 5.1 using neural networks, which is used by the robot to extractdata from weather images. The second component described in Section 5.2 isresponsible for navigating the robot using a bio-inspired dynamic path-planningtechnique utilizing a genetic algorithm.

    5.1 Neural networks for weather image processing

    As shown in Figure 5, wind and atmospheric data in Australia is generated every 6 h bythe Bureau of Metrology in an image format. This data is unavailable in a textual

    Structured data

    SIGMET(HTML)

    Grid point windspeed, temp, anddirection (PDF)

    ARFOR(HTML)

    Chart data

    Convertto image

    Remove background from gridpoint chart

    Extract temp, wind speed, anddirection from image

    Fuse temp, pressure, windspeed, front,storm, cloud, hail/ice, and other

    atmospheric phenomena with lat. Ioncoordinates and flight level

    Perform informationextraction based on

    semantics ofdomain - extract

    location, and relatedtemp. pressure,

    wind speed data

    Semi-structureddata

    Figure 3 Process flow diagram for weather and wind

    data processing

    Bui et al. 299

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    10/28

    format so this image needs to be converted into a text format for further machineprocessing. Each image (map of Australian airspace) is divided into grid cells based on

    latitude and longitude. The size of each cell is 583

    58

    . Each cell has wind andatmospheric information for different altitudes. This information is in the format ofddfffTT, where dd are the two digits for wind direction, fff are the three digits for windspeed and TTare the two digits for temperature.

    In general, the problem is seen as a pattern recognition problem where thepatterns the robot needs to identify are digits from 0 to 9. The problem of convertingthis image into a system readable text file is dealt with in two stages; in the first stagewe filter out the background clutter (the map of Australia), and localize and segmentthe image based on the location of individual digits. In the second stage, anoffline trained neural network is used to identify the digits in wind and atmosphericimage files.

    5.1.1 Pre-processing of weather images: Before we can train neural network toidentify digits in wind and atmospheric images, several pre-processing steps need to beperformed in order to generate the cases used for training. The map of Australiaoverlaps a number of digits and the size (width and height) of one digit sometimesvaries from one to another. Utilizing several wind and atmospheric image files, the

    Figure 4 Optical mouse-based weather robot navigation hardware

    300 Bio-inspired robotics for air traffic weather information management

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    11/28

    unchanging background pixels associated with the grid and map of Australia

    are identified. Following this, we acquire the digits from each image by subtractingthe background image and drawing another automatically constructed grid on theimage in order to segment the digits. The segmentation of the grid is generated byiterating over every pixel row and column of the image; the pixels that becomeassociated with the segmentation grid are the rows and columns with white pixels only.If there are several consecutive white rows and columns, we consider them as one. Thegrid then allows us to segment the individual digits with high precision. Thissegmented set of digits is then used in the second stage where a neural network is usedto recognize digits in new wind and atmospheric images. The size of the training setused by the neural network is 317, where the number of templates for each digit from 0to 9 is 29, 16, 25, 52, 19, 31, 50, 9, 30 and 56 respectively. These numbers were used toreflect the variations associated with the font used in the image for the digits.

    5.1.2 Training the neural network: We used a set of neural networks that aretrained separately to recognize each digit. Each trained network has one hiddenlayer with 10 units and one output. The output reports true if an image of a digit ismatched, otherwise it reports false. The architecture of the 10 neural networks

    Figure 5 Wind and atmospheric image file retrieved byweather robot

    Bui et al. 301

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    12/28

    is presented in Figure 6. The neural network parameters are reported in Table 1.As shown in the table:

    The network type we use is feed-forward back-propagation (Heaton, 2005). In a feed-forward neural network, neurons are only connected forward. Each layer of theneural network contains connections to the next layer (eg, from the input to thehidden layer), but there are no connections back. Back-propagation is a form of super-vised training. When using a supervised training method, the network must be providedwith both sample inputs and anticipated outputs. The anticipated outputs are comparedagainst the actual outputs for a given input. Using the anticipated outputs, the back-propagation training algorithm then takes a calculated error and adjusts the weights ofthe various layers backwards from the output layer to the input layer.

    The transfer function used in the network is the tansig function. This function transfersthe weighted sum of the input nodes to the output as shown in Equation (1).

    y fa fXn

    i1 xiwi 1

    Table 1 Parameters for the two neural network models

    Parameters Values

    Network type Feed-forward back-propagationHidden layer Transfer function is tansigOutput layer Transfer function is tansigGoal 0Epochs 100Momentum constant 0.95Learning rate 0.01

    IW[1,1]

    b[1]

    LW[2,1]

    b[2]

    84 10 1

    Figure 6 The architecture of the 10 neural networks withone output

    302 Bio-inspired robotics for air traffic weather information management

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    13/28

    where xi(i 1, . . . , n) are elements of an input vector x, x0 is a threshold input, wi is aweight from an input i to a neuron, and f(a) is the is the tansig function given inEquation (2).

    fa 2

    1 e2a 1 2

    Goal, epochs, and momentum constant (MC) are training parameters. Training stopswhen a maximum number of epochs occur or the performance goal is met. The MCdefines the amount of momentum. MC is set between 0 (no momentum) and values closeto 1 (lots of momentum). An MC of 1 results in a network that is completely insensitive tothe local gradient and, therefore, does not learn properly. The learning rate for both inputweights and biases is 0.01. This constant is used in error back-propagation learning andother artificial neural network learning algorithms to control the speed of learning.

    The training set for each neural network consists of the set of template images for all

    10 digits, even though a neural network is trained to recognize a specific digit only. Thisis necessary so that each network can see negative as well as positive examples in orderto distinguish correctly between digits. The 84-bit strings used in training originatefrom the binary matrix representing the digit image, where the width of the matrix is 7and the height is 12. In Figure 7, examples of this binary matrix are presented fordigit 0.

    5.1.3 Using neural networks for classifying weather images: Once the neural networkis trained, it can then be used to process new grid-point wind and atmospheric dataimages. When a new image arrives for processing, we perform the same preliminary pre-

    processing steps that were used before training. After this, the segmented digits can berecognized using the trained neural networks. In order to recognize one new image of adigit, all 10 trained neural networks are used. If only one of the neural networks has the

    0 0 1 1 1 1 0

    0 1 0 0 0 0 1

    1 0 0 0 0 0 1

    1 0 0 0 0 0 1

    1 0 0 0 0 0 1

    1 0 0 0 0 0 1

    1 0 0 0 0 0 11 0 0 0 0 0 1

    1 0 0 0 0 0 1

    0 1 0 0 0 1 0

    0 0 1 1 1 1 0

    0 0 0 0 0 0 0

    0 1 1 1 1 0 0 0 0 1 1 1 0 0 0 1 1 1 0 0 0 0 0 1 1 1 0 0

    0 1 0 0 0 1 0

    0 1 0 0 0 0 1

    0 1 0 0 0 0 1

    0 1 0 0 0 0 1

    0 1 0 0 0 0 1

    0 1 0 0 0 0 10 1 0 0 0 0 1

    0 1 0 0 0 0 1

    0 0 1 0 0 1 0

    0 0 1 1 1 0 0

    0 0 0 0 0 0 0

    1 0 0 0 1 0 0

    1 0 0 0 0 1 0

    1 0 0 0 0 1 0

    1 0 0 0 0 1 0

    1 0 0 0 0 1 0

    1 0 0 0 0 1 01 0 0 0 0 1 0

    0 1 0 0 1 0 0

    0 1 1 1 0 0 0

    0 0 0 0 0 0 0

    0 0 0 0 0 0 0

    0 1 0 0 0 1 0

    1 0 0 0 0 0 1

    1 0 0 0 0 0 1

    1 0 0 0 0 0 1

    1 0 0 0 0 0 1

    1 0 0 0 0 0 11 0 0 0 0 0 1

    0 1 0 0 0 1 0

    0 0 1 1 1 0 0

    0 0 0 0 0 0 0

    0 0 0 0 0 0 0

    1 0 0 0 0 1 0

    1 0 0 0 0 1 0

    1 0 0 0 0 0 1

    1 0 0 0 0 0 1

    1 0 0 0 0 0 1

    1 0 0 0 0 0 1

    1 0 0 0 0 0 1

    1 0 0 0 0 0 1

    1 0 0 0 0 0 1

    0 1 0 0 0 1 0

    0 1 1 1 1 0 0

    Figure 7 Five samples of digit 0

    Bui et al. 303

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    14/28

    output of true during classification, the digit that this neural network is responsible foridentifying is deemed the digit corresponding to the new image, otherwise the newimage is deemed to be unrecognized. The neural networks are applied to each digit in theimage until there are no more digits to classify. The wind, atmosphere and weatherinformation extracted from the image is then integrated to give a complete picture. The

    graphical representation of this meta-data generated by processing weather and windinformation is then visualized as given in Figure 8.

    5.2 Evolutionary computation for path planning

    The second bio-inspired component of the weather robot is the path-planning module.Before we introduce our approach to path planning, we will discuss some of thelimitations of traditional path-planning approaches.

    In traditional path planning used in robotics, terrain is often represented as a grid of

    cells (Barraquand et al., 1992). These cells include forbidden cells that the robot cannotpass through and allowed cells that the robot can pass through. As a result, the path-planning problem can be reformulated as a problem of finding paths in a graph, wherethe graph vertices are the cells, and the edges are the connections betweenneighbouring cells.

    Figure 8 Graphical output of meta-data generated by processingweather and wind information

    304 Bio-inspired robotics for air traffic weather information management

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    15/28

    The main issue of this type of approach is that the path found is usually not smoothas it is a list of neighbouring cells from the origin to the destination. Hence, the robotusually has to change its direction when it goes from one cell to the next cell. Moreover,when the grid has a high resolution, it is very time consuming to find a path usingtraditional algorithms. For example, the computational complexity of Dijkstras

    algorithm is O(|E||V|log|V|), where |E| is the number of edges of the graph,and |V| is the number of vertices of the graph.

    We try to avoid the above issue by using a continuous terrain representation.In particular, terrain is represented as a continuous space in which obstacles are

    bounded by polygons represented as lists of points with real co-ordinates. With thisrepresentation, we can increase the smoothness of the found paths. Unfortunately,traditional path finding algorithms will not work with this continuous representationas the number of cells is, for all intents and purposes, almost infinite. Therefore,we employ a genetic algorithm (GA), which is a bio-inspired technique basedon Darwins principle of natural selection, often used for optimization and search prob-

    lems. GAs work by firstly initializing a random population of candidate solutions,which can be represented by a binary string of 0 s and 1 s, or a vector of real variables.This population of solutions is then evaluated according to some measure of fitness orutility. Following this, a new offspring population is produced by using geneticoperators such as crossover and mutation. The offspring are then evaluated andselected for the next generation based on some fitness criteria. The algorithm continuesover a number of generations until a preset maximum number of generations isreached, or a solution with acceptable fitness values is found.

    Specifically the NSGA-II (Deb et al., 2002) algorithm is used in this work tofind obstacle-free shortest paths for the robot. Although the NSGA-II is designed for

    multi-objective optimization, in this work we have only utilized a single-objective,which is minimizing the path for the robot from its origin to its destination. We decidedto use this algorithm for this single-objective problem, because it was alreadyintegrated into an existing framework for path planning, which incorporates objectivessuch as minimizing the risk associated with the robot colliding with obstacles. Inaddition, NSGA-II is a very robust algorithm and provides advantages to single-objective optimization because of its diversity preserving mechanism. In our work, theNSGA-II procedure is implemented repeatedly until the termination criterion is met.Usually, the NSGA-II procedure is continued for a predefined number of iterations(Tmax).

    5.2.1 Chromosome representation and evaluation: In our approach, one individual inthe GA population is a path consisting of a list of waypoints, which the robot visits onthe way from the origin to the destination. In this paper, we use NSGA-II with somemodifications in the representation of a candidate path, and some necessary constraintchecking associated with the feasibility of a path.

    Bui et al. 305

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    16/28

    Each 2D point in the space is represented using two values, one each along the ( X,Y)axis. Therefore, if there are n points, then one solution will include 2n real-parametervariables, as a two-tuple for each point in a successive manner, as shown in Equation (3).

    X1, Y1|fflfflfflffl{zfflfflfflffl}P1, X2, Y2|fflfflfflffl{zfflfflfflffl}P2

    , . . . Xn, Yn|fflfflfflffl{zfflfflfflffl}Pn 3

    while computing the entire path from start to end, the start and end points are added inthe above list at the beginning and at the end, respectively.

    The robot must not collide with obstacles; therefore every candidate path for therobot in one population is checked to see that it is feasible (a feasible path does notcollide with obstacles). We model our weather robot as a bounding box with a width of50 cm. To check this condition, one bounding box is created between every twosuccessive points, which represents the width of the robots bounding box. If all

    bounding boxes corresponding to one path do not intersect with any polygon

    representing an obstacle, the path satisfies the feasibility constraint. In Figure 9, thoughthe path line connecting points P1; P2; P3; P4 does not intersect with any polygon, thispath does not satisfy the feasibility constraint because the bounding boxes of the pathintersect with a polygon corresponding to an obstacle. A path is deemed feasible if the

    bounding boxes associated with the path do not intersect with any polygonsrepresenting obstacles, if and only if every edge of the obstacle polygons do not

    P1 P2

    P3

    P4

    Figure 9 Bounding boxes for the robot path

    306 Bio-inspired robotics for air traffic weather information management

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    17/28

    intersect with any edge of the path bounding boxes, every point of the obstacle polygonis outside the path bounding boxes, and every point of the path bounding boxes isoutside the polygon.

    In our optimization procedure if one individual does not satisfy the feasibilityconstraint, a very large penalty is added to its fitness value, in order to remove the

    possibility of its selection in the following generation; otherwise fitness is assigned asthe length of the overall path. The value associated with path length should be as smallas possible, because we are interested in minimizing the path length. The length of thepath is calculated by summing over the distances between the consecutive discretepoints of the path. If we assume the path consists of a list of points P0, P1, . . . Pn1,where P0 is origin and Pn1 is the destination, the length of the path is described inEquation (4).

    Xin

    i0

    PiPi1

    !

    4

    5.2.2 Genetic operators: For recombination, we use the standard SBX operator(Deb et al., 2007) with crossover probability pc 0.9 and a distribution index ofhc 15.SBX operator is the simulated binary crossover whose search power is similar to that ofthe single-point crossover used in binary-coded GAs. Simulation results on a number ofreal-valued test problems of varying difficulty and dimensionality suggest that the real-coded GAs with the SBX operator are able to perform as well as or better than binary-coded GAs with the single-point crossover. SBX is found to be particularly useful inproblems like the one at hand. Furthermore, a simulation on a two-variable blocked

    function shows that the real-coded GA with SBX works as suggested by Goldberg, andin most cases the performance of real-coded GA with SBX is similar to that of binaryGAs with a single-point crossover (Deb and Agrawal, 1995).

    Equation (5) details the mutation operator employed by our genetic algorithm (Deband Goyal, 1996).

    yi xi xUi x

    Li di

    di 2ri

    1=hm1 1 ifri\ 0:5

    1 21 rij j1=hm1 otherwise

    (5

    where xi is the value of the ith parameter selected for mutation; yi is the result of themutation; xLi and x

    Ui are the lower bound and the upper bound ofxi respectively, and ri

    is a random number in [0,1]; gm is a control parameter (hm 20 in our study).

    5.2.3 Objective function: The objective function is to find a shortest path for therobot from a given start point to an end point in the environment. Thus the objective

    Bui et al. 307

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    18/28

    function is to minimize Equation (4), where P0 is origin, Pn1 is the destination, P1 to Pnare middle points extracted from a chromosome.

    5.3 Air traffic flow management

    The weather robot monitors a realistic air traffic control environment, which is theTrajectory Optimization and Prediction for Live Air Traffic (TOP-LAT) system(Figure 10). This system was co-developed by the authors and can provide a realtime full situation awareness of airspace (control zones, sectors, terminal airspace,special use airspace), which includes air traffic flow, aviation emission, airspacecomplexity and safety indicators. TOP-LAT synthesizes and integrates all thisinformation into an interactive graphical user interface, which assists users (airlineoperation centres, air traffic flow management centres) to make air traffic flowmanagement decisions. TOP-LAT provides users with a clear view for sectors in each

    Figure 10 Air traffic flow management environment (TOP-LAT)showing all the sectors in the Australian airspace

    308 Bio-inspired robotics for air traffic weather information management

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    19/28

    category, and the interface provides options for choosing individual sectors that can bemonitored by ATCs.

    5.4 Sector image processing

    The weather robot takes a picture of the controllers screen to identify which sector thecontroller is handling. In order to detect a sector, the colour picture is first transformedinto a grey-scale image and the histogram of the grey-scale image is built. The image isthen adjusted to display only the interesting curves and points, eg, the boundary of thesector in white and the image background in black, by limiting the upper and lowergrey level (between 0.98 and 0.92 in our study) of the histogram.

    Equations (6)(8) illustrate the transformation process in which Ci is the grey level ofi pixel before the transformation; Ri, Gi, Bi are red, green and blue portions of the pixel,WR, WG, WB are the corresponding weights.

    Ci Ri3

    WR Gi3

    WG Bi3

    WB 6Cmin minCiCmax maxCi

    7

    Ci Ci Cmin 255

    Cmax Cmin8

    Blob detection and extraction (Lindeberg, 1994) is then applied to remove the noisepoints in the adjusted image, and extract the blob of a sector. At the stage of blobdetection and extraction, a label is applied to the sector.

    The highlighted sector is then filled with white colour for pixel distance comparison.Based on the boundary (M3N) of the sector, the image is cropped to the required size(M3N) fitting the sector before it is then translated to a 2D matrix (M3N) of Booleanvalues, where true or false identifies the pixel as white or black respectively. Theobtained Boolean matrix is then used to search for the matching sector in a previouslystored database of sector boundaries and locations. The various stages in sector imageprocessing are illustrated in Figure 11.

    6. Validation

    To evaluate the functionalities of the robot in the air traffic control environment, anumber of experiments were conducted. In particular, two main functionalities of theweather robot were tested: (1) navigation in a constrained dynamic environment; and(2) recognizing ATC sectors, interpreting weather messages for that sector andcommunicating (through a synthetic voice) relevant weather information to thecontroller managing the sector.

    Bui et al. 309

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    20/28

    Figure 11 (a) Sector image captured by robot in the ATCenvironment; (b) grey image generated by the image processingalgorithm; (c) black and white processed image of sector bound-

    aries generated by the image processing algorithm; (d) imageafter blob extraction; (e) sector filled by white colour by theimage processing algorithm; (f) identified sector image by theimage processing algorithm

    310 Bio-inspired robotics for air traffic weather information management

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    21/28

    6.1 Navigation in a constrained environment

    We have used a live air traffic environment for the Australian airspace from the TOP-LAT system. Four ATC positions were set up in our Air Traffic Research Laboratory.The task of the weather robot is to navigate successfully in this constrained

    environment and identify each controllers position, and aim at the sector image onthe controllers screen. The robot moves continuously in this environment (Figure 12),going from one controllers position to another controllers position. Timings wererecorded and the distance covered from one position to another was measured. We notethat this is a small set-up compared with a full air traffic operation centre, wherecomputational timings and distances covered by the weather robot will be of muchgreater importance.

    In this path finding experiment, we chose to use 10 intermediate points in a path, ie,n 10 (see Equation 4). The number of generations was 100. The population of onegenerations consisted of 1000 candidate paths. It took approximately 35 s to find a path

    for the weather robot.

    Figure 12 Weather robot navigating in the ATC environment

    Bui et al. 311

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    22/28

    Figure 13 shows the path found by the algorithm (path P) and the shortest path(path S). It can be seen that the path found by the algorithm is close to the shortest path.The turning points between the origin and the destination in path P are points extractedfrom the best chromosome in the final population. The two objects (ATC environmenthardware) in the centre (with red boundary) are obstacles, which the robot needs tonavigate around.

    We can see that the algorithm converges after about 80 generations as shown inFigure 14. The standard deviation of fitness values of individuals in a populationdecreases quickly. It is around 10 (cm) after 80 generations.

    6.2 Sector recognition and interpreting weather data

    The other task of the robot is to capture the sector image and identify correctly, in areasonable time, which sector a controller is managing. This is followed by interpretingweather-related messages for the identified sector and delivering it to the relevantcontroller.

    Figure 13 An example robot navigation path in the givenATC environment

    312 Bio-inspired robotics for air traffic weather information management

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    23/28

    As shown in Figure 12, there are four ATC positions working on different sectors.The weather robot is given a start point and an end point. The robot then continuouslymoves from start to end, stops at each controller position, identifies the sector, makesweather announcements and then moves onto the next controllers position.

    The sector information was then combined with the weather information processed by

    the neural network to formulate the weather message. In our experiment, the accuracy ofthe image recognition task achieved with the neural network was 100%. The sample finaltext output of the image recognition module is illustrated in Figure 15, which fullymatched with the sample input data given in Figure 5.

    6.3 Metrics and measures

    The following measures and metrics were recorded in the experiment.

    Distance (m) from start to end and distance travelled by weather robot. Time (s) to navigate from start to end.

    Time (s) at a controllers position. Time (s) to identify a sector. Time (s) to make weather announcements. Correct identification of sector. Correct interpretation of weather data pertaining to the recognized sector. Delivery of weather announcements, in synthetic voice, to the relevant ATC.

    0 20 40 60 80 100 1200.5

    1

    1.5

    2

    2.5

    3x 104

    Generations

    Fitness

    values

    MeanSTDEV

    Figure 14 Mean and standard deviation through 100 generations,Seed 1

    Bui et al. 313

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    24/28

    Three experiments with four different sectors were conducted. There were 12 sectorsto be recognized and 12 different weather datum to be processed by the weather robotand announced in accordance with 12 controller positions.

    7. Results and discussions

    The weather robot successfully navigated in the air traffic environment from its givenstart position to its end position. The distance from start to end position was 2.52 m.The robot travelled 4.25 m, which included movements at each controllers position.The average total time it took for the robot from start to end was 54 s with a standarddeviation of61 s. However, the robot has five speed settings and can run five timesfaster but was limited because of safety concerns. The weather robot correctly identifiedthe controllers position in each instance and performed the correct positioning of the

    Figure 15 Processed wind and atmospheric data text generated byneural networks

    314 Bio-inspired robotics for air traffic weather information management

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    25/28

    camera with respect to the sector. It spent 596 20 s on average at each controllers

    position. The time at a controllers position included the time to take a picture of thesector, identify the sector, process the weather message pertaining to that sector andvocalize the relevant weather conditions. On average, the weather robot took 256 10 sto identify the sector image, and 306 10 s on average to make the announcements.

    The robot successfully recognized all of the 12 sector images displayed on the ATCscreens and correctly identified the weather information pertaining to each sector. Avideo of the robot in motion during the experiments can be seen at http://www.itee.adfa.edu.au/;alar/tmag/. Figure 16 shows the weather robot at an ATCposition processing a sector image.

    As an example, when the weather robot successfully recognized the YBBB/YMMM/HUON sector over Tasmanian airspace, it made the following weather announcement(speech converted to text):

    Weather Information for YBBB/YMMM/HUON Sector Location: 45S 140E WindDirection: 20 Degrees At Flight Level: 300 Wind Speed: 130 knots Temperature: 48degrees centigrade Barometric Pressure: 300 Hectare Pascal SIGMET ValidityPeriod from 8/19/2009 3:00:00 PM to 8/19/2009 8:30:00 PM Severe turbulence

    Figure 16 Weather robot processing sector image

    Bui et al. 315

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    26/28

    Location: Latitude 47 degrees to 32 Degrees and Longitude 128 Degrees to 136 Degrees Movement: East Movement Speed: 25 Knots.

    An audio clip of the weather message announced by the weather robot to thecontrollers for the given sector can be listened to at: http://www.itee.adfa.edu.au/

    ;alar/tmag/.

    8. Conclusions

    In this paper, we proposed a weather robot that uses bio-inspired techniques forweather image processing and navigation in a constrained ATM environment.The robot is capable of navigating around ATC positions and finding shortest pathsfrom one location to another while avoiding obstacles. It uses neural networks forweather image data processing, and edge-detection algorithms to identify airspace

    sectors on an ATC screen. The robot can assist human ATCs with respect to weatherphenomena associated with the controllers working sector by announcing the relevantweather information to the controller in natural language. In our conductedexperiments, the weather robot successfully navigated in a simulated ATM environ-ment, correctly identified the sectors and made appropriate weather announcements

    based on relevant sector weather information.Although the main intention of our robot system is to demonstrate the practical

    synthesis and engineering of a robotic system in ATM environments, we are interestedin improving the bio-inspired techniques employed in our robot, such that they canfunction with greater efficiency in the ATC domain. To this end, an analysis of the

    topology of the search spaces involved in our domain, and the relationship betweenthese search spaces and our chosen algorithms would be desirable. We leave this as amatter for future work, as our project continues.

    Also in future work, we will be further exploring the effect of co-adaptationand learning within the environment. In addition, high-risk sectors with regularweather phenomenon, such as thunderstorms and hailstorms, will be regularlyattended to by the weather robot. The navigation strategy of the weather robot withinan air traffic control centre will also be improved, such that the robots path planningwill also be based on prioritization of weather reports and any critical weatherphenomenon can be delivered to the required controller in the most efficient and timely

    manner. We will also be conducting human factor experiments where the effect on thecognitive workload of the controller when using the weather robot will be investigated,as well as the role that implicit embodiment plays in the communication of weatheralerts.

    316 Bio-inspired robotics for air traffic weather information management

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    27/28

    References

    Alam, S. 2008: Evolving complexity towardsrisk: A massive scenario generation approachfor evaluating advanced air traffic manage-ment concepts. PhD thesis, School of

    Information Technology & ElectricalEngineering, University of New South Wales.

    Azuma, R., Neely, H., Daily, M. and Geiss, R.2000: Visualization tools for free flight air-traffic management. IEEE Computer Graphicsand Applications, 326.

    Barraquand, J., Langlois, B. and Latombe, J.-C.1992: Numerical potential field techniques forrobot path planning. IEEE Transactions onSystems, Man, and Cybernetics 22, 22441.

    Broach, D. 2005: A singular success: air trafficcontrol specialist selection 19811992. In:Kirwan, B., Rodgers, M. & Schafer, D. editors.

    Human factors impacts in air traffic management177205.

    Deb, K. and Agrawal, R.B. 1995: Simulatedbinary crossover for continuous search space.Complex Systems 9, 11548.

    Deb, K. and Goyal, M. 1996: A combinedgenetic adaptive search (geneas) for engineer-ing design. Computer Science and Informatics26, 3045.

    Deb, K., Pratap, A., Agarwal, S. and Meyarivan,T. 2002: A fast and elitist multiobjective geneticalgorithm: Nsga-ii. IEEE Transactions onEvolutionary Computation 6, 18297.

    Deb, K., Sindhya, K. and Okabe, T. 2007: Self-adaptive simulated binary crossover for real-parameter optimization. GECCO07: Proceed-ings of the 9th Annual Conference on Genetic andEvolutionary Computation. ACM, 118794.

    Erzberger, H. 2004: Transforming the NAS: thenext generation air traffic control system.NASA Ames Research Centre, Moffett Field,CA, Technical Report NASA/TP2004-212828.

    Evans, J. 2001: Tactical weather decision sup-port to complement strategic? Traffic flowmanagement for convective weather. TheFourth International Air Traffic ManagementR&D Seminar ATM-2001, Santa Fe, NM.Paper available at http://atm2001.eurocontrol.fr.

    Heaton, J.T. 2005: Introduction to neural networkswith Java. second edition. Heaton Research.

    ICAO. 2002: The ICAO Global Air NavigationPlan for CNS/ATM systems. ICAO, Volume 1,

    no. 9750.Joint Planning and Development Office. 2007:

    Concept of operations for the next generation airtransportation system, version 2.0. Washington,D.C.

    Lester, P. 2000: Aviation weather. JeppesenSanderson.

    Lindeberg, T. 1994: Scale-space theory incomputer vision. Springer.

    Nolan, M. 2004: Fundamentals of air trafficcontrol. fourth edition. Brooks/Cole-Thomson Learning.

    Palacin, J., Valganon, I. and Pernia, R. 2006:

    The optical mouse for indoor mobile robotodometry measurement. Sensors & Actuators:A. Physical 126, 1417.

    SESAR Consortium. 2007: The ATM targetconcept: SESAR definition phase, deliverable3, Eurocontrol, Brussels, Technical ReportDLM-0612-001-02-00.

    Sherry, J., Ball, C. and Zobell, S. 2001: Trafficflow management (tfm) weather rerouting.4th USA/Europe Air Traffic Management R&DSeminar.

    Sobel, I. and Feldman, G. 1968: A 333 isotropicgradient operator for image processing.Presentation for Stanford Artificial Project.

    Teo, J. and Abbass, H. 2005: Multiobjectivityand complexity in embodied cognition. IEEETransactions on Evolutionary Computation 9,33760.

    Wickens, C. 1999: Automation in air trafficcontrol: The human performance issues. In:Scerbo, M.W. and Mouloua, M., editors.Automation technology and human performance:Current research and trends. LawrenceErlbaum, NJ, 210.

    Ziemke, T. 2003: Whats that thing calledembodiment? Proceedings of the 25th AnnualMeeting of the Cognitive Science Society,

    130510.

    Bui et al. 317

  • 7/31/2019 Bio Inspired Robotic for Traffic Area

    28/28

    Reproducedwithpermissionof thecopyrightowner. Further reproductionprohibitedwithoutpermission.