human-machine interfaces for increased uas pilot situational awareness

26
Human-Machine Interfaces for Increased UAS Pilot Situational Awareness Ricardo Lopez Graduate Student Researcher Prepared for UAS 2010 June 14-18

Upload: ahuizote

Post on 04-Jun-2015

243 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Ricardo LopezGraduate Student Researcher

Prepared for UAS 2010June 14-18

Page 2: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

2

Outline• Research Background

– Situation Awareness– Autonomy vs. Automation– Classes of Unmanned Aircraft Systems– Hypothesis

• Level of Autonomy– Autonomy Levels for Unmanned Systems (ALFUS) Framework– Total ALFUS Score– Decomposition of Human Independent Domain– ALFUS and Airspace

• Experimental Design– External vs. Internal Pilot– Levels of Autonomy– Navigation Situation Awareness User Interface Baseline– Interface Environments

• Results and Conclusions

• Status and Future Work

• Questions

Page 3: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

3

Situation Awareness

• Unmanned systems must perceive “the elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future.”

Where you are

Where you have been

Where you are going

Within your environment

Understand what you perceive

Collision Alert!!

Page 4: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Situation Awareness

From Toward a theory of situation awareness in dynamic systems., Endsley, M.R., 1995, Aldershot, England: Human Factors, copyright © 1995 by Human

Factors and Ergonomics Society.

Page 5: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

• Autonomy is an unmanned aircraft system’s “own ability of sensing, perceiving, analyzing, communicating, planning, decision-making, and acting, to achieve its goals as assigned by its human operator(s) through designed” interfaces.

Autonomy vs. Automation

Automatic Autonomous

• Automation refers to a pre-programmed effect by the unmanned system in response to a specific stimulus.

Page 6: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Adaptability

Non-Adaptable

Adaptable

Waypoint A

Waypoint B

• Adaptability, in terms of levels of autonomy, refers to the ability of the system to “react” to a mission or environmental change without operator input.

Page 7: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

7

• The functions needed for Situation Awareness are the same for all classes of Unmanned Aircraft Systems (UAS).

• What varies is their level of fidelity to achieve the system’s situation awareness needs based on the mission and the environment they are operating in.

Classes of Unmanned Aircraft Systems (UAS)

Mini/Micro

Medium Altitude Long Endurance

High Altitude Long Endurance

Page 8: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

• The Level of Autonomy for any given system, is the determinant factor for the degree of Human-Machine Interfaces needed for proper Situation Awareness in any class airspace.

8

Hypothesis

What, how, and how much gets displayed to the operatorfor them to interface with the system

Page 9: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

9

What is the Autonomy Level For Unmanned Systems (ALFUS)?

• The National Institute of Standards and Technology’s ALFUS framework measures unmanned systems by using the Contextual Autonomous Capability model, which is composed of three axes:

Mission Complexity

Environmental

Complexity

Human Independence

Page 10: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Total ALFUS Score

• As the Unmanned System falls further to the right of the figure, the Level of Autonomy increases

Page 11: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

ALFUS and Airspace

• An ALFUS value can also be assigned to airspace:

ALFUS 6

ALFUS 4

ALFUS 6

ALFUS 8

ALFUS 4ALFUS 6

ALFUS 6

A

B

ALFUS 5

ALFUS 4

ALFUS 4

Scenario 1

ALFUS 6

ALFUS 4

ALFUS 6

ALFUS 8

ALFUS 4ALFUS 6

ALFUS 6

A

B

ALFUS 9

ALFUS 4

ALFUS 4

Scenario 2

A UAV with a certain Total ALFUS Score could be cleared to operateUnder lower level ALFUS airspace yet not under a higher level one

Page 12: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Introduction to the National Airspace System

• Following the International Civil Aviation Organization’s (ICAO) airspace classification designation and assigning an ALFUS value range (e.g. ALFUS 0-3) to an ICAO airspace class range (e.g. Class E & F), a system with a specific Total ALFUS score will comply to that country’s National Airspace System (NAS)

Total ALFUS Score

MC EC Total ALFUS

0

1

2

7:3

8

9

10

0

1

2

7:3

8

9

10

0

1

2

7:3

8

9

10

HI or LOA

0

1

2

7:3

8

9

10

ALFUS Airspace

0

1

2

7:3

8

9

10

NAS

G

E

D

C

B

A

Page 13: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Decomposition of Human Independence Domain

• The study will address the interfaces within the Human Independence domain of the ALFUS Framework

• By decomposing the Human Independence domain into: Communication, Navigation, and Surveillance, it becomes easier to isolate a baseline list of interfaces

Nav

0

1

2

7:3

8

9

10

Total ALFUS Score

MC EC Total ALFUS

0

1

2

7:3

8

9

10

0

1

2

7:3

8

9

10

0

1

2

7:3

8

9

10

HI

0

1

2

7:3

8

9

10

Comms

0

1

2

7:3

8

9

10

Surveillance

0

1

2

7:3

8

9

10

Human-Computer Interface Fidelity

Level of Autonomy

Low

High

The study will begin by testing the Navigation interfaces

Page 14: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

• External = Direct Line of Sight

External vs Internal Pilot

www.discoverymagazine.com

• Internal = Inside Control Station

www.airforcetimes.com

Page 15: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Levels of Autonomy: Low/Manual Control

Low or ManualControl

Air vehicle will respond to all operator commands.

Continuous and direct operator engagement of flight through the use of a primary flight control input device.

Page 16: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Medium/Semi-Autonomous

Operator pre-loads flight plan (waypoints) and/or provides ad-hoc flight plan changes while in flight. Operator monitors and makes corrections based on navigation issues issues

System will execute flight plan (as provided by the operator) and will annunciate any navigation (flight parameter) issues for correction by the operator

Levels of Autonomy: Medium/Semi- Autonomous

Page 17: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Levels of Autonomy: High/Fully Autonomous

High/Fully Autonomous

Operator pre-loads flight plan (waypoints) and/or provides ad-hoc flight plan changes while in flight. Operator monitors air vehicle navigation status and changes

System will execute flight plan (as provided by the operator) and address any navigation (flight parameter) issues autonomously

Page 18: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Navigation Situation Awareness User Interface Baseline

• The Federal Aviation Administration’s (FAA) Federal Aviation Regulation (FAR) Part 91.205 spell out the minimum required set of indicators for Visual Flight Rules (VFR) and Instrument Flight Rules (IFR).

Required by Part 91.205

1. Airspace Indicator

2. Altimeter

3. Rate of Turn Indicator

4. Slip Skid Indicator

5. Pitch and Bank Indicator (Artificial Horizon)

6. Direction Indicator (Directional Gyro or Equivalent)

7. Time and Timer

Other navigation indicators not included in Part 91.205

8. Forward Detection Device (Forward Looking Camera)

9. Navigation Indicator (Moving Map)

10. Rate of Climb Indicator

Page 19: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Interface Environments

“The amount and criticality of human interaction should be inversely proportional to the levels of autonomy”

Different Configurations were created using Laminar Research X-Plane

Manual

All Navigation indicators are equivalent to those of a manned aircraft.

Page 20: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Interface Environments

Semi-Autonomous

A combination of indicators and annunciators. The indicators would be for critical flight parameters (attitude, heading, airspeed) displayed at all times, while the annunciators will be for conditional status of non- critical flight parameters.

Page 21: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Artificial Intelligence

• “Whenever one can predict the consequence of an action, he or it is able to plan, thus to express intelligence. The predictive based behavior was used to correct in some ad-hoc way the reactive behavior itself; it could be used in many different ways to enhance the so called intelligence”

Dr. Hartland CedricEcole Francaise Electronique Informatique

Creating a scenario that will “predict the consequence of an action” becomes very difficult, and it is outside the scope this study.

Page 22: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Interface Environments

Fully-Autonomous (Notional)

Navigation indicators are prioritized annunciators for conditional status (color annunciators following aviation practices for caution, warning, fault, etc).

Page 23: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Results

Situation Awareness Rating Technique (SART)

Values under the HMI environments are very close due to the isolation of Navigation SA without regard to communication and surveillance.

Page 24: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Conclusions

• It was expected that the calculated SA values under the two HCI environments would be the same or very close because we are isolating Navigation SA without regard to communication and surveillance.

• Comparing the manual and semi-autonomous environments, the participants found that the primary flight display (heads up like display) contained the necessary information to gain SA; and hence the annunciators, which were meant to light up when a threshold was exceeded, were used very little or not at all.

• A complete scenario including Navigation, Communication, and Surveillance interfaces for SA is required to analyze results for use to determine interface design.

Page 25: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

Status and Future Work

• Received further funding to conduct experiment on Communication Situation Awareness using the same methodology presented in this paper.

Manual Voice CommunicationsSemi-

AutonomousVoice Communications and Data Link MessagesFully-

AutonomousData Link Messages

Page 26: Human-Machine Interfaces for Increased UAS Pilot Situational Awareness

• This study is funded through a research award from Embry-Riddle Aeronautical University.

• Conference Sponsorship:

"Critical Thinking. Solutions Delivered."