metrics based approach for evaluating air traffic control automation of the future
DESCRIPTION
Metrics Based Approach for Evaluating Air Traffic Control Automation of the Future. Purpose. Provide overview of air traffic control automation system metrics definition activity Motivation Process - PowerPoint PPT PresentationTRANSCRIPT
Presented at: ATCA Symposium April 2006 By: Mike Paglione, FAA Simulation and Analysis Group
Date: April 25, 2006
Federal AviationAdministrationMetrics Based
Approach for Evaluating Air Traffic Control Automation of the Future
Aerospace Control and Guidance Systems Committee Briefing2Federal Aviation
AdministrationMarch 1, 2007
Purpose
• Provide overview of air traffic control automation system metrics definition activity – Motivation– Process
• Present comparison of Host Computer System (HCS) radar tracks to GPS-derived aircraft positions
Aerospace Control and Guidance Systems Committee Briefing3Federal Aviation
AdministrationMarch 1, 2007
Definitions
• Software Testing– Process used to help identify the correctness, completeness, security
and quality of developed computer software– "Testing is the process of comparing the invisible to the ambiguous,
so as to avoid the unthinkable happening to the anonymous."James Bach (Contemporary author & founder of Satisfice, a test training & consulting company)
– “Testing can show the presence of errors, but never their absence.” Dijkstra (Famous Dutch computer scientist and physicist, author, etc.)
• Two Fundamental Processes– Verification
• Building the product right (e.g. determining equations are implemented correctly)
– Validation• Building the right product (e.g. solving the right equations)
Aerospace Control and Guidance Systems Committee Briefing4Federal Aviation
AdministrationMarch 1, 2007
Why is this important to the FAA?• En Route Automation Modernization (ERAM)
– Replaces EnRoute Host Computer System (HCS) and backup
• ERAM provides all of today’s functionality and:– Capabilities that enable National Airspace System evolution – Improved information security and streamlined traffic flow at our
international borders– Additional flight radar data processing, communications
support, and controller display data – A fully functional backup system, precluding the need to restrict
operations as a result of a primary system failure– Improved surveillance processing performance using a greater
numbe/variety of surveillance sources (e.g. ADS-B)– Stand-alone Testing and Training capability
Aerospace Control and Guidance Systems Committee Briefing5Federal Aviation
AdministrationMarch 1, 2007
ERAM Test Challenges• Limited funding • Installed and operational at 20 sites in 2008-9• System Requirements
– 1,298 in FAA System Level Specification – 4,156+ in contractor System Segment Specifications– 21,906 B-Level “shalls”
• Software: 1.2 million SLOC• COTS/NDI/Developmental mixture• Numerous potential impacts, significant changes
– ATC Safety, ATC Functions, System Performance, RMA, ATC Efficiency
– Replacement of 1970s legacy software that has evolved to meet today’s mission
Aerospace Control and Guidance Systems Committee Briefing6Federal Aviation
AdministrationMarch 1, 2007
Metric Based Approach
• Formation of Cross Functional Team– Members from ERAM Test, Simulation, Human Factors, System
Engineering, Air Traffic Controllers, and others…
• Charter – “To support the developmental and operational testing of ERAM
by developing a set of metrics which quantify the effectiveness of key system functions in ERAM”
– Focus beyond requirement based testing but validation emphasis linked directly to services
– Targeted system functions – Surveillance Data Processing (SDP), Flight Data Processing (FDP), Conflict Probe Tool (CPT),
Display System (DS)
Aerospace Control and Guidance Systems Committee Briefing7Federal Aviation
AdministrationMarch 1, 2007
Background• Metrics may be absolute or comparative in nature
– Comparative metrics will be applied to current air traffic control automation systems (and later to ERAM)
• Measure the performance of the legacy En Route automation systems in operation today to establish a benchmark
• Allow direct comparison of similar functionality in ERAM
– Absolute metrics would be applied to FAA standards• Provide quantifiable guidance on a particular function in ERAM
• Could be used to validate a requirement
• Task phases– Metrics Identification– Implementation Planning – Data Collection/Analysis
Aerospace Control and Guidance Systems Committee Briefing8Federal Aviation
AdministrationMarch 1, 2007
Background (cont.)
• Identification Phase – List of approximately 100 metrics were mapped to the Air Traffic services and capabilities found in the Blueprint for NAS Modernization 2002 Update
• Implementation Planning Phase – Metrics have been prioritized to generate initial reports on a subset of these metrics
• Data Collection/Analysis Phase – Iterative process
Aerospace Control and Guidance Systems Committee Briefing9Federal Aviation
AdministrationMarch 1, 2007
Iterative Process• A series (drops) of data collection/analysis reports
generated in the targeted system areas• Generate timely reports to the test group• Documentation is amended as process iterates
Aerospace Control and Guidance Systems Committee Briefing10Federal Aviation
AdministrationMarch 1, 2007
Example Metrics• High Priority Metric – false alert rate of Surveillance
Data Processing (SDP) Safety Alert Function– Direct link to ATC Separation Assurance from NAS Blueprint– Affects several controller decisions: aircraft conflict potential,
resolution and monitor– Directly observable by controller and impacts workload– Several ERAM requirements – e.g. “ERAM shall ensure that no
more than 6 percent of the declared alerts are nuisance alerts…”– Lockheed Martin is using it in their TPM/TPI program
• Low Priority Metric – wind direction accuracy for Flight Data Processing (FDP) Aircraft Trajectory– Trajectory accuracy already high priority metric– Potentially affects controller decisions but only indirectly by
increasing trajectory prediction accuracy– Not directly observable by controller
Aerospace Control and Guidance Systems Committee Briefing11Federal Aviation
AdministrationMarch 1, 2007
High Priority Metrics FY05/06
• Surveillance Data Processing (SDP)– Positional accuracy of surveillance tracker– Conflict prediction accuracy of Safety Alert Functions
• Flight Data Processing (FDP)– User Request Evaluation Tool (URET) trajectory accuracy metrics– Comparison of route processing (HCS/URET & ERAM) – Forecast performance of auto-hand-off initiate function
• Conflict Probe Tool (CPT)– URET conflict prediction accuracy metrics for strategic alerts
(missed and false alert rates), working closely with development contractor (scenarios, tools, etc.)
Aerospace Control and Guidance Systems Committee Briefing12Federal Aviation
AdministrationMarch 1, 2007
High Priority Metrics FY05/06
• Display System (DS)– By En Route Automation Group
• DS Air Traffic Function Mapping to ATC Capabilities
– By NAS Human Factors Group• Usage Characteristics Assessment
– Tightly controlled environment, not dynamic simulation
– Focused on most frequent and critical controller commands (e.g., time required to complete a flight plan amendment)
• Baseline Simulation– High-fidelity ATC simulation, dynamic tasks
– Focused on overall performance, efficiency, safety (e.g., number of aircraft controlled per hour)
Aerospace Control and Guidance Systems Committee Briefing13Federal Aviation
AdministrationMarch 1, 2007
Completed Studies• “Comparison of Host Radar Tracks to Aircraft Positions from
the Global Positioning Satellite System,” Dr. Hollis F. Ryan, Mike M. Paglione, August 2005, DOT/FAA/CT-TN05/30.*
• “Host Radar Tracking Simulation and Performance Analysis,” Mike M. Paglione, W. Clifton Baldwin, Seth Putney, August 2005, DOT/FAA/CT-TN05/31.*
• “Comparison of Converted Route Processing by Existing Versus Future En Route Automation,” W. Clifton Baldwin, August 2005, DOT/FAA/CT-TN05/29.*
• “Display System Air Traffic Function Mapping to Air Traffic Control Capabilities,” Version 1, Christopher Reilly, Lawrence Rovani, Wayne Young, August 2005.
• “Frequency of Use of Current En Route Air Traffic Control Automation Functions,” Kenneth Allendoerfer, Carolina Zingale, Shantanu Pai, Ben Willems, September 2005.
• “An Analysis of En Route Air Traffic Control System Usage During Special Situations,” Kenneth Allendoerfer, Carolina Zingale, Shantanu Pai, November 2005.
*Available at http://acy.tc.faa.gov/cpat/docs/
Aerospace Control and Guidance Systems Committee Briefing14Federal Aviation
AdministrationMarch 1, 2007
Current Activities
• Continue the baseline of system metrics
• Begin comparison of ERAM performance to current system metrics
Aerospace Control and Guidance Systems Committee Briefing15Federal Aviation
AdministrationMarch 1, 2007
Immediate Benefits to Initial Tests
• Establish legacy system performance benchmarks
• Determine if ERAM supports air traffic control with
at least the same “effectiveness” as current system
• Provides data driven scenarios, methods, and tools for comparison of current HCS to ERAM
• Leverages broad array of SMEs to develop metrics and address ERAM testing questions
Aerospace Control and Guidance Systems Committee Briefing16Federal Aviation
AdministrationMarch 1, 2007
Longer Term Benefits
• Apply experience to future ERAM releases
• Provide valid baseline, methods and measurements for future test programs
• Support Next Generation Air Transportation System (www.jpdo.aero) initiatives– Contribute to the development of future requirements
by defining system capabilities based on measurable performance data
Aerospace Control and Guidance Systems Committee Briefing17Federal Aviation
AdministrationMarch 1, 2007
Study 1: Comparison of Host Computer System (HCS) Radar Tracks to Aircraft GPS-Derived Positions
Aerospace Control and Guidance Systems Committee Briefing18Federal Aviation
AdministrationMarch 1, 2007
BackgroundTask: Determine the accuracy of the HCS radar tracker
• Supports the test and evaluation of the FAA’s En Route Automation Modernization (ERAM) System
• Provides ERAM tracking performance baseline metric • Recorded HCS radar track data available from Host Air Traffic
Management Data Distribution System
• GPS-derived position data available from the FAA’s Reduced Vertical Separation Minimum (RVSM) certification program
• GPS data assumed to be the true aircraft positions
Aerospace Control and Guidance Systems Committee Briefing19Federal Aviation
AdministrationMarch 1, 2007
GPS-Derived Data• RVSM certification flights
• Differential GPS
• Horizontal position (latitude & longitude)
• Aircraft positions identified by date/call-sign/time
• 265 flights, 20 Air Route Traffic Control Centers (ARTCCs), January thru February 2005
• Continuous flight segments – level cruise, climbs, descents, turns
Aerospace Control and Guidance Systems Committee Briefing20Federal Aviation
AdministrationMarch 1, 2007
HCS Radar Track Data
• Recorded primarily as track positions in the Common Message Set format, archived at the Technical Center
• Extracted “Flight Plan” and “Track” messages from RVSM flights
• Track positions identified by date, call sign,
ARTCC, and time tag (UTC)
Aerospace Control and Guidance Systems Committee Briefing21Federal Aviation
AdministrationMarch 1, 2007
Methodology• Point-by-point comparison – HCS track position to GPS
position – for same flight at same time
• Accuracy performance metrics in nautical miles: – horizontal error - the unsigned horizontal distance between the time
coincident radar track report and the GPS position – along track error - the longitudinal orthogonal component (ahead and
behind) of the horizontal error – cross track error - the lateral orthogonal component (side-to-side) of
the horizontal error
• Distances defined in Cartesian coordinate system
• Latitude/longitude converted into Cartesian (stereographic) coordinates
• Stereographic coordinate system unique to each ARTCC
Aerospace Control and Guidance Systems Committee Briefing22Federal Aviation
AdministrationMarch 1, 2007
Reduction of Radar Track Data• Split flights into ARTCC segments
• Convert latitude/longitude to stereographic coordinates
• Clean up track data
• Discard data not matched to GPS data
• Resample to 10 second interval & synchronize
Aerospace Control and Guidance Systems Committee Briefing23Federal Aviation
AdministrationMarch 1, 2007
Reduction of GPS Data
• Discard non-contiguous data (15% discarded)
• Identify ARTCC and convert lat/longs to stereographic coordinates
• Reformat to legacy format
• Re-sample to 10 second intervals and synchronize
Aerospace Control and Guidance Systems Committee Briefing24Federal Aviation
AdministrationMarch 1, 2007
Comparison Processing
• Radar track point (x1,y1) matched to corresponding GPS point (x2,y2)
• Pairs of points matched by date, call sign, time tag
• Horizontal distance
= SQRT [(x1-x2)2+(y1-y2)2]
= SQRT [(Along Track Dist.)2+(Cross Track Dist.)2]
Aerospace Control and Guidance Systems Committee Briefing25Federal Aviation
AdministrationMarch 1, 2007
Descriptive Statistics
0.03
0.05
0.08
0.10
0.13
Pro
babili
ty
0 .2 .4 .6 .8 1 1.2 1.4 1.6 1.8 2
Horizontal Error (nm)
0.05
0.10
0.15
0.20P
robability
-1 -0.5 0 .5 1
Cross Track Error (nm)
0.05
0.10
0.15
Pro
bability
-2 -1 0 1 2
Along Track Error (nm)
Horizontal Error (nm)
Cross Track Error (nm)
Along Track Error (nm)
Type Sample Size
Mean RMS Mean RMS Mean RMS
Signed54170 0.69 0.78
0.000.16
-0.67
0.77Unsigned 0.12 0.67
Aerospace Control and Guidance Systems Committee Briefing26Federal Aviation
AdministrationMarch 1, 2007
Radar Horizontal Track - Flight #1
X Coordinate in Nautical Miles
Y C
oord
inat
e in
NM
Falcon Mystere business jet
Springfield – Kansas City – Wichita – Fayetteville radial – St Louis
Climb – Cruise (FL350 & FL370) - Descend
Aerospace Control and Guidance Systems Committee Briefing27Federal Aviation
AdministrationMarch 1, 2007
Radar (Left) & GPS (Right)
X Coordinate in Nautical Miles
Y C
oord
inat
e in
NM
Flight #1 – Turn
(“south” heading)
Aerospace Control and Guidance Systems Committee Briefing28Federal Aviation
AdministrationMarch 1, 2007
Radar (Right) & GPS (Left)
X Coordinate in Nautical Miles
Y C
oord
inat
e in
NM
Flight #1 – Straight
(northeast heading)
Aerospace Control and Guidance Systems Committee Briefing29Federal Aviation
AdministrationMarch 1, 2007
Track Errors – Flight #1
0.02
0.04
0.06
0.08
Pro
babi
lity
-0.3 -0.2 -0.1 0 .1 .2 .3
Cross Track Error (nm)
0.03
0.05
0.08
0.10
Pro
babi
lity
-1.5 -1.25 -1 -0.75 -0.5 -0.25 0
Along Track Error (nm)
Horizontal Error (nm)
Cross Track Error (nm)
Along Track Error (nm)
Type Sample Size
Mean RMS Mean RMS Mean RMS
Signed374 0.80 0.89
-0.040.12
-0.790.88
Unsigned 0.10 0.79
Aerospace Control and Guidance Systems Committee Briefing30Federal Aviation
AdministrationMarch 1, 2007
Contact the Author:
Available Publications: http://acy.tc.faa.gov/cpat/docs/index.shtml