ERAM - acgsc

Metrics Based
Approach for
Evaluating Air
Traffic Control
Automation of the
Future
Presented at: ATCA Symposium April 2006
By: Mike Paglione, FAA Simulation and Analysis Group
Date: April 25, 2006
Federal Aviation
Administration
Purpose
• Provide overview of air traffic control
automation system metrics definition activity
– Motivation
– Process
• Present comparison of Host Computer System
(HCS) radar tracks to GPS-derived aircraft
positions
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
2
Definitions
• Software Testing
– Process used to help identify the correctness, completeness, security
and quality of developed computer software
– "Testing is the process of comparing the invisible to the ambiguous,
so as to avoid the unthinkable happening to the anonymous."James
Bach (Contemporary author & founder of Satisfice, a test training & consulting company)
– “Testing can show the presence of errors, but never their absence.”
Dijkstra (Famous Dutch computer scientist and physicist, author, etc.)
• Two Fundamental Processes
– Verification
• Building the product right (e.g. determining equations are
implemented correctly)
– Validation
• Building the right product (e.g. solving the right equations)
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
3
Why is this important to the FAA?
• En Route Automation Modernization (ERAM)
– Replaces EnRoute Host Computer System (HCS) and backup
• ERAM provides all of today’s functionality and:
– Capabilities that enable National Airspace System evolution
– Improved information security and streamlined traffic flow at our
international borders
– Additional flight radar data processing, communications
support, and controller display data
– A fully functional backup system, precluding the need to restrict
operations as a result of a primary system failure
– Improved surveillance processing performance using a greater
numbe/variety of surveillance sources (e.g. ADS-B)
– Stand-alone Testing and Training capability
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
4
ERAM Test Challenges
• Limited funding
• Installed and operational at 20 sites in 2008-9
• System Requirements
– 1,298 in FAA System Level Specification
– 4,156+ in contractor System Segment Specifications
– 21,906 B-Level “shalls”
• Software: 1.2 million SLOC
• COTS/NDI/Developmental mixture
• Numerous potential impacts, significant changes
– ATC Safety, ATC Functions, System Performance, RMA, ATC
Efficiency
– Replacement of 1970s legacy software that has evolved to
meet today’s mission
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
5
Metric Based Approach
• Formation of Cross Functional Team
– Members from ERAM Test, Simulation, Human Factors, System
Engineering, Air Traffic Controllers, and others…
• Charter
– “To support the developmental and operational testing of ERAM
by developing a set of metrics which quantify the effectiveness of
key system functions in ERAM”
– Focus beyond requirement based testing but validation emphasis
linked directly to services
– Targeted system functions – Surveillance Data Processing (SDP),
Flight Data Processing (FDP), Conflict Probe Tool (CPT),
Display System (DS)
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
6
Background
• Metrics may be absolute or comparative in nature
– Comparative metrics will be applied to current air traffic control
automation systems (and later to ERAM)
• Measure the performance of the legacy En Route automation
systems in operation today to establish a benchmark
• Allow direct comparison of similar functionality in ERAM
– Absolute metrics would be applied to FAA standards
• Provide quantifiable guidance on a particular function in ERAM
• Could be used to validate a requirement
• Task phases
– Metrics Identification
– Implementation Planning
– Data Collection/Analysis
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
7
Background (cont.)
• Identification Phase – List of approximately 100
metrics were mapped to the Air Traffic services and
capabilities found in the Blueprint for NAS
Modernization 2002 Update
• Implementation Planning Phase – Metrics have
been prioritized to generate initial reports on a
subset of these metrics
• Data Collection/Analysis Phase – Iterative process
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
8
Iterative Process
• A series (drops) of data collection/analysis reports
generated in the targeted system areas
• Generate timely reports to the test group
• Documentation is amended as process iterates
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
9
Example Metrics
• High Priority Metric – false alert rate of Surveillance
Data Processing (SDP) Safety Alert Function
– Direct link to ATC Separation Assurance from NAS Blueprint
– Affects several controller decisions: aircraft conflict potential,
resolution and monitor
– Directly observable by controller and impacts workload
– Several ERAM requirements – e.g. “ERAM shall ensure that no
more than 6 percent of the declared alerts are nuisance alerts…”
– Lockheed Martin is using it in their TPM/TPI program
• Low Priority Metric – wind direction accuracy for
Flight Data Processing (FDP) Aircraft Trajectory
– Trajectory accuracy already high priority metric
– Potentially affects controller decisions but only indirectly by
increasing trajectory prediction accuracy
– Not directly observable by controller
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
10
High Priority Metrics FY05/06
• Surveillance Data Processing (SDP)
– Positional accuracy of surveillance tracker
– Conflict prediction accuracy of Safety Alert Functions
• Flight Data Processing (FDP)
– User Request Evaluation Tool (URET) trajectory accuracy metrics
– Comparison of route processing (HCS/URET & ERAM)
– Forecast performance of auto-hand-off initiate function
• Conflict Probe Tool (CPT)
– URET conflict prediction accuracy metrics for strategic alerts
(missed and false alert rates), working closely with development
contractor (scenarios, tools, etc.)
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
11
High Priority Metrics FY05/06
• Display System (DS)
– By En Route Automation Group
• DS Air Traffic Function Mapping to ATC Capabilities
– By NAS Human Factors Group
• Usage Characteristics Assessment
– Tightly controlled environment, not dynamic simulation
– Focused on most frequent and critical controller commands
(e.g., time required to complete a flight plan amendment)
• Baseline Simulation
– High-fidelity ATC simulation, dynamic tasks
– Focused on overall performance, efficiency, safety (e.g.,
number of aircraft controlled per hour)
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
12
Completed Studies
• “Comparison of Host Radar Tracks to Aircraft Positions from
the Global Positioning Satellite System,” Dr. Hollis F. Ryan, Mike
M. Paglione, August 2005, DOT/FAA/CT-TN05/30.*
• “Host Radar Tracking Simulation and Performance Analysis,”
Mike M. Paglione, W. Clifton Baldwin, Seth Putney, August 2005,
DOT/FAA/CT-TN05/31.*
• “Comparison of Converted Route Processing by Existing Versus
Future En Route Automation,” W. Clifton Baldwin, August 2005,
DOT/FAA/CT-TN05/29.*
• “Display System Air Traffic Function Mapping to Air Traffic
Control Capabilities,” Version 1, Christopher Reilly, Lawrence
Rovani, Wayne Young, August 2005.
• “Frequency of Use of Current En Route Air Traffic Control
Automation Functions,” Kenneth Allendoerfer, Carolina Zingale,
Shantanu Pai, Ben Willems, September 2005.
• “An Analysis of En Route Air Traffic Control System Usage
During Special Situations,” Kenneth Allendoerfer, Carolina
Zingale, Shantanu Pai, November 2005.
*Available at http://acy.tc.faa.gov/cpat/docs/
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
13
Current Activities
• Continue the baseline of system metrics
• Begin comparison of ERAM performance to
current system metrics
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
14
Immediate Benefits to Initial Tests
• Establish legacy system performance benchmarks
• Determine if ERAM supports air traffic control with
at least the same “effectiveness” as current
system
• Provides data driven scenarios, methods, and
tools for comparison of current HCS to ERAM
• Leverages broad array of SMEs to develop metrics
and address ERAM testing questions
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
15
Longer Term Benefits
• Apply experience to future ERAM releases
• Provide valid baseline, methods and
measurements for future test programs
• Support Next Generation Air Transportation
System (www.jpdo.aero) initiatives
– Contribute to the development of future requirements
by defining system capabilities based on measurable
performance data
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
16
Study 1: Comparison of Host
Computer System (HCS) Radar
Tracks to Aircraft GPS-Derived
Positions
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
17
Background
Task: Determine the accuracy of the HCS radar tracker
• Supports the test and evaluation of the FAA’s En Route
Automation Modernization (ERAM) System
• Provides ERAM tracking performance baseline metric
• Recorded HCS radar track data available from Host Air Traffic
Management Data Distribution System
• GPS-derived position data available from the FAA’s Reduced
Vertical Separation Minimum (RVSM) certification program
• GPS data assumed to be the true aircraft positions
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
18
GPS-Derived Data
• RVSM certification flights
• Differential GPS
• Horizontal position (latitude & longitude)
• Aircraft positions identified by date/call-sign/time
• 265 flights, 20 Air Route Traffic Control Centers
(ARTCCs), January thru February 2005
• Continuous flight segments – level cruise, climbs,
descents, turns
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
19
HCS Radar Track Data
• Recorded primarily as track positions in the
Common Message Set format, archived at the
Technical Center
• Extracted “Flight Plan” and “Track” messages from
RVSM flights
• Track positions identified by date, call sign, ARTCC,
and time tag (UTC)
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
20
Methodology
• Point-by-point comparison – HCS track position to GPS
position – for same flight at same time
• Accuracy performance metrics in nautical miles:
– horizontal error - the unsigned horizontal distance between the time
coincident radar track report and the GPS position
– along track error - the longitudinal orthogonal component (ahead and
behind) of the horizontal error
– cross track error - the lateral orthogonal component (side-to-side) of
the horizontal error
• Distances defined in Cartesian coordinate system
• Latitude/longitude converted into Cartesian (stereographic)
coordinates
• Stereographic coordinate system unique to each ARTCC
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
21
Reduction of Radar Track Data
• Split flights into ARTCC segments
• Convert latitude/longitude to stereographic
coordinates
• Clean up track data
• Discard data not matched to GPS data
• Resample to 10 second interval & synchronize
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
22
Reduction of GPS Data
• Discard non-contiguous data (15% discarded)
• Identify ARTCC and convert lat/longs to
stereographic coordinates
• Reformat to legacy format
• Re-sample to 10 second intervals and
synchronize
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
23
Comparison Processing
• Radar track point (x1,y1) matched to
corresponding GPS point (x2,y2)
• Pairs of points matched by date, call sign,
time tag
• Horizontal distance
= SQRT [(x1-x2)2+(y1-y2)2]
= SQRT [(Along Track Dist.)2+(Cross Track Dist.)2]
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
24
Descriptive Statistics
Horizontal
Error (nm)
Type
Sample
Size
Signed
54170
Cross Track
Error (nm)
Mean
RMS
0.69
0.78
Unsigned
Mean
0.00
Along Track
Error (nm)
RMS
Mean
RMS
-0.67
0.16
0.77
0.12
0.67
0.10
0.08
0.05
Probability
0.13
0.03
.2
.4
.6 .8 1 1.2 1.4 1.6 1.8
Hori zontal Error (nm)
2
0.20
0.10
Probability
0.15
0.15
0.10
0.05
0.05
-1
-0.5
0
.5
Cross Track Error (nm)
1
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
-2
-1
0
1
Along Track Error (nm)
Federal Aviation
Administration
Probability
0
2
25
Falcon Mystere business jet
Springfield – Kansas City –
Wichita – Fayetteville radial –
St Louis
Climb – Cruise (FL350 &
FL370) - Descend
Y Coordinate in NM
Radar Horizontal Track - Flight #1
X Coordinate in Nautical Miles
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
26
Radar (Left) & GPS (Right)
(“south” heading)
Y Coordinate in NM
Flight #1 – Turn
X Coordinate in Nautical Miles
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
27
Radar (Right) & GPS (Left)
(northeast heading)
Y Coordinate in NM
Flight #1 – Straight
X Coordinate in Nautical Miles
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
28
Track Errors – Flight #1
Signed
Sample
Size
374
Cross Track
Error (nm)
Along Track
Error (nm)
Mean
RMS
Mean
Mean
0.80
0.89
-0.04
Unsigned
0.10
RMS
0.12
-0.79
RMS
0.88
0.79
0.10
0.06
0.04
Probability
0.08
0.08
0.05
0.03
0.02
-0.3
-0.2
-0.1
0
.1
Cross Track Error (nm)
.2
.3
-1.5
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
-1.25
-1 -0.75 -0.5 -0.25
Along Track Error (nm)
Federal Aviation
Administration
0
29
Probability
Type
Horizontal
Error (nm)
Contact the Author:
[email protected]
609-485-7926
Available Publications:
http://acy.tc.faa.gov/cpat/docs/index.shtml
Aerospace Control and Guidance Systems Committee Briefing
March 1, 2007
Federal Aviation
Administration
30