Intelligent Autonomy for Reducing Operator Workload

ARL
Penn State
Intelligent Autonomy for Reducing
Operator Workload
Mark Rothgeb
Intelligent Control Systems Department
Autonomous Control and Intelligent Systems Division
April 10, 2007
ARL
Penn State
Overview
„ Applied Research Laboratory background
„ Autonomous unmanned vehicles (ARL / DoD)
„ Issues in automation and levels of autonomy
„ Two examples of reducing operator workload via increasing levels
of system autonomy
ARL
Penn State
Applied Research Laboratory
Navy UARC Background
„ Navy UARC’s established in mid-1940’s to continue University
centered R&D effective during WWII
„ We offer a diverse portfolio of systems expertise and technologies
applicable to Distributed Systems
„ UARC Universities maintain a long-term strategic relationship with
the Navy
„ Characteristics:
– Can address evolving needs with enabling technologies
– Understanding of operational problems and environment
– Objectivity and independence
– Corporate knowledge and memory
– From concept to prototype (integration and test facilities)
– Freedom from conflict of interest
ARL
Penn State
•
•
•
•
•
•
•
•
Core Technologies
Fluid Dynamics
Hydro Acoustics
Computational Mechanics
Composite Materials
Information Fusion and Visualization
Energy and Power Systems
System Simulation
Autonomous Control and Intelligent
Systems
ARL
Penn State
Characteristics and Size
9Systems Engineering Orientation
9Basic Research thru Demonstration to Full-Scale Implementation
9Project Management of Cross-disciplinary, Multi-performer Teams
ARL Part of Penn State Research
ARL Full-Time Equivalent Years
ARL
Penn State
ARL Locations
Keyport Naval
Facility
Keyport, Wa.
Electro-Optics Center
Kittanning, Pa.
ARL Penn State
State College, Pa.
APPLIED
APPLIED RESEARCH
RESEARCH LABORATORY
LABORATORY BUILDING
BUILDING
Distributed Engineering Center
Penn State Fayette Campus
APPLIED
APPLIED SCIENCE
SCIENCE BUILDING
BUILDING
Washington
Office
Washington, DC
ARL Hawaii
Pearl Harbor, Hi.
Navigation Research & Development Center
Warminster, Pa.
GARFIELD
GARFIELD THOMAS
THOMAS WATER
WATER TUNNEL
TUNNEL
NAVIGATION
NAVIGATION RESEARCH
RESEARCH &
& DEVELOPMENT
DEVELOPMENT CENTER
CENTER
ELECTRO-OPTICS
ELECTRO-OPTICS SCIENCE
SCIENCE &
& TECHNOLOGY
TECHNOLOGY CENTER
CENTER
ARL
ARL CATO
CATO PARK
PARK
ARL
Penn State
PSU/ARL Experimental UGV
• Embedded Health Monitoring
• Autonomous Navigation & Control
• Intelligent Self-Situational Awareness
• COTS OCU Development
• JAUS Development & Testing
ARL
Penn State
PSU Aero/ARL UAV Base Aircraft
TRAINER
UAV
Specifications
Wingspan
80 inches
Wing Area
1180 sq. inches
Length
64 3/4 inches
Weight
6 to 6 1/2 pounds
Engine
.40-.46 2 stroke
or .91 4-stroke
Radio
4 channels,
5 servos
ARL
Penn State
PSU/ARL Autonomous Undersea Vehicle
OCEANOGRAPHIC DATA GATHERING
Diameter:
38 in.
Endurance:
300 nautical miles
Length:
27 ft.-10 in.
Payload:
dual side-scan sonars;
other oceanographic
instruments
Weight:
9,900 lbs.
Navigational Accuracy:
better than 150 meters
OBJECTIVE
„ Developed a rapid prototype AUV for use
in collection of oceanographic
environmental data
VEHICLE FEATURES
„ Long-range capabilities (>300 nm @ 4 kts)
„ Fully autonomous vehicle operations
„ Launch/recovery from TAGS 60 platform
„ Sensors: Sidescan Sonar, Acoustic Doppler
Current Profilers (ADCP), Conductivity,
Temperature, and Depth (CTD)
„ Simple maintenance & turnaround at sea
ARL
Penn State
• Predator
DoD Autonomous Vehicles
• Firescout
• Battlespace Preparation AUV
ARL
Penn State
NASA Dart Autonomous Operation
• Even basic automation
concepts … not so simple
•
•
Rendevous and Inspection
Proximity Operations
…On April 15, more than 450 miles above Earth, an experimental
NASA spacecraft called DART (Demonstration of Autonomous
Rendezvous Technology) fired its thrusters and closed in on a
deactivated U.S. military communications satellite—and then gently
bumped into it. (Popular Science 2005)
ARL
Penn State
•
Automation Perspectives
Underwater Vehicles
– Communication issues
– Load ‘n Go
– Automation Æ Manual
•
Ground (UGV), Air (UAV), Surface (SUV) Vehicles
– Remote control / Tele-operation (fly-by-wire)
– Human in control with bits of automation (waypoints)
– Manual Æ Automation
•
Spacecraft Vehicles
– Ground-Control driven
– Backoff and “Safe” the system (valuable assets)
– Solve the problem on the ground through analysis
ARL
Operator Overload Æ Forcing Automation
Penn State
• Operator overload comes in different ways
– Increase in number of tasks for the same number of
people
• Can’t add crew, but now have more sensors
– Reduce head-count for same number of tasks
• Littoral Combat Ship (LCS)
– Increased complexity of tasks forces automation
• Surface vehicle on open ocean, surface vehicle in harbor
– Increase in amount of data to process
• Need to react quickly also forces automation
– Systems that automatically respond because of
timing requirements
– Advisory systems that call the operator to attention
ARL
Penn State
Acceptance of Automation
What is required for acceptance of automation…
Its all about gaining trust….
•
•
•
Don’t do something fundamentally wrong (run into the wall)
Don’t do something non-intuitive (go right around wall versus left)
Do tell the operator when the autonomy doesn’t know what to do
– Ambiguous circumstances
– Able to solve the 95% case but not the 5%
•
•
Do give insight into decision-making
Do have automation assist the operator, not vice versa
– Microsoft word helps you?
– Mapquest fixes for example (beltway anomoly)
– Employee Reimbursement System (cure worse than ailment?)
•
Do let the operator dynamically alter the level of autonomy
– Full manual Æ Full autonomy
ARL
Penn State
Levels of Autonomy
• Various groups have defined levels of
autonomy
–
–
–
–
–
National Institute of Standards (NIST)
Future Combat Systems (FCS)
Air Force Research Laboratory (AFRL)
Uninhabited Combat Air Vehicle
ASTM Committee on Unmanned Undersea Vehicle
Systems (UUV)
– NASA FLOOAT (Function-specific Level of
Autonomy and Automation Tool)
– Sheridan’s Levels of Autonomy
ARL
Penn State
NIST Definitions
Autonomous - Operations of an unmanned system
(UMS) wherein the UMS receives its mission from
the human <1> and accomplishes that mission with
or without further human-robot interaction (HRI).
The level of HRI, along with other factors such as
mission complexity, and environmental difficulty,
determine the level of autonomy for the UMS [2].
Finer-grained autonomy level designations can also
be applied to the tasks, lower in scope than mission.
Autonomy - The condition or quality of being selfgoverning
[NIST Special Publication 1011 - Autonomy Levels for Unmanned Systems
(ALFUS) Framework]
ARL
Penn State
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.
Sheridan’s Scale for Degrees of Automation
The computer offers no assistance, human must do it all
The computer offers a complete set of action alternatives, and
narrows the selection down to a few, or
suggests one, and
executes that suggestion if the human approves, or
allows the human a restricted time to veto before automatic
execution, or
executes automatically, then necessarily informs the human, or
informs him after execution only if he asks, or
informs him after execution if it, the computer, decides to.
The computer decides everything and acts autonomously,
ignoring the human.
R. Parasuraman, T. B. Sheridan, and C. D. Wickens, "A Model for Types and Levels of
Human Interaction withAutomation Transactions on Systems, Man, and Cybernetics Part A, vol. 30, pp. 286-297, 2000
ARL
Penn State
Future Combat Systems
Levels of Autonomy
1.
2.
3.
4.
5.
6.
Remote control / teleoperation
Remote control with vehicle state knowledge
External preplanned mission
Knowledge of local and planned path
Hazard avoidance or negotiation
Object detection, recognition, avoidance or
negotiation
7. Fusion of local sensors and data
8. Cooperative operations
9. Collaborative operations
10. Full autonomy
–
SOURCE: LTC Warren O’Donell, USA, Office of the Assistant Secretary of the
Navy (Acquisition, Logistics, and Technology), “Future Combat Systems Review,”
April 25, 2003.
ARL
Penn State
•
Level 1 (Manual Operation)
–
–
•
The system automatically recommends actions for selected functions.
The system prompts the operator at key points for information or decisions.
Today’s autonomous vehicles operate at this level.
Level 3 (Management by Exception)
–
–
–
–
•
The human operator directs and controls all mission functions.
The vehicle still flies autonomously.
Level 2 (Management by Consent)
–
–
–
•
Levels of Autonomy as Defined by the
Uninhabited Combat Air Vehicle Program
The system automatically executes mission-related functions when response
times are too short for operator intervention.
The operator is alerted to function progress.
The operator may override or alter parameters and cancel or redirect actions
within defined time lines.
Exceptions are brought to the operator’s attention for decisions.
Level 4 (Fully Autonomous)
–
–
The system automatically executes mission-related functions when response
times are too short for operator intervention.
The operator is alerted to function progress.
ARL
Penn State
NIST Levels of Autonomy
We make a distinction between the terms of
“degrees of autonomy” and “levels of autonomy.”
Total autonomy in low-level creatures does not
correspond to high levels of autonomy. Examples
include the movements of earthworms and bacteria
that are 100% autonomous but considered low.
[NIST Special Publication 1011 - Autonomy Levels for
Unmanned Systems (ALFUS) Framework]
[ Metrics, Schmetrics! How The Heck Do You Determine A UAV’s
Autonomy Anyway? Bruce T. Clough, Air Force Research Laboratory ]
ARL
Penn State
Air Force Research Laboratory (AFRL)
Levels of Autonomy (Clough)
[FLOAAT, A Tool for Determining Levels of Autonomy and Automation,
Applied to Human-Rated Space Systems, Ryan W. Proud* and Jeremy J.
Hart†]
ARL
Penn State
NASA FLOAAT (Function-specific Level of
Autonomy and Automation Tool)
ARL
Automation Approach in the “Real” World
Penn State
•
•
Don’t over-commit on capability of automation
Begin by automating the mundane
– Bid and proposal database (Excel…)
– Periscope key-in’s
– Surface ship heading recommendations
•
Extend by making some mildly intelligent inferences regarding
decision-making
– Go the right way around the wall
– Not always simple: Cul-de-sac
•
Extend to more complex “intelligent” systems…
–
–
–
–
–
•
Neural Nets
Fuzzy Systems
Rule-based Systems
Other techniques
Cognition?
But… What is intelligence?
ARL
Penn State
•
Intelligent Systems
AIAA Intelligent Systems Technical Committee (JACIC, Dec.,
2004), they stated:
"The question of what is an intelligent system (IS) has been the
subject of much discussion and debate. Regardless of how one
defines intelligence, characteristics of intelligent systems
commonly agreed on include:
1) Learning - capability to acquire new behaviors based on past
experience;
2) Adaptability - capability to adjust responses to changing
environmental or internal conditions;
3) Robustness - consistency and effectiveness of responses
across a broad set of circumstances;
4) Information Compression - capability to turn data into
information and then into actionable knowledge; and
5) Extrapolation - capability to act reasonably when faced with a
set of new (not previously experienced) circumstances."
[courtesy: Lyle Long]
ARL
Penn State
Some System Architectures
• Many options
–
–
–
–
–
NASA: CLARAty
MIT: MOOS (Framework for Modeling)
MIT: CSAIL (Robotic Reactive Planning)
CMU: SOAR (Cognitive Architecture)
CMU: CORAL (Cooperative Robots)
• Has won Robocup several times
• Robocup Goal: “By the year 2050, develop a team of fully
autonomous humanoid robots that can win against the human
world soccer champion team.”
–
–
–
–
USC: STEAM (Agent Teamwork Model)
PSU/ARL: PIC (Behavior-based Framework)
PSU/IST: R-CAST (RPD Model for Agent Teamwork)
…
ARL
Penn State
INTELLIGENT CONTROL ARCHITECTURE
INTELLIGENT CONTROLLER
DATA INPUTS
Response
Perception
Sensor 1
.
.
.
Sensor N
• Sensor Data Fusion
• Information Integration
• Inferencing and Interpretation
• Situational Awareness
• Operational Assessment
• Dynamic Planning and Replanning
• Plan Execution
Messages
Messages
Conventional
Control
Systems
Human
Collaborator
• Human-in-the-loop Operations (Collaborates / Commands)
Other
Autonomous
Controllers
ARL
Penn State
System for Operator Workload Reduction
• Talked mostly regarding unmanned systems
• Applicability versus a wide range of operational
systems
– Let the operator have ultimate control (allow him to
control levels of autonomy)
– Gain his confidence by …
• Helping him make better decisions
• Not misleading to bad decisions
– Allow him to understand what the system is doing
– Don’t provide him more of a burden to operate
• An example of a simple system…
ARL
Penn State
•
•
•
•
•
Target Anesthesia/Analgesia Example
Advisory System
Human-In-The-Loop
Information Overload
Subtle Combinatorial
Changes
Reduce 5-to-1
?
An example of a more complex system…
ARL
Penn State
Contact Awareness Example
• Reduces workload when making tactical maneuvering
decisions
– Reduce manual integration of information
• Reduce time to make maneuvering decision
– Improve situational awareness holistic view
• Improve quality of tactical decision
– Better situation understanding leads to better decision
– Traceability to “truth” data
• Provide help for less experienced operator
– Queue operator to predicted loss of tactical control
– Incorporate SME expertise in automated recommendations with
ability to interrogate recommendation
ARL
Penn State
Contact Collision Threat Level
• CPA Concept
5 Kyd
• Collision Threats
52Kyd
Kyd
•
Orange to Red
•
Level 0-1
• Violation Threats
•
Yellow to Orange
•
Level 0-1
2 Kyd
.500 yd
.500 yd
• No Threat Level
•
Green
Speed of Advance (SOA)
Speed in the line of sight (range rate)