A Scorecard for Cyber Resilience: What We have Observed

10/12/2015
A Scorecard for Cyber Resilience:
What We have Observed
Philip A. Scolieri
Bob Vrtis
October 14, 2015
© 2014 Carnegie Mellon University
Copyright 2015 Carnegie Mellon University.
This material is based upon work funded and supported by Department of Homeland Security under Contract No.
FA8721-05-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a
federally funded research and development center sponsored by the United States Department of Defense.
NO WARRANTY. THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE
MATERIAL IS FURNISHED ON AN “AS-IS” BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO
WARRANTIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT
NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTABILITY, EXCLUSIVITY, OR
RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT
MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK,
OR COPYRIGHT INFRINGEMENT.
This material has been approved for display at 2015 IEEE STC.
[DISTRIBUTION STATEMENT F] Further dissemination only as directed by Department of Homeland Security c/o
[email protected] (2015-08-14) or higher DoD authority.
DM-0002675
2
1
10/12/2015
Agenda
Cyber Resilience Review (CRR) Background
How CRR Data is collected
Overview of CRR Data that has been collected
CRR Data Analysis Key Observations
Maturity Indicator Level Observations
Domain Specific Observations
3
What do we mean by resilience?
Operational resilience: The
emergent property of an organization
that can continue to carry out its
mission after disruption that does
not exceed its operational limit
[CERT-RMM]
Where does the disruption come from? Realized risk.
4
2
10/12/2015
Background
What is the Cyber Resilience Review (CRR)?
A U.S. Department of Homeland Security (DHS) initiative intended
to help the nation’s critical infrastructure providers understand their
operational resilience and ability to manage cyber risk.
A review of the overall practice, integration, and health of an
organization’s cyber security program, as it relates to a specific
critical service.
A tool that allows an organization to
develop an understanding of process-based cyber security capabilities
develop meaningful indicators of operational resilience
improve its ability to manage cyber risk to its critical services and
related assets.
5
Overview of the CRR
The CRR is derived from the CERT Resilience Management Model
(CERT-RMM).
A structured assessment conducted during a one day, facilitated
session
The CRR is facilitated by multiple navigators (DHS and CERT) who
solicit the answers to 269 questions.
The CRR results in a summary report that provides suggested
options for consideration to the organization.
At the time this analysis a total of 229 organizations have been
assessed using version 2 of the CRR.
6
3
10/12/2015
Who Participates?
Criteria for participation in a facilitated CRR?
Organizations within Critical Infrastructure and Key Resources
(CIKR) sectors,
State, Local, Tribal, and Territorial (SLTT) governments, within the
United States (and its territories)
Participation is voluntary
Participants must request DHS conduct a CRR
7
Service Oriented Approach
The CRR has a service focus.
An organization deploys its assets (people, information, technology, and
facilities) to support specific operational missions, or services. For example, the
wastewater processing service in a water treatment plant.
A service orientation enables the identification of assets important to achieving an
organizational or sector mission.
The service is used to scope the CRR.
Mission
Success
Service
Service
Mission
Services
Mission
people
info
tech
facilities
8
4
10/12/2015
CRR Domains-What We Examine
Based on the CERT-RMM
The ten domains in the CRR
represent important areas that
contribute to the cyber
resilience of an organization.
The domains focus on
practices an organization
should have in place to
assure the protection and
sustainment of its critical
service.
CRR Domains
AM
Asset Management
CM
Controls Management
CCM
Configuration and Change
Management
VM
Vulnerability Management
IM
Incident Management
SCM
Service Continuity Management
RM
Risk Management
EDM
External Dependencies
Management
TA
Training and Awareness
SA
Situational Awareness
9
Cyber Resilience Review Architecture
There are three possible responses to all CRR practice questions:
“Yes”, “No”’ and “Incomplete”
10
5
10/12/2015
Cyber Resilience Review Numbers
Number of
Goals
Number of
Goal
Practices
Number of
MIL
Practices
Asset Management
7
24
13
Controls Management
4
7
13
Configuration and Change
Management
Vulnerability Management
3
15
13
4
12
13
Incident Management
5
23
13
Service Continuity Management
4
15
13
CRR Domain
Risk Management
5
13
13
External Dependencies
Management
Training and Awareness
5
14
13
2
8
13
Situational Awareness
3
8
13
139
130
Total
11
Process Institutionalization in the CRR
Maturity Indictor Levels (MIL) are used in the CRR to measure process
institutionalization
Level 5-Defined
Processes are
acculturated,
defined,
measured,
and governed
Level 4-Measured
Level 3-Managed
Level 2-Planned
Practices are
performed
Level 1-Performed
Practices are
incomplete
Level 0-Incomplete
Higher degrees of
institutionalization
translate to more
stable processes that:
• produce consistent
results over time
• are retained during
times of stress
12
6
10/12/2015
Agenda
Cyber Resilience Review (CRR) Background
How CRR Data is collected
Overview of CRR Data that has been collected
CRR Data Analysis Key Observations
Maturity Indicator Level Observations
Domain Specific Observations
13
Role of the Navigator
The Navigators
•
•
•
are assigned at the beginning of the navigation process
work with the organization to ensure the CRR produces high quality results
execute the CRR with the organization’s representatives
The CRR Lead Navigator
•
•
•
•
•
is assigned at the beginning of the navigation process and coordinates the
CRR
works with the organization to complete the CRR Preparation Questionnaire
prior to the CRR
facilitates preparation meetings
works with the organization to review the initial draft report to validate
results
delivers the final CRR report to the organization
14
7
10/12/2015
CRR Data Capture Form and Report Preparation
Each Domain is in its own section
Each Domain is divided into goals
Each practice question has 3 options for
answers
•
•
•
Yes
Incomplete
No
Each practice question has imbedded
guidance
Following the completion of the CRR
each navigator’s answers will be
•
•
compared
differences resolved
An initial report is then generated
15
Agenda
Cyber Resilience Review (CRR) Background
How CRR Data is collected
Overview of CRR Data that has been
collected
CRR Data Analysis Key Observations
Maturity Indicator Level Observations
Domain Specific Observations
16
8
10/12/2015
CRR Version 2 Data Visualizations
17
CRR Version 2 Data Visualizations
Maturity Indicator Level per Domain
229 Organizations
1.2
1.0
MIL Score 0.8
(MIL Scale 0.6
Range
0 to 5) 0.4
Mean
Median
0.2
0.0
CRR Domain
Sorted by Median
18
9
10/12/2015
CRR Version 2 Data Visualizations
19
CRR Version 2 Data Visualizations
Percentage of Questions Answered Yes per Domain
229 Oranizations
100
90
80
70
60
Percentage Yes
Answers
50
Median
40
Mean
30
20
10
0
CRR Domain
*CRR Domains are sorted by median percentage of ‘Yes’ answers
20
10
10/12/2015
CRR Version 2 Data Visualizations
Median Percentage of Yes, No and Incomplete Answers
229 Organizations
100%
90%
80%
70%
Median
Percentage
Sorted
by
Percentage
of Yes
Answers
% Yes
60%
50%
% No
40%
30%
% Incomplete
20%
10%
0%
CRR Domain
21
Agenda
Cyber Resilience Review (CRR) Background
How CRR Data is collected
Overview of CRR Data that has been collected
CRR Data Analysis Key Observations
Maturity Indicator Level Observations
Domain Specific Observations
22
11
10/12/2015
Examining the Relationship Between MIL
Practices and Performance
Influence of Planning on Performance
Percentage Questions
Answered “Yes”
Planning is Important
•
•
•
MIL Scores
Less than 41% of organizations indicate
they have a documented plan
Those having plans, even partial plans,
complete more practices than those that do
not
Those performing planning, even partial
planning, tend to have higher MIL scores
23
Examining the Relationship Between MIL
Practices and Performance
Influence of Review and Measurement on Performance
Percentage Questions
Answered “Yes”
Measurement is important
•
•
•
MIL Scores
Fewer than 38% of organizations
measure their performance
Those that conduct review and
measurement tend to complete more
practices than those that do not
Those performing review and
measurement, even partially, tend to
have higher MIL scores
24
12
10/12/2015
Agenda
Cyber Resilience Review (CRR) Background
How CRR Data is collected
Overview of CRR Data that has been collected
CRR Data Analysis Key Observations
Maturity Indicator Level Observations
Domain Specific Observations
25
Asset Management (AM) Summary Results
•
Our results show that even though 73% of organizations identify their services
and roughly 60% to 83% inventory their assets, approximately 35% to 50%
of organizations do not associate inventoried assets to the critical
services they support.
26
13
10/12/2015
Controls Management (CM) Summary Results
•
Our results show that between 45 and 50% of assessed organizations have
established control objectives for the assets that support the critical service. This
indicates that controls and safeguards may be misaligned with resilience
requirements.
Additionally, only 47% of organizations prioritize controls based on their potential to
impact the service. This indicates that a large percentage of organizations
may not have a robust understanding of the which controls are more
effective in protecting the assets that support the service.
•
27
Configuration and Change Management (CCM)
Summary Results
•
Only 57% of organizations operate a change management process for information
assets. If modifications to information assets are not carefully controlled,
their availability, integrity, and confidentiality may be compromised.
•
Approximately 68% of organizations implement capacity management,
configuration management and configuration baselines
28
14
10/12/2015
Configuration and Change Management (CCM)
Summary Results
•
An area of particular weakness in the CCM domain is that only 38% of
organizations implement methods (whether technical, administrative, or
physical inspection) to actively discover changes to their technology
assets.
29
Vulnerability Management (VM) Summary Results
•
Our results show that across the four asset types, 53% to 65% of
organizations have not developed a strategy to guide their
vulnerability management effort.
•
Additionally, 46% to 58% of organizations have not determined a standard
set of tools or methods to identify vulnerabilities.
30
15
10/12/2015
Incident Management (IM) Summary Results
•
While 73% of assessed organizations perform event detection
•
Only 55% of organizations have a process to declare incidents, and
only 36% have developed criteria to guide personnel in declaring incidents
31
Service Continuity Management (SCM) Summary
Results
•
In general, less than 53% of organizations have fully developed and
documented service continuity plans for the assets that support the critical
service.
32
16
10/12/2015
Service Continuity Management (SCM) Summary
Results
•
Less than 40% of assessed organizations execute a formal test of
their service continuity plans and compare test results to test objectives
to identify improvements.
•
Testing is often the only opportunity for an organization to determine if
their plans will achieve the plan objectives and meet the sustainment
requirements of the service.
33
Risk Management (RM) Summary Results
•
Of the organizations assessed, only 33% have established an
operational risk management plan to guide their risk management
efforts
•
The risk management plan is the heart of any resilience program, and the
absence of this plan greatly hinders the organization’s ability to effectively
manage risk
34
17
10/12/2015
External Dependencies Management (EDM)
Summary Results
•
78% of the organizations assessed indicated that they identify external
dependencies that are critical to the service.
•
Only 54% of organizations report that they follow a process for maintaining their
catalog of external dependencies and about half of the organizations indicate
that they identify and manage risks from those external dependencies
•
Only 40% of organizations prioritize external dependencies, indicating an area
of weakness in applying focus to the most important of these dependencies.
35
Training and Awareness (TA) Summary Results
•
Only 33% of organizations conduct training activities.
•
A mere 21% evaluate the effectiveness of the training that they provide.
36
18
10/12/2015
Situational Awareness (SA) Summary Results
•
Only 31% of organizations have implemented threat monitoring
procedures despite more than 60% of these organizations having assigned
responsibility for monitoring threat information
•
In addition, only 15% of organizations indicated that they have a
documented plan for performing situational awareness activities
37
Contact Information
Robert A. Vrtis
CISSP
Senior Engineer
Cybersecurity Assurance - CS2
CERT |Software Engineering Institute | Carnegie Mellon University
703.247.1399 (Office)
[email protected]
Philip A. Scolieri
Senior Information & Infrastructure Security Analyst
Cybersecurity Assurance – CS2
CERT | Software Engineering Institute | Carnegie Mellon University
412-268-7067 (Office)
[email protected]
38
19
10/12/2015
Supporting Material
39
List of Acronyms Used
AM – Asset Management
CCM – Configuration and Change Management
CIKR – Critical Infrastructure and Key Resources
CM – Controls Management
CMU – Carnegie Mellon University
CRR - Cyber Resilience Review
DHS – Department of Homeland Security
EDM - External Dependencies Management
IM – Incident Management
MIL – Maturity Indicator Level
RM – Risk Management
RMM – Resilience Management Model
SA – Situational Awareness
SCM – Service Continuity Management
SEI – Software Engineering Institute
SLTT – State, Local, Tribal and Territorial
TA – Training and Awareness
VM – Vulnerability Management
40
20
10/12/2015
Key Definition
Operational resilience: The organizations ability to adapt to
risk that affects core operational capabilities. Operational
resilience is an emergent property of effective operational
risk management, supported and enabled by activities such as
security and business continuity. A subset of enterprise
resilience, operational resilience focuses on the organizations
ability to manage operational risk, whereas enterprise
resilience encompasses additional risk such as business risk
and credit risk.
[CERT-RMM]
41
Cyber Resilience Review Maturity Indicator
Level(s) (MILs)
• MIL-0 Incomplete – indicates that practices in the domain are not being
performed
• MIL-1 Performed – indicates that practices that support the goals in a
domain are being performed
• MIL-2 Planned – indicates that practices are performed and supported by
planning, policy, stakeholders, and relevant standards and guidelines
• MIL-3 Managed – indicates that practices are performed, planned, and are
supported by governance and adequate resources
• MIL-4 Measured – indicates that practices in a domain are performed,
planned, managed, and supported by measurement, monitoring, and
executive oversight
• MIL-5 Defined – indicates that practices in a domain are performed,
planned, managed, measured, and supported by enterprise standardization
and analysis of lessons learned.
42
21
10/12/2015
Examining the Relationship Between MIL
Practices and Performance
Influence of Planning on Performance
43
Examining the Relationship Between MIL
Practices and Performance
Influence of Planning on Performance
Influence of Planning on Organizational Performance by Domain: Percentage of “Yes” Answers
44
22
10/12/2015
Examining the Relationship Between MIL
Practices and Performance
Influence of Planning on Performance
Influence of Planning on Organizational Performance per Domain: MIL Scores
45
Examining the Relationship Between MIL
Practices and Performance
Influence of Review and Measurement on Performance
46
23
10/12/2015
Examining the Relationship Between MIL
Practices and Performance
Influence of Review and Measurement on Performance
Influence of Measurement on Organizational Performance per Domain: Percentage of “Yes” Answers
47
Examining the Relationship Between MIL
Practices and Performance
Influence of Review and Measurement on Performance
Influence of Measurement on Organizational Performance per Domain: MIL Scores
48
24