SAVANNAH RIVER SITE PERFORMANCE METRIC MANUAL

WSRC-RP-2002-00252
SAVANNAH RIVER
SITE
PERFORMANCE
METRIC MANUAL
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
2
TABLE OF CONTENTS
OVERVIEW ............................................................................................................................. 3
BACKGROUND AND HISTORY .......................................................................................... 4
BENEFITS................................................................................................................................ 5
READING THE CHARTS ....................................................................................................... 6
READING THE SUMMARY SHEETS .................................................................................. 6
CALCULATING SUMMARY SHEETS................................................................................. 9
COLOR VALUES AND DEFINITIONS................................................................................. 9
DOE/CONTRACTOR INVOLVEMENT .............................................................................. 10
SIX SIGMA AND SRS PERFORMANCE METRICS.......................................................... 11
MANAGING METRICS ........................................................................................................ 13
COMMUNICATION.............................................................................................................. 14
LIST OF CURRENT SRS PERFORMANCE METRICS ..................................................... 15
DEFINITIONS, GOALS AND CONTACTS FOR SRS PERFORMANCE METRICS....... 18
WASHINGTON GOVERNMENT GROUP INITIAL DIRECTIONS FOR
PERFORMANCE METRICS................................................................................................. 33
APPENDIX A: SRS PERFORMANCE METRIC PRESENTATION .................................. 43
APPENDIX B: NUCLEAR SAFETY .................................................................................... 49
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
3
OVERVIEW
This manual provides the “nuts and bolts” that
the Savannah River Site (SRS) used in
developing the site’s performance metrics.
The Background and History section explains
the how the site began the system, including
where concepts were first established. The
Benefits section describes the major benefits
the site feels it has derived from the system.
The next sections, Reading the Charts,
Reading the Summary Sheets, and Calculating
Summary Sheets, were taken from an SRS
newspaper article, where the charts were
explained to site personnel. These sections
give an explanation of how to read the
information provided. To be successful, these
metrics require Westinghouse Savannah River
Site’s (WSRC) customer, the Department of
Energy
(DOE).
The
DOE/Contractor
Involvement section explains how the two
have worked together at the Savannah River
Site. Six Sigma is now being used throughout
major companies in the United States, and
WSRC is adopting the tools from Six Sigma to
analyze the underlying processes of the
performance metrics. The Six Sigma section
explains how WSRC is using these tools and
what is expected from this effort. Managing
Metrics section describes specific steps used
by the WSRC point of contact to manage over
60 key performance indicators. These metrics
required a communication strategy to inform
site employees about the system, which is
described in the Communications section.
To further help the reader in developing a set
of metrics for another DOE site, a list of SRS
metrics, the definitions, goals and SRS subject
matter experts for each metric, and the initial
directions from the Washington Government
Group are provided.
Appendix A is a
presentation used by the WSRC Executive
Vice President, Dave Amerine, with notes.
Nuclear
Safety
performance
metrics
definitions used at SRS are explained in
Appendix B.
It is hoped that this manual will assist other
sites in the development of performance
metrics. Comments and questions can be
directed to Gail Jernigan, WSRC, Building
703-A Room A-129, Aiken, South Carolina.
She can also be reached by calling 803-725774 or by email at [email protected].
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
4
BACKGROUND AND HISTORY
During the Executive Safety Conference in
December 2001, Greg Rudy, Department of
Energy Savannah River Operations Office
(DOE-SR) and Bob Pedde, Westinghouse
Savannah River Company (WSRC) President,
volunteered for the Savannah River Site (SRS)
to become a pilot for the DOE Complex, using
a performance metric system discussed at the
conference.
This system uses key performance indicators
(KPIs) as its basis. The color rollup scheme
was established by the Institute of Nuclear
Power Operations (INPO) after Three Mile
Island. It provides a quick status summary,
which can be modified to suit various
customers. (For the commercial world, this
might include INPO, Nuclear Regulatory
Agency, Board of Directors, and other
stakeholders.). The underlying principle
behind each metric is to use objectivity to
assess SRS performance. This system provides
key information at a glance but allows “drill
down” to identify issues and actions. Instead
of focusing on events, it provides an easy
assessment of trends. It also encourages the
sharing of expertise and knowledge and allows
benchmarking of successes as well as
analyzing problem areas.
While this metric system was discussed with
the DOE at the Safety Conference, for several
years SRS had been using performance
metrics based on the DOE-SR Site Manager’s
Focus Areas of Safety and Security; Technical
Capability; Community, State and Regulator
Relationships; Cost Effectiveness; and
Corporate Perspective. The process of
defining and measuring performance in these
Focus Areas heightened the awareness of the
SRS mission and objectives outlined in the
SRS Strategic Plan and helped focus
management attention on achieving results.
Using these metrics helped SRS continuously
improve operations by clearly targeting areas
for improvement. The results of analyses were
available in summary format on a quarterly
basis in a newsletter and displayed on posters
around the site. Detailed files and backup
material for the ratings were posted on the
site’s intranet so that all organizations and
employees could have access to the
information.
In addition, WSRC has been using the KPI
color rollup system since January 2001.
Washington Government Group (WGG) (now
Washington Energy and Environment), a
division of Washington Group International,
had initiated the development of performance
metrics, using the same INPO format. Dave
Amerine, WGG Executive Vice President, led
the effort to develop a set of meaningful
performance metrics for all companies within
the division. During the previous year, he had
met with crosscutting subject matter experts to
determine a set of metrics and their
definitions. A standard description and roll-up
algorithms were developed. Because of the
diversity of business units, the group was
challenged to develop a set of consistent and
useful parameters. This diversity of business
units is analogous to the DOE Complex.
After becoming the pilot for the DOE
Complex, SRS staff members used metrics
from both the Focus Area metrics and the
WSRC Corporate metrics to develop the list of
site metrics. They combined the various
metrics into the five Focus Areas. The list of
metrics is being refined as DOE-SR and
WSRC continue to monitor them.
As a further refinement, WSRC is also using
the same format and list of metrics for each
division within the company. Not all metrics
will be applicable to a division, and divisions
may have additional metrics that management
wishes to track to assess the division’s
performance. Facilities will also track their
performance metrics using this same format
and the list of metrics. WSRC anticipates
implementation of facility metrics by the end
of 2002.
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
5
BENEFITS
Listed below are some of the benefits of this system.
9 The focus is on performance and results through key performance indicators (KPIs). As a result
of the Governmental Performance and Results Act and other Congressional actions, it is
imperative that the site demonstrates results. This system gives a visual display of the site’s
results, as well as performance.
9 Color rollup allows quick “drill down” to source of problem area. By looking at colors on the
summary charts, management can quickly determine the areas of concern. A look at the next level
of detail can provide information as to which particular area needs attention. Additional details
can be found in the individual KPI. The site can also determine if too many resources are being
used with little benefit
9 KPIs as basis are objective. Eliminating the subjectivity of performance metrics allows
management to focus on results and focus resources on areas of concern.
9 Analysis/Action section of KPI allows assessment of problem status. A critical part of each KPI,
the Analysis/Action section analyzes successes and failures. If the site is doing well in a particular
area, others can learn from them. If the KPI is trending to Yellow or Red, this section will help
management determine if appropriate actions are being taken to remedy the problems in the area
being measured.
9 This system has the ability to accommodate various needs. The use of the summary panels is
flexible to allow different groupings to satisfy needs of DOE-HQ, Defense Nuclear Facility
Safety Board, Congress, and other stakeholders. At the same time, the various KPI do not have to
be changed.
9 An analysis of the summary panels could allow/promote sharing/prioritizing of resources/ideas
across the site. The site can use the Analysis/Action section to promote sharing of ideas across the
DOE Complex to the benefit of all. In addition, if one focus area is doing extremely well,
management can use this system to determine if resources should be reallocated to areas that need
more attention.
9 The use of this system can promote consistency in approaches across the site. By using the same
INPO format for all metrics, site personnel can quickly assess the performance of a particular KPI
without having to learn different performance measurement systems.
9 This system will facilitate continuous improvement across the site and the DOE Complex. The
use of these KPIs gives site personnel the ability to analyze their performance to lead to improved
performance.
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
6
READING THE CHARTS
G
G
G
G
Safety and
Security
G
This box represents the Safety and Security box from the Summary Charts. The small boxes across the
top are the scores (color-coded) for the past four quarters. The oldest quarter is on the left, and the most
current quarter is on the right. Others have used this to show the last four months instead of quarters.
However, using quarters allows the reader to see what a particular area has done over the past 12 months.
The letter “G” in the box represents “Green” for those who may not see this printed in color. The larger
box is the current month, which also has a “G” for “Green” in the bottom left corner.
Focus Area
W
W
W
W
Level I
G
G
W
G
W
G
W
W
Tech Capability
and
Performance
W
W
G
W
W
W
W
W
W
W
G
W
W
Production
W
W
G
Y
Emergency
Industrial
Services and
Safety
Fire Protection
Safety and
Security
W
G
W
W
W
W
W
W
Y
W
G
W
W
W
W
W
W
G
G
W
G
W
W
G
Corporate
Perspective
Y
W
W
W
Financial
Forecasts
Cost
Effectiveness
W
W
G
W
W
W
G
W
G
W
B
B
W
W
Y
Y
G
G
Y
W
G
Y
Y
G
G
W
R
G
G
W
G
G
G
W
W
W
G
W
W
Engineering
W
G
G
W
W
W
Y
Project
Manage-ment
Y
W
Public
Perception
G
Y
R
Y
G
Financial
Performance
Y
B
G
G
Physical
Security
Employee
Relations
Public
Participation
Program
PBI
Performance
G
G
G
Feedback and
Improvement
G
B
B
B
Nuclear NonEM Integration
Proliferation
Oversight
W
G
G
Y
G
G
W
Disciplined
Operations
Community,
Environ- Environ-mental
State and
Compliance
mental
Regulatory
Index
Release Index
Relationships
W
G
Radiological
Nuclear Safety
Safety
Infrastructure
W
W
W
R
B
Reading the Summary Sheets
This is the Summary Chart for all of the performance metrics. The first column represents the five DOE
Focus Areas: Safety and Security; Technical Capability and Performance; Community, State and
Regulatory Relationships; Cost Effectiveness; and Corporate Perspective.
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
7
Reading across provides the Level 1 performance for each focus area. For example, Safety and Security
has the following Level 1 performance indicators: Industrial Safety; Emergency Services and Fire
Protection; Radiological Safety; Nuclear Safety; and Physical Security. For more details, a reader would
go to the next level. (See chart below.) At this level, the individual key performance indicators (KPIs) or
the result of rolling up several KPIs are shown. This allows readers to quickly see where the site is
excelling and where there might be problems. Using both charts and the individual performance metrics,
managers, DOE-HQ, and others can “drill down” to an area of their particular interest.
Fictitious Example
Performance Indicators
Through March 31, 2002
Focus Area
Y
W
W
W
Level 1
G
G
Level 2
G
G
G
G
G
G
G
Y
G
W
W
W
Y
Safety and
Security
W
W
G
R
Y
Y
W
Radiological
Safety
R
W
Y
Y
R
W
W
W
W
W
G
G
W
G
R
Y
G
Y
G
W
W
G
G
G
Corrective
Emergency
Actions Exercises/ Drills
W
W
G
G
Y
Reportable
Contamination
W
W
W
Nuclear Safety
Issue
Management
Index
G
G
Y
W
W
G
G
Cost Index
EPHA Annual
Review/Revision
Reportable Dose
Exceedances
Nuclear Safety
G
W
G
G
R
Y
G
W
G
G
W
Emergency
Fire Protection
Services and
Impairment Status
Fire Protection
G
G
Days Away,
Restricted or
Transferred Rate
Industrial Safety Total Recordable
and Health
Case Rate
G
G
W
W
W
Significant
Nuclear Safety
Incidents Index
W
G
G
G
Physical
Security Incidents
Security
W
G
G
This is the expanded version of the Safety and Security section shown on the previous page. Here the
reader can see that Safety and Security is made up of Industrial Safety; Emergency Services and Fire
Protection; Radiological Safety; Nuclear Safety; and Physical Security. Further study shows that
Industrial Safety is composed of Total Recordable Case Rate, Lost Workday Case Rate, and Cost Index.
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
8
Contamination
Events
SAFETY AND SECURITY
Radiological Safety
Reportable Contamination Events
Goal
Through March 31, 2002
10
9
8
Number of Events
7
6
5
4
3
2
1
0
Apr
May
Jun
Jul
Data
Apr
May
Jun
Jul
Contamination Events
1
2
3
3
Goal
1
1
1
1
Score
White
White
Yellow
White
Definition
This chart reflects the number of personnel contamination events per month.
Goal
Green: ≤ 1 contamination events
White: >1 and ≤ 3 contamination events
Yellow: >3 and ≤5 contamination events
Red: >5 contamination events
Aug
Sep
Oct
Nov
Dec
Jan
Feb
Mar
Aug
Sep
Oct
Nov
Dec
Jan
Feb
Mar
2
4
4
3
4
5
4
6
1
1
1
1
1
1
1
1
White
Yellow
Yellow
Yellow
Yellow
Yellow
Yellow
Red
Analysis / Action
WHITE: Analysis:Over the past year, this metric has been trending upward, with more contamination events
each month. While management has been stressing PPE use to avoid additional contamination events,
personnel still resist using the new suits, resulting in more events. Action: Staff will be surveying personnel to
discover the reasons for the resistance to the new suits. Explanations of the results of not using the suits will
be explained to each employee using the new suits, as well as additional training on the use of PPE to avoid
future contamination events.
Comments
KPI Owner: Jim Stafford (803) 952-9888 SME: Athena Freeman (803) 952-9938
This is a fictitious individual performance metric (or KPI) in the INPO format. Specifically, this shows
the site’s performance on Contamination Events. The upper section provides a graph, where readers can
see how the site’s performance has been over the past 12 months. It includes a line for the goal.
Underneath the graph is the actual data for this particular metric. Underneath that are four boxes. The first
box is the “Definition,” which describes the metric being discussed by describing what is being measured.
The box below that one is the “Goal,” which describes the site goal and the values for the various colorcoded scores. To the right of the “Definition” box is the “Analysis/Action” section that describes why the
site is where it is today. It answers questions such as “what are we doing to reach this level of
performance and what are we doing to sustain this level of performance.” If the color is green or white,
this section would describe how the site level of performance is able to attain this level of performance. If
the color is yellow or red, this section would describe specific actions the site is taking to improve in this
area. Finally, in the lower right is a comment section with the Key Performance Indicator Owner and the
subject matter expert. This section also includes any other comments about the metric.
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
9
CALCULATING SUMMARY SHEETS
To summarize the metrics, each key performance indicator (KPI) is assigned a value based on the
color of the performance for the month and/or quarter as shown below.
Excellent
Satisfactory/Normal
Tracking to Satisfactory
Unsatisfactory
Green
White
Yellow
Red
3 points
2 points
1 point
0 points
To determine the value for a summary, the values for the individual metrics are added. (SRS KPIs are
equally weighted at this time. A more sophisticated differential weighting may be added later.) For
example, if a summary has 4 metrics with color values of green (3 points), white (2 points), white (2
points) and green (3 points), the total for that summary is 10 points (3+2+2+3=10). Using the table
below, since there are 4 metrics and the total 10 points, the value for the summary is Green.
Values
Green
2
>6
Number of applicable KPIs to be evaluated
3
4
5
6
>8
>13
>15
>10
7
>18
8
>20
White
Yellow
Red
>4
>2
<2
>5
>3
<3
>12
>10
<10
>14
>12
<12
>7
>5
<5
>9
>7
<7
>11
>10
<10
COLOR VALUES AND DEFINITIONS
G
Excellence - where performance exhibits a significant strength, such as achievement of
long-term goals
W
Satisfactory/Normal - Satisfactory performance. Management should watch the
performance to ensure that it improves.
Y
Less than Satisfactory - where performance needs improvement and management
attention.
R
Unsatisfactory - Weakness: where performance is significantly below the goal or where
the annual goal is not expected to be reached.
B
No Data - This performance measure did not have data to report. This is also used for
charts that are underdevelopment and do not have data to report yet.
NA
Not Applicable
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
10
DOE/CONTRACTOR INVOLVEMENT
Before beginning a performance metric system
such as the one discussed in this manual, both
DOE and WSRC worked together to develop a
list of performance metrics. SRS found that
senior management involvement is imperative
to ensure that the site is measuring the “right”
performance areas. SRS’s list of KPIs is an
involving
process
with
input
from
management and subject matter experts.
asked, “Are we
information?”
This manual includes the list currently being
used at SRS and a generic list, proposed by
Washington Government Group (now
Washington Energy and Environment) to be
used by its companies within this division.
Either or both of these can be used in
developing a list of metrics.
The most important section of a KPI is the
Analysis/Action section. After WSRC
provides this section to DOE-SR, DOE
reviews it to determine if WSRC understands
the data and the level of performance. This
analysis section should explain why the site’s
performance is where it is. Furthermore, the
action section should provide activities and
steps WSRC is planning to take to either
improve or sustain performance. DOE-SR
reviews this section to determine if WSRC is
responding appropriately.
However, most sites probably already have
numerous performance metrics. Existing
metrics are recommended as a starting point
since site personnel are already familiar with
these metrics. These are probably the items
that management considers most important for
tracking.
Once the list of metrics was determined, both
DOE and WSRC worked together to
determine the definition of each metric. What
information should be measured? How should
it be measured? Since these were metrics the
site was already using, SRS management
measuring
the
‘right’
Next, DOE and WSRC developed the goal and
scoring levels for green, white, yellow, and
red for each metric. Where should the “bar”
be? Consideration was given for existing site
resources and the current level of
performance.
For individual KPIs, KPI owners and subject
matter experts from DOE-SR and WSRC
review the various sections of the appropriate
KPI. The list of KPIs and groupings are
reviewed and determined by senior
management.
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
11
SIX SIGMA AND SRS PERFORMANCE METRICS
Overview
Many companies in the United States are now
embracing Six Sigma.
Six Sigma is a
management tool that helps define and
implement needed process improvements.
The improvements are evaluated based on the
sigma level of the process. The sigma level is
a business metric used to indicate the
performance of a process or service to the
customer's specification. WSRC is currently
integrating the use of Six Sigma tools into the
evaluation of the site’s performance metrics.
As such, the Six Sigma tools will be used to
statistically evaluate and improve the site
processes that directly impact the site's
performance
metrics.
The
following
discussion provides an overview of Six Sigma
and also presents the initial plan for Six Sigma
integration with the business metrics that has
been developed at SRS.
Six Sigma:
The Six Sigma tools are being applied by all
types of companies worldwide in order to:
− Improve customer satisfaction;
− Improve profit margins and reduce
costs;
− Reduce cycle times; and,
− Gain greater predictability of the
results.
The Six Sigma success strategy obtains results
by reducing the cost of poor quality (COPQ),
using a data-driven approach. The cost of poor
quality (COPQ) can be lost contracts,
cancelled projects, lost funding, injuries, as
well as reviews, inspections, waste, defects,
etc. The Six Sigma tools are used to identify
the critical drivers/inputs for each process
output. Actions and controls are then put in
place to improve the overall process outputs
(i.e. reduce the number of defects and process
variation). By identifying and controlling
these critical drivers/inputs, the COPQ will be
reduced.
In order to measure the process capability
relative to the customer specification, average
performance of the process and variation in
the process, a sigma level is calculated for
each process. This provides a standard means
to compare processes and also provides a solid
data driven foundation from which
improvements to each process can be
measured. The specific goal of Six Sigma
tools is to optimize each process to a sigma
level that is cost effective for that specific
process.
The bottom line goal is to utilize the Six
Sigma tools to manage each process in an
effective and efficient manner.
This
management approach has been successfully
applied in both the manufacturing and
transactional business worlds.
As such,
WSRC will be using the Six Sigma tools to
help drive improvements in both their
manufacturing and transactional processes in
order to improve the overall site performance.
WSRC Integration Strategy:
WSRC will be integrating the Six Sigma tools
to management and evaluation of the SRS
Performance metrics. The primary goals for
this integration are:
− Improve the ability to predict future
site performance;
− Identify the critical actions or
processes associated with each metric;
− Identify potential improvements for the
critical actions or processes; and,
− Implement controls on the critical
actions or processes to ensure
improvements are sustained.
In order to meet these goals, the following
strategy utilizing Six Sigma tools is being
developed and implemented:
1. Prioritize the Level 1 site metrics. This
prioritization will be completed in order to
begin evaluating the key site metrics. The
key site metrics are those metrics that
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
12
could be improved which would lead to
positive impacts relative to the SRS Goals.
2. Evaluate the Level 2 metrics for each
Level 1 metric. This evaluation will:
a. Determine if the process measured
by the metric is in control;
b. Identify
the
customer
specification limits;
c. Determine the current capability
for the process (determining the
sigma level);
d. Determine if the process is
effective (Is the current defect rate
acceptable? What is the cost
associated with the current defect
rate?);
e. Determine if the process is
efficient (Is the cycle time of the
process acceptable? Is the cost to
conduct the process acceptable?);
and
f. Establish measurable goals for
improvement to each of the Level
2 metrics.
3. Determine critical processes for each
Level 2 metric relative to the improvement
needs.
4. Identify process improvement projects
(yellow belt and black belt efforts) for
critical processes.
5. Establish overall scorecard and track
process improvements.
Summary:
The current SRS site metrics provide a history
of past performance and depict current site
performance. The Six Sigma tools will allow
WSRC to predict future performance based on
this past history and will be used to improve
the overall site performance. This will allow
WSRC to manage in a proactive manner based
on the predicted performance of the metrics.
With the integration of Six Sigma tools into
the overall SRS business, WSRC expects to
improve the overall site performance while
decreasing
the
overall
site
costs.
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
13
MANAGING METRICS
WSRC found that no one person can be a
subject matter expert on all the various
metrics; however, a point of contact was
needed to collect and manage the various KPIs
and sum them in the summary pages. A person
was named to manage the performance metric
system. This person has the responsibility to
collect the data and individual KPIs, sum them
using the algorithm, develop the summary
charts, and perform quality assurance on each
KPI. This manager issued an Excel
spreadsheet template for each KPI and asked
each KPI owner and subject matter expert to
provide the information required for each KPI.
The individual KPI owner and subject matter
expert worked with their DOE customer in
developing the definitions, goals, and scoring
values.
At the beginning of each month, since SRS
collects the information monthly, this manager
sends out a reminder by electronic mail to
each KPI owner and subject matter expert.
KPIs are due the seventh business day of each
month to the manager who then collects and
reviews them. Quality assurance reviews are
performed to ensure that the each KPI is
complete and that the Analysis/Action section
is complete. The metrics are also reviewed to
ensure that there is consistency within all
metrics. Senior management decided early that
the graphs should be similar in appearance.
For example, the color bars of the information
should be a consistent color. (WSRC chose
blue for the color of the first bar and a red and
white striped for the second bar.) Care is taken
to ensure that KPI graphs are easily read if
printed in black and white. The goal is
graphed as a line in green. If there is only one
y-axis and the KPI scoring permits, a separate
color bar is established on the far right,
showing the levels for green, white, yellow,
and red so that the reader can quickly establish
the scoring color without reading the text. This
is simply a series of small color boxes stacked
on top of one another.
Once the KPIs are submitted and quality
assurance reviews are complete, the
performance metric manager summarizes the
KPIs by the various focus areas and then the
final overall summary sheet. Another quality
assurance review is performed by another
individual to ensure that the package is
correct. A package of all summary sheets and
KPIs is prepared in Adobe Acrobat, which is
sent to both DOE-SR and WSRC senior
managers and placed on the site’s Intranet.
Summary pages are posted in or around the
DOE-SR and WSRC senior management
offices, conference rooms, and other
administrative areas across the site. A copy is
sent to all KPI owners and subject matter
experts. This is all accomplished by the
fifteenth of each month so that the posting is
available to the site in a timely manner.
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
14
COMMUNICATION
For this performance metric system to be truly
effective for the Savannah River Site,
communications to all employees was used.
Newsletter articles, videos, Intranet, employee
communications and numerous meetings were
used to describe the system and its impacts.
As site subject matter experts were developing
individual key performance indicators (KPIs),
other staff members were strategizing
communication processes. Before the metrics
were released, a short news article was placed
in the site’s newspaper. The next month a
more extensive center page spread was used to
provide the biggest impact to all employees.
(In fact, much of the section on describing the
metric system came from the newspaper.)
In addition, the DOE-SR point of contact, Jeff
Allison, Assistant Manager for Health, Safety,
and Technical Support Division, and the
WSRC Executive Vice President, Dave
Amerine, attended many site, organizations,
councils and committees to teach site
employees about the new systems. Examples
of these include DOE-SR Senior Staff
meeting, WSRC Senior Staff (all WSRC Vice
Presidents and General Managers), WSRC
Deputy Managers Council, WSRC Program
Managers
Committee,
and
WSRC
performance metric KPI owners and subject
matter experts meeting, an all managers
meeting. Essentially, the same presentation
was used in these meetings, ensuring
consistent communication across the site. (See
Appendix A for a copy of the presentation.)
One of these presentations was video taped
and placed on the site’s Intranet, allowing all
SRS employees to learn about the new metric
system. Names and phone numbers of points
of contact were published in numerous places
so that staff members could call with any
questions.
When the first set of metrics was issued, the
summary sheets were placed on large poster
boards and displayed around the site. Most
prominently, the posters were placed outside
both DOE-SR Managers’ and WSRC
President’s office. Other locations included
various conference rooms and other
administrative areas. These posters have been
updated monthly to reflect the most current set
of summary charts. As people become more
familiar with these charts, specific KPIs will
be placed on these posters and other locations.
To allow for these posters to be updated
monthly, each chart is laminated before being
placed on the poster, using rubber cement.
(Rubber cement is easily removed each month
while not damaging the posters.)
The entire set of metrics is placed on the site’s
Intranet each month. To allow for further
historical perspective, past month’s metrics
stay on the site’s Intranet.
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
15
LIST OF CURRENT SRS PERFORMANCE METRICS
To manage and find the key performance indicators, a numbering system, similar to an outline, is
used. Under this system, each DOE Focus Area is numbered, using Roman Numerals. Safety and
Security is I, Technical Capability and Performance is II, Community, State, and Regulator
Relationships is III, etc. For the next level, Level 1, an alphabetic numbering system is used. Under
Safety and Security, Industrial Safety and Health is A, Emergency Services and Fire Protection is B,
Radiation Safety is C, Nuclear Safety is D, and Physical Security is E. The next level, Level 2, which
for SRS is mainly individual KPIs, a number is assigned. For example, under Industrial Safety and
Health, Total Recordable Case Rate is 1, Days Away, Restricted or Transferred Rate is 3, and Cost
Index is 3.
This numbering system is used by naming each file in Excel and is the page number in the set of
charts. Doing this makes it easier to find a particular KPI, Level 1, or Focus Area either in the set of
printed metrics or on a computer. For example, Total Recordable Case Rate is I-A-1, Days Away,
Restricted or Transferred is I-A-2, and Cost Index is I-A-3.
DOE FOCUS
Level 1
AREAS
I. Safety and A. Industrial
Safety and
Security
Health
B. Emergency
Services and
Fire Protection
C. Radiation
Safety
D. Nuclear
Safety
E. Physical
Security
Level 2
1. Total Recordable Case Rate
2. Days Away, Restricted, or Transferred Rate
3. Cost Index
1. Fire Protection Impairment Status
2. Emergency Preparedness Hazards Assessments Annual
Review/Revision
3. Emergency Management Corrective Actions
4. Emergency Exercises and Drills Conducted Versus Scheduled
1. Events Resulting in Reportable Dose Exceedances
2. Reportable Contamination Events
1. Nuclear Safety Issue Management Index
2. Significant Nuclear Safety Incidents Index
1. Security Incidents
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
16
DOE FOCUS
Level 1
AREAS
II. Technical A. Production
Capability and
Performance
Level 2
1. Materials Processed Towards Stabilization
2. Environmental Release Sites Assessments and Completions
3. Tritium Loading and Finishing Lead Time
4. Defense Waste Processing Facility FY01-06 Canister Performance
B.
1. Preventive Maintenance/Predictive Maintenance
Infrastructure 2. Delinquent Predictive Maintenance
3. Corrective Maintenance backlog in man-hours
C. Disciplined 1. Conduct of Operations
Operations
2. Violation/Inadequate Procedures
3. Pollution Prevention and Waste Reduction
D. Engineering 1. Design Engineering Changes
2. Engineers Participating in IDEAS
3. Engineering Productivity
1. Project Management- Schedule
E. Project
Management
2. Project Management -Cost
3. Project Management Health
A.
1. Reportable Environmental Events
III.
Community Environmental 2. Projected Dose to the Public
Release Index
State and
Regulator B.
1. Enforceable Agreement Milestones-Federal Facility Agreement
Relationships Environmental 2. Environmental Enforcement Actions (Notice of Violation)
Compliance
3. South Carolina Department of Health and Environmental Control
Index
and Environmental Protection Agency Index
4. Environmental Commitments Index
C. Public
1. Citizens Advisory Board Responsiveness
Participation
2. Public Meeting Effectiveness
Program
3. Public Involvement Response to Inquiries
D. Public
1. Tours/Visitors' Program Effectiveness
Perception
2. Low Country Plan Implementation
3. Education Outreach
E. Employee
1. Number of Opened Employee Concerns per 100 Employees
Relations
2. Average Number of Days to Closure of Employee Concerns
3. Staffing to Target
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
17
DOE FOCUS
Level 1
AREAS
A. Financial
IV. Cost
Effectiveness Forecasts
B. PBI
Performance
C. Financial
Performance
D. Feedback
and
Improvement
V. Corporate A. Oversight
Perspective
B. Nuclear
NonProliferation
C. EM
Integration
Level 2
1. Cumulative Budget Cost Performance for DP
2. Cumulative Budget Cost Performance for EW02
3. Cumulative Budget Cost Performance for EW04
4. Comparison of Actual Overtime Cost to Budget
1. Performance Based Incentives Completed
2. Super Stretch Performance Based Incentives Earnings
1. WSRC Encumbrance Summary (Operations)
2. On time Payments
3. On time Invoicing
4. Cost Savings Performance (PACE)
5. Employees Cost Effectiveness Performance (IDEAS)
1. Nonconformance Report Processing Status
2. Problem Identification Report Status
3. Self-Assessment Findings
4. Facility Evaluation Board Grade Status
1. Defense Nuclear Facility Safety Board 94-1 and 2000-1
Recommendation Implementation
1. Schedule Performance
2. Annual Operating Plan Milestone Performance
3. Cost Performance
1. Receipts of Rocky Flats Plutonium
2. Ship to Waste Isolation Pilot Plant
3. Mound Receipts
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
18
DEFINITIONS, GOALS AND CONTACTS FOR SRS PERFORMANCE METRICS
The following section provides a brief definition and goal of each KPI used at SRS. In the Goal section, the color scoring is included. In addition,
SRS subject matter owners and experts and their phone numbers are provided to encourage exchanges of ideas.
Definition
I-A-1 Total Recordable Case Rate And I-A-2 Days Away,
Restricted, Or Transferred Rate: the occupational safety and
health act of 1970 requires covered employees to prepare
and maintain records of occupational injuries and illnesses.
WSRC measures safety performance in accordance with
these guidelines. The incidence rates are calculated by
multiplying the number of injuries, illnesses or lost workdays
by 200,000 and dividing by the total number of work hours.
I-A-3 Cost Index: The Cost Index metric measures the severity
of the injury or illness. It is equal to the approximate dollar loss
(direct and indirect) per 100 hours worked of all injuries and
illnesses, calculated using weighting factors which were derived
from a study of the direct and indirect dollar costs of injuries.
I-B-1 Fire Protection Systems Impairment Status: This
indicator depicts the percent of unplanned impairments of fire
protection suppression and detection systems on site. A system
is considered available if it is not impaired. Problems are
detected by operations, maintenance, or fire department
personnel through observation or routine checks. Fire Protection
Suppression and Detection Systems combined. WSRC does not
track operable vs. inoperable. This indicator will be revised
effective with the March data to track both unplanned
impairments and impairments that have exceeded their planned
duration.
I-B-2 Emergency Preparedness Hazard Assessment Annual
Review/Revision: This indicator depicts the review and revision
Goals and Scores
TRC Rate Goal = 0.93
DART Rate Goal = 0.30
Subject Matter Owner and
Expert
KPI Owner: Kevin Smith (803952-9924) SME: Linda Blackston
(803-952-9905)
Green < 2% above goal
White <7% and >2% above the goal
Yellow <12% and >7% above the goal
Red>12% above the goal
Cost Index Goal = 3.00
Green < 2% above goal
White <7% and >2% above the goal
Yellow <12% and >7% above the goal
Red>12% above the goal
The goal is to maintain unplanned fire
protection system impairments at less
than 2%. Green is < 2% of fire
protection systems impaired, White is >
2% and < 4% of fire protection systems
impaired, Yellow is > 4% and < 10% of
fire protection systems impaired, and
Red is > 10% of fire protection systems
impaired.
Green = Accomplish 84-100% within 12
months and 100% within 15 months
KPI Owner: Kevin Smith (803952-9924) SME: Linda Blackston
(803-952-9905)
KPI Owner Terri Bolton (803725-5173) SME: Richard Lewis
(803-725-5211)
KPI Owner: Chris Baker (803725-5096) SME: Lynda Blystone
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
19
status of Emergency Preparedness Hazard Assessments.
I-B-3 Emergency Management Corrective Actions: This
indicator depicts the status of corrective actions for Functional
Area 13 related deficiencies at the site and facility level.
I-B-4 Emergency Exercises and Drills Conducted vs.
Scheduled: This indicator depicts the number of exercises and
drills scheduled each month versus the number actually
conducted.
I-C-1 Events Resulting in Reportable Dose Exceedances:
This chart reflects the number of internal exposure cases from
radioactive materials resulting in ³100 mrem committed effective
dose equivalent (CEDE)1 or external doses greater than site
ACL (or FB-Line ACL for FB-Line employees) in a single
event.
1Committed Effective Dose Equivalent (CEDE) - The
cumulative whole body dose a worker would receive over a 50
year period from internal exposure.
I-C-2 Reportable Contamination Events: This chart reflects
the number of personnel contamination events per month.
I-D-1 Nuclear Safety Issue Management Index: The Nuclear
Safety Information Management Index measures the number of
unresolved safety issues (NIs and PISAs) over the previous three
month compared to last 12 month period. Additionally, the
average time and weighted mean time of open reports factor into
the measurement formula. Unresolved is defined to be those NIs
for which a PISA declaration decision has not been made, or
PISAs which have not been rolled up into a JCO or any other
AB Document." (Additional details on Nuclear Safety can be
White = Accomplish 95-100% within 15
months
Yellow = Accomplish 75-94% within 15
months
Red = <75% within 15 months
Green - 5% or less overdue; White - 10%
or less overdue; Yellow - 15% or less
overdue; Red >15%
Green > 80% of scheduled drills
conducted White = 65-79% of scheduled
drills conducted Yellow = 64-50% of
scheduled drills conducted Red <50% of
scheduled drills conducted.
Green: Number of events resulting in
reportable dose exceedances = 0
Yellow: Number = 1
Red: Number >1
(803-557-9254)
Green: < 1 contamination events
White: >1 and < 3 contamination events
Yellow: >3 and <5 contamination events
Red: >5 contamination events
The goal for the Nuclear Safety
Information Management Index is to
have a measurement less than 1 during
the quarter. Green is less than 1, White
is 1, Yellow is 2 and Red is 3
KPI Owner: Jim Stafford (803)
952-9888 SME: Athena Freeman
(803)952-9938
KPI Owner: Chris Baker (803725-5096) SME: Lynda Blystone
(803-557-9254)
KPI Owner: Chris Baker (803725-5096) SME: Lynda Blystone
(803-557-9254)
KPI Owner: Jim Stafford (803)
952-9888 SME: Athena Freeman
(803)952-9938
KPI Owner: G. Clare (952-7222)
SME: Andrew Vincent (9527209)
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
20
found in Appendix B.)
I-D-2 Significant Nuclear Safety Incidents Index: This metric
reviews and grades all Unusual Occurrences (UOs) and other
events related to Nuclear Safety listed in SIRIM under Facility
Condition-01 for rolling 3 months. UOs relating to most serious
incidents, such as For Cause Shutdowns, OSR/TSR violations
and Criticality Limit Challenges are graded more severely in the
formula. (Additional details on Nuclear Safety can be found in
Appendix B.)
I-E-1 Physical Security: This indicator depicts trending data for
security incidents. WSRC does not separate Security Events
from Computer Security Events and Document Security Events;
therefore, for this metric all security events are combined.
II-A-1 Materials Processed Towards Stabilization: Combined
stabilization of
1) Items converted to Pu metal from solution and packaged in
Bagless Transfer,
2) RFETS Pu metal canned in Bagless Transfer,
3) MK 16/22 assemblies and SFO containers charged,
4) Oxide containers processed,
5) Residue containers dispositioned,
6) RBOF storage units moved to L-Basin, and
7) FRR/DRR casks unloaded and processed.
Monthly & cumulative scheduled versus actual stabilization
activity is shown.
The sum total of each of the units of production in the chart
definition section is the number needed to meet the WSRC
Nuclear Material Management Vision 2006 Roadmap Base Case
and the corresponding base fee for the applicable Performance
Based Incentive in accordance with the WSRC/DOE
Performance Baseline.
II-A-2- Environmental Release Sites Assessments and
Completions: This metric measures ER progress in assessing
Goal for the Site is Less than 15
Color definition is as follows: Green
Less than 10, White 10 to 19, Yellow 20
to 29, Red 30 or greater
KPI Owner: George Clare (9527222) SME: Andrew Vincent
(952-7209)
Goal is to minimize the number of
security incidents across the site.
Green = 0 - 8 events, White = 9 - 13
events, Yellow = 14 - 18 events, Red =
19 plus events, based on 15,000 WSRC
employees.
Monthly:
Green = > 80% of scheduled
White = 65-79% of scheduled
Yellow = 64-50% of scheduled
Red = < 50% of schedule
KPI Owner: Chris Baker (803725-5096) SME: Lynda Blystone
(803-557-9254)
ER's goal for FY02 is to assess a total of
15 sites and complete at least 13 release
KPI Owner: Dean Hoffman (26837) SME: Bob Pride (2-6479)
KPI Owner: John Dickenson
(803-952-4604) SME: Harry
Pund (803-952-3684)
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
21
and completing release sites against its commitment to
sites. A plan is identified to accomplish
aggressively remediate inactive waste sites. Specific annual
these goals based on commitments
goals are reported to Congress and a plan to accomplish these
incorporated in ER's actual scope of
pre-established numbers is reviewed on a monthly basis.
work. (This schedule for actual
Assessments are complete when the key document for a site is
accomplishment may vary in conjunction
submitted to the final approval authority. This includes
with valid change control.) Scoring: G =
submission of Rev. 0 Record of Decision (ROD), RCRA Permit exceeding both goals; W = at least
closure Plan or a Site Evaluation Report ending in a No Further
meeting goals; Y = behind on one goal;
Action (NFA). Release Sites are complete when the physical
R = behind on both goals
cleanup is complete and a final report is submitted for approval.
An SE request for NFA is considered cleanup complete. A No
Action recommendation in a Rev 0 ROD is complete upon
regulatory approval.
II-A-3 Tritium Production: This Level 2 is composed of 2 KPIs: Reservoir Performance and Tritium Loading and Finishing Time.
II-A-3- Reservoir Performance: This indicator depicts the
The FY01 PBI Goal was 101.7 and the
KPI Owner: Cheryl Cabbil (208completion of Function Tests in support of Reservoir
stretch PBI goal was 116.95. For FY02, 1234) SME: Henry King (208Surveillance Operations (RSO), Life Storage Program (LSP) and the PBI goal is 117 and the stretch PBI
1729 )
Production Sampling (PS). This represents the PBI goal and the
goal is 129. Goals are: Green (10% over
stretch PBI goal related to Function Tests and DP's progress
PBI stretch goal for month), White ( 0%
toward reaching these goals.
to 9.99% over PBI stretch goal for
month), Yellow ( Up to 5% below stretch
goal for month), and Red ( More than
5% below stretch goal for month)
II-A-3- Tritium Loading and Finishing: This indicator shows
The goal is to ensure reservoirs are ready KPI Owner: Cheryl Cabbil (208Defense Program's progress toward meeting monthly DOE
to meet shipping requirements.
1234) SME: Henry King (208shipping requirements.
Average Loading Lead Time Goals:
1729 )
Green (>60 days), White (>45 days),
Yellow (>30 days) and Red (Less than
30 days). Average Finishing Lead Time
Goals: Green (>30 days), White (>20
days), Yellow (>15 days) and Red (Less
than 15 days). 0% of schedule
II-B-1 Preventive Maintenance/Predictive Maintenance: This The Goal is to perform 50% of the Total KPI Owner: Chuck Campbell
metric measures the effectiveness of the Preventive Maintenance Maintenance as PM / PdM. A monthly
803-725-2726 SME : Rick
(PM) and Predictive Maintenance (PdM) programs by indicating index scoring is applied to the goal as
Fleming 803-725-1460
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
22
what percentage of Total Maintenance is being performed as
follows:
PM/PdM. This indicator revised to reflect man-hours versus item Green: 50-100%
count.
White: 40-49%
Yellow: 30-39%
Red <30%
II-B-2 Delinquent Predictive Maintenance: This metric
The Goal is to reduce Delinquent PM's
measures the total number of delinquent Preventive Maintenance relative to a previous average of 215. A
(PM) work orders - these PM's are not field work complete, but
monthly index scoring is applied to the
have exceeded their scheduled date and any allowable grace
70% goal as follows:
period.
Green: <70% of Goal
White: 71-110% of Goal
Yellow: 111-130% of Goal
Red: >131%
II-B-3 Corrective Maintenance in man-hours: This metric
The goal is to reduce CM Backlog (in
measures the Total Corrective Maintenance (CM) backlog in
hours) relative to a previous average of
man-hours. Corrective Maintenance excludes work associated
103,871 hours. A monthly index scoring
with Preventive Maintenance and Plant Modifications.
is applied as follows:
Green: <75% of Average
White: 76-105% of Average
Yellow: 106-130 % of Average
Red: >131% of Average
II-C-1 Conduct of Operations: This indicator is composed of
The goal is a sitewide normalized
the normalized sitewide monthly total of ORPS/SIRIM
cumulative value per month of 4.5.
reportable occurrences in 16 selected Nature of Occurrence
Green is <4.5, White is 4.5-5.5, Yellow
categories as defined in DOE M 232.1-1A, which are deemed to is 5.5-6.0, and Red > 6.0
be of ConOps significance. The value is derived by dividing the
number of SIRIM recordables by headcount, which is
normalized for 200,000 work hours.
II-C-2 Violation/Inadequate Procedures: This indicator
The goal is a sitewide normalized
depicts the normalized sitewide monthly total of ORPS/SIRIM
cumulative value per month of 1.5.
reportable occurrences in the Nature of Occurrence category
Green is defined to be <1.5, White is >
01F, Violation/Inadequate Procedures, as defined in DOE M
1.5 and < 2.0, Yellow is > 2.0 and < 2.5,
232.1.1A. The value is derived by dividing the number of SIRIM and Red is > 2.5.
recordables by headcount, which is normalized for 200,000 work
hours.
KPI Owner: Chuck Campbell
803-725-2726 SME : Rick
Fleming 803-725-1460
KPI Owner: Chuck Campbell
803-725-2726 SME : Rick
Fleming 803-725-1460
KPI Owner: Steve Johnson (803952-9886) SME: Joy Price (803952-9657)
KPI Owner Steve Johnson (803952-9886) SME: Joy Price (803952-9657)
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
23
II-C-3 Pollution Prevention and Waste Reduction: This
indicator depicts WSRC's progress toward meeting a 20%
reduction in newly generated radioactive and hazardous solid
waste volumes based on DOE-SR approved waste forecast. The
objective is to implement changes to decrease the amount of
waste generated, thereby reducing current and future operating
cost and environmental and personnel risks.
II-D 1- Engineering Changes: The purpose of this metric is to
measure design changes caused by design errors. This is
accomplished by tracking the fraction of design changes
required because an error was made by engineering in providing
requirements or in performing the design. The value charted
will be the design changes NOT caused by design error. This
metric will provide SRS and WGI with an indication the quality
of engineering design.
II-D-2 Engineering Participating in IDEAS: The purpose of
this metric is to measure participation by engineers in
improvements to products and processes through the employee
involvement program. This is accomplished by tracking
submittals by engineers to the IDEAS program. This metric will
provide SRS and WGI an indicator as to the degree to which
engineers are initiating recommendations for product/process
improvements.
II-D-3 Engineering Productivity: The purpose of this metric is
track engineering productivity by combining specific Division
measurements, normalize for the goal of 1.00, and weighting
each measure by the ratio of Division Engineering organization
population to the Site Engineering organization population. Each
Division metric is linked to deliverables, commitment
fulfillment, or equipment availability. As an example, HLW
Achieve annualized disposal volume
reduction of 20% (1,222 cubic meters)
based on DOE-SR approved solid waste
forecast. WSRC documentation of
implemented waste reduction initiatives'
impacts determines both waste volume
and operating cost reduction
performance. Green is > 10% of monthly
prorated goal, White is 0-9% above
monthly prorated goal, yellow is 1-9%
below monthly prorated goal, and Red is
> 10% below monthly prorated goal.
The goal is based showing improvement
over historical performance. Green 95% or higher; White is 85-94%; Yellow
is 75-84%, and Red --< 75%.
KPI Owner: Luke Reid (803-5576309), SME: Tim Coffield (803557-6316)
KPI Owner George Clare (803952-7222) SME: Monica Jorque
(803-952-9246).
The monthly goal (40) is derived from an
annual goal of 475 IDEAS submittals
(averaging .3 submittal per engineer).
Green -- 90% of goal, White is 80-89%
of goal, Yellow is 70-79% of goal, and
Red is < 70% of goal.
KPI Owner George Clare (803952-7222) SME: Brenda Kelly
(803-725-0676)
The goal is to maintain a normalized
ratio for the composite measure and the
individual Division measures each
month. Green is 1.25 or better, White is
1.24 to 0.85, Yellow is 0.84 to 0.65 and
Red is < 0.64.
KPI Owner George Clare (803952-7222) SME: Andrew Vincent
(803-952-7209) or William
Tucker (803-952-7182) .
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
24
measures commitment fulfillment whereas P&CT measures
computer system availability. USQS&E's, new or revised
procedures, and PMT's. Since this a revised metric, the data will
continue to be analyzed with an eye to both improving
performance and improving the metric.
II-E-1, 2, and 3 Project Management Schedule, Cost, and
Health: Applies to all active projects (i.e., Line Item, General
Plant, Capital Equipment and Cost funded). The indicators are
officially measured quarterly. Prior to the beginning of each
quarter within each focus area, attributes and initiatives are
developed and weighted relative to customers' expectations
jointly by WSRC & DOE-SR for each project. Performance
against these attributes and initiatives are scored jointly by
WSRC & DOE-SR at the end of quarter. For the months in
between, an internal assessment and score will be generated
based upon monthly review meetings with DOE-SR. This KPI
includes an assessment of monthly project Forecast At
Completion (FAC) as part of the cost metric.
III-A-1 Reportable Environmental Events: This indicator
depicts the number of Reported Environmental Events (noncompliances as determined by receipt of regulatory notification
from SCDHEC and /or EPA, and the number of National
Response Center notifications of releases > reportable quantity.
The objective of this indicator is to measure continuous
improvement in the SRS Environmental Management System.
One point is assigned for each release. REEs include events
(letters of Non-Compliance) and releases (greater than
Reportable Quantity).
III-A-2 Projected Dose to the Public: Operations at SRS,
which are permitted for the release of radionuclides, operate far
below regulatory standards. Consistent with the ALARA
program, this indicator measures how effectively operations are
performed to maintain permitted releases below site determined
ALARA Guides. This assures the public and site stakeholders
that operations are not resulting in doses to the public near
Goal is to achieve 95 or greater score for
schedule, cost, technical and overall.
Green is "Excellent" (95 to 100); White
is "Very Good" (85-94); Yellow is
"Good" (75-84); Red is "Poor" (score
less than 75).
KPI Owner: Steve Morton (803952-6620) SME: David M.
Matos (803) 952-7294.
Achieve reductions of 20% for REE
based on the WSRC current FY02
performance compared to the previous
five-year performance average. (The
FY01 goal was 6) The FY02 goal was
lowered from 6 to 4 or less points.
Scoring for REE: Green - 4 or less,
White -- 5, Yellow - 6 Red - >6.
KPI Owner: Patricia Allen (803725-1728) SME: Dave Lester
(803-725-2904)
The Goal of this indicator is to show that
operations do not exceed 100% of the
ALARA Release Guide for a composite
of Liquid or Air Releases. Red 100% 90%; Yellow 89% - 85%; White 84% 80%; Green < 80%.
KPI Owner: Patricia M. Allen
(803-725-1728) SME: Dave
Lester (803-725-2904)
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
25
regulatory limits.
III-B-1 Enforceable Agreement Milestones – Federal Facility
Agreement: This metric measures Environmental Restoration
Division progress in completing regulatory milestone
deliverables. These deliverables are contained in the Federal
Facility Agreement and the RCRA permit. Milestones are
negotiated annually with EPA and SCDHEC and are modified in
accordance with the FFA.
III-B-2 Environmental Enforcement Actions (Notice of
Violations) : This indicator depicts the number of Notices of
Violation (NOVs) and fines issued to WSRC by SCDHEC
and/or EPA. Total NOVs/Fines paid are weighted and have been
rescaled to reflect current practices for SCDHEC issuance of
NOVs. NOV without fine = 0.5 point, NOV with fine <$10K = 1
points, NOV with fine between $10-50K = 3 points, NOV with
fine >$50K = 6 points. The scale for scoring has been modified
to reflect this different weighting scheme.
III-B-3 South Carolina Department of Health and
Environmental Control and Environmental Protection
Agency Index: This metric measures responsiveness to
SCDHEC and/or EPA environmental compliance assessments.
SCDHEC conducts most of the assessments (CWA NPDES
3560, CAA, RCRA CME, SDWA, UST). EPA participates as
their schedule allows and leads assessments for CERCLA,
TSCA, and FIFRA. Points are assigned to findings and days
delinquent to develop an index. 0 points <3 findings, 5 points for
3-4 findings, 7 points for 5-10 findings and 10 points > 10
findings; 0 points for < 0 days delinquent, 10 points for 1-4 days,
ER's goal is to meet all milestones on or
before the date due. These milestones
are identified in Appendix D of the FFA,
the RCRA permit, and other official
correspondence and documents that set
out regulatory expectations. Since a
missed milestone can lead to regulatory
enforcement actions, there is no
acceptable number that can be missed.
Scoring: G = no missed milestones +
significant, high value negotiated
regulatory outcomes, W =no missed
milestones R = 1 or more missed
milestones
The goal is 3 or less points. Scoring
Green <= 3 points, White = 4 - 6 points,
Yellow = 7 - 9 points, Red > 9 points.
KPI Owner: Kim Cauthen (26540) SME: Patty Burns (2-6750)
The goal is to show adherence to
regulations and responsiveness to
SCDHEC and/or EPA environmental
compliance assessments. Goal is to have
less than three findings per audit or a
numeric index total of no more than 5
points. Green is defined as < 5 points;
White is 6-15 points; Yellow is 16-22
points, and Red is > 23 points.
KPI Owner: Patricia Allen (803725-1728) SME: Dave Lester
803-725-2904)
KPI Owner: Patricia Allen (803725-1728) SME: David Lester
(803-725-2904)
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
26
15 points for 5-10 days, 20 points > 10 days.
III-B-4 Environmental Commitments Index: This indicator
depicts WSRC progress towards meeting 100% regulatory
commitments. The objective of this indicator is to measure
performance in meeting federal and state environmental
commitments. Combined graph that includes both the Overdue
Commitments Index and the Open Regulatory Items Index;
federal and state regulatory agency commitments are applicable
to WSRC.
III-C-1 Citizens Advisory Board Responsiveness: This metric
depicts responsiveness to SRS Citizens Advisory Board
recommendations by the Department of Energy. It measures the
number of days to respond to a recommendation.
III-C-2 Public Meeting Effectiveness: This indicator measures
the effectiveness of public meetings based on DOE and
stakeholder feedback, meeting preparation/follow-up and
attendance.
III-C-3 Public Involvement Response to Inquiries: This
metric measures how quickly SRS Public Involvement Staff
respond to public inquiries made via phone, email, and meetings.
III-D-1 Tours/Visitors Program Effectiveness: SRS Tours are
offered to the public to educate them on site
activities/capabilities. A survey is given to each tour group of 5
or more touring SRS that contains the following questions:
"Before the tour today, how did you feel about SRS?" "How do
you feel about SRS now that your tour is complete?" The SRS
Achieve 100% compliance in meeting
regulatory commitments on schedule.
Ratings for Percent on Schedule: Green
is defined as 100% - 95%, White is 94%
- 90%, Yellow is 89% - 85%, and Red is
> 84%. Rating for meeting
environmental compliance commitments:
Green = 100%, Red < 100%.
The goal is for DOE-SR to respond to
CAB recommendations within 10 days
from receipt. Green is all responses in
less than 10 days. White is if any
response in 11-20 days. Yellow is a
response from 20-30 days. Red is if any
response is more than 30 days from the
date of receipt.
The goal is to maintain a score above 75
points. Points are assigned based on
feedback forms, previous year averages
and checklist completion. Green is a
score of 75 or above, White is a score of
70-74, Yellow is a score of 65-69, and
Red is a score of < 65.
The goal is to respond within a 5-day
period. Green = average response in less
than two days, White = average response
in 3 days, Yellow = average response in
4 days, and Red = response > 5 days.
To strive for positive, continuous
improvement in the public’s perception
of SRS.
Green: x >3.75
White: 3.5< x <3.75
Yellow 3.0 <x<3.5
KPI Owner: Patricia Allen (803725-1728) SME: Dave Lester
803-725-2904)
KPI Owner Teresa Haas, 5-4643
SME: Dawn Haygood, 5-9668
KPI Owner Teresa Haas (803725-4643) SME: Dawn Haygood
(803-725-9668)
KPI Owner Teresa Haas, 5-4643
SME: Dawn Haygood, 5-9668
KPI Owner Teresa Haas (803)
725-4643 SME: Laurie Posey(803) 725-5505
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
27
Opinion Survey goal reflects the feelings of tour participants
Red: x<3.0
toward the site. It is important that those who participate leave
the site with an opinion that is equal to or better than when they
came.
III-D-2 Low Country Plan Implementation: This metric is
Green – 95-100%
designed to demonstrate WSRC’s performance in meeting the
White – 85-94%
milestone commitments as identified per WSRC 2002 Low
Yellow – 80-84%
Country Outreach Strategy and Plan. This tool will focus
Red -<80%
management attention to ensure that Overdue and Not
Completed milestones receive the required level of attention
through completion. The formula is YTD Completed divided by
YTD plus Milestones Overdue. Due date is either the original
target date identified in the plan or the revised date that has been
approved via an informal change control process with our DOESR Customer.
III-D-3 Education Outreach is composed of the Traveling Science Program and Internships at SRS.
III-D-2 Traveling Science: This metrics indicates the number
The goal of the traveling science
of requests received from teachers in the CSRA for scientists
program is for CSRA teachers to have
and engineers at the Savannah River Site to demonstrate
access to Savannah River Site scientist
scientific, mathematics or engineering topics in the classroom.
and engineers. EOP strives to complete
The metrics also indicates the requests from the teachers that
100% of the requests from the teachers.
were completed during the quarter.
Green = 100%
White = 70-99%
Yellow = 50-69%
Red =<50%
III-D-2 Internships at SRS: Internship programs offer
The goal is to provide at least 100
opportunities to the education community to gain experience and internship opportunities at SRS through
knowledge in their field and build a pipeline of future skilled
the School-to-Work program and the
workers to support site missions. This metric measures the
Research Intern program. Green is >
number of interns utilizing Site resources by participating in the 200 participants; White is 130Research Internship Program and the School-to-Work Program.
199;Yellow is 100 - 130; Red is <100.
Goal: Maintain concerns per 100
III-E-1 Number of Open Employee Concerns per 100
Employees: This indicator reflects the number of concerns per
employees at a factor of less than 2.00.
100 employees. The information provides a common
Green is defined as < 2.00; White is >
denominator by which WSRC can identify adverse trends.
2.00 and < 3.00; Yellow is >3.00 and <
KPI Owner: Lessie B. Price (803)
502-9983
SME: Phillip T. Mottel (803)
502-9984
KPI Owner: Name Barbara
Smoak (725-7761) SME: Bonnie
Toole (725-7473)
KPI Owner: Barbara Smoak (57761) SME: Bonnie Toole (57473)
KPI Owner John Runnels (803725-4919) SME: Larry Adkinson
(803-725-1951)
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
28
WSRC can use this data, internally and externally, for
benchmarking with other organizations.
III-E-2 Average Number of Days to Closure of Employee
Concerns: This indicator reflects the average number of days
required to internally resolve employee issues through the
WSRC Employee Concerns Program. Timely resolution is
directly related to the nature of the concern and depth of
investigation required. Thoroughness and accuracy of the
investigation are often related to customer satisfaction, which is
important to our employees, customers, and Board members and
is also an important factor of risk management. However, the
time factor is but one element of quality.
III-E-3 Staffing to Target: This indicator demonstrates how
closely the current workforce is aligned with projected staffing
targets. Staffing plans and workforce skills analyses are updated
to coincide with the program planning cycle. Together, they
highlight both shortages and excesses in terms of manpower
resources and worker skill sets.
IV-A-1 Cumulative Budget Cost Performance for DP, EW02
and EW04: This indicator shows fiscal year to date operating,
capital equipment, and general plant project expenditures shown
as compared to budget. Line items are excluded. Measurement
will be done for the major B&R structures on the site, EW02,
EW04 and DP.
IV-A-4 Comparison of Actual Overtime Cost to Budget: This
metric compares FYTD monthly overtime actuals as a percent of
overtime budget.
IV-B-1 Performance-Based Incentive Performance: This
indicator reflects contract performance success as indicated by
fee from the WSRC/Division PBI milestones completed. The
percentage shown is the milestones completed versus the
4.00; and Red is > 4.00.
Goal: Resolve issues expeditiously such
that the average days to closure remains
less than 60 days. (This factor may vary
widely and have less validity when small
numbers of concerns are being reported.)
Green is defined as < 60 days; White is
defined as 61-90 days; Yellow is defined
as 91 to 120 days; and Red is defined as
> 121 days.
KPI Owner John Runnels (803725-4919) SME: Larry Adkinson
(803-725-1951)
Goal is to maintain a staffing to target
ratio of 97% or higher. Green stop light
is 98.6-100%; White is 97-98.5%;
Yellow is 95-96.9%, and Red is less than
94.9%.
KPI Owner Ted Myers (803-7255-4691) SME: Willie Bell (803725-4207)
The goal for this metric is for actual
expenditures be close to budgeted
expenditures. Green is where the
performance is > 95% and < 100%,
White is 85-95% or 100-102%, Yellow
is 80-85% and 102-105%, and Red is <
80% and > 105%.
The goal for this metric is for actual
overtime is to be less the than or equal to
102% of budget. Green is where the
actual is equal to or <102% White is
103-106%, Yellow is 107-110%, and
Red is > 110%.
The ranges utilized reflect the higher
expectations of WSRC and are in excess
of those required in the Contract.
Green: x> 95%
KPI Owner: Dan Becker (803725-1124) SME: Susan Crosby
(803-725-7668)
WSRC KPI Owner: Dan Becker
(803-725-1124)
SME Paula Hardin (803-7251767)
KPI Owner: Clay Jones (803-7254409) SME: Phil Croll (803725-3158)
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
29
projected completions. Total fee invoiced (within 5 days of
month end) is shown as the measurement. Adjustments have
been/are made for WSRC/DOE approved baseline changes.
IV-B-2 Super Stretch Performance-Based Incentives
Performance: This indicator reflects contract performance
success as reflected by fee from Super Stretch (SS) PBIs
completed. The percentage shown is the actual cash flow
generated versus the projected cash flow for currently approved
SS PBIs. Adjustments are made when new SS initiatives are
approved or when operating circumstances require cash flow
projection revision.
IV-C-1 WSRC Encumbrance Summary (Operations): To
measure the quarterly Operating Encumbrance Summary
(exclusive of Capital Equipment and GPP) of the Major B&R's:
DP09, EW02,and EW04 against the established target.
IV-C-2 On Time Payments: This indicator reflects actual
monthly invoices submitted as a percentage of approvals from
DOE within 10 days time frame.
IV-C-3 On Time Invoicing: This indicator reflects milestones
completed versus milestones invoiced (within five working
days). The measure is the fee associated with the milestones.
White: 85% < x < 95
Yellow: 75 < x <85
Red: x<75%
The goal is to maximize super stretch
earnings throughout the fiscal year.
Success is measured as follows:
Green: x > 90%
White: 80 < x, 90%
Yellow: 70% < x < 80
Red: x < 70%
The goal of this metric is for the actual
encumbrance be close to the target
encumbrance. Green is where the
performance is >95% and <100%.
White is 85-95% and 100 - 102%.
Yellow is 80-85% and 102 - 105%. Red
is <80% and > 105%.
This indicator reflects actual monthly
invoices submitted as a percentage of
approvals from DOE within 10 days time
frame.
Green: X > 90%
White: 80% < X < 90%
Yellow: 75% < X < 80%
Red: X < 75%
The goal is to maintain an on-time
(within five working days of completion)
invoicing ratio of greater than 90% for
all PBI milestones/ objectives completed.
Success is measured as follows:
Green: X > 95%
White: 85% X 95%
Yellow: 75% < X < 85%
Red: X < 75%
KPI Owner: Clay Jones (803-7254409) SME: Phil Croll (803725-3158)
KPI Owner: John Cantwell 9529236 SME: Carol Posey 9528233
KPI Owner Dan Becker (803725-1124) SME: Anthony Grant
(803-952-8803
KPI Owner: Clay Jones (803725-4409) SME: Philip Croll
(803-725-3158)
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
30
IV-C-4 Cost Savings Performance: This indicator depicts the
site's progress toward achievement of the PACE/(CRIT) goal.
(The CRIT goal and the PACE goal are the same.) A monthly
interim goal has been determined to ensure sufficient progress is
being made toward the final goal for the year. Monthly goals are
a function of either (1) the total projected value of all current
PACE initiatives with projected implementation date of the
current month or earlier, or (2) the total projected value needed
by the indicated date to ensure that the site's goal is met by the
end of the fiscal year. The actual savings to date vs. the goal to
date equals the cost savings performance.
IV-C-5 Employees Cost Effectiveness Performance (IDEAS):
To show the number of approved IDEAS needed per month to
accomplish the goal of 12 approved IDEAS per 100 employees
for the year (or .12 IDEAS per employee).
IV-D-1 Nonconformance Report Processing Status: NCR's
document item deficiencies as required by 10CFR830, SubPart
A, "Quality Assurance Requirements". This indicator depicts the
nonconformance report (NCR) population in 3 age categories of
< 60 days, between 60 and 120 days, and > 120 days old.
Suspended NCRs are not included in this metric because they
cannot be dispositioned "due to a project cancellation,
suspension, extended system shutdown, facility closure or
abandonment-in-place." (WSRC 1Q, QAP 15-1).
IV-D-2 Problem Identification Report Status: This
performance metric evaluates the company’s ability to identify
problems using the Problem Identification Report (PIR) program
and effect corrective actions in the time established by the
responsible manager. Both the estimated corrective action
completion dates and the responsible manager(s) are identified
on the PIR document. The data will take into account
Significance Categories of the PIRs.
Site PACE Goal for fiscal year 2002, as
of 1/31/02, is $120,103,000. The FY '02
PACE goal may fluctuate according to
the additional funds demand created by
emerging scope. Green > 95% of
monthly goal. White is defined as > 85%
and <95% of monthly goal; Yellow is
defined as > 80% and <85% of monthly
goal, and Red is defined as <80% of
monthly goal.
KPI Owner: Frank Iwuc (803725-7199) SME: Mike Tarrant
(803-725-0679 )
1500 approved IDEAS for FY02 (based
on 12 approved IDEAS per 100
employees). Green is > 90% of monthly
goal; White is 80-90% of monthly goal;
Yellow is 70-79% of monthly goal, and
Red is < 70% of monthly goal.
Green: Population of old NCRs is <=
200.
White: Population of old NCRs is > 200,
but <= 250.
Yellow: Population of old NCRs is <
250, but >= 300.
Red: Population of old NCRs is greater
than 300.
NOTE: Old NCRs are defined as those
active NCRs > 120 days old.
Green: Average PIR corrective actions
completed within 5 days of the estimated
completion date.
White: Average PIR corrective actions
completed between 5 and 10 working
days of the estimated completion date.
Yellow: Average PIR corrective actions
completed between 10 and 15 working
KPI Owner Frank Iwuc (803-7257199) SME: Brenda Kelly (803725-0676)
KPI Owner Joe Yanek (803-9529893). SME: Sun Hwang (803952-9926).
KPI Owner Joe Yanek (803-9529893). SME: Sun Hwang (803952-9926).
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
31
IV-D-3 Self0Assessment Program Effectiveness: This
indicator depicts the percent of self-assessment findings for the
month that are identified and closed (activity) each month.
Additionally, the problem recurrence rate is presented as a
function of total findings identified.
RECURRING FINDINGS
X 100 = RECURRENCE
RATE (%)
TOTAL FINDINGS IDENTIFIED
IV-D-4 Facility Evaluation Board Grade Status: This
indicator reflects a running average of the most recent grades of
all evaluated facilities and programs. Grades are assigned the
following values: Excellent - 5, Above Average - 4, Average 3, Below Average - 2, Significantly Below Average - 1
V-A-1 Defense Nuclear Facility Safety Board 94-1 and 20001 Recommendation Implementation: This indicator depicts the
year to date and cumulative status of SRS DNFSB
Recommendation 94-1/2000-1 commitments. Beginning with
FY01, there were 25 remaining SRS commitments. The
remaining DNFSB 94-1/2000-1 milestone commitments are
scheduled over the next 7 years with only a few due each year.
It is more meaningful to monitor progress for the entire life of
the SRS commitments.
days of the estimated completion date.
Red: Average PIR corrective actions
after 15 working days of the estimated
completion date.
No more than 10% findings recur within
the assessment unit. Green is < 10%,
White is > 10% and < 15%, Yellow is >
15% and < 20%, and Red is > 20%.
The goal is a running average value of
3.0. Green: >3.5; White: 2.5 - 3.5;
Yellow: 2.0 - 2.5; Red: <2.0
KPI Owner Joe Yanek (803-9529893) SME: Sun Hwang (803952-9926)
KPI Owner: Steve Johnson (803952-9886) SME: Joy Price (803952-9657)
Complete the remaining DNFSB
KPI Owner: John Dickenson
Recommendation 2000-1 milestones by
(803-952-4604) SME: Harry
the due date in the DOE Implementation Pund (803-952-3684
Plan for the Remediation of Nuclear
Materials in the Defense Nuclear
Facilities Complex (Revision 1), dated
January 19, 2001. Green: all DNFSB
2000-1 commitments are on schedule;
White: one commitment is likely to be
late; Yellow: 2-3 commitments are likely
to be late; Red more than three
commitments are likely to be late.
V-B reflects schedule, cost and Annual Operating Plan Milestone Performance for three SRS Projects: MOX Fuel Fabrication Facility (MFFF),
Pit Disassembly and Conversion Facility (PDCF), and Highly Enriched Uranium Facility (HEU).
V-B-1 MFFF and PDCF Schedule Performance: This
The goal is deliver 85-90% of
KPI Owner: Jimmy Angelos
indicator shows performance against scheduled deliverables for
deliverables on schedule. Green denotes
(803-725-8593) SME: Dick
the PDCF and MFFF project support as a percentage of
95-100% delivered on schedule, White is Tansky (803-725-5303)
deliverables to WGI, DCS, or NNSA made on schedule.
85-94%, Yellow is 80-84% and Red is
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
32
V-B-1 HEU Schedule Performance: This indicator shows the
monthly Schedule Performance Index (SPI) for the HEU Blend
Down Program. The cumulative SPI captures the SPI from
inception to-date for the HEU Blend Down Project.
V-B-2 MFFF and PDCF AOP Milestone Performance: This
indicator shows AOP milestone on-time completion for the
fiscal year to date for the Pu Disposition Program against the
milestones scheduled.
V-B-3 MFFF and PDCF Cost Performance: This indicator
shows fiscal year to date OPC and TEC expenditures in support
of the MFFF, PDCF and PDP Waste Building projects shown as
compared to budget.
V-B-3 HEU Cost Performance: This indicator shows the
monthly Cost Performance Index (CPI) for the HEU Blend
Down Program. The cumulative CPI captures the CPI from
inception to-date for the HEU Blend Down Project.
V-C-1 Receipts of Rocky Flats Plutonium, V-C-2 Ship to WIPP,
and V-C-3, Mound Receipts are currently being developed.
<80% delivered on time.
The goal for this metric is for the
monthly Budgeted Cost of Work
Performed (BCWP) to exceed the
Budgeted Cost of Work Scheduled
(BCWS). Green is where the
performance is >1.00. White is .99-.90,
Yellow is .89-.80 and Red is <.79.
The goal for this metric is to complete all
AOP milestones on time. Green is 100%
of milestones met on time in the current
month.
The goal for this metric is for actual
expenditures be close to budgeted
expenditures. Green is where the
performance is > 95% and < 100%,
White is 85-95% or 100-102%, Yellow
is 80-85% and 102-105%, and Red is <
80% and > 105%.
The goal for this metric is for the
monthly Budgeted Cost of Work
Performed (BCWP) to exceed Actual
Cost of Work Performed (ACWP).
Green is where the performance is >1.00.
White is .99-.90, Yellow is .89-.80 and
Red is <.79.
KPI Owner Sam Speight (803)
208-2796; SME: Perry Stanley
(803) 208-3192
KPI Owner Jimmy Angelos (803)
725-8593; SME: Dick Tansky
(803-725-5303)
KPI Owner Jimmy Angelos (803)
725-8593; SME: Dick Tansky
(803-725-5303)
KPI Owner Sam Speight (803)
208-2796; SME: Perry Stanley
(803) 208-3192
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
33
WASHINGTON GOVERNMENT GROUP INITIAL
DIRECTIONS FOR PERFORMANCE METRICS
This section provides the general definitions and goals of KPIs as developed by the Washington
Government Group after a year of study by a crosscutting set of subject matter experts. These are
general guidance and should be used by sites in the development of their own metrics. Each site may
want to “bin” the KPIs to suit their particular needs.
1.
INDUSTRIAL SAFETY INDICATORS
A.
Total Recordable Case Rate
Suggested Goal: 3% reduction in the average of the previous three years
Green
= >3% reduction
= 3 points
White
= >1% reduction
= 2 points
Yellow = >0% reduction
= 1 point
Red
= increase
= 0 points
B.
Lost Days (days away) Rate
Suggested Goal: >5% reduction of average of past 3 years’ rates
Green
= >5% reduction
= 3 points
White
= >3% reduction
= 2 points
Yellow = >0% reduction
= 1 point
Red
= increase
= 0 points
C
Lost and Restricted Work Day Case Rate
Suggested Goal: 3% reduction in the average of the previous three years
Green
= > 3% reduction
= 3 points
White
= > 1% reduction
= 2 points
Yellow = > 0% reduction
= 1 point
Red
= increase
= 0 points
2.
ENVIRONMENTAL MANAGEMENT LEVEL 2 INDICES
A.
Reportable Environmental Events
1.) Number of regulatory non-compliances (as determined by receipt of notification,
e.g. Notice of Noncompliance from Regulatory Agency)
2.) Number of national Response Center notifications (CERCLA/EPCRA release>
reportable quantity)
Suggested goal: x% lower than the previous four-year performance average to reflect
continuous improvement.
Target for KPIs
Green
>10% below goal (Continued performance at no reportable events is rated
green) = 3 points
White
0-9% below goal = 2 points
Yellow 1-9% above goal = 1 point
Red
>10% above goal = 0 points
B.
Environmental Enforcement Actions
1.) Number of NOVs
2.) Total Fine/Penalties paid (weighted – see below)
NOV w/o fine = 1
NOV w/ fine < $50K = 3
NOV w/ fine > $50K = 6
Suggested goal: x% lower than previous four-year performance average to reflect
continuous improvement.
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
34
C.
Target for KPI #2:
Green
>10% below goal (Continued performance at no NOVs is rated green)
White
0-9% below goal
Yellow 1-9% above goal
Red
>10% above goal
Pollution Prevention/Waste Reduction
1.) Reduction in Volume of Waste Generated
- Hazardous
- Radioactive
- Sanitary (non hazardous, non radioactive)
2.) Sum of disposal costs saved
Suggested goal: x% (each organization is to decide their own goal) reduction in waste
generation from previous year while meeting all applicable regulatory requirements
and/or goals set by EPA, DOE, DOD, etc.
Proposed measure: based yearly waste generation forecast from planned activities,
achieve percentage reductions for each of the three waste stream types (each having its
own goal). Hazardous and radioactive waste reduction efforts should be given more
emphasis because of higher associated costs. Since unit disposal costs may vary
depending on location, it is proposed that total cost savings be reported but not
factored into the target calculation.
Target for KPI #3:
Green
>10% above goal
White
0-9% above goal
Yellow 1-9 % below goal
Red
10% below goal
3.
EMERGENCY SERVICES AND FIRE PROTECTION
Each organization should set a goal (either a number or percent) of fire protection and fire
detection systems it will allow in impaired or inoperable condition (with compensatory
action) at any time.
Green
= .8 goal
White
= goal
Yellow = 1.1 goal
Red
= 1.2 goal
A similar approach is to be taken for drills and exercises scheduled versus performed and
percentage of overdue corrective actions and assessments tolerated.
A.
Fire Protection Systems
B.
Fire Detection Systems
C.
Emergency Services
4.
MAINTENANCE INDICATORS
A.
PM/PdM as a % of Total maintenance
This metric measures the effectiveness of the PM and Predictive Maintenance (PdM)
by indicating what percentage of Total Maintenance is being performed as PM/PdM.
(Since this is a ratio, the measurement of PM and PdM can be in whatever units the
organization determines most useful, i.e. hours, man-hours, events, actions, etc.)
Suggested Goal: 50%
Green
= 56-100%
White
= 40- 55%
Yellow = 30-39%
Red
=<30%
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
35
B.
C.
Maintenance CM Backlog in Man-hours
This measure measures the Total Corrective Maintenance (CM) backlog in manhours Excludes work associated with Preventative Maintenance, Predictive
Maintenance, and plant modifications.
Suggested Goal: The goal is equal to the average monthly backlog for the previous
year. A monthly index scoring is applied to the goal as follows:
Green
= <70% of goal
White
= 71-105% of goal
Yellow = 106-130% of goal
Red
= >131% of goal
Maintenance Delinquent PMs
This metric is intended to measure the total number of Delinquent Preventative
Maintenance (PM) work orders – those PMs that are not field work complete, but
have exceeded their scheduled completion date and any allowable grace period.
Suggested Goal: The goal is equal to the average monthly backlog for the previous
year. A monthly index scoring is applied to the goal as follows:
Green
= <70% of goal
White
= 71-105% of goal
Yellow = 106-130% of goal
Red
= >131% of goal
5.
MANAGEMENT
A. NCR/PDR/Deficiency Reports Processing Status (age)
Each site would establish its own statistical baseline and then work towards an
improvement goal. The coloring would be based on being within the goal or
improvement curve.
Green
= < 95% of goal
White
= 96 - 100% of goal
Yellow = 101 - 110% of goal
Red
= >110% of goal
B. Corrective Action Reports/Items completed on time
Green
= >95%
White
= >90%
Yellow = >85%
Red
= <85%
C. QA Surveillance Report Processing Status (age)
Each site would establish its own statistical baseline and then work towards an
improvement goal. The coloring would be based on being within the goal or
improvement curve.
Green
= < 95% of goal
White
= 96 - 100% of goal
Yellow = 101 - 110% of goal
Red
= >110% of goal
6.
ENGINEERING INDICATORS
A. Engineering changes (as a measure of quality)
ƒ Measure changes to product design/work plans/engineering documents after approval
(e.g. approved for construction status).
• As a result of final internal (cross discipline) review
• After approval due to field implementation problems attributed to engineering design
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
36
•
ƒ
ƒ
ƒ
As a result of testing problems attributed to engineering/design
In a construction/production environment, measure required rework.
Record only changes/rework caused by engineering unit.
Total the % of initial products submitted for final internal Engineering review to the
total processed from that point without “substantial” revision.
Green
=< 10%
White
= < 15%
Yellow =< 20%
Red
=> 20%
B. Requests for Engineering Assistance or whatever the analogous vehicle is called.
• Measure requests, resolutions, backlog of REAs or construction site RFIs (Request
for Information) due to engineering shortcomings.
• Engineering involvement in plant start-up or product introduction to manufacturing
(measured in man-hours or dollars).
• Number of non-conforming items during project execution (e.g., construction, initial
manufacturing and testing).
• % responded to in original allotted time
Green
= 100% - 96%
White
=<95%
Yellow =<90%
Red
= <85%
C. Employee participation in product/process improvements (a measure of
improvement in how work is done efficiently). Examples include:
• Status of log of process/product improvement suggestions.
• Dollar value of projected savings due to accepted product/project improvements.
• Number of employee meetings/reviews where improvement/development topics are
discussed.
• Level of funding of process improvement projects.
• Measure of success for a “Lessons Learned” Program.
• Number of team interface meetings held (e.g., process mapping meetings,
constructibility review) during the design phase of project.
• Measure volume of value engineering activities in net dollar savings.
Each site should determine its own scoring levels.
7.
EMPLOYEES
A. Employee Concerns Program
1.) Percent of concerns resolved by internal management
2.) Number of concerns resolved internally + # concerns taken to external source
Green
= 100% internal
White
= >95% internal
Yellow = >90% internal
Red
= <90% internal
3.) Employee Issue Resolution Trend
Green
100% < 30 days old
White
>95% < 30 days old
Yellow >90% < 30 days old
Red
<89% < 30 days old
B. Employee Performance
1.) Conduct of Operations Performance Index
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
37
Track & trend Disciplined Operations (DO). Goal: maintain performance @ 4.5
(+/- 1.0) events/200k hours. Maintain a healthy balance of operations without
suppressing reporting.
Green
=<4.5
White
=4.5 – 4.8
Yellow =4.8 – 5.0
Red
=>5.0
2.) Violation/Inadequate Procedures
Track & trend Violation/Inadequate Procedure DO component events per 200,000
hours. Goal: maintain performance @ 1.5 events/200k hours. Maintain a healthy
balance of operations without suppressing reporting.
Green
= <1.5
White
=<2.0
Yellow = <2.5
Red
= >2.5
C. Employees Cost Effectiveness Ideas Performance
Track & trend IDEAS submitted, approved & implemented per 100 employees.
Goal: 15 ideas accepted/100 employees. Because people are our most important
resource, then it is imperative to encourage and reward participation for cost effective
continuous improvement
Green
= >15
White
= 15 -10
Yellow = 10 -5
Red
= <5
8.
SAFEGUARDS AND SECURITY
(Note: Events are defined as violations, infractions, incidents)
A. Security Events
B. Computer events
C. Document events
Suggested goal **:
Green
= 0 events
White
= 1 event
Yellow = 2 events
Red
= > 3 events
** Goal exceptions for sites with > 1000 employees and > 10,000 documents
9.
FISCAL ACCOUNTABILITY*
A. Gross Profit – Award Fee less Unallowable Cost/Available Fee
Green
≥ 91%
White
80 – 90%
Yellow
75 – 79%
Red
< 75%
B. Cash Flow – Date Collected less Date Earned
Green
< 10 days
White
11–20 days
Yellow
21-30 days
Red
>30 days
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
38
C. Revenue Growth – Revenue for recorded time present in year divided by revenue to
same time period previous year
Green
>100%
White
= 100%
Yellow
Red
<100%
10.
HUMAN RESOURCES
A. Staffing to Target
• % Staffing to Budget/Staffing Plan
• % Contract Staffing to Budget/Staffing Plan
KPI Target: % of goal attainment per site goal
Green
.97 – 1.00
White
.90 - .97
Yellow
.70 - .89
Red
<.70 =
B.
Critical Skills Staffing
• % Critical Hires to Staffing Plan
KPI Target and scoring level is the same as for Staffing to Target
C. Workforce Diversity Hiring Index
• % Minority Technical Hires – Technical Hires = direct staff such as engineers,
chemists, project controls, etc.
• % Female Technical Hires – Technical Hires = direct staff such as engineers,
chemists, project controls, etc.
• Division Minority Goals/hired or placed
• Division Female Goals/hired or placed
Each site should develop a target for female or minority hiring that is consistent with
its affirmative action plan (%). Each report will be an actual YTD hire (%).
KPI Target and scoring level is the same as for Staffing to Target
11.
PROJECT MANAGEMENT
A. SPI = Actual Days / Baseline Days
Red > 1.06
Yellow = 1.059 to 1.0 or .94 with ETC > 1.0
White = 0.99 to 0.94 with ETC <1.0
Green < 0.94
B. CPI = Actual Cost/Baseline Cost
Red > 1.06
Yellow = 1.059 to 1.0
White = 0.99 to 0.94
Green < 0.94
C. Health of project is a subjective assessment by the Project Manager of real and
potential obstacles to finishing on schedule, on budget or other problems
Each site should determine its own scoring levels.
12.
COST EFFECTIVENESS
A. Milestone Performance
Green
> 95%
White
85 to 95%
Milestones Achieved/Milestones Committed
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
39
Yellow 80 to 85%
Red
< 80%
B. Budget Cost Performance
Green
> 95%
White
85 to 95%
Yellow 80 to 85%
Red
< 80%
C. Cost Savings Performance
Green
> 95%
White
85 to 95%
Yellow 80 to 85%
Red
< 80%
D. PBI Performance
Green
> 95%
White
85 to 95%
Yellow 80 to 85%
Red
< 80%
13.
Budget Performance/Total Budget
Cost Savings/Cost Savings Committed
PBIs Achieved/PBI Committed
REGULATORY PERFORMANCE
E. Overdue Commitments Index
1.) State Environmental Agency Overdue Commitments
2.) Other State Agency Overdue Commitments
3.) DOE Overdue Commitments
4.) Stakeholder Groups Overdue Commitments
Green
No overdue commitments
White
One overdue commitment
Yellow Two or three overdue commitments
Red
Four or more overdue commitments
Note: It is recognized that a single overdue commitment could be categorized as
yellow or red depending upon the risk or if were one of only a few commitments to
be met.
F. Open Regulatory Items Index
1.) State Environmental Agency Open Commitments
2.) Other State Agency Open Commitments
3.) DOE Open Commitments
4.) Stakeholder Groups Open Commitments
Green
All commitments on schedule
White
Likely to miss >5% of commitments
Yellow Likely to miss >10% of commitments
Red
Likely to miss >20% of commitments
G. DNFSB Commitment Adherence Index
1.) Status of 94-1 Commitments
2.) Status of other non-94-1 DNFSB commitments
3.) Commitments older than 6 months
4.) Commitments older than 12 months.
Green
Greater than 95% commitments on schedule
White
Likely to miss >5% of commitments
Yellow Likely to miss >10% of commitments
Red
Likely to miss >20% of commitments
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
40
14.
COMMUNITY RELATIONS
A. External Relations
1.) Responsiveness to Citizen Advisory Boards/Citizen Advisory Commissions
Green
85-100 percent turnaround to requests/commitments within 30 days
White
70-84 percent turnaround to requests/commitments within 30 days
Yellow 50-69 percent turnaround to request/commitments within 30 days
Red
49 percent or less turnaround to requests/commitments
NOTE: Excluding legal and customer driven restraints.
2.) Public Meeting Effectiveness
Utilizing survey techniques, measure public meeting effectiveness. Each site will
document through exit surveys each participant's percentage of change from before
to after the meeting or session relating to their impressions.
Green
50-100 percent improvement
White
25-49 percent improvement
Yellow 10-24 percent improvement
Red
9 percent or less
3.) Tour/Visitor's Program Effectiveness
Utilizing survey techniques, measure tour and visitor program effectiveness. Each
site will measure through exit surveys each participant's percentage of change from
before to after the tour or program relating to their awareness.
"Did this tour and/or program increase your awareness of the facility and its
operations?"
Green
50-100 percent improvement
White
25-49 percent improvement
Yellow 10-24 percent improvement
Red
9 percent or less
15.
SELF ASSESSMENTS
B. Scheduled vs. Complete
Green
98-100%
White
96-97%
Yellow 94-95%
Red
<93%
C. Status
This metric would include the number of items opened each month, closed each
month, items overdue, and the total number of open in the system. Goal = no more
than 1% delinquent. Percent may vary depending of the size of the site.
Green
0% delinquent
White
<1%% delinquent
Yellow <2%% delinquent
Red
<3%% delinquent
16.
QA ASSESSMENT LEVEL 2 SUPPORTING CATEGORIES
When an assessment is conducted there is typically a checklist prepared to guide the
assessor. Depending upon the type of assessment being performed, the checklist
will contain "attributes." The attributes will typically be one of two types: either
requirements from our established quality programs and procedures (a compliance
assessment) or attributes in the form of performance indicators (an implementation
assessment).
A compliance assessment tells if written quality programs address the requirements
of the regulatory body/agency having jurisdiction over the project (e.g. NRC, DOE,
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
41
Army Corps of Engineers, etc.). It also tells if those requirements are passed down
through implementing procedures and procurement documents, and if personnel are
effectively trained to perform in accordance with those requirements and
procedures.
An implementation assessment tells if written quality program and procedures are
being implemented effectively.
Because assessments are performed to a checklist of attributes, the number of
noncompliances can be compared to the number of attributes and expressed as a
percentage.
A. QA Program Compliance KPI
Percentage of the QA Program Compliance attributes assessed versus the QA
Program Compliance attributes found compliant.
QA Program Compliance KPI pertains to assessing whether the QA requirements
applicable to the Project and the methods to meet these requirements are adequately
delineated in the Project’s command media. This KPI should also address personnel
training and qualification assessment attributes.
Green
= 100% - 95%
White
= 94% - 90%
Yellow = 89% - 80%
Red
= 79%- below
B.
QA Program Implementation KPI
Percentage of the QA Program Implementation attributes assessed versus the QA
Program Implementation attributes found compliant.
QA Program Implementation KPI pertains to assessing whether the work is being
performed in accordance with the established Project command media associated
with the QA Program Compliance KPI.
Green
= 100% - 95%
White
= 94% - 90%
Yellow = 89% - 80%
Red
= 79%- below
Notes on both the QA Program Compliance and Implementation KPIs
• The Assessment Plan must delineate specific attributes to be assessed. Changes
to the plan during the assessment must be documented in the plan. Thus, the
Assessment Plan provides the number of attributes assessed for both KPIs.
• A deficiency/concern is categorized as a noncompliance or an observation.
Observations should not be included in the calculating the performance. Also,
there should be a method to factor in additional value for attributes that are
identified with proficiency. (Note: If more value is given to an attribute with
proficiency, the KPI could be greater than 100%).
• More than one noncompliance may be identified to an attribute. Therefore, there
needs to be method whereby the noncompliances are grouped to the attribute.
C. Corrective Action Implementation KPI
• Percentage of the corrective actions completed by the original forecasted
completion date.
This KPI can be graded to the following:
Green
Completed ahead of schedule.
White
Completed on or within 30 days after scheduled date.
Yellow Completed within 30 to 60 days after scheduled date
Red
Completed greater than 60 days after scheduled date.
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
42
Consideration needs to be given for those corrective actions that are dependent on
external interfaces outside the control of the Project. The Corrective Action
Document serves as the tool to gather data for this KPI.
D. External Assessments
• Findings/Audit
• Days Open
• Times Delinquent
• Days Delinquent
• Extension Request
Individual Site External Oversight Indicator Scoring Sheet
Attribute
Red Yellow White
Findings/Audit
>10
5-10
3-4
Points 10
7
5
Days Open
>45 15-45 10-14
Points 10
7
5
Times Delinquent
Green
<3
0
<10
0
>5
3-5
1-2
<1
Points 10
Days Delinquent
>10
Points 20
Extension
>5
Requests
Points 10
Index (Totals)
7
5-10
15
5
1-4
10
0
<0
0
3-5
1-2
<1
7
5
0
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
43
APPENDIX A: SRS PERFORMANCE METRIC
PRESENTATION
This presentation has been used onsite to
teach various managers and site employees
about the SRS Performance Metric system.
Examples provided are fictitious, but
realistic.
Questions can be directed to:
Slide 1
Savannah River Site
Performance Metrics
1
Slide 2
Potential Solution - Color Rating System
History
z Institute of Nuclear Power Operations (INPO)
key performance indications established post
Three Mile Island
−Vital element of industry improvement
initiative
−Leading and following indicators
−Includes analysis and action section
z Color roll-up developed by utilities
−Quick status summary (for utility, Nuclear
Regulatory Commission, and INPO)
−Easy assessment of trends
−Sharing of expertise/knowledge
2
Gail Jernigan
Westinghouse Savannah River Company
Building 703-A, Room A129
Aiken, SC 29853
Phone: 803-725-7774
Fax: 803-725-4023
[email protected]
Public utilities, with the use of this system,
have shown that it is possible to be cost
efficient and cost effective while
maintaining excellent performance.
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
44
Slide 3
Strategy
9 To sharpen the focus on a critical few
performance metrics with the ability to
analyze problem areas and benchmark
successes.
9 Using the metrics developed for the
Washington Government Group (WGG) effort
and metrics developed for the Department of
Energy-Savannah River (DOE-SR) Focus
Area metrics, map Westinghouse Savannah
River Company (WSRC) Key Performance
Indicators (KPIs) and DOE-SR parameters to
DOE-SR Focus Areas.
9 Use objectivity to assess SRS performance.
3
Slide 4
Benefits
9 Quickly illustrates successes and problems.
9 Shifts contract performance to focus on results
through KPIs.
9 Color rollup allows quick drill down to source of
problem areas.
9 KPIs as basis are objective.
9 Analysis/Action section of KPI allows
assessment of problem status.
9 Flexibility: ability to accommodate various needs.
9 Could allow/promote sharing/prioritizing of
resources/ideas across the site.
9 Could promote consistency in approaches
across the site.
9 Will facilitate continuous improvement.
4
Drivers for moving to this metric system
included:
− Need to quickly look at a performance
metric system to identify problem areas
and successes.
− Communication tool between DOE and
contractor to establish a common
understanding of problems, issues and
successes.
− Replace subjectivity with objectivity in
the spirit telling WHAT DOE wants
instead of HOW.
To determine the metrics to be used, the
following were used:
− DOE-SR Focus Areas (Safety and
Security; Technical Capability and
Performance; Community, State and
Regulator Relationships; Cost
Effectiveness and Corporate
Perspective)
− Existing DOE-SR/WSRC performance
metrics
− Corporate Washington Government
performance metrics
These were combined, discussed, and
mapped to the DOE-SR Focus Areas.
Further changes have been made once the
system was established.
The analysis/action section is the most
important section of the KPIs. This is where
the KPI owner analyzes the performance by
looking at the data and asking WHY. The
reader should be able to quickly tell if the
KPI owner really understands what is
happening. The action defines what the KPI
owner plans to do to either improve
performance (if yellow or red) or
sustain/improve (if white or green).
This system’s flexibility allows the site to
map the various performance metrics to suit
various customers, including DOE-HQ,
Operations Office, regulators, contractor’s
Board of Directors, the public, and other
interested stakeholders.
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
45
Slide 5
“Drilling Down”
Site Level Metrics
KPIs
WSRC Division
KPIs
Facilities
KPIs
5
Slide 6
Another way to drill down is for each
division within the company to embrace the
same metrics for the division and then the
facility. While not all metrics will apply to
all divisions, many metrics are applicable
across the site, company, division, and
facility. Using this same system allows
managers to analyze their division’s or
facility’s performance. In addition, using
the same metric system across the site
makes it easier for all employees, staff, and
managers to understand how one action can
effect the site’s performance.
Color Values and Definitions
Color Va lue s and D efinitions
G
Excel len ce/Si gn ifica nt
Strength: where perfo rmance exhib its a significant strength, such as ind ustry
top quartile p erformance o r ac hievement of longer term goals
W
Sa tisfactory/Norm a l
Satisfacto ry P erformanc e: at/ab ove the industry average or the annual go al,
with stab le or impro ving trend over several period s.
Y
Tra ckin g to S a tisfa ctory
Needs Imp rovement: where p erformance needs impro vement and
management attention. P erfo rmance may be achieving goal, but showing a
negative trend over several periods that, if co ntinued, will challenge goal
achievement.
R
Un sa tisfactory/S ig n ifi can t
W eakness: where p erfo rmance is significantly belo w the goal and not
showing an improving trend, o r where the annual goal ac hievement is no t
expected .
B
No Da ta
This perfo rmance measure did no t have data to report for this time p erio d;
no sc ore is includ ed in summary
NA
Not A ppl ica bl e
6
Slide 7
Rating/Scoring Sheet
Scoring
Score s for e a ch L-2 o r L-3 KPI
me tric
Green
3
W hite
2
Yello w
1
Red
0
Ro llup scoring b ased on number o f app licab le K PIs in grouping
# of ap plicable K P I's to b e evaluated
R ollup
Value s
2
3
4
5
6
7
8
Green
>6
>8
> 10
> 13
> 15
> 18
> 20
W hite
>4
>5
>7
>9
> 11
> 12
> 14
Yello w
>2
>3
>5
>7
>9
> 10
> 12
Red
<2
<3
<5
<7
<9
< 10
< 12
7
The algorithm was developed
conservatively. Basically, it is using an
average of the number of metrics. For
example, if there are four KPIs to be
evaluated and these four KPIs are Green,
White, Green, and White, the sum would be
10. (3+2+3+2+3=10). The average would
be 2.5, which is rounded to 3, Green.
The simplicity of the system provides the
ability to be either more or less
conservative.
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
46
Slide 8
SRS Summary Example
Quarters
Focus Area
B
G
G
Current Month
Level I
G
B
B
G
W
B
G
G
B
B
W
W
Cost
Effectiveness
B
B
W
G
W
B
W
W
B
B
B
B
B
B
G
Radiation
Contam ination
and Control
W
B
G
G
G
G
G
G
B
G
G
B
B
B
Nuclear Safety
G
G
G
G
G
W
G
B
PBI Performance
W
G
G
G
G
G
G
W
B
B
B
B
W
Technical
Qualifications
W
B
B
W
W
G
B
W
G
R
Y
G
B
W
R
R
G
W
B
W
G
G
B
G
G
G
W
W
G
B
Y
W
W
G
Employee
Relations
Feedback and
Improvement
Financial
Performance
PACE
B
W
W
B
Y
W
Project
Management
Engineering
W
G
CAB Public Participation
Public Perception
Program
Responsiveness
B
B
AOP Milestones
B
G
G
Physical Security
G
B
G
G
B
G
G
Disciplined
Operations
Enforceable
Environmental
Agreem ent
Compliance Index
Milestones
Financial
Forecasts
W
G
G
B
Environmental
Release Index
G
G
Waste Inventory
Reduction
Infrastructure
G
G
G
B
W
B
Production
B
Community,
State and
Regulatory
Relationships
G
B
B
G
B
W
B
G
G
B
G
G
G
Tech Capability
and
Performance
B
G
Emergency
Industrial Safety Services and Fire
Protection
Safety and
Security
G
G
G
Other Key
Indicators
G
B
Corporate
Perspective
B
Large boxes represent current month; each small box represents a quarter of the year.
G Excellence/Significant
W Satisfactory/Normal
B No Data
Satisfactory Performance: at/above the industry
average or the annual goal, with stable or improving
trend over several periods.
Strength: where performance exhibits a
significant strength, such as industry top
quartile performance or achievement of
longer term goals
Y Tracking to Satisfactory
This performance measure did not
have data to report for this time
R Unsatisfactory/Significant
NA Not Applicable
Weakness: where performance is significantly below
the goal and not showing an improving trend, or where
the annual goal achievement is not expected.
Needs Improvement: where performance
needs improvement and management
attention. Performance may be achieving
goal, but showing a negative trend over
several periods that, if continued, will
challenge goal achievement.
Data represented here is fictitious.
8
Slide 9
Safety and Security Focus Area Example
Savannah River Site
Performance Indicators
Through December 31, 2001
Focus Area
G
G
G
Level 1
G
B
G
Level 2
G
G
B
G
G
G
W
W
W
B
Industrial Safety and
Health
G
B
G
W
B
B
Safety and Security
B
B
B
B
B
B
B
B
B
G
G
B
B
B
G
G
R
B
B
R
G
W
G
B
B
B
B
G
B
Reportable
Contamination
B
B
B
B
B
G
G
B
B
B
Authorization Basis
Docum ent Management
Index
Nuclear Safety Issue Significant Nuclear Safety
Incidents Index
Managem ent Index
B
B
G
Security Incidents
Physical Security
G
B
B
G
G
G
B
Nuclear Safety
B
G
W
Cost Index
B
Em ergency Management
Em ergency Exercises
Em ergency Management
EPHA Annual
and Drills Conducted Vs.
Corrective Actions
Review/Revision
Scheduled
Reportable Dose
Exceedances
B
B
B
G
G
Radiation Contam ination
and Control
B
G
W
LWC Rate
B
G
B
G
G
G
Em ergency Services and Fire Protection Im pairment
Status
Fire Protection
W
G
G
TRC Rate
G
G
G
Large boxes represent current month; each small box represents a quarter of the year.
Data represented here is fictitious.
9
Slide 10
Safety KPIs
TRC Rate
SAFETY
LWC Rate
LWD Rate
Industrial Safety
Industrial Safety Indicators
Goal
3.00
Through October 31, 2001
13.60
2.80
12.80
2.60
12.00
2.40
11.20
2.20
10.40
9.60
2.00
TRC and LWC Rate
8.80
1.80
LWD Rate
8.00
1.60
7.20
1.40
6.40
1.20
5.60
4.80
1.00
4.00
0.80
3.20
0.60
2.40
0.40
1.60
0.20
0.80
0.00
0.00
Jan
Feb
Mar
Apr
May
Jun
Jul
Aug
Sep
Oct
Nov
Dec
Data
T RC Rate
0.47
0.51
0.73
0.94
0.97
0.96
0.93
0.95
0.94
0.95
Green
Green
Green
W hite
W hite
W hite
W hite
W hite
W hite
W hite
LWC Rate
0.00
0.17
0.22
0.17
0.32
0.30
0.25
0.23
0.31
0.22
Green
Green
Green
Green
Green
W hite
Green
Green
W hite
Green
LWD Rate
0.00
1.19
2.85
3.23
3.00
4.07
3.24
3.24
3.68
4.36
Score
Green
Green
Green
Green
Green
Green
Green
Green
Green
Green
Definition
Analysis / Action
TRC is White and LWC and LWD are Green. Before the incidents in October, WSRC was approaching the
The Occupational Safety and Health Act of 1970 requires covered
employees to prepare and maintain records of occupational injuries established goals for 2001. Using the site's Lessons Learned program, information from these four incidents will be
and illnesses. WSRC measures safety performance in accordance communicated to all s ite employees. Since the previous injury rate indicated that the Behavior Based Safety proces s
was having a positiv e impact on the site's overall safety performance, each site department and section will have
with these guidelines. The incidence rates are calculated by
additional BBS training during the monthly safety meeting. Management will also c ontinue to monitor safety during its
multiplying the number of injuries, illnesses or lost workdays by
weekly staff meetings.
200,000 and dividing by the total number of work hours.
Comments
Goal - Monthly Rates are cumulative
TRC Rate = 0.93
Green
> 5% below the goal
W SRC combined Total Recordable Case (TRC) Rate, Lost W orkday Case (LWC) Rate and Lost Workdays
LWD Rate = 12.87
W hite
0-4% below the goal
(LWD) Rate for this metric. Stoplight definition and goals are more aggressiv e than Corporate.
LWC Rate = 0.30
Yellow
0-4 % above the goal
Red
>5% above the goal
SME Manager: Kevin Smith (803-952-9924) SME: Linda Blackston (803-952-9905)
Data represented here is fictitious.
I-A-1,2,3 Safety
10
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
47
Slide 11
Metric Grouping
z
Initial listing of metrics developed by WGG after a
year of study by various site experts
¾
¾
¾
¾
¾
¾
Safety (Industrial Safety, Environmental
Management, and Emergency Services and Fire
Protection)
Operational Excellence (Maintenance,
Management, and Engineering)
Work Environment (Employee Concerns, Employee
Performance, and Safeguards and Security)
Organizational Effectiveness (Fiscal Accountability,
Human Resources, Project Management, and Cost
Effectiveness)
External Relations (Regulatory Performance and
Community Relations
Assessments (Self Assessments, Quality
Assurance, and External Assessments)
11
Slide 12
Metric Grouping (continued)
z
Current set of metrics “married” the WGG set with the DOESR set of metrics, based on DOE-SR’s manager’s Focus
Areas
¾
¾
¾
¾
¾
Slide 13
Safety and Security (Industrial Safety, Emergency Services and
Fire Protection, Radiological Safety, Nuclear Safety, and
Physical Security)
Technical Capability and Performance (Production,
Infrastructure, Waste Avoidance, Disciplined Operations,
Employee Relations, Engineering, and Project Management)
Community, State and Regulatory Relationships (Environmental
Release Index, Enforceable Agreement Milestones,
Environmental Compliance Index, Citizens Advisory Board
Responsiveness, Public Participation Program, and Public
Perception)
Cost Effectiveness (Financial Forecasts, Performance Based
Incentives, Financial Performance, and Feedback and
Improvement)
Corporate Perspective (Nuclear Non-Proliferation, Plutonium
Disposition, Assisting Other Sites to Closure, and EM
Integration)
12
Communication Plan Outline
I. Indoctrination
A. Letter from WSRC Executive Vice President to
WSRC Vice Presidents, WSRC Deputy Managers,
and Level 2 managers (telling what the metrics are
and the schedule)
B. Training Sessions
1. Deputy Managers
2. Level 2 Managers
3. Program Managers
4. Subject Matter Experts
5. DOE-SR Personnel and staff
II. Site Communications
A. SRS On Line
B. SRS News
C. DOE-SR Newsletter
13
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
48
Slide 14
Summary
9 Using experience of Institute of Nuclear
Power Operations, World Association of
Nuclear Operators, and Washington
Government Group, proposed set of metrics
provides an objective view of site activities
that shifts focus to results.
9 Proposal quickly allows management to
identify areas of concern and “drill down” to
specific activity.
9 Analysis/Action section allows assessment
of problem status.
14
Slide 15
Questions?
9 WSRC Point of Contact:
−Gail Jernigan - 803-725-7774
[email protected]
9 DOE Point of Contact:
−Jeff Allison - 803-952-2-6337
[email protected]
−Sherry Southern - 803-952-8272
[email protected]
15
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
49
APPENDIX B: NUCLEAR SAFETY
WSRC is evaluating the various components to determine applicability to SRS Nuclear Safety Level
2 metrics. This appendix was created by Westinghouse Safety Management Solutions as a general
guidance. WSRC has opted not to include Authorization Basis Document Management Index.
1. Nuclear Safety Issue Management Index (these include all NIs, PISAs, Discovery USQs, etc.)
• percentage of unresolved nuclear safety issues to total
• average age of unresolved nuclear safety issues
• number of unresolved nuclear safety issues
− greater than 30 days old
− greater than 6 months old
− greater than 1 year old
2. Significant Nuclear Safety Incidents Index
• number of emergency notifications
• number of reportable unusual occurrences related to nuclear safety
• number of inadvertent safety system challenges
• number of Technical Safety Requirement violations (or Operational Safety Requirement or
Technical Specification as applicable)
• number of significant criticality infractions (e.g. both Double Contingency Controls violated)
• number of Criticality Safety Limit violations
• number of “for cause” shutdowns by DOE which are related to Nuclear Safety
NOTE: It is recognized that double, triple or even 5X counting may occur for one event. This is
acceptable since accumulative counting appropriately reflects the significance of the event.
3. Authorization Basis Document Management Index
• Number of overdue requirements/commitments to DOE associated with 10CFR830.
The order of safety significance and associated weighting for the color codes is:
• Significant Nuclear Safety Incidents Index
(50%)
• Nuclear Safety Issue Management Index
(30%)
• Authorization Basis Document Management Index (20%)
The frequency of measurement is quarterly.
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
50
1. Nuclear Safety Issue Management Index (including all NIs, PISAs, Discovery USQs, etc.)
a. Ratio of the number of unresolved nuclear safety issues on the last day of each quarter of a
rolling annual fiscal year to the total cumulative number opened during the rolling annual
fiscal year.
RATIO
A
0.00 < R < 0.25
0
0.25 < R < 0.50
1
0.50 < R < 0.75
2
R > 0.75
3
b. Average age of all unresolved nuclear safety issues on the last day of each quarter of a rolling
annual fiscal year.
AVERAGE AGE
B
0 month < T < 1 month
0
1 month < T < 3 months
1
3 months < T < 12 months
2
T > 12 months
3
c. Sum of the ratios of unresolved nuclear safety issues greater than 30 days old (U1), greater
than 6 months old (U2), and greater than 1 year old on the last day of each quarter of a rolling
annual fiscal year to the total cumulative number opened during the rolling annual fiscal year.
SUM OF RATIOS
X
(U1/U) + 2(U2/U) + 5(U3/U) < 0.5
0
(U1/U) + 2(U2/U) + 5(U3/U) >0.5< 1.5
1
(U1/U) + 2(U2/U) + 5(U3/U) > 1.5< 2.5
2
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
51
d. Result for Nuclear Safety Issue Management Index
S = A + (B + X) / 2
COLOR
S=0
Green
1<S<2
White
2<S<3
Yellow
S>3
Red
2. Significant Nuclear Safety Incidents Index
e. Number of emergency notifications during each quarter of the fiscal year (X=2).
f. Number of reportable Unusual Occurrences related to nuclear safety during each quarter of
the fiscal year.
g. Number of inadvertent safety system challenges during each quarter of the fiscal year
h. Number of Technical Safety Requirement, Operational Safety Requirement, and Technical
Specification violations during each quarter of the fiscal year (X = 2)
i. Number of significant criticality infractions (e.g. both Double Contingency Controls are
violated) during each quarter of the fiscal year (X = 3)
j. Number of Criticality Safety Limit violations during each quarter of the fiscal year (X = 3)
k. Number of “for cause” shutdowns ordered by DOE during the rolling annual fiscal year due
to nuclear safety concerns as measured at the end of each quarter of the rolling annual fiscal
year(X = 6)
X = Count Multiplication Factor
NOTE: It is recognized that double, triple or even 10X counting may occur for one event. This is
acceptable since accumulative counting appropriately reflects the significance of the event.
TOTAL COUNT
SCORE
COLOR
C=2
0
Green
2<C<5
1
White
6 < C < 10
2
Yellow
C > 11
3
Red
3. Authorization Basis Document Management Index
l.
Number of overdue requirements/commitments to DOE associated with 10CFR830 on the
last day of each quarter of the fiscal year.
SRS Performance Metrics Manual, Rev. 0
WSRC-RP-2002-00252
52
TOTAL COUNT
SCORE
COLOR
N<2
0
Green
3<N<5
1
White
6<N<8
2
Yellow
N>9
3
Red
Each Level 2 metric will be evaluated annually and the Level 1 roll-up will also occur annually
The hierarchy of safety significance of issues and their associated weights for the color codes is:
Nuclear Safety Issue Management Index
0.3
Significant Nuclear Safety Incidents Index
0.5
Authorization Basis Document Management Index
0.2
RESULT
SCORE
COLOR
0.0 < R < 1.4
0
Green
1.5 < R < 4.1
1
White
4.2 < R < 7.8
2
Yellow
R > 7.9
3
Red
R = (S * 0.3) + (C * 0.5) + (N * 0.2)