Department of Defense Assessment System Project

Proceedings of the Annual General Donald R. Keith Memorial Conference
West Point, New York, USA
April 30, 2015
A Regional Conference of the Society for Industrial and Systems Engineering
Department of Defense Assessment System Project (DoDASP)
Samantha Dorminey1, Daniel Lasche2, Angel Santiago3, and Milton Washington4
1, 2, 3, 4
United States Military Academy, Department of Systems Engineering
Corresponding author's Email: [email protected]
Author Note: Great appreciation goes to the Office of the Deputy Chief Management Officer (DCMO) for their support in
terms of time and resources for this project.
Abstract: The Department of Defense Assessment System Project (DoDASP) is a system which will provide key leaders and
department heads within the DoD information about the change in status of key Department of Defense (DoD) objectives.
The DoDASP development consisted of two parallel operations: The development of the Assessment System (metrics and
structure) and the development of the Visualization system (the display and presentation). This paper addresses the
development of the Visualization system. The key decision for this sub-system was which type of delivery system would
work best for these leaders. The Systems Decision Process (SDP) was followed to develop, analyze and recommend an
approach. Having received approval from the client, the Assistant Deputy Chief Management Officer for the DoD, the study
is now in the implementation phase and the web-based application is under development for presentation in the near future.
The Assessment sub-system development team is nearing completion of their assessment hierarchy and the two sub-systems
will be combined upon delivery of the visualization product.
Keywords: Department of Defense, Assessment, Visualization, Display, Systems Decision Process, Systems Engineering
1. Introduction
The United States Department of Defense (US DoD) was established on 10 August 1949 and is the largest and oldest of
the all government agencies with over 3.3 million employees (including Active Duty, civilians, National Guard and
Reservists) and over 2 million retirees! (Department of Defense, 2013) This makes the US DoD the largest employer in the
world! (BBC, 2012). Its budget is nearly a half a trillion dollars, which exceeds the GDP of all but 24 or 25 countries.
The sheer magnitude of the size and scope of the DoD and its operations make management extremely difficult. In a
private company, the Chief Executive Officer (in our case, the Secretary of Defense) would develop (or have his team
develop) an assessment capability to track significant metrics of his or her organization. These would be used to make
business decisions and/or to provide insights on items of interest to the Board of Directors (in this case, loosely, the US
Congress). There are a number of such systems at the beck and call of the Secretary of Defense, but what is lacking is a
simple “head’s up” system to identify potential problems or areas where the Secretary can exert his or her influence before
the problem arises. Developing such a system is the initial problem statement for the analysis in this paper and the
development of the DoD Assessment System Project (DoDASP).
2. Background
The United States Congress has been a bit wary of the Department of Defense since its establishment. As a matter of
fact, in 1789 President George Washington had to remind Congress twice to establish the War Department (which it 1997
National Defense Authorization Act (NDAA), Congress required the DoD to review its strategic priorities and develop a
report every four years called “The Quadrennial Defense Review” (Department of Defense, 2010). In this document, the
DoD was required to report on what they proposed to do in the coming years and how they intend to accomplish those
mission requirements. It did not however identify performance measures for these initiatives. In the 2008 NDAA, Congress
required the Department “to align and improve business operations across the Department’s Business Functions”. (Public
Law, 2008). In response to this directive, the DoD developed the Strategic Management Plan which established “goals, key
initiative, outcomes, performance measures, and guiding principles for the Department’s business mission area.” (Department
of Defense, 2013).
Assessments internal to the Department of Defense include the Chairman’s Readiness System (CRS) which is a
readiness assessment of military units and capabilities which is a bottom up assessment from the units’ to the Office the
Chairman, Joint Chiefs of Staff (CJCS, 2013). This provides the Department and its key leaders with a current assessment of
1
Proceedings of the Annual General Donald R. Keith Memorial Conference
West Point, New York, USA
April 30, 2015
the status of military forces. It does not however, include other requirements and capabilities beyond readiness nor does it
give insights to potential future problems.
According to Deputy Chief Management Office (DCMO) at OSD, the “early warning” aspect of assessment was
missing from the other existing systems. There lacked a simple system to give the SecDef and the Service leaders indications
of potential problems. There are two aspects to the development of such a system: the development of the assessment (what
is measured?) and the development of the visualization system (how is the information presented?) This paper addressed the
analysis for the development of the latter system. The remainder of the paper follows the analysis using the Systems
Decision Process to determine the best visualization system for the key stakeholders. The paper then concludes with a
summary of the implementation of the decision made by the client – the Assistant Deputy Chief Management Officer.
3. Approach and Analysis
The approach chosen to analyze the visualization system was the Systems Decision Process (SDP) which is shown
in Figure 1, below. The SDP is “a collaborative, iterative and value-based decision process that can be applied in any system
life cycle stage.” (Parnell, et al, 2011). Given that this visualization system analysis will lend itself to the analysis of
potential alternatives, this approach seemed most appropriate. It has been used widely in government and non-government
applications for similar such analyses. (see Freberg, et al 2013, Dees, et al 2010, Roeckl 2009, and others). There are four
phases of the SDP. The paper proceeds by describing the analysis in each phase.
Figure 1: The Systems Decision Process (SDP) (Parnell, et al, 2011)
3.1 Problem Definition
The primary objective of the Problem Definition phase of the SDP is to fully understand the problem and establish a
means for analyzing alternatives which will be developed in the second phase. It begins with receiving an initial problem
statement from the client, previously identified as “Develop a “heads up” system for the SecDef and other key DoD leaders”.
In his “The Elements of User Experience”, Jesse James Garrett introduces the five “elements of user experience” layers
(shown in Figure 2, below). These elements are organized from abstract to concrete. The “visualization system” part of the
overall DoDASP concerns the top two layers and part of the middle, whereas the “assessment system” part, concerns the
bottom two and part of the middle. This helps scope the problem under study for this paper.
Figure 2: Elements of User Experience (Garrett 2010)
2
Proceedings of the Annual General Donald R. Keith Memorial Conference
West Point, New York, USA
April 30, 2015
The diagram above helps structure the input-output diagram for the system, shown below in Figure 3. This diagram
depicts what enters the user interface, what the subsystem levels the user interface will address, and what the user receives as
Visualizationresults
Communication
InputOutput
an output. The interface output is a clear, meaningful
report. This
diagram shows that there are four main functions
that the visualization system must perform: receive inputs, Diagram
process information, present outputs, and provide feedback.
System Boundary
Subsystems
Assessment
Methodology
User Inputs/
Controls
Inputs
Strategy
Structure
Scope
Data
Surface
Outputs
Assessment
Results
Skeleton
Feedback Loop
Figure 3: DoDASP Inputs-Outputs Diagram
As the system must perform each of these functions, evaluating how well each alternative achieves these functions
form the basis of the alternative evaluations. Therefore, these are the top level functions or the Value Hierarchy built for the
evaluation and which will be used for the remainder of the work here. The Value Hierarchy, shown in Figure 4, depicts the
main functions of the system, the primary goals within each function, and corresponding value measures by which the
achievement of these goals can be assessed.
Develop a visualization system that receives and processes
assessment data, presents information to key leaders in the
DoD, and provides feedback within the system
1.1 Maximize
ease of input
1.1.1 Time
spent inputting
information
(hours)
3.0 Present
Outputs
2.0 Process
Information
1.0 Receive
Information
2.1 Maximize
accessibility
2.1.1
Update
Frequency
(days)
2.2 Maximize
ability to
manipulate
2.4 Maximize
Organization
3.1 Maximize
Ease of use/
intuitive
2.2.1
Information
Comparison
level
(stars)
2.4.1 Level of
Organization
(stars)
3.1.1 Time to
Learn Interface
(mins)
3.3 Maximize
Visual Tool
Effectiveness
3.3.1
Communication
ability
(stars)
4.0 Receive
Feedback
4.1 Maximize
Auditing/
Troubleshootin
g Effectiveness
4.1.1 Auditing
Frequency
(hours/biweekly)
4.2 Maximize
Implementation
Speed
4.3 Maximize
Technology
Modernization
4.2.1 Time until
Implementation
(stars)
4.3.1 Level of
Modernization
(stars)
Figure 4: DoDASP Value Hierarchy
After using these tools to better understand the problem, the client approved the Value Hierarchy and the below
Revised Problem Statement (RPS):
Develop a visualization system that receives and processes assessment data, presents information to key
leaders in the DoD, and provides feedback within the system.
The research, the approved RPS, and the approved value hierarchy are the key products to move into the Solution
Design Phase.
3.2 Solution Design
The goal of the Solution Design phase is to facilitate the creation of new and innovative solution possibilities via an
expanded “solution space”. In addition to considering typical solutions already in practice elsewhere. After developing a
robust list of potential solutions, each was evaluated for feasibility as shown in the Feasibility Screening Matrix in Figure 5,
below. The resulting feasible alternatives (approved by the client) were: Web page, App, PowerPoint presentation, and
paper report. These were carried to the Decision Making phase for further quantitative analysis.
3
Proceedings of the Annual General Donald R. Keith Memorial Conference
West Point, New York, USA
April 30, 2015
Figure 5: DoDASP Feasibility Screening Matrix
3.3 Decision Making
In the Decision Making phase, the candidate solutions are evaluated using the Value Hierarchy to determine the
strongest recommendation to solve the revised problem. Each of the candidate solutions was evaluated using the metrics
developed in the Problem Definition phase and the results reported in the DODASP Raw Data Matrix shown in Figure 6,
below.
Figure 6: DoDASP Raw Data Matrix
Since the value metrics are not all in the same units, they are not comparable. The raw data must be converted to
value scores so they may be in a consistent unit of measurement for comparison. These transformation functions were
developed during the Problem Definition phase to convert raw scores to “value” with a scale between 0-100. A separate value
function is required for each value metric. Figure 7 depicts an example of a value function for the metric “Implementation
Expediency” by associating the number of weeks to implement (y-axis) with a value score (x-axis).
Implementation Expediency
100
80
60
40
20
0
Weeks
1
2
4
12
Figure 7: Value Function Example
After the raw scores were converted to value scores, the measures were weighted and summed to determine a Total
Value Score (TVS) (Parnell, et al, 2011). These weights were developed using a Swing Weight Matrix designed to account
for not only importance, but also variability. Variability of a measure is significant because this approach seeks to
differentiate between alternatives. Measures with less variability do not contribute to this differentiation and therefore are
weighted less than those with high variability which contribute greatly to differentiation (Ewing, et al, 2006). The swing
weight matrix is shown in Figure 8, below and the resultant global weights by measure are shown in Figure 9.
4
Proceedings of the Annual General Donald R. Keith Memorial Conference
West Point, New York, USA
April 30, 2015
Figure 8: Swing Weight Matrix
Figure 9: Global Weight Conversions
Using an additive value model, the resultant scores for each candidate solution is listed below in figure 10. Figure 11 is a
bar graph which displays the contribution of each measure to the TVS for each candidate solution. This allows for visually
comparing how an alternative might be improved by addressing a measure with low contributions to overall value.
Figure 10: Total Value Scores
Figure 11: Total Value Scores Graphed
As seen above, the App could be improved by making implementation more expedient and the Web could be
improved by making it easier to use (more accessible). In Parnell, et al (2011), this is known as Value Focused Thinking. In
review, combining these alternative approaches results in a Web-based App which realizes the best of each of the other
alternatives. This resulted in the generation of a hybrid alternative: a Web-Based App. The concept of a Web-Based App is
not new; Facebook as well as USAA Banking, for examples, have websites as well as phone applications for their users. This
alternative combined the highest scores between the Web and App alternatives, and the total value score was higher than any
other alternative, thus making it our recommended solution to the decision maker.
The client, The Assistant DCMO, approved the development of such an approach. This may seem obvious for a
solution, but as the client pointed out, this is far from an obvious solution for the Office of the Secretary of Defense and the
DoD, which relies heavily on PowerPoint briefings and paper reports, especially to Congress.
3.4 Implementation
The study is currently in the Implementation Phase. This includes working closely with the Assessment System
development team to combine the sub-systems. The work accomplished in the previous phases contributed greatly to the
development of the requirements for the web-based app. The system should be implemented and ready for use in the near
future.
4. Conclusion
This study was significant on many levels. First, this was one of the first holistic looks at assessments for all of the
Department of Defense. Most previous efforts have been on very specific areas or have been externally directed (QDR).
There also has been little thought as to how to present the results – all other such assessments have been paper reports or
PowerPoint charts! This work focused on how the most senior leaders in the Department of Defense should receive
information for fast action and responsiveness. This DoDASP Visualization system will not only be on the Secretary of
Defense’s desk soon, but it will also be on his phone. Also, it will be on the desk and phone of each of the Service
5
Proceedings of the Annual General Donald R. Keith Memorial Conference
West Point, New York, USA
April 30, 2015
Secretaries and the Service Chiefs. For an organization as large and diverse as the Department of Defense, such a system is a
long time coming.
5. References.
Alexander, R. (2012, March 29). “Which is the world's biggest employer?” British Broadcasting Company (BBC) Magazine.
, accessed April 6, 2015, from http://www.bbc.com/news/magazine-17429786
Agrawala, M., Li W., Berthouzoz, F. (2011). “Design Principles for Visual Communication”. Communications of the ACM,
54, 60-69.
Alexander, R. (2012, March 29). “Which is the world's biggest employer?” British Broadcasting Company (BBC) Magazine,
accessed April 6, 2015, from http://www.bbc.com/news/magazine-17429786
Apple Inc., iOS Human Interface Guidelines. Apple Inc. Retrieved January 26, 2015, from
https://developer.apple.com/library/ios/documentation/UserExperience/Conceptual/MobileHIG/
CACM Staff. (2014). Visualizations Make Big Data Meaningful. Communications of the ACM , 57, 19-21.
CJCS. (2013). CJCS Guide 4301D: CJCS Guide to the Chairman’s Readiness System. Retrieved April 8, 2015, from
http://www.dtic.mil/cjcs_directives/cdata/unlimit/g3401.pdf
Dees, R., Dabkowski, M., Parnell, G. (2010, March). Decision-Focused Transformation of Additive Value Models to
Improve Communication. Decision Analysis, 7,172-184.
Department of Defense. (2010, January). QDR 101: What You Should Know. . Retrieved April 8, 2015, from
http://www.defense.gov/qdr/QDR_101_FACT_SHEET_January_2010.pdf
Department of Defense. (2013, July 1).Stretegic Management Plan: The Business of Defense FY2014-2015.. Retrieved
October 10, 2014, from http://dcmo.defense.gov/publications/documents/FY14-15_SMP.pdf
DoD website. Retrieved January 30, 2015, from http://www.defense.gov/osd/
Ewing, P. , Tarantino, W., Parnell, G. S. (2006). Use of decision analysis in the Army base realignment and closue (BRAC)
2005 military value anlaysis. Deicison Analysis, 3, 33-49.
Freberg, K., Saling, K., Vidoloff, K., Eosco, G. ( 2013, September). Using value modeling to evaluate social media
messages: The case of Hurricane Irene. Public Relations Review, 39, 185-192.
Garrett, J. J. (2010). The Elements of User Experience: User-Centered Design for the Web and Beyond (2 nd Edition).
Berkley, CA: New Rider.
Hearst, M. (1999). Modern Information Retrieval.USA: Addison-Wesley-Longman Publishing co.
Kasunic, M. (2005). Designing an Effective survey, Retrieved January 26, 2015, from
http://www.sei.cmu.edu/reports/05hb004.pdf
Kirk, A. (2012). Data visualization: a successful design process; a structured design approach to equip you with the
knowledge of how to successfully accomplish any data visualization challenge efficiently and effectively, Birmingham,
UK: Packt Publishing.
Library of Congress. (2015).America’s Story. Retrieved 7 April 2015, from
http://www.americaslibrary.gov/jb/revolut/jb_revolut_army_1.html
Parnell, G., Driscoll, P., Henderson, D. (2011). Decision Making in Systems Engineering and Management. New Jersey: John
Wiley & Sons, 2nd ed.
Public Law. (2008, January 28). Section 904 of Public Law 110-181. Retrieved April 8, 2015, from
http://www.dod.gov/dodgc/olc/docs/pl110-181.pdf
Roeckl, M.. (2009). Cooperative Situational Awareness in Transportation: Integration of Information and Communications
in Intelligent Transportation Systems ( Doctoral dissertation, Leopold-Franzens-University of Innsbruck, 2009). DLR
electronic library.
VIDI Group UC Davis. Visualization Interface. Retrieved April 2, 2015, from http://vidi.cs.ucdavis.edu/research/visinterfaces
6