OPSEC Professionals Society The OPSEC Journal Sixth Edition, 2008 OPSEC JOURNAL 6TH Edition – October 2008 Table of Contents Title and Author Page Letter to Our Readers The OPSEC Journal Committee 2 President’s Remarks Samuel V. Crouse, OCP 3 Is the Term “Operational” or “Operations” Security? Arion (Pat) Pattakos, OCP, CPP 4 OPSEC in Military Logistics: What’s the Risk? Patrick J. Geary, OCP 6 OPSEC and Risk Mitigation in Action: Tracing the Origin and Viewing the Future Samuel V. Crouse, OCP 14 Demystifying OPSEC Assessments: A “How To” Primer Daryl Haegley, OCP, CCO 33 A Pretext for War: An Insider Threat and an OPSEC Failure A Book Review by William M. Feidl, OCP 44 1 October, 2008 Dear Readers, It is our pleasure to introduce the sixth edition of the OPSEC Journal. This represents another example of our efforts to provide worthwhile products and services for OPSEC, counterintelligence and security professionals in the public, private and military sectors. The OPSEC Journal is provided free of charge for members of the Operations Security Professionals Society. The opinions expressed herein are not necessarily endorsed by any company or federal agency and are the sole responsibility of each author. Although some of the concepts expressed in the Journal are not universally accepted, they are nonetheless designed to encourage discussion. We would like to thank the members of the OPSEC Journal Committee who spent significant time and effort reviewing articles proposed for publication. This issue begins with a short article explaining why OPSEC means “Operations” Security not “Operational” Security by Past National President Pat Pattakos, OCP, followed by an article explaining the importance of applying OPSEC in military logistics by Past National President Pat Geary, OCP. Next, current National President Sam Crouse, OCP, examines OPSEC and risk mitigation and current National Vice President Daryl Haegley, OCP, discusses an interesting method for conducting OPSEC Assessments. We conclude this edition of the OPSEC Journal with a book review written by a past member of the National Board of Directors, Bill Feidl, OCP. We encourage each of you to write articles to be considered for publication in the OPSEC Journal. Articles should be submitted to: The OPSEC PROFESSIONALS SOCIETY at http://www.opsecsociety.org or https://opmis.xsp.org or PO Box 150515, Alexandria, VA 22315-0515 or E-FAX: 503-907-7511. Make sure you include a brief biographical sketch. Printed copies should include a copy on disk. Thank you for your interest in the OPSEC Journal and we hope you find the following pages both informative and interesting. The OPSEC Journal Committee 2 October, 2008 Dear Members and Friends, Let me first thank the members of the Journal Committee for this edition of The OPSEC Journal. Special thanks to Pat Geary, OCP and Past President, and Janice Edwards, OPS Member who worked to pull all the pieces together for production. It has been some time since the publication of a Society journal, and we are hopeful that we will publish one journal per year in the future. Second, although we are working on a number of projects in the Society, one of the most important is cataloguing of past articles and journals in an electronic library. This library will represent the written portion of our “body of knowledge.” As we proceed on this path, you will see there is much we can learn from a review of these gems from the past. The next step is to expand this body of knowledge by articulating and sharing our new thoughts, ideas, and crystallized wisdom via papers and proceedings. Over the next year, we will finish our second decade as a Society and enter a third decade that will bring more challenges to our profession. We encourage all members and friends to contribute as we meet those challenges and solve tomorrow’s problems as a team. Enjoy this journal and as you read the articles, write down those thoughts and ideas and consider contributing to the next edition. Sam Crouse, OCP 13th National President OPS Board of Directors 3 Is the Term “Operational” or “Operations” Security? Arion (Pat) Pattakos, OCP, CPP A ll too often we hear the term Operational Security when clearly the reference is to Operations Security. Some of us old-timers give a shudder when we hear the term Operational Security used in the context of Operations Security. Why you say? Well because many of us believe they are not the same thing. Every profession has its professional lexicon and so does Operations Security (aka OPSEC). Furthermore, it’s incumbent upon professionals to use standard and agreed to terminology to ensure effective communication and comprehension. Interestingly, while OPSEC is defined in the "DOD Dictionary of Military and Associated Terms” (JP 1-02) there is no definition for Operational Security. When you do an internet search using the term Operational Security, it usually reflects usage associated with cyber-security. OPSEC as defined in the DoD dictionary and as we professionals know it, deals with protecting critical information which are those specific facts about friendly intentions, capabilities, and activities vitally needed by adversaries for them to plan and act effectively so as to guarantee failure or unacceptable consequences for friendly mission accomplishment. The DoD Dictionary then defines OPSEC as the process of identifying critical information and subsequently analyzing friendly actions attendant to military operations and other activities to: a. identify those actions that can be observed by adversary intelligence systems; b. determine indicators that hostile intelligence systems might obtain that could be interpreted or pieced together to derive critical information in time to be useful to adversaries; and c. select and execute measures that eliminate or reduce to an acceptable level the vulnerabilities of friendly actions to adversary exploitation. Note that OPSEC has a particular point-of-view; it deals with what can be observed by adversary intelligence systems, those indicators that may be interpreted to reveal our critical information. The five step OPSEC process is a risk management process directed toward reducing to an acceptable level adversary exploitation of those indicators and hence determining our critical information. After all, the Dictionary advises that risk management is the process of identifying, assessing, and controlling, risks arising from operational factors and making decisions that balance risk cost with mission benefits. I see one set of these operational factors to consider is observable actions and therefore OPSEC but there are more operational factors of concern and this is where the idea of Operational Security comes into play. Hopefully, we are in agreement in what we call OPSEC as defined in the Dictionary and other sources. However, given the use by many of the term Operational Security, we 4 need to offer some definition for Operational Security so that we can keep our OPSEC terminology in our professional context. Here are some ideas. To paraphrase the DoD Dictionary, an operation is an action (or actions) taken to carry out a mission. Both the word operation and the word mission are used in their broad contextual framework as the things you do to achieve specific objectives which may range from protecting a facility to eliminating terrorists, even taking hill 101 and more. When conducting any type of operation or activity, it is common sense for the person in-charge to protect the operation from adversary interference. Again to paraphrase from the Dictionary, we must take the measures necessary to protect ourselves against acts designed to, or which may impair our effectiveness in accomplishing a mission. The meaning for Operational Security that flows from these ideas suggests the Operational Security is bigger than OPSEC and that OPSEC is but one component of Operational Security. Operational Security is an umbrella of security processes used to protect all critical assets associated with an operation such people, facilities, logistics, information/communication systems, equipment, technology, and so forth. Putting out flankers in a movement to contact is Operational Security as is controlling access to a facility. As noted above, OPSEC looks at how an asset (the operation or its components) may be vulnerable to enemy exploitation from what is revealed and thus observable thereby risking reduced operational effectiveness. Are you posting guards to protect a facility? Are there patterns that you establish when doing so that may be exploitable by an attacker? When do shifts change and how? How do guards communicate? Do the guards sleep on duty? Do they fail to patrol? The guards are a component of Operational Security while OPSEC is a vulnerability assessment tool that reveals how that security may be defeated. In conclusion, here is a working definition of Operational Security which is paraphrased from the definition for security provided by the Dictionary. Operational Security: 1. Measures taken by an organization, activity, facility or installation to protect itself against all acts designed to, or which may, impair its mission effectiveness. 2. A condition that results from the establishment and maintenance of protective measures that ensure a state of inviolability from hostile acts or influences. Dictionaries might not be fun to read but sometimes it is necessary to promote comprehension and professionalism. We are after all, the OPSEC Professionals. ______________ Arion “Pat” Pattakos is a founding member, past national president and a member of the Standards Committee of the OPSEC Professionals Society. He is certified as an OPSEC Professional (OCP) and as a Protection Professional (CPP). 5 OPSEC in Military Logistics: What’s the Risk?1 Patrick J. Geary, OCP The haft of the arrow had been feathered with one of the eagle’s own plumes. Sometimes we supply our enemies with the means of our own destruction. -Aesop 6th Century BC 2 I n this example, 6th century philosopher Aesop was commenting on how a hunter succeeded in killing his prey – an eagle. After observing eagles for an extended period, the hunter learned eagles occasionally shed their plumes or feathers. The hunter had also discovered those plumes made an excellent addition to his arsenal of arrows. When the eagle carried out his normal function of shedding a plume, the hunter picked it up and fashioned it into one of his arrows, then used it to kill the very same eagle. Whether Aesop actually saw this event occur is not known. But his fable illustrates how seemingly innocuous actions can lead to one’s demise. As we continue our military operations in Afghanistan and Iraq as part of our efforts to protect our homeland in the War on Terrorism, it is important to remember the key role of logistics support. Just as the eagle provided the means of his own destruction, so too could seemingly innocuous activities in logistics provide the means of our own elimination. It is easy to overlook how we appear to others who might be interested in what we are doing. In the words of Vice Admiral Arthur K. Cebrowski, USN (ret.), Past President of the U. S. Naval War College, “Anytime you have something of military value, it becomes a target.” 3 The question is: How do you determine when logistic actions or operations become an unacceptable risk? The purpose of this article is to discuss actions or operations in a military logistics environment and apply the Operations Security Process in an effort to help determine the risk of such actions. The objective is to enlighten the reader on both of these areas so he will be in a better position to positively affect outcomes. 1 Note: This article originally appeared in Logistics Spectrum. International Society of Logistics Engineers, Vol. 37, Issue 1, Jan.-Mar., 2003. Reprinted with permission of the author. 2 Charles W. Eliot, ed., The Harvard Classics (New York, New York: P.F. Collier and Son, 1909). 3 VADM Arthur K. Cebrowski, “Warfare in the Information Age,” Speech, U.S. Naval War College, Newport, RI: 5 October 1999. 6 Background Before proceeding further, the key terms in this discussion should be defined. The most important terms are logistics and operations security (OPSEC). Since logistics is a broad, complex and somewhat abstract field it has many definitions. Henry E. Eccles, in his classic logistics work, Logistics in the National Defense, made his attempt to define it in 1959 when he stated: Logistics is the provision of the physical means by which power is exercised by organized forces. In military terms, it is the creation and sustained support of combat forces and weapons. Its objective is maximum sustained combat effectiveness. Logistical activities involve the direction and coordination of those technical or functional activities which in summation create or support the military forces. 4 Today, probably the most accepted definition of logistics comes from Joint Publication 1-02, DOD Dictionary of Military and Associated Terms: The science of planning and carrying out the movement and maintenance of forces. In its most comprehensive sense, those aspects of military operations which deal with: a. design and development, acquisition, storage, movement, distribution, maintenance, evacuation, and disposition of materiel; b. movement, evacuation, and hospitali-zation of personnel; c. acquisition or construction, maintenance, operation, and distribution of facilities; and d. acquisition or furnishing of services. 5 As with logistics, probably the most accepted definition of OPSEC also comes from Joint Publication 1-02: A process of identifying critical information and subsequently analyzing friendly actions attendant to military operations and other activities to: a. Identify those actions that can be observed by adversary intelligence systems. b. Determine indicators adversary intelligence systems might obtain that could be interpreted or pieced together to derive critical information in time to be useful to adversaries. 4 Henry E. Eccles, Logistics in the National Defense (Harrisburg, Pennsylvania: Stackpole Company, 1959; reprinted., Newport, RI: Naval War College Press, 1997), 22. 5 Joint Chiefs of Staff, Department of Defense Dictionary of Military and Associated Terms (Joint Pub 1-02) (Washington, D.C.: June 29, 1999), 263. 7 c. Select and execute measures that eliminate or reduce to an acceptable level the vulnerabilities of friendly actions to adversary exploitation. 6 The Problem – Logistic Parts of the Whole In the military world, few things are more important to the successful outcome of battle than good logistics and Operations Security (OPSEC). Joint Pub 4.0, Doctrine for Logistic Support of Joint Operations says, “Logistics… is the foundation of combat power.” 7 Joint Pub 1, Joint Warfare of the US Armed Forces says, “Adequate logistic support” and “maintaining the operations security of plans and gaining the fullest possible surprise” is vitally important and essential for maintaining freedom of action. 8 As quoted above, logistics is the foundation of combat power and as such is a part of any military endeavor. An airman, marine, sailor or soldier cannot participate in a fight unless he can get to the fight and have something to fight with when he arrives. In addition, to sustain the fight he must be continually supplied. Basic human needs such as food, water and shelter must be not only available and in good condition, but they must also arrive at the right location at the right time and in sufficient quantity for the relevant mission. In addition to those basic human needs, in a combat environment, combat forces also need such essentials as fuel, clothing, weaponry, ordnance, protective equipment, communication gear, medical supplies, health services, transportation, construction material, spare parts plus equipment and facility maintenance. 9 In a peacetime environment at a military research facility, scientists and engineers need certain types of supplies to work on specific types of metals, ceramics, epoxies or other materials. The vital need for these supplies and services is nothing new for the experienced military logistician. But to ensure these supplies and services arrive at the right location at the right time and in sufficient quantity for the relevant mission, one must take a great many actions to overcome all the inherent hurdles involved. 6 Ibid., 328. 7 Joint Chiefs of Staff, Joint Doctrine for Logistic Support of Joint Operations (Joint Pub 4.0) (Washington, D.C.: January 27, 1995), ix. 8 Joint Chiefs of Staff, Joint Warfare of the US Armed Forces (Joint Pub 1) (Washington, D.C.: January 10, 1995), III-6. 9 George C. Thorpe, Pure Logistics: The Science of War Preparation (Kansas City, MO: Franklin Hudson Publishing, 1917; reprinted., Newport, RI: Naval War College Press, 1997), 10-11. 8 The actions necessary to provide these needs in the logistics chain of supply are easily observed or detected. Major General Julian Thompson perhaps best underscored the importance of this statement in his book, The Lifeblood of War: Logistics in Armed Conflict, when he said the need for security in logistics is important because of, “…the need to conceal one’s logistic intentions from the enemy, lest by his interpretations of them he devines the overall plan.” 10 In other words, when going about the duties necessary to provide adequate supplies and material for our military forces or infrastructure, one’s activity can be readily observed or detected by potential adversaries who are trying to discover our intentions and capabilities. When a potential adversary knows your intentions and your ability to carry out those intentions, he is in a much better position to anticipate your next move and to prepare to counter your move. The result can be catastrophic for any combat operation. The problem is many logistics practitioners and those who benefit from logistics support do not always understand the significance of logistic actions or operations as a part of the overall effort. As a result, final objectives or outcomes are negatively affected when others exploit easily observed logistic indicators of our intentions and capabilities. A few examples follow. Suppose you are in the Navy and you are trying to move a ship from Port A to Port B without being detected. In some cases, this activity would be classified information and you might want to go to great lengths to protect that information. No matter what else you do to hide the presence of your ship, your adversary could learn both your intentions and mission if you send the mail for the ship to its port of destination. To deduce the sensitive information, an intelligence collector would merely need to combine the mail information with publicly released information about the ship’s capabilities and knowledge of desirable targets near the destination port. The resulting analysis might not be perfect, but it would have a high probability of being close and this could be enough to remove the element of surprise and defeat your objective. Suppose you are in the Army, and you are deployed and preparing to move a combat unit into action. Allowing your unit to continue displaying unit identifiers on their uniform while feeding them an unusually lavish hot meal and supplying them with a week of provisions, could indicate a number of things with intelligence value. To most observers, those actions could indicate: 1) the type of training the unit has received (capability), 2) the size of the unit (capability), 3) the approximate time of combat (intention), and possibly 4) the approximate location of the expected combat. Again, an intelligence collector could accurately interpret these seemingly innocuous actions, anticipate your move, remove the element of surprise and defeat your objective. Similarly, one could also develop plenty of examples involving the Air Force. Most people realize one of the most frequent activities for an Air Force unit in combat is to fly bombing missions. If you give civilian air traffic controllers access to the flight plans for 10 Maj. Gen. Julian Thompson, The Lifeblood of War: Logistics in Armed Conflict (London, UK: Brassey’s, no date avail.), 8. 9 the aircraft flying a bombing mission, your adversary could discover the time, destination and approach-vector of your aircraft. With this knowledge, your adversary would know exactly when and where the aircraft would arrive and therefore find it much easier to shoot down the incoming aircraft. In a research environment, there are many occasions when you might not want potential adversaries to know what kind of military research project you are working on and where it is located. Easily observed material supplied to a military facility loading dock could be an indicator of these sensitive activities. If you combine this observation with the appearance of a test schedule, plus hotel reservations and travel orders for specific types of scientists and/or engineers, chances are your adversary will be able to interpret this information and make a good estimate of what you are working on and when and where you are working on it. In the above example, if your adversary needed more detailed information, he would know exactly where to direct additional intelligence collection assets to learn more valuable information. For instance, a closer examination of the test schedules or the types and amounts of supplies and personnel at your location might indicate the progress made on your project and subsequently reveal how much time before you are likely to be able to deploy it. Your adversary would then know how much time he has to develop a countermeasure to your project and would be in a better position to reduce or eliminate your effectiveness when deployed. All of these operations are examples of logistics activities having the potential to reveal sensitive or classified information. The important thing to remember is everything we do in the defense community is a part of the greater whole. Many small, individual and seemingly harmless actions are part of every thing we do to provide for our country’s defense. Many different people are involved in these actions. They all have importance and together with other pieces of information could reveal sensitive intentions, capabilities and activities. Essential Secrecy and Risk The examples above point out some of the dangers of compromise from relatively simple logistic actions or activities. Those same simple actions however, are also absolutely necessary to conduct our business and operations. Subsequently, there is the equally important risk of inhibiting those activities to such an extent we would negatively affect progress toward our own objectives. The concept of essential secrecy recognizes both of these fundamental facts: 1) inadequate secrecy allows our potential adversaries to inhibit or degrade our operational effectiveness by identifying critical information and eliminating the element of surprise; and 2) excessive secrecy inhibits our operational effectiveness by interfering with crucial activities such as coordination 10 and logistical support. 11 Our goal then should be to conduct our military logistics activities without allowing either of these to occur. To ensure the even-handed application of essential secrecy, military logistics planners should ask themselves a series of questions: Should I be concerned about securing the information and material under my purview? Do I need to take any action? How much time is it worth? Should I allocate any of my limited resources? If I need to allocate some of my scarce resources to secure the information and material under my control, how much should I allocate and what measures should I use? In other words, what is the risk of doing too much or too little? Clearly, the answer will come from an effective application of an analytical process otherwise known as the Operations Security (OPSEC) process. Essential secrecy through the proper application of the OPSEC process will directly contribute to the operational effectiveness of military logistics by inhibiting or preventing potential adversaries from degrading mission effectiveness. Determining Risk Through the OPSEC Process The operations security (OPSEC) process is a five step analytical process designed to answer the questions above. They are: 1) Identify the critical information; 2) Analyze the threat; 3) Analyze the vulnerabilities; 4) Assess the risk; and 5) Apply appropriate measures. To be effective, all five steps should be taken in sequence. Except for step 1, each of these steps is somewhat dependent on each previous step and can negate the need to proceed further. 12 It should also be noted, there are circumstances when two or more of these steps must be applied simultaneously. A full discussion of the OPSEC process is beyond the scope of this article but a brief summary of the steps involved follows. Step 1 – Identify the critical information. After establishing your mission and what you are trying to accomplish, the first thing to examine is how does your mission impact others and their mission. Ask yourself what are the most important elements of your mission. Is there anything in your mission that could put lives or valuable resources in danger? You must look at your effort as if you were an intelligence collector for a potential adversary. If a potential adversary learned this information, would it allow him to degrade or eliminate your ability to accomplish your mission? Are you doing anything that could indicate your intentions and capabilities to a potential adversary? The answers to these questions will form the basis for all further actions in the OPSEC process. If you determine there is nothing worth protecting in your mission, there is no need to proceed further in this process. 11 Navy Department, OPERATIONS SECURITY, OPNAVINST 3432.1 (Washington: 1995), 3-4. 12 Ibid., Enclosure (1) 1-6; Joint Chiefs of Staff, Joint Doctrine for Operations Security (Joint Pub 3-54 CH-1) (Washington, D.C.: April 15, 1994), III.2.a.–III.2.e.; Air Force Department, Operations Security, AFI 10-1101 (Washington: 1994), 2.1-2.6. The following discussion on the OPSEC Process is based on these three sources. 11 Step 2 – Analyze the threat. After you have confirmed the presence of critical information, the next step is to make sure your critical information is actually threatened. This might take some liaison work with intelligence, counterintelligence and OPSEC professionals. By working closely with these groups you should be able to assess a) whether someone is actually interested in your mission, and b) whether those interested parties actually have the ability to collect your critical information. The sources of those threats are also important because they can later help determine what measures might be necessary. The most common sources of threat include human intelligence collection, imagery intelligence collection, and signals intelligence collection. All of these methods can be applied when collecting information on logistics but the most common method is reviewing open source information and observing processes, procedures, behaviors, and administrative habits that could indicate the desired critical information. Step 3 – Analyze the vulnerabilities. After you have confirmed your critical information is threatened, you must determine whether or not the critical information is actually vulnerable. Here you are looking at what elements of your operation are not adequately protected and could indicate or lead a collector to your critical information. You must determine if your threatened critical information is exposed in a way that a potential adversary could act in time to degrade or eliminate your effectiveness. If he cannot act in time to do damage to your critical information, what you are already doing might be adequate protection and you might not need to proceed further in this process. However, critical information is almost always vulnerable at one point or another. Consultation with OPSEC professionals in this step might be especially helpful in discovering relevant vulnerabilities. Step 4 – Assess the risk. Only after the first three steps are confirmed do you proceed to this step. In this step you are combining all of the information from the previous three steps and determining whether it appears the risk to your program is acceptable or unacceptable. What is the likelihood unacceptable damage to your mission or program could result if you do not take further action? You must weigh the answer against the potential for a reduction in operational effectiveness and the cost of the various measures available to reduce or eliminate the risk. With the information you have collected in the previous three steps, can you justify taking no action or is there an unacceptable risk to your critical information requiring actions or measures to mitigate the risk? Step 5 – Apply appropriate measures. If the risk to your critical information is unacceptable, additional measures will be necessary to mitigate the risk. You must weigh the total importance and/or value of your mission against the measures available to mitigate the risk. In some cases, the only thing you need to do is cover something up or wait half an hour until a satellite passes overhead. In other cases, you might need to develop an extensive cover story to disguise the presence of what you are doing. Sometimes the measures available are so costly you cannot justify their implementation. Usually however, there is a significant range of options available. You 12 must select the most appropriate measures considering all of the information previously gathered in the steps above. Conclusion Military logisticians today face a multitude of hurdles to overcome: from ensuring their products or services are supplied in sufficient quality and quantity to ensuring those same products or services arrive in time to positively affect the outcome of combat operations. The actions to overcome these hurdles can sometimes produce risk to the overall integrity of a military operation. The concept of essential secrecy as in the effective application of the OPSEC process is of vital importance when determining and mitigating risk in military logistics without having a negative impact on our own operational effectiveness. With an awareness of the above discussion, it is imperative those in a position to act implement the OPSEC process and thereby enhance the essential protection of critical information. ______________ Patrick J. Geary is a charter member, past national president and permanent member of the Standards Committee of the OPSEC Professionals Society. He is certified as an OPSEC Professional (OCP) and currently serves as the Director, Office of Security Programs and Continuity Planning for the Department of the Navy, Naval Sea Systems Command (NAVSEA). NAVSEA is the 3rd largest command in the Navy with over 53,000 employees and a budget of over $30 billion. 13 OPSEC and Risk Mitigation in Action: Tracing the Origin and Viewing the Future Samuel V. Crouse, OCP “Above all, we must realize that no arsenal, or no weapon in the arsenals of the world, is so formidable as the will and moral courage of free men and women. It is a weapon our adversaries in today's world do not have.” Ronald Reagan I. Introduction T oday’s leaders are faced with unique opportunities in the area of information accessibility. The electronic age, the internet and shared networks supply them with an almost endless supply of data with the additional feature of real time or near real time updates. While information accessibility is a quantum leap over what previous generations had available, the analysis side of the equation becomes increasingly more complex. Processes that refine and sort information for the decision making task of leaders need to respond to this increased supply of information or bottlenecks and “analysis paralysis” can and will develop. It is important for leaders and managers to make decisions efficiently and effectively. This is true in the area of risk mitigation, ultimately the subject of this paper. This paper examines the history and origin OPSEC (Operations Security) and the basic five step model. The steps are compared and contrasted with the DoD program protection process in order to learn more about the importance of identifying critical program information, threats and vulnerabilities; these elements are considered in the risk mitigation steps of both processes. More can be learned about risk mitigation, the highly cognitive portion of the task, by discovering the nature of factual decisions. This facet is examined in detail and compared with recent literature on decision making, a vital component of the problem solving associated with risk mitigation. Finally, the need for a common lexicon among decision makers is discussed and outlined in a simple, flexible model that will help leaders foster the environment for integrated team analysis, decisions and risk mitigation. The paper aims to tie the past to the future and spur critical thinking on how we as a professional society can contribute to “baking in” the tenets of OPSEC in the early stages, along with helping leaders, managers and policy makers formulate better risk management decisions. “Risk Management in Action,” the mantra of OPS is better served by a deeper 14 understanding and practical look at mitigation strategies and by returning to the analytical roots of OPSEC. It is written in a hope to stretch our thinking beyond the normal boundaries of professional OPSEC writings and venture into contemporary decision making and problem solving literature to foster a better understanding of the cognitive processes and organizational forces involved in risk mitigation. Before proceeding with a discussion, a few important operational definitions are offered in respect to this paper. II. Definitions One of the challenges and joys of our profession is that we work with other disciplines (technical, security, intelligence and operations) that each have their own language. Language, while absolutely needed to communicate and allow structure, becomes limiting if words and phrases take on differing meanings to members of a team attempting to solve a common problem. The “paradox of structure” (Kirton 2006, 126) must be dealt with; where the structure of language is needed to communicate, but in doing so presents its own limits. Bridging the gap between the lexicons of varying disciplines is probably one of the most singularly important tasks we can perform if we intend to work with a multi-disciplined team. Consequence Consequence, in respect to this paper, is the effect or outcome on the operation or system if the threat (no matter what the likelihood) actually occurs. In the risk model proposed in this paper, it is the vulnerability rating. Decision Making Decision making is the cognitive process that takes place between perception of a problem or condition and the action taken as a result of the decision making. Impact Impact is defined as the effect on the operation, program or system. Likelihood Likelihood is defined as the probability that an event (threat) will occur. Matrixed Organization An organization composed of personnel from two or more work groups, companies or entities working on project, task or problem. The groups may also be a combination of customer and supplier elements. OPSEC Operations Security (OPSEC) is defined as a systematic and analytic process used to deny and adversary information, generally unclassified, concerning intentions and capabilities by identifying, controlling, and protecting indicators associated with planning processes or operations (IOSS, 2007). 15 OPSEC Process The five-step analytical model used to assess risk and develop countermeasures. Risk Mitigation Risk mitigation refers to efforts taken to reduce either the probability (likelihood) or consequences of a threat. Vulnerability Vulnerability as defined in this paper is the susceptibility of the critical information or operation in the presence of the threat. III. History and Origins The roots of our profession date back to the Vietnam era. “The impetus for OPSEC was overwhelming evidence that a relatively unsophisticated adversary had foreknowledge of our intentions (Fisher 1988, 1). We found that we were very good at protecting classified information, but those pesky pieces of unclassified information were literally killers. In today’s information porous society, the challenge is the same or greater. As our founders first tackled the problem in Vietnam, they found that “physical security assessments showed that classified information, per se was protected adequately” (Fisher 1988, 1). A select group of pioneers under the direction of Pacific Command’s (PACOM) chief of the operations staff command and control division, Col Jim Chance took the challenge. “The permanent group, known as the Purple Dragon team, continued to conduct surveys throughout the duration of the Vietnam conflict” (Fisher 1988, 2). Why the color purple? In the Vietnam War era, our forces considered the enemy they were facing to be represented symbolically by a Red Dragon with the color red, referring to the communists and the dragon a traditional Asian symbol of strength. The color of the uniform worn by the U.S. Air Force is the reason. Since “the first OPSEC survey involved an Air Force operation, and was headed up by an Air Force Colonel…ergo, it was the Purple Dragon that was going to go up against the Red Dragon” (Johnston 2006, 6). Purple Dragon essentially combined the COMSEC survey techniques of interviewing and open-source exploitation with operational research methodology in order to examine operational activities and patterns that might be providing cueing to the enemy forces (Peeples 1993, 1). The Army later leveraged the lessons learned from Purple Dragon. When Army CI missions were transferred to other agencies, the Army started its own OPSEC program in the mid-70s (Howe 1993, 2). “During the 20 year period between the birth of OPSEC and the signing of NSDD-298, OPSEC programs ran hot and cold and the true identity of OPSEC became obscure” (Ferrill 1993a, 2). On January 22, 1988, The President of the United States signed National Security Directive (NSDD) – 298, establishing a National Operations Security 16 Program. Contained in the NSDD was a requirement for the Director, National Security Agency to “..establish and maintain an Interagency OPSEC Support Staff (IOSS) whose membership shall be composed of all government agencies” (Ferrill 2006, 3). DoD Directive 5205.2 (1999) essentially reaffirms the policy and charges the services, commands and activities to consider OPSEC in planning and analysis of programs and operations. It is really the first layer of overarching guidance for OPSEC professionals. The various uniformed services and subordinate units have their own directives that supply more guidance on what must be done. What are the keys to how an OPSEC program is promulgated at the unit level? This is where most of our attention is focused as OPSEC professionals -- the essence of it is leadership driving teamwork. If there is an optimum blueprint, cookbook or recipe for OPSEC, we haven’t found it, nor are we sure we would want it. OPSEC has to be thought through and tailored at the unit level. What has to be done, how much has to be analyzed and who should do it? All of these are driving forces in shaping an OPSEC mentality. Consider this from one of our founders in regard to the initial efforts, “we were 17 people from different armed services, different disciplines, different ranks, all working in the same space, all striving together to make sense out of mountains of information; collecting it, studying it, analyzing it, vetting it with the staff, deciding what to do” (Johnston 2006, 8). Does this sound familiar? Are we much different today? We must work across organizational boundaries, companies, units and cultures to get it done. We simply cannot afford to build an “OPSEC empire” and allow sub-optimization to take hold. An Effective OPSEC Program Okay, you want a program, but not an empire. You want to be effective but not seen as a zealot. It is best to learn from those who have gone before. Study those who have succeeded and examine the obituaries of those who failed. Both are worthy of careful investigation. The following are words worth considering before you even enter the profession. As I read them once again, I thought they should be posted on the side panel of the OPSEC package (like the warning on a package of cigarettes): “you will be seen as an invader because you are a threat to the status quo. After all you are suggesting a change in their system. Since you will be an outsider, it is natural they may view your efforts as meddling. It is extremely important at this point of time that you do not permit yourself to be drawn into a confrontation (Ferrill 1993b, 5). Where do you put the OPSEC function? In his article, The Nature of OPSEC, George Jelen discusses some options for where the OPSEC function should be placed in an organization. Intelligence, security and operations sections are all weighed, but he concludes that the “problem is that with any of the three choices, there is no way to accrue the advantages of all three placements, and the absence of the other advantages becomes a disadvantage” (1993, 5). The choice is a difficult one, and unfortunately left to leadership, who often sees OPSEC as an additional duty, rarely 17 worthy of their appointing a full time professional. It is an opportunity for us as a professional society to grow the most knowledgeable and influential personnel we can. As we become more effective, the placement issue will take care of itself. IV. The OPSEC Model In 1993 one of our leaders stated, “The analytic process we recognize as OPSEC has been around for about 25 years, yet with the exceptions of the publications of the IOSS, there is an outstanding lack of literature on the subject” (Ferrill, 1993a, 1). Well, someone must have heard him, because there has been a considerable amount written on the analytical process just in the last 10 years. Five Steps in Theory Exhibit 1 illustrates the analytical model. The five steps are always the same, but are sometimes worded differently. The sub bullets in exhibit 1 are compilations from a number of different articles and publications published over the last few years. The model is relatively simple, and that quite frankly is the beauty of it. The shaded box will be discussed later, so just file it in the back of your mind that we will come back to it. The basic theory or model can and has been used in many other risk management and mitigation arenas. Think of the safety discipline, the physical security discipline and one of the more recent players on the stage, the operational risk management (ORM) movement. The challenge in any effort is moving from theoretical to application. It is helpful to view the OPSEC practitioner as a consumer of data in the collection of information, then performing an “operation on the data” in the analysis steps, and finally as a supplier of data to the customer (policy maker or leader). This helps frame the application of OPSEC in an interesting light. For more on this discussion, see Structuring OPSEC which has some great practical thoughts and ideas (Peeples and Lothrop 1993, 3). 18 1. Identify Critical Information Observable actions Activities Capabilities Intentions 2. Analyze the Threat Identify the adversaries Threat to specific information Continuous, dynamic cycle Constant evaluation 5. Mitigate Risk Apply Countermeasures Eliminate indicators Disrupt adversary efforts Deceptive techniques 4. Assess Risk Cost-benefit analysis Use of measurement tools List recommended countermeasures Rank ordered by impact 3. Analyze Vulnerabilities Through all phases of the project Survey by multi-disciplined teams Best described by “measurements” Specific to indicators EXHIBIT 1 – OPSEC Analytical Model Three Domains Another cognitive construct is to form a mental model that illustrates sources that might contain critical information. Media, events or things are suggested as “sources of information” (Peeples 1993, 3), and this construct may be valuable for thinking critically through scenarios. Recently, in determining a format for a security concept of operations documents, I used 1) the physical/mission domain, 2) the personnel domain, and 3) the information (printed and electronic) domain as frameworks for the chapters. I arrived at these three “domains” by examining two other programs and brainstorming all possible scenarios, then taking the outputs and arranging them into an affinity diagram. When it was all boiled down, everything ‘fit’ into one of those categories and I was left with my chapter headings -- and another simple construct that was easy to communicate to leaders and served the customer’s needs. The genius in simplicity cannot be overstated. There are enough complex and complicated arenas in our life without our making more of them. In planning for OPSEC, we do need to consider many scenarios, but keeping basics in mind helps create touchstones in an internet connected, distributed and frenetic world. Here is a gem worth remembering: “when we assume the task of protecting ourselves against industrial espionage, we should not become so overwhelmed by electronic wizardry surrounding computers and modern communications that we lose sight of what we are trying to do” (Helms 1993, 1). Keep it simple when you put the OPSEC model into action. Later, when you gain acceptance, you can always make it more intricate for those leaders and policy makers who require more. Get them to the dinner table first, and then you can always add side portions and condiments to the bill of fare. 19 V. Program Protection Risk Planning The Department of Defense Acquisition Guide (2004, 8.4.2) directs that if a program contains critical program information (CPI), program protection planning is required. This acquisition requirement is the responsibility of the program manager and is done by a Working Level Integrated Product Team (WIPT), required to produce a Program Protection Plan (PPP) prior to Milestone B. That seems to be a lot of new acronyms, but let’s compare the process to our own OPSEC world. Our main focus in PPP is to ensure that the adversary does not have the capability to kill, counter or clone our system. Those three outcomes would also be unacceptable for an operation in the OPSEC sense as well. Our risk mitigation decisions and countermeasures must ensure we are not killed, that the operation is not countered, and that the enemy cannot copy or clone our methods, tactics and weaponry. Thus, we can learn a lot by studying a close cousin of the OPSEC process. “PPP uses a Risk Management approach to identify, recommend, and implement security countermeasures designed to reduce risk to an acceptable level at an acceptable cost (Pattakos 2003a, 61). Nine Steps in PPP If five steps are good, then surely nine steps must be better. In examining other risk mitigation and planning models, one model uses eight steps in determining what to protect and how to protect it by adding additional steps dealing with implementation and review (Pattakos 1993, 12). Another model actually used eight steps just to determine the critical and sensitive information (Peeples 1993, 1). 2. Identify and Prioritize Critical Program Information in Relation to Unique Characteristics 1. Prioritize Operational and Design Characteristics 3. Identify Locations and Potential Locations of Critical Program Information and Critical Technologies 9. Identify Procedures and Actions to Ensure Countermeasures are in Place 8. Identify Protection Costs Associated through Risk Mitigation Process and Make Decisions 7. Identify Elements needing Classification 4. Identify Foreign and Domestic Threats 5. Identify Vulnerabilities to Specific Threats during Acquisition Phases 6. Identify Anti-Tamper Techniques and Measures to Protect Critical Program Information and Critical Technologies EXHIBIT 2 – Program Protection Planning 20 Exhibit 2 shows an illustration of the program protection planning process as outlined in the DAG. Okay, one of the tenets was to keep it simple, so take a look at the shaded box. Does it look familiar? Of course other boxes look familiar as well (2, 4, 5 and 9), but the focus of our discussion, risk mitigation, is where the shaded box 8 equates to exhibit 1, box number 4. Hopefully, PPP is less of a mystery at this point. Let’s see how we can “make some money” as OPSEC professionals in a PPP world. Making it Work The challenge with the PPP process is similar to the challenge we face with the OPSEC process. The PPP is a bit more complex of a challenge because in theory you do not have a physical observable and you are involving technical disciplines as well. However, think about this in the construct of the three domains. There are two other domains that must be protected and there are enough potential vulnerabilities to keep you busy there as an OPSEC professional. If you are lucky enough to be on the Working-level Integrated Product Team (WIPT) tackling the PPP, the personnel and information domains will have enough unclassified elements to keep you up at night. Also, while the program protection plan does “not mandate [it], developing an OPSEC Plan is recommended” (Pattakos 2003a, 62). As an OPSEC professional, you can bring your experience to the PPP table in the area of risk mitigation by increasing the sharing of vulnerabilities and contributing an analysis to make sense of vulnerabilities. Underlying this all is decision making, which we will explore in more detail to provide opportunities for critical thinking. “An assessment of risk involves an estimate of the potential effects of vulnerability on an operation and a cost-benefit analysis about corrective actions. OPSEC planners provide the decision maker with a preliminary risk analysis, the basis of which is a comparison of known vulnerabilities to the adversary’s threat capabilities, opportunities, and intentions” (IOSS 1993, 4). In order to provide this information to the customer, we first have to solve some problems and make some calculated decisions. VI. Decisions and Risk Mitigation In review, both models have a very complex cognitive step involved, that of decision making. As suggested in the definitions section, decision making is the cognitive process that takes place between perception of a problem or condition and the action taken as a result. Therefore, in sequential order: perception of a problem; decision making; action/implementation. Decision making and how humans actually make decisions is the subject of thousands of articles in the fields of psychology, brain science and organizational development, to name a few. A trip through some recent and appropriate contemporary literature on the subject and how it might apply to our situation is offered here. 21 Decision Making – the Essence Our government leaders are expected to work across organizational boundaries in growing frequency and magnitude. At the heart of this expectation is the everincreasing speed and volume of information capable of being shared between organizations, principally due to dynamic and sophisticated electronic systems. However, our leaders, like all humans are still equipped with the same cognitive brain functions to deal with this ever changing complex flow of information. In order to learn more about risk mitigation it is useful to examine the nature of problem solving and decision making in the government. While the classic hierarchical organization may still be the norm in government agencies, there is a growing model that has relationships and partnering as foundations; whereas the traditional model uses centralized decision making and authority as driving principles (Goldsmith and Eggers 2004, 8). As these complex relationships develop, decision makers often find themselves working across cultures (in our case, civil service, military and contractor worlds) and as they do so, the problem solving and decision making arena becomes a bit more complex. There are many factors that affect individual decision making and many more factors that affect leaders’ organizational decision making. The models and tools developed to aid decision making all attempt to prioritize and weight a myriad of factors in varying degrees of complexity. Much of the literature on problem solving centers on process discussions and is concerned with explanations of various problem solving steps or models. Most, if not all problem solving models are variations of the basic four step; plan, do, check and act cycle known as the Shewhart Cycle named after Walter A. Shewhart (Walton 1986, 86). Decision making, for the most part is generally regarded as a step (usually a final step) in most problem solving models. Most people regard a problem as something negative requiring resolution, sometimes defined as “something we do not like – a pain, a difficulty, an injustice, a misfortune, a shortcoming, a gap, a vexation, a tragedy, a discomfort, a disaster” (Goodsell, 2004, 81). However, a problem can also be defined as a condition that needs to be resolved (mitigation of risk in our case) and the condition can be positive or negative depending on one’s perspective. Kirton defines problem solving very broadly as “the means by which life survives, that is, successfully manages the ever constant change engendered by itself and its environment” (2003, 26). As defined here, problem solving has two components. First, the opportunity for change must be perceived and second, there must be action taken to make the change. It is the space between perception of the problem and action that decision making takes place. If one reviews most problem solving processes, tools or models that facilitate decision making the result will show most all of them relate decision making as synonymous with implementation or as a minimum, assume implementation has or will take place. Impacts of decisions made in one part of the organization can have far reaching consequences (Jackson and Stainsby 2000, 11). Understanding the entire 22 organization, particularly matrixed organizations, becomes a complex, dynamic challenge and complicates problem solving and decision making. Attempts to shortcircuit this can be devastating. One of our original Purple Dragon’s called it “black stovepiping” describing instances of taking information right to the top without vetting it through the rest of the team or organization. He called it the “antithesis of teamwork” (Johnston 2006, 8). Perceiving problems in inter-related or matrixed organizations becomes complex and requires new methods. One study done on the nature of conventional theories that looked at simple dyadic exchanges indicates this approach may actually be counterproductive when dealing with multiple parties (O’Toole 1997, 47). Another study highlights the fact that unless decision makers have a full understanding of what it means to work across boundaries; they will continue to use traditional policies and management techniques that counter the positive attributes of a multi-disciplined organization (Keast et al. 2004, 364). O’Toole further suggests that it is the leader’s challenge to craft contexts that enable those who must make decisions, to have access to needed information in order to make sound choices efficiently (1997, 45). In our situation, this equates to leaders ensuring the right environment with proper resources in place. A study on strategic decisionmaking in government and private cultures uncovered three types of decision making processes (Hickson et al. 1989, 373). While they were defined as sporadic, fluid, and constricted in name, all three were largely related to time and frequency, a common factor we face in our profession as there is usually a sense of urgency associated with protection issues. More recent research on decision making in organizations yields some interesting insight. Nutt’s (2006) study looked at perception differences of managers in regard to cost decisions. This particular study also accounted for individual cognitive style difference by the use of a psychometric instrument that measures an individual’s preference for assimilating and processing data. There were four resultant cognitive styles (analytic, speculative, consultative and networking) identified as the result of the psychometric measurement. As we work in cross-culture teams to solve OPSEC problems, individual processing schemes are important to consider. The results of the study found that private cultures seem to place great reliance on analyses and little on bargaining (internal peer/subordinate coordination). Conversely, government cultures placed great emphasis on bargaining and little on networking (working with external constituencies). There was also a difference noted in the area of risk assessment. The government cultures saw less risk than the private cultures when considering the same decisions. This was thought to be the result of the government culture not supporting the value of risk analysis as much as the private culture (Nutt 2006, 312). This is also an important consideration as we deal with risk mitigation in a matrixed team environment. Implications from the study show that organizational cultural differences present potential barriers for our matrixed organizations. 23 Government leaders must be more open to making risk estimates and will have better chances of success when they interact with other cultures in a team context. Another government sector challenge was identified by Ammons who lists “political factors that influence decision making” as environmental barriers to productivity in the government sector (2004, 140). Again, this is evidence that well thought out decisions are sometimes derailed by political considerations in the implementation phase. In another recent study, the researchers found that there was a great need for both government and private organizations to focus on knowledge sharing capabilities both internally and externally. However, they found that government organizations had a much more difficult task in accomplishing this goal principally due to lower levels of trust. The discussion of multiple perspectives and sharing best practices by enabling integration of competing viewpoints in decision making would be very useful to government leaders (Kim and Lee 2006, 372). With some theoretical background in place, we move to the practical application in decision making models. Decision Making Models There are many different models for decision making, some having as many as nine different steps. In a classic review of different models of organizational decision making, Pfeffer classifies organizational decision models into four categories which he names: rational, bureaucratic, organized anarchy, and political power (1981, 236). While this is helpful in understanding the decision making process as a whole, the distinctions between the four models is somewhat unclear and confusing. The VroomYetton model of decision making developed in 1973 indicates that while decision makers rely on a number of different attributes (up to a total of eight), a majority of the decision makers utilize only two or three of the problem attributes when making their decisions (Hill and Schmitt 1977, 366). The rigidity and complexity of some of these models is somewhat limiting. Most of them were developed for intra-organizational use and predate the rise of the matrixed organization. While the advent of computer technology has automated some of the models, this change has frequently given rise to more structure, complexity and limitations. Again, hearkening back to our legacy, Sam Fisher, one of the original Purple Dragons reported how too complex of a model or process can choke an OPSEC survey effort. Vulture Probe was probably the first attempt at a weighting criteria model and seemed fine for dealing with linear strategies. When the model was expanded to include multilinear strategies and contingencies, it became overly complex and “the software disappeared into a black hole where such things are stored and probably remains there, somewhere, impossible to retrieve” (Fisher 1988, 3). Another issue is the number of people involved in the decision making process. This is a particular concern with matrixed organizations. While there is some evidence to 24 suggest that the number of participants involved in decisions reduces the quality of the outcomes, the more important factor is the inclusion of motivated contributions of the parties (Brown 1998, 523). The value of teams in an OPSEC environment analysis has already been highlighted and in an actual experiment oriented around a complex risk analysis, the teams using a common structuring approach were more successful than an individual structuring approach (Peeples and Lothrop 1993, 3). Additionally, Fisher indicates that he was reluctant to encourage OPSEC surveys by large teams as it was counter to the Purple Dragon methodology. “The mass experience approach broadened the OPSEC awareness of the participants, but had an adverse effect of establishing a precedent in which group gropes became a common practice (1988, 3). The double edged sword effect was evident then and is still a delicate “balance” factor. Similarly, another study found that while the number of participants involved in fostering ideas for decision making did generate more new ideas, the boundaries imposed on interaction and feedback limited contributions (Mahler 1987, 341). In other words, there is a practical limit to the amount of decision making participants. Certainly the criticality of the problem will affect the decision making process. Trivial or routine decisions do not merit the same attention or analysis as major policy decisions. While systematic methods for weighing probabilities and risk to determine impact are plentiful, they can become constrictive and complex. As an example, one of the most complex decision weighting models was developed by Dr. Charles Kepner and Dr. Benjamin Tregoe who conducted research on breakdowns in decision making in the United States Air Force’s Strategic Air Command in the 1950’s. They discovered that successful decision making by leaders had less to do with rank or position than the logical process used to gather, organize, and analyze information before taking action (Kepner and Tregoe 1965, 28). The resultant models of weighing risks and probabilities of occurrence were applied to the processes dealing with the management and readiness of the nuclear weapons program. While one can argue that here are few if any areas more critical in nature than this area, the detail and time allotted to most organizational decisions does not require the degree of rigor applied to nuclear weapons. The broader application is helping solve the need for organizations that need future leadership to “be oriented to constant change, handling of uncertainty and welcoming innovation” (Riggio 2004, 253). So how do we take those decisions and place them in a mitigation context? Getting from Qualitative to Quantitative All of the models attempt to take qualitative data and translate it to quantitative levels. Some models use two factorials and some use three or more factorials. Regardless of the model used, good operating definitions and a common understanding are keys. In an excellent article by Arion N. Pattakos written on the subject of prioritization of critical program information, he proposes “one of many possible ways to establish a linguistic scale” for the weighting of information (2003b, 42). 25 Exhibit 3 lays the ground work and is just one way we can take information from various disciplines in an attempt to arrive at a common team language and decision. You will note that the Likelihood (L) column has the traditional engineering descriptors first, followed by a column that has the standard intelligence reporting language. 2 Unlikely 21-40% Improbable = Probably not, unlikely, we believe it will not 10-35% Likely 41-60% Possible = Chances are slightly better than even, chances about even, chances slightly less than even 36-64% 3 4 Highly Likely 61-80% Probable = Likely, we believe, we estimate, chances are good, it is probable that 65-89% 5 Near Certainty 81-99% Near Certainty = Virtual (almost) certain, we are convinced, highly probable, highly likely 90-99% 1 Operation or System has little or no vulnerabilities 2 Operation or System has minor vulnerabilities 3 Operation or System has moderate vulnerability 4 Operation or System has significant vulnerability 5 Operation or System is completely vulnerable or made ineffective Minimal Impact to Effectiveness, Schedule or Cost GREEN Slight Chance = Almost impossible, only a slight chance, highly doubtful 0-9% IMPACT or RISK LEVEL From Exhibit 4 with L/C YELLOW 1 Remote 1-20% Consequence (C) in light of threat occurring RED Likelihood (L) of the threat occurring Acceptable Impact with some minor degradation to Effectiveness, Schedule or less than 5% Cost impact Acceptable Impact with significant reduction in Effectiveness, Schedule and 5-7% Cost impact Unacceptable Impact with significant reduction in Effectiveness, major slip in Schedule and 719% in Cost impact Major changes required to retain Effectiveness, inability to adhere to Schedule or greater than 20% Cost Impact EXHIBIT 3 – Likelihood, Consequence, Impact Table While the percentages vary somewhat between the two columns, the task is to come to a consensus on a scale of 1 to 5. The team’s next task is to determine the Consequence (C) rating, again on a scale of 1 to 5 depending on the team decision of how vulnerable the operation or system would be when faced with the threat. Once the two factors are determined, exhibit 4 is used to determine the Impact/Risk level. Enter exhibit 4 with L and C to get an Impact/Risk rating, then return to exhibit 3 and use the far right column to determine the general path. If exhibit 4 yields a Y1, this indicates a borderline situation between a yellow and red condition. Nevertheless, the team should have a good idea of what must be mitigated and can then begin discussion on countermeasures (how to reduce the threat or vulnerability). This risk cube (exhibit 4) is taken from NAVAIR (2006). Collaboration is the key to making this work. 26 Probability of Threat G = Green Y = Yellow Y1 = Borderline R = Red 5 G Y R R R 4 G Y Y1 R R 3 G G Y Y1 R 2 G G G Y Y 1 G G G G Y 1 2 3 4 5 Vulnerability (Consequence) EXHIBIT 4 – Risk Cube (NAVAIR 2006) The key is using simple tools like these to facilitate working with a common language and eliminate or at least reduce the lexicon confusion on a multi-disciplined team (security, technical, operations, intelligence, communications, etc.). Are there more complex models? The answer is yes and they are available for use, however, experience has shown that the more simple the model, the less constrained the ensuing discussion will be. The next most difficult part is determining cost. A very good discussion cost-benefit analysis and deciding how much to commit to countermeasures was written by Rosemary N. Hutchins (1993) in regard to the On-Site Inspection Agency, but the lessons there are applicable to our risk mitigation process. VII. Keep it Simple “As OPSEC professionals, it can be problematic to effectively convey our message to the people we work with – especially to individuals who are so focused on their ‘piece of the puzzle’ that they won’t take time to look at the big picture” (Yates 2006, 17). Leaders sometimes know intuitively what the general answer might be. This comes with experience and crystallized intelligence. In any case, they need the assistance of their team to determine the best specific course of action. Only by assembling a crossfunctional team of professionals, can they arrive at the best answer that covers all contingencies using fact based decision making. Intelligence, threat, security and technical personnel must work together to provide validation for the leaders actions. At a recent threat symposium, one of the speakers touted the phrase, “a risk assumed by one is shared by all; a vulnerability found by one applies to us all.” While these are 27 very true and daunting words, only by working in a team based approach can we ensure risks and vulnerabilities are communicated across organizational boundaries. 1. Identify the critical information or elements needing protection 4. Determine how vulnerable the operation or system is in the presence of the threat. (Consequence). 5. Calculate the possibility of loss or compromise to the operation or system 7. Risk Decisions Proposed Countermeasures Mitigation 3. Forecast the Likelihood of the threat occurring 2. Identify the specific threats to the critical information 6. Risk Assessment Risk Evaluation Regulation Calculation Accumulation Accumulation, Calculation, Mitigation, Regulation One final construct offered is to take both the OPSEC process and PPP process and present it in a model that can be communicated to linear thinkers. Recently, when working with a technical team, I used the following model to explain the risk protection process to them. I had previously assembled the OPSEC process and PPP process on a presentation slide in a wheel format, with the OPSEC process as the center and the PPP process around it. While it seemed half the audience “got it”, when I flashed up the next slide in a format similar to exhibit 5, the rest of the audience could absorb it. As OPSEC professionals we must be able to get to the other half of the audience, so I share this as one of my last points. 8. Specific Plans, Procedures, Processes aligned with dates and milestones. Site Specific Plans, Standard Operating Procedures, Security Classification Guides, Other methods for capturing information for re-use. Knowledge management systems. EXHIBIT 5 – ACMR Construct Accumulation. Gathering or accumulating the data regarding the critical information and the threat is the first step of the process. These are key elements to collect. Calculation. Once the data is collected, the calculation of risk is performed by using a model similar to exhibits 3 and 4. Mitigation. This is the really tough part of assessing the risk, discussing it with the team and determining prioritized courses of action, then deciding what to do and getting it approved. 28 Regulation. The final step where the decided upon measures and actions are implemented and documented. Only by doing this step correctly, can we ensure what we decided will be carried out. This simple construct is just another framework to use in thinking about our OPSEC process and it helps explain it to the others in the crowd or on the team who may process information according to a different cognitive style outlined earlier. Be creative and get your message across in different ways. VIII. Conclusion The time compressed electronic age and information cornucopia presents our decision makers with a real challenge. As OPSEC professionals we need to understand how to sort, refine and analyze information for our leaders and policy makers. The crux of our five step process is risk mitigation and the heart of that is analytical decision making. By understanding the complexities of this vital core we help leaders and managers make efficient and effective decisions. We have examined the history and origin OPSEC and the basic five step model. The steps were compared and contrasted with the DoD program protection process to learn more about the importance of identifying critical program information and vulnerabilities. The nature of factual decision making and working in cross-functional, multi-disciplined team environments are inherent in the risk mitigation step of the OPSEC model as well as the program protection model. As OPSEC professionals, we can contribute a wealth of experience and knowledge to this relatively new area and in doing so; enhance our own growth and development. Examining contemporary literature on decision making and problem solving allows us to open our mind to new techniques and methods and helps us understand complex organizational interplay. The sharing of information across boundaries is vital to our survival. Crafting strategies that foster private and governmental cooperation is necessary (Van Cleave 1993). The need for a common lexicon among teams who take on these tasks to assemble options for decision makers can not be overstated. The simpler the model, the more flexible it is to serve in an environment for integrated team analysis, decisions and mitigation steps. Hopefully, this paper has spurred some critical thinking for the old and new on how we as a professional society can progress and help our leaders, managers and decision makers make better risk management decisions. An OPSEC or PPP program is only as good as its implementation. One of the most insightful articles on the role of the OPSEC professional for those just getting started was only two pages in length, but carefully written by the 2002 OPSEC Individual Achievement Award Winner. In it, the author and award winner outlines the hats worn and ideas for those starting out, but it is also a good read for the more experienced to re-focus on what is really important (Ostermann 2002, 18). Finally, “efforts to make information protection quickly and easily understood should be made – training is a keystone” (Pattakos 1993, 13). 29 In closing, let us consider some sage words from one of our Purple Dragons that remind us of the real focus of our task. “If you consider yourself risk managers working under the imprimatur of Operations Security, you are required to estimate the threat. If you do not know the enemy well enough, how can you possibly ‘look at things through the eyes of the adversary?’ How can you examine the capability and intent of an adversary? How will you determine what an adversary can do and whether or not he is motivated to do it?”? (Johnston 2006, 12). ______________ Samuel V. Crouse, OCP is a Senior Analyst employed by EMSolutions, Inc. He is a retired USAF Lt Colonel whose career included highly demanding aviation, training, and change management positions. A command mission pilot with U-2R and B-52G experience, he held director level staff positions in special operations, education, quality implementation, safety, intelligence, and business operations. Sam’s experience includes surveys, multi-layer protection planning, training implementation and program protection risk mitigation. He is in the final stages of a Ph.D. program at the University of Texas at Dallas. Sam is an FAA airline transport pilot rated in the Boeing-747, and a certificated instrument instructor pilot with over 6000 total flight hours in military and civilian aircraft. References Ammons, David N. 2004. Productivity Barriers in the Public Sector. In Public Productivity Handbook, 2nd ed., edited by Marc Holzer and Seok-Hwan Lee. 139-163. New York: Dekker. Brown, Mary Maureen, O'Toole, Laurence J. Jr, and Jeffrey L. Brudney. 1998. Implementing information technology in government: an empirical assessment of the role of local partnerships. Journal of Public Administration Research and Theory. 8(4): 499-525. Defense Acquisition Guidebook. 2004. Washington, DC: Defense Acquisition University. DoDD 5205.2 1999. DoD Operations Security (OPSEC) Program. Washington DC: ASD(C3I) Ferrill, Howard M. 1993a. Editors Comments. The OPSEC Journal. First Edition. Ferrill, Howard M. 1993b. Walking the Tightrope. The OPSEC Journal. First Edition. Ferrill, Howard M. 2006. OPS – The Beginning. OPS News. Summer Issue: 40. Fisher, Robert “Sam”. 1988. A Long Look. Unpublished article. Permission to cite work obtained through OPMIS. Goldsmith, Stephen, and William D. Eggers. 2004. Governing by Network: The New Shape of the Public Sector. Washington, DC; The Brookings Institution. 30 Goodsell, Charles T. 2004. Looking Closer at the Bureaucrats, in The Case for Bureaucracy: A Public Administration Polemic, 4th ed. Washington, DC: QC Press. Helms, Richard M. 1993. Economic Counterintelligence: The Role of the U.S. Government in Protecting U.S. Companies from Industrial Espionage. The OPSEC Journal. First Edition. Hickson, David J., Richard J. Butler, David Cray, Geoffrey R. Mallory and David C. Wilson. 1989. Decision and organization--processes of strategic decision making and their explanation. Public Administration. 67: 373-390. Hill, Thomas E., and Neal Schmitt. 1977. Individual Differences in Leadership Decision Making. Organizational Behavior & Human Performance. 19(2): 353-367. Howe, Greg. 1993. Maskirova and OPSEC. The OPSEC Journal. First Edition. Hutchins, Rosemary, N. 1993. You Know He is Coming, But You Don’t have to Bake Him A Cake! The OPSEC Journal. First Edition. IOSS. 2007. What OPSEC Means to You. http://www.ioss.gov/bulletin.html IOSS. 1993. National Operations Security Doctrine. National OPSEC Advisory Committee Publication Series. January: 4. Information Sharing: A National Priority. September 2006. http://horizontalfusion.dtic.mil/order/ (3 September 2006), Defense Technical Information Center. Jackson, P. M., and L. Stainsby. 2000. Managing Public Sector Networked Organizations. Public Money & Management. 20(1): 11. Jelen, George, F. 1993. The Nature of OPSEC. The OPSEC Journal. First Edition. Johnston, William, A. 2006. Birth of the Purple Dragon Redux. Excerpts from a speech at the May 2006 Conference. OPSEC Indicator. Winter 2007: 5-13. Keast, Robyn, Myrna P. Mandell, Kerry Brown and Geoffrey Woodcock. 2004. Network Structures: Working Differently and Changing Expectations. Public Administration Review. 64(3): 363-71. Kepner, Charles, H., and Benjamin B. Tregoe. 1965. The Rational Manager: A Systematic Approach to Problem Solving and Decision Making. New York; McGraw-Hill. Kim, Soonhee, and Hyangsoo Lee. 2006. The Impact of Organizational Context and Information Technology on Employee Knowledge-Sharing Capabilities. Public Administration Review. 66(3): 370-85. Kirton, Michael. J. 2003. Adaption-Innovation: In the Context of Diversity and Change. London: Routledge. 31 Mahler, Julianne G. 1987. Structured decision making in public organization. Public Administration Review. 47: 336-342. NAVAIR. 2006. Risk Assessment Data Base Model. Patuxent River, Maryland. NSDD No. 298. 1988. National Security Decision Directive, National Operations Security Program. Washington DC. Nutt, Paul C. 2006. Comparing Public and Private Sector Decision-Making Practices. Journal of Public Administration Research and Theory. 16(2): 289-318. Ostermann, Brian. 2002. What Every OPSEC Professional Should Know. OPSEC Indicator. Summer 2002: 18-19. O'Toole, Laurence J. Jr. 1997. Treating networks seriously: practical and researchbased agendas in public administration. Public Administration Review. 57: 45-52. Pattakos, Arion N. 1993. Counter-Competitor Intelligence: Keeping Company Secrets Secret. The OPSEC Journal. First Edition. Pattakos, Arion N. 2003a. Security Support to Acquisition of Weapon Systems: Vital to Success on the Battlefield. Program Manager. March – April Issue. 60-63. Pattakos, Arion N. 2003b. Guarding the Crown Jewels: Identifying Critical Program Information. Program Manager. September – December Issue. 40-43. Peeples, Donald, R. 1993. Determining Critical and Sensitive Information. The OPSEC Journal. First Edition. Peeples, Donald, R. and Fred Lothrop. 1993. Structuring OPSEC. The OPSEC Journal. First Edition. Pfeffer, Jeffrey. 1981. Power in Organizations. Marshfield, MA: Pittman Publishing. Riggio, R. E. and Orr, S. S. (Eds.) (2004). Improving Leadership in Nonprofit Organizations. Jossey-Bass Pub.: San Francisco. Van Cleave, Michelle. 1993. National Security Policy and the Protection of Strategic Economic Information. The OPSEC Journal. First Edition. Walton, Mary. 1986. The Deming Management Method. New York: Putnam. Yates, Lynne. 2002. From the Editor. OPSEC Indicator. Summer 2002: 17. 32 Demystifying OPSEC Assessments: A “How To” Primer1 Daryl Haegley, OCP, CCO OPSEC Assessment Purpose: Determine susceptibility to adversary exploitation O perations Security (OPSEC) is commonly defined as the process of denying adversaries information about friendly capabilities and intentions by identifying, controlling, and protecting indicators associated with planning operations or other activities (“Loose Lips Sink Ships”). Integral to the OPSEC process is the requirement to conduct regular OPSEC Assessments. The Department of Defense Directive (DoDD) 5205.02, Operations Security, dated 06 March 2006, defines an OPSEC Assessment as “An evaluative process, usually conducted annually, of an operation, activity, exercise, or support function to determine the likelihood that critical information can be protected from the adversary’s intelligence.” Additionally, Joint Pub 3-13.3, Operations Security, dated 29 June 2006, describes an OPSEC assessment as “an intensive application of the OPSEC process to an existing operation or activity by a multi-disciplined team of experts. Assessments are essential for identifying requirements for additional OPSEC measures and for making necessary changes in existing OPSEC measures.” Assessments are conducted only after an organization has identified its Critical Information (CI). Critical information is defined as “Specific facts about friendly intentions, capabilities, and activities vitally needed by adversaries for them to plan and act effectively so as to guarantee failure or unacceptable consequence for friendly mission accomplishment (Joint Pub 1-02). CI is often referred to a subset of Essential Elements of Friendly Information (EEFI). For example, an EEFI would be “When will the special operation commence?” and the corresponding CI would be “Saturday, January 6th, 0600.” The identification of CI is important in that it focuses the OPSEC Assessment on evaluating protection of vital information rather than attempting to protect all classified or sensitive information. The list below serves as a good reference to generate a CI list for your organization: • • • UNIT CAPABILITIES OR DEGRADATION DETAILS OF PLANS, OPERATIONS, ORDERS, OR PROGRAMS REFERENCE OF MISSION ASSOCIATED INFORMATION, SUCH AS PERSONNEL/EQUIPMENT DEPLOYMENT DATES OR LOCATIONS 1 This article originally appeared in Naval Network Warfare Command, Info Domain (Norfolk, VA, May 2007), 25. Reprinted with permission of the author. 33 • • • • SPECIFIC TAD/TDY DEPLOYMENT DATA, INCLUDING PERSONNEL NUMBERS, DURATION, LOCATION, SYSTEMS, ETC. SPECIFIC DETAILS CONCERNING TAD/TDY TRAVEL ITINERARIES AND PURPOSES OF TRAVEL BY KEY PERSONNEL ASSOCIATION OF ABBREVIATIONS, ACRONYMS, NICKNAMES, OR CODEWORDS WITH PROJECTS OR LOCATIONS NEW, PROJECTED, OR EXPANDED SECURE COMMUNICATIONS CAPABILITIES OPSEC assessments are different from security evaluations or inspections in that an assessment attempts to reproduce an adversary’s view of the operation or activity being assessed. Independently, a security inspection seeks to determine if an organization is in compliance with the appropriate security directives and regulations. Essentially, OPSEC assessments enable an evaluation of current OPSEC measure effectiveness. Although OPSEC Assessment findings are not provided to the assessed unit’s higher headquarters, Commanders or OPSEC assessment teams may forward to senior officials generic lessons-learned on a non-attribution basis. Lessons-learned from assessments should be shared with command personnel in order to advance the command’s OPSEC posture and mission effectiveness. Further, leaders and decision makers are shown the resources required to adequately protect against adversary exploitation. Findings should be labeled and handled at appropriate classification level (SECRET or CONFIDENTIAL) depending upon vulnerability results. See your Information Security Manager for guidance. COMFLTFORCOM states in 042111Z Jun 04 message that, “Leaders must pursue every effort to ensure that highest OPSEC measures are followed and OPSEC integrity is maintained. Make OPSECC a priority with daily emphasis from senior command personnel to the newest requite and observe strict adherence to OPSEC in all transactions and/or communication lines to ensure classified or otherwise sensitive information is not inadvertently disclosed.” OPSEC Assessment bottom line: OPSEC is emphasized, security is improved, threat awareness raised and mission success rate increased. Of note: “Operations Security” is not the same as “Operational Security.” The former focuses on protecting unclassified indicators to critical information from the adversary’s perspective while the latter, although not defined in Joint Pub 1-02, is commonly associated with physical protection measures regarding building or network access concerns. Recommended Assessment Procedures The steps listed below provide the basic and logical steps to conduct an OPSEC Assessment and have been used at many Department of Defense (DoD) shore based, Navy ships and forward deployed organizations world wide with consistent, positive results. It is highly recommended that all the steps be read first to gain insight to the entire assessment process prior to its execution. For example, if communications 34 security (COMSEC) monitoring is going to be part of the assessment, scheduling may take several months. Although no specific or unique training is required to administer and conduct an OPSEC assessment, it is assumed that the organization’s OPSEC Officer and working group members have completed basic OPSEC education and understand OPSEC fundamentals. If training is required, OPSEC training sources (formal and CBT) are referenced at the very end of this document. Complete each step in the order listed below: Steps: 1. Complete the “Rate Your OPSEC” survey below to determine the status of your organization’s OPSEC program. Upon completion, proceed to step 2. Rate Your OPSEC Instructions: Assess your command's OPSEC posture by completing the following questions. Insert 10 for a "Yes" response, 0 for a "No" response and 1-9 in "Progressing" (depending on the degree you feel your command is at in regards to that question.) YES NO Progressing 1. Does your command have an OPSEC Officer in writing? 2. Has the OPSEC Officer received formal OPSEC training or completed the OPSEC 1301 CBT? 3. Does your command have an OPSEC instruction? 4. Has your command conducted an annual OPSEC assessment? 5. Does your command have an OPSEC working group? 6. Is your command's Critical Information available to all personnel for awareness? 7. Does the command have a shred or paper destruction Policy? 8. Does the command provide OPSEC training during command indoctrination? 9. As a minimum, does the command provide yearly OPSEC GMT? 10. Does your command utilize OPSEC awareness products? (I.E. Posters, signs, etc.) 35 Total score = 0 Upon score calculation, determine whether your program is satisfactory or requires improvement. Scores greater than 85 represent OPSEC programs that require minor adjustments. Scores less than 85 require greater emphasis and concerns should be addressed immediately. 2. In the event you answered “No” to the Rate your OPSEC survey questions: (1), (2), (3), (5), or (6), then corrective action needs to be taken prior to conducting an assessment. When the survey answers are “Yes” or found to be satisfactory, proceed to step 3. 3. Assemble your Working Group to determine an appropriate execution timeline for this assessment. To optimize the effectiveness of an OPSEC program or assessment, a comprehensive understanding of relevant processes, activities, business practices, and applicable critical information is required. This is most easily obtained through a working group whose representatives (at least one) are derived from each division, department, directorate, etc. For example, Operations, Communications, Logistics, Intelligence, Administration, Public Affairs, etc. each should include a participating team member. Another benefit is that the working group will consist of subject matter experts with intimate knowledge of routines, inter-workings, and potential vulnerabilities. If involved with Information Operations (IO) missions or planning, including Psychological Operations (PYSOP) and Military Deception (MILDEC) representatives will improve the OPSEC working group’s impact to mission success. If not already completed, the working group will generate the Critical Information list. It is recommended the events proceed in the following order including, but not restricted to: (Details of each are broken out farther beginning on page 37.) A. In Brief B. Threat brief C. Red Team activities D. Observations, space walk-throughs and dumpster dives E. Conduct OPSEC interviews F. COMSEC Monitoring G. Web Risk Assessment (WRA) H. Physical and electronic integrity breach J. Command program review K. Assessment wrap-up; Plan of Action & Milestones (POA&M) Below is a generic timeline depicting the general sequence of events: 36 Timeline • Verify CI / EEFI (Date) • Obtain threat assessment: NCIS (Date) – Foreign intelligence service collectors, terrorists, criminals • Identify Vulnerabilities / Conduct assessment (Date) – Evaluate command emphasis, awareness, training; conduct interviews – Emulate threat: Open source discovery, dumpster dives [test physical security, observe routines, monitor comms & network – not performed] • Assess risk of vulnerability findings (Date) • Apply / evaluate countermeasures (recommendations) (Date) Threat Assessment u -30 Days Red Team u Observations / Dumpster Dive u Web Risk JCMA / Surveillance Assessment u interviews u CMD Breach u Day 1 CMD Program Review u Evaluation Complete u Day 5 Sample five-day assessment daily POA&M: Monday (Day Month Year) 0900 Team leaders muster 0930 – 1415 Surveillance of building(s); Dumpster dives; Working Group members walk through assigned spaces with checklist 1430 Team leaders muster for debrief Tuesday (Day Month Year) 0900 Team leaders muster 0930 – 1415 Surveillance / intrusion of building(s); Conduct interviews / space walk through 1430 Team leaders muster for debrief Wednesday (Day Month Year) 0900 Team leaders muster 0930 – 1415 Intrusion of buildings; Dumpster dives; Policy review; Conduct interviews / space walk through (cont.) 1430 Team leaders muster for debrief Thursday (Day Month Year) 0900 Team leaders muster 0930 – 1415 Intrusion team compile findings for out brief; Policy review (cont.) / compile findings for out brief Conduct interviews / space walk through (cont.) 1430 Team leaders muster for debrief Friday (Day Month Year) 0900 Team leaders muster 37 0930 – 1215 1300 Conduct interviews / space walk through (cont.) / compile findings for out brief Dumpster dives / compile findings for out brief Final Out Brief (all WG members) A. Threat brief The Commander of NETWARCOM recently commented on the persistence of adversarial intent and capability: “The threat vector is 360 degrees, the enemy is ever vigilant probing and collecting 24/7, and our information is constantly at risk, at work and at home. You must be at GQ round the clock.” In order to understand what threats are relevant to your organization, obtain a local threat briefing from the organization’s intelligence representative or Service investigative branch agent (i.e. Navy would contact the Naval Criminal Investigative Service [NCIS]). The presentation will provide actual adversarial intentions and capabilities that need to be emulated in support of the assessment. This brief should be presented prior to the execution phase of the assessment, as it will raise the level of awareness of all personnel. Without this brief, an assessment may focus on erroneous adversary capabilities and portray irrelevant vulnerabilities. B. Red Team activities A group of individuals with proper authorities will replicate adversary capabilities as outlined in the Threat Brief. By simulating malevolent intent via a wide spectrum of institutional or ad hoc methodologies, potential vulnerabilities are usually uncovered. From network penetration to dumpster dives and from attempts to gain building access without proper identification to monitoring conversations at local areas of personnel congregation, the Red Team demonstrates the adversary’s view. After weaknesses are identified, specific mitigation strategies are developed to prevent exploitation. Before the assessment begins, Red Team members and activities will be identified and approved via a document (otherwise known as a “Get out of Jail free Card”) by the organization’s Commander, OPSEC and Security Officers. C. Observations, space walk-throughs and dumpster dives These functions can be conducted by working group members or the Red Team. Through observations, one can identify potential vulnerabilities via visible indicators, predictable patterns, entrance procedures, poor security practices, etc. Dumpster-dives reveal the organization’s policy on discarding documentation, classified and unclassified. Team members will explore discarded contents in workspace and outside containers for disclosures of the organization’s critical information (operation or exercise). Even though an organization may not “own” the dumpster at the end of the pier, it is imperative to identify what an adversary will have access. Immediately inform the information security officer / manager once classified information is discovered. Policy changes are typically recommended upon assessment observation and dumpster dive findings. Use the following list to conduct a space walkthrough. Comment on any poor security practices noticed during walk-through not listed below: 38 Office/Space checked: _____________ Date checked: __________ _____ CI Cue Card (Yellow Card) posted near phone/computer? _____ Posters Posted _____ Phone stickers on phones and legible _____ Shredders available and operable _____ Burnbags available _____ Personal information in the open/posted _____ Unoccupied computers logged on _____ Computer passwords written in open _____ Computer screens facing windows _____ Safes locked when not is use _____ Cell phones in spaces Use the following checklist for trash searches: Trash / Recycle Receptacles or Dumpster location ________ Date / time checked: __________ ____ Privacy Act information, to include but not limited to SSN, addresses, phone numbers, and family information ____ POD / POW ____ Documents related to command, mission and critical information ____ Supply requests and / or equipment inventories ____ Discarded / unopened mail, whether personal or command specific ____ Itineraries / VIP schedules ____ Joint/ Navy doctrine, publications and instructions D. Conduct OPSEC interviews OPSEC interviews provide a non-attributable means of acquiring insight to potential vulnerabilities that organizational personnel may be aware of, yet tend not to disclose during the course of everyday activities. The names of the interviewees are NOT disclosed to facilitate non-attribution. Questions are developed by the OPSEC working group to gain insight to OPSEC awareness and practices. Often the questions reflect the chief concerns of the Commander. Responses are collated and integrated into the out brief. It is recommended that working group members pair-up and interview organizational personnel, preferably not from the interviewer’s division, department, etc. Interviewers from different areas of the organization tend to make those interviewed more comfortable and able to provide honest answers, not the answers they think the organization wants to hear. Optimally, one-person interviews an individual while another records the response. However, other interview options may be used to attain the required insight to the OPSEC posture. For example, one can interview small groups of similar ranking personnel, similar division personnel, etc. Sample interview questions 39 Regarding the number of interviews required: depending upon the organization’s size, number of interviewers and time allotted, the working group will propose – the commander decides – on a “representative sample” percentage. As with any survey / polling data, the smaller the sample size, the less accurate the results. Ten (10) percent is usually too small and one hundred (100) percent is often too difficult. If each working group member interviews seventy (70) percent of each division, then a representative sample is readily achieved. As personnel are the key to protecting an organization’s critical information, OPSEC interviews are fundamental in understanding their ability to prevent its exploitation. Metrics from interviews are focus-area indicators. Keep the number of questions to ten or twelve. Ask open-ended questions, but grade them as “yes” or “no.” Therefore, data from hundreds of interviews can be simply captured in spreadsheet form. For example, ask, “Explain what OPSEC is and why it is important.” Correct responses will be marked “yes” and incorrect responses marked “no, “as the following slide depicts: Combined Total (126) 20% 40% 60% 80% Explain OPSEC and its relevance Provide examples of CI / EEFI ID location of CI / EEFI list Sensitive info not publicly discussed Name CMD’s most relevant threat Public websites can be exploited Name ways to prevent LN collection Recall last OPSEC training event List STU / STE vulnerabilities Handling of UNCLAS / FOUO CMD shred policy CMD cell phone, PDA, & USB drive policies Yes No Focus Area E. COMSEC Monitoring Unfortunately, personnel commonly discuss an organization’s critical information via un-secure government communications (phones, email, etc.). Army General McKiernan stated in August 2006 that, “Even when the user turns it off, a wireless device can be remotely turned on to eaves drop and retransmit conversations, typically within 20 feet of the device. Because there are no external indications of active use, the user will not know that the device has been turned on.” If requested, your organization can request and authorize the Joint COMSEC Monitoring Activity (JCMA) monitor government communications for references of the organization’s critical information (working group provides target information to JCMA). Prior to communications monitoring, it is imperative 40 that personnel are provided notice of proposed monitoring (attain Legal Council approval). Results are typically compiled daily and provided to a single designated individual (i.e. OPSEC Officer). Findings identify whether or not personnel divulge critical information via un-secure communications modes and are non-attributable as the offender’s name is not identified, only the revealing disclosure content. F. Web Risk Assessment (WRA) An Al Qaeda training manual recovered in Afghanistan states, “Using public sources openly and without resorting to illegal means, it is possible to gather at least 80 percent of information about the enemy.” Justifiably, information posted on an organization’s publicly accessible website must be regularly reviewed to ensure it is free of critical information and or information that provide adversarial advantage. Additionally, web site material will be analyzed for accumulation of seemingly unrelated topics that when aggregated, disclose information useful to adversaries. SECDEF promoted in an Information Security/Website Alert on August 2006, “All personnel have the responsibility to ensure that no information that might place our service members in jeopardy or that would be of use to our adversaries is posted to websites that are readily accessible by the public.” During an assessment, a working group or Red Team member will review the organization’s web page for Critical Information as well as ensure compliance with DoD regulations and instructions. The Navy Information Operations Command Norfolk maintains a cadre of Web Risk Assessment experts and a website (https://[email protected]/operations/wra/wra.shtml) filled with resources (checklists, references, etc.) promoting effective WRA. Findings must be discussed with the Public Affairs Officer. G. Physical and electronic integrity breach If applicable, approved and pre-coordinated, Red Team personnel will attempt to compromise building integrity through attempts to bypass or circumvent physical and / or electronic security measures. The Red Team should never cause physical damage to any property or person while conducting their duties as a simulated aggressor. It is, however, acceptable to leave a mark, i.e. Red Team sticker, to illustrate the fact that vulnerability was identified and the potential of compromise or disclosure was probable. Before the assessment begins, Red Team members and activities will be identified and listed on a limited distributed document (Get out of Jail free Card). The following checklist serves as a good reference: _____ Badges properly checked at Entrance / Quarterdeck _____ Badges openly worn outside _____ CO/XO or VIP arrival/departures repetitive _____ Building doors secure during / after hours _____ Outside exit only doors secured _____ Cipher locks easily bypassed 41 _____ Piggybacking occurs (someone holds door, others enter without swiping badge) _____ Shoulder Surfing opportunities exit (ease of observing other’s PC screens) Date of intrusion attempt: __________ Building: __________ Areas observed / Areas breeched: _____________ H. Command program review During this portion of the assessment, a designated team member from the working group should review all applicable documentation and procedures related to the organization’s OPSEC program. For example, has the OPSEC Officer and working group members obtained current letters of designation? Has training been conducted and documented? Have instructions and standard operating procedures (SOPs) been updated? Use the checklist below to gauge the adequacy of your program: ____ OPSEC Officer designated via appointment letter ____ Critical Information List (CIL) developed, relevant and posted near PCs, phones, copiers, faxes, shredders, etc. ____ Assessment results from previous year (formal and/or informal) ____ Command OPSEC instruction, policy, or plan on file ____ Personal Electronic Device (PED) policy ____ Shred policy I. Assessment wrap-up; Plan of Action & Milestones (POA&M) When all assessment activities are complete and data compiled for summarization, it is recommended that a Power Point brief be built for the Commander’s out brief. The out brief should include key findings and recommendations for corrective action with specific remediation milestones and designated action officers. This brief should serve as a POA&M template for the working group to identify and track all deficiencies and prepare them for the sixmonth follow-up report. 4. Based on the above, present Commanding Officer with an In-Brief prior to the assessment and obtain approval to proceed. Proceed to step 5. 5. Request COMSEC monitoring support if required / needed. Due to many requests for this limited resource service, scheduling must be done months in advance. If this resource is not available, continue with the assessment but make a note of it during the Out-brief. Proceed to step 6. 42 6. Contact command Intelligence department, (i.e. N2, G2, S2, J2 etc.) or Service investigative branch (i.e. NCIS, OSI, CID, etc.) for a threat brief / analysis of local threat intent and capabilities. Proceed to step 7. 7. Assign team leads for designated portions of the assessment (i.e. Dumpster dive, Interviews, Observations, etc.). Proceed to step 8. 8. Begin assessment in accordance with your POA&M. After each assessment activity has been executed, proceed to step 9. 9. Upon completion of the execution phase, and all information has been gathered, it is recommended the working group begin compiling a comprehensive report to present findings to the Commander. It is recommended a short Power Point brief reflecting these findings and recommendations for corrective action are presented to the Commander. SECDEF’s DODDIR 5205.02 directs, “As an operations activity, OPSEC will be considered during the entire lifecycle of military operations or activities” and “Ensure adequate practices are in place to prevent adversaries from taking advantage of and aggregating publicly available information...and other detectable activities to derive indicators of U.S. intentions, capabilities operations and activities.” Conducting OPSEC assessment via the steps outlined above ensures SECDEF requirement fulfillment. Sources for OPSEC training are available at the following websites: IOSS JIOC Army Navy / USMC Air Force http://www.ioss.gov http://josc.jioc.smil.mil (SIPR only) https://www.1stiocmd.army.mil/io_portal/Public/Pages/Pulic_main.cfm https://www.nioc-norfolk.navy.mil/operations/opsec https://www.afiwcmil.lackland.af.mil.opsec/index.cfm ***Check out wiki-pedia’s definition of OPSEC. Search wiki-how’s “How to Conduct an OPSEC Assessment.” *** ____________ Daryl Haegley is the OPSEC Program Manager for the Department of the Navy, Naval Sea Systems Command (NAVSEA). NAVSEA is the 3rd largest command in the Navy with over 53,000 employees and a budget of over $30 billion. He currently serves as the National Vice President of the OPSEC Professionals Society and is a recently retired U.S. Naval Officer. He is certified as an OPSEC Professional (OCP) and as a Confidentiality Officer (CCO). 43 A Pretext for War -- An Insider Threat and an OPSEC Failure A Book Review by William M. Feidl, OCP P rotecting sources and methods is one of the key security objectives of the intelligence community. In his most current book, A Pretext For War, James Bamford, author of The Puzzle Palace and Body of Secrets presents a cogent—if somewhat slanted—perspective of the Iraqi war. The Puzzle Palace revealed the inner workings of the NSA, reportedly the largest, most secretive, and bestfinanced intelligence organization in the world. Body of Secrets took readers inside what is routinely referred to as the ultrasecret agency, charting its deeds and misdeeds from its founding in 1952 to the end of the twentieth century. In this new book, Bamford takes on the entire intelligence community and presents a compelling picture of an inept community effort, bolstered by misdirected political pressures to provide evidence of Iraqi WMD. You can read any number of reviews of this book, evenly split on its merits. While I personally found the book entertaining and to some degree informative, the level of detail and insight it provided was disturbing. Of course this was no surprise. In his previous books, Bamford demonstrated an excellent capability to elicit sensitive information from “trusted” insiders. The trusted insider is a major dilemma for the U.S. Government. These individuals are recruited for sensitive positions and provided access to the most sensitive US Governments secrets. This same group of individuals represent the majority of U.S. persons arrested for espionage and are also believed to be the primary source for media leaks which occur with far too much rapidity. One can only conjecture how trusted insiders react when they view their thoughts on the otherwise white pages of Bamford’s books and other publications notorious for “leaking” sensitive government secrets. One can speculate that the insider’s ego is swelling, while his anxiety of possibly being identified is soaring. Why is the media leaker important? The answer is relatively simple, the impact of espionage and media leaks are similar if not identical. It is baffling that we sentence individuals to life for espionage and appear to overlook serious security compromises by media leakers. Many of us cringe as we read some of the blatant security compromises routinely evidenced in the media, be it a technical publication or a widely respected news outlet. Most frequently, we tend to blame the author—who obviously is culpable—but fail to understand that the real 44 offender is much more complex and involves the author, the publisher, the “source” or the insider, as well as the lack of accountability. Release of classified and sensitive information is governed by statutory and regulatory guidance, supplemented by senior leadership direction. In spite of this infrastructure, media leaks continue to plague the government. Frankly, it should come as no great surprise that people like to talk to the media about what they have agreed not to talk about under penalty of law. Yet they talk and they talk and they talk. The motivations of these insiders are as varied as the personalities we see around us on a daily basis. Interestingly, the commonly accepted motivators for espionage fall into four basic areas identified using the acronym “MICE”. 1 • M – Money. This is generally referred to as greed and is the single most common motivator for espionage. • I - Ideology. Support for religious or secular beliefs. situational, but certainly a major factor in espionage. • C – Compromise. Generally blackmail based on criminal or sexual misconduct. An extremely viable and useful tool which has had tremendous success in the past and will continue to be useful in the future. • E – Ego. This aspect encompasses a wide range of psychological contributions focused on esteem; e.g., feelings of superiority, inferiority, depression, rejection, and a number of other factors— resulting in a disgruntled employee. Extremely It would appear that few trusted insiders do it for money, most do it for ego. There used to be a fairly successful TV show called, “I have a Secret” where individuals would confess their most innocent and at times embarrassing foibles for the privilege of being on nationwide television. Today we see this demonstrated in the “Funniest Videos”’. In the case of a media leak, the insider gains some degree of notoriety while he / she is virtually guaranteed anonymity because the author / reporter can hide behind the 1st Amendment. We shouldn’t have it any other way, but let us also remember that this right comes at a significant cost. There are any numbers of cases where the media has compromised extremely sensitive technological developments, intelligence operations, and other classified activities. But let’s get back to the collector, frequently referred to as a “reporter.” Every foreign intelligence service in the world would love to have a James Bamford, Tom Clancy, etc., on its payroll. They are tremendous assets, able to consistently elicit sensitive information from insiders while protected by the 1st 1 The concept of MICE was a core concept for the Soviet Intelligence Services. This is a very generalized presentation of MICE, but a review of espionage cases over the years suggests the concept continues to have considerable merit. 45 Amendment. Good investigative authors and reporters are a valued media asset, but they also represent a major threat to national security—particularly when the publisher is willing to sacrifice US National Security interest for the sake of notoriety. Successful collectors like Bamford are rare. They repeatedly demonstrate instinctive analytical capabilities, good research expertise, exceptional social talents, great memories, outstanding writing skills, and pack rat mentality. Most reporters start their careers using the reputation of the publication and eventually build their own status developing their own sources—who invariably are trusted insiders. As the stature of individual collectors increase, their ability to access sources becomes straightforward. It is not uncommon for well-known authors / reporters to make appointments with senior leadership at a moments notice. Many of these meetings are off the record and at other times certain information is provided as “background” and published without attribution. But the success of these collectors hinges on access to the “trusted” insider. As frustrating as it may be, the lack of OPSEC awareness and appreciation for the national security infrastructure on the part of trusted insiders who obviously have top level security clearances is apparent in media leaks. In spite of direction from the most senior levels of government, certain people obviously think it is their divine right to disclose classified US Government information. After all, Bamford is not a Russian or Chinese intelligence officer and therefore fair game. More importantly, the collector is a U.S. citizen and thus disclosing sensitive information to him / her is not the same as espionage. There is also the issue of accountability—when is the last time an individual was called to account for a media leak? Are you following this convoluted thought process? The fact is that security is not an issue when the ego says the individual is more important than the government’s right to protect its secrets. Individual contributors to media may think of themselves as patriots, but when it is done and over, they are just as guilty of espionage as Hanson, Ames, Cavanaugh, etc. Their ability to put idealism ahead of national security is a weak argument which needs to recognized for what it is: espionage. I recommend Bamford’s book, but as you read it, I suggest you consider who provided the background information and what his / her motivation was. You should also ask, “When is the US government going to become serious about OPSEC and the identification of the insider threat?” More importantly, consider the consequences of compromising US national security secrets. James Bamford, A Pretext for War, Doubleday, 2004 ISBN 0-385-50672-4 __________ William M. Feidl is certified as an OPSEC Professional (OCP). He is a charter member, past member of the National Board of Directors and permanent member of the Standards Committee of the OPSEC Professionals Society. 46 The OPSEC PROFESSIONALS SOCIETY PO Box 150515, Alexandria, VA 22315-0515 http://www.opsecsociety.org or https://opmis.xsp.org
© Copyright 2026 Paperzz