Impacts of information system vulnerabilities on society

Impacts of information system vulnerabilities on society
by LANCE J. HOFFMAN
The George Washington University
Washington, D.C.
ABSTRACT
After briefly presenting examples of potential vulnerabilities in computer systems
which society relies on, the concept of risk analysis is introduced and applied to a
simplified model for a nation's financial system. A sampling of specific technical
safeguards to ameliorate the risk in this (or any) computer system is then given. The
paper concludes with examples of questions to be asked before committing to any
new technological system.
461
From the collection of the Computer History Museum (www.computerhistory.org)
From the collection of the Computer History Museum (www.computerhistory.org)
Impacts of Information System Vulnerabilities
INTEGRATED INFORMATION SYSTEMS
As computer applications in many countries have become
increasingly sophisticated, the operators of these systems
have become increasingly concerned about the unforeseen
consequences of total reliance on them and have become
more and more aware of the vulnerability of these systems to
failures.
Society's dependence on the uninterrupted operation of
large information systems has been increasing. 13 As these systems have grown larger, more complex, and more centralized,
the potential societal loss from their failure or misuse has also
increased. When society becomes highly dependent on the
reliable functioning of a single integrated technological system or small collections of such systems, the possibility of a
"domino-like" collapse of several of the individual connected
units could be disastrous. The failure of the Northeast power
grid in 1965, which blacked out much of that section of the
United States (including all of New York City), is an example.
Other risks may be in the form of economic losses, such as
the failure of an automated check-clearing system or a national automated securities market. Banks or brokerage houses
could be severely damaged in a matter of minutes, long before
it was discovered that the system had failed. The potential
victims would be the owners of the failed system, individuals
with accounts, correspondent organizations, and (if the failure cascaded through other institutions) all of society.
Still other risks may entail social costs such as would occur if a centralized criminal history system or an electronic
funds transfer (EFT) payment system were misused by a
government or a private firm to exert undue control over
individuals. 2
Large, centralized systems are not all bad; in addition to
cost and functional advantages, a large nationally networked
information system may provide greater availability than a
single-site system by supplying instant backup to nodes that
fail. However, there may also be a greater risk that the entire
system will collapse if an unlikely or unexpected combination
of events occurs. While the likelihood of this may be very low,
the consequences might be extremely damaging in terms of
physical, economic, and/or social costs.
How can we estimate the potential risks of such catastrophic
events? To do this, we turn to the field of risk analysis-the
study of estimating loss from adverse events.
463
We prefer inexact tree-based risk analysis methods 4 ,14,16
which use linguistic terms such as "high," "medium," and
"low." One can use fuzzy set theory17 or other means to
implement these and to allow the estimator to provide a degree of confidence for the estimates; this can then be taken
into account when risk is computed. The computed results are
then less prone to instill false confidence, because they are
modified to reflect estimator confidence in the inputs and
given in words rather than in numbers. We also believe that
linguistic estimates are more useful for subjective risk analyses which deal with human error or social risk. There are
myriad problems in obtaining numeric input data. 15 Nonnumeric input data are easier to obtain and allow the use of
structural risk analysis techniques that force the disclosure of
assumptions in the input data.
Tree-based risk analysis has recently been made available in
computer systems, making sensitivity analysis (asking "what
if' questions) much easier and inexpensive than with many
other systems.
A TREE-STRUCTURED MODEL FOR ONE
POTENTIALLY VULNERABLE SYSTEM-A
DEPOSITORY FINANCIAL SYSTEM
A simple model which treats separately paper, "old electronic" (wire transfers, etc.), and electronic funds transfer transactions of a depository system is shown in Figure 1. Figure 2
treats that depository system as a subsystem of a financial
system involving a number of financial institutions. Figure 3
treats the world as a system composed of several interrelated
financial systems.
Obviously the model here is simplified; in particular, the
Depository System
for
Funds Movement
RISK ANALYSIS
Risk analysis often is used to estimate the exposure of system
components to various threats and to allocate resources to
provide both technical and nontechnical safeguards against
various threats. 3 ,15,19,20,21
Paper
Transfers
Old
Electronic
Transfers
Electronic
Funds Transfers
(EFT)
Figure I-Tree structure of a depository system for funds movement
From the collection of the Computer History Museum (www.computerhistory.org)
National Computer Conference, 1982
464
Financial
System
Nation 1
Financial
System
Nation 2
Financial
System
Nation 3
Figure 2-Possible tree structure of a financial system
World Financial System
Figure 4-Detailed tree structure for simplified model of world financial
system. (Broken lines are not part of the tree structure; they indicate
significant interdependencies.)
Financial
System
Nation 1
Financial
System
Nation 2
Financial
System
Nation N
Figure 3-Possible tree structure of the world financial system
dotted lines in Figure 4 indicate just some of the large flows
of information which constitute threats and which must be
taken into account in any risk analysis but which are not
considered here. In addition, we do not address here (although a larger tree could) non-operational problems such
as potential limited blockade against the import of spare parts
for computers. 13 Nevertheless, we hope our simplified model
bears enough resemblance to reality to highlight basic concepts and that other tree-based models can be suggested for
societal institutions so that tree-based risk analysis can be
used in assessing the vulnerabilities of these institutions.
EXAMPLE RISK ANALYSES OF THE FINANCIAL
SYSTEM
We now consider the financial system model outlined above
and use it to evaluate vulnerability risks for a "low risk"
scenario and a "high risk" scenario. The estimates given are
meant to be illustrative only--each reader can supply his or
her own. We are only attempting to give examples of how risk
analysis can be used to evaluate the potential vulnerabilities of
financial systems; we make no claims about the validity of the
specific data or computations used.
Low Risk Scenario
Figure 5 illustrates the low risk scenario. Here we consider
the relative weights (or importance) of the financial systems of
Nation 1, Nation 2, and Nation 3, respectively, as LOW,
HIGH, and LOW (with respect to the risk associated with the
higher level World Financial System).
At the bottom level of Figure 5, the likelihood of failure and
severity of failure for the various subsystems are input data,
and risks computed from these are shown in the top half of
some boxes in Figure 5. Risks for subsystems can also be
estimated; these are shown in italics in the top half of some
boxes in Figure 5. Given this data, which is estimated or
computed from input data, risks for higher level subsystems
can be derived. 4,14 Derived risks for (the higher level subsystems of) Figure 5 are shown in CAPITAL LETTERS in the
top half of some boxes in Figure 5. Details on possible risk
computation methods are given in Schmucker;14 in essence,
we consider some of these analogous to weighted sums, but
using nonnumeric data.
In this "low risk" scenario, the systems with the highest
associated risks are the financial systems of Nation 1 and
Nation 3.
Higher Risk Scenario
Risks computed using slightly different data (the weights of
the subsystems of the Depository System of Nation 2 have
From the collection of the Computer History Museum (www.computerhistory.org)
Impacts of Information System Vulnerabilities
465
INTERCONNECTION OF INFORMATION SYSTEMS
AND JOINT VULNERABILITY
Weights;
Weights:
Weights:
Likelihood
of failure:
Severity
of loss:
Medium
to High
Low
Medium
Hiah
Medium
Medium
Medium
Medium
Medium
Medium
Low
Medium
Medium
Medium
Medium
Medium
Figure 5-Financial system low risk scenario
Due to the complexities of real-world systems, the subsystems
are interdependent (as shown by the dotted lines in Figures 4,
5, and 6). A well-publicized failure in one EFf system may
subtly influence vulnerabilities in other EFf systems (for exampie, by inviting attempts against similar systems). If, in
turn, there is a relatively large number of successes of exploiting vulnerabilities in a number of EFf systems, this may
affect the depository system of a country. Our simple model
above does not reflect many relationships; for example, it
does not show the effects of an increase of vulnerabilities in
one country on vulnerabilities in another country.
We've shown in the preceding pages how one can structure
a simplified model of the world financial system as a tree and
separately evaluate the risks to it and to each nation's financial
subsystem. We can also insert our world financial system of
Figure 4 into a larger World Information System (Figure 7).
After weights and estimates of severity and likelihood are
added to Figure 7, we can perform a sensitivity analysis to
determine the one or two most critical subsystems. Then we
can focus our efforts to protect the overall system by strengthening the most critical subsystems using non-technical and
technical safeguard methods.
been juggled, and the severity of a loss due to the EFf subsystem has been changed from "LOW" to "HIGH") are
shown in Figure 6. Given this scenario, the system with the
highest associated risk is now the financial system of Nation 2,
and within it, the greatest contributing subsystem is the EFf
subsystem.
Financial
Subsystem
Telephone
Subsystem
Electronic
Mail
Subsystem
Criminal
Justice
Information
Subsystem
Air Traffic
Control
Subsystem
Other
Information
Subsystem
Figure 7-World information system
Weights:
AMELIORATING SECURITY PROBLEMS IN
COMPUTER SYSTEMS
Weights:
Weights:
Likelihood
of failure:
Severity
of loss:
Medium
to High
Low
Medium
High
Medium
Medium
Medium
Medium
Medium
Medium
High
Medium
Medium
Medium
Medium
Medium
Figure 6-Financial system higher risk scenario
We can fairly easily determine the greatest contributors of risk
and then assign safeguards at the appropriate places in a system's tree. Applying appropriate countermeasures (for example, adding additional procedural and administrative controls)
might transform the risk indicators of Figure 6 to those of
Figure 5.
In addition, one can always use countermeasures which are
external to the system under discussion: additional legal safeguards, an improved physical security program to deter
threats, additional training of auditors to detect variances
from the norm, etc. Preventive action can take a number of
forms in computer systems, but the reader should keep in
mind that the majority of computer security problems are not
resolved by technical security measures such as those we are
discussing here. Rather, they are resolved by the more traditional non-technical solutions: physical security, legal and administrative restrictions, and policies and procedures. These
From the collection of the Computer History Museum (www.computerhistory.org)
466
National Computer Conference, 1982
non-technical methods are generally well known to the managerial and audit community. 6 Because technological security
measures are less commonly known, we shall here mention a
couple of the most useful. Many more are described in much
more detai1. 7 ,8,18
SOME TECHNOLOGICAL COMPUTER SECURITY
MEASURES
rections before the system is implemented; it is better to
have bugs discovered early by invited commentators
rather than later on by intruders.
2. User acceptability
The human interface must be simple, natural, and easy
to use; if not, users will bypass it, thus rendering the
security system ineffective.
SOME WORRISOME QUESTIONS
Computer systems can identify and authenticate users by various methods. The use of an identification number or a user's
name together with a computer account number and a password is typical. Computer systems can also control access by
levels. Top managers, for example, can be granted access to
more information than lower level personnel. Computer systems can also differentiate based on category (need to know);
users having only a need for financial data can be prevented
from access to medical data. When attempts to access information in computer systems are denied, a log of information
about the attempt can be recorded for later analysis.
Another method of protection is the use of cryptography
for storage and transmission of information. Plaintext information is encrypted by a transmitting device. The encrypted
message, or ciphertext, is transmitted across a relatively insecure medium (such as a communication line) to a receiving
device. Only if the human sender and the human receiver
possess the same (secret) digital key is the information intelligible to the receiver (Figure 8). Otherwise, only unintelligible ciphertext is seen.
User
A
User
B
Figure 8--Cryptographic system
One possible cipher is now a U.S. national standard,\I mandated for use by all civilian applications of the U.S. Government which require cryptography and supplied as hardware by
several U.S. manufacturers. Another possible cipher technique is public key cryptography. 10 It also has the capability of
transmitting signatures in such a way that it is very difficult for
someone to send a message and then later disavow it.
DESIGN PRINCIPLES FOR COMPUTER SECURITY
There are some generally accepted principles to follow in the
secure design of computer systems. Two of the most important
are early critical review and user acceptability.
1. Early critical review
Exposing the frailties of a system to a number of bright
people during the planning stages will facilitate cor-
System designers and policy makers should ask some questions before committing themselves to any new technological
systems. The answers will not always be ea~y to find, but the
questions will not go away. The following are examples of
such questions:
• How can we balance the risks society may encounter
against the benefits it may receive, under conditions
where failure rates appear to be relatively low, but potential losses may be high?
• How can society retain the option to end its dependence
on a particular technology if it has unanticipated, undesirable effects? How can it avoid becoming "locked in"
to the use of an information system forever?
• How can society provide alternatives to persons choosing
not to use or live under a given system?
ACKNOWLEDGMENTS
Discussions with Fred W. Weingarten of the U.S. Congress,
Office of Technology Assessment (OTA) and Information
Policy, Inc., and with Zalman Shavell of OTA were very
helpful in developing some of these concepts. Hans Peter
Gassman provided some of the initial inspiration for writing
this by inviting an earlier paper on the topic for a workshop
sponsored by the Organization for Economic Cooperation
and Development. Kathy Hoffman ruthlessly pointed out unnecessary jargon. However, any responsibility for conceptual
or other errors is solely that of the author.
REFERENCES
1. Sweden, Ministry of Defence, The Vulnerability of the Computerized
Society-Considerations and Proposals, 1980.
2. U.S. Congress, Office of Technology Assessment, Computer-Based National Information Systems; Technology and Public Policy Issues, Sep.
tember 1981.
3. U.S. National Bureau of Standards, Guidelines for Automatic Data Processing Risk Analysis, FlPS PUB 65, August 1979.
4. Hoffman, L. J. and L. A. Neitzel. "Inexact Analysis of Risk." Computer
Security Journal, Vol. 1, No.1 (Spring 1981), Hudson, Mass.
5. Meadows, C. Identifying the Greatest Contributor to Risk in a Tree Model.
The George Washington University, Department of Electrical Engineering
and Computer Science, Research Report GWU-EECS-81-09, May 1981.
6. Martin, James. Security, Accuracy, and Privacy in Computer Systems, Englewood Cliffs, N.J.: Prentice-Hall, 1973.
7. Hoffman, L. J. Modern Methods for Computer Security and Privacy, Englewood Cliffs, N.J.: Prentice-Hall, Inc., 1977.
8. Fernandez, E. G., R. C. Summers, and C. Wood. Database Security and
Integrity, Reading. Mass.: Addison-Wesley Publishing Co .. 1981.
9. U.S. National Bureau of Standards, Data Encryption Standard FlPS PUB
46. Washington. D.C. 1977.
From the collection of the Computer History Museum (www.computerhistory.org)
Impacts of Information System Vulnerabilities
10. Hellman, M. E. "An Overview of Public Key Cryptography." IEEE Communications Society Magazine, November 1978, pp. 24-32.
11. Denning, D. E. "Are Statistical Data Bases Secure?" AFIPS Conference
Proceedings of the National Computer Conference (Vol. 47) 1978, pp.
525-530.
12. Denning, D. E., and J. Schlerer. "A Fast Procedure for Finding a Tracker
in a Statistical Database." ACM Transactions on Database Systems (Vol. 5)
No.1, March 1980, pp. 88-102.
13. Tengelin, U. "The Vulnerability of the Computerised Society." in Proceedings of the High Level Conference on Information, Computer, and
Communications Policies for the 80's, Organization for Economic Cooperation and Development, Paris, 1980, pp. 359-377.
14. Schmucker, K. "Fuzzy Sets, Natural Language Computations and Risk
Analysis." The George Washington University, Department of Electrical
Engineering and Computer Science, Research Report GWU-EECS-81-1O,
May 1981.
15. Okrent, D. (Ed.). "Risk-Benefit Methodology and Application: Some Papers Presented at the Engineering Foundation Workshop." September
16.
17.
18.
19.
20.
21.
467
22-26,1975, Asilomar, Calif., School of Engineering and Applied Science,
University of California, Los Angeles. Report UCLA-ENG-7598 (NTISPB-261920), December 1975.
Hoffman, L. J. "Tree-Based Risk Analysis Using Inexact Estimates." Report GWU-IIST-81-14, Department of Electrical Engineering and Computer Science, The George Washington University, Washington, D.C.,
June 1981.
Zadeh, L. A., K. S. Fu, K. Tanaka, and M. Shimura (Eds.). Fuzzy Sets and
Their Applications to Cognitive and Decision Processes. New York: Academic Press, 1975.
Denning, Dorothy. Cryptography and Data Security. Reading, Massachusetts: Addison-Wesley, 1982.
Okrent, D. "Risk-Benefit Evaluation for Large Technological Systems."
Nuclear Safety (Vol. 20) No.2, March-April 1979, pp. 148-164.
Lowrance, W. w. Of Acceptable Risk: Science and The Determination of
Safety. Los Altos, Calif.: William Kaufmann, Inc., 1976.
Rowe, W. L. An Anatomy of Risk, New York: John Wiley and Sons, 1977.
From the collection of the Computer History Museum (www.computerhistory.org)
From the collection of the Computer History Museum (www.computerhistory.org)