Klein 2011 - Duke University

Chapter 2: How I Learned to Stop Worrying and
Love Satisficing
Judy L. Klein’s contribution to How Reason almost Lost its Mind: The Strange Career of Cold
War Rationality. Do not cite or reproduce without the author’s permission.
The US Air Force looked to linear programming to yield an optimum allocation of
resources in detailed operations such as the Berlin Airlift and in plans for a general war, yet the
mathematical reach for best decisions exceeded the computational grasp.
Frankfurt/Rhein-Main 1948: On his seventieth day on the LeMay Coal and Feed run,
Lieutenant Fred V. McAfee, a self-described flour and dehydrated potatoes man who “was not
above hauling macaroni,” iterated his very regular life:
They briefed me on the basic pattern to follow and I’ve done it so many times
since, I could repeat their lecture. I was assigned a 6,500-foot altitude. I had ten
tons of flour, used 2700 rpm and forty-five inches of manifold until I broke ground
and then I throttled back to 2500 rpm and forty inches. At 500 feet I dropped her
back again, this time to 2300 rpm and thirty-four inches. I climbed on a course of
180 degrees for seven minutes, at an indicated speed of 160 miles an hour, leveled
off at my assigned altitude, where my power settings were reduced still again to
2050 rpm and twenty-nine inches. My indicated air speed was 170 miles an hour.
I’d picked up Darmstadt beacon. Presently it was Aschaffenburg, then Fulda’s range
and presently I was in the corridor [see Figure 1 for the radio beacons that guided
planes to the air corridors]. I let down over Tempelhof at 170 miles an hour, pushed
her in and watched them unload. Then back here. Then Berlin. And so it’s been for
seventy days. A very regular life.1
The regularity of McAfee’s life was a product of an assembly-line style of scientific
management that the US Air Force used in Operation Vittles to get essential supplies to the
blockaded zones of western Berlin. The management methods used for the logistics of the airlift
harked back to the early twentieth century time studies of Frederick Winslow Taylor and motion
1
Lt. Fred McAfee quoted in Paul Fisher, “The Berlin Airlift,” The Beehive, United Aircraft
Corporation 23 (1948): 14-15.
1
studies of Frank and Lillian Gilbreth. A group of applied mathematicians in the Pentagon was,
however, also using the Berlin airlift to construct a new management science based on the
mathematical programming of military operations. As we will see, that new combination of
mathematical model and algorithmic solution protocol helped put the “science” into both
management science and economic science. While holding out the promise of an optimal
allocation of resources in the absence of any market, the programming protocols also nurtured
Nobel Prize-winning musings on bounded rationality. But first back to Berlin.
The shattered, occupied, and zoned city of Berlin was a major, if not the key, battleground
for the Cold War. There were no hot war battles between the two superpowers in Berlin, but it
was a setting for their brinkmanship, crises, and settlements. In 1944, a year before the end of
World War II, allied powers in the European Advisory Commission determined the zones in
Germany and Berlin that after the defeat of the Nazi government would be occupied and
controlled by the Soviet Union, the United States of America, and Britain. The Yalta conference
in 1945 accorded France sectors carved out from former US and British planned jurisdictions.
Berlin was surrounded by the land intended for Soviet occupation, but the Commission
established the traffic routes by which American, British, and French garrisons in the respective
Berlin sectors could be supplied: a twenty-mile wide air space for each of three air corridors, and
a railway line from Helmstedt and a highway from Marienborn. After Germany’s surrender on 8
May 1945, the zoned occupation proceeded according to plan.
In the spring of 1948, the American, British, and French occupying forces took steps
toward a more unified and autonomous western Germany, including issuing new currency for
2
their combined zones in western Germany. The Soviet Union, worried about the potential for the
strengthening of a separate German state, responded with increased restrictions on travel to
Berlin and insisted that the currency of the Soviet zone be the sole currency for Berlin. On the
23rd of June the western allies announced their plan for issuing their new Deutsche Marks in the
French, British, and American-occupied zones of Berlin. That same day, the Soviet authorities
issued the East German Mark (also called the Deutsche Mark, colloquially referred to as the
Ostmark) with the intention of it being the currency for all of Berlin and began a complete
blockade of road, rail, and river traffic to and from the American, British, and French sectors of
Berlin as well as cutting off key sources of electrical power. 2
Britain, France and the US were faced with the choice of withdrawing from Berlin or
remaining at a high cost. The US government decided “to remain in Berlin, to utilize to the
utmost the present propaganda advantage of our position to supply the city by air and, if the
restrictions continued, to protest to the Soviets and keep the Berlin situation before World
attention.3 On Sunday 26 June 1948 the Berlin airbridge (Luftbrücke) with USAF Operation
Vittles and the British Operation Plane Fare began. The American and British airlifts were
ultimately successful in getting sufficient supplies to the western zones of Berlin for the 2.5
million people who lived there. 4 On 12 May 1949 the Soviet Union lifted the road blockade in
The details of the daily diplomatic to and fro of the Berlin crisis are spelled out in a formerly
top secret Department of State research report:
3
The Berlin Crisis. 1948. US Department of State, Foreign Policy Studies Branch, Division of
Historical Policy Research, Research Project No. 171.Washington, DC, 5.
4
The French also had a few planes helping out, but US and British planes carried most of the
material for the French zone. At one point there was a dispute between the American air crews
and the French when the former balked at hauling wine in. The outraged French “sent a
delegation armed with their dietary history through all times. Their chief contention was that
wine was to them equally as important as potatoes of a German, bread to a Russian, or ketchup to
a Texan.” (Fisher, Berlin Airlift, 9).
2
3
exchange for an agreement to a meeting of the respective four foreign ministers. In order to build
up a cushion for possible future stoppages, Operation Vittles continued until the last official
flight on 30 September 1949.
2.1 Operation Vittles
The ultimate success of Operation Vittles channeled resources to the Air Force and
enhanced the relative status of that newly-autonomous branch of the military as well as the
already high status of General Curtis LeMay, the commander of the USAF in Europe when the
airlift began. In an interview a few months into the airlift LeMay remarked,
To me, nothing points up the need for modern equipment as the airlift does. We’ve
gotten almost no new cargo or transport aircraft since the war ended....Finally we
know this: No matter what occurs in the Berlin situation, we’ll have to keep an
airlift organization in being. Not only in being, but ready….Frankly, I’m pleased
by the way this whole problem was met head-on. It was hard work, and it will
continue to be hard work. And we’ll be profiting by this experience for many
years.5
A few days after LeMay made these comments to a visiting journalist in September 1948
in Wiesbaden, the Air Force appointed him as chief of the Strategic Air Command. LeMay
assumed leadership of the SAC October 19, 1948 and remained in that key Cold War position
until June 1957. It was not just LeMay, however, who profited from the stunning success of
Operation Vittles. The great expense and complexity of the Berlin airlift capitalized research that
was already underway in the Air Force to optimize program planning for the rational allocation
of resources. With this research, management science in the Air Force and subsequently in US
5
LeMay quoted in Fisher, Berlin Airlift.
4
industry went from the time and motion studies for improving singular output with given inputs
to the computer-enhanced optimal allocation of alternative inputs to alternative outputs.
The journalist who interviewed LeMay described the General’s approach to the air bridge
problem: “break the operation down into components, select men to head the components that
were experts in their field, then weld the whole show into an assembly line. 6 For most of
Operations Vittles, the commanding general for organizing the assembly line was Major General
William Tunner. From the first day of his command on 29 July 1948, Tunner measured,
evaluated, and dictated the procedures for loading, flying and unloading of the aircraft. 7 The
timing of every flight was recorded and physical models of the layout of airfield facilities
enabled the study of bottlenecks and simulations for improving operations (see Figures 2 and 3).
Under Tunner’s discipline a typical six-hour block (shift) of 70 C-54 planes took off from
Frankfurt Rhein-Main air base at three minute intervals. The first plane in that block was
scheduled to return four hours and twenty minutes later, not long after the last of its block took
off for Berlin, and in 70 minutes the first plane was reloaded and ready to set off on the next shift
of 70 aircraft.8 This precision ensured that Operation Vittles brought 2.3 million tons of food to
West Berlin via 277,500 flights, which at the peak of the airlift were landing in the western zones
of Berlin at a rate of one every sixty-two seconds. Tunner’s scientific management, however, was
limited to the careful analysis and improvement of single-purpose operations with given
personnel and equipment.
6
Fisher, Berlin Airlift, 3.
7
For examples of the detailed-oriented management style Tunner brought to the flying and
loading operations see D.M. Giangreco and Robert E. Griffin, Airbridge to Berlin: the Berlin
crisis of 1948: its origins and aftermath (1st ed. Novato, Calif: Presidio, 1988).
8
Fisher, Berlin Airlift, 11.
5
The operational cost to the Air Force was immense and the Air Force required the
Comptroller’s office to make a daily budget report. Figure 4 comprises the daily report for 26
June 1949, the day marking the first anniversary of Operation Vittles. Part I at the top tabulates
the day’s tonnage delivered by how many aircraft on how many flights and a graph compares the
tonnage delivered that day with previous days that month and the daily average for all preceding
months. An attachment to that report tabulates the US and British efforts over the previous year
(Figure 5). In its first year, Operation Vittles cost the US military $227 million (over $2 billion
in 2010 dollars), not including the costs of vittles or coal, and Air Force operation costs
comprised over 14% of the entire USAF budget.
The job of the mathematicians employed in the USAF comptroller’s office in the
Pentagon was to develop optimizing algorithms that could minimize costs and incorporate
dynamic analysis for personnel training and aircraft production for future needs and take into
account opportunity costs (for example, bringing in more food today meant forgoing the
opportunity to bring in material for a new runway that would increase the daily amount of food
that could the airlifted in the future). Such were the goals of the Project for the Scientific
Computation of Optimum Programs.
2.2 Project SCOOP
In World War II, Captain Charles (Tex) Thornton created the US Army Air Force
Statistical Control unit to improve budget control and planning by using statistical information
for management decisions and operations control. The Statistical Control group employed,
among others, George Dantzig, Robert McNamara, and Harvard business professor Edmund
6
Learned. According to Dantzig’s historical account 9, Learned developed what the group called “a
program for programming”- a generic schema that would efficiently connect the scheduling of
many command agencies in any detailed operation (such as that pictured in Figure 6). The
efficiency emerged from sequentially ordering of hierarchical steps in a unidirectional
information flow that was compatible with the hierarchical bureaucratic structure of the USAF.
Despite the care the Statistical Control group took in constructing the scheme, they estimated it
would take seven months to complete the process of programming a major new operation.
Seven months was far too long a time to work out detailed programming for a possible
confrontation with the Soviet Union. Also the “efficiency” of Learned’s schema was limited.
Although the main driving force of the schema was to reduce the time and complexity of
planning for a specified operation, the program for programming an operation could not
minimize the costs of an operation and it could not solve the economic problem of planning
because it did not weigh the opportunity costs of alternative uses of resources.
Under the 1947 plan to separate the Air Force from the army and constitute it as an
autonomous branch of military service, the office of the Deputy Chief of Staff/ Comptroller
General E. W. Rawlings took charge of program planning, statistical control, and budgetary
functions. In June 1947, Rawlings formed a group in the Planning Research Division of the
Directorate of Management Analysis at the Pentagon to mechanize program planning for the Air
Force by developing decision-making protocols that could realize the future potential of
electronic digital computers. That initial group included George Dantzig (Chief Mathematician),
9
George B. Dantzig, Concepts and Origins of Linear Programming (Santa Monica, CA: RAND
Corporation, 1957).
7
Marshall Wood (Chief Economist & Directorate Head), Murray Geisler, and Emil Schell; Saul
Gass, Alex Orden, Walter Jacobs, Leon Goldstein, Lyle Johnson, and Philip Wolfe, among
others, joined the effort soon after. In October 1948 the Air Force accorded the group the official
title of Project SCOOP (Scientific Computation of Optimum Programs). This computation would
initiate a new applied mathematics of optimization – the decision-making process that
determined the way in which alternative inputs and activities could be combined to yield
maximum output (or minimum cost), subject to resource constraints.
There are two important features to highlight here: from the beginning, the design of
optimum programs for the Air Force was under the auspices of the comptroller’s office, and was
thus a part of the budgetary planning and management analysis branch of USAF headquarters at
the Pentagon; and the modeling and solution strategies were designed for electronic digital
computers that would not be available for Air Force programming until the early 1950s. The
context for the first feature was President Truman’s insistence that the military budget had to be
cut; there was considerable post-war pent up demand for a thriving consumer-based economy,
the electorate would not tolerate increases in military spending for a possible future war, and the
US monopoly on nuclear weapons ostensibly guaranteed national security.
10
The Air Force along
with other military branches had to do more with less.
10
The US monopoly lasted until the USSR successfully tested their first atomic bomb on 29
August 1949. Michael Gordin, Red Cloud at Dawn: Truman, Stalin, and the End of the Atomic
Monopoly (New York: Farrar, Straus and Giroux, 2009) explores US military and diplomatic
strategy during their short monopoly.
8
Project SCOOP aimed to combine the science of economizing with effective
computation. In an August 1948 brief to the Air Staff on their proposal for formalizing the
mission of Project SCOOP Dantzig and Wood laid out their ambitious plans:
1.
The primary objective of Project SCOOP is the development of an
advanced design for an integrated and comprehensive system for the planning and
control of all Air Force activities.
2.
The recent development of high speed digital electronic computers
presages an extensive application of mathematics to large-scale management of
problems of the quantitative type. Project SCOOP is designed to prepare the Air
Force to take maximum advantage of these developments. 11
Dantzig and Wood defined programming as “the construction of a schedule of actions by
means of which an economy, organization, or other complex of activities may move from one
defined state to another, or from a defined state toward some specifically defined objective. Such
a schedule implies, and should explicitly prescribe, the resources and the goods and services
utilized, consumed or produced in the accomplishment of the programmed actions.” 12 The kernel
of Project SCOOP’s plan to mechanize military programming was George Dantzig’s formulation
in June 1947 of a linear programming model, his construction in August 1947 of the simplex
algorithm for solving linear programming problems, and the coding of the simplex algorithm for
11
Reproduced in George Dantzig, “Impact of linear programming on computer development,”
OR/MS Today 15 (1988): 12-17.
12
Wood, Marshall K., and George B. Dantzig, “Programming of Interdependent Activities: I
General Discussion,” Econometrica 17 (1949): 193-199, 193-194.
This definition of programming was consistent with the long-standing military use of the verb to
mean a scheduling of actions for a military operation. Although Dantzig and his colleagues
designed algorithms to solve mathematical programming models with digital computers, they
used the term “coding”, not programming, to describe the writing of machine instructions for the
computer.
9
digital computers.13 The National Applied Mathematics Laboratories of the National Bureau of
Standards assisted the Pentagon team with the latter task.
Dantzig’s model, which he called a “linear technology,” consisted of a linear objective
function that maximized output (or minimized costs), linear constraints in the form of equations
that specified the technological relations between the set of items being consumed or produced
and the set of activities that constituted the production processes, and linear constraints in the
form of inequalities that specified, for example, maximum available resource limits for items.
Dantzig’s constructed solution to a linear programming problem was his simplex
algorithm that relied on the geometric property that the objective function will have a maximum
value at a corner (vertex) of the convex feasible region bounded by the linear constraints of the
problem. The simplex algorithm was an iterative method for moving about the geometric form to
find the point where the objective function was at its maximum. As with its Cold War fraternal
twin game theory, which we will encounter in the next chapter, linear programming was based on
the mathematics of convex sets and computationally depended on manipulating matrices.
Dantzig stated in several forums that one of the key sources of inspiration for his linear
technology was Wassily Leontief’s input-output model of the US economy that the Bureau of
Labor Statistics had used to plan for demobilization after WWII. 14 Dantzig and his colleagues at
George Dantzig (1914-2005) received his PhD in Mathematics in 1946 from the University of
Berkeley after working as a statistician first at the Bureau of Labor Statistics and then at the
Pentagon. Dantzig’s solution to two unproven theorems in statistics, which he mistakenly
assumed were homework assignments from his professor Jerzy Neyman, formed part of the
story-line in the film Good Will Hunting. Dantzig was a founding member of the Institute of
Management Science, its president in 1966, the first recipient of the Operations Research Society
of America's von Neumann Theory Prize, and a recipient of the US National Medal of Science.
13
14
Martin C. Kohli, “Leontief and the U.S. Bureau of Labor Statistics, 1941-1954: Developing a
framework for measurement,” The Age of Economic Measurement, edited by J. L. Klein and M.
10
Project SCOOP turned the input/output framework into an optimization machine by explicitly
introducing an objective function that minimized cost or maximized gain and by incorporating
activities from different time periods into the model so that the solution was the best combination
for achieving the objective.
A selection of Project SCOOP’s status reports for the DCS/Comptroller during late 1948
and 1949 arranged in a chronological order of status report delivery gives an indication of the
scope and nature of their early work for the Air Force (see Table 1). It includes programming for
detailed operations, such as Operation Vittles, planning for the movement of supplies to and from
military depots in order to minimize transportation and staffing costs of shipping, and planning
on a grand scale for another total war by working out input/output tables for the entire US
economy as well as estimating peacetime personnel training and aircraft production consistent
with war readiness.
Stored-program electronic digital computers were a necessity for getting optimal
solutions when the number of activities was large. As is evident in Table 1, Dantzig and his
colleagues were heavily involved in examining the engineering alternatives for a machine
suitable to solving linear programming problems via his simplex method. Dantzig tested
experimental circuits and in October 1948 began planning with the National Bureau of Standards
for a UNIVAC computer for USAF programming with an expected delivery date of 1951.
S. Morgan. (Durham, NC: Duke University Press 2001): 190–212; Judy Klein, “Reflections from
the age of measurement,” ibid., 128-133), and George B. Dantzig, Linear Programming and
Extensions (Princeton: Princeton University Press, 1963, 16-18) discuss the Bureau of Labor
Statistics use of Leontief’s input/output analysis of inter-industry data and its connection to
Dantzig’s linear programming.
11
In 1948, the Air Force Comptroller, General Rawlings, awarded the National Bureau of
Standards $400,000 ($3.9 million in 2010 dollars) to design a quickly-constructed interim
computer before the UNIVAC was ready. The NBS’s SEAC (Standards Eastern Automatic
Computer), was based on the design of the EDVAC and was the first fully operational stored
program computer in the US.15 The SEAC was also the first electronic computer to solve a linear
programming problem, but it was not ready for that purpose until May 1950, and even after that
limited memory and input problems were such that the SEAC could only be used for testing out
new algorithms on small problems rather than for fully fledged programming of major Air Force
operations. The first successful solution of a linear programming problem with the general
simplex method on the SEAC did not occur until January 1952. Until the UNIVAC was finally
ready for Air Force programming in June 1952, Project SCOOP had to make do with IBM
electromechanical card-programmed calculators.
In early 1948, the NBS staff used IBM calculators to compute the optimal solution of
Jerome Cornfield’s and George Stigler’s diet problem using Dantzig’s simplex method. 16 The
In stored program computers based on John von Neumann’s architectural design, the coded
algorithms for manipulating data are stored as a collection of switch settings in the same memory
device as the data they manipulate.
16
In 1941, in an unpublished Bureau of Labor Statistics memorandum Cornfield formulated a
linear program to find the lowest cost diet to meet the nutritional needs of a typical soldier, but
he did not have a computationally efficient algorithm for solving for the optimum. Seven years
later, Cornfield, a friend of Dantzig’s, was not able to find his original input data for the problem
so Dantzig turned to Stigler’s 1945 attempt to find the least cost daily diet combination of 77
different foods that met the recommended daily nutritional requirements of nine nutrients for a
154-pound man. Stigler stated his cost minimization problem in the form of nine equations in 77
unknowns. Stigler did not have the computational capacity to compute an optimal solution so he
had to make do with a “clever heuristic,” as Dantzig (George B. Dantzig, “The Diet Problem,”
Interfaces 20 (1990): 43-47) described it, for drawing up a shortlist from 510 combinations and
choosing the best among the short-listed diets. When the NBS staff solved Stigler’s diet problem
with Dantzig’s simplex algorithm, calculators, and many staff-hours, they determined that the
optimal diet was $39.69 a year versus Stigler’s estimate of $39.93 (George B. Dantzig, Linear
15
12
comparatively small diet problem was the first computational test case for the simplex solution
approach to linear programming problems, but that feat required nine statistical clerks working
the equivalent of 120 staff days to perform the required 17,000 multiplications and divisions
using desk calculators.17 It would take months of testing small models on desk calculators before
the Air Force team was convinced that the simplex method was both efficient and practical, at
least within the promise of digital computational capacity, and that it was not worthwhile to
pursue better algorithms.18 The IBM calculators available in 1948 and 1949, however, were not
up to the task of manipulating the large rectangular matrices required for computing the optimal
programming of Operation Vittles much less for the larger wartime and peacetime programs for
overall Air Force Operations.
2.3 Programming for Operation Vittles
In December 1948, Marshall Wood and George Dantzig presented a simplified version of
their linear program for Operation Vittles at the winter meeting of the Econometric Society in
Cleveland and with more formal elaboration by Wood and Geisler at the seminal June 1949
conference on Activity Analysis sponsored by the USAF, via the RAND Corporation and the
Programming and Extensions. Princeton: Princeton University Press, 1963, 551).
17
Dantzig, “The Diet Problem”; Mina Rees, “The Mathematical Sciences and World War II.”
The American Mathematical Monthly 87 (1980): 607-621.
18
Dantzig described the long process of exploring the computational efficiency of the simplex
algorithm in an article in 1988 and in an interview with Albers and Reid in 1986: (Saul I. Gass,
“Model World: In the Beginning There Was Linear Programming,” Interfaces 20 (1990): 128132, William Orchard-Hays, “Evolution of Linear Programming Computing Techniques,”
Management Science 4 (1958): 183-190, idem “History of the Development of LP Solvers,”
Interfaces 20 (1990): 61-73, Alex Orden, “LP from the '40s to the '90s,” Interfaces 23 (1993): 212, and Lyle R. Johnson, “Coming to Grips with Univac,” IEEE Annals of the History of
Computing Archive 28 ( 2006): 32-42 also discuss the early computational history of linear
programming.
13
Cowles Commission.19 That conference included presentations by military planners and
academics, particularly economists, including Paul Samuelson, Robert Dorfman, Kenneth Arrow,
Tjalling Koopmans, and Herbert Simon. Members of the Air Force Project SCOOP team
presented seven of the papers, with Dantzig listed as an author or co-author for five of them. The
SCOOP team claimed that one of the key advantages of their new linear technology they were
presenting was the ability to take into account opportunity cost. The pre-optimization way had
been plagued with an inability to consider alternative courses of action:
So much time and effort is now devoted to working out the operational program
that no attention can be given to the question whether there may not be some
better program that is equally compatible with the given conditions. It is perhaps
too much to suppose that this difference between programs is as much as the
difference between victory and defeat, but it is certainly a significant difference
with respect to the tax dollar and the division of the total national product between
military and civilian uses.
Consideration of the practical advantages to be gained by comparative
programming, and particularly by the selection of “best” programs, leads to a
requirement for a technique for handling all program elements simultaneously and
for introducing the maximization process directly into the computation of
program. Such a technique is now in prospect.20
Alfred Cowles, president of an investment firm in Colorado Springs, had a keen depressionhoned interest in the accuracy of stock market forecasts. In 1932, he established the Cowles
Commission for Research in Economics. From the outset the commission had close ties with the
Econometric Society and supported the statistical and mathematical research of prominent
economists. In 1939 the Cowles Commission moved to Chicago. In 1948, Tjalling Koopmans
took over as director and increased the Commission’s emphasis mathematical methods for the
study of rational (optimal) behavior.
19
20
Marshall K. Wood and Murray A. Geisler, “Development of Dynamic Models for Program
Planning,” Activity Analysis, edited by T. C. Koopmans (New York: John Wiley & Sons, 1951):
194.
14
There are three key points that Wood and Geisler addressed in this passage: the promise
of optimization to account for alternative uses and achieve the best outcome; the declaration that
this would enable the military to effectively pursue new operations without demands for higher
taxes or lower civilian production; and the admission that the heralded technique was “in
prospect.” With regard to the latter, the Air Force did not have the computing capacity to make
operational use of the general linear programming model and the simplex algorithm for
Operation Vittles. Wood and Geisler thus presented two sets of input/output coefficients and
respective derived equations to illustrate models for programming the Berlin airlift: the
optimizing rectangular model in prospect and the non-optimizing triangular model that could be
solved on their current punched card electronic accounting equipment. The model they wanted to
use was rectangular because the number of activities (supplying Berlin with food and coal, flying
the aircraft, constructing runways in Berlin, training new crews….) exceeded the number of
items (supplies, runways, aircraft, experienced crews, new crews….). The mathematical goal was
to determine the level for each that would maximize the tonnage of products delivered to Berlin
over thirty-six months, subject to technological and resource constraints. That long time period
increased the number of activities and made the input/output matrix even more rectangular.
The Air Force, however, lacked the computing capacity to deal with the large rectangular
matrices in Project SCOOP’s original optimizing Berlin airlift model. For several years to come,
they had to rely on sub-optimizing triangular models that rearranged and decomposed the general
problems into hierarchical steps such that in order to solve for the activities in a time period, the
algorithm had to solve for the levels for each activity in the time period before. The hierarchical
arrangement in their triangular model was similar to the earlier procedure illustrated in Figure 6
15
and thus it mirrored to some extent the sequential direction of information flows through the Air
Force’s bureaucratic structure. It did, however, rationalize and mechanize that older “program of
programming” and in so doing drastically reduced the time required to program a specified
military operation.
The rectangular model offered the prospect of determining the level of each activity
consistent with achieving the optimum maximum tonnage delivered over the course of
thirty-six months even though in any single three-month period the optimum for that
period would not necessarily be achieved. Over the course of all thirty-six months it may
well make optimizing sense to engage in the activity of constructing a new runway that
took at least three months at the expense of some loss of food or coal delivered at the
beginning of that time framework to make room for construction equipment and
materials. If one were forced to confine the analysis to just one three-month segment,
however, it would not. The temporally-constrained hierarchy of the triangular model
could not incorporate dynamic considerations of opportunity costs of alternative
production activities and it could not guarantee an optimum solution to qualitatively
stated objectives such as “maximize tonnage delivered subject to constraints.” The
triangular model, however, could compute the required supporting activities to achieve a
specified tonnage and it used far fewer computational resources than the optimizing linear
programming model:
With this formulation we have been we have been able to solve programming
problems involving 100 activities and 36 time periods in one day by using present
punched card equipment, obtaining answers which are realistic and useful. In the
16
more general formulation this would be represented by 3,600 equations in 3,000
unknowns.21
It was, however, the promised rationality of optimization that was celebrated at the 1949
conference where Project SCOOP presented its linear technology to the academic world. Tjalling
Koopmans, the conference chair, even went as far as to claim that Dantzig’s linear program
model and simplex algorithm settled an earlier debate as to whether centrally-planned allocation
could lead to rational outcomes. In the 1920’s and 1930’s free-market champions such as Ludwig
von Mises and Friedrich Hayek had argued that the challenge of economic calculation prohibited
planned economies from achieving an efficient allocation of resources and in so doing precluded
rationality:
Without economic calculation there can be no economy. Hence, in a socialist state
wherein the pursuit of economic calculation is impossible, there can be – in our
sense of the term – no economy whatsoever. In trivial and secondary matters
rational conduct might still be possible, but in general it would be impossible to
speak of rational production any more. There would be no means of determining
what was rational, and hence it is obvious that production could never be directed
by economic considerations. What this means is clear enough apart from its
effects on the supply of commodities. Rational conduct would be divorced from
the very ground which is it proper domain. Would there, in fact, be any such thing
as rationality and logic in thought itself? Historically, human rationality is a
development of economic life. Could it then obtain when divorced therefrom? 22
21
Wood and Geisler. Development of Dynamic Models, 206.
22
Ludwig von Mises, “Economic Calculation in the Socialist Commonwealth,” Collectivist
Economic Planning; Critical Studies on the Possibilities of Socialism, edited by F. A. Hayek
[London: Routledge & Kegan Paul, (1920) 1935]: 105.
17
Koopmans consciously took on this challenge and argued that economic calculation and
rationality in centralized allocation was now possible: “To von Mises’ arguments regarding the
unmanageability of the computation problems of centralized allocation, the authors oppose the
new possibilities opened by modern electronic computing equipment. …Dantzig’s model is an
abstract allocation model that does not depend on the concept of a market. 23 24
With hindsight, Koopmans’ sweeping claim for the achievement of optimization was
premature. The mathematical programming of Operation Vittles would not be the only endeavor
where limited computational capacity forced operations researchers to make-do rather than
maximize. Herbert Simon, who was working under a research contract with Project SCOOP at
the time of his participation in the Activity Analysis conference, took up this issue and
constructed definitions of rationality that spoke to the reasoning process associated with the
limited computational capacity not only of organizations seeking to plan but also of von Mises
and Hayek’s economic man.25
23
Tjalling C. Koopmans, “Development of Dynamic Models for Program Planning,” Activity
Analysis, edited by T. C. Koopmans (New York: John Wiley & Sons, 1951): 7.
24
Indeed, the dual nature of mathematic programming was such that the process of solving for
the maximum output yielded what operation researchers called “efficiency prices” or “shadow
prices,” which signaled worth in the absence of markets. For example, Stephen Enke, “Some
economic aspects of fissionable material,” Quarterly Journal of Economics 68 (1954): 217-232
at the RAND Corporation demonstrated that with linear programming economists could
contribute “the principles of value determination and the logic of economizing” to determine the
production, consumption, and allocation of the fissionable materials U 235 and PU239 for which
there were no markets or real prices.
25
Herbert Simon (1916-2001) described himself a mathematical social scientist. As an
undergraduate at the University of Chicago and a graduate student at the University of California
at Berkley, Simon sought out courses in physics, mathematical economics, and symbolic logic.
Simon’s doctoral thesis on administrative decision-making built on his operations research work
for a city government. From 1949 until his death, Simon was a professor at the Carnegie Institute
of Technology/Carnegie Mellon University. Simon received the Nobel Prize in Economics in
1978. Hunter Crowther-Heyck Herbert A. Simon: The Bounds of Reason in Modern America.
18
2.4 Project SCOOP and the Carnegie Institute of Technology
The next to last item in the selected Project SCOOP status reports of 1948-49 (Table 1)
describes funding outside research that would improve the data and modeling of the large interindustry input/output matrix that was an essential part of the Air Force’s war and peacetimereadiness models. In 1949, Project SCOOP and the Bureau of the Budget’s Division of Statistical
Standards awarded the Graduate School for Industrial Administration (GSIA) at the Carnegie
Institute of Technology a three-year grant for research on “Intra-Firm Planning and Control.” 26
The commitment involved researching the upstream data generating process that fed into the
government input/output models, improving production planning through an applied
mathematics that combined logical, accounting, engineering, and computational frameworks, and
training staff in new analytical methods of planning and optimization. This foundational contract
between the military, the executive branch, and the university was a key path by which linear
programming spread to operations research in private industry and management science became
a profession backed by an analytical graduate business school curriculum and a professional
organization.
Baltimore: Johns Hopkins University Press, 2005 and Ester Mirjam Sent: “Herbert A. Simon as a
Cyborg Scientist,” Perspectives on Science 8 (2000): 380-406 and “Simplifying Herbert Simon,”
History of Political Economy 37 (2005): 227-232) have documented the broad disciplinary span
of Simon’s work brought to bear on first administrative decision making and then computer
simulated problem solving and literally through his departmental travels at Carnegie Mellon
through industrial administration, psychology, and political science. Philip Mirowski examined
Simon’s research for the Cowles Commission (Philip Mirowski, Machine dreams: Economics
becomes a Cyborg Science. Cambridge: Cambridge University Press, 2002, 452-472).
26
Simon’s 1952 notes on the GSIA research budget indicate the contract with the Air Force and
the Bureau of the Budget financed on an annual basis the equivalent of 3 man-years of faculty
research and 6 man-years of graduate assistant research per year in addition to overhead costs.
19
To complement Project SCOOP’s development of war and peacetime-readiness models,
the Carnegie group agreed to direct their intra-firm analysis of production planning to companies
that were either very important to the economy or representative of a key industry. The
economist William Cooper and the mathematician Abraham Charnes worked with data from the
Philadelphia Gulf Oil Refinery as well as with an engineer from the refinery staff, Bob Mellon,
to model linear programming problems for chemical manufacturing processes. Carnegie
colleagues Herbert Simon and Charles Holt used servomechanism models to obtain optimum
decision rules for determining the rate of production that would minimize manufacturing and
inventory costs in a transportation equipment manufacturer. 27
In their first project for Project SCOOP, Cooper and Charnes applied linear programming
to determine the optimum blend of aviation fuel that the refinery should aim for. In the contract
with the Air Force, the Carnegie team had agreed to ultimately make the analysis operational so
that, for example, a non-mathematically trained production manager could routinely manipulate
the algorithm to obtain the optimum blend. On their first go-round, however, Cooper and
Charnes were confronted with the dilemma that “unresolved computations could not be expected
to proceed safely without the services of a trained professional mathematician at each stage… the
27
Servomechanisms were error-actuated devices that acted on information, provided through a
feedback loop, on the difference between the actual value and the predicted value (or desired or
forecast value) to instigate energy to control the system. For example, if a thermometer attached
to a furnace determines that the actual temperature is less than the desired temperature then an
electrical signal turns the furnace on. The servomechanism approach relied heavily on work in
World War II on lead computing gunsights and predictors and on a common mathematical
language of control mechanisms that Warren Weaver encouraged in the Applied Mathematics
Panel (see LeRoy A. MacColl, Fundamental Theory of Servomechanisms. New York: D. van
Nostrand. 1945, Judy Klein, forthcoming, Protocols of War and the Mathematical Invasion of
Policy Space, 1940-1960, and Peter Galison, “The Ontology of the Enemy: Norbert Wiener and
the Cybernetic Vision,” Critical Inquiry 21 (1994): 228-266.
20
work could not be routinized to the degree necessary for large-scale continuing application to
practical administrative problems.” 28 To solve that problem, Charnes29 developed a general
means of handling complex mathematical problems that, in combination with Dantzig’s simplex
algorithms, made it possible to completely routinize the computing process. The Carnegie team
was intent on operational effectiveness and Charnes’s computational innovation opened the door
for many industry applications of linear programming. They presented their mathematical
protocol at the 1951 Project SCOOP symposium on linear programming 30 and published their
results in Econometrica.31 The petrochemical industry subsequently became the exemplary user
of linear programming and simplex algorithms partly because optimization for the capitalintensive chemical processing did not require the computational capacity that other industries or
the complex Air Force models required.
Charnes and Cooper eventually consulted with over 100 companies and government
agencies to determine the potential for industrial applications of linear programming. Under the
terms of the agreement with Project SCOOP, the Carnegie Institute was committed to training
future workers in government and private industry in linear programming and optimization. After
28
William W. Cooper, Report of Progress: Carnegie Institute of Technology School of Industrial
Administration and the Department of the Air Force, Project SCOOP, Research Project for the
Study of Intra-Firm Behavior, Graduate School of Industrial Administration, Records, 19492000 (Pittsburgh: Carnegie Mellon University Archives, 1952): 4.
29
Abraham Charnes, “Blending Aviation Gasolines—A Study in Programming Interdependent
Activities in an Integrated Oil Company”. Econometrica 20 (1952): 135-159.
30
Abraham Charnes, William. W. Cooper, and B. Mellon, “Blending Aviation Gasolines—A
Study in Programming Interdependent Activities”. Paper read at Symposium on Linear
Inequalities and Programming, June 14-16, 1951.
31
Abraham Charnes, William. W. Cooper, and B. Mellon, “Blending Aviation Gasolines-—A
Study in Programming Interdependent Activities in an Integrated Oil Company,” Econometrica
20 (1952): 135-159.
21
training their own GSIA staff and circulating mimeographed lecture notes to colleagues at other
institutions, Charnes and Cooper developed a graduate course for the Carnegie business school;
the textbook based on their lectures on linear programming at the GSIA was the go-to text for
operations researchers for many years.32 Armed with operational optimization tools, the GSIA
team were at the forefront of professionalizing management science. Cooper was the founding
president of Institute of Management Science (TIMS), and in the first few years of that
organization Charnes and Simon served as national officers on the editorial board of the
professional organization’s journal, Management Science.
Project SCOOP’s contract with the GSIA ended in June 1953 but that work for the Air
Force laid the foundation for a larger and longer contract with the Office of Naval Research
(ONR) for “Planning and Control of Industrial Operations” that began in the fall of 1952, and
indeed several projects begun under the Air Force contract were transferred for completion under
the Navy contract.33 Under the ONR contract Holt, the economist Franco Modigliani, the
engineer-turned-business graduate student John Muth, and Simon constructed, tested, and
applied mathematical programming to the Springdale paint manufacturing plant of the Pittsburgh
Plate Glass Company. The aim of the mathematical programming was for production managers
to be able to derive linear decision rules that would enable them to calculate the optimum levels
of production, employment and inventory that would minimize costs. One major computational
hurdle they faced was how to handle uncertain future demand for different types of paint in the
32
William W. Cooper, Abraham Charnes, and A. Henderson, An Introduction to Linear
Programming (New York: Wiley, 1953).
33
During this time, Simon’s research was also funded by a RAND Corporation contract with the
Cowles Commission on the “Theory of Resource Allocation” and an ONR contract with Cowles
for research on “Decision Making under Uncertainty.”
22
program. The cost-minimization program required the specification of the probability
distribution of future demand. It would be difficult if not impossible to get the data and
computationally incorporate a probability distribution into the mathematical protocol without
making some heroic assumptions about the independence of demand in one time period
compared with another. The Carnegie team’s search for an approximation that would save
computational resources led them to the practice of assuming that the criterion function of the
mathematical program took the form of a quadratic equation. If the objective was to minimize
costs and the cost function could be approximated as a quadratic function of the rate of
production per period, the rate of change of production, and the amount of inventories held at the
end of each period, then the production manager would only have to plug in the expected value
of future sales into a linear decision rule. The expected value could be calculated from past sales
using, as they eventually did, an exponentially weighted moving average of monthly values of
past sales to forecast future sales.34 The quadratic approximation eliminated the computational
problems of dealing with uncertainty and enabled the paint factory managers to specify
numerical constants for the linear decision rule with only three labor hours on a desk calculator
and with only five minutes of calculation each month to update the model. 35 36
34
see Judy Klein, forthcoming — Protocols of War and the Mathematical Invasion of Policy
Space, 1940-1960.
35
Charles C. Holt, Franco Modigliani, and Herbert A. Simon, “A Linear Decision Rule for
Production and Employment Scheduling”. ONR Research Memorandum No. 30. Pittsburgh, PA:
Graduate School, 1955, 2.
36
Simon proved mathematically that when the criterion function of a dynamic programming
problem is reasonably approximated by a sum of quadratic and linear terms, the “planning
problem for the case of uncertainty can be reduced to the problem for the case of certainty,
simply by replacing, in the computation of the optimum first period action, the ‘certain’ future
values of variables by their unconditional expectations.” Herbert A. Simon, “Dynamic
programming under uncertainty with a quadratic criterion function,” Econometrica 24 (1956):
74-81, 74.
23
In their first Navy research memorandum introducing the quadratic criterion function
Simon and Holt declared, “There should be no impossible gulf between ‘theory’ and ‘practice’:
good theory is theory that works…. A theory of rational decisions is a theory of how to decide,
given certain kinds of information and certain computing capacities. 37 38 Simon took the lessons
of making do with existing computational resources when crafting normative modeling strategies
and applied it to the realm of positive, descriptive economics. He argued in many forums that
economists should be incorporating these approximating heuristics into their models of how
economic man makes decisions. In a working paper first circulated at the RAND Corporation in
1953 and eventually published in the Quarterly Journal of Economics in 1955, Simon proposed a
model “for the description of rational choice by organisms of limited computational capacity”. 39
In that RAND paper Simon explored ways of modeling the process of rational choice that took
into consideration that limitation. Simon illustrated the advantage of focusing on process with a
comparison of a utility function for someone who had unlimited computational power (the Sshaped curve in Figure 7) with one for someone whose limits on computational capacity make
ideal optimization too costly (the stepped line). If limits on computational capacity constrain the
ability to evaluate all alternatives (such as all possible future plays in a chess game) then the
37
Herbert A. Simon and Charles C. Holt, The Control of Inventory and Production Rates; A
Survey. ONR Research Memorandum No. 9. Pittsburgh, PA: Graduate School of Industrial
Administration Carnegie Institute of Technology, 1954, 16-17.
38
In another ONR memorandum, Modigliani and Simon took the approximation process beyond
the criterion function by exploring when a heuristic decision rule constituted a good
approximation to the exact linear decisions rule (F. Modigliani and Herbert A. Simon, On the
relation between a heuristic and an exact production decision rule. ONR Research Memorandum
No. 17. Pittsburgh, PA, Graduate School of Industrial Administration, Carnegie Institute of
Technology, 1954).
39
Herbert A. Simon, “A behavioral model of rational choice”. RAND P-365, Santa Monica, CA:
RAND Corporation, 1953, 1.
24
player will seek an approximating mechanism to simplify computations. For example, the payoff
function could be flat over a wide range of alternatives and the utility function would be
characterized by plateaus that map three simplified alternatives such as “clearly won”, “clearly
lost” and “clearly drawn”. In the words of Simon, “The player, instead of seeking for a ‘best’
move, needs only to look for a ‘good’ move.” 40
Simon moved freely from the normative to the positive realms and back again. In that
1953 RAND paper he speculated that a survey of possible definitions of rationality would lead to
improved designs of computer equipment and algorithms as well as improved models of how
people make rational choices. Over the next two decades, Simon refined his definitions of
rationality. In his 1957 book on Models of Man, Simon introduced the term “bounded
rationality.” He argued that consumers and entrepreneurs were “intendedly rational,” but they
had to construct simplified models of real situations that were amenable to effective
computation. For Simon the key to simplifying the choice process and reducing computational
demands was “the replacement of the goal of maximizing with the goal of satisficing, of finding
a course of action that was ‘good enough’”.41 42
40
Simon, “A behavioral model,” 9.
Herbert A. Simon, Models of Man: Social and Rational; Mathematical Essays on Rational
Human Behavior in Society Setting (New York: Wiley, 1957): 204.
42
In his Nobel Prize lecture in Stockholm on December 8, 1978, Simon reflected on the
conceptual development of bounded rationality by describing the Carnegie team’s quadratic cost
approximation to illustrate “how model building in normative economics is shaped by
computational considerations.” Herbert A. Simon, “Rational Decision Making in Business
Organizations”. The American Economic Review 69 (1979): 493-513, 498.
41
25
At a talk at Gronigen University in September 1973 and in revisions circulated in 1974
and published in 1976, Simon refined his defining schema of rationality further by making the
distinction between substantive and procedural rationality. Substantive rationality was the
achievement of the best outcome given a goal; the rational consumer achieving maximum utility
or the rational producer achieving maximum profits. In contrast to the economist’s emphasis on
the choice outcome that a rational economic man made, the psychologist focused on the process
of how decisions are made. Procedural rationality dealt with reasonable deliberation. 43 Simon
illustrated the difference between the two with examples from linear programming. The solution
to Stigler’s diet problem was substantively rational. Thanks to the linear programming model, the
simplex solution algorithm, and contemporary computing equipment, an optimal least cost
solution meeting the nutritional goals had been achieved. The traveling salesman problem of
finding the city-to-city route that would minimize traveling costs was one of Simon’s example of
procedural rationality. Computable optimal solutions were only possible for trivial set-ups of the
problem. For the more complex traveling salesman problems, operation researchers searched for
43
In many of Simon’s notes and publications, starting with his dissertation in 1943, the
contrasting adjectives of “substantial” and “procedural” appear in the same paragraph. Over the
decades Simon paired the following nouns with the two adjectives: “matters”, “conformity”,
“decision premises”, “flexibility”, “problems”, and “alternatives”. In his 1964 entry on
“Rationality” to the Dictionary of the Social Sciences, Simon contrasted two types of rationality,
the economist’s “attribute of an action selected by a choice process” and the psychologist’s
“processes of choice that employ the intellectual faculty” (Herbert A. Simon, “Rationality,” J.
Gould and W.L. Kolb (Eds.), A dictionary of the social sciences Glencoe, IL: The Free Press,
1964, 573-574, 574). It was apparently not until 1973, however, that Simon coined the phrases
“substantial rationality” and “procedural rationality”. Once he did so he used the currency
liberally in several key publications (see for example Herbert A. Simon, “From substantive to
procedural rationality,” Method and appraisal in economics, edited by S. J. Latsis: New York,
1976, “On How to Decide What to Do,” The Bell Journal of Economics 9 (1978): 494-507,
“Rationality as Process and as Product of Thought,” The American Economic Review 68 (1978):
1-16, and “Rational Decision Making in Business Organizations,” The American Economic
Review 69 (1979): 493-513.
26
computationally efficient algorithms that would achieve good, but not necessarily optimal,
solutions.
In another illustration of procedural rationality, Simon offered the programming model
with the quadratic criterion function that his Carnegie team had used for their military-funded
operations research on production planning:
this did not mean that we thought real-world costs functions were
quadratic; it meant that we thought that many cost functions could be reasonably
approximated by a quadratic, and that the deviations from the actual function
would not lead to seriously non-optimal decisions. … Not only did the quadratic
function provide good computational efficiency, but it also greatly reduced the
data requirements. …All that was expected of the solution was that the optimal
decision in the world of the model be a good decision in the real world. 44
In Simon’s scheme, forcing the objective function minimizing costs to take on a quadratic
form was one deviation from classical optimization: “simplification of the model to make
computation of an ‘optimum’ feasible.” Another route, taken by Project SCOOP in their making
do with non-optimizing triangular models for Operation Vittles speaks to a second type of
deviation that Simon specified: “searching for satisfactory, rather than optimal choices.” Simon
considered both adaptations to the limitations on computational capacity as examples of
satisficing behavior, rather than optimization, the aim being to construct “practicable
computation procedures for making reasonable choices.” 45
44
Herbert A. Simon, “From substantive to procedural rationality,” Method and appraisal in
economics, edited by S. J. Latsis (New York, 1976): 139.
45
Idem, From substantive to procedural rationality, 140.
27
The thrust of Simon’s argument in many of his essays contrasting substantive and
procedural rationality addressed positive or descriptive economics: because consumers and
producers are organisms with limited computational capacity, economists should learn from
psychologists, as well as from their own discipline’s experience with normative operations
research, and focus more on the process of how decisions are made. 46 As is evident in a
symposium on economics and operations research in the May 1958 issue of The Review of
Economics and Statistics47, Simon was not alone in drawing this conclusion. Several of the
authors, including William Cooper, Charles Hitch, Thomas Schelling, and Daniel Ellsberg, spoke
not only to how economists could contribute to improvements in operations research, but also to
how their own operations research experience with approximation and good, alternative, nonoptimizing rules should be incorporated into microeconomic theory. It was Simon, however, who
provided the clarity with the naming of “bounded” and “procedural” rationality.
2.5 Programming after Project SCOOP
In the fall of 1953 the Air Force, acknowledging an end to the early development stage of
mathematical programming and a commitment to a new stage of widespread implementation,
Simon expressed this lesson from operations research quite clearly in an essay on “Economic
Rationality”:
46
Normative economics has shown that exact solutions to the larger optimization problems
of the real world are simply not within reach or sight. In the face of this complexity the
real-world business firm turns to procedures that find good enough answers to questions
whose best answers are unknowable. Thus, normative microeconomics, by showing realworld optimization to be impossible, demonstrates that economic man is in fact a
satisficer, a person who accepts “good enough” alternatives, not because he prefers less to
more but because he has no choice. (Herbert A. Simon, “Economic Rationality: Adaptive
Artifice,” Herbert A. Simon Collection. Pittsburgh, PA: Carnegie Mellon University,
1980, 5).
47
The Review of Economics and Statistics 40 (1958).
28
disbanded Project SCOOP. Mathematical programming, however, continued to thrive both at the
Comptroller’s office at the Pentagon and at the RAND Corporation, to where George Dantzig
had migrated. Even after the arrival of the UNIVAC, the Air Force had to rely heavily on the
triangular model for programming operations with thousands of activities. Former SCOOP
members, including Jacobs and Gass, worked at the Pentagon to design models and code
algorithms that could be solved with existing computers and be operationally friendly to those
having to formulate myriad Air Force routines. In 1956 they replaced the triangular model with
the Trim model (also square and non-optimizing) that they had designed “as a production
system” that disciplined and mechanized the data input process from various Air Force
departments48. The Trim model was used to construct budgets and to determine for any specific
war plan the monthly requirements for bombs, ammunition, fuel, personnel, etc. In cases of
relatively small military operations linear programming on the UNIVAC with rectangular
optimizing models was possible in the mid-1950s. 49
So far in this chapter we have neglected the other superpower engaged in the Cold War. If
necessity is the mother of invention, why didn’t Soviet requirements for planning the entire
national economy spur an early development of linear programming there? There was a major,
albeit neglected, development in the linear technology for optimum resource allocation in 1939.
In his capacity as a consultant to a plywood enterprise in the late 1930’s the mathematics
48
Walter Jacobs, “Air Force Progress in Logistics Planning,” Management Science 3 (1957):
213-224.
49
Gass described a typical aircraft deployment linear programming problem that he worked on
in the mid-1950s, “Given the initial availability of a combat-type aircraft and the additional
monthly availabilities in the succeeding months, divide these availabilities between combat and
training so as to maximize, in some sense, the combat activity.” Saul I. Gass, “Model World: In
the Beginning There Was Linear Programming,” Interfaces 20 (1990): 128-132, 131.
29
professor Leonid Kantorovich was confronted with the economic problem of allocating raw
materials in order to maximize equipment production subject to constraints. He formulated a
linear programming model and suggested an iterative solution process similar to, but not
identical to, the simplex method that held out the prospect for calculating “resolving multipliers”
(the efficiency or shadow prices of Dantzig’s model). In May 1939, Kantorovich made two
presentations of his new mathematical approach to the Institute of Mathematics and Mechanics
of the Leningrad State University and to the Leningrad Institute for Engineers of Industrial
Construction. That same year the Leningrad University Press printed his booklet on The
Mathematical Method of Production Planning and Organization. A lack of computational
capacity in the early 1940s, a distrust of a mathematical approach to planning allocation of
resources, and the preoccupation of war with Germany led to the neglect of Kantorovich’s
contribution to scientific management in the USSR. In the late 1950s, planners and academics in
the USSR began to acknowledge the usefulness of Kantorovich’s protocol and its excellent fit
with the growing Soviet interest in cybernetics. 50 In 1975, the Swedish Nobel Committee
awarded Kantorovich and Koopmans the Sveriges Riksbank Prize in Economic Sciences for their
contribution to theory of optimum allocation of resources. To the consternation and anger of
many, including Koopmans, Dantzig was not included in the honor. Ironically most of the Nobel
laureates for the prize in economic science from its inception in 1969 up to 1980 had published
work on linear programming.
50
Management Science published an updated, expanded English translation of Kantorovich’s
1939 booklet in July 1960 with a forwarding note by Koopmans [Tjalling C. Koopmanns, “A
Note about Kantorovich's Paper, ‘Mathematical Methods of Organizing and Planning
Production’,” Management Science 6 (1960): 363-365, 364] who described the study as “an early
classic in the science of management under any economic system.”
30
In June 1952, Dantzig left Project SCOOP to continue his developmental work for the Air
Force at the RAND Corporation. There Dantzig worked with William Orchard-Hays to improve
the computational efficiency of the simplex algorithm, to adapt it to new computers, and to
develop commercial-grade software for solving linear programs. Looking back on that work with
Dantzig, Orchard-Hays51 described their occupation, “An algorithm designer is an engineer who
works with abstract concepts rather than physical materials. The designer’s goals are efficiency
and that the algorithm works; it should give correct results reliably for a class of problems.” The
algorithm designers at RAND were constructing what Simon would later call procedural
rationality, “the rationality of a person for whom computation is the scarce resource”.
52 53
2.6 Conclusion
In the 1958 symposium on Economics and Operations Research the economist Daniel
Ellsberg described the conditions in the Cold War military sphere that made “problems of choice
and allocation almost un-bearably pressing.” 54
The budget famine, the sudden military challenge, the unprecedented variety of
alternative weapon systems with their long lead-times, rapid obsolescence, high
cost, and excruciatingly technical claims: these basic pressures on the Secretary of
51
William Orchard-Hays, “History of the Development of LP Solvers,” Interfaces 20 (1990): 6173, 62.
52
Herbert A. Simon, “On How to Decide What to Do,” The Bell Journal of Economics 9 (1978):
494-507, 496.
53
Acknowledging the efficiency and widespread successful application of the simplex algorithm,
the journal Computing in Science & Engineering named it one of the ten algorithms with the
greatest influence in the twentieth century (see John C. Nash, The (Dantzig) simplex method for
linear programming. Computing in Science and Engineering 2 (2000): 29-31.
54
Ellsberg is most well-known for his 1971 leak of The Pentagon Papers, top-secret decisionmaking documents on US involvement in Vietnam. Before that he had served as a nuclear
strategist at the RAND Corporation and under Secretary of Defense Robert McNamara,
31
Defense are dramatized by the inter-service rivalry with the public for funds and
resources and with each other for allocations and control of weapons. 55
That description applied equally well to the earlier decade in which the blockade of and
airlift for the western sectors of Berlin increased the urgency of attempts at the Pentagon
headquarters of the Air Force Comptroller to compute optimum programs for resource allocation
in military operations. In a reflective essay on management science since the end of World War
II, Dantzig claimed that “True optimization is the revolutionary contribution of modern research
to decision processes.” A few sentences later he acknowledged, “Computation of truly optimum
programs was of course beyond the original capabilities of SCOOP.” 56 He also acknowledged
that much of what had been done so far in management science was “a vast tooling-up, a
preparation for new ways to do old tasks.” With the linear programming model and its explicit
optimizing objective function and the simplex solution algorithm, Dantzig and his Air Force
colleagues at Project SCOOP were a major part of that tooling-up. In 1962 Brockway McMillan,
the Assistant Secretary of the Air Force for Research and Development, contended that
operations research, and linear programming in particular, had been instrumental in highlighting
the unique way in which mathematics could be applied to any field: with a mathematical model
and a new concept of a solution to a mathematical problem – an algorithm for a computer. The
essential link to the computer was accompanied by the standardization of model formulation and
algorithmic design, time studies of computational procedures, and deskilling goals applied to the
data input process as well as the rule-based decision output process. The military client insisted
55
Daniel Ellsberg, A Final Comment. The Review of Economics and Statistics 40 (1958):
227-229, 229.
56
George B. Dantzig, “Management Science in the World of Today and Tomorrow,”
Management Science 13 (1976): C107-C111, C-109.
32
upon numerical solutions to the allocation problems – how many tons of food versus how many
tons of construction materials for a new runway do we fly into Berlin this week? Limits on the
capacity for computers to calculate optimal numerical solutions meant that operation researchers
often had to be consciously aware of how to manage the scarce computational resource for their
own production process and be satisfied with less-than optimal, good-enough solutions to the
allocation problems they were modeling. The lesson from operations research was that “rules of
substantive rationality that are not backed by executable algorithms are a worthless currency.” 57
The allocation problem that the Project on the Scientific Computation of Optimum
Programs hoped to address was intimately tied to the game problem that theorists at the RAND
Corporation would focus on. Both grew up together in the same USAF extended family and
applied mathematicians were quick to prove some common genetic material. In his first meeting
with George Dantzig on October 3, 1947, John von Neumann speculated that game theory and
Dantzig’s linear programming were analogues of each other. Two papers 58 presented at the
seminal 1949 conference on Activity Analysis addressed the mathematical equivalence of the
program problem of resource allocation and the game problem of strategy. They proved that
every two-person zero sum game could be turned into and solved as a linear programming
problem. As we will see in the next chapter, outside of that two-person, zero sum game, optimal
57
Herbert A. Simon, “Economic Rationality: Adaptive Artifice,” Herbert A. Simon Collection
(Pittsburgh, PA: Carnegie Mellon University, 1980): 11.
58
David Gale, Harold W. Kuhn, and Albert W. Tucker, “Linear Programming and the Theory of
Games,” in Activity Analysis, edited by T. C. Koopmans (New York: John Wiley & Sons, 1951),
and George B. Dantzig, “A Proof of the Equivalence of the Programming Problem and the Game
Problem,” ibid.
33
solutions to games could be as elusive as they were for computationally-strapped mathematical
programming problems.
34
Table 1: Project SCOOP Status Reports to USAF, 1948-1949
Source: US National Archives, RG 341, Entry 108, Box 736.
Figure 2.1: Plan for the Berlin airlift highlighting the British and American bases that the planes
took off from, the radio beacons in the western zones, the boundary line with the Russian zone of
eastern Germany, the three air corridors through that zone, and the Gatow and Tempelhof airports
in the British and American sectors of Berlin, respectively. Flying at a designated altitude
enabled planes to land every three minutes at Tempelhof.
Source: US National Archives, RG 342-G, Box 25.
Figure 2.2: Unloading planes at Templehof 28 October 1948.
Source: National Archives, RG 306-PS, Box 85.
Figure 2.3: The plotting board of the airlift terminal at the British base at Fassberg Germany,
“helps loading officers to study new methods of operations before putting them to work on a
large scale. The models, which may be moved to test loading plans, include U.S. Air force C-54s,
British trucks, and German trains and track sidings. Although an RAF base, all Berlin-bound
cargo is flown from Fassberg in U.S. Air Force planes.”
Source: US National Archives, RG 342- G, Box 25.
Figure 2.4: Example of USAF Comptroller’s daily report on cost of Operation Vittles, 25 June 1949.
Source: US National Archives, RG 341, Entry 105.
Figure 2.5: USAF Comptroller’s First Anniversary report for Operation Vittles, 25 June 1949.
Source: US National Archives, RG 341, Entry 105.
Figure2.6: Schematic diagram of major steps in Air Force wartime program scheduling before
Project SCOOP.
Source: Wood and Geisler, 1951, Figure 1.
35
Figure 2.7: Simon’s image of approximating mechanism to make do with limited computational
capacity.
Source: Simon 1953, 6.
36