Decision Making and Apollo 13

Jennifer LaViola
April 5, 2012
HROB101: Management of Contemporary Organizations
Dr. Margaret Anne Cleek
California State University Sacramento
NASA’s Finest Hour: Decision Making and Apollo 13
Forty-two years ago next week, NASA launched the Apollo 13 spacecraft on a mission to the moon.
Three astronauts were on board: Jim Lovell, Fred Haise, and Jack Swigert. On the way to the moon,
approximately 56 hours into the mission, an explosion occurred. We now know that an oxygen tank blew
up, venting nearly all the life-and-power-sustaining element into space. Jim Lovell’s distress call has
since passed into the common vernacular: “Houston, we have a problem”.
Led primarily by Flight Director Gene Kranz, the team at Mission Control worked feverishly around the
clock to save the crew. Despite the Control room presence of expert aeronautical engineers and
astronauts with spaceflight experience, Kranz’ “White Team” was confronted by several unforeseen, lifethreatening complications. Only outstanding decision making by Mission Control saved Lovell, Haise,
and Swigert and created what the Apollo 13 Review Board called “a successful failure”. (1)
Decision Making Model
NASA’s organizational structure was formalized, specialized and complex. As chapter 15 of the textbook
points out, NASA had a particular type of structural configuration called adhocracy, in which selected
areas operated in decentralized form (2). This structural form allowed teams of highly specialized experts
to function effectually and to collaborate effectively with other equally specialized teams. By
decentralizing certain decision making with an adhocracy, this large organization mastered complex,
highly technical projects unique to America’s ambitious space program.
The agency aspired to the rational decision making model. They developed procedures to attempt
situational forecasting, to calculate the probability of success of each alternative, and to form a consistent
method for selecting the best alternative. NASA created flight manuals and checklists, and “the rules for
manned spaceflight were explicit”. (3) The organization attempted to remove emotion from even
emergencies; for example, Lovell reported to Mission Control that the oxygen, in gas form, was visibly
venting out into space. Everyone from the astronauts to the ground crew knew that this was a potentially
fatal disaster, but Houston tersely responded, “Roger that, we copy your venting”. (4)
In addition, the agency attempted to identify infinite mission variables using space flight simulators, and
they observed Russia’s space experiences as closely as possible. At the same time that NASA stressed
limitless potential mission variables, the organization never exhibited paralysis of decision making due to
unknown alternatives. In fact, NASA met President Kennedy’s 1961 challenge to place a man on the
moon before 1970.
Although the Agency reached for pure rational decision making, in reality a bounded rationality model
existed. Indeed, space exploration by its very nature was fraught with unknowns and uncertainty. It
would have been impossible to conceive all outcomes because of the limited experience with manned
spaceflight. Furthermore, it would have been impossible to remove all emotion. Although feelings were
effectively disguised in Apollo 13’s space-ground communications, flight controllers later revealed initial
fear reactions, as if they were “tumbling into an abyss. Then the training kicked in”. (5)
LaViola 1
NASA made extensive use of heuristics; in fact, the glitches thrown at astronauts who were training in the
simulators were meant in part to cultivate a rule-of-thumb sense. This highly developed intuition enabled
them to navigate complex decisions rapidly. Kranz later said that his intuition played a central role in
decision making during the crisis:
“I got a gut feeling, and that’s all I got. I had to make decisions… with nothing
more than this gut feeling. To this day, I can’t explain why I felt so strongly
about (certain decisions).” (6)
He reflected that when lives were on the line, there was no time for testing and improvements; the
decision merely had to be good enough. Kranz recognized, therefore, the bounded rationality model’s
characteristic of satisfice: selecting the first alternative that is good enough because costs (here in human
lives) were too great to optimize (7).
Participative Versus Autocratic Decision Making
Since the organization used selective decentralization, a fair amount of participative decision making
existed. For example, while training in the simulator for Apollo 13, Astronaut Ken Mattingly tried to
overcome a system failure. Although Mattingly succeeded in “saving” the crew in this simulated disaster,
he was not satisfied with the timing. He was motivated to act autonomously by declaring the exercise a
failure. Rather than finishing their workday, Mattingly and his team returned to the simulator to repeat
the exercise.
Underpinning a certain amount of participative decision making were the tightly cohesive work groups
together with the team-oriented work design of an adhocracy. These cohesive teams developed a synergy
that was a vital factor in saving Apollo 13. In a 2010 interview Kranz noted, “Whatever happened, the
team in Mission Control was going to be better than the sum of its parts.” (8)
NASA blended participative decision making with autocratic decision making. A striking example of the
autocratic form occurred when Mattingly, who was originally scheduled to go to the moon with Lovell
and Haise, was exposed to the measles. Since Mattingly had never had the measles, NASA Deputy
Director Chris Kraft scrubbed him from Apollo 13. Although Lovell argued in Mattingly’s favor, he was
told, “Mattingly is off the crew. You can step down too if you don’t like it.” (9) Kraft later said, “It was
my job to…ensure that every decision--whether it involved hardware, software, or operational
procedures—was made in their (the astronauts’) favor.” (10)
Despite Kraft’s good intentions, the astronauts’ perception of the decision was poor. In the film “Apollo
13”, director Ron Howard did a good job depicting the crew’s reaction to Mattingly’s removal by filming
tight lips and rolling eyes during pre-launch training with the new team member, Swigert, as well as
flaring tempers later aboard a drifting, crippled spacecraft. Lovell cites the factors that influenced his
internal attribution about Kraft’s decision, including anger that “you (Kraft) want to break up my team,
now, three days before launch? When we can predict each other’s moves and read the tone of each
other’s voices?” (11)
Creativity’s Influence on Decision Making
The supportive relationships, team orientation, and certain instances of participative decision making
encouraged another key component in bringing the astronauts home: creativity. Post-explosion events
were almost wholly unforeseen, and the string of potentially lethal problems required ingenious decision
making under intense pressure. Chapter 10 of the textbook defines four stages of the creative process that
influence decision making (12):
LaViola 2
The four stages were prominent in deciding the best method to reduce deadly carbon dioxide levels
detected in the spacecraft. The problem was that the only available replacement for the round CO2 filter
in the lunar module was the square CO2 filter in the command module. The environmental engineers
prepared by gathering duplicates of all the items on the ship; in this way they built their knowledge base
of what the astronauts had to work with. Time constraints abbreviated the incubation phase; a workable
alternative emerged in the illumination stage. The ground crew verified as much as they could, but final
verification came from the astronauts’ report that the procedure worked.
Group Decision Making Techniques
NASA effectively utilized group decision making, particularly the quality teams defined in the textbook.
(13) At NASA, the entire Mission Control was essentially a quality team. Flight controllers were experts
who had responsibility and control over specific areas, monitoring conditions constantly and in real time.
For example, there was a flight surgeon officer, an instrumentation and communication officer, a
guidance officer, and many more. Their control was evident in the minutes before Apollo 13 was
launched. Kranz asked each desk for “ go/no go for launch.” (14) Any flight controller could declare
“no go”, and that is exactly what would happen.
Additionally, NASA extensively employed brainstorming. For instance, Kranz recalled ordering every
engineer to report to Mission Control to work the problem. The engineers were to talk to everyone
involved with every system, right down to “the guy on the assembly line who built the thing.” (15)
Furthermore, the organization frequently utilized dialectical inquiry, most notably in determining the best
approach to get the astronauts home. Two alternatives were available: turn around and come directly
back, or slingshot around the moon using the moon’s gravity field (called a free return trajectory). Kranz
encouraged open and frank debate of the pros and cons for each method; ultimately, Kranz decided on the
free return trajectory (16).
The Decision Making Process
In making the critical free return trajectory decision, Kranz followed the decision making process in our
textbook (17):
Recognize the problem and the need for a decision:
An explosion has caused nearly all of the oxygen to leak from the spacecraft, endangering
the astronauts’ lives.
Identify the objective of the decision:
Bring the crew home alive.
Gather and evaluate date and diagnose the situation:
Kranz asks the flight controllers, “what do we have on the spacecraft that’s good?” (18)
List and evaluate the alternatives:
Brainstorming and dialectical inquiry reveal the pros and cons about direct abort versus
free return trajectory.
LaViola 3
Select the best course of action:
Kranz decides to use free return trajectory.
Implement the decision:
Mission Control directs the astronauts in positioning the spacecraft to slingshot around the moon.
Gather feedback:
Flight controllers recalculate Apollo 13’s revised course, expected splashdown, power use, and
other effects of the redefined mission.
Follow up:
Mission Control constantly follows the ship’s path. About six hours before splashdown,
flight controllers determine the spacecraft’s current course is inadequate to penetrate
Earth’s gravitational field; Apollo 13 will ricochet off the atmosphere out into space.
Thus, the decision making process begins anew at step one. As Kranz asserts, “I’m not through making
decisions until my crew is back on the ground.” (19)
Consequences of Organizational Decision Making
Unfortunately, NASA’s decision making skills plummeted in the years following Apollo 13’s successful
failure. As chapter 10 of the textbook points out, poor decision making was blamed in the Challenger
explosion in 1986. Criticism of NASA decision making was significantly sharper following the 2003
Columbia explosion (20). The loss of the fourteen astronauts aboard these two ships is a consequence of
poor decision making.
Few humans have overcome more obstacles to get home alive than Lovell, Haise, and Swigert, and they
survived in a ship that had less computing power than today’s cell phones. Good organizational decision
making techniques such as following the decision making process, blending autocratic and participative
decision making, and fostering group decision making methods such as quality teams, brainstorming, and
dialectical inquiry led to the successful failure of Apollo 13 and what Gene Kranz called “one of NASA’s
finest hours.” (21)
LaViola 4
References
1.
National Aeronautics and Space Administration. (1970). Apollo 13, The Seventh Mission.
[Technical Report]. Publication SP-4029. Washington, D.C.: U.S. Government Printing Office.
Downloaded from http://history.nasa.gov/SP-4029/Apollo_13a_Summary.htm.
2.
Nelson, D. and Quick, J. (2011). ORGB. Mason, OH: Southwestern Cengage Learning. p 244.
3. Kluger, J. and Lovell, J. (1994). Lost Moon: The Perilous Voyage of Apollo 13. New York:
Houghton Mifflin. p 122.
4.
National Aeronautics and Space Administration. p 3.
5. Kranz, E. (2010). Failure Is Not An Option: The Apollo 13 Story [Lecture]. Downloaded from
http://www.youtube.com/watch?v=WyZenck9WQg&feature=relmfu.
6.
Kranz. The Apollo 13 Story [Lecture].
7.
Nelson and Quick. p 155.
8. Kranz, E. (2010). Interview. Downloaded from
msnbc.msn.com/id/36471007/ns/dateline_nbc_newsmakers/t/Apollo-real-story.
9. Kluger and Lovell. p 82.
10. Kraft, C. and Schefter, J. (2001). Flight: My Life in Mission Control. New York: Penguin
Group. p 352.
11. Grazer, B. (Producer) and Howard, R. (Director). (1995). Apollo 13 [motion picture].
United States: Universal Pictures.
12. Nelson and Quick. p 159.
13. Nelson and Quick. p 167.
14. Apollo 13 [Motion Picture].
15. Kranz. The Apollo 13 Story [Lecture].
16. Kluger and Lovell. p 144.
.
17. Nelson and Quick. p 154.
18. Apollo 13 [Motion Picture].
19. Kranz. The Apollo 13 Story [Lecture].
20. Nelson and Quick. p 163.
21. Kranz, E. (2000). Failure Is Not An Option: Mission Control From Mercury to Apollo 13 and
Beyond. New York: Simon and Schuster. p 425.
LaViola 5