IT Failure Rates— 70% or 10–15%?

loyal opposition
Editor: Robert L. Glass
■
C o m p u t i n g Tr e n d s
■
[email protected]
IT Failure Rates—
70% or 10–15%?
Robert L. Glass
… in which I question the unquestionable Standish Chaos findings
his is a closed-book quiz. Quick, now—
what percentage of software projects
fail?
Let me hazard a guess as to your answer. Odds are, you said something like
“70 percent.” Am I right?
Now, I want to analyze why you said whatever you said. A friend of
mine, Nicholas Zvegintzov
(the Chief Guru of software
maintenance), likes to say that,
if you track down the sources
of the things we “know” in
our field, you’ll find that what
we thought was a plethora of
sources for that knowledge
generally boils down to only a
precious few.
That raises the question “Where do we get
our information on software failure rates?”
I’ve seen software failure trumpeted from so
many academic research papers that I had to
quit counting (they tend to see a “software
crisis” and then say that the research work
they’re describing will help eliminate it). However, if you closely examine the citations they
use to support the claims of crisis, over and
over again the citations boil down to one primary source, the Standish Chaos Reports.
And, if you take matters one step further, the
papers generally cite the 1994 report.
I want to question the unquestionable status of that Standish report. That’s because, you
T
112
IEEE SOFTWARE
Published by the IEEE Computer Society
see, my own observations lead me to believe
that something is terribly wrong with those
Standish findings.
Success depends on your viewpoint
First, there’s the matter of the 1994 report.
If you go back a decade and look at what it
actually said, it was this—31 percent of software projects were cancelled, 53 percent were
“challenged” (had difficulty meeting goals),
and 16 percent were successful. Most people
tend to add that 31 and 53, concluding that an
astonishing 84 percent of software projects are
deeply troubled. It’s interesting to question
whether those two numbers should be added,
but at the same time, the finding that only 16
percent of software projects were successful is
very damning to the field.
Those academic researchers who quote
those original 1994 figures have ignored the
fact that Standish has updated the Chaos Report periodically. The 2000 report, for example, saw a drop in cancelled projects to 23 percent, a small drop in challenged projects to 49
percent, and a corresponding increase in successful projects to 28 percent. Still not figures
to be proud of, of course. But an improvement, nevertheless.
Then why do academics (and, to be honest,
popular-press periodicals) continue to quote
those 1994 figures?
Continued on p. 110
0740-7459/05/$20.00 © 2005 IEEE
LOYAL OPPOSITION
Continued from p. 112
The only reason I can come up with
is lazy research. It’s easier to copy the
figure that they’ve seen so often in previous publications than to seek out the
newer figures. (In addition, Standish
charges something like $5,000 for the
reports, a figure that many researchers
are undoubtedly unwilling to pay.)
Pardon me for looking the gift horse
of Standish improvements in the
mouth, but I think there’s something
bogus about all those Standish findings.
My own view of the software field, bolstered by the fact that most of my experiences as both a software developer
and a software user have been largely
successful, is that software projects succeed far more often than they fail. As
I’ve said many times before, I see this as
the Computing Age, an era that simply
wouldn’t be possible if we didn’t have
astoundingly successful software to
make all those computers do the wonderful things they do.
So what could be wrong with the
Standish findings? I’ve gone to their
Web site and reviewed the questions
they ask, which seem perfectly objective. I don’t know whom they send
their questionnaires to, but it’s hard to
imagine that they’ve accidentally and
consistently sent them only to software
losers, organizations that can’t program their way out of a paper bag. But
there are at least these possibilities:
■
■
“Failure” and “success” are tricky
words in the software business.
How do you categorize a project
that is functionally brilliant but
misses its cost or schedule targets by
10 percent? Literalists would call it
a failure, realists a success.
People tend to focus on organizations that fail. For example, an early
US Government Accounting Office
study of projects that failed, which
came up with an astounding failure
rate, turned out to have examined
projects already known to be in
trouble when the study began.
When I put together the special issue of IEEE Software on the State of
110
IEEE SOFTWARE
w w w . c o m p u t e r. o r g / s o f t w a r e
Software Engineering Practice back in
November 2003, I invited Standish to
include a report on their findings as
part of that issue. It bothered me that
they declined—and declined and declined, as I repeated my invitation several times—and eventually I had to
give up on presenting their viewpoint.
A viewpoint, as you can imagine, that I
thought would be pretty darn relevant
to a discussion of the state of software’s practice.
Other dissenting voices
Time has passed since then, and
I’ve continued to mull over this dilemma. I even put an appeal in one issue of Software asking for input on
how Standish does its work. I received
only one response.
Magne Jørgensen of the Simula Research Lab in Norway and his coauthor Kjetil Johan Moløkken-Østvold
shared my doubts regarding the Standish studies’ accuracy. [An unrelated
article by Jørgensen, “Practical Guidelines for Expert-Judgment-Based Software Effort Estimation,” appears in
this issue.—Ed.] So, Magne tells me,
they studied one particular aspect of
the Standish findings: that (at least in
the early Standish reports) software
projects suffered from an average 189
percent cost overrun.
Looking at other research studies of
software failures, they discovered something interesting. Whereas Standish reported those 189 percent overruns, three
other studies reported a consistent 33 to
My own view of the
software field is that
software projects
succeed far more often
than they fail.
34 percent cost overrun. Clearly, something was at best inconsistent between
what Standish was doing and what
the other three studies had learned.
(To check out the Jørgensen findings,
go to www.simula.no/publication_one.
php?publication_id=711. Jørgensen is
trying to get those findings published
as we speak, and there might well be a
much more comprehensive source by
now for continuing this discussion.)
Others are beating the drum that
something might be fishy in the Standish data world. For example, in a special issue on the “Great Myths of IT,”
InfoWorld (16 Aug. 2004) included as
its Myth 5 “Most IT Projects Fail.”
They cited, then openly doubted, the
Standish findings in their discussion of
that particular myth.
So what should you believe?
Meanwhile, whatever the truth of
the matter, most people seem to have
seen and tend to believe the Standish
findings. A particularly interesting and
indicative case of this came from a discussion in the 5 Nov. 2003 Cutter IT
E-Mail Advisor. That article reports
that a speaker asked an IT audience
how many of them agreed with the
data that showed an “over 70% failure
rate of IT projects.” Everyone agreed.
Then, the speaker asked them what the
failure rate was in their own companies. “85% of the audience reported
5–10% failure rates for their companies.” The article concluded with the
speaker’s tongue-in-cheek comment, “I
was lucky to have an audience from
such high-performing companies,” implying that the audience wasn’t entirely
truthful about its own failure rates.
Perhaps. But isn’t it also possible
that the 70 percent failure rate that
everyone accepted, probably because
they had seen it published so often in
analyses that included the Standish
findings, is really the figure that should
be doubted?
We haven’t seen the last of this issue.
When and if the Jørgensen report is
published, the issue will arise again bigtime, as well it should. Whatever the
true failure rate of IT projects, our field
should spare no effort in seeking the
ADVERTISER / PRODUCT INDEX
truth of the matter. In many ways, the
future of the software field is at stake.
repeat my invitation to Standish.
Dear Standish folks—if you would
like to rebut, explain, or supplement
the views I’ve presented, I’d be happy to
publish your response in a future Loyal
Opposition column.
I
Robert L. Glass is editor emeritus of Elsevier’s Journal
of Systems and Software, the publisher and editor of the Software Practitioner newsletter, and someone whose “head is in the
theory of software engineering, but whose heart is in its practice.” Contact him at [email protected]; he’ll be pleased to
hear from you.
MAY/JUNE 2005
Advertiser / Product
Page Number
Addison-Wesley
106
Agile 2005
Cover 3
Charles River Media
13
Dorset House
108
JavaOne 2005
1
John Wiley & Sons, Inc.
107, Cover 4
Pace University
Cover 2
Boldface denotes advertisements in this issue.
Advertising Personnel
Marion Delaney
IEEE Media, Advertising Director
Phone: +1 212 419 7766
Fax:
+1 212 419 7589
Email: [email protected]
Copyright and reprint permission: Copyright © 2005 by the Institute of Electrical
and Electronics Engineers, Inc. All rights reserved. Abstracting is permitted with credit
to the source. Libraries are permitted to
photocopy beyond the limits of US copyright law for private use of patrons those
post-1977 articles that carry a code at the
bottom of the first page, provided the percopy fee indicated in the code is paid
through the Copyright Clearance Center,
222 Rosewood Dr., Danvers, MA 01923.
For copying, reprint, or republication permission, write to Copyright and Permissions
Dept., IEEE Publications Admin., 445 Hoes
Ln., Piscataway, NJ 08855-1331.
IEEE Software (ISSN 0740-7459) is published bimonthly by the IEEE Computer Society. IEEE headquarters: Three Park Ave.,
17th Floor, New York, NY 10016-5997.
IEEE Computer Society Publications Office:
10662 Los Vaqueros Cir., PO Box 3014, Los
Alamitos, CA 90720-1314; +1 714 821
8380; fax +1 714 821 4010. IEEE Computer
Society headquarters: 1730 Massachusetts
Ave. NW, Washington, DC 20036-1903.
Subscription rates: IEEE Computer Society
members get the lowest rates and choice of
media option—$44/35/57 US print/electronic/combination; go to www.computer.
org/subscribe to order and for more information on other subscription prices. Back issues: $20 for members, $121 for nonmembers (plus shipping and handling). This
magazine is available on microfiche.
Postmaster: Send undelivered copies and address changes to IEEE Software, Membership Processing Dept., IEEE Service Center,
445 Hoes Lane, Piscataway, NJ 088551331. Periodicals Postage Paid at New York,
NY, and at additional mailing offices. Canadian GST #125634188. Canada Post Publications Mail Agreement Number 40013885.
Return undeliverable Canadian addresses to
PO Box 122, Niagara Falls, ON L2E 6S8,
Canada. Printed in the USA.
Marian Anderson
Advertising Coordinator
Phone: +1 714 821 8380
Fax:
+1 714 821 4010
Email: [email protected]
Sandy Brown
IEEE Computer Society,
Business Development Manager
Phone: +1 714 821 8380
Fax:
+1 714 821 4010
Email: [email protected]
Advertising Sales Representatives
Mid Atlantic
(product/recruitment)
Dawn Becker
Phone: +1 732 772 0160
Fax:
+1 732 772 0161
Email: [email protected]
Midwest/Southwest
(recruitment)
Darcy Giovingo
Phone: +1 847 498-4520
Fax:
+1 847 498-5911
Email: [email protected]
Northwest/Southern CA
(recruitment)
Tim Matteson
Phone: +1 310 836 4064
Fax:
+1 310 836 4067
Email: [email protected]
New England (product)
Jody Estabrook
Phone: +1 978 244 0192
Fax:
+1 978 244 0103
Email: [email protected]
Midwest (product)
Dave Jones
Phone: +1 708 442 5633
Fax:
+1 708 442 7620
Email: [email protected]
Southeast (recruitment)
Thomas M. Flynn
Phone: +1 770 645 2944
Fax:
+1 770 993 4423
Email: [email protected]
New England (recruitment)
Robert Zwick
Phone: +1 212 419 7765
Fax:
+1 212 419 7570
Email: [email protected]
Will Hamilton
Phone: +1 269 381 2156
Fax:
+1 269 381 2556
Email: [email protected]
Connecticut (product)
Stan Greenfield
Phone: +1 203 938 2418
Fax:
+1 203 938 3211
Email: [email protected]
Southeast (product)
Bill Holland
Phone: +1 770 435 6549
Fax:
+1 770 435 0243
Email: [email protected]
Southern CA (product)
Marshall Rubin
Phone: +1 818 888 2407
Fax:
+1 818 888 4907
Email: [email protected]
Joe DiNardo
Phone: +1 440 248 2456
Fax:
+1 440 248 2594
Email: [email protected]
Southwest (product)
Josh Mayer
Phone: +1 972 423 5507
Fax:
+1 972 423 6858
Email: [email protected]
Northwest (product)
Peter D. Scott
Phone: +1 415 421 7950
Fax:
+1 415 398 4156
Email: [email protected]
IEEE Software
IEEE Computer Society
10662 Los Vaqueros Circle
Los Alamitos, California 90720-1314
USA
Phone: +1 714 821 8380
Fax: +1 714 821 4010
http://www.computer.org
[email protected]
Japan (product/recruitment)
Tim Matteson
Phone: +1 310 836 4064
Fax:
+1 310 836 4067
Email: [email protected]
Europe (product/recruitment)
Hilary Turnbull
Phone: +44 1875 825700
Fax:
+44 1875 825701
Email:
[email protected]