Erosion of industry leadership in the face of seemingly innocuous

Erosion of industry leadership in the face of seemingly innocuous
technological changes: a study of the video game console industry
Allan Afuah
University of Michigan Business School
University of Michigan Business School, 701 Tappan
Ann Arbor, Michigan 48109-1234, USA
Tel.: (734) 763 3740; Fax: (734) 936 0282
[email protected]
Rosa Grimaldi
University of Bologna
CIEG, Via Saragozza, 8, 40124, Bologna, Italy
Tel.: 0039 (0) 51 2093953; Fax: 0039 (0) 51 2093949
[email protected]
Abstract
We present a framework for exploring why industry incumbents
lose their leadership positions to attackers in the face of seemingly
innocuous technological changes. We use the framework to explain
why in the video game industry, market leaders have been toppled
each time a new generation of microprocessors has been
introduced. Our framework has implications for research in
dynamic capabilities.
SPRU Conference,
Falmer, Brighton, UK, 13-15 November, 2003
Erosion of industry leadership and innocuous technological changes
INTRODUCTION
Although the impact of technological change on the competitive advantage of incumbents
has intrigued scholars of technological change since at least Schumpeter (1934), most of the
leading research in the area has focused on radical or competence-destroying technological
changes (e.g., Abernathy and Clark, 1985; Tushman and Anderson, 1986; Anderson and
Tushman, 1990; Tripsas, 1997). One finding of this research stream is that incumbents are
relatively safe in the face of non-radical technological changes (Hill and Rothaermel, 2003).
However, it is not unusual to see incumbents lose their leadership positions in the face of
what appears to be relatively harmless technological changes—non-radical technological
changes that build on existing core scientific and engineering concepts. This leads to a rather
interesting question: Why is it that seemingly innocuous technological changes sometimes
result in the loss of competitive advantage by industry leaders? Henderson and Clark
(1990) were the first to explicitly tackle this question. Their paper, and the others that
followed, took an information processing perspective while assuming that firms have the
resources that they need or can easily acquire them. That is, they argued that incumbents
who lost their leadership positions in the face of apparently harmless technological changes
did so because they did not understand the architectural changes—they failed to search,
find, import and process the information that they needed to successfully reconfigure their
systems (Henderson and Clark, 1990; Henderson and Cockburn, 1994; Afuah and Bahram,
1995).
An underlying assumption in this information processing perspective is that these
firms have the resources that they need and if they do not, they can easily find and acquire
them whenever they want to. However, the resource-based view of the firm argues that
2
Erosion of industry leadership and innocuous technological changes
valuable resources can sometimes be scarce, difficult to duplicate or acquire (e.g., Peteraf,
1993; Helfat and Peteraf, 2003). Thus we can expect that incumbents’ information
processing problems can be exacerbated by the lack of resources, or even when incumbents
have no information processing problems, they may still not successfully exploit a new
technology simply because the resources which they need are scarce or they have to pay
too much for the resources.
We build on the architectural innovation theory and draw on the resource-based
view of the firm and related research streams (e.g. Henderson and Cockburn, 1994; Brusoni,
Prencipe and Pavitt, 2001, Granstram, Patel and Pavitt, 1997) to further explore why
seemingly innocuous technological changes can result in the overthrow of incumbent
leaders. We argue that architectural technological changes that require reconfiguration of
systems and trigger the need for scarce resources, offer attackers an opportunity to topple
incumbent leaders. We apply our theory to the video game industry to explain why the
market leader has been toppled by an attacker each time a new generation of
microprocessors has been introduced. We find that attackers pushed out incumbent leaders
because changes in microprocessor generations triggered changes in the video game systems
such that, building an optimal system required not only new knowledge of system
reconfiguration (changes in linkages and components) but also new scarce resources which
leader did not have and attackers had.
Handicapped by their routines and other commitments to the established design,
leaders experienced difficulties in building the new and scarce resources needed to profit
from the change and were displaced by attackers.
3
Erosion of industry leadership and innocuous technological changes
BACKGROUND LITERATURE
Architectural innovation
Henderson and Clark (1990) were the first to explore the question of why seemingly
innocuous technological changes can erode the competitive advantage of leading
incumbents. They argued that since a product is made up of components and linkages
between them, developing and producing a product must require knowledge of the core
concepts that underpin the components, and knowledge of the linkages between the
components or architectural knowledge. Since organizations are boundedly rational, the
departments within such organizations develop routines to interact with each other and
with external organizations to build and market products (Allen, 1984; Clark and Fujimoto,
1991; Nelson and Winter, 1982; Henderson and Cockburn, 1994). A radical technological
change overturns the core concepts that underpin components and linkages between them,
thereby rendering the knowledge accumulated by the organization’s departments obsolete
and increasing the chances that incumbents will lose their competitive advantages (Hill and
Rothaermel, 2003; Leonard-Barton, 1992; Tushman and Anderson, 1986;). For example, an
electric car would be a radical technological change to makers of the internal combustion
engine automobile since the knowledge that underpins its components (electric motor,
electronic driver control, electrical flow, and battery) rests on electrical engineering rather
than mechanical engineering.
In an architectural change, knowledge of linkages between components is changed
while the core concepts that underpin components are not overturned. For example, the
change from mainframe computers to personal computers was an architectural change since
4
Erosion of industry leadership and innocuous technological changes
the core concepts that underpin the components of a computer (a central processing unit,
semiconductor memory, input/output and software) did not change but the linkages
between the components changed. For an incumbent, effectively developing a new product
in the face of an architectural innovation requires reconfiguring the established system in
ways that may be different from what the incumbent is used to (Henderson and Clark,
1990). This means that the incumbent's departments may have to interact with each other
or with external organizations in ways that are different from what their established
routines and procedures were designed for (Henderson and Cockburn, 1994).
In particular, the following two factors can pose problems for an incumbent that
faces an architectural innovation. First, since the core concepts that underpin components
have not changed, incumbents may not see the need to change and are therefore not likely
to make an effort to look for subtle architectural changes (Cyert and March, 1963;
Henderson and Clark, 1990). They may be so blinded by their existing dominant managerial
logic that they miss out on key signals to recognize the potential of the change (Bettis and
Prahalad, 1995; Tripsas and Gavetti, 2000). For example, Ken Olsen, former CEO of
minicomputer maker, Digital Equipment Corporation, is often quoted as asking back when
PCs first surfaced, "Why would anyone want a home computer?" If and when incumbents
see the need to change, they may not know exactly what it is that must be changed, a
problem synonymous with the casual ambiguity of the resource-based view of the firm.
Effectively, because firms are boundedly rational, incumbents pursue the same routines and
procedures that they have used to exploit the existing technology. It is business as usual for
the incumbent, even though architectural change may require otherwise. For example, since
some of a firm’s knowledge of linkages between components is rendered obsolete, the firm
5
Erosion of industry leadership and innocuous technological changes
is likely to lose some of its absorptive capacity (Cohen and Levinthal, 1990; Zahra and
George, 2002). However, because the firm does not realize that it needs new related
knowledge, it is going to be less effective. Second, routines, especially those that have been
used successfully in the past, can be difficult to change. Anyone who has tried to break an
old habit knows how difficult this can be. Also, even if an organization can successfully
unlearn old ways of doing things and wants to move on to newer routines, prior
commitments (Gemawat, 1991) or governance inseparability (Argyres and Liebeskind,
1999) can also prevent the organization from pursuing new routines.
Thus, in the face of an architectural innovation, some of what a firm knows (e.g.,
the core concepts that underpin components) is useful in exploiting the new technology but
some of what it knows (some of the architectural knowledge) is not useful; in fact, it may
handicap the firm's efforts to exploit the new technology. In organizations where
departments have come to mirror product components and developed routines, procedures
and problem-solving strategies accordingly, these departments may see the change as
innocuous and not pay enough attention to the new interactions that they need to exploit
the architectural change. It would be business as usual at such organizations as they
continue to use the same routines, procedures and problem-solving strategies that they
developed and used to attain this existing leadership positions. Attackers who do not see it
as business as usual and have no legacy of established routines are likely to exploit the
architectural change and displace leading incumbents (Leonard-Barton, 1992; Nelson and
Winter, 1982). Effectively the architectural innovation framework takes an information
processing perspective and therefore assumes that firms which perform well in the face of a
technological change are firms which are able to search, find and process the relevant
6
Erosion of industry leadership and innocuous technological changes
information well in the problem-solving that takes place during innovation (Henderson and
Clark, 1990; Henderson and Cockburn, 1994).
Role of resources
An underlying assumption in the information processing view is that incumbents and
attackers can acquire the resources that they need if they want to and do not already have
them. However, the resource-based view of the firm suggests that some of the important
resources which a firm needs to exploit a technological change can be scarce, difficult-toimitate and costly to acquire (e.g., Wernerfeldt, 1984; Peteraf, 1993). This view is echoed
by Teece (1986) who argued that in the face of a new technology that is easy to imitate,
firms with tightly held but important complementary assets, not the inventors, stand to
profit from the technological change. In the face of an architectural change, a firm needs
resources for the systems reconfiguration that must take place (reconfiguration assets). It
also needs (Abernathy and Clark, 1985) resources for interacting with the market for the
resulting product and other relevant external parties (commercialization assets). The type
of resources that a firm needs are a function of the technology that underpins its activities
and of the industry in which it competes. For example, in industries with technologies that
exhibit network externalities, a firm’s installed base, availability of complementary goods
and ability to learn are important resources (Shilling 1998, 2002).
Thus, in exploring why leading incumbents loose their leadership to
attackers in the face of seemingly innocuous technological changes, it is also important to
incorporate the role of the new resources that a firm may need.
7
Erosion of industry leadership and innocuous technological changes
FRAMEWORK AND THEORY
Consider an incumbent that has an established design of a product with components A, B,
C, D and E as well as an established base of complementors and customers. Performing the
activities that allow the firm to design the product and position it so as to appropriate the
value from it requires both knowledge of the core concepts that underpin each component
and knowledge of how to configure the system (link the components) (Henderson and
Clark, 1990; Iansiti, 1993). It also requires knowledge of linkages with complementors and
customers as well as non-knowledge resources. For example, designing and building
computers, requires not only knowledge of the core concepts that underpin the
microprocessor, main memory, input/output devices, software and secondary memory, but
also knowledge of the interaction between these components. It also requires resources
such as skilled engineers, relationships with suppliers, software developers, equipment and
knowledge of the markets that serves. Since the design in question is an established one, the
firm's different departments have developed the routines, skills and other capabilities that
are rooted in the existing configuration of the system (Nelson and Winter, 1982). Now,
suppose there is a change in one of the components, say A, that triggers changes in the
linkages between components B, C, D and E, as well as in the linkages between the firm
and its complementors and customers. The extent to which these changes impact an
incumbent's competitive advantage is a function of (1) the amount of systems
reconfiguration required as a result of the changes in A, and (2) the extent to which the
changes triggered by A require scarce and difficult-to-imitate resources (that the incumbent
does not have) so as to offer superior customer value and appropriate the value. We have
shown these variables in Figure 1 with “the amount of systems reconfiguration required by
8
Erosion of industry leadership and innocuous technological changes
the change” as the vertical axis and “the degree to which systems reconfiguration requires
resources” as the horizontal axis.
Systems reconfiguration
The amount of systems reconfiguration required to design an optimal system, following the
changes triggered by a component, depends on the breadth and depth of the changes so
generated in the system. Breadth here is defined as the number of linkages that experience
changes triggered by the component in question. For example, if A causes changes in B, C,
D and E while B causes changes only in A and C, the changes caused by A are said to have
more breadth than those caused by B. Depth is defined as the extent to which the changes
triggered in linkages force changes in components. For example, a change in one component
that leaves all components relatively intact has less depth than one which forces some of
the components to change as well.
High breadth means more linkages are impacted by the changes triggered by a
component. The more linkages that are impacted, the greater the chances that an incumbent
will miss an important subtle change in a linkage or not know what it is that has to be
changed about the linkage to get things right during reconfiguration. The more linkages, the
greater the chances that an incumbent's routines may have to be changed for more effective
interaction during reconfiguration. For example, if replacing a plane's engine with a new one
only impacts the linkages between the engine and the avionics (electronics)—and not the
linkages between the engine and the fuselage—the designer of the plane need worry only
about how the engine group would interact with the electronics group. It need not worry
about its routines for dealing with the fuselage group. The more linkages that are impacted,
9
Erosion of industry leadership and innocuous technological changes
the more an incumbent may also need new resources that it does not already have. Getting
all these things right or acquiring the necessary scarce resources takes time. Thus we can
expect that the higher the breadth, the longer it will take an incumbent to exploit a highbreadth change. This increases the chances that an attacker with the key scarce resources
may have at eroding an incumbent's competitive advantage.
High depth means that the changes triggered by a component have a large impact on
each of the other components of the system. If the impact is high enough, the component
may need to be changed considerably or completely replaced. For example, the change from
propeller engine to jet engine required considerable changes to be made to the fuselage of
airplanes to get the reconfiguration of the airplane right. If changes have to made to other
components, an incumbent must first figure out exactly what the changes should be. Doing
so may not always be easy since the incumbent may be handicapped by its established
routines (Leonard-Barton, 1992; Afuah, 2000). Once an incumbent determines what needs
to be changed in these other components, it also has to deal with the process of making the
changes. At the extreme, these other components may have to be replaced with new ones.
Doing so may mean that an incumbent has to modify its relationships with its suppliers or
find new suppliers. Either way, the firm may need new routines and resources that are
specific to these other component, both of which can be scarce, difficult to imitate and take
time.
Resources
Reconfiguring a system requires resources. Let us call such resources, reconfiguration
resources. These include equipment, people, intellectual property, and relationships with
10
Erosion of industry leadership and innocuous technological changes
suppliers, customers and complementors. In many cases, incumbent leaders already have
these reconfiguration resources (Mitchell, 1989; Tripsas, 1997; Roetharmel, 2001). In other
cases, incumbents do not have some of the critical new resources that are needed for
optimal reconfiguration of the system. For the following reasons, if both incumbents and
attackers do not have the new reconfiguration resources and the level of reconfiguration
required is high, attackers still have an advantage in securing and using these resources.
First, incumbents may be handicapped by their existing routines, information filters, and
problem-solving strategies in identifying whether they need new resources or not and
which resources they need (Bettis and Prahalad, 1996; Henderson and Clark, 1990). For
example, prior commitments (e.g., contracts) made in exploiting the existing technology
may prevent incumbents from making new investments or new important contracts
(Argyres and Liebeskind, 1999). They are also likely to be handicapped in acquiring the
resources and using them during the reconfiguration process (Leonard-Barton, 1992; Hamel
and Prahalad, 1994). Attackers do not have such handicaps. Second, attackers may already
have these new resources. Thus, where high depth and high breadth changes trigger the need
for scarce resources that incumbents do not have, the chances that attackers will erode
incumbent advantages are even better.
In addition to reconfiguration resources, firms usually need commercialization
resources such as brand name reputation, distribution channels, installed base and
complementary products to position themselves vis-à-vis their competitors and
appropriate the value from reconfiguration (Afuah, 2003). If the reconfigured product
requires new ‘commercialization’ resources that incumbents do not have, the same
arguments detailed above for why attackers have an advantage over incumbents building
11
Erosion of industry leadership and innocuous technological changes
and exploiting reconfiguration resources hold for commercialization resources. Additionally,
since it takes incumbents longer than attackers to reconfigure and launch a system,
attackers have a chance to build commercialization resources and take an early lead. If the
technology exhibits network externalities, for example, attackers can parlay their initial lead
into an advantage that can last until the next major technological change (Schilling, 1998,
2002; Arthur, 1989).
Since the performance of a system depends on its components and the linkages
between them, the reconfiguration that usually follows high depth and breadth changes is
likely to result in a system whose performance is different from the original system. The
key word here is ‘different’ since reconfiguration can result in a system whose performance
falls above as well as below the performance requirements of existing customers. For
example, the PC’s performance fell below that of mainframe and minicomputers. Where the
reconfigured system’s performance is very different, the new product may open up a new
market which might require different commercialization resources (distribution channels,
etc.).
Leadership changes
In summary, the systems reconfiguration that often follows a technological innovation also
requires resources in addition to the body of knowledge that underpins the innovation.
Therefore, the ability of industry leaders to maintain their leadership positions or lose to
attackers in the face of a technological innovation is a function of both the amount of
systems reconfiguration that is needed and the resources that are needed to reconfigure the
system and appropriate the value from it. Figure 1 captures the likely outcomes of the
12
Erosion of industry leadership and innocuous technological changes
competition between industry leaders and attackers. In Quadrant 1, incumbents not only
may have problems with reconfiguration, they do not have the scarce and difficult-toimitate resources. Attackers are therefore more likely than current leaders to be the first to
successfully reconfigure systems and build relevant commercialization assets in the face of
high-depth and high-breadth architectural changes that require scarce resources that
incumbents do not have. Therefore, we can expect attackers to be most likely, on average,
to take over the leadership positions from incumbents in the face of such technological
changes, especially in industries where first-mover advantages or lockout are possible
(Arthur, 1989; Schilling, 1998, 2002).
In Quadrant II, the amount of systems reconfiguration required is low (as a result of
the low-depth and low-breadth changes) and the resources required for optimal
reconfiguration and appropriation of value are scarce and incumbents do not have them.
Thus, those attackers who happen to have these resources are more likely to successfully
reconfigure the system and put themselves in a position to perform better than incumbents
in exploiting the technological change. Quadrant III shows the case where the systems
reconfiguration required is low and the no new scarce resources are needed that incumbents
do not already have. (That is, if any resources are needed, incumbents already have them.)
Clearly, incumbents are likely to maintain any competitive advantages that they had prior
to the technological change. In Quadrant IV, the amount of systems reconfiguration
required is high but the reconfiguration does not require scarce resources that incumbents
do not already have. (If any scarce resources are need, incumbents already have them.) In
such a case, one of two things is likely to happen. Attackers are more likely to successfully
reconfigure the system and put themselves in a position to perform better than incumbents
13
Erosion of industry leadership and innocuous technological changes
in exploiting the technological change, unless incumbents have the important resources. If
incumbents have the relevant resources, they are more likely to reinforce their existing
competitive advantages.
(Insert Figure 1 here)
RESEARCH METHOD
We constructed a technical history of the video game console industry and built a database
using data from different websites on the Internet, personal interviews with industry
officials, and company publications. The history and data covered the years 1977 to 2003.
Our dataset includes entries on 42 companies, game console technical attributes [CPU
characteristics, clock speed, RAM, ROM, resolution], and cumulative number of games
and of game developers for each system.
VIDEO GAME CONSOLE INDUSTRY
Historical leadership changes in the Video Game industry
We chose the video game industry to explore our theoretical framework for two reasons.
First, the introduction of each generation of microprocessors has resulted in the overthrow
of the incumbent leader in the industry. Following the introduction of each generation of
microprocessors since 1977, a different firm has used the more advanced processor to
develop a new video game console to topple the incumbent industry leader. Nintendo took
over the leadership position from Atari with the introduction of the 8-bit microprocessor;
14
Erosion of industry leadership and innocuous technological changes
Sega took over the leadership position from Nintendo with the introduction of the 16-bit
microprocessor, only to see its own leadership position eroded by Sony with the
introduction of the 32-bit microprocessor. Second, the introduction of a new generation of
microprocessors is not a radical technological innovation to video game console makers; it
does not change their methods and materials in novel ways.
History
The home video game industry traces its origins to the early 1970s, when the first
dedicated video game machines appeared, first for the arcade and subsequently for the
home market. The very first home video game system was introduced in 1972 by a
company called Magnavox (a U.S. subsidiary of Philips). Many other firms entered the
market so that by 1973, there were more than 25 companies in the market. Because of the
limitations of the technology, the game was “hardwired” into the console so that each
console could only play one game. Effectively, the game console (hardware) and games
(software) were one and the same thing. The functionality of these very first systems was
rudimentary—supporting no sound, had no counter—so that players had to keep their
own scores. (Some later versions of this first generation of video games added sound and
counters.)
The video game console
The major components of a video game console are: the CPU (central processing unit),
main memory or RAM (random access memory), Input/Output (I/O) devices, and software
(games played on the console). Usually, the parts of a game needed are brought in from
15
Erosion of industry leadership and innocuous technological changes
secondary storage such as a cartridge or compact disc (CD). The CPU is usually made up
of a microprocessor and a co-processor. The RAM provides temporary storage for a game
when it is being played. I/O devices are the components that enable the interaction between
the console and external environment. Building a video game console requires knowledge of
these components and knowledge of how to configure them to produce a video game
system. It also requires the resources to acquire the components, configure the system and
make money from it. These include relationships with video game makers and customers.
Reconfiguration introduced by new generations of microprocessors
The introduction of each generation of microprocessors usually results in two major
improvements that have had a major impact on the quality of video games. First, the
instruction and data widths, a measure of the size of the individual instructions that the
CPU could process (per cycle) in executing instructions, has increased from one generation
to the next—from 4-bit to 8-bit to 16-bit to 32-bit to 64-bit to 128-bit. The higher the
number of bits, the larger the amount of addressable memory that the microprocessor can
work with.1 And the larger the addressable memory that a processor can handle, the larger
the datasets that the processor can manipulate. As illustrated in Table 1, moving from an 8bit processor to a 32-bit processor increases the addressable memory from 512 bytes to
4,294,967,296 bytes, a huge improvement. Being able to manipulate larger datasets
facilitates the development of games that are more life-like. After all, digital images are a
manipulation of ones and zeros—of data. Second, the clock speed, or MHZ as popularly
known, usually increases drastically with the introduction of each generation of
16
Erosion of industry leadership and innocuous technological changes
microprocessors (see Table 1).2 This increase in clock speed means an increase in the rate
at which data can be manipulated. The increase in clock speed, in and of itself, generates
enough improvements in data manipulation to make games more life-like from one
generation of microprocessors to the other. When combined with larger word lengths, the
resulting images are even more life-like. Effectively, the combined effect of both factors on
the ability to create games with more like quality is dramatic, giving game developers a
chance to develop games that bring more life to the screen.
(Insert TABLE 1 Here)
Impact on other components of the video game console system
The dramatic impact on word length and clock speed that each generation of
microprocessors introduced, triggered changes in the linkages between the microprocessor
and the other components of the video game console. In particular, a lot more bits of
information could now flow between the microprocessor, and the memory or input/output
units. This in turn, required faster memory and input/output units for optimal system
performance. In the transition from discrete transistor game consoles to 8-bit consoles,
memory cartridges were available to serve as secondary memory and part input/output unit
(see below). In the transition from 16-bit consoles to 32-bit consoles, the CD was also
available for use as secondary memory (see below). Effectively, the memory cartridge was
a brand new component for 8-bit consoles while the CD was a brand new component for
1There
are computer tricks for addressing more memory than a processor's bid width would suggest. But
they have high costs, including speed.
17
Erosion of industry leadership and innocuous technological changes
32-bit consoles. (The 16-bit console used cartridges but it was no longer a brand new
component since it had been introduced a generation earlier with the 8-bit console.)
Therefore, the amount of reconfiguration that was required for 8-bit and 32-bit consoles,
was greater than that for the 16-bit generation. Thus, according to Henderson and Clark
(1990), we can expect incumbents to experience more architectural innovation problems
with the 8-bit and 32-bit than with the 16-bit, thereby opening up more chances for
attackers in the former than the latter.
ATTACKERS, NEW RESOURCES AND INCUMBENT RECONFIGURATION
HANDICAPS
Attackers
Nintendo and the 8-bit era. 8-bit processors, made from integrated circuits, introduced
two major changes in video game consoles. First, having 8-bit data and instruction widths
meant game console makers could now address up to 28 or 512 bytes of memory, compared
to zero in the pre-8-bit era. Effectively, programmability had been introduced into video
games. That is, instead of hardwiring a game into the console so that only one game could
be played on a console, many games could be loaded into the programmable memory of the
console, giving the console the flexibility of playing many games, each one at a time. Each
game had to reside somewhere—in secondary memory—and get loaded into the console
only when a player wanted to play. This secondary memory turned out to be the memory
cartridge. Games could be stored in cartridges and loaded into the game console through an
2This
increase in clock speed is but a result of the fact that many architectural improvements have been made
to the processor.
18
Erosion of industry leadership and innocuous technological changes
input/output slot. Separation of hardware from software also meant that independent game
developers could now develop games for each console.
The second major change was in the speed of the processor. Because 8-bit
processors were made from integrated circuits, they were much faster than their discrete
logic predecessors and could also be used to perform more tasks since many more
transistors had been etched into a smaller amount of space. With the flexibility from a larger
addressable memory and faster processor, game developers could develop games that were
more realistic than those that ran on the discrete logic systems of the previous generation.
Game developers could also take advantage of the possibilities of producing more realistic
games and more importantly, of the fact that more than one game could be played on the
same game console.
Consumers, many of whom were arcade game players, now had the choice of many
games per console that they could play at home. The number of 8-bit console units sold
grew from approximately 4 millions (pre-8-birts) to more than 10 millions (Game investor,
2003). The primary targets were still kids (Harrington, 1988). Nintendo, a new entrant, had
several things (resources) in its favor. First, as the largest toy company in Japan, it had
developed distribution channels to children—the primary target for the 8-bit consoles.
From an earlier relationship with Disney, Nintendo had also established some distribution
channels to the 8-15 year olds in the US (Yoder, 1986). Second, in 1981, Nintendo had had
a video game hit called Donkey Kong in the arcade market. The characters for this game had
become very popular among children. Nintendo’s first home game, ‘Mario super brothers’
(bundled with the 8-bit system) was designed by the designers of ‘Donkey Kong’. It built
on the already popular arcade game. Mario super brothers’ turned out to be even more
19
Erosion of industry leadership and innocuous technological changes
popular than Mickey Mouse among US children at the time and it was licensed for use on
everything from food to linen. The games, relationships with developers from the arcade
era and distribution channels to children served Nintendo well.
Sega and the 16-bit era. Using 16-bit processors meant a lot more addressable memory
and faster processing speed than 8-bit. Game developers could develop even more realistic
games than they could with 8-bit processors. The main target for video game consoles was
still children. No new secondary memory was available at the time; cartridges could still be
used. Thus, as pointed out earlier, the amount of systems reconfiguration needed with 16bit consoles was not as high as with the 8-bit where the cartridge had been a brand new
component. Sega entered the market in 1986 with an 8-bit machine. In 1989 it launched a
16-bit machine (Sega Dreamcast). When it introduced the 16-bit machine, Sega made sure
that it was compatible with its 8-bit machine. This backward compatibility meant that
owners of Sega’s 8-bit games could play them on its new 16-bit machines. This made many
games immediately available for the new system. Its competitors’ machines were not. At
that time, Sega was also the world’s largest arcade video-game maker.
Sony and 32-bit era. As with other generations, the introduction of the 32-bit
microprocessor offered even more addressable memory. Now, 4 Gigabytes of memory
could be addressed. The chip was also faster. The CD, a faster device with 20 times the
memory capacity of cartridges, had evolved to a point where it could replace the cartridge
as the secondary memory in consoles (Coughlan, 2001). Sony, a new entrant, had several
things (resources) in its favor. Sony’s previous experience with the CD came handy when
20
Erosion of industry leadership and innocuous technological changes
the company decided to enter the video game console business. Sony had teamed up back
in 1982 with Philips to lay the foundation for CDs and Sony used CDs in many of its
consumer products. For even more sophisticated games, Sony used memory cards which it
had also developed in earlier ventures. The memory cards worked in conjunction with CDs.
The much more sophisticated games that could be played with 32-bit machines and CDs
opened up a new market for adults. The ability to incorporate dialogue and film clips into
games was new and very successful among adults. Hit movies could be translated into hit
games. Sony, with its history of successful consumer products, already had distribution
channels to adults. In fact, Sony’s Playstation was initially successfully sold in music
stores, electronics stores and only marginally in videogames stores.
Software as key resource
Competition in the video game industry, especially in the later generations, was largely on
software (Pereira, 1990). As Table 2 shows, each attacker had more software titles than the
leading incumbent that it displaced. In all three cases, there is a significant difference
between the average number of game titles developed for the attacker’s console and those
developed for the incumbent leader’s console. In the 8-bit era, Nintendo (the attacker) had
an average of 27.9 game titles a year compared to incumbent Atari’s 7 game titles a year. In
the 16-bit era, Nintendo, now the leading incumbent, had an average of 33 game titles while
Sega had 45. Sega was also an incumbent, having offered 8-bit consoles earlier. As stated
earlier, Sega’s 16-bit machines were backward compatible and therefore games developed
for Sega 8-bits system could also be played on the new 16-bits console. In the 32-bit era,
21
Erosion of industry leadership and innocuous technological changes
Sony the attacker had an average of 107.2 game titles compared to Sega’s 22.5. These data
suggest that the winner had more game titles.
(Insert Table 2 here)
Incumbents and reconfiguration handicaps
As we have argued above, each new generation of microprocessors triggered changes in the
linkages between the processor and other components of the video game console as well as
in the other components themselves. In two of those generations, a brand new component
was introduced—the memory cartridge in the 8-bit generation and the CD in the 32-bit
generation. According to Henderson and Clark (1990), we can expect incumbents in these
two markets to experience more reconfiguration problems than those in the 16-bit era
where no brand new component was required. This is partly because of the fact that each
of these new components is actually a system in its own and designing it into a new video
console requires linkages with the new system that are new to the incumbent and its
established routines for searching, filtering and assimilating knowledge. Linkages between
the new component and console are also more likely to require different problem-solving
routines than incumbents’ existing ones. Thus, according to the theory which we detailed
earlier, we can expect new entrant attackers to be more likely to displace incumbents in the
8-bit and 32-bit eras whereas in the 16-bit era, we can expect an incumbent with the new
resources to win. As shown in Figure 2, that was indeed the case. Nintendo, a new entrant
attacker, displaced the incumbent Atari in 8-bit. Sega, an incumbent, displaced Nintendo,
22
Erosion of industry leadership and innocuous technological changes
another incumbent in the 16-bit era and Sony, a new entrant attacker, displaced the
incumbent Sega.
(Insert Figure 2 here)
SUMMARY, DISCUSSIONS AND CONCLUSIONS
We have offered a framework for why seemingly innocuous technological changes can erode
the competitive advantage of leading firms. We have argued that what appears to be a minor
change in a “component” can trigger changes not only in the linkages between components
but also in other components. During the reconfiguration that follows, these changes in
linkages and components can be deep and widespread enough to cause architectural
innovation problems (and more) for incumbents. More importantly, the changes can trigger
the need for valuable, scarce and difficult-to-imitate resources that incumbents do not have.
If attackers have the scarce and difficult-to-imitate resources and leading incumbents do not
have them, incumbents are faced with both the reconfiguration problems associated with
architectural innovation (Henderson and Clark, 1990) and the need to acquire scarce
difficult-to-imitate resources and integrate them into their existing systems and routines.
Attackers therefore can replace incumbent leaders in the face of a seemingly innocuous
technological change.
We applied our framework to the video game industry where each introduction of a
new generation of microprocessors has resulted in a change in leadership in the video game
console industry. We argued that a new microprocessor triggers changes in other
components of the video game console and the linkages between them. In all the cases, a
23
Erosion of industry leadership and innocuous technological changes
new microprocessor opened up the possibilities for more lifelike games and new markets,
establishing the need for new games and, sometimes, new distribution channels to the new
markets opened up by the newer games. In two of the three transitions that we studied, the
introduction of a new processor resulted in the introduction of a new component into the
video game system. In the discrete transistor to 8-bit transition, the memory cartridge was
introduced and in the 16-bit to 32-bit transition, the compact disc (CD) memory was
introduced. We argued that in these cases, the amount of reconfiguration required was more
than that required during the transition from 8-bit to 16-bit. Thus the winning attackers
were likely to be new entrants in the first case and incumbents in the second case. Indeed,
Nintendo, the winner in the 8-bit market, and Sony, the winner in the 32-bit market, were
both new entrant attackers while Sega, the winner in the 16-bit era, was an incumbent
attacker.
The past research that has explored the impact of technological change on
incumbent leaders has focused on the role that incumbent existing resources or dominant
logic play in the face of a technological change. It is true that in the face of certain types of
technological changes, an incumbent’s existing resources and dominant managerial logic can
become a handicap in exploiting the new technology. But it is also true that new
technologies often require new resources. Thus our paper adds to and extends this body of
research by exploring the role of the new resources that firms need to successfully exploit
new technologies.
Brusoni, Prencipe and Pavitt (2001) have argued that firms need to have more
knowledge than they need for what they make. Our research suggests that, on the one hand,
such excess knowledge and associated physical assets can help a firm in the face of certain
24
Erosion of industry leadership and innocuous technological changes
technological changes since such excess knowledge and associated physical assets can serve
as the new assets needed or as the absorptive capacity to acquire them. On the other hand,
the excess knowledge and associated assets can also become a handicap. Future research
could explore when excess knowledge can hurt or help.
Dynamic capabilities have been described as capabilities that allow a firm to quickly
build the capabilities that it needs to exploit a technological change. Our research only
points out that incumbent leaders that do not have the new resources required are likely to
lose their leadership competitive positions. Future research could explore why some firms
had the scarce resources and others did not. This would add to the little that we know
about dynamic capabilities.
In the theory part of the paper, we hinted at the fact that “component” can refer to
the physical component of a system as well as to the different departments, functions, and
groups within a firm or the different organizations in a network. However, in exploring the
video game console industry, we concentrated on the physical component aspects of the
theory. We assumed, as had Henderson and Clark (1990), that incumbent departments had
come to mirror the components of their system while the interactions between departments
had come to mirror the linkages between components. Future research could explore
“components” where they mean functional or other organizational departments.
EMBED
REFERENCES
3There
computer tricks for addressable more memory than a processor's bid width would suggest. But they
have high costs, including speed.
4This increase in clock speed is but a result of the fact that many architectural improvements have been made
to the processor.
25
Erosion of industry leadership and innocuous technological changes
Abernathy, W.J. and Clark, K.B. (1985), ‘Mapping the winds of creative destruction’,
Research Policy, 14, pp: 3-22
Afuah, A. (2003), ‘Redefining firm boundaries in the face of the Internet: are firms really
shrinking?’, Academy of Management Review, 28 (1), pp: 34-45
Afuah, A. (2000), ‘Do your co-opetitors capabilities matter in the face of a technological
change’, Strategic Management Journal, 21 (3), pp: 378-404
Afuah, A. and Bahram, N. (1995), ‘The hypercube of innovation’, Research Policy, 4 (1):
pp: 51-66
Allen, T. (1984), ‘Managing the flow of technology’, MIT press, Cambridge
Anderson, P. and Tushman, M. L. (1990), ‘Technological discontinuities and dominant
designs: a cyclical model of technological change’, Administrative Science Quarterly,
35 (4), pp: 604-633
Argyres, N. S., and J. P. Liebeskind (1999). 'Contractual commitments, bargaining power,
and governance inseparability: Incorporating history into transaction cost theory',
Academy of Management Review, 24 (1), pp. 49-63.
Arthur, W. B. (1989) Competing technologies, increasing returns, and lock-in by historical
events. Economic Journal, 99, pp. 116-131.
Bettis, R.A. and Prahalad, C.K. (1995), ‘The dominant logic: retrospective and extension’,
Strategic Management Journal, 16 (1), pp: 5-20
26
Erosion of industry leadership and innocuous technological changes
Brusoni, S., Prencipe, A. and Pavitt, K. (2001), ‘Knowledge specialization, organizational
coupling, and the boundaries of the firm: why do firms know more than they
make?’, Administrative Science Quarterly, 46, pp: 597-621
Cohen, W.M and Levinthal, D.A. (1990), ‘Absorptive capacity: a new perspective on
learning and inno’, Administrative Science Quarterly, 35 (1), pp: 128-153
Coughlan, P.J. (2001), ‘Competitive Dynamics in Home Video Games (E): the Rise of
3DO and 32-bit gaming’, Harvard Business School, Case # 9-701-095
Clark, K.B. and Fujimoto, T. (1991), ‘Product development performance: strategy,
organizations, and management in the world auto industry’, Harvard Business
School Press, Boston, MA
Cyert, R. M., and J. G. March. (1963), A Behavioral Theory of the Firm. Englewood Cliffs,
NJ: Prentice-Hall.
Game investor, 2003. www.gamesinvestor.com. Accessed in 2003.
Ghemawat, P. (1991), ‘Commitment: The Dynamics of Strategy’, New York: Free Press.
Granstram, O., Patel, P. and Pavitt, K., (1997), ‘Multi-technology corporations: why they
have distributed rather than distinctive core competencies’, California Management
Review, 39 (4), pp: 8-25
Harrington, R. (1988). ‘Now, play’s the thing; toy time: it’s the toast of Christmas ‘88’,
The Washington Post, Nov. New York.
Helfat, C.E. and Peteraf, M.A. (2003), ‘The dynamic resource-based view: capability
lyfecycles’, Strategic Management Journal, 24 (10).
27
Erosion of industry leadership and innocuous technological changes
Henderson, R. M. and Cockburn, I. (1994), ‘Measuring competence? Exploring firm effects
in pharmaceutical research’, Strategic Management Journal, 15, pp: 63-84
Henderson, R.M. and Clark, K.B. (1990), ‘Architectural innovation: the reconfiguration of
existing product technologies and the failure of established firms’, Administrative
Science Quarterly, 35, pp: 9-30
Hill, C. W. L. and F. T. Rothaermel, (2003), ‘The performance of incumbent firms in the
face of radical technological innovation’, Academy of Management Review, 28 (2),
pp: 257-274
Iansiti, M. (1993), ‘Real-World R&D. Jumping the product generation gap’, Harvard
Business Review May-June, pp: 138-47.
Leonard-Barton, D. (1992), ‘Core capabilities and core rigidities in new product
development’, Strategic Management Journal, 13, pp: 3-25
Mitchell, W. (1989). 'Whether and when? Probability and timing of incumbents' entry into
emerging industrial sub fields', Administrative SciencesQuarterly, 34, pp. 208-30.
Nelson, R and Winter, S. (1982), ‘An evolutionary theory of economic change’, Harvard
University Press, Cambridge (MA)
Pereira, J. (1990), ‘For video games, now it’s a battle of bits’, Wall Street Journal, Jan.,
New York
Peteraf, M.A. (1993), ‘The cornerstones of competitive advantage: a resource based view’,
Strategic Management Journal, 14, pp: 179-191
Roetharmel, F.T. (2001), ‘Incumbents’ advantage through exploiting complementary assets
via interfirm cooperation’, Strategic Management Journal, 22, pp: 687-699
28
Erosion of industry leadership and innocuous technological changes
Schumpeter, J.A. (1934), ‘The theory of economic development’, Boston, Harvard
University Press
Schilling, M. A. 1998 “Technological lockout: An integrative model of the economic and
strategic factors driving technology success and failure. Academy of Management
Review, 23:267-284.
Schilling, M. A. 2002 Technology success and failure in winner-take-all markets: The
impact of learning orientation, timing, and network externalities. Academy of
Management Journal, 45, pp: 387-398
Teece, D. (1996), ‘Profiting from technological innovation: implications for integration,
collaboration, licensing and public policy’, Research Policy, 15, pp: 285-306
Tripsas, M. (1997), ‘Unraveling the process of creative destruction: complementary assets
and incumbent survival in the typesetter industry’, Strategic Management Journal,
18, pp: 119-142
Tripsas, M. and Gavetti, G. (2000), Capabilities, cognition, and inertia: evidence from
digital imaging’, Strategic Management Journal, 21, pp: 1147-1161
Tushman, M.L. and Anderson, P. (1986), ‘Technological discontinuities and organizational
environments’, Administrative Science Quarterly, 31, pp: 439-465
Venkatraman, N. and Chi-Hyon, (2003), ‘System-based competition and Indirect network
effects: Empirical test of the US Video Game Industry, 1976-2002’, Academy of
Management Proceedings, 2003 (Seattle).
29
Erosion of industry leadership and innocuous technological changes
Zahra, S.A. and George, G. (2002), ‘Absorptive capacity: a review, reconceptualization,
and extension’, The Academy of Management Review, 27 (2), pp: 185
Yoder, S.K. (1986), ‘Nintendo Co. Isn’t just playing games; it takes on Japan’s computer
giants’, The Wall Street Journal, Jun., New York.
30
Erosion of industry leadership and innocuous technological changes
TABLE 1
Improvements in microprocessor speed and addressable memory
MHZ
Microprocesso range
r
4-bit
Addressable memory
24 = 16 bytes
8-bit
1.02-4.00 28 = 512 bytes
16-bit
2.8-16
216 = 65,536 = 64Kbytes
32-bit
12.5-200
232 = 4,294,967,296 = 4 Gbytes
64-bit
26.29-162 264 = 18,446,744,073,709,600,000 = 16,000,000,000 Gbytes
= 4 billion times more addressable memory space
128-bit
104-733
2128=340,282,366,920,938,000,000,000,000,000,000,000,000
=256,000,000,000,000,000,000,000,000,000 Gbytes
TABLE 2
A Comparison of number of games for each console
Number of games
8-bits
16-bits
32-bits
Leading incumbent
Atari
Mean: 7
SD: 2,77
Nintendo
Mean: 33
SD: 22,17
Sega
Mean: 22,5
SD: 11,38
Attacker
Nintendo
Mean: 27,9
SD: 20,8
Sega
Mean: 45
SD: 19,7
Sony
Mean: 107,2
SD: 68,28
31
Stat. Significance
t-value: 3,283
p < 0,01
t-value: 1,840
p < 0,05
t-value:
p < 0,005
Erosion of industry leadership and innocuous technological changes
FIGURE 1
System reconfiguration required by
the change in component
IV
High
Attackers more likely to
win unless incumbents
have the important
resources
I
Attackers with the
resources most likely
to win
III
Low
II
Attacker with scarce
resources like to win
Incumbents win
Low
High
The degree to which systems reconfiguration
requires scarce and difficult-to-imitate resources
that incumbents do not have
32
Erosion of industry leadership and innocuous technological changes
FIGURE 2
System reconfiguration required by
the change in component
IV
Nintendo takes over
from Atari in 8-bit
High
Low
I
Sony takes over from
Sega in 32-bit
Sega takes over from
Nintendo with the
introduction of 16-bit
III
II
Low
High
The degree to which systems reconfiguration
requires scarce and difficult-to-imitate resources
that incumbents do not have
33