Proceedings of the 40th Hawaii International Conference on System Sciences - 2007
Simulating Indirect Network Effects in the Video Game Market
Jochen Strube, Sven Schade, Patrick Schmidt, Peter Buxmann
Technische Universität Darmstadt
Chair of Information Systems
Hochschulstr. 1
64289 Darmstadt, Germany
{strube|schade|buxmann}@is.tu-darmstadt.de
Abstract
Since the 1970s, seven generations of consoles have
competed in the video games market. Today video
game industry sales even surpass the box-office results
of the movie industry. The history of video games
provides numerous examples of former successful
console vendors failing to establish technologically
superior successors of their consoles. This is mostly
due to dependencies of console manufacturers on
game publishers.
Therefore, this paper examines strategies of console
vendors against the background of indirect network
effects on the video game market. We apply a
simulation approach based on the agent-based
computational economics paradigm to simulate an
artificial video game market consisting of console
vendors, game publishers and customers. This enables
the examination of various competition scenarios. As
an example application of our model we evaluate
pricing strategies of console manufacturer showing
that penetration pricing is a possible way to gain
ground in the market or even to increase revenues.
1. Introduction
In 1972, Magnavox released the Odyssey, the first
video game console that supported interchangeable
games. This was the starting point for the development
of today’s multi-billion dollar video game market, in
which console manufacturers strive for market share.
To reach this goal the console manufacturers compete
for customers and game publishers alike. Only
consoles with many supported games will attract
consumers and games will only be published for
consoles with a large customer base. Currently the
main vendors Microsoft, Nintendo and Sony battle for
dominance of their consoles in the video game market
and the outcome of this battle is yet unclear.
The video game market is an interesting subject for
economic research for several reasons. First, the
combined hardware-software-system of consoles and
games is well-suited for the examination of indirect
network effects. Games can only provide utility if they
are used in combination with the related consoles
which in turn have no stand-alone utility. As games for
one console system, e.g. Playstation 2, usually cannot
be played on another console system, e.g. XBox, this
market is well suited for the examination of system
competition. Second, large empirical data on lifecycles
of games and consoles is available. Thus, the relevant
market can be delimited precisely and simulation
models can be developed close to reality. With
empirical data and appropriate adjustments of the
models it is possible to “replay” the console history or
predict future market outcomes. Third, the
understanding of the market coherences is not yet
extensive. Even successful console vendors sometimes
fail to establish a new console generation. Sega for
instance was fairly successful with their Genesis
console, because it entered the market earlier than
other consoles of its generation. The same strategy
however failed for the Sega Saturn and Dreamcast.
Thus, it is important, especially for console vendors, to
know under which conditions which market strategy is
promising.
Against this background, we developed a
simulation model to evaluate the effects of console
manufacturers’ strategies on the video game market.
The goal of this model is to provide a simplified
representation of the real video game market in order
to broaden the understanding of indirect network
effects in this industry. We apply an agent-based
computational economics approach. This enables to
model the video game market bottom-up: by modeling
the decision process of the market actors we can
observe the behavior of the entire market under given
circumstances. Our model is based on the concept of
indirect network effects.
Proceedings of the 40th Annual Hawaii International Conference on System Sciences (HICSS'07)
0-7695-2755-8/07 $20.00 © 2007
1530-1605/07 $20.00 © 2007 IEEE
1
Proceedings of the 40th Hawaii International Conference on System Sciences - 2007
The paper is structured as follows: The next section
gives a description of network effects and introduces
the video game market. We also explain which typical
strategies console manufacturers apply to benefit from
network effects. Section 3 presents a model based on
an agent-based computational economics approach to
simulate a video game market consisting of console
vendors, game publishers and customers. Section 4
presents some simulation results on the strategy of
penetration pricing and the role of game licensing fees.
The paper finishes with an outlook on further research.
2. Network Effects in the Video Game
Market
2.1. Network Effects
The network effect theory has its main roots in
pioneering papers by Katz/Shapiro [16], [17] and
Farrell/Saloner [13], [14], although there are earlier
contributions on positive feedback effects or demand
externalities, e.g. [22] or [24].
According to Katz/Shapiro [16] and Economides
[12], network effects are existent when “the utility that
a user derives from consumption of the good increases
with other agents consuming the good”. Concerning
the sources of these effects, they distinguish between
direct and indirect network effects.
Direct network effects refer to situations with direct
physical effects, mostly enabling the exchange of
information in communication networks. A prominent
example is a telecommunications network where
telephones or fax-machines are network effect goods.
A new actor might enter the network if his utility of
communication with some existing network
participants exceeds the costs for the telephone or faxmachine. Also, his entry generates utility for all actors
as they can communicate with him. Some further
examples for direct network effect goods are office
software, EDI standards or communication standards
in general.
Indirect
network
effects
originate
from
interdependencies between the consumption of a basic
good and the consumption of complementary goods
and services. They occur when a higher diffusion of a
good results in a broader variety of additional goods or
services. These could be consulting services available
for an ERP software, a broad variety of software for a
specific operating system or the availability of
programmers for a programming language. Other
typical examples for indirect network effects are
combined hardware-software systems, where the
hardware component does not have any stand-alone
utility and gains a value from facilitating available
software. This applies to video recorder systems and
video cassette formats, CD players and available CDs,
PCs and operating systems and application software as
well as to video consoles and available game titles.
In this paper we focus on the examination of the
video game market, which is characterized by multiple
incompatible console systems, short product cycles and
an apparent presence of indirect network effects. A
comprehensive contribution of Clements and Ohashi
[10] analyses hardware and software adoption based
on empirical data for the US video game market
between 1994 and 2002. Contrary to their approach,
this paper examines indirect network effects using a
simulative approach on an artificial video game market
to analyze the effects of different vendor strategies.
For a general survey on the network effect theory
the interested reader is referred to [11] [21] [28] [15]
and [18] for a comprehensive state of the art and
introductions into recent research on the network effect
theory.
2.2. The Video Game Market
The first generation of consoles included the
Magnavox Odyssey and Atari Pong. These devices
were followed by 6 generations of consoles. Every
generation consisted of different console types, which
were similar in performance. Figure 1 shows the
history of the major consoles in the U.S beginning with
the second generation.
Figure 1. Console generations in the
U.S. market
As one can see, there have been numerous consoles
from different vendors over the past 30 years. Some
Proceedings of the 40th Annual Hawaii International Conference on System Sciences (HICSS'07)
0-7695-2755-8/07 $20.00 © 2007
2
Proceedings of the 40th Hawaii International Conference on System Sciences - 2007
technically superior consoles like Jaguar or Dreamcast
could not gain ground in the market while others
dominated for several years although being technically
inferior. Therefore, the variety of video games seems
to be a more important success factor for consoles than
the technical capabilities.
Nowadays the video games industry has grown to a
major branch of the entertainment industry and has
even surpassed the movie industry in sales in 2005.
The worldwide entertainment software sales reached
$18 billion in 2005 [3]. The worldwide interactive
entertainment market (consisting of console video
games, PC games, online games and dedicated portable
systems) sizes about $28.5 billion in 2005 [5]. Other
sources estimate $35.3 billion in 2005 for the entire
game industry [4], forecasting more than $50 billion in
2007. The console sector in 2005 makes up about half
of these revenues with $3.89 billion for hardware and
$13.05 billion for software [4], [6].
The actors in the market are depicted in figure 2.
Figure 2. Actors in the video game market
The major actors in this market are the console
vendors, game publishers and customers. Console
vendors, like Sony, Microsoft or Nintendo, develop
and produce consoles and sell them to the customers.
Customers buy video games from game publishers,
e.g. Electronic Arts, Activision or LucasArts as well as
complementary products, like special input devices or
game magazines, from third party vendors. The game
publishers are responsible for manufacturing and
marketing of the games. They obtain licenses from
console vendors as a legal basis for the selling of
games for a specific console [2]. As a license fee a
share of the game price is passed from publisher to
console maker. Thus, the console vendors usually
participate in the revenues of the sold games. The
game publishers can employ internal development
studios or hire external game developers, e.g. Pyro
Studios, Rockstar North or Westwood Studios.
Given that consoles do not provide a stand-alone
value or utility, it is obviously difficult for console
vendors to place new console systems in the market.
On the one hand, customers will only buy a console,
which has a huge game variety. But the game supply
on the other hand, results from game publisher
activities, which will more likely develop games for
widespread consoles as they promise higher game
sales. Hence, they might not want to develop games for
a console type which has or is expected to have too
few customers. Thus, the next section presents some
strategies for console vendors to overcome this start-up
problem.
2.3. Strategies for console vendors
To gain market share, console manufactures apply a
variety of strategies, which mostly aim at establishing
network effects. In the history of video games,
numerous examples for these strategies can be found.
Perhaps the most dominant strategy is penetration
pricing. This pricing technique sets a relatively low
initial entry price in order to gain market shares. A
higher market share leads to greater game sales and
thus makes it more attractive for game publishers to
release new games for that console. This increases the
attractiveness of the console and thus the indirect
network effects. Most console vendors speculate that
the initial losses will later on be compensated by the
game licensing fees [1].
Another important aspect is the point of market
entry. If a manufacturer is the first to offer a console of
a new generation, he can use this time to increase the
installed base of the console. A greater installed base
will lead to a greater game variety and therefore to
higher network effects. Sony’s Playstation, for
example was released in December 1994 in Japan and
in September 1995 in the rest of the world. This helped
Sony gain an advantage over Nintendo’s N64, which
was released a year later.
However, an early release date does not guarantee
market success: Sega’s Dreamcast console was
released more than a year earlier than Sony’s
Playstation 2. Though the Dreamcast console was
initially successful, an early press release by Sony led
consumers to postpone their buying decision.
Therefore, Sega’s Dreamcast could not establish a
sufficient installed base.
The predecessor of Dreamcast, the Sega Saturn,
faced a different problem. The design of its
technological architecture, with two CPUs and six
other processors, made it difficult to develop games.
This problem was strengthened by the lack of useful
software libraries and development tools. This made
Proceedings of the 40th Annual Hawaii International Conference on System Sciences (HICSS'07)
0-7695-2755-8/07 $20.00 © 2007
3
Proceedings of the 40th Hawaii International Conference on System Sciences - 2007
game development expensive and lead to less games
being released for this platform. Thus, a way to
strengthen the network effect is to support game
development. This was also an advantage for
Microsoft’s Xbox. Since the Xbox uses similar
standards like common PCs. The operating system is
based on Windows 2000, the game development relies
on DirectX and the CPU is manufactured by Intel.
Hence, it is easier to port a game originally developed
for PCs to the Xbox and vice versa.
A smart move of Sony was to secure exclusive
rights to Squaresoft’s Final Fantasy VII. Final Fantasy
is a series of very popular – especially in Japan – roleplaying games, which hitherto had been released for
Nintendo consoles only. Exclusive game titles increase
the indirect network effects of a console and are a very
common strategy. Nintendo’s Mario games are also
typical examples.
Furthermore, the Mario Series pinpoints a different
way to boost network effects: Nintendo develops a lot
of games itself (first-party game development), not
only to gain profits, but also to reduce the start-up
problem every console faces.
The start-up problem can also be reduced by
implementing downward compatibility. Sony’s
Playstation 2 was compatible to the original
Playstation and Playstation 3 will most likely be
compatible to Playstation 2. Most of the handheld
consoles from the Nintendo Game Boy line are also
downward compatible to their predecessors.
Another success factor of the Playstation 2 was that
it allowed for DVD playback. Since DVD players were
not very common in 2000, when the console was
released, this was a further incentive for consumers to
buy the system. In other words, the Playstation 2
enabled the use of other complementary goods and
thus benefited from additional indirect network effects.
Furthermore, there are other ways to create utility,
e.g. most vendors deliver a game with their console.
Such a product bundle already provides the user with
the utility of one game.
Recently direct network effects have also gained
importance for video game systems. An increasing
number of players participate in massive multiplayer
online games (MMOG). A MMOG is a game played
on the internet, which is capable of supporting
hundreds or thousands of players simultaneously.
Launched in 2002, Microsoft’s subscription-based
online gaming service Xbox Live was the first unified
platform offered by one of the current console
manufacturers. Meanwhile the remaining players in the
console market have announced similar services for
their next-generation consoles.
Table 1. Strategies of console vendors
included in our simulation model
Strategy
Included in model
Penetration pricing
Yes
Different game qualities
Yes
Game variety
Yes
Different complexity of game
Includable with
development
minor changes
First-party game development
Yes
Exclusive Game Titles
Yes
Point of market entry
Yes
Downward compatibility
Includable with
minor changes
Audio / video media playable
Includable with
minor changes
Direct network effects
No
Table 1 gives an overview on which of the
mentioned strategies can be examined with our model
and which can be included with little effort. We focus
on factors, such as pricing, performance of the
consoles, installed based, game variety and game
quality. The details of the simulation model will be
described in the next section.
3. Simulating Video Game Markets
3.1. Methodology
Weitzel, Wendt and König [27] suggest the
application of agent-based computational economics
(ACE) as paradigm for research in network effects.
ACE is the computational study of economies modeled
as evolving systems of autonomous interacting agents
[26]. Thus, the behavior of a system at the macro-level
is determined by the actions of the involved actors –
the agents - on the micro-level. The advantage of this
approach is that we can observe how a change in an
agent’s behavior affects the total system. In our case
we observe the impact of different console
manufactures strategies on the video game market.
As Axelrod states, agent-based modeling in social
sciences does not necessarily aim to provide an
accurate representation of a particular empirical
application [7]. Instead, the goal of agent-based
modeling is to broaden the understanding of
fundamental processes in a system. Thus, following
Axelrod’s advises on the design of simulation models,
we try to reduce the video game market to a number of
important actors. These are the console vendors, game
publishers and the customers. Furthermore, we make a
few simplifying assumptions about the market, listed
in the next section.
Proceedings of the 40th Annual Hawaii International Conference on System Sciences (HICSS'07)
0-7695-2755-8/07 $20.00 © 2007
4
Proceedings of the 40th Hawaii International Conference on System Sciences - 2007
3.2. Simulation model
The following assumptions form the basis of our
simulation model:
1. Consoles are completely incompatible
2. Consoles provide no direct network effects
3. Consoles provide no stand-alone utility
4. Indirect network effects result only from the
variety of games offered for each console
5. Consumers may own multiple consoles at once
6. Each consumer is given a random budget in each
period, which he completely exploits
7. The number of game titles remains constant for all
periods
8. The number of consumers remains constant for all
periods
While assumptions 1-5 are consistent with the
described properties of the video game market,
assumption 6 is not yet confirmed by empirical data.
Assumptions 7 and 8 can not be observed in reality but
facilitate the modeling of agent behavior. It is difficult
to predict how changes in these assumptions would
affect the simulation results. Different assumptions 6-8
would definitely increase the complexity of the model,
but in turn may also reduce the significance of the
simulation results.
In the following we first give a general overview on
the actions of the agents in our model. The order of
these actions is also depicted in figure 3. Afterwards
we will describe the decision process of each agent in
detail.
Figure 3. Order of agents’ actions in the
simulation model
In each period, the console vendors set the price for
their consoles. Next, the game publishers remove a
constant number of games from the market based on
the numbers sold in the previous period. Furthermore
they publish new games. According to or model
assumptions the total number of games in the market is
kept constant. The consumers then evaluate the
different combinations of games and/or consoles they
could buy in the current period. Restricted by a certain
budget, the consumer buys the combination of
consoles and games that maximizes his utility.
Console vendors
The goal of our model is to evaluate market
strategies for console vendors. The pricing of consoles
plays an important role in these strategies. Therefore,
we want to evaluate the outcome of different pricing
strategies such as penetration pricing or skimming.
Thus, the prices pc for each console c in each period
are not determined as a reaction to the market but are
simulation parameters.
Game publishers
In our model the game publishers are represented
by a single agent, which reacts on the consumers’
buying behavior. Thereby, the agent faces the decision,
which console to publish games for. The price pgc of
game g on console c is randomly drawn from a normal
distribution. The mean μ of this distribution represents
the typical price of a game for a particular console.
Beginning with the second period, in each period
the game publisher replaces a constant number o of
the games in the market with new games. The decision,
which games to replace, depends on the sales of each
game in the last period. First the worst-selling game is
taken off the market next the game with the second
lowest sales et cetera until o is reached. If there is
more than one game with equal sales, a game is chosen
randomly.
New games are published based on the expectations
of future sales. For each past period and each console
the average sales per period of the respective games
are computed. We then apply first-order exponential
smoothing [19] to predict the average expected game
sales for a console in the next period. Next the ratio
β c between the expected game sales for a console c
and the expected sales for all consoles is calculated.
This ratio is multiplied to the total number of new
Proceedings of the 40th Annual Hawaii International Conference on System Sciences (HICSS'07)
0-7695-2755-8/07 $20.00 © 2007
5
Proceedings of the 40th Hawaii International Conference on System Sciences - 2007
games. The agent hence publishes
oβ c games for
console c.
Consumers
In each period the consumers can buy games and
consoles. Therefore, at the beginning of each period
the consumer i receive a random budget Bi following
a Gaussian distribution, which forfeits at the end of the
period.
Next the consumers identify their alternatives of
console purchase. If they already own a console, they
can either buy a new console or spend his complete
budget on games for his present console. If they do not
own a console yet, they decide on which console to
buy. The consumers can also buy more then one
console per period. The set of alternatives of owning
and buying consoles for each consumer i is Ai .
The goal of each customer is to find the alternative
ai with the highest utility. Utility is only generated by
games. Let
ucgi be the utility of a game g on console c
for the consumer i in the current period, then it shall
consist of three factors:
u gci = qg lc rgi
The parameter qg denotes the quality of the game.
This attribute represents a number of factors:
gameplay, the user interface, the storyline, the
longevity of motivation to play et cetera. The
performance factor of console c is represented by lc .
A technological advanced console will provide better
graphics and sound and thus the consumer will value a
game higher on that console. The third factor rgi
represents the preference of consumer i for a game.
Some users prefer strategy games others like roleplaying games or love first-player action games.
Hence, the same game provides different utility to
different users. The parameters lc and qg are
randomly drawn from a normal distribution for each
console c and game g respectively. The preference
factors rgi of each customer i are randomly drawn
from a normal distribution for all games g.
Given the utilities for the games, the consumer tries
to maximize his utility by buying available games g for
the consoles c ( g ∈ Gc ). For every alternative a,
Ca = Co + Cb denotes the set of consoles a consumer
can buy games for, consisting of the subset of already
owned consoles ( Co ) and the subset of consoles he
buys ( Cb ) in the current period. Thus, the objective
function of each consumer i is to maximize the utility
U ai of an alternative ai :
Max U ai =
s.t.
∑ ∑u
c∈Ca g∈Gc
∑∑p
c∈Ca g∈Gc
x
gc gci
x
gci gci
≤ Bi − ∑ pc
c∈Cb
xgci ∈ {0,1}
The decision variable xgci is 1, if consumer i buys
game g for console c and 0 otherwise. Finally the
consumer i selects the alternative with the highest
utility U ai .
The overall value of a console does not depend on
the utility of all games available for that console, but
only the utility of the games the user could buy by
exploiting his budget.
The maximization of the users’ utility can be
interpreted as a binary knapsack problem. This is a
combinatorial maximization problem of choosing as
many as possible essentials of different value that can
fit into one bag (of maximum weight) you are going to
carry on a trip [20].
In our case the maximum weight equal the
remaining budget B −
∑p
c∈Cb
c
, the weight of each
item, i.e. a game, is the games price pgc and the value
of an item is ucg .
Since the knapsack problem is NP-complete, we
apply a heuristic approach to solve the model. We use
a greedy algorithm based on an algorithm proposed by
Martello and Toth [20]: For each item we compute the
value/weight (utility/price) ratio. We sort the items
according to this ratio in descending order. Starting
from the first element the items are inserted into the
ges><dates><year>2006</year></dates><pub-
3.3. Validation
We used a combination of techniques for the
validation of our model. First we conducted a simple
performance evaluation. This is the process of
determining whether the computational model
generates the stylized results or expected behavior of
Proceedings of the 40th Annual Hawaii International Conference on System Sciences (HICSS'07)
0-7695-2755-8/07 $20.00 © 2007
6
Proceedings of the 40th Hawaii International Conference on System Sciences - 2007
the underlying processes [8]. Stylized results are
general empirical regularities that have been repeatedly
observed within real data.
The following stylized results were tested with our
model:
• If there is already a console present in the market,
which has a high installed base, an otherwise
identical console (the same performance is offered
for the same price) can not capture a significant
market share.
• If two consoles have the same installed base, are
offered at the same price and one console is
technologically advanced, the advanced console
will dominate the market.
• Identical consoles offered at the same price lead to
an identical number of sold consoles and game
variety.
• If two identical consoles are offered at the same
price, but the games for the second console are
less expensive, the second console will dominate
the market.
Some of the parameters for our simulation model
could be derived from empirical data, such as the
typical consumer’s budget, the prices of games and
consoles. To determine the other necessary parameters
we combined the simple performance evaluation with a
parameter variability - sensitivity analysis [25]. This
means that we changed the values of the input
parameters to determine the effect upon the model’s
behavior. We tested whether the stylized facts could
still be reproduced after a parameter was changed.
Thus, we could identify a range of sufficiently accurate
values for each parameter.
4. Simulation results
The goal of our simulation model is to provide a
tool, which can be used to evaluate the influence of
selected strategies on the video game market. As an
example let us take a look at one of the strategies
described in section 2.3., namely penetration pricing.
Penetration Pricing
from better graphics and sound, it will not be twice as
good as on the older console.
The new console faces a start-up problem: Since the
established console is already owned by number of
consumer, it possesses an installed base. Furthermore,
there is greater variety of games available for the older
console. This leads to network effects, which counter
the technological lead of the newer console.
Table 2. Parameters of the simulation scenario
Number of consoles
2
Installed base for
console 1 / console 2
500/0
Performance factor for
console 1 / console 2
1.0 / 1.3
Price for console 1
100
Number of consumers
Consumers budget
1000
500
Total number of games in market
Initial
game
availability
for console 1 / console 2
New games per period
Price per game
100
Quality factor: μ;σ
80/20
35
40
0.5;0.15
Preference factor: μ;σ
0.5;0.15
Given the simulation parameters summarized in
table 1, we evaluate, which price will lead to a market
success for the manufacturer of the new console. But
how can market success be measured? A successful
console will be bought by many consumers. Thus, one
key figure are console sales. Moreover, for a console
with a high market share more games will be
published. Therefore, we will use the game variety –
the number of available games for a console – as a
second measure.
The time horizon for this scenario is ten periods.
For each examined price we conducted 100 simulation
runs. The following figure shows the average results
for the totalized sales of the second console in all
periods.
We assume a market with two consoles. The first
console is already established in the market and is sold
at a price of 100. A second vendor tries to enter the
market. The second console is technologically
advanced compared to the first, represented by a
higher performance factor. Note that a duplication of
the technical capabilities does not lead to a duplication
of the performance factor. While a game will profit
Proceedings of the 40th Annual Hawaii International Conference on System Sciences (HICSS'07)
0-7695-2755-8/07 $20.00 © 2007
7
Proceedings of the 40th Hawaii International Conference on System Sciences - 2007
Figure 4. Total units sold of console 2
(average of 100 simulation runs)
In the price range of 0 to 100 the console will
always capture the whole market in terms of installed
base. At higher prices, not all consumers will buy the
console. At prices above 200 a significant drop in
console sales can be observed. This is because the
critical mass is reached in less simulation runs: Only a
few customers buy the console at first, which leads to
low game sales. Therefore, the game publishers release
only a small number of new games for the console.
This inhibits the growth of network effects, making the
console less attractive et cetera.
However, a large number of sold consoles does not
necessarily lead to high game sales. In this scenario,
fifty percent of the consumers in the first period
already own an older console, but may buy fewer
games for that console in the following periods. An
indicator for game sales is the game variety. Hence,
figure 5 shows the average number of available games
for the second console in the tenth period.
As one can see, the game variety correlates with the
installed base. This is reasonable, because the game
variety is one of the main influence factors on the
indirect network effects. In contrast to the installed
base, the availability of games is reduced by higher
prices even in the low price range. This is because a
more expensive console will take longer to capture
market shares.
Another aspect, shown by this simulation scenario,
is that the technologically superior standard does not
prevail under all conditions. Of course the market
success is not only influenced by pricing but also by
other factors, e.g. initial game availability, game
prices, game quality etc.
The role of licensing fees
As mentioned in 2.3., it is a common strategy for
console vendors to sell the console at a loss and instead
rely on licensing fees to gain money. Therefore, we
examine whether the licensing fees have an influence
on pricing.
We assume that the vendor of the new console
receives a twenty percent share of the games’ retail
price1. Figure 6 shows the combined revenues
achieved by licensing fees and console sales in all
periods. Again the values represent the average of 100
simulation runs.
Figure 6. Total revenue for console 2
(average of 100 simulation runs)
If the console manufacturer only depends on
console sales, the optimal price in regard to revenue
Figure 5. Game availability for console 2 in the
tenth period (average of 100 simulation runs)
1
While this share may vary for different console vendors, The
Economist [2] states that $10 of the common $50 game price are
passed to the console vendor as license fees.
Proceedings of the 40th Annual Hawaii International Conference on System Sciences (HICSS'07)
0-7695-2755-8/07 $20.00 © 2007
8
Proceedings of the 40th Hawaii International Conference on System Sciences - 2007
maximization would be 180. However, at this price the
diffusion of the console proceeds only at a slow pace.
A lower price will result in higher console sales in
earlier periods. This also strengthens game sales in
these periods. Thus, a lower price, namely 80, is
optimal in regard to total revenues.
Hence, one could argue that the licensing fees foster
technological development in the video game market:
Since licensing fees raise the tendency for lower prices
and lower prices lead to a faster adoption of a console,
new console generations are expected to quickly gain
market shares and replace older consoles. Thus, the
console life-cycle is shortened.
5. Conclusion and Further Research
There are many real-life examples of markets with
indirect network effects, which are subject to scientific
research. In contrast to existing literature in this field
[9] [23] we use a bottom-up approach to model the
video game market, enabling simulations which can be
used to evaluate market strategies for console
manufacturers. Based on simulations, various scenarios
can be analyzed from the console vendors or game
publishers perspective.
As we have shown, pricing plays an important role
for the market success of video game consoles. A
penetration pricing strategy seems promising,
especially in the light of licensing fees. However our
results do not include the role of cost, which will most
probably lead to a different optimal price.
A transfer of our model to other markets with
indirect network effects is possible, if these markets
inhibit the same key properties as the video game
market. This applies to markets of combined hardwaresoftware-systems with incompatibilities, where the
software components work with one type of the
hardware components, e.g. HD-DVD as well as Bluray players and their respective discs. The video game
market is special compared to other markets with
indirect network effects insofar as the game publishers
pay fees to console vendors. From an economic point
of view this is relevant for console vendors as they can
generate revenues not only with console sales but also
with game sales, enabling a penetration pricing
strategy.
There is still need for further optimization,
calibration and validation of the model. In order to
achieve a higher validity, which would allow for the
prediction of actual market development, we need to
gather empirical data.
Furthermore the ACE approach supports easy
changes of the simulation model to incorporate other
relevant aspects of the video game market. Thus, a
next step could be to examine different licensing
strategies between console vendors and game
publishers or the impact of different support levels for
game developers.
The upcoming trend to enable MMOG on consoles
now incorporates direct network effects on a market,
which hitherto was characterized by the influence of
indirect network effects only. Since empirical data for
the video game market is available, there is a challenge
for scientific research to estimate the impact of direct
and indirect network effects on those markets.
7. References
[1] Altizer, R., 2006: Sony predicts 887 million dollar
loss on game division, blames PS3, About.com,
http://playstation.about.com/b/a/257687.htm, accessed
May 29, 2006.
[2] Anonymous, 2002: Console wars, The Economist,
http://www.economist.com/business/displayStory.cfm?
story_id=1189352, accessed June 10, 2006.
[3] Anonymous, 2005: DFC Intelligence Releases New
Market Forecasts For Video Game Industry, DFCIntelligence,
http://www.dfcint.com/news/prjune2005.html,
accessed May 29, 2006.
[4] Anonymous, 2005: Informa Predicts $58.4 Billion
Game
Industry
In
2007,
Gamasutra,
http://www.gamasutra.com/phpbin/news_index.php?story=6942, accessed May 29,
2006.
[5] Anonymous, 2005: Interactive Entertainment
Industry to Rival Size of Global Music Business, DFCIntelligence,
http://www.dfcint.com/news/prnov92005.html,
accessed May 29, 2006.
[6] Anonymous, 2005: The NPD Group reports on
Retail Sales of U.S. Video Games Industry for first
half
2005,
NPD
Group,
http://www.npdfunworld.com/funServlet?nextpage=pr
_body.html&content_id=2173, accessed May 29,
2006.
[7] Axelrod, R., 2003: "Advancing the Art of
Simulation in the Social Sciences", Japanese Journal
for Management Information System, 12(2), 16-22.
[8] Carley, K., 1996: Validating Computational
Models, Carnegie Mellon University.
[9] Clements, M. T., 2004: "Direct and indirect
network effects: are they equivalent?" International
journal of industrial organization, 22(5), 633-645.
[10] Clements, M. T., Ohashi, H., 2004: "Indirect
Network Effects and the Product Cycle:Video Games
Proceedings of the 40th Annual Hawaii International Conference on System Sciences (HICSS'07)
0-7695-2755-8/07 $20.00 © 2007
9
Proceedings of the 40th Hawaii International Conference on System Sciences - 2007
in the U.S., 1994-2002", NET Institute Working Paper;
#04-01.
[11] David, P. A., Greenstein, S., 1990: "The
Economics of Compatibility Standards: An
Introduction to Recent Research", Economics of
Innovation and New Technology, 1, 3-41.
[12] Economides, N., 1996: "The Economics of
Networks", International journal of industrial
organization, 14(6), 673-699.
[13] Farrell, J., Saloner, G., 1985: "Standardization,
Compatibility, and Innovation", Rand Journal of
Economics, 16(1), 70-83.
[14] Farrell, J., Saloner, G., 1986: "Installed base and
compatibility : innovation, product preannouncements,
and predation", The American Economic Review,
76(5), 940-955.
[15] Gandal, N. S., 2002: "Compatibility,
standardization, and network effects : some policy
implications", Oxford Review of Economic Policy,
18(1), 80-91.
[16] Katz, M. L., Shapiro, C., 1985: "Network
Externalities, Competition, and Compatibility",
American Economic Review, 75(3), 424-440.
[17] Katz, M. L., Shapiro, C., 1986: "Technology
adoption in the presence of network externalities",
Journal of Political Economy, 94(4), 822-841.
[18] Koski, H., Kretschmer, T., 2004: "Survey on
Competing in Network Industries: Firm Strategies,
Market Outcomes, and Policy Implications", Journal
of industry, competition and trade, 4(1), 5-31.
[19] Makridakis, S. G., Wheelwright, S. C., Hyndman,
R. J., 1998: Forecasting : methods and applications,
Wiley, New York.
[20] Martello, S., Toth, P., 1990: Knapsack problems :
algorithms and computer implementations, J. Wiley &
Sons, Chichester ;New York.
[21] Matutes, C., Regibeau, P., 1996: "A selective
review of the economics of standardization : entry
deterrence, technological progress and international
competition", European Journal of Political Economy,
12, 183-209.
[22] Oren, S. S., Smith, S. A., 1981: "Critical mass and
tariff structure in electronic communications markets",
Bell Journal of Economics, 12(2), 467-487.
[23] Park, S., 2005: "Integration between hardware and
software producers in the presence of indirect network
externalities", Homo Oeconomicus, 22(1), 47-70.
[24] Rohlfs, J., 1974: "A theory of interdependent
demand for a communications service", Bell Journal of
Economics, 5(1), 16-37.
[25] Sargent, R., 2003: Verification and Validation of
Simulation Models, Winter Simulation Conference,
New Orleans 2003, pp. 37-48.
[26] Tesfatsion, L., Judd, K. L., 2006: Agent-based
computational economics, Elsevier, Amsterdam.
[27] Weitzel, T., Wendt, O., König, W., 2003:
Towards an Interdisciplinary Theory of Networks,
11th European Conference on Information Systems
(ECIS), Naples 2003.
[28] Weitzel, T., Wendt, O., von Westarp, F., 2000:
Reconsidering Network Effect Theory, 8th European
Conference on Information Systems (ECIS), Wien
2000, pp. 484-491.
Proceedings of the 40th Annual Hawaii International Conference on System Sciences (HICSS'07)
0-7695-2755-8/07 $20.00 © 2007
10
© Copyright 2026 Paperzz