Trading speed competition: Can the arms race go too far?

Trading speed competition: Can the arms race go too
far?∗
Dion Bongaerts†, Lingtian Kong‡and Mark Van Achter§
March 15, 2016
Abstract
We analyze the likelihood of arms race behavior in markets with liquidity provision by HFTs. Liquidity providers (makers) and liquidity consumers (takers)
make costly investments in monitoring speed. Competition among makers and
takers induces arms race behavior. However, trade success probabilities increase
in monitoring speed, giving rise to complimentarity between makers and takers.
This counters arms race effects. Whether arms race effects materialize crucially
depends on how marginal gains from trade depend on transaction speed. With the
common (often implicit) assumption of constant gains from trade, complimentarity
effects dominate and market participants tend to under-invest in technology. However, with gains from trade that decline in transaction speed, arms race behavior is
much more likely. A portfolio optimization exercise with endogenous and costly rebalancing frequency shows that gains from trade are likely to decline in transaction
speed.
∗
We would like to thank Hans Degryse and Joost Driessen for helpful comments and suggestions.
Rotterdam School of Management, Erasmus University, Department of Finance, Burgemeester Oudlaan 50, P.O. Box 1738, 3000 DR Rotterdam, The Netherlands. E-mail: [email protected].
‡
Rotterdam School of Management, Erasmus University, Department of Finance, Burgemeester Oudlaan 50, P.O. Box 1738, 3000 DR Rotterdam, The Netherlands. E-mail: [email protected].
§
Rotterdam School of Management, Erasmus University, Department of Finance, Burgemeester Oudlaan 50, P.O. Box 1738, 3000 DR Rotterdam, The Netherlands. E-mail: [email protected].
†
1
1
Introduction
The landscape of the financial markets has been transformed during the last decade
by a group of very fast traders: high frequency traders (HFTs) who trade with computer
algorithms at a millisecond pace. According to SEC1 , as of 2010, HFTs generate at
least 50% of the volume, and even more so in terms of order traffic in the US equity
market. This market share has risen to 70% as of 2009, reported by the Reuters newswire,
demonstrating an ongoing trend.2 Faced with such radical changes, regulators in the US
3
and the European Union
4
have called for an evaluation of the welfare consequences of
HFTs, so that appropriate regulation can be designed.
The academic findings on the welfare effects of HFTs are divided. The proponents,
like Burton Malkiel, argue that “competition among high frequency traders serves to
tighten bid-ask-spreads, reducing transaction costs for all market participants”.5 The opponents, including Paul Krugman, are concerned that HFTs undermine markets and use
up resources that could have been put to better use.6 Empirically there is evidence in favor of both sides. For example, using data of 26 NASDAQ-labeled HFT firms, Brogaard,
Hendershott and Riordan (2014) conclude that HFTs contribute to price discovery, and
Carrion (2013) finds that HFTs provide liquidity when it is scarce. On the other hand,
HFT technology is very expensive (see Biais, Foucault and Moinas (2014) for a discussion). For example, Spread Networks laid a new fiber optic cable connecting Wall Street
with Chicago, which cost $300 million and is only there to shaves 1.4 milliseconds off of
the latency of the existing route. Moreover, some of the brightest minds in this world
work on the design of super-fast trading algorithms for Wall Street. The opportunity
cost of this human capital for society is likely to be high.
1
SEC, 2010. Concept release on equity market structure. Release No. 34–61358; File No. S7–02–10;
Page 45.
2
http://www.reuters.com/article/2009/09/16/idUS113924+16-Sep-2009+BW20090916
3
http://blogs.wsj.com/riskandcompliance/2014/10/30/the-morning-risk-report-future-of-highfrequency-trading-regulation-is-murky/
4
http://www.bloomberg.com/news/2014-04-13/high-frequency-traders-set-for-curbs-as-eu-reins-inflash-boys.html
5
See ”High frequency trading is a natural part of trading evolution”, Financial Times, 14 December
2010.
6
”Rewarding bad actors”, New York Times, 2 August 2009.
2
This paper aims to reconcile the two arguments described above: is the value added
by HFTs enough to justify these (opportunity) costs? In particular, we distill the two
counterbalancing effects HFTs have on the economy, and which shape the debate. On
the one hand, only the first trader to react to a trading opportunity profits from his
investments. Biais, Declerck and Moinas (2015) , among others, provides empirical document for such facts that the faster traders in general obtain more profit in the current
market.
This negative externality, which we label as “substitution effect”, induces an
arms race towards lower latency among market participants. In line with Krugman’s arguments, this race could potentially distort the allocation of resources away from first
best. On the other hand, increases in liquidity supplying activity also enlarges the opportunities for liquidity demanders to successfully transact and could stimulate market
participation. This positive “complementarity effect” incorporates and even goes beyond
the competition argument put forward by Malkiel.
For our analysis we set up a stochastic monitoring model featuring two types of agents:
market makers and takers. Each agent competes with his own kind in a winner-take-all
fashion: the fastest trader is the only one that profits from a trading opportunity.7 In
our model, there is a book for a security which the makers sell and the takers buy. The
fastest maker has a better chance to detect it when the book is empty thus post a sell
order, and the first taker to respond buys it, making the book empty for the next maker
to fill. We then identify both the substitution and the complementarity effects, and show
that they at least partially offset each other. In a next step, we show that the welfare
effect of HFTs strongly depends on the marginal social value of additional transactions
taking place. Using a portfolio optimization model, we show that the faster trading takes
place, the lower the marginal value per trade is.
To simplify a systematic study of the above mentioned relationship, we first set up
the general model in which the marginal gains from trade can be any function of the
trading frequency of the trader involved. We can then study any particular functional
7
Here we do not consider the batched order limit order books proposed by Yueshen (2014) and
Budish, Cramton and Shim (2013), a practice which currently does not exist. Their type of market
organization may help in reducing welfare loss due to the substitution effect.
3
relationship between these two variables by assuming specific functional forms. As a
benchmark, we first make the assumption that the gains from trade are linear in the
average trading frequency. In other words, as the trading speed increases, the marginal
benefit it generates to the traders involved stays constant. This assumption of constant
marginal gains from trade is often (implicitly) made in the current literature (for example,
Foucault, Kadan and Kandel (2013), Biais et al. (2014) and Hoffmann (2014)). We show
that within this setting the complementarity effect outweighs the substitution effect for
large parameter ranges. As a result, from a welfare perspective, HFTs often underrather than over-invest in speed and value-destroying arms races do not take place. For
the parameter ranges that do generate arms-race phenomena, we see them with the takers
rather than with the makers. These findings go against the proclaimed over-monitoring
of HFTs in the arms race literature ( Biais et al. (2014); Hoffmann (2014)).
Next, we analyze the sensitivity of the results to the constant marginal gains from
trade assumptions. More specifically, we impose on the general model that the aggregate gains from trade are strictly decreasing and differentiable in trading frequency.
We demonstrate that under this assumption, over-monitoring and therefore arms-race
phenomena occur for larger parameter ranges when the aggregate gains from trade are
concave in trading frequency. Intuitively, the complementarity component decreases in
importance while wasteful competition remains.
In the final part of our analysis, in a portfolio allocation and consumption setting we
demonstrate that aggregate gains from trade are actually likely to be concave in trading
frequency. To this end, we integrate a mechanism generating the need for trading into
our existing model. More specifically, we let risk averse liquidity takers with log utility
re-balance their portfolio whenever they get to trade in the market. There are gains from
trade because investors may want to rebalance their portfolios following price innovations.
The expected value of doing so increases as the average interval between two trades grows.
Intuitively, volatility and therefore portfolio dislocations and rebalancing needs are larger
over longer horizons. As a result the marginal value of an extra trading opportunity is
positive, but declines as the interval in between trades shrinks. Hence, individual and
4
therefore aggregate gains from trade are concave in trading frequency.
Our paper joins the burgeoning literature of high frequency traders on market quality. The impacts of HFTs on various aspects of market quality have been investigated
theoretically. Foucault et al. (2013) model the trading strategy of an informed trader
who is able to react faster than others to news. They find that as a result, the order
flow volatility increases for these faster traders. As for the price efficiency, Martinez and
Rosu (2013) study the behavior of high frequency traders who are able to react to news
instantaneously thus the price reflects the news immediately. Instead of the latency of
the traders, some other studies focus on the speed of the exchanges, such as Pagnotta
and Philippon (2012), and Menkveld and Zoican (2014).
Among the various aspects of the market quality, our paper focuses on the liquidity
provision effect of the HFTs. In addition, we study it in an environment without information asymmetry. The previous papers on financial expertise and information asymmetry
Glode, Green and Lowery (2012) and the externality of being picked off Hoffmann (2014),
Cartea and Penalva (2012), Foucault et al. (2013), Biais et al. (2014) and Hoffmann (2014)
focus on the adverse selection cost when evaluating the welfare consequences of the HFTs.
We show that hardware and human capital (opportunity) cost alone can induce a wasteful arms race, even in the absence of information asymmetry and pick-off risk. We also
abstract from inventory risk either, which is a result of the volatility of the security, as
in Aı̈t-Sahalia and Saglam (2013).
Furthermore, our paper seeks to derive endogenously the marginal benefits from trade
(termed“gains from trade” here) based on realistic assumptions of the trading motivations. Most papers Foucault et al. (2013), Biais et al. (2014), Budish et al. (2013), and
Cartea and Penalva (2012)) investigating the cost of the HFT competition take a fixed
gains from trade as given and assume risk neutrality of investors. In particular, Biais
et al. (2014) find that institutions do invest in fast trading technologies to keep up with
each other, and that over-investment in fast trading can occur in equilibrium. However,
these papers do not differentiate market makers and market takers, and only allow for a
binary choice of the high versus low latencies. In a recent study, Hoffmann (2014) mod-
5
els the limit order book and studies the externalities among traders arisen from order
revisions and cancelations. In contrast, the loss of social welfare in our paper stems from
a different economic source: the gain from trade is getting smaller and smaller as the
trading speed increases, yet the costs grow ever higher.
The rest of this paper is organized as follows. In Section 2 we lay out the framework
of our models. In Section 3 we derive and compare the first best and the equilibrium
allocation for different specifications of the gains from trade. In Section 4 we use a
portfolio allocation model to show that the gains from trade are likely to be concave in
trading frequency. Finally, in section 5 we conclude and point to the limitations of this
model.
2
Setup
The basic setup of our model is based on the one developed by Foucault et al. (2013).
We consider a market with two types of participants: market makers and market takers.
Each maker i ∈ {1, 2, . . . , M } monitors the market at discrete points in time and arrives
according to a Poisson process with intensity parameter µi ≥ 0. Similarly, each taker
j ∈ {1, 2, . . . , N } arrives with intensity τj ≥ 0. The total numbers of makers and takers,
M and N , are exogenous.8
The market we consider operates as a limit order book (LOB) for one security. The
LOB can be either empty or filled with a limit order of unit size. When the book is
empty (E), the first maker to arrive can post a limit order, thereby changing the LOB
status to filled (F). While the LOB is filled, no other makers can post limit orders until the
existing quote is hit by a taker (i.e., the filled book becomes empty again). Similarly, once
the first taker has seized the trading opportunity in a F book, no other takers can trade
anymore until the book becomes F again. The market operates according to the following
timeline: before the trading begins, each maker i chooses an intensity µi to maximize
8
This setup has a winner-takes-all feature from an ex-post perspective (the one conducting the trade
is the only one benefitting). From an ex-ante perspective, the fastest trader is not guaranteed to always
execute. We choose this setup based on the notion of order processing uncertainty: the fast trader never
knows what is going to happen after he submits the order and before it reaches the server of the exchange.
A similar argument can be found in Yueshen (2014).
6
his expected trading profit9 Πm (µi ). Similarly, each taker j chooses τj to maximize his
expected trading profit Πt (τj ). Once the trading begins, each player monitors the market
following a Poisson process with the intensity previously chosen. For the moment, we
assume that the trading phase of the model repeats itself an infinite number of times.
In our model, monitoring speed does not come for free. In particular, we assume the
monitoring costs for both trader types to be quadratic in the monitoring frequency chosen.
These costs reflect the required investments in IT-infrastructure and human capital. The
marginal cost of technology is increasing in monitoring intensity due to its quadratic
nature. This assumption is based on the observation that as latency approaches zero, the
cost for such advancement becomes higher and higher. For example, to improve latency
from a second to a tenth of the second, one would “only” need to automate the trading
using algorithms. To get to the millisecond level, however, co-localization and super-fast
communication lines are required, which are significantly more costly. Monitoring costs
can differ between the makers and the takers. This difference represents heterogeneity of
for example IT specialty among the market participants that potentially provides some
of them with high monitoring frequency at a lower cost than the others. As a result, the
cost per unit of clock/calendar time for maker i equals Cm (µi ) = βµ2i /2, while for the
taker j it equals, Ct (τj ) = γτj2 /2.
By the properties of the Poisson distribution, the aggregate monitoring process of
P
all makers also follows a Poisson distribution, with intensity: µ̄ = M
i=1 µi . Similarly,
P
the aggregate monitoring intensity of all takers equals τ̄ = N
j=1 τj . Consequently, the
expected interval between a transaction and replenishment of the book equals Dm = 1/µ̄.
Analogously, the expected interval between the posting of a limit order and transaction
is given by Dt = 1/τ̄ . Thus on average, the duration between two trades is D = Dm + Dt ,
and the average trading frequency equals R = (Dm + Dt )−1 = µ̄τ̄ /(µ̄ + τ̄ ). Similarly,
one can show that for a given liquidity taker j with monitoring intensity τj , the expected
trading frequency is given by
τj µ̄
.
µ̄+τ̄
Finally, when a transaction takes place, the gains of trading are split between between
9
In this sense the agents in our models are the prop traders in Biais et al. (2015)
7
maker and taker according to are split between makers and takers according to the
fractions πm and πt = 1 − πm , respectively, where πm ∈ (0, 1). The expected gains
τ µ̄
j
).10 Relatedly,
from trade for a trade originated by liquidity taker j are given by Γ( µ̄+τ̄
the aggregate “gains from trade” (GFT for short) is the expected trading gain per time
unit and hence given by
τj µ̄
τj µ̄
Γ( µ̄+τ̄
).
µ̄+τ̄
Thus, concavity/convexity of the GFT is equivalent
to a Γ(·) function being uniformly (strictly) decreasing/increasing in expected trading
speed on its domain. A major contribution of this paper is that it allows for the expected
gain per trade Γ(·) to depend upon how fast the trade is happening.
3
Equilibrium Analysis
In this section we first set up the general model in which the marginal gains from trade
can be any function of the trading frequency of the trader involved. Then by restricting
the parametric specifications, we can easily study any particular functional relationship
between these two variables. As a benchmark, we first make the assumption that the
gains from trade are linear in the average trading frequency.
3.1
The Most General Case
In this section, we derive the equilibrium monitoring intensities of makers and takers.
Moreover, we will assess whether these equilibrium monitoring intensities exceed or fall
short of first best monitoring intensities. In order to do so, let us first derive the first
best outcome by solving a social planner’s problem.
Definition 1 (the social planner’s problem in the case of general form of
gains from trade)
A social planner chooses {µi }i=1,2,...,M and {τj }j=1,2,...,N to maximizes the total social
PN
PM
PN
τj µ̄ τj µ̄
2
2
welfare:
j=1 Γ( µ̄+τ̄ ) µ̄+τ̄ −
i=1 βµi /2 −
j=1 γτj /2, such that: µi ≥ 0, ∀i ≤ M ; τj ≥
10
We assume that the gains from trade arise from the portfolio selection and consumption need of the
takers. This is supported by the reality that in the financial market, the market takers are usually the
parties with the intention to hold the security over an horizon exceeding a day, while the makers mainly
serve as the short term intermediary. We will provide the additional setup regarding this problem in
later sections.
8
0, ∀j ≤ M .
Without the explicit form of the functional form of Γ(·) with respect to
τj µ̄
,
µ̄+τ̄
a closed
form solution for the first best allocation is impossible.11 But we do obtain the following
interesting result:
Lemma 2 (the ratio of the first best monitoring intensity of the maker side
and the taker side)
The first best monitoring intensities’ ratio
µ̂
τ̂
is not affected by the functional form of
the Γ(·).
µ̂
=
τ̂
Mγ
Nβ
31
(1)
Proof. See Appendix.
The intuition behind this result is as follows. Since the existing expected speed of
the trade occurring for the makers does not influence the marginal gain per trade Γ(·)
of the entire economy, the social planner can take the trade execution speed of the taker
side, thus the gain per trade as given, and independently adjust the maker’s frequency,
and vice versa. Thus the first order optimality condition determines a fixed ratio of their
optimal monitoring frequency to that of the takers.
Armed with this result, the job to compute the optimal monitoring intensities by
the SPP becomes easier. As can be seen in conjunction with the functional form of her
objective function, the SPP needs only to solve a univariate problem instead of a bivariate
one. An important feature of this setup contributes to the obtaining of this result: the
monitoring intensities τ̄ and µ̄ enter into the objective function of the SPP only in the
form of
τ̄ µ̄
.
µ̄+τ̄
As such, this result will prove to be an essential intermediary step in solving
the linear benchmark in the next sub-section.
Then we define the equilibrium of this general case economy, with the hope to benchmark the allocation efficiency against the first best.
11
When we restrict this function to be linear, like conventional setup of the current literature, a closed
form solution may be obtained, as is shown in the next subsection.
9
Definition 3 (the equilibrium of the economy with general function form of
Γ(·))
The equilibrium of this economy is defined by the intensity choices {µi }i=1,2,...,M and
{τj }j=1,2,...,N , such that:
1. Given all the takers choice, {τj }j=1,2,...,N , as well as all other maker’s
choice: {µn }n=1,2,...,i−1,i+1,...,M each maker i maximizes his profit after cost:
PN
βµ2i
τj µ̄ µi
τj µ̄
j=1 Γ( µ̄+τ̄ )πm µ̄+τ̄ µ̄ − 2 ,
2. Given all the makers choice, {µi }i=1,2,...,M , as well as all other taker’s choice:
τ µ̄
τ µ̄
j
j
)πt µ̄+τ̄
−
{τn }n=1,2,...,j−1,j+1,...,N each taker j maximizes his profit after cost: Γ( µ̄+τ̄
γτj2
,
2
where µi ≥ 0, and τj ≥ 0.
In the attempt to compare the equilibrium allocation with that of the first best, we
explicitly consider the curvature of the functional form of Γ(·). Doing so may be crucial to
determine whether wasteful arms race would occur. The current literature mostly assume
that Γ(·) is a constant, irrespective to the trading speed. But intuition suggests otherwise.
When the frequency of trading goes to infinity, the gain per trade Γ(·) should vanish, or
the GFT would go to infinity. This contradiction leads us to study the sensitivity of the
arms race result to the shape of the GFT function. But first, we set up the benchmark
where Γ(·) is not a function of the trading speed, that is, the linear GFT, the conventional
assumption.
3.2
Benchmark Case: Linear Aggregate Gains from Trade
In this case, each transaction that takes place generates the same amount of social
welfare. Hence, we have that Γ(·) = Γ0 . Hence, the aggregate GFT is linear in the
expected trading frequency of the taker involved. The linearity of GFT is a standard
assumption in the current literature (for example, in Foucault et al. (2013) and Biais
et al. (2014)).
10
The definition of the social planner’s problem (SPP) within this economy, is simply the
previous definition with the restriction that Γ(·) is a constant. For this economy, however,
a closed form solution for the allocation and welfare aggregated over all participants is
available:
Proposition 1 (the SPP allocation of the economy with linear GFT)
The first best monitoring intensities are symmetric and given by:
Γ0 r̂2
τ̂ =
;
γ (r̂ + 1)2
1
Γ0
µ̂ =
β (r̂ + 1)2
where r̂ =
Mγ
Nβ
13
(2)
The resulting total welfare equals:
Π̂ = Γ0
M µ̂ · N τ̂
1
1
− βM µ̂2 − γN τ̂ 2
M µ̂ + N τ̂
2
2
(3)
Proof. See Appendix.
The economic intuition behind the solution above is as follows. First, the optimal
maker intensity µ̂ increases in the GFT per trade, Γ0 , because higher Γ0 justifies higher
monitoring investments to make trades happen more frequently. Second, µ̂ decreases
in the marginal monitoring cost for makers, β, due to the increasing marginal cost of
monitoring intensity. Third, it is also intuitive that µ̂ decreases in the number of makers,
M . Since it is the aggregate intensity that the social planner cares about and individual
costs are quadratic in monitoring intensity, the more makers there are, the less frequently
each maker should monitor. In addition to the within-type effects outlined above, crosstype effects can also be observed. First, the optimal maker intensity µ̂ decreases in the
marginal cost of the takers, γ̂. This happens because maker intensity and taker intensity
are complementary. After all, high monitoring by takers is only useful if the book is
likely to be F and high monitoring intensity by makers is only worthwhile if the book
is likely to be E. Hence, makers and takers impose positive externalities on each other.
Second, µ̂ increases in the numbers of takers due to the same type of complementarity.
Third, the complementarity effect can also be seen from the first term of the formula for
11
the aggregate welfare: Γ0 M µ̂N τ̂ /(M µ̂ + N τ̂ ). Unilateral increases in maker intensity µ̂
will increase the total welfare by not only a factor of M , but also by N , the number of
takers. Due to the symmetry of the results, all six interpretations above apply to the
taker intensities too.
In reality however, the first best outcome typically does not materialize. Therefore,
we now proceed by defining and solving for the equilibrium of this economy and compare
it with the first best outlined above. The definition is quite simple: we restrict Γ(·) in
definition 3 to equal a constant Γ0 .
While solving in closed form is not possible, following Foucault et al. (2013) we can
obtain an implicit solution for the base case equilibrium.
Proposition 2 (the equilibrium allocation of the economy with linear GFT)
The equilibrium monitoring intensity for makers and takers is given by:
M + (M − 1)τ ∗ Γ0 πm
,
M (1 + r∗ )2
β
r∗ ((1 + r∗)N − 1) Γ0 πt
,
τ∗ =
N (1 + r∗ )2
γ
µ∗ =
(4)
(5)
respectively, where r∗ = γΓ0 πm /βΓ0 πt is the positive real solution to:
N r3 + (N − 1)r2 − (M − 1)zr − M z = 0
(6)
with z ≡ γΓ0 πm /βΓ0 πt .
The resulting total welfare is given by:
Π∗ = Γ0 M µ∗ N τ ∗ /(M µ∗ + N τ ∗ ) − βM µ∗ 2 /2 − γN τ ∗ 2 /2
(7)
Proof. See Appendix.
An easy comparison between the equilibrium and SPP intensities is not possible because r∗ is implicitly defined. However, we can observe some interesting regularities.
First, we can compare the first order conditions with respect to µi of both the social
12
planner outcome (SPP) and the unrestrained equilibrium outcome (EQ), respectively:12
Γ0 τ̄ 2 /(µ̄ + τ̄ )2 = βµi
πm (τ̄ 2 + µ−i τ̄ )/(µ̄ + τ̄ )2 = βµi
where µ−i ≡
Pj≤M
j6=i
(FOC-SPP)
(8)
(FOC-EQ)
(9)
µj . This way, we can see whether the complementarity effect or
the substitution effect works differently in the EQ than in the SPP. First, observe that
the LHS of the FOC-SPP has a multiplier Γ0 , yet in the FOC-EQ it is πm < 1. This is a
demonstration that in equilibrium, the individual maker fails to endogenize the positive
externality his monitoring imposes on the takers. In other words, he fails to take into
account the complementarity effect he has on his counter-parties. As a result, makers are
inclined to under-monitor. Secondly, the numerators of the LHSs of the two FOCs differ
by a term µ−i τ̄ , which increases the EQ intensities relative to the SPP intensities. This
over-monitoring in EQ happens because each maker i suffers from a negative externality
(substitution effect) from all other makers. As a result, he tends to over-invest in speed
to keep up and does not endogenize the negative effect on the other makers.
The two effects described above are (partially) offsetting. Which of the two dominates
depends on parameters. Due to the complexity of the model closed form solutions are not
available and we rely on numerical analysis. A result which can be shown more generally
however is that the smaller the cost coefficients β, γ, the closer monitoring intensities are
to first best. This can be seen by comparing the FOCs of the equilibrium, for example,
µ∗ = (M + (M − 1)r∗ ) πm /M (1 + r∗ )2 β with that of the SPP: µ̂ = Γ0 /β(r̂ + 1)2 . We can
see that the difference is inversely related to β and γ.
In order to get an idea under which parameter ranges over- and under monitoring
take place, we visualize the maker and taker intensities relative to first best in Figure
1. To enable 3D plotting, we reduce the number of free parameters by restricting our
model parameters as follows: Γ0 = 1 and β = γ. Then we make plots of the overmonitoring of either type as a function of the profit split ratio πm and the cost coefficient
12
For the derivation of these FOCs, please refer to the appendix.
13
β, for several combinations of (M, N ). We consistently set the number of takers larger or
equal to the number of makers as in reality, there are usually more liquidity demanders
than the suppliers in any particular market (think of any one stock). Even if these
parameterizations do not hold in some markets, due to the symmetry of the way me
model the makers and takers, symmetrical conclusions can be drawn if the maker/taker
ratio is reversed.13 This is because the marginal cost of technology of both groups are
assumed to be equal in the plotting. This assumption can be easily relaxed to consider
specific markets.
In the top half of the figure, we plot the following combinations of (M, N ) :
(2, 2); (2, 5); (2, 10); (2, 20). For those combinations, M is fixed at a small number and N
varies. In the bottom half of the figure, we maintain the ratio
N
M
= 5, but vary the aggre-
gate market size by considering combinations (10, 50); (50, 250); (200, 1000); (1000, 5000).
Some preliminary features of the plots are consistent with our intuition. First, as
argued above, the smaller the cost coefficient (β, γ), the more over-monitoring and undermonitoring is generated. Second, over-monitoring is more likely to happen when there
are more makers and/or takers competing. In this case there are more competitors, and
hence more sources of negative externalities.
Our main observation from the top half of this figure is that for relatively small
values of M and N , neither side over-monitors severely. The top four graphs in figure
1, which correspond to the cases when N ≤ 5 show no over-monitoring at all. The
arms race only begin gradually from N ≤ 10 (left bottom quarter of figure 1). Even
with (M, N ) = (2, 20) over-monitoring is very limited. Thus the actual over-monitoring,
especially by makers, should be even smaller than our results would suggest.
When we look at the bottom half of the figure, we notice that as the market size
grows, the competition/substitution effect starts to dominate the complementarity effect
more and more. Yet, especially for the makers, we see large parameter ranges where there
is under- rather than over-monitoring. Only for relatively high values for πm , do we see
over-monitoring among makers. This also intuitively makes sense as a higher value for
13
Note that this reversion is only possible for the linear case.
14
πm gives more value to being the first one to execute a trade.
These results tell us that when the GFT is linear, the complementary effect dominates
the substitution effect for sizable parameter ranges. In other words, the higher frequency
of monitoring of either side benefits the other by providing liquidity to them, which more
than compensates for any arms race behavior. If an arms race is going on, it is on the
taker rather than on the maker side, due to the relatively higher presence of the takers
in the market.
Some additional observations are also interesting and may have policy implications. It
seems that the over-monitoring of the takers are more prevalent in the M, N combinations
that we deem realistic: N = 10. This is in contrast with the mainstream view on the
arms race that it is more likely to occur among makers. In addition, it stands out from
the figure that the over-monitoring of the takers only occurs when πm is close to 0.5, that
is, the two sides have similar bargaining power. An intuitive explanation is that when
πm is too small, there are not enough of them to generate enough positive externality to
motivate sufficient monitoring of the takers, let alone the over-monitoring. On the other
hand, when πm is too big, there are not enough going to the takers, thus they are not
motivated enough either. This observation suggests that the relative market power of the
liquidity suppliers and its demanders, in addition to their speed and their sheer numbers,
can also be relevant factors to take into account when designing regulatory measures to
ensure efficient monitoring.
The above results yield interesting insights. However, they crucially depends on the
assumption of constant marginal GFT. Therefore, in the next subsection, we relax this
assumption, and fall back on our previous intuition that the faster the trading speed gets,
the less benefit there is left to be gained from each trade. In other words, when the gain
from trade is concave in the expected trading speed. We then show that if aggregate
GFT are concave in trading frequency, over-monitoring by either side of the market is
much more likely.
15
3.3
Concave Gains from Trade
While in the current literature, the marginal gains from trade are usually assumed to
be linear in the aggregate expected trading speed, this may not be true in practice. After
all, if trading is motivated by portfolio rebalancing or hedging demands, the need for
trading is likely be lower shortly after the previous trade, as market prices are unlikely
to have moved much. In other words, expected marginal gains from trade are likely
to be declining in trading frequency and expected aggregate gains from trade are likely
to be concave in trading frequency. Thus it would be interesting to know whether the
over-investing results obtained under the traditional linearity assumption would still hold
under this more realistic assumptions. Therefore in this section we relax the assumptions
on the linearity of the expected aggregate GFT. Instead, we impose on the model that the
expected aggregate gains from trade are strictly decreasing and differentiable in trading
frequency.
Now we analyze how the relationship between equilibrium and first best differ from
their counterparts in the previous section.
τ µ̄
j
) is decreasing in
When Γ( µ̄+τ̄
τj µ̄
µ̄+τ̄
, the marginal gains from trading is declining in
trading frequency. As a result, the complementarity effect declines as
τj µ̄
µ̄+τ̄
increases, and
over-monitoring is more likely to take place.
To illustrate the effect of concavity of GFT in
τj µ̄
µ̄+τ̄
τ µ̄
we repeat the previous plots for
τ µ̄
j
j
one of the simplest concave functions of Γ(.): a Γ( µ̄+τ̄
) = −k µ̄+τ̄
+ Γ0 . Furthermore,
we let Γ0 = 1 to make this setting comparable to the previous plots. From a conceptual
point of view, this linear form for the expected marginal gains from trade is interesting
to analyze as it is also the result of a second-order Taylor approximation around 0 of
any monotonically increasing and differentiable function that crosses the origin. Unfortunately, even a simple function as a quadratic polynomial leads to loss of tractability as
the FOC involves solving for the root of a fifth order polynomial. This is a problem that
has been proven to have no closed form solution by Abel (1881). Therefore, we can only
analyze this setting numerically.
The plot can be found in Figure 2. We first plotted the case when k = 1 Compared
16
with Figure 1, we observe that the tendency for makers to over-monitor is higher, holding
constant the number of makers and takers. In other words, it is more likely for the HFTs
to over-monitor when the GFT is with decreasing marginal effect with respect to the
trading frequency. Moreover, for small values of M , the makers only over-monitor in
this concave setup, and hardly in the linear setup, within the parameter range we tested.
Hence, with marginal GFTs, we do not only see arms race behavior among HFTs in
larger parameter ranges, but also for makers as a group.
Our results are robust to different choices of the quadratic parameters. Allowing the
coefficient on the linear term, k, to take values of 0.2 (figure 3) and 5 (figure 4), we obtain
qualitatively the same result. Moreover, as the concavity increases, the over-monitoring
in general becomes more severe, demonstrating that the nature of the marginal gains from
trade plays an important role in determining the occurrence of the arms race. This can
be seen by a cross-sectional comparison of the corresponding sub-figures of the figures
1, 3, 2, 4
14
As the concavity increases, the parametric ranges over which the over-
monitoring occurs seem to expand. When M and N are large, for example, starting from
M = 50, N = 250, the domain of the β, πm over which over-monitoring occurs is strictly
increasing in the concavity. Though such large numbers of makers or takers may not be
feasible in the real market, this result demonstrates that when the marginal effect of one
particular maker (taker) on the entire maker side (taker side) is negligible, the concavity
has a monotonic effect on the over-monitoring.
While the assumption of concave GFT yields results that conform more to the “Krugman view” on HFTs, one could wonder whether such a shape is likely to arise. In the next
section, we extend our model by introducing investors with portfolio rebalancing needs
and show that the concavity assumption is actually supported by asset pricing theory. As
such, we provide micro foundations for an expected aggregate gains from trade function
that is concave in transaction speed.
14
In this particular order, the concavity increases.
17
4
Deriving the Shape of the Gains from Trade Function
4.1
An Extended Setup
In our extended framework, portfolio optimization is one of the most important motivations for trading. We build our analysis on the Merton (1969) portfolio selection
problem in which risk-averse takers trade because of their need to rebalance their portfolios as a response to market price changes. The makers in our model are risk neutral
but serve to provide liquidity for these trades and are rewarded a share of the gains from
trade from the takers. The new setup extends from the one in Section 3.1, integrating
the mechanism generating the need for trading.
We now assume that in an economy of infinite time horizon, there are two assets
available for trade. There is a risk free asset with constantly compounded rate of r:
dX0 (t) = rX0 (t)dt,
(10)
and a risky asset whose price change follows a geometric Brownian motion:
dX1 (t) = αX1 (t)dt + σX1 (t)dB(t),
(11)
where B(t) is a Brownian motion. Only the risky asset is traded in the book that was
introduced earlier.
We allow continuous quantities of limit orders in the order book to be traded on,
after which the rest of the limit order is withdrawn by assumption. This way, the book
accommodates the desire to continuously rebalance as a result of price changes. By
assumption, if the LOB is filled, there is always enough volume available to accommodate
any trading demand of a single trader. Whenever a maker arrives to an empty LOB, he
can fill it to “full”, meaning that he puts in a limit buy order as well as a sell order,
both of infinite size. The taker has an opportunity to trade when he arrives to the LOB
18
and sees that the LOB is full (i.e. it has been filled by a maker and not yet been hit by
another taker). Whenever he sees this opportunity, the taker will trade almost surely in
equilibrium, since the price of the risky asset is continuously changing.
We assume that there is a cost for each transaction paid by the taker to the maker.
This fee to equal a fixed proportion πm of the expected gains per trade Γ(·). This
assumption provides the basis for the split of the GFT in portions πm and πt in previous
sections. We further assume that whenever execution happens, the rest of the depth
(i.e., the remaining limit orders) is automatically canceled. This simplifying assumption
implies that the maker responsible for these orders does not have to monitor again to
cancel them.
The consumption of the taker is allowed to happen continuously. Similar to Merton (1969), we assume that each taker maximizes the cumulative sum of his expected
R∞
consumption trajectory: 0 e−ρt U (C(t))dt.
4.2
Endogenizing Gains from Trade
We can now continue by endogenizing the marginal gains per trade Γ(·) and analyzing
how it depends on average expected trading frequency
τj µ̄
µ̄+τ̄
of the involved taker, say, j.
To do so, we abstract from the multitude of agents of the previous model, and focus
on one of the takers, since the relationship of his GFT depends only on the portfolio
selection problem he privately faces, rather than the interaction with other agents. For
this representative taker, let W (t) be the total wealth at time t, C(t) the consumption
per unit time at time t, and the amount of wealth in each security: W0 (t) and W1 (t). Let
the initial condition be the portfolio: W0 (t) = W0 and W1 (t) = W1 .
To generate a need for portfolio rebalancing, we assume the takers to be risk averse
with CRRA utility with relative risk aversion δ.
Our problem differs from that in Merton (1969) partly in that there is transaction
cost that the taker must pay to the maker per trade. However, it is equivalent for the
market takers to first solve their wealth problem without transaction cost and then pay
the total transaction cost as a fraction of the total GFT.
19
Proposition 3 (equivalency of the problems whenever transaction costs are
paid.) The following two economy will lead to the same equilibrium decisions of monitoring intensity, trading decisions, consumption decisions, and thus the gains from trade:
Economy 1: the economy described in this section; Economy 2: the economy described
in this section, except that the expected sum of the transaction costs are paid before the
trades starts, but after the monitoring intensity of all agents are chosen.
Proof. See appendix.
This result follows from the assumption that the transaction cost is proportional
to the GFT, such that the actual objective function for the taker is simply a linear
transformation of the objective of the equivalent transaction-cost free problem.
In light of this, the inter-temporal law of motion of the capital can be written as:
dX0 (t)
− C(t)dt + Q(t)dP (t),
X0 (t)
dX1 (t)
− Q(t)dP (t),
dW1 (t) = W1 (t)
X1 (t)
dW0 (t) = W0 (t)
(12)
(13)
where Q(t) be the process of the transaction quantity of the risky asset, and P (t) the
poisson process associated with the trading occurring with frequency parameter λ. The
last terms represent the discontinuous changes of the portfolio weight caused by the
discrete trading time. The frequency of this poisson process is the link between this
model and the previous ones.
The optimization problem is with the initial portfolio as the initial condition:
Z
J[W0 , W1 ] ≡
max
E(0)
C(s),W1 (s)
∞
e
−ρs
U [C(s)]ds)|W0 (0) = W0 , W1 (0) = W1
(14)
0
The above formula can be restated in dynamic programming form to apply the Bellman Principle of Optimality:
20
Z
J[W0 (t0 ), W1 (t0 )] =
max
C(s),W1 (s)
t
E(t0 )
e
−ρs
U [C(s)]ds) + J[W0 (t), W1 (t)]|W0 (t0 ), W1 (t0 )
t0
(15)
Then by Taylor’s theorem and the mean value theorem for integrals, there exist t̄ ∈
[t0 , t], such that the above equation can be restated as:
J[W0 (t0 ), W1 (t0 )] =
∂J
(W0 (t) − W0 (t0 ))
C(t),W1 (t)
∂W0 (t0 )
∂J
1 ∂ 2J
+
(W1 (t) − W1 (t0 )) +
(W1 (t) − W1 (t0 ))2
2
∂W1 (t0 )
2 ∂W1 (t0 )
max E(t0 ) e−ρt̄ U [C(t̄)]l + J +
+λ[J(W0 (t0 ) + Q(t0 ), W1 (t0 ) − Q(t0 )) − I(W0 (t0 ), W1 (t0 ))] + o(l)]
(16)
,where l ≡ t−t0 is the latency at time t0 , and J[W0 (t0 ), W1 (t0 )] is subscript-suppressed
as J.
The last term is due to the possibility of the jump of the portfolio processes.
Now take the E(t0 ) operator onto each term, and eliminate J[W0 (t0 ), W1 (t0 )] =
E(t0 )[J[W0 (t0 ), W1 (t0 )]] from both sides,
then evaluate E(t0 )[W0 (t) − W0 (t0 )],
E(t0 )[W1 (t) − W1 (t0 )] and E(t0 )[W1 (t) − W1 (t0 )]2 , and finally take the limit as l → 0
we get:
0=
max
C(t),W1 (t)
∂J
∂J
(rW0 (t) − C(t)) +
αW1 (t)
∂W0 (t)
∂W1 (t)
+ λ[J(W0 (t) + Q(t), W1 (t) − Q(t)) − I(W0 (t), W1 (t))] (17)
U [C(t)] − ρJ[W0 (t), W1 (t)] +
1
∂ 2J
+ σ 2 W12 (t)
2
∂W12 (t)
A closed-form solution is not currently possible for this problem (see Pham (2009)),
but we are able to show the asymptotic concavity of the GFT in the trading speed, as
l → 0 using the technique of asymptotic expansion of the solution as in Roger and Zane
(2002).
21
Proposition 4 (Equilibrium GFT without transaction cost) For each taker, his
value function J[W0 , W1 |λ =
τj µ̄
]
µ̄+τ̄
is a concave function of
inverse of the trading frequency λ:
enough that
1
τ
j µ̄
µ̄+τ̄
2
1
τj µ̄
µ̄+τ̄
15
τj µ̄
µ̄+τ̄
to the 2nd order of the
, that is, as the trading frequency λ is high
is negligible.
Proof. See appendix.
Our reasoning in the proof is based on asymptotic expansion. The very high speed
of the HFTs warrants our assumptions of
τj µ̄
µ̄+τ̄
getting very large in the current trading
environment.
The concavity comes from the nature of the Geometric Brownian price change. The
price change in infinitesimal dt is to the higher order of dt itself, due to the continuity
of the sample path of this type of random processes. As a result, the deviation of the
portfolio should the trade has not occurred is also to the higher order of dt heuristically.
This translates into the marginal gains from trade shrinks quicker than t, that is, being
concave.
5
Conclusion
In this paper, we have explored whether competition on speed among stock market
participants is likely to trigger arms races, leading to socially wasteful investments. We
highlight two opposing economic channels that influence such effect in opposing and
partially offsetting ways. Competition among makers and among takers may indeed
trigger an arms race in the classic sense. However, complementarity between the two
sides, the increased success rate of trading, may offset this effect if the gains from trade
are large enough. Therefore, the likelihood of arms races depends on how gains from
trade depend on transaction frequency. We show that gains from trade are likely to be
concave in transaction frequency. Under this new condition, arms races are more likely
than under the standard paradigm in the literature, featuring gains from trade that are
linear in transaction frequency.
15
While fixing the main arguments W0 , W1 .
22
While providing important insights, the model does make some concessions to reality.
One of the most often heard concerns is that the model does not allow for the dual role
of participants in modern limit order markets. While this is a valid concern, it may not
be as grave as one would think. After all, there is a group of market participants that
are likely to show a net demand for liquidity. Moreover there is a group that on net will
be providing liquidity. This is what in the end generates the welfare gains. After all,
trades among makers, which currently are very common, are zero sum within the group
of makers (one maker could have been the only intermediary rather than a whole chain).
Of course, the simplifying assumption also massively simplifies our problem, leading to
better tractability.
Another comment to make on the results of the model is that competition in speed
may have positive externalities in the sense that it boosts technological progress and
knowledge. Other industries may benefit from this progress in unanticipated ways. As
an example, the internet was developed for internal and largely military purposes, but
in an unanticipated way massively improved productivity and living standards across
the globe. Unfortunately, incorporating such effects is notoriously difficult due to the
unanticipated and uncertain nature of such effects. Pham (2009) The model could be
adjusted for such effects in reduced form, but conclusions would depend crucially on
parameters and probability distributions that are very hard to calibrate.
References
Abel, N. H.: 1881, Oeuvres Completes, Grondahl and Son. editors: L. Sylow and S. Lie.
Aı̈t-Sahalia, Y. and Saglam, M.: 2013, High frequency traders: Taking advantage of
speed. NBER Working Paper.
Biais, B., Declerck, F. and Moinas, S.: 2015, Who supplies liquidity, how and when?
Working Paper.
Biais, B., Foucault, T. and Moinas, S.: 2014, Equilibrium fast trading. Working paper.
23
Brogaard, J., Hendershott, T. and Riordan, R.: 2014, High-frequency trading and price
discovery, Review of Financial Studies 27(8), 2267–2306.
Budish, E., Cramton, P. and Shim, J.: 2013, The high-frequency trading arms race:
Frequent batch auctions as a market design response. working paper.
Carrion, A.: 2013, Very fast money: High-frequency trading on the nasdaq, Journal of
Financial Markets 16(4), 680–711.
Cartea, A. and Penalva, J.: 2012, Where is the value in high-frequency trading?, Quarterly Journal of Finance 2(3), 1250014–1–46.
Foucault, T., Kadan, O. and Kandel, E.: 2013, Liquidity cycles and make/take fees in
electronic markets, The Journal of Finance 68(1), 299–341.
Glode, V., Green, R. C. and Lowery, R.: 2012, Financial expertise as an arms race, The
Journal of Finance 67(5), 1723–1759.
Hoffmann, P.: 2014, A dynamic limit order market with fast and slow traders, Journal
of Financial Economics 113(1), 156–169.
Martinez, V. and Rosu, I.: 2013, High frequency traders, news and volatility. Working
paper.
Menkveld, A. and Zoican, M.: 2014, Need for speed? exchange latency and liquidity.
Working paper.
Merton, R.: 1969, Lifetime portfolio selection under uncertainty: The continuous-time
case., Lifetime portfolio selection under uncertainty: The continuous-time case. pp. 247–
257.
Pagnotta, E. and Philippon, T.: 2012, Competing on speed. Working paper.
Pham, H.: 2009, Continuous-time stochastic control and optimization with financial
applications, Springer Science and Business Media.
24
Roger, L.-C.-G. and Zane, O.: 2002, A simple model of liquidity effects, Advances in
finance and stochastics pp. 161–176.
Yueshen, B. Z.: 2014, Queuing uncertainty. Working paper.
25
Appendices
A
Figures
26
27
Figure 1: The relationship between the over-monitoring and the numbers of makers and takers. We restrict the values of the parameters, such that β = γ. Then
we make plots of the over-monitoring, that is the equilibrium intensity minus the first best intensity, of either side (columns 1,3 of the maker intensity and 2,4
of taker intensity), as a function of the profit split ratio πm /πt and the cost coefficient β = γ. The flat surface is the 0 plane. All the part above 0 is thus
over-monitoring, while under 0 is not. Each one eight of the figure is for one of the combinations of (M, N ).
28
Figure 2: The relationship between the over-monitoring and the numbers of makers and takers, when Γ(R) = −R + Γ0 (more concave). Compared with Figure
1, the over-monitoring appears more severe.
29
Figure 3: The relationship between the over-monitoring and the numbers of makers and takers, when Γ(R) = −0.2R + Γ0 .
30
Figure 4: The relationship between the over-monitoring and the numbers of makers and takers, when Γ(R) = −5R + Γ0 .
B
Proofs
Proof of proposition 1: The objective of the SPP is to maximize:
M
N
X
N
X
X
τj µ̄
)−
βµ2i /2 −
γτj2 /2, s.t. : µi ≥ 0, ∀i ≤ M ; τj ≥ 0, ∀j ≤ M
Γ(
µ̄
+
τ̄
i=1
j=1
j=1
Due to the symmetry of the problem, the optimal monitoring intensity for all makers
should equal, so are those of the takers. Thus similar to Foucault et al. (2013) the FOC
can be simply written as:
FOC–µi : Γ
0
FOC–τj : Γ
0
µ̄τ̄
µ̄ + τ̄
µ̄τ̄
µ̄ + τ̄
τ̄ 2
µ̄τ̄
+Γ
2
(µ̄ + τ̄ ) µ̄ + τ̄
µ̄2
µ̄τ̄
+Γ
2
(µ̄ + τ̄ ) µ̄ + τ̄
ˆ
Divide (18) by (19), and define: r̂ ≡ µ̄τ̄ˆ , we get:
r̂ =
Mγ
Nβ
1
r̂2
µ̄τ̄
µ̄ + τ̄
µ̄τ̄
µ̄ + τ̄
τ̄ 2
µ̄
=β
2
(µ̄ + τ̄ )
M
(18)
µ̄2
τ̄
=γ
2
(µ̄ + τ̄ )
N
(19)
N
= βγ r̂ M
, thus:
13
(20)
Proof of proposition 2:
The objective of the SPP is to maximize:
X
M N X
µi τ̄
1 2
µ̄τj
1 2
πm
− βµi +
πt
− γτj
µ̄ + τ̄
2
µ̄ + τ̄
2
i=1
i=1
FOC–µi : Γ
τ̄ 2
= βµi
(µ̄ + τ̄ )2
(21)
FOC–τj : Γ
µ̄2
= γτj
(µ̄ + τ̄ )2
(22)
31
Observe (21) and (22), we conclude that the solutions are symmetrical. Divide (21)
ˆ
by (22), and define: r̂ ≡ µ̄τ̄ˆ , we get:
1
r̂2
N
= βγ r̂ M
, thus:
r̂ =
Mγ
Nβ
13
(23)
Plug (23) back into (21) and (22), we get the final result:
µ̂ =
1
Γ τ̄ 2
Γ
=
2
β (µ̄ + τ̄ )
β (r̂ + 1)2
(24)
τ̂ =
Γ µ̄2
Γ r̂2
=
γ (µ̄ + τ̄ )2
γ (r̂ + 1)2
(25)
Thus the SPP total welfare:
Π̂ = Γ
1
1
M µ̂ · N τ̂
− βM µ̂2 − γN τ̂ 2
M µ̂ + N τ̂
2
2
(26)
Proof of proposition 3:
µi τ̄
− 12 βµ2i .
The objective of maker i is to maximize: πm µ̄+τ̄
FOC–µi : πm
τ̄ 2 + µ−i τ̄
= βµi
(µ̄ + τ̄ )2
(27)
Similar to Foucault et al 2013 A6-A11, all equilibria (EQ) are symmetric, and the EQ
allocations are:
EQ monitoring intensity for each maker: µ∗ =
M +(M −1)r∗ πm
;
M (1+r∗ )2
β
EQ monitoring intensity for each taaker: τ ∗ =
r∗ ((1+r∗ )N −1) πt
;
N (1+r∗ )2
γ
where: r∗ (intepreted as r∗ ≡
M µ∗
)
Nτ∗
is the positive real solution to:
N r3 + (N − 1)r2 − (M − 1)zr − M z = 0
where z ≡
γ πm
.
β πt
EQ ”total welfare”, defined as the sum of all agents’ profit:
32
(28)
Π∗ = Γ
M µ∗ · N τ ∗
1
1
∗ 2
βM
(µ
)
−
γN (τ ∗ )2
−
∗
∗
Mµ + Nτ
2
2
(29)
Proof of proposition 4:
Recall that the decisions of taker j happen in two stages:
Stage 1: he commit to a monitoring intensity τj ;
Stage 2: given τj , he implement a trading decision at every time point when trading
becomes available, as well as a continuous time consumption decision.
First let’s look at the problem in the 2nd stage. In every trading opportunity, he
chooses the trading amount to maximize: GF T ≡ I[W0 , W1 ]−I 0 [W0 , W1 ] where I 0 [W0 , W1 ]
is the expected value if he doesn’t trade this time.
Note also that the transaction cost he pays if he does trade, is proportional to
this maximand: cost ≡ cGF T ≡ c{I[W0 , W1 ] − I 0 [W0 , W1 ]}. Then The to maximize
I[W0 , W1 ] − I 0 [W0 , W1 ] and c{I[W0 , W1 ] − I 0 [W0 , W1 ]}, that is, with or without the transaction costs, each taker will choose the same trading policy. Thus it is obvious that in
either economies in the proposition, the trading policies and the consumption policies
should be the same. What’s more, as a result, the functional form of the value function
in terms of the expected trading frequency should also be the same. Thus now if we look
at the problem taker j faces in stage 1, in either economies in the proposition, he would
choose the same trading frequency as well.
Proof of proposition 5:
Closed-form solution is not currently possible for this problem Pham (2009), so we
attempt the asymptotic expansion of the solution, same as Roger and Zane (2002). The
value function of our portfolio choice and consumption problem, J[W0 , W1 ], can be expressed by that of the Merton problem Merton (1969), adjusted by a series of small
numbers:
33
J[W0 , W1 ] = a−δ U (W0 + W1 ) +
∞
X
n=1
kn
ln
n!
(30)
As said earlier, a−δ U (W0 + W1 ) is the value function of the Merton problem, where
h
i
(α−r)2
1
−δ
a ≡ δ ρ + (δ − 1) r + 2δσ2
is a coefficient entirely composed of exogenous parameters. In addition, l ≡
1
τj µ̄
µ̄+τ̄
is the latency of the trades for taker j, where the subscript is
suppressed in the left hand side. bn , n ≥ 1 are the coefficients that we try to determine
in the expansion practice.
Following Roger and Zane (2002), we can solve for:
1
b1 = − σ 4 a−1−δ U (W0 + W1 )w2 (1 − w)2
2
where w ≡
α−r
σ2 δ
(31)
is the Merton proportion, which is entirely composed of exogenous
parameters of this problem. Thus it’s obvious that always we have b1 < 0.
Assuming that latency l is very small, a fact very much true in the context of HFTs,
terms b2 , b3 , . . . vanishes leaving the value function influenced only by the first order of
the latency:
J[W0 , W1 ] ≈ a−δ U (W0 + W1 ) + b1 l = a−δ U (W0 + W1 ) + b1
1
τj µ̄
µ̄+τ̄
(32)
Then it can be seen by taking derivative that the value function is asymptotically
concave in trading speed:
∂ 2 J[W0 , W1 ]
τj µ̄
< 0, for
big enough
τj µ̄ 2
µ̄ + τ̄
∂( µ̄+τ̄ )
34
(33)
C
Notation Summary
Symbol
Support
Description
Parameters
M
N
Γ0
πm
πt
β
γ
r
α, σ
λ
[1, ∞)
[1, ∞)
[0, ∞)
[0, 1]
[0, 1]
[0, ∞)
[0, ∞)
[0, ∞)
[0, ∞)
[0, ∞)
the total number of makers
the total number of takers
marginal gain per trade in linear GFT setting
share of gains from trade that is captured by the maker
share of gains from trade that is captured by the taker
technological cost parameter for each maker
technological cost parameter for each taker
the risk free rate
the parameters of the Brownian motion
the parameters of the Poisson process Pj (t).
States of nature
X0 (t)
X1 (t)
B(t)
Pj (t)
[0, ∞)
[0, ∞)
[0, ∞)
[0, ∞)
the price trajectory of the riskless asset accumulation
the price trajectory of the stock price
Geometric Brownian Motion serving as control process of S(t).
trader-specific Poisson jump process recording transactions
Indices
i
j
t
{1, .., M )} maker i
{1, .., N )} taker j
{0, .., ∞} time
Decision variables
µi
τj
W0 (t)
W1 (t)
Q(t)
[0, ∞)
[0, ∞)
[0, ∞)
[0, ∞)
(−∞, ∞)
C(t)
[0, ∞)
the intensity parameter of maker i.
the intensity parameter of taker j.
the wealth invested in the riskless asset at time t.
the wealth invested in the stock at time t.
the amount transferred from the stock to the riskless asset at time
t (when trading is possible).
the amount of consumption at time t.
35