Jeffcoat, C.E.; (1973)Some related queuing models with dependent service and inter arrival time."

••
~his
research was supported by the Office of Naval
Research, Contract No. ONR-N00014-67-A-032l-0002 •
SOME RELATED QUEUEING MODELS WITH DEPENDENT SERVICE AND
INTER-ARRIVAL TIMES
Colin E. Jeffcoat
~
Department of StaUs tias
UniveX'sity of lVoX'th CaroUna at ChapeZ Hill
~.
r
·•.1
J
Institute of Statistics Mimeo Series No. 876
June, 1973
.'
I
'.
~
•.
TAB L E O· F CON TEN T S
l.
2.
3.
4.
-
PAGE
CHAPTER
ACKNOl-JLEDGMENTS
iii
NOTATION
iv
INTRODUCTION
1
1.1
Introduction to Queueing Theory
1
1.2
Correlated Queues
2
1.3
Definition of the Basic
1.4
Summary of Known Results
1.5
Outline of the Dissertation
4
~1ode1s
THE STATIONARY DISTRIBUTION OF QUEUEING
FORvJARD I>"ODELS
·i8
8
TI~ES
FOR THE
10
2.1
Introduction
10
2.2
Wiener-Hopf Factorization
10
2.3
More Practical Results
22
STATIONARY BEHAVIOUR OF THE BACKWARD VODELS
33
33
3.1
Introduction
3.2
Total Waiting times for the General Backward tltode1 33
3.3
The Backward Model with exponential arrivals
39
BUSY PERIODS FOR THE BACKWARD VODELS
45
4.1
Introduction
45
4.2
The Backward Hodel with exponential arrivals
46
4.3
Mean length of a Busy Period
71
4.4
The Backward Transition r-1odel with
A
0
=0:>
84
APPENDIX
88
BIBLI OGRAPHY
93
~
-
•
COLIN E. JEFFCOAT. Some Related Queueing Models with Dependent
Service and Inter-arrival times. (Under the direction of
WALTER L. SMITH).
A B 5 T RAe T
In this dissertation we discuss some related queueing models
which allow for a type of dependence between the service and interarrival times.
We assume that there are a finite number of customer
types and that the service time of a customer depends on either the
preceding inter-arrival time (backward models) or the following
arrival time (forward models) through the customerwpe.
Particular
cases of the general models are the transition models, for which
arrivals are generated by transitions in a Markov chain in continuous
time.
Possible applications for these models are production lines,
doctor's appointments and storage situations.
We analyse the stationary behaviour of the forward and backward
models.
The case of the forward model has been examined before, and
quite general results are possible.
We consider how these results
relate to the forward transition model.
A complete analysis of the
stationary behaviour for backward models is achieved only in the
particular case of exponential arrivals, where we show how to compute
the Laplace transform of the stationary distribution of total waiting
times.
We also analyse the busy period distribution for this last model,
and show how to compute the busy period transform.
•
iii
-.
ACKNOWLEDGfl-1ENTS
I wish to thank Dr. W. L. Smith, my thesis adviser for his help
and encouragement during the preparation of this work.
I would
like to thank the members of my committee, Dr. W. Hoeffding,
Dr. R. J. Cannon, Dr. K. J. C. Smith and Dr. E. J. Wegman, for their
cooperation, and Dr. S. Cambanis for his assistance at a crucial
stage.
Mrs. Pamela Smith did an excellent job of the typing for which
I am grateful.
support.
I would like to thank my wife, Mary, for her constant
In helping me through my stay at the University of North
Carolina I would like to thank the Department of Statistics, the
taxpayers of North Carolina and most of all my friends .
•
iv
-.
NOT A T ION
arg(~)
argument of s
a.s.
almost surely, i.e. with probability I
def
definition
deg
degree (of a polynomial)
d.f.
distribution function
iid
independent and identically distributed
Im(s)
imaginary part of s
the
nxn
unit matrix
m.
the set of real numbers
Re(s)
the real part of s
We abuse the usual set notation by writing {s
as {s
has property
Fo (s),F * (5)
s
has property p}
p} •
denote the Laplace and Laplace-Stieltjes transform of
F
(see Appendix).
00
f g(x)
All integrals of the form
= o(g(x))
f(x)
g(x)
f(x)
o
•
+
= O(g(x))
f(x)
g(x)
are understood to include the
o
lower limit.
f(x)
F(dx)
as
0
as
x
x-+-O
as
x
or
0
+
+
0
is bounded as
or
00
or
00
x-+-O
means
00
means
or
00.
denotes the end of a proof or result •
~.
C HAP T E R 1
INTRODUCTION
1.1 Introduation to Queueing Theory
A queue is created when customers arrive at a service facility
and wait for service in the event that it is not immediately available.
The probabilistic description of a queueing model is specified
by
a)
the distribution of arrival times,
b)
the number of servers,
c)
the queue discipline, i.e. the order of service,
d)
the distribution of service times.
Throughout this dissertation we consider only single-server
queues in which the customers are served in order of arrival (first
come, first served).
If two or more customers arrive simultaneously,
we assume that there exists a procedure for deciding precedence.
Observation of the queue is assumed to begin at time O.
Let
o be the time of the first arrival, and Un the time between the
nth and (n+l)th arrivals (n = 1,2, ... ). Vn will denote the
V
.
.
serVIce
tlme
0f
the n th
customer
(n = 1,2, ... ),
the
sum of the service times of any customers already in the system at
time
•
O.
2
The mathematically interesting aspects of a queue are the
number of customers in the system at one time, the amount of time
a customer has to wait for service and the duration of busy periods,
Le. the time intervals during which the server is continuously
busy.
We define the queueing time of a customer to be the time
between his arrival and commencement of service.
Total waiting
time is defined to be the time from arrival to completion of
service.
Analysis of the transient (time-dependent) behaviour of a queue
is usually quite complicated.
A common procedure is to show that
a unique equilibrium distribution exists and assume .in applications
that the system has reached equilibrium.
We use the terms
stationary distribution and equilibrium distribution interchangeably
For example, if
to mean a limiting distribution in some sense.
{Qn} is a sequence of random variables with
then the stationary distribution of {Qn}
Fn
is
the d.f. of Qn'
lim F ,
n-l-OO n
provided
this limit exists and is a proper d.f.
1.2
CorreZated Queues
Most investigations of the single-server queue assume that the
inter-arrival times
sequences of iid
{Un}
and service times
random variables.
{Vn } are independent
We wish to consider queueing
models for which this independence assumption does not hold;
following Conolly [2]
we call these correlated queues.
Several authors have considered various types of dependence
between service and inter-arrival times.
."
"-
analysed a model which had
•
{U }
n
D. R. Cox and W. L. Smith
related through
3
--.
the
queue size at arrival epochs [5; § 2.4J.
C.
~1.
Harris discussed
in [7J a similar case where service time is dependent on the queue
size at the moment when service begins.
~l.F.
Neuts investigated in [16] a model where arrivals and
service times were related through an extraneous phase process:
during phase
i
arrivals are Poisson with parameter
initiated during this phase have d.f.
H.
1
A.1
and services
•
R. M. Loynes generalized some results of D. V. Lindley concerning equilibrium distributions to the case where
{(Un,V )} is a
n
strictly stationary metrically transitive sequence [12J. In a
subsequent paper
[13], he showed how the stationary distribution of
queueing time may be found when the queue has a certain structure.
We are interested in two related subclasses of correlated queues,
which may be characterized as follows:
(I)
queues with backward dependence:
dependent
n
= 0,1,2, ••. ,
Un
and
Vn + l are
all other pairs U,V being
n m
independent;
(II)
queues with forward dependence:
dent,
n = 0,1,2, ...
Un
and
all other pairs
Vn are depenUn ,Vm being
independent.
With backward dependence, the service time of a customer is related
to the arrival interval preceding his arrival.
With forward depen-
dence the service time is related to the following arrival interval.
B. W. Conolly in [2] and [3] (with N. Hadidi) analysed a
particular backward model with
V
n+1
directly proportional to
U •
n
The motivation given for studying this type of model was that such
•
4
-.
queues may be made self-regulating:
according to the value of Un
the server may adjust
V
n+l
in order to stabilize the queue, i.e.
to reduce waiting times for a fixed traffic intensity.
F. I. John discussed a general method applicable to
forward dependence.
({/!Cues
with
As possible applications he mentions a production
line and appointments at a doctor's office.
In both cases the time
between arrivals "should depend strongly on the estimated service
times" [8].
Conolly claims in [2] that forward models cannot be interpreted
usefully as self-regulating systems.
However, such an interpretation
can certainly be made if service times are ascertainable immediately
on arrival (as in storage applications), or if an estimate of
service time is available then, as in John's examples.
Backward dependence does seem more natural, however, since it
is quite plausible that the service time of a customer might influence the time required for arrival; this is especially so when the
queue is interpreted as an infinite store (or dam) as in [20] , [14],
for then service times correspond to size of deliveries.
1.3 Definition of the basic modeZs
We now define the queueing models that we deal with in this
dissertation.
First we define the general backward and forward
models, then the particular cases of the transition models.
The so-called general models are not the most general models
as described in the previous section.
cases of
-.
•
types, and
Instead they are the special
(I) and (II) where there are a finite number of customer
{Un} and
{Vn } are related only through the customer
5
••
types.
This does restrict the range of application considerably,
but some such simplification appears necessary if usable results
are to be obtained (c.f. § 1 of [13J).
We define the genepal baokwapd model as follows:
customer types represented by 1,2, ... ,K.
for each j
customer.
l~.
j J
and
= 1.
Then we suppose that
Let
{U
:n~l}
A
Z
(x)
n
K
The (unconditional)
where
th
be the type of the n
probability that an arriving customer be of type
~. > 0
J
There are
Zn
j
is
~.,
J
{v :n~l}
and
are iid
n
with
(n
n+l
Pr{Vn ~ x
Clearly U
n
and
I
{Z m},{Um}}
(x)
(n
= 1,2, ..• )
n+l
Un
.
but conditionally
Zn+ l '
The definition of the genePal !opwapd model
that
,
n
are related through
V
independent given
= BZ
= 1,2, ... )
is related to
is the same except
Z , i.e.
n
Pr{U ~ x I {Z },{V }} = AZ (x) (n = 1,2, ... ) . Thus for
n
m
m
n
the general forward model, Un and Vn are related through Zn'
but conditionally independent given
Zn
The transition models are special cases of these general models,
with inter-arrival distributions belonging to a subclass of those
distributions with rational Laplace transforms.
In particular, the
arrivals correspond to certain transitions in a finite Markov chain
in continuous time.
We suppose that
•
state space
{Zt,t~O}
{O,l, •.. ,K}
is an homogeneous Markov chain with
and infinitesimal transition probabilities
Pr{Zt+h
for
t,h>O,
W= [Pij]
=j I
where
Zt
= i}
(l-A.h)o
.. + A.hp
.. + o(h)
1
1)
1
1J
=
AO,Al, ... ,A K are positive real numbers and
is an irreducible Markov transition matrix.
Realizations of
stays in state
i
have the following form:
the process
for a time period which is exponentially distri-
buted with parameter
probabilities
(Zt)
A. ,
{Pij}' j
1
€
then jumps to state
according to the
j
{O,l, ... ,d (see [9;pp. 226-9]) . Note
that a ;Yjumpl1 is not necessarily a change of state, since transitions
i
~
may be possible.
i
We call (Z )
t
the guiding chain.
We define the backward tpansition modeZ as follows.
K+l
customer types represented by
by transitions into state
transition
j
~
O,l, ... ,K.
There are
Arrivals are generated
0 of the guiding chain; in particular, a
0 is identified with the arrival of a type
customer, with service time d.f.
j
B. •
J
The fopward tpansition modeZ is defined similarly, except that
arrivals correspond to transitions out of state
0:
o~
j
j
is identified with the arrival of a type
a transition
customer.
The inter-arrival distributions for the transition models are
(possibly infinite) mixtures of Erlang distributions, since each
arrival interval is the sum of a variable number of exponential
phases.
Actually, it is possible to show (as in Lemma 2.12)
that each Aj
has a rational Laplace transform for these models.
As
noted by Loynes in [14], it is possible to construct a large class of
inter-arrival distributions by varying
[4]) .
-.
•
K,{A.}
)
and
lP
(see also
7
0-·.
An annoying feature of the transition models is that every
arrival interval begins or ends with an exponentially distributed
phase with parameter
tation
~ay
A
o
The resulting difficulties in interpre-
be minimised by taking
convenient to be able to set
A
o
very large; it would be
A
O
=
Instantaneous states may
00
lead to theoretical difficulties; however, it is possible to redefine the transition models so as to incorporate the effect of having
provided p 00
= o.
This is done for the backward model in
Section 4.4.
With
A
o
= 00,
we may represent the backward model with
exponential arrivals (as treated in Chapters 3 and 4) as a transition model with
o
lP
K
1
=
I
'IT
o
1
More complicate<! inter-arrival
choosing submatrices of
o
o
o ...
0
distribut~;ons
lP approrriately.
can be constructed by
For example, Erlangian and
hyperexponential inter-arrival distributions are generated by submatrices of the form
.-.
•
o 1
o o
o
o
o
1
o
1
o
o
o
1
o
0
o
o
1
1
o o
o
1
0
o
o
o
8
1.4
Summa~
of Known ResuZts
The general backward model does not seem to have been investigated before.
Results concerning the stationary distribution of queueing
time are implied for the general forward model by results in [13].
W. L. Smith analysed the transient behaviour of the forward
transition model in some unpublished work (c. 1958 - see [21]).
He
obtained an explicit expression for the Laplace transform of the
busy period distribution.
R. H. Loynes considered in [14] what is essentially a more
general transition model, including both the forward and backward
transition models as special cases.
His analysis of the transient
behaviour did not lead to any explicit solutions.
1.5
OutZine of the dissertation
The main objective of this dissertation is
useful results for the backward models.
to obtain some
We have not been able to
derive explicit solutions for the busy period transform for the
backward transition model as in the case of the forward transition
model.
In Chapter 2 we apply the methods of
w. L. Smith [20] to the
general forward model to obtain results on the stationary distribution of queueing time.
Actually, the main results of this chapter
are contained in [13]; they are derived independently here and
included for completeness and comparison with the results of
Chapter 3.
.-.
We also show how these results apply to the forward
transition model.
•
9
--.
In Chapter 3 we consider the stationary behaviour of the general
backward model.
Instead of proceeding as in Chapter 2 we must
consider here the total waiting times.
integral equation we are
forc~d
To solve the resulting
to restrict ourselves to the case of
exponential arrivals.
In Chapter 4 we consider busy periods for the backward models.
In the case of exponential arrivals we generalize a procedure used
in [5] to obtain the busy period transform.
This procedure may be
partially generalized to the backward transition model with
•
Ao
=
00
•
••
C HAP T E R
2
THE STATIONARY DISTRIBUTION OF QUEUEING TIME FOR THE FORWARD MODELS
2.1
Introduction
The stationary distribution of queueing time was investigated
for the
[20].
GIGIl queue by D. V. Lindley in [11] and W. L. Smith in
In this chapter we apply Smith's methods to the general
forward mode 1.
Section 2.2 is essentially a complete derivation of Smith's main
results with only a few minor changes.
In Section 2.3 we obtain some more practical results (following
§§ 3.4, 3.5, 4.2 and 4.3 in [20]) by making further assumptions on
the service time or inter-arrival time distributions.
We develop the
latter case in more detail and show how it applies to the forward
transition model.
2.2
In this section we consider the general forward model with
customer types.
We describe this model as follows:
at each arrival
epoch the probability that the arriving customer be of type
~j'
independently of any previous events (j
assume
~.
J
>
0 for all
customer is
customer has d.f.
--
•
j.
= 1,2, ••. ,K).
The service time d.f. of a type
is
j
We
j
and the time to the next arrival after a type
A..
J
K
j
11
Let
Zn
denote the type of the nth
time, and Un
the time between the
nth
customer,
and
Vn his service
(n+l)th arrivals.
Pr{Vn S x I Zn = j} = Bj(x),
and Pr{Un S x /Zn = j} = Aj(x). Vl ,V 2 , ••• are iid random variables,
as are Ul ,U 2 , ... ; we leave unspecified the distributions of Uo '
Then for n ~ I,
we have
Pr{Zn
= j} = rr j ,
the time to the first arrival, and V ' the residual service time at
o
O. Un and Vn are conditionally independent given Zn for n ~ 1.
Xn = Vn - Un (n = 0,1,2, •.. ). Then Xl ,X 2 , ... are iid random
variables, which enables us to apply the methods of Lindley and Smith.
Let
We assume that
Aj
and
Bj
have finite means
co
=
aj
co
f x A. (dx), b. = J x B (dx) .
J
j
J
0
Let
=I
j
rr.a., b
J J
= L
rr.H.(x).
a
0
=~
J
1T. b ..
Set H (x)
j
J J
j =1
J J
= JB.J (X+Y)
= Pr{Xn
Then HJ.(x)
S
x
Zn
the d.f. of Xn (n = 1,2, ... ) .
Let Qn be the queueing time of the n th
= O.
= max
(~
+ Xn,O)
Qn+ l is determined by {(Uk,Vk):
Markov process.
n = 0,1,2, ••.
•
= j}
(n
Note that
let
Fn(x)
~
Osksn},
S
[11])
(2.1)
(n = 0,1,2, ... ) .
so clearly
is independent of
= Pr{~
and H is
= 1,2, ..• ).
Then we have the fami1iur relation (see
Qn +l
and
customer, i.e. the
time from his arrival to commencement of service
Take Qo
Aj (dy),
0
K
H(x)
co
xl.
Then for
Zn
x
~
{~}
is a
and Xn .
0
For
12
-.
00
Fn +1 (X)
= JPr{~+l
s x
I ~ = u}
Fn(du)
o
00
= JPr{Xn
x-u} Fn(du)
S
o
by (2.1), since Qn
is independent of Xn'
Thus
00
Fn+l(x)
=
JH(x-u) Fn(du),
for
x
~
0, n
= 0,1,2,...
(2.2)
o
We show that under the usual first moment condition
statiO:lary distribution, L e.
lim Fn (x)
{Qn}
has a
exists and gives a proper
n-+oo
distribution function.
Assume
Ie',
r 1T.(b.-a.)
J J
< 0,
Le.
EX
j=l J
LE~~
2.1:
PROOF:
Under assumption (AI),
See
§§
3,4 of
[11].
F(x)
= lim
(AI)
{Qn} has a stationary distribution.
Lindley's proof applies to our model
also [see note on p. 278 of [11]].
Let
< 0 •
I
0
F be the (unique) stationary distribution function, i.e.
Fn(x).
Then
F satisfies
n-+oo
00
F(x)
= J H(x-u)
F(du)
(x
~
0)
(2.3)
o
from (2.2) and the dominated convergence theorem.
(2.3) corresponds
to Equation (4) in [20].
,"
We now follow the Wiener-Hopf procedure as outlined by Smith.
•
First we define
13
00
-e
Fl(x)
= f H(x-u)
F(du)
for
x < 0 ,
for
x
o
o
Then for any real
~
0 .
x we have
00
F(x)
+
=f
Fl(x)
(2.4)
H(x-u) F(du).
o
If we take Laplace-Stieltjes transforms of both sides of (2.4), then
we obtain
F* (s) + Fl * (s)
= H* (s)
F* (s)
(2.5)
We know that both sides of (2.5) are finite at least for
s
on the
imaginary axis; we can extend this range by making the following
.'
assumpt:Lcn: (1'.2)
j
E:
There exists
ll' >
U,2, ... ,d
00
f
00
ell' x B (dx) <
j
f
e ll'x Aj(dx)
o
Assumption (A2) implies that
Bj * (s)
and Aj * (-s)
is analytic in
{Re(s) <
follows that
H* (s)
=
{s
LE~~~
2.2:
= Kr
j=l
Re(s)
-ll S
def
property, e. g.
,
II
•
00,
o
S
-"
0 such that for each
11
_
... -
III
S ll}
00
•
is analytic in {Re(s) >
J
for some
for each
'll'}
n.A. * (-s)B. * (s)
J J
<
II >
j.
-ll'}
It
is analytic in
O.
Fix
II
with this
2'
Under assumption (A2),
Fl * (s)
converges for
Re(s) s
ll.
14
I<
PROOF:
Let
A(x) =
for
as
x
+ +
00
n
I
A is the unconditional d.f.
.A. (x), Le.
J J
Assumption (A2) implies that
1.
~
'IT
j=l
(Theorem 2.2b in
[23]).
Now for
x
I - A(x)
>
= o(e-~
0
00
(OS) FI(-x)
= J H(-x-u) F(du)
o
s; H( -x)
A(x - 0) .
= I -
FI(-x) = o(e- il x)
There,fore
*
F1 (s)
[23],
converges for
x
Re (s) <
of
s
as
lsi
REMARK:
+
00
(AS)
in
il
00,
so by Theorem 2.1 in
i
0
{O S Re(s) S ilJ.
in the strip
I - H* (s).
Our next step is to factorize
assumption:
+ +
assumption (A2) we see that (2.5) is valid at least
'Ln.h, U:1i';; i'
for all values
as
There exists
0
>
0
We need a further
such that
H*(s)
= O(lsl- o)
S
il
Assumptions (A2) and (A3) are satisfied in most practical
applications of queueing theory, e.g. if AI, ..• ,AI<' BI, ... BI<
are
exponential, hyperexponential or Erlangian distribution functions.
(A3) certainly implies that
H cannot be a lattice d.f., for if
H puts all probability at points
then
--
H* [2~;i)
= 1
a
+
for n = I, 2, ... ,
md, m = 0, ± 1, ± 2, ... ,
which contradicts (AS).
Throughout the rest of this section we will assume
•
(A3) implicitly.
(AI), (A2) and
,
x)
15
.-.
LEMMA 2.3: We can choose
~ >
1 - H* (s) has only one
0 so that
zero in S
~
1 - H* (s)
PROOF:
1 + (a-b)s
+
has a single zero at
0(151)
im~gin;.;.rY' a:d::;, since
e
some
~ O.
= 0,
- H* (s)
C1CSGr.{,
h'35
no
=
There are no other zeros on the
IH * (ia) I = 1
H ~ s .. ~l:rttice d. f. if
It also follows from (A3) that there exists
1
H* (s)
since
for
and 'c:hi sis ruled out by as sumption (A3).
lR - {O } ,
€
151
as
s
in
zelOS
bc:m.ded rec:tar'gle
S n {s
~
S n
~
l.rm(s)
I
. I Im(s) I :s;
{s
such that
6 > 0
M.
Inside the
6o} , 1
- H* (s) is
>
ana!ytic and therefore can have only a finite number of zeros.
all
t.~U7
choC's~
S
c·ne of these zeros must lie off the imaginary axis, we can
so that the zero at
1.!
de fi1)Nl
Clearly
g(s)
factorize
['.f·
fol1aws:
fix
g(s)
= S~A
= 0
g(O)
= A(a-b)
g(s)
is the only one contained in
we consider the function
A real so that
for
s
is analytic in
For
g(s)
Is I
S
~
-
S
and set
SJ.l - {O} ,
€
{oL
It is easy to show that
s
= 0,
so that
g(s) is
J.l
is bounded in S ,
~
large,
A > J.l
g(s)
•
has a power series expansion around
LEMHA 2.4:
PROOF:
1 - H* (5)
{I - H*(S)}
analytic everywhere in
•
5
II
.i-~i (':::d';"f·~~\.J
--
Since
S-A
5
and also bounded away from zero.
and
11 - H* (s) I
s 1
+
"I
1"1 S
,-6
16
M-.
by (A3).
Thus
Clearly
g(s)
Is I
S n {s :
~
s lIl.
Thus
g(s)
~
g
is bounded in
Ig(s) I ,..,
and
S
the cont:Lrl11.i ty of
C!)hsider :ihst points
:
!;:
t ',: )::
;<:
,',
,~
.
IR - {O}.
f
~.
~: Tl: ,~
1,
;::
,
'
~
!..:!
,.
Since
S
~
g(s)
large, it follows by
.'j
S
€
S
~
U
on tht; :maginary axis:
suppose
In the proof of Lemma 2.3 we showed that
which imp lies
IH* (iT) I
,',
,_
lsi> lIl.
is bcun::lf',d away from zero in
g(~)
trtst
Is I
for
1
for
:3
n {s
is bounded in the closed, bounded set
has no zeros in
FRV''" "
S~
g(s) is bounded in some set
(.~.,..)
and
\ J, •
<
Therefore
1.
Re{l - H* (iT)} > O.
Since
Re{i~~A} = 1
{iT-A}
* (iT)} and arg - . 1T
~r::g{l - H
continuously so
that
Since
g(O)
= A(a-b)
0,
>
it follows that we can define
continuously on the imaginary axis so that
For
=1
g(s)
lsi
+
S €
•
S
~
n
o(lsl-o)
{z :
larg{g(s)}1 <
TI •
large, we have from the proof of Lemma 2.4 that
so that there exists
oM
arg{g(s)}
Iz I
>
+
1
as
lsi
+
00
II > 0 such that
lI} .
in S
~
•
Thus
larg{g(s)}1
arg{g(s)}
< TI
for all
+
0,
0
17
...
5].1 n {z : Izi s ~}
The set
for all
s
in the closed subset
by Lemma AS
O~iSt5
.~
( S
,since
r'\
arg{g(s)}
0 such that
E >
I.
is compact, and
I. ,
r'/
I
. ,~\
".,'
...
_ ..
{z : Re(z)
larg{g(s)}1 <
= o,lzl s
~}.
Therefore,
is a continuous function of
IQrg{g(s)}1 <
is
.\ I.
/.C'.
TI
there
5,
for all
TI
ind~pandent
,
'
~ -..
J i <
I ~l""·~-(.·f·."~"'~
J:.1 ...., )
of
we may
].I,
for all s
'IT
€
o
5 .
].I
def
in
5
:[;(s) 1
>i' :;-;;: :
Is!
r.c;
~
.. r .
: ;';:,,:3 j
+
I
;
Thus there exist
151
that
>~.
5ince
IG(s)
I
S
wi th
00
s
€
].1
M
s --_.
1 +
5
].1
151 0
•
we have
~
>
G(s)
M2 .
0, Ml ~
° such that
is bounded in
If we choose
at Ir.1 (s) = ± T
5].1'
IG(s)!
S
Mllsl-o
there exists
M = max[Ml(l+b
-0
for
~~ ~
),M2(l+~
0
° such
)],
then
0
ConsiJer the rectangular contour
•
5
s !loglg(s)l I + larg{g(s)}l
the result follows.
5].1
€
5].1 .
(T>O) ,
i.e.
Ar
Ar
obtained by truncating
is the rectangle with vertices
18
.'.
± ~ ± iT.
inside
Suppose
AT
and inside
for
z
is any point inside
T sufficiently large;
S.
~
since
Then
G(s)
z will be
is analytic on
AT we have by Cauchyi s integral formula
G(z)
=~
2~1
J G(s)
s-z
ds.
AT
Now from the proof of Lemma 2.6
].1+iT
G(s) ds
f s-z
-~+iT
::;
2~
sup {G(S)
s-z
Im(s)
Thus
~+iT
s-z
J G(s)
ds
as
-+ 0
T -+
co
•
-~+iT
Similarly
~-iT
J
G(s)
s-z
ds
-+ 0
as
T -+
co
•
-~-i T
Define
iCO
f
-ll+i oo
.G(s) ds
s-z
'
l1 +
J
ll-i oo
LE~~~
2.7:
-'
Re(z) ::;
•
1l -
(resp. Re(z)
(IRe(z)
I
<
-l1-i co
Il(Z) (resp. 12 (z))
€
G(s) ds
s-z
~
is bounded and analytic for
-].1 + e)
for any
e > O.
].1) .
19
.-.
PROOF:
z
Suppose
=x
+ iy
where
x
~ ~
-
€.
Then
ClO
G(lJ+it)
J lJ-x+i (t-y)
dt
_00
Choose p,q
M
1+ltl
£
15
>
1 so that
p6
>
1 and p-l + q-l
= 1.
Then, since
Land
p
inequality
Since this last bound is finite and independent of
that
Il(z)
is bounded for
To show that
z
•
=x
+
iy,
where
I1(z)
x
z it follows
Re(z) S lJ - e: •
is analytic in
S 1J - £ ,
and
{Re(z)
h f C.
~ ~
We have
- e:}
consider
20
.-.
= llf+ i co -,:----::G;..::(.;;;..s)"---~
ds
(s-z) (s-z-h)
ll- i CD
00
=
f
G(ll+it)
{ll-x+i(t-y)}{ll-X-h+i(t-y)} dt ..
-co
If h s
€
then the integrand is bounded by
/2'
Therefore by the dominated convergence theorem
G(s)
«
ds
co) •
(S_z)2
Thus
Il(z)
is analytic in {Re(z)
corresponding results for
12 (Z)
S II -
e}.
The proofs of the
are similar.
0
We may now prove the main result of this section.
THEOREM 2.1:
Under assumptions (AI), (A2), (A3),
A>
such that for fixed
1 - H* (5) =
s~
+
(s)
-'-;;;~,,;;;.L.._
(S-A)~- (s)
where
~+(s)
{Re(s)
~
(resp.
-ll + d
II >
II
for
~-(s))
for any
there exists
s
in
{IRe(s)1 < ll},
(2.6)
is analytic, bounded and zero-free in
e: > 0 (resp. {Re(s) S Jl - e;}).
The
Laplace-Stieltjes transform of the stationary queueing time distribution is
•
0
21
--.
F * (5)
PROOF:
¢l+(0)
= ----'=--"-
for
~+(s)
We know that for
5
f
Re(s)
~
0 .
(2.7)
{IRe(s)1 < ~}
and we have shown above that
Then Equation (2.6) follows.
bounded in
{Re(s)
~ -~
~ -~
+ e}.
The properties of
5W+ (s)F * (s)
~ + (s)
s~+(s)F*(s)
F
* (5)
1
~-
(s)
By Lew~a 2.7
{Re(s)
+ €} •
From (2.5) and (2.6)
Now
Let
Clearly
(s)
~-(s)
has no zeros in
follow similarly.
for
~
0
Re(s) <
~ •
are analytic and bounded for
is analytic and
0(151)
is analytic and bounded for
is analytic and bounded for
(S-A)~-(S)Fl*(S)
+
is analytic and
we have
= -(s-A)¢l - (s)F l * (s)
and F* (s)
~
~+(s)
is analytic and
as
lsi
Re(s)
+
~ ~
as
Re(s) > 0,
so
~ in {Re(s)
> O} .
by Lemma 2.2.
Also
Re(s) ~ ~/2
0(151)
(2.8)
Therefore
151
+
~ in
{Re (s) ~ ~ / 2} .
Thus the two sides of (2.8) represent a function which is analytic
in the finite complex plane and 0(151)
•
as
151
+
~.
It follows
from a variant of Liouville's Theorem [22; 2.52]
that this function
must be linear, i.e. there exist complex numbers
a,B
such that
22
--.
s~
+ (s)F * (s) = a +
Putting s = 0 shows that
a = 0,
~
0 .
and by continuity
B = ~+(O).
0
Therefore (2.8) follows.
COROLLARY:
Bs for Re(s)
The cumulant-generating function of queueing time is
s
log F* (s) =2'11'i
-"rOO
I
{Z-A
*
} dz.
z(z-s) log --z-- (l-H (5»
-lJ-i co
PROOF:
log F* (s)
I
= 2ni
{I 2 (s) - I 2 (0)}
=__1__ -lJJ+
2ni
ico
-lJ-i co
sG(z)
z(z-s) dz .
o
2.3 We may obtain more practical results by making further assumptions on either the service time or the inter-arrival time distributions.
We prove one result with conditions on the service times
on the basis of some results in [20].
Then we develop in more
detail the case of conditions on the inter-arrival distributions;
we show that the required conditions are always satisfied for the
forward transition model.
Throughout this section we retain assump-
tions (AI), (A2), (A3).
Let
T be the class of distribution functions with rational
Laplace-Stieltjes transform.
We show that the stationary distribution
of queueing time is in T if the service time distribution functions
are in T and no customer has zero service time a.s.
•
In addition,
23
••
we are led to a formula for
F* (5)
which may be inverted easily in
any particular example.
THEOREM 2.2:
If
B.(O)
(ii)
then
F
E'
1 for each j ,
<
in particular we have
T
F * (s)
J
=
M
IT
j=l
I - s IC j .}
{
for
I _ sid. "
J.:.
in
s
{Re(s)
~
O} ,
(2.9)
cI"",cM ar~ the poles of H* (5) in {Re(s) < O}
dl,· •. ,dM are the zeros of 1 - H* (5) in {Re(s) < O}
where
PROOF:
Condition
(i) implies that II* (s)
and
is meromorphic in
{Re(s) < OJ, i.e. has no singularities there other than a finite
number of poles
c l ' ... ,cM •
Pj (s) and Qj (s) are polynomials in
If deg{P j } < deg{Qj}' then B. * (s)
J
If deg{P j } = deg{Qj}'
cient of Pj to that of Qj
~ +~
that
Since
,r
for
•
IB
= Q~(S)
J
such that
5
~
P. (s)
(5)
0
as
/51
,
where
deg{P j }
~ ~
~
deg{Qj}'
in any direc-
as
along the real axis.
In either case, condition (ii) implies
*
j (5)1
sufficiently large in
<
IAj * (-5)1
151
Bj
then the ratio of the leading coeffi-
tion.
s
*
We have
1 for
~ 1
for
Is/
Re(s)
sufficiently large in
follows by Theorems
1 and 3 in
<
0,
{Re(s)
[20].
{Re(s)
it follows that
<
O}
0
<
oJ.
IH * (5) I
The theorem then
<
1
24
.-.
COROLLARY:
The stationary probability of an arriving customer
!vI
finding the server free is
IT
j=l
PROOF:
F(O)
= lim
F* (s)
d./c .•
J
J
0
(s real).
s-+co
APPLICATIONS
Suppose we know A.,B.
J
the poles of H* (s)
(j
J
= 1,2, .•. ,K)
by inspection.
.
Then we can find
To apply formula (2.9) we need
only solve
lK
j =1
*
*
1f.A. (-s)B. (s)
J J
in the left half-plane for
J
dl, ••. ,d
M
.
=I
We can then find
F(x)
by expanding
in partial fractions and inverting termwise.
To obtain the corresponding result with conditions on the
inter-arrival distributions we need to express
4>- (s)
and consider the behaviour of H* (s)
half-plane (see
§
F* (5)
in terms of
in the right
4.3 in [20]).
To prove an analogue of Smith's Theorem 1 we need a family of
simple closed contours in the right half-plane on which the inequality
IH * (5)1 < 1 is valid. We saw in the proof of Lemma 2.3 that
IH* Cia) I < 1 for a € IR- {Ol. In our next result we show that
IH* (s)"' < 1 on a small semicircular arc to the left of s = 0 .
•
25
.-.
LE~'~
2.9: There exists
in {lsi =
PROOF:
€,
Re(s) <
IH * (s)1
e > 0 such that
for all
s
oJ.
*1
H* (0) = 1
<I
and H (0) = a - b > 0 ,
so there exists
< I
(-E,O) .
E in
such that
(O,ll)
o < H* (x)
Suppose
Re(s) =
0' €
for all x
(real) in
[-e,O) • Then
s
r n.
j=l
K
*
*
A. (-O')B. (0')
J J
J
= H* (0')
< 1 .
Fix e
in
(O,ll)
with this property.
o
For any
R > e,
we
define
CR to be the simple closed contour formed by joining the
semi-circular arc of Le~~a 2.9 with the arc {lzl = R,Re(z) > O}
CR = {lzl=E,Re(z)<O}u{e<lzl<R,Re(z)=O}
CR may be represented by the following diagram:
along the imaginary axis,i.e.
u{lzl=R,Re(z»O}.
•
26
--.
We have the following analogue of Smith's Lemma 2:
LE~~~
2.10:
If,
for some
R>
(i)
H* (5) is meromorphic inside
€
,
(ii) IH * (5) I < 1 on the arc
1 - H* (s)
then
,-.u~1::'8r
hns the Sf>.:C
CR ,
{lsl=R,Re(s»O},
(of zel'OS inside
C
r
as
L * (5)
has poles •
The proof is very similar to that of Smith's Lemma 2.
PROOF:
has a finite number of poles inside
in or on
CR'
Let these poles be
H* (s)
CR and no other singularities
sl,.,.,sN' Define
N
*
pes) = TI (s-sJ')' R(s) = pes) H (5) •
j=l
We suppose here that
R(S)/P(s), i.e.
H* (s)
is expressed in lowest terms as
pes) and R(s) have no common factors!
only singularities of H* (s)
Since the
are poles, this means that
pes) = O=> R(s) :; 0 ,
C , By Lemma 2.8 and
R
the remarks preceding it, condition (ii) implies that IH * (s) I < 1
pes)
and
for all
are analytic inside and on
R(s)
s
€
CR ,
Therefore, since
IR(s) I < Ip(s)
By
Rouch~'s
zeros inside
•
pes)
Theorem,
CR'
as well, since
I
P(s):; 0 on
for all
P(s){l-H * (s)}
s
= pes)
€
pes)
=0
==>
R(s) :; O.
we have
CR .
- R(s)
No zero of P(s){l-H * (s)}
C ,
R
has exactly
N
can be a zero of
Thus
I - H* (s)
has
27
.-e
N zeros inside
N.B.
CR •
Note that we do not require
sl, •.• ,sN
o
to be distinct.
In Theorem 2.1 we showed that Equation (2.6), viz.
+
*
s~ (s)
1 - H (s) = --~(s-;q~- (s)
is valid for all
s
such that
Re(s)
-~ <
In the following
< ~.
lemma we extend the domain of validity of this equation.
LE~~
2.11:
If H* (s)
is meromorphic in
(s-A)w-(s)[l-H * (s)]
throughout
PROOF:
{Re(s) >
-~}
{Re(s)
>
= SW+ (s)
(2.10)
except at isolated points in
From (2.6) we know that (2.10) is valid for
{-~ <
Re(s)
h(s)
=
< ~}.
s~
Now
+ (5)
(S-A)~ - (s)[l-H * (s)]
clef
Therefore
h(s)
If H* (5)
is meromorphic in
is analytic in
{Re(s) >
s
{Re(s)
{-~ <
is analytic in
{Re(s)
sl, .. ,sN.
>
OJ,
{Re(s)
then
~~}
It follows that
> -~}
Re(s) <
and
~} •
{Re(s)
1 - H* (s)
= s~
+
except at a finite
(S-A)~-(S)
has an
~s)
1-H (s)
throughout
1-H * (5),
•
{Re(s)
~~}
except at
which are isolated.
~ ~}.
may be
analytic continuation defined by
(S-A)~-(S)
~} •
in
has an analytic continuation throughout
continued analytically throughout
number of poles
OJ, then
sl, .•. ,sN or at the zeros of
It follows that (2.10) is valid
.-.
-~}
throughout
{Re(s) >
COROLLARY:
The zeros of
of
(S-A)~-(S)
in
I - H* (5)
{Re(s)
>
(S-A)~-(S)
are the zeros of
0
except at these points.
o} ;
in
{Re(s) > o}
in
the poles of
{Re(s)
>
O}.
are the poles
H* (s)
0
We may now prove the analogue of Theorem 2.2 with conditions on
the inter-arrival distributions.
THEOREM 2.3:
If
Aj (0)
(ii)
and
I
s1' •.. , sN
- H* (s)
*
F (5)
has
< 1
for each L
are the poles of II* (s) in
N-I
(a-b)s
zeros in
{Ile(s) > O},
{Re (s) > OJ,
rl,···,rN_l
then
say~
and
N-1
n (s/rJ,-l)
j=l
= ---~~----
*
{l-H (s)}
for all
s
in
{Re (5) ;;::
O}
(2.11)
N
n (s/sJ,-l)
j=l
(the right-hand side of (2.11) being defined by continuity when terms
in the denominator vanish).
PROOF:
{Re(s)
that
choose
H* (s)
•
H* (s)
Condition (i) L:plies that
>
oJ.
H* (s)
Since
E < ~
and
H* (5)
is analytic in
has a finite number of poles inside any
R sufficiently large that
in
is meromorphic in
{Rees)
>
O} .
S~J
eR .
we see
We may
CR contains all the poles of
29
As in the proof of Theorem 2.2, condition (ii) implies that
IH * (5) I
1
<
for all
s
on the
arc
= R,Re(s)
{lsi
sufficiently large.
Since the only zero of
is a single zero at
s
has
N-l
zeros in
= 0,
> O}
1 - H* (5)
for
R
inside
S\l
it follows by Lemma 2.10 that
{Re(s) > O}.
Denote these by
1 - H* (s)
rl, ... ,r _l .
N
From (2.10) it follows that
ep+ (0) = lim
(S-A)ep-(S) l-H * (s)
5
s~O+
= >.(a-b)iP-(O)
and therefore for all but isolated points in
F* (5) =
{Re(s) > O}
>.(a-b)iP-(O)s
{l-H* (s)}(s->.)ep - (5)
(2.12)
_~_..L..-""':_"';;"<"'''--
From the corollary to Lemma 2.11, the function
s
i {s.} u {r.} by
J
~(s)
defined for
N-I
J
TI (s-r.)
J
I
N
Hs)
TI(s-s.)
1
J
and by continuity elsewhere, is zero-free and entire in the finite
plane.
We show that
In
is bounded and therefore constant.
N-l
II (s-r.)
1V(S)
{Re(s) SO}, (5->')
1
J
N
is bounded
since
Re(s.) > 0,
J
TI(s-s.)
1
J
Re(r.)
J
so
•
>
~(s)
0
for each
j.
Also
ep-(s)
is bounded in
is bounded in the left half-plane.
Choose
{Re(s)
~
\l!2} ,
rr so that
30
••
A,sl, •• ·,sN,rl, ..• ,rN_I all lie inside CR' Now
. 'N-I
IT (s-r.)
I
J
(S-A) N
is bounded in {lsi ~ R,Re{s) > O},
IT (s-s.)
I
and
~-(s)
is
J
bounded there also since
by (2.10)
+
~- (s) = _--..;;;;.s_\P.....:{o.;:.s...,..~__
(s-A){l-H (s)}
and
$(s)
+
s
iI> (s),~
{l-H*{s)}-l
and
s-/\
{lsi ~ R,Re{s)
is bounded in
right half-plane.
and bounded on
are all bounded there.
$(s)
CR'
OJ,
i.e. outside
CR in the
is bounded inside
Thus
C since analytic there
R
is bounded everywhere, and therefore
$(s)
is constant by Liouville's Theorem.
have
>
Therefore
Suppose
$(s)
=P
~
O.
Then we
N
P IT{s-s.)
I
J
o.
N-l
IT (s-r.)
1
J
(2.11) now follows from equation (2.12).
COROLLARY:
0
The stationary probability of an arriving customer finding
the server free is
N
{a-b)ITs.
1 J
N-I
IT r.
1
J
Application to the forward transition model:
Consider the forward transition model with customer types
0,1, ... ,I<: , transition matrix
•
[p j k]
and parameters
AO' AI" .. ,AI<: •
31
We recall that for this model a type
transition
(Zt)
0
~
in state
arrival corresponds to a
j
j
in the guiding chain
j
between transitions is exponentially distributed
with parameter Aj
0 < A <
(assume
j
(Zt);
~),
j
the time spent by
= O,l, •.. ,K.
We shall
demonstrate that the conditions of Theorem 2.3 are always satisfied
for thi s mode 1•
Condition (ii) is clearly satisfied since each type
j
arrival
interval begins with an exponentially distributed period in state
j
o to be +~ here if Poo = 0.]
In the following result we show that condition (i) is always satisfied.
of the guiding chain.
LE~~A
2.12:
[We may allow
A
The inter-arrival distributions for the forward transi-
tion model have rational Laplace transforms.
PROOF:
(j
Let
Aj
be the d. f. of a type
j
arrival interval as usual
The length of a type
j
arrival interval is the
= O,l, ... ,K).
time when the guiding chain first leaves state
Clearly Ao* (s)
=
A
0
A +S
For
j = 1, ...
,K
0
given
Z(O) = j.
we have the relation
0
A.
A
j
J
* (s) =
L+s
J
since a period in state
k.
is followed by a transition to some state
We may write this as
A.+S
*
1\.
A. (5)
J
-+J
Thus we have
•
j
K
I
k=l
K
*
PJ'k'\ (5)
=
(j = 1,2, ... ,K).
linear equations in Al * (s), ... ,AK* (s).
minant of this system of equations is a polynomial in
s
(2.13)
The deterwhich is zero
32
only at isolated points.
By Cramer's rule,
ratio of polynomials in
NOTE:
A * (5)
j
is seen to be a
o
s.
The proof of Lemma 2.12 goes through when
A =
o
+00.
We conclude this chapter with a numerical example.'
Example:
K=2,A =4,
Consider the forward transition model with
=
II'
1
1
I
4"
"2
4"
1
0
0
1
I
1
"3
"3
o
"3
We suppose that service times are constant, with
The
Bj * (5)
=e
-sb.
J,
so that Theorem 2.2. does not apply.
From (2.13)
we have
*
4
A0 (s ) = 4+ 5
*
Al (s) =
'
3.4
"7(_=-~+-s') '(4:-+-s~)
Hence
= - !!
48 <
= EpOJ.a.J = --127
0
~
{Re (5) > O}
{Re(s) > O}
17
= EpoJ.b.J = 48
o =I
'
at
s
=
H* (5)
2,3,4.
(A2) and (A3) may be
= EPojAj * (-s)B j * (s)
I-H * (5)
has
F* (5)
by (2.11) .
has
..
,)
2 zeros in
which are computed by Newton's method to be
We may now evaluate
•
3
= 2'
' b
(AI) is satisfied.
so
'
I
verified with
poles in
4(6+s)
(2+s)(3+s)(4+s) •
and
a
Now b-a
=
,
2.76,4.60.
·-.
C HAP T E R 3
STATIONARY BEHAVIOUR OF THE BACKWARD MODELS
:5 • 1
Introduation
The queueing time process
{~}
is not Markovian for the back-
ward model, so we cannot expect to obtain a simple integral equation
like (2.3) for the stationary queueing time distribution.
However,
the total waiting time process is Markovian; in Section 3.2 we show
that this process has a stationary distribution
G under the usual
first moment condition, and derive an integral equation for
G.
In Section 3.3 we solve this equation in the case of exponential
arrivals.
As a by-product we see how to calculate some stationary
probabilities that will be useful in dealing with busy periods in
Chapter 4.
3.2
In this section we consider the general backward model, so-
called because at each arrival epoch the customer type (and therefore
service time) is related to the preceding arrival interval.
fically, we assume that there are
K
customer types: at each arrival
epoch the type of the next customer is selected from
according to the probabilities
any previous events.
sampl~d
•
~1'~2""'~K'
The service time of a type
from a distribution with d.f.
Speci-
{1,2, ..• ,K}
independently of
j
customer is
Bj , and the length of the
34
.-
•.
arrival interval preceding a type
(j
A.
:J
= 1,2, .•• , K) •
Zn be the type of the nth customer,
Let
Vn his service time
(n_l)th and nth arrlva
. 1s
'
be t ween the
th e t lme
and Un-l
en
customer has d.f.
j
= 1,2, .•. )
(Uo being the time from 0 to the first arrival).
Let Vo be the residual service time at time O. We leave unspecified
We may summarize our assumptions as follows:
the distribution of Vo
for n
= 1,2, •.•
Pr{Un~. 1 s x
I
Z
n
= j} = A,(x)
:J
•
Thus Vl ,V , •• , are iid random variables, and Uo,U l , •.• are iid
2
also. Vn and 0n-l are dependent through Zn' but conditionally
independent given
Zn'
We suppose that the mean service and inter-
arrival times -are finite, 1. e.
co
00
aj
= Ju Aj(du)
<
~
,
bj
=J u
o
o
r
B. (du) <
:J
co •
K
Let
a =
E{Vn }
•
=b
'I1',a
j=l J j
'11'
for n. 1,2, .••.
are iid
.1,). •
:J J
a
for n=O,1,2, .•• ,
Let
r~ndom
variables, with
E{X }
n
=b
..
a •
35
{~}
The queueing time process
Markovian for the backward model.
:
of the relation
(as defined in Chapter 2) is not
This is evident from a consideration
(c.f. (2.1))
(3.1)
Since
both
Vn )
~
~-l
= max
(~-1
and Qn
+ Vn _1 - Un_I'O),
we see that knowledge of
yields information about
Un _1 (and therefore
that is not contained in knowledge of Qn alone. As a result
the conditional distribution of
~+l
different from the distribution of
given
~+1
nth
completion of service.
a Markov process.
Wn+ 1
= max
customer, i.e.
Let
W
be the total
n
the time from his arrival to
(Wn - Un
+
Vn+ l'Vn+ 1)
Wn
=~
(n
{Wn } is
= 0,1,2, ... )
+ Vn ·
(Take
x
0,
~
(3.2)
Wo
= Yo).
{Vo ,U0 ,Vl,ul,···,U,V
n n+ I}' and
We note that Wn+l is independent of
Un + 1 and Zn+l'
Let Gn be the d.f. of Wn (n = 0,1,2, ..• ).
result, we derive a recursive relation for {G } •
n
For
alone.
This is clear from the recursive relation
Clearly W +l is determined by
n
the Markovian property follows.
3. 1:
~
Then, as remarked in [2 ; p. 1013],
which follows from (3.1) since
LEM1~A
will be
conditional on
However, a related process is Harkovian.
waiting time of the
~-1'~
n = 0, 1,2 , • . .
In the following
we have
K
Gn + l (x) =
•
I
j=l
G (x-u+z) A.(dz) B.(du) .
n
o
0
J
J
(3.3)
36
·-e-
PROOF:
K
Pr{Wn +1 s; x} = I
j=l
.-
PrHtTn+ I
1T.
J
I zn+l
S;
x
x
I zn+l
= j}
and
x
Prfwn+ 1 s; x I zn+l =
j}
=
JPr{\i1n+
1
s;
= j, Vn+l = u} Bj (du) .
0
If Vn+l s; x,
then from relation (3.2) we have
wn+l
s;
x <=> W - U + VI
n
n
n+
S;
x •
Hence it follows that
00
=
JGn(x - u +
z) Aj(dz)
o
since
Wn
and Un
are independent.
We may show that
LEMMA 3.2:
all
Therefore (3.3) follows.
0
{W } has a stationary distribution if EX I < O.
n
If EX l = b - a
< 0,
then
G(x)
= lim G (x)
.def n-+oo n
exists for
x and is a proper d.f.
PROOF:
Applying relation (3.2) repeatedly, we have
= Pr{Vns;x,Vn- I+Xns;x, ••. ,V l +X 2+..• +Xn
s;x,V o
+Xl+ ..•
+X S;X}
n
•
37
dependent, we may replace
by
U
• (j =
n-J
o~
Gn (x)
~
Hn (x)
Now
0
~
lim
n+co
by
I
exists:
n
s Hn (x) - Gn (x)
for
= O.
So
Hn+ lex)
and
H (x)
independent of Vo
X
n+ l-J'
= 1,2, ... );
j (j
follows that
V I . (j = 1,2, ... ,n) and U. I
n+ -J
Jwithout changing the above probability.
X.J
+ X
f.
by
J
1,2, ... ,n)
This means replacing
+ •
V.
S
Hn (x)
+ V
n
Let
Then we have
define
S Pr{S
j=I,2, ... ,n.
G(x)
>
0
(n
x}
= 1,2, ..• ) .
= lim H (x).
It
n+co n
and since
Sn
is
we have
I
0:>
Pr{Sn + Vo
>
x}
=
Pr{Sn
>
x-y} Go(dy) .
o
Pr{S
By the strong law of large numbers,
for all x,y,
Pr{Sn + V
o
equals
>
n
>
x-y}
+
0
as
n
+
0:>
so by the dominated convergence theorem,
x} + 0
as
n +
00.
Therefore,
lim Gn(x)
n+co
exists and
G(x).
We have
G(x)
=0
non-decreasing function.
for all
x
<
0,
and
G(x)
Therefore, to show that
is clearly a
G(x)
is a proper
lim G(x) = 1. This follows by the same
x+co
methods as used in 94, (ii) of [11], since Sn + Vn + +_00 a.s. by
l
d.f.
we need only show that
the strong law of large numbers and
G(x)
•
= Pr{sup(S
n~O n
+ V 1)
n+
~ x}
.
o
38
.-.
We assume throughout the rest of this chapter that the
stationarity condition,
(3.3) that
G(x)
b - a < 0,
is satisfied.
It follows from
G satisfies the integral equation
=
I
j=l
~.
J
x
00
JJG(x-u+z)
A.(dz) B.(du) (x
J
o 0
J
~
(3.4)
0).
This equation can be written in Wiener-Hopf form as
00
= J H(x-u)
G(x)
G(du) ,
o
but the kernel function
{A.,B.}
J
H does not bear such a simple relation to
as does the kernel for the forward model.
J
Therefore, the
methods of Chapter 2 are not appropriate.
Let
Kj(X)
be defined for each
j as
00
Kj (x) =
J G(x+z)
Aj (dz) (x
2 0)
o
=0
(x < 0) •
Then (3.4) may be written as
G(x)
=
I
j=l
x
~.JK.(X-U)
J
o
J
B. (du) (x 2 0)
J
If we take Laplace transforms, then by Lemma A2
o
G (s)
K
= L
j=l
~.
J
we have
K. o (s) B. * (s) ,
J
(3.5)
J
wherever both sides are defined.
To proceed further with the solution of (3.5) we need to express
KjO(S)
in terms of GO(s)
and Aj *(s).
We have been able to do this
only in the case of exponential arrivals, i.e. when
(j
•
= l,2, ... ,K).
Aj(X)
= l-e
We examine this case in the following section .
-A.X
J
39
3.3
In this section we continue the investigation of the stationary
distribution of total waiting time for the particular case of the
backward model with exponential arrivals.
::
In addition to the assump-
tions of Section 3.2 we suppose that for each
j
is of the
Aj(X)
form
=1
A (x)
j
Clearly
A.
J
= 1/a
with parameter
,
-A.X
- e
J
(x ~ 0) •
since the mean of an exponential distribution
j
1/
A is
A
K.o(s)
This restriction leads us to a simple expression for
J
in
°
terms of G (s) •
LEMMA 3.3:
For all
s
in
{Re(s)
~
O}
such that
s
~
A.
J
(3.6)
PROOF:
We have
Kj °(s) = A. J Je-SY-AjZ G(y+z) dz dy .
J
Put
y+z
= u,
z
°
= t.
Kj (s)
=
00
00
o
0
Then we have
oou
-s(U-t)-A.t
e
J G(u) dt du
A.
J
o 0
JJ
Je- su G(u) {Je-(Aj-S)t dt} du
00
= Aj
A.o
=-L
A.-S
J
r.
Je- su {I _Oe-(Aj-S)U} G(u) du
o
A.
=
•
J
{Go(s) - GO(Aj)} .
o
40
.-.
From (3.5) and (3.6) we have
(3.7)
::
G* (s)
=s
GO(s)
THEOREM 3.1:
If
l
Since
we have the following result:
K
'=1
then the stationary
< 0,
7T.
J
distribution of total waiting time for the backward model with
exponential arrivals has Laplace transform
K
*
7T.
G p.) B. * (s)
s l\' ~
I\.-S
J
J
)'=1 )
G* (s) = --"-:.....;;.;.---"-'-----:-----
I
/_~
j=l
J
G* (A.).
J
Bj
(s) - 1
G* (s), we need some way of deter-
To make use of (3.8) to calculate
mining the quantities
(3.8)
*
7T.A.
K
We proceed as follows:
Suppose
Al, ... ,A K are distinct. (If not, then we could amalgamate the appropriate customer types as is done in Chapter 4 without
affectin~
G* (s).)
Consider the function
~(s)
defined by
7T.A.B. * (s)
J J J
<l>Cs) =
A.-S
j=l
J
K
L
In Lemma 4.13 we show that the equation
(real) positive roots
.'
out
{Re(s)
>
~2""'~K'
O} , it follows that
Since
$(s)
= 1 has exactly K-l
G* (s)
is analytic through-
41
-.
K
I
*
1r.
*
- L G CAJ.)B . (s)
. 1 11..-5
J=
.
Let
T.
J
G* (A.).
= ~.J
as
s
~
Then we have
J
T1,TZ, ..• ,T K
for
= 0
J
J
= llZ ' 113' ••• , JJ K
5
K -
1
•
linear equations in
We may derive one more equation by considering (3.8)
•
0+.
G* (0)
Since
= 1,
we have
I
1
•
J
= lim
T.
*
~5 B.J (5)
1\.
J
IfI ~j Aj
5\..J 11..-5
J
5~0+
I
B. * (s)
J
T.
J/
-"j~_A....j '---
=
I ~.
I
{(l -
lim
J s~O+
j
s/
_
A.
J
- I}
) -1 B . * (s)
J
Tj /
:\.
= --=-_..........
J
a - b
Thus we have
K
linear equations in
T.
K
I
J/
j=l
K
I
j=l
*
T.
A J
B.
J
j-llk
A.
=a
(JJk)
Section 4.3, then calculate
•
G* (5)
as
viz.
,
(3.9)
- b ,
J
=0
To summarize our procedure, we solve
can evaluate
Tl,TZ, ... ,T K
(k
= Z, 3, ••• , K)
cf>(s)
=1
for
(3.10)
•
lJZ"'" JJK
T1, ... ,T K from (3.9), (3.10).
as in
Hence we
42
'r.B. * (s)
J J
s L
>..-s
-.
~
G* (S) = _ .....J~_..J<.J _ _'--
>..
*
L. --Ll..
B. (s) - 1
>..-5 J
'IT.
.
We may interpret
3.4:
'r.
J
~
0 .
J
is the stationary probability that an arriving
J
G* (A .)
J
in terms of stationary probabilities.
T.
customer be of type
PROOF:
s i {>..},Re(s)
J
J
LE~~A
for
and have zero queueing time.
j
0
= A.G
(A.)
J
J
co
I
=I
=
-A.X
A. e J
G(x) dx
J
o
co
G(x) Aj(dx) .
o
A customer has zero queueing time iff his total waiting time is equal
to his service time.
From (3.2) this occurs iff the total waiting
time of the previous customer W is no greater than the intervening
Wn is independent of Un for any n, it
follows that the stationary probability of a type j customer having
arrival time
U.
Since
zero queueing time is
I
co
Pr{W - U ~ O}
=
G(x)
o
Since
'IT.
J
is the probability that an arriving customer be of type
the result follows.
0
j,
43
.-.
COROLLARY:
The stationary probability that the server be free is
K
It"·.
J
0
j=l
Let
t" .
=
y.
J
--l.
Then Y'
Lt".
J
is clearly the stationary probability
J
that a customer with zero queueing time be of type
As the arrival
j.
of a customer with zero queueing time is equivalent to the beginning
of a busy period (see Chapter 4),
may be regarded as the station-
y.
J
ary probability that the initial customer of a busy period be of type
j.
We may show that the stationary probability that a busy period
start with a type
probability
LENMA 3.5:
of
customer is in general different from the
j
that any arriving customer be of type
~.
J
It is not possible that
y.
J
=~.
J
j.
for two distinct values
j .
PROOF:
Since
t". =
J
=
Therefore, if YI
G* (AI)
= G* (A Z)
.
~.
J
'IT
1
t".
G* (A.)
and y. = --l , Yj
Lt".
J
J
J
and Y2
= 'lT 2
where
Al
= ~.J
< A '
2
=>G * (Aj)
..
= Lt"
. J
J
we have
But
G* (AI) - G* (A 2) __
~I
{e- AIX - e -A 2X} G(dx)
o
and
exp{ - AlX}
Therefore
G(O)
= 1.
G* (AI)
exp{ - A2X}
= G* (A 2)
with equality only at
X = O.
iff G puts all probability at 0, i.e.
This is impossible with exponential arrivals,
result 'follows.
•
~
0
50
the
44
.-.
NOTE:
fo~vard
For the general
model the stationary probability
that a customer with zero queueing time be of type
j is
7T.
J
J
i.e.
the same as the unconditional probability that a customer be of
type
j.
Qn is independent of Zn
That is because
for this model.
It can be shown that the stationary probability that the server be
free is simply
b
period is l-b/a
1 - b/
a
.
and the mean length of a general busy
for the general forward model (c.f. Section 4.3) .
From (3.9) we have the following expression for the stationary
probability that the server be free:
~
LT.
j J
=-a-b
--
L y./A.
j
•
J
J
(3.11)
·-.
CHAPTER
4
BUSY PERIODS FOR THE BACKWARD MODELS
4.1
A busy period begins when a customer arrives to find the server
free, and ends when the server becomes free again.
In this chapter
we work in terms of busy periods generated by a fixed initial service
time; in particular, we denote by Yew)
generated by an initial service time
transform
the length of a busy period
w.
If
~(w,s)
is the Laplace
E{e-sY(w)}
begun by a type
and Y denotes the length of a busy period
j
customer, then
j
00
E{e-
SYj
}
=
f ~(w,s) Bj(dw)
.
o
In Section 4.2 we consider busy periods for the backward model
with exponential arrivals, generalizing a procedure applied by Cox
and Smith in [5 ; §5.6] to the random arrivals ca$e.
the busy period transform
~(w,s)
We show that
is a mixture of exponential terms,
Le.
-W]1.(s)
K
$(w,s)
= l
H.(s) e
J
j=l J
and derive equations enabling us to solve for
j.
mj
J
J
for all
In Section 4.3 we show how to compute the means
= EY j ,
mew)
= EY(w).
ir. ::cction 4.2 to
•
H.(s), ]1.(s)
model with
A
o
=
t~~e
00
•
In Section 4.4 we extend some of the results
:1ore (Jeneral case of t}le backward transition
46
.~.
Similar results could be obtained for
~(w,s)
for the corre-
sponding forward models, but the simple expression for
{
Ee
above is no
~onger
-SY,}
J
valid, since the initial service time and the time
to the second arrival are now dependent.
Therefore, the procedures of
this chapter are of little value for the forward models.
4.2
The baakward model. with szponential. arrivaZs
In this section we consider the general backward model with
K
customer types, general service times and exponential inter-arrival
times.
ed from
At each arrival epoch the type of the next customer is select{1,2, ... ,K}
~1'~2""'~K
according to the (non-zero) probabilities
and independently of any previous events.
customer is chosen to be of type
j,
then the time to his arrival is
sampled from an exponential distribution with parameter
service time has d.f.
If
B ••
J
If the next
for some
A"
J
and his
then we
j of k,
could amalgamate the two customer types without affecting the busy
period distribution by taking
~.B. + ~kBk
J J
as the new service time d.f.
AI,A Z"" ,AK
Therefore, we may assume that
are distinct; suppose
0 < Al
assume also that the mean service times
bj
•
are finite .
=
f u Bj(du)
def o
<
AZ
< ••• <
AK
We
47
.-.
To consider the busy periods for this model, we suppose that at
time
0 a customer with service time
server free.
Let
Wt(w)
be the virtual waiting time at time
under this initial assumption.
i.e.
Yew)
w has arrived to find the
Define
Yew)
= inf
{t : Wt(w)
t
~
= O}
0
,
is the length of the busy period induced by the arrival
of a customer with service time
a hitting time for the process
yew)
w.
(Wt,t~O.),
is measurable since it is
[15; p. 72], and
Yew)
is finite a.s. provided the stationarity condition holds, i.e.
K
I
j=l
since then
Wt
1T.(h. - l/A.) < 0 ,
J J
J
reaches
0 eventually with probability 1 (see Chapter
3) .
At any instant
t > 0,
there will be a customer in the process
of arriving, of the type selected at the previous arrival epoch.
define
Zt
to be this customer type and make
continuous, so that if
t
is an arrival epoch, then
of the next customer to arrive.
process, with
Pr{Zt
state at time
t.
= j}
(Zt,t~O)
Then
= 1T j
(Zt)
Zt
We will refer to
w).
(Wt )
rightis the type
is a stationary Markov
Zt
Consider realizations of the process (lilt (w) , t~O).
of this paragraph we suppress
We
as the arrival
(In the rest
has discrete increments at
each arrival epoch equal to the service time of the arriving customer.
(W ) decreases continuously at a unit rate until it
t
reaches zero, where it remains until the next arrival epoch. (W ) is
t
not a Markov process since {W ; 0 ~ SSt} contains information
s
about the time of the last arrival before t
that is not given by
Elsewhere
•
48
••
knowledge of W alone, and this information affects the distribution
t
of the time to the next arrival after t, except when the (overall)
inter-arrival distribution is exponential.
[The unconditional inter-
arrival distribution has d.f.
AjX
~.(1.
e].]
j=l
E
J
Suppose
(Ws,Zs)
is given for
0 s sSt.
The distribution of inter-
arrival time conditional on the arrival type is exponential, so the
time to the next arrival after
since the last arrival before
future behaviour of
os
SSt}
(Wu,Zu)
only through
is a Markov process.
t
is
now independent of the time
t,
as
Zt
for
u
(Wt,Zt)'
>
t
is given.
Therefore, the
depends on {(Ws,Zs)'
This shows that
«Wt,Zt),t
~ 0)
It is clear from the discussion at the beginning
of this paragraph that the behaviour of
(Wu,Zu)
for
u > t
given
is independent of t, i.e. «Wt,Zt)) is homogene~s [1 ].
Define $(w,s) = f{e -sY(w) } for Re(s) ~ O,W ~ 0; i.e. $(W,s)
(Wt,Zt)
is the Laplace-Stieltjes transform of the d.f. of Yew).
In the
classical random arrivals case, which is what our present model reduces
to when
K
= 1,
it is possible to show that
w2 ,s) = $(wl,s) $(w2 ,s) for any wl ,w 2 > 0,
and consequently that $(w,s) is necessarily of the form
$(w,s) = e -wacs) [5; p. 145]. In generalizing this procedure to our
$(w l
+
present model we must split
$(w,s)
into components corresponding to
the conditional transforms of the busy period distribution given the
initial and final arrival states .
•
49
Let
;
Then
K
cf>(W,5)
= I
K
I
i=l j=l
cp •• (w,s)
1J
In the following lemma we derive a relation satisfied by the components
<Pij(w,S)
and corresponding to the relation
for the random arrivals case.
LE~MA
For any w ,w > 0 and
l 2
4.1
i,j
€
{1,2, ..• ,K},
Re(s)
~
0 ,
(4.1)
We may split the busy period generated by an initial service
PROOF:
time
wI
w into two parts, the first part being the time required
2
to expend amount wI of the virtual waiting time, the second being
+
the time required to expend the remaining amount
Y1 =
Then
•
inf {t
w2 .
Let
O} •
50
.•
Let
~
E..
1)
be the event
~o= i,Zy 1+y2 = ~
and
,
the event
RE..
1)
~o = i,Zy 1
JI.,Zy +y
1 2
==
.~ .
Then
K
= .Q,=l
I
E.•
1J
and
50
EiJ· R.
we have
4> ••
1J
(wI
+ w ,s) = E{e
-5(Yl + yz)
2
=
f
e
-s(y
+
1
}
IE..
.
1J
YZ)
Pr(E .. )
1J
dP
E..
1.)
K
= I
.1/,=1
I
E.. n
1J""
Pr(E .. ,,)
1J""
(where
P is the probability measure of the underlying probability
space) .
If E.. .Q,
1)
has occurred, then
of the homogeneity of the
bution of yZ
given
(Wt,Zt)
Yl,Eij.l/,
Wy
1
= Wz
and
Because
process, the conditional distri-
is the same as the distribution of the
51
length of the busy period induced by an initial service time
conditional on the initial arrival state being
val state
j,
~
W
z
and the final arri-
i.e.
(We note that the conditional vrobability
can be defined for almost all
[17; p. 260].)
y as
Since
is. therefore / independent of
y,
it follow5 thlit
E.1J' 0 '
conditionally independent given
Y1 and
for if
IV
then we have
Pr{Y l S Yl'Y Z S Y2 lEijt}
Yl
=f
Pr{Y Z S
Y2~Eijt}
Gijt(dy)
o
Yl
= Pr{Y 2
s YZIE ijt }
f
o
•
Gijt(dy)
Y2 are
52
.:.
Because of this conditional independence of YI 'Y we have
2
We defined YI as inf {t : Wt(w 1 + w ) = w2}; for every realiza2
tion of the process (Wt(w l + w2)) there is a realization of (Wt(w l ))
w2) = Wt(w 1) + w2 for all t such that
s t S inf {u : Wu(w I ) = O}. Thus the conditional distribution of
such that
o
YI
given
given
Zo
Wt(w l
+
Eijt
is the sarne as the conditional distribution of Y(w l )
= i,Zy(w
)
l
= t.
Hence
From our discussion of the conditional independence of Y and
I
Y2 given Eij £, it follows that the conditional distribution of Y2
given Eij £ is the same as the conditional distribution of Y(w 2)
Hence
•
53
--.
Now, since
Pr(EijR.)
(Zt)
is stationary, we have
= pr{zo = i,Zy = i,Zy
1
= pr{zo = i,Zy = R.}.
1
=
pr{zo
= i'ZYl = R.}.
=
pr{zo
= i,Zy(\V
l
)
1
+
= i}
Y
2
pr{zy
1
pr{zyz
= R.}.
+
y
j Iz0 = iAZy
;:
2
= jlzo
pr{zy(W
)
z
=
t}
R.}
=
jlzo
:I:
1
= R.} .
Therefore, we have shown that
pr{zy(W
pr{zo
which gives us Equation (4.1), since
Let
O(w,s)
be the
K x K
Pr{Zo
matrix with
) = ilzo = R.}
z
= R.,Zy(W
= R.} = nt
)
2
•
(i,j)-element
=
j}
0
!p ••
1J
(w,s).
Then (4.1) is equivalent to the matrix relation
(4.2)
l 2 , ..• ,TIK ). From (4.2) we may derive a partial
differential (matrix) equation for ~(w,s). Let
where
A = diag
(n ,TI
;: I
00
'. j (w,s) ..
l4>ij (w ,s), (.Uij (5)
i
•
o
cjl • (u, s)
•J
B. (du) •
1
54
.-.
:
Le.
u;,..(5)
1J
is a multiple of. the Laplace transform of the length of the busy
period initiated by a type
custoruex· .. conc:litionai on th,;>
i
period ending in arrival state
with
(i ,j) -e lernem:
Kronecker delta;
THEOREI! 4.1
0
-
t/J • • ls)
1)
lj
il'(w,s)
=1
J.
Let
p.1
if i
a
Suppose w,h
be the
A.tI) ••
1 1J
> O.
matrix
being the
(5), &••
1J
U otherwise.
:::
w
x K
K:
satisfies tile matrix differential
";}W" 4?(w,s) = - '~(w,s) '1'(s) for
PROOF:
-
o1J.
s)
= j,
'1(5)
P.Uf,}'
Re(s)
> 0,
Then from (4. 2)
We
<=:
eql:!!t~~
(4.3)
0 •
have
h- l {4?{w + h,s) - w(w,s)} = <p(w,s) h-l{A-l~(h,s) - I } .
K
We consider the behaviour of
number of arrivals in
(O,t].
~(h,s)
= liz o = i}
then Y(h) = h
Pr{N(h)
If N(h) = 0,
h
~
0+.
Let Net)
be the
Since the inter-arrival distributions
= olz o = i} = 1
are exponential, we have Pr{N(h)
h ~ 0+,
as
~ A.h +
~
and
~ h.h + o(h)
1
o(h) and Pr{N(h)
ZY(h)
= Zo;.
>
as
liz0 = i}
if N(h)
~
~
o{h).
1, then
at time h the virtual waiting time is equal to the service time of
the newly-arrived customer, who is necessarily of type
fore, we have
•
Zo
=i.
There~
55
co
= w.1 (1
- A.h)e-Sho' j
1
1
+
n.A.he- sh
1 1
f ~ .J.(u,s)B. (du)
1
+
o(h)
o
= n.{1 - (A.+s)h}o .. + n.A.h w. . (s) + o(h)
1
1
1J
1 1
1)
Hence
h .. l {-1
n.
$'j(h,s) - o.. } = -(A..'I's)o,. + A.Wi.(S)
1
1)
1
1)
1
J
h+O+
1
'
l 1m
= -ljJ1)
• • (s)
so
It follows that
¢lew,s)
satisfies (4.3).
0
is differentiable with respect to
w and
In solving Equation (4.3) we take as boundary condition the
equation
For fixed
~(O,s)
s,
= A,
the
since
I
jth column of
~
satisfies the differential
equation
d~
!
-= - '¥ ,x.,
dw
with boundary condition ~(O)
.-
•
of I.
Ie
= ~.e.,
) ""3
where
e.
"')
is the
jth column
The unique solution of this equation satisfying the boundary
56
condition is
~
=e
-w'Y
r
(~j~j)
[10; p. 190].
unique solution of (4.3) satisfying
LEMMA 4.2
If A is an n
x
~(O,s)
It follows that the
=A
is
n matrix with n distinct eigenvalues
then
e
where
t.
J
in
tIL
J t.(A) ,
J
i.(ll.) = 0. . .
J 1
1J
A is simple
[10; p. 61], so the lemma follows from results
0
[10; p. 169].
Suppose
some point
re
j=l
n
=
is a fundamental polynomial of Lagrange type with degree
n - 1 such that
PROOF:
tA
s
'Y(s)
has
in
iRe(s)
e-w~(s)
=
K
~
distinct eigenvalues
O}.
K
r
e
lll(s), .•. ,lJK(s)
at
Then by Lemma 4.2
-WlJ.(s)
J
tj(~(s),s),
j=l
where
R.j(u,s)
R..(lJt(S),s)
J
a polynomial in u of degree
i~
= 0.1J. .
U -
It tollows then that
matrix
•
~(w,s),
1
such that
Specifically, we have
i.(u,s)
J
K -
l.lk
(S)}
= k¥j
n
( ) _ () .
{ llj S
lJk s
<j>(w,s) ,
which is the sum of the elements of the
can be written as
57
....
K
l
</l (W,s) =
-Wll. (S)
e
3
H. (5) ,
3
j=l
where Hj(S)
Let S
= {s
is the sum of the elements of the matrix
~
: Re(s)
0 and
~(s)
has
K
Atj(~(s),s).
distinct eigenvalues}.
Then we have proved
llIEOREM 4.2
For all
5
S
€
and all w
~
0
(4.4)
Let S+
= {s
: Re(s)
O}nS.
>
valid throughout S+
We shall prove that Equation (4.4) is
except at isolated points.
First we note some
properties of 1l.{s),H.(s).
J
LEMMA 4.3:
so that
(i)
].I. (5)
3
We may number the eigenvalues of
is a continuous function of s
3
(ii)
is analytic throughout
J
Since each w.. (s)
ls
13
the elements of
~(s)
The eigenvalues of
1J
(s)
~(s)
is analytic there.
•
s
in {Re(s) > O}
eigenvalues
in
{Re(s)
~
O}.
are continuous functions of the elements of
[10; p. 237].
tRees)
>
a},
each
It follows from the theory of
analytic functions that each ].Ij(s)
point
O}.
S+.
are continuous functions of s
1J
~ ..
~
5 varies
a mUltiple of a Laplace transform,
[10; p. 225], So (i) is clear.
Since w. . (5) is analytic throughout
~(s)
in {Re(5)
as
is analytic throughout S+.
llj (5)
(iii) H. (5)
PROOf:
~(s)
such that
is analytic at every
~(s)
has non-repeated
This proves (ii); (iii) follows directly
58
from (ii) and the definition of H.(s).
J
not be defined at points where
may
J
has repeated eigenvalues.
~(s)
We show now that points where
We note that H.(s)
0
has repeated eigenvalues
~(s)
correspond to the zeros of an analytic function, and are therefore
I
isolated.
It is clear that
I~IK - ~(s)1
discriminant of the equation
iff 0
LE~WA
= 0(5) =
n
{~.
def i<j
4.4:
PROOF:
From
D(s)
has repeated eigenvalues
~(s)
(s) -
~j (s) }
=0
iff the
is zero, i.e.
2
1
is analytic throughout
(ii) of Lemma 4.3,
{Re(s)
>
OJ.
is analytic thToughout S+.
0(5)
Since repeated eigenvalues may fail to be analytic, we need some results
from [6; pp. 233-7] to the effect that symmetric polynomials in
multiple roots are analytic, in order to show that
D(s)
is analytic
everywhere in {Re(s) > a}.
We summarize the results we need from [6]:
an analytic function of x
(a,S)
r
such
>
F(a,y)
of
~
0 has y
= (3
r:
p
and
Ix - al ~ r there are exactly n roots
x with
F(x,y)
be
as a root of
Then there exist positive real numbers
1.
that for all
Yl(x)'''''Yn(x)
F(x,y)
and y in a neighbourhood of a point
such that the equation
,multiplicity n
Let
0
in
{y
jy - 131 ~ p}.
Every
expression of the form
(with k
•
= 1,2, •.. )
is analytic in
x throughout
{x
Ix - al
$
r}.
59
-.
.
Applying these results to our situation, let
'
If
~k(s)
is a repeated eigenvalue of
=
s
in
in a neighbourhood of
$
: j
J }
€
k
Since
lyIK -'1'(s)!
def
is analyti c in y and
{~j(s)
then
IYI K - '1'(5) I = O.
are multiple roots of the equation
F(s,y)
~(s),
(s , ~k (8»
for any
{Re(s) > a}, it follows that
{~ . (s) }
R.
J
is analytic in some neighbourhood of s
for any
t
z
1,2, ..•
It follows that
lK
j=l
fir
J
{Re(s) > O}
noted in [6 ; p. 235J,
this is sufficient to prove that all symmetric
particular,
~l(s)""'~K(s)
D(s)
K
0(5)
As
{Re(s) > a};
{Re(s) >
a}.
s with Re(s)
D(s) cannot be identically zero •
in
0
has isolated zeros if D(s)
In the following lemma we show that
distinct eigenvalues for all
so that
fir
are analytic in
is analytic throughout
It will follow that
identically zero.
for any
= 1,2,.,.
is analytic throughout
polynomials in
•
{~.(s)}
'1'(s)
is not
has
sufficiently large,
60
LEMMA 4.5:
If B.(O)
J
=0
eigenvalues when Re(s)
PROOF:
for each
s
then
~(s)
s
We restrict our proof to the case where
+~.
has
distinct
K
is sufficiently large.
in the general case is similar.
as
j,
We have
is real; the proof
We show first that
w. . (5) = 0(1)
1J
ClO
10)..
(5) I =
1J
II
ep j
(u,s) B.1 (du) 1
1<1>
.(u,s)1 B.(du)
1
•
o
.J
o
ClO
s
I
e
-su
Bi(du)
o
since Y(u)
~
u
Since
it follows from
th~
assumption B.(O)=0
1
for each i
that
lim w.j{s) ,.. 0 ,
s~
Le.
1
W.• (5) :; 0(1) •
1)
By Ger~gorin's Theorem [10; p. 226], each eigenvalue of ~(s)
lies in at least one of the discs
0. (5) = {z :
J
.
•
Iz
- ( A.
J
+
(and in exactly one of these
s)
+ A. w•• (s)
J
di~cs
lJ
1 s k#j
1:
if 01, ... ,OK
I
A• w' k (s)
J J
I}
are disjoint) .
61
Now
as
w• • (s)
1J
5
= 0(1)
so that
-+ «>,
ciently large.
that
s
implies
~.(s)
J
It follows that
= A.J
A.W .. (s)
+ 5 -
J JJ
as
0(1)
sufficiently large.
0
~l(s), ... ,~
,
s
= AJ.
+ S +
will be disjoint for
D (s), ... ,D (5)
1
K
+ 5 +
J
A.
-+
=,
K
(5)
0(1)
and
s
suffi-
can be numbered so
so they must be distinct for
Thus we have proved
If 8 (0) = 0 for each i, then equation (4.4) is
1
valid throughout {Re(s) > O} except possibly at isolated points.
THEOREM 4.3:
REMARK:
Hj(S) and
the matrix
have been defined only in terms of
which is usually unknown.
~(s),
Hj(s)'~j(S)
~j(s)
In order to solve for
we need to derive equations satisfied by
Hj(S)'~j(s).
We shall obtain such equations by substituting
r
K
<p(w,s) =
H. (s)
j=l J
into an integral equation for
e
-w~.(s)
J
~(w,s).
First we prove some properties of H,(s)JP,(s):
J
J
K
LEMMA 4.6
l
H. (s)
(ii) If
Bi(O)
(i)
j=l
for each
•
=I
for all
=0
for each i,
s
€
S
J
j.
then
lim
Re(~)~
H.(s)
J
= ~.
J
62
....
(iii)
PROOF:
Re{~j(S)} ~
0
for
{Re(s) > oL
s
in
(i) is clear since
~1(s)
= Aj
~k(s)
Uj(s) that the
= At
+ S +
- Ak +
s
in
s +
~k(s} 0mn ::; (Am -
sufficiently large,
S ~
Ak ) 0nm +
J
>0
To prove (ii) we
Therefore,
00.
= 0(1)
wij(s)
it follows
~(s) - ~k(S)IK
(m,n) - element of the matrix
lJJmn(s) -
s.
Re{~ . (s)}
o};
From the proof of Lemma 4.5
as
since
~
{Re(s)
for all
real.
0(1)
0(1);
s
=I
$(Q,5)
restrict our proof to the case
we know that
for
0(1)
as
S so that Hj(S)
s
+
OQ.
is
For
s
is the sum of the elements
of the matrix
Since all off-diagonal elements of
~(s)
n
follows that the (m,n) - element of
k,j
: n (A
k~j
Henoe H.(s)
J
Suppose
there exists
m
= ~.J
- A ) 0mn +
k
f
= o.Jm0mn
+
-
~k{s)IK}
is
+ o(l)~~
",.
0(1).
is an eigenvalue of ~(s).
{l,2, .•. ,K}
J
(see proof of Lemma 4.5).
By Ger~gorin's Theore~
such that
I~(s) - (A. + s) + AjW .. (S)! s
•
{~(s)
0(1), it
j
~(s)
j
0(1)
are
- ~k(s)IK
JJ
Hence
I
k,j
AJ.lwJ·k(s)1
63
l
Re{~(s)} ~ Re{A. + S - A'W j .(s)} A.lw.k(s)1 .
J
J J
k;j J J
Now
co
Iwjk(s)
00
I s JI~.k(U,s)IBj(dU)
s
o
0
qjk is the probability that a busy period initiated by a type
say.
j
JPr{zy(u) = k}Bj(du) = qjk,
customer ends during arrival state
Re{~(s)} ~
Thus
Re(s)
RmfARKS:
S+,
~
1)
A.
J
\
J
I
Re(s) - A.q .. - A.
+
so q'k ~ 0 and
k,
= Re(s).
q'k
Jk~j
J JJ
r q'kJ = 1.
k
J
0' => Re{~(s)} ~ 0, Re(s) > 0 => Re{l-\(s)} > O.
For each
j,
the zeros of Hj(S)
since Lemma 4.6 (ii) implies that
zero, and we already know that
Hj(S)
0
must be isolated in
is not identically
is analytic throughout
Hj(S)
S+
(Lemma 4.3).
NOTE:
This presumes that
Bi(O)
=0
for all
i;
we assume hence-
forth that this is true.
2)
Part (ii) of Lemma 4.6 also implies that
lim
e
Sw
K
«f>(w,s)
=l'lr.e
j:; 1
Qe(s)-+IlO
-A j W
J
a result that we can obtain from another direction as fOllows:
Let
sew) = Yew)
- w, i.e.
t(w)
is the excess length over the
initial service time of a busy period induced by an initial service
time
But
•
w.
Then
E{e-S~(w)}
Pr{~(w)
= e Sw
= O} = Pr{no
~(w,s),
arrivals in (O,w]}
so that
=
K
r~. e
j=l J
.A.W
J •
64
Pr{~(w)
c
O}
=
lim eSw ~(w,s) •
Re(s)-+co
We now derive an integral equation for
LE~~
r
j=l
K
4>(w,s)
III
W ~
for
PROOF:
satisfie~
$(w,s)
4.7:
e
J{
~.
the integral equation
J
w
-(S+}.j)W
$(w,s):
+}.,
J
e
-(}..+S)z J~
J
Re(s)
~
} (4.5)
o
o
0,
4>(w-z+v,S)Bj (dv)dz
0.
The derivation of this equation follows a standard pattern
for integral equations in queueing theory, so we merely sketch the
proof.
We have
$(w,s)
=
r ~.
j=l
K
Given Z
-A.W
e J ,
o
III
J'
'
•
0
either no arrival occurs in
in which case Yew)
place at time
= j}
E{e-SY(w)lz
J
(O,W]
with probability
= w,
or the first arrival in (O,w]
-Lz
z with probability }..e J dz, increasing the
J
virtual waiting time at
z to
W-
z
+
V where V has d.f.
B,
so that in this case
~
c{e-SY(W).}
III
J e- sz
$(w-z
+
V,S)Bj(dv) •
o
o
To derive equations for
expression for
Then for each
•
takes
$(w,s)
S ~
from
S we have
Hj(S)'~j(s)
e~uation
we now substitute the
(4.4) into equation (4.5).
i
65
r Hi(s)
i=l
K
e
-WlJ. (5)
K
1
=
I
1T.
J
j=l
W
+
{e
-(A.+S)W
-(A.+S)Z
Aj e
J
J
e
o
Now
e
-CA.+S)Z
J
o
= A. I
J i=l
= A.
K
I
J i=l
Hi(s) e
Hi Cs)
= A1 I
00
I
Hi (5)
1
" i=l
EO
J e -(A.+S-lJi(S»Z
J
00
o
0
(5)
.
B.(dv)dz
J
J -(A.+S-lJ.Cs»z dz
W
*
B. {lJ. (s) }
J
1
(5)
1
Je-vlJ i
W
-WlJ.(S)
Hi(s) e
J
o
-w lJ.
Therefore for each s
}
Bj (dv) dz •
J e-(W-Z+V)lJiCS) B. (dv)dz
K
e-WlJi(S)
K
1
0
i=l
K
-(W-Z+V)lJ. (s)
Joo
K
iLHi (s)
J
*
Bj {lJi(S)}
e
J
1
o
1 - e
-(A.+S-lJ. (s»W
J
1
A. + S - 1l.(S)
J
1
S we have
K
I
i=l
Hi
(5 )
*
B. {lJ. Cs)}
J
1
-WlJiCs) -(Aj+SlW}
-e
"';'-";\-.-+-s""';--".-::C,....,S):-- (4.6)
e
J
~l
-l\1lJ j (s)
and
By Lemma A4 , we may equate the coefficients of e
-(S+A.)W
e
J
to zero, since for S € S lJ 1(s), ... ,lJ K(s) are distinct~
A , ••• ,AK are distinct and lJi(s~ can equal Aj + s for some j
1
only at isolated points in S+ •
•
66
-(S+>'.)W
Equating the coefficient of e
we obtain for
s ( S, j
:0
to zero in equation (4.6),
J
l,2, ... ,K
Hi(s) B. *{JJi (s)}
J
A. I
=
J i=l A. + s - JJ i (5)
J
K
The coefficient of e
-WJJi(s)
in equation (4. (»
Bj *{JJ i (5)}
j
j
H. (s){
'IT A
J.
• I X. + 5 - JJ.]. (5)
J=
J
I
Since each JJi(S)
5 €
(4.7)
is
I} .
-
is continuous throughout
has isolated zeros in S+,
have for every
I.
{Re(s)
~O}
and Hi(s)
for this coefficient to be zero we must
+
S
for
i
I, 2, •.. , K
l:lI
(4.8)
•
We may extend the domain of validity of equation (4.8) to the
entire right half-plane
{Re(s)
S+,
>
O} - S+
{Re(s)
~
in two steps:
O}
uityof JJi(s); since
for all
JJi(s)
in
5
•
hold~
for
s
on the imaginary
*
r
'IT.A.
J J
l
j=l A.J
{Re(s) > O} by the contin-
To summarize, we have shown that
Hi(s) Bj {JJi(s)}
A
= 1
j
i=l >..J + s - JJi(S)
K
l(
in
is continuous to the right at the
imaginary , axis, it follows that (4.8)
,
5
can be surrounded by a deleted neighbourhood in
so that (4.8) holds
axis as well.
every
+
B. * {JJ.J. (s)}
J
5 - lJi(s)
=1
for
for
5 E:
Re(s)
~
$, j
= 1,2, ... ,K
0, i
:;:
1,2, .•. ,K
(4.7)
.
(4.8)
67
Define
Q(Z,s)
'IT /'j
K
J=
LE~~~
4.8:
exactly
PROOF:
K
s,
has
{Re(z) > O} .
r(e,R)
{z :
defined for
0
<
Re(z) = e,lzl
5;
R}u{z :
Consider the simple closed
e < R by
Izl = R,Re(z)
>
d .
is almost a semicircle in the right half-plane, as in the
following diagram:
-.~
J
\
.,,~
~ _... . ..··t
.
;
~.,. ...:,...:;-",.
1
•
•
=1
Once again we apply Rouche's Theorem, our proof following
f(e,R) =
r(e,R)
Re(z) ~ 0, Re(s) ~ O.
the equation Q(z,s)
the pattern of the proof of Lemma 2.10.
contour
for
J
For any fixed positive
roots in
* (Z)_
B
=·'lA.+s-z
I
- j
\
i ., . ' .•
68
-.
.
.
Since Q(z,s)
is analytic in
number of singularities at
analytic on
r(E,R)
for all
sufficiently large.
lQ(z,s)1 < I
z
= A.
J
{Re(s) > O}
+
except for a finite
j = 1,2, ...
s,
,K, Q(z,s) is
sufficiently small and all
E
R
We show next that we may choose e,R so that
for all
z
r(E,R).
€
so that
Hence for
R sufficiently large,
z on the
i~ginary
z
axis,
IQ(z,s) I S
IQ{z,s)1 < 1 on this arc.
= ie,
say,
1TjAjlBj * (is)1
K
L
IA.
j=l
+
J
s - iel
1T.A.
J J
K
S
For
L
where S=cr+iT
j=l
K
<
l
1T.
J
j:;l
It follows py Lemma A3
for all
that
•
{Re(z)
z in
IQ(z,s)/
<
1
that fQr
= E,lzj
for all
z
=I
since
a > 0 .
€
sufficiently small
IQ(z,s)! < 1
s R}.
Thus we may choose
E,R so
€
r(e,R) .
69
-.
Let
P(z)
"
M(z)
so P{z)
=
K
IT
(A .
=
K
~.A. B• (z) IT (A + s - z) ,
1
J J J
R4j
are analytic inside and on
IM(Z) 1= Ip{z)/IQ(z,s)1
Rouch~'s
Theorem,
has exactly
K
Ip(z)1
<
P(z) - M(z)
as does
zeros in
{Re(z)
Each zero of P(z) - M(z)
if P(zo)
= P(zo)
- M(zo)
But this is impossible, since
4.9:
PROOF:
+
= Ak
Zo
~k,Ak >
For each fixed positive
root in the interval
\,S
= 1,
+ S
Ai + l )
Q{z,s)
for
for some
0, Al, ... ,A K are distinct
=I
has exactly
K
Thus we
roots
0
Q(z,s) • 1 in {Re(z)
(s +
P(z) - M(z)
cannot be zero for u positive (Lemma Al ).
{Re(z) > O}.
LE~~~
It follows that
must be a root of Q(z,s)
have shown that the equation Q(z,s)
in
Therefore, by
oJ.
0, then
=
r(E,R).
and
- Q(z,s)} has the same number
P(z).
>
Z E
f{E,R),
and
k E {1,2, ... ,K}
&nd Bk *(u)
for all
= P(z){l
r{e:,R)
of zeros inside
Then
*
r
j=l
and N{z)
= P(z)Q(z,s).
s - z), M(z)
+
J
j=l
>
O}
(s,s
s,
the
roots of the equation _
lie on the positive real axis:
+
AI)
is real and continuous for
z
= Aj
one
and one in each of the intervals
= 1,2, ... ,K-1
for i
axis except at the points
K
+
s.
•
z on the positive real
Since Q(A.1
+
S
+
O,s) •
-~
70
= +~,
and Q{A.1+ 1 + s - 0,5)
it follows that there is at least one
value of z in the interval (s + Ai's + Ai + l )
for i = 1,2, .•. ,K-1. Since
and Q{s
+
Al - 0)
in the interval
= +~
(s,s
+
AI)'
PROOF:
Q(z,s)
~(s)
has
=1
By Lemma 4.8 there is exactly one root
0
K distinct eigenvalues for all positive
The eigenvalues of W(s)
=1
= 1,
Q{z,s)
there is at least one root of Q(z,s)
in each of these intervals.
COROLLARY 1:
such that
in the region
are
{Re(z)
>
K roots of the
OJ.
s.
equation
By Lemma 4.8 there are
exactly K roots in this region, so the eigenvalues must be the
K distinct positive roots of Lemma 4.9.
~(w,s)
COROLLARY 2:
-w)J.(s)
K
= l
j=l
H.(s) e
for all positive
J
We can now, in principle, calculate
{Re(s)
~
)Jl{s), .•. ,)JK(s)
~
Q(Z,s)
=1
and then (4.7) for
0, Im(s) , O}
are distinct:
we must evaluate
Hl(s), •.. ,HK(s);
•
s in
Hw,s)
{Re(s) > O}
for
K
HI (5), ••• ,HK(5)
as
lim 4>(w, sn)
such that
5
n
-+
s
.
s
in
solutions of
if they are, then we conclude that
n~
sequence in
for any
we need to check whether the
so that we can solve (4.7) for
th~n
~(w,s)
for positive values of s we solve (4.8) for
Q}:
{Re(s)
s.
J
5 €
as before; if not
where
{Sn} is a
S
71
4.3
The Mean Length of a Busy PeJ:'iod
Let
perio~
mew)
= E{Y(w)},
i.e.
mew)
induced by an initial service time w.
busy period initiated by a type
period.
is the mean length of a busy
j
The mean length of a type
We shall refer to a
customer as a type
j
busy
j
busy period is
00
m.
=
J clef
E{Y.}
J
= J mew)
o
B.(dw) .
J
In this section we show how mew) (and consequently m1, ••• ,mK)
be computed for the model of Section 4.2.
may
In Chapter 3 we showed
how to calculate the probabilities YI""'Y K where
Yj
is the
stationary probability that a busy period be of type
j.
If
YI""'Y K are known, then we can evaluate such quantities as the
overall mean length of a busy period,
K
1]1=
I
j=l
y.m.,
J J
the mean length of an idle period, and the stationary probability
that the server is free.
As usual, we let '. b =
I
'IT.
b ., a =
J J
be. the overall mean service and inter-arrival times.
Prooedure A:
We know that
mew}
=
lim 1 - p(wts)
s-+O+
s
so a possible procedure for computing mew)
•
is to evaluate
I
A. -1
J J
'IT.
72
for a sequence
{sn} of positive real numbers such that
and estimate
mew)
s
n
-+-
0+,
as the apparent limit.
Procedure A requires the solution of equations (4.7) and (4.8)
at each point $n,n
= l,2, •.• ,N,
where
required to approximate mew).
procedure is available if
the right at
=0
s
K
L
mew) = -
j=l
for
A less time-consuming and more exact
~j(s)
each
1
N is the number of values
and Hj(S)
j,
are differentiable to
for then we have
I
{H . (0) - w~J' (O)H " (O)} e
j
J
-w~.(O)
(4.9)
J
For H1 (S)t ••• ,HK(s) to be defined at s = 0, we need \feO) to
have K distinct eigenvalues. We prove this fact in the following
lemma.
LEr~
PROOF:
4.10:
\feO)
K
distinct eigenvalues.
The (i,j) - element of
where q'J' (=
1
1J
j
r
the
.
q .. = 1
1J
all zero.
write
'Y(s)
for each
~l(O)
for
i,
Therefore
= O.
s
Therefore each
'II (0)
From
$ .. (0)
1J
~
A.{O .. - q .. ),
1
1)
=
::0\<1
sums
of 'Y(O)
is singular and has
Lemm~
busy period
Since
are
0
as one eigenvalue:
4.9 we may number the eigenvalues of
lim lli (5)
s-+O+
i
1)
(see the proof of Lemma 4.6).
positive so that for i
~.(O)
1
is
~(O)
is the probability that a type
W. . (0))
ends during arrival. state
J
•
has
~
2,3,t ••
,K
is positive, and
73
:\.1- l:c.;; ].1.(0)
S:\.
1
1
But we know that each ].Ii (0)
(i = 2,3, ... ,K) •
is a solution of the equation
K
I
j=l
so it is impossible that
].1.(0)
:\.1- 1 < ].1.(0)
< A.
1
1
so that
].Il(O), •.. ,].IK(O)
COROLLARY:
PROOF:
Hl(O)
= I,
I
j:;l
HJ.(O) e
J
Thus
are distinct.
Hj(O)
-W].I. (0)
j.
(i = 2,3, ..• ,K)
=0
for
This follows from Lemma A4
K
for any
=:\.
J
1
= ~(w,O)
,
j
= 2,3, ... ,K
.
since
- 1 for all
W,
and we can write this as
-w].ll (0)
{HI (0) - l} e
For HI(s), ... ,HK(s)
it is sufficient that
at
s
= 0,
.'
= 0,
I
H. (0) e
j=2 J
(0)
J
-
o.
0
to be differentiable to the right at s = 0,
].Il(s), ••• ,].IK(s)
be differentiable to the right
this being clear from the definition of Hj(s).
shall prove that each
s
-W].l.
K
+
].Ij(s)
We
is differentiable to the right at
but first we need the following result:
74
LEMMA 4.11: The mean length of a type
m. <
J
co
PROOF:
for each
busy period is finite, i.e.
j
j.
be the tin~ between the nth and (n+1)th arrivals,
. assumlng
.
the service time of the nth customer to arrlve,
Let
that a type
Un
j
busy period began at time
are iid random variables with
E{Uj }
= b.
= a,
O.
Then Uo ,U1 ,U2 , ...
and V1 ,V2 , ••. areiid
X = V - U 1 (n = 1,2, .•. ).
n
n
nare iid random variables with E{Xj } = b - a < O.
random variables with
E{V.}
J
Let
Then Xl ,X2 , ..•
Let ~1 be the nUlaber of customers served during the busy period.
N = in£{k : Vo + Xl +.•. + Xk < oJ. (N is finite a.s. since
EXj < 0). If the busy period is still in progress when the nth
customer arrives, then his total waiting time is vo + Xl + ..• + Xn
Let
(see the proof of Lemma 3.2).
We will show that
EN <
Thus 0
co,
<
N s N.
so that
EM <
co.
Let
+ Xl +••• + Xk (k = 1,2, •.. ). Now 0 > SN = (SN_1 + VN) - UN_I
and SN_l;::: 0, so SN_l + VN ;::: O. We may show that SN is
Sk
= Vo
distributed as a mixture of exponential random variables, since it
is the overshoot below zero of a mixture of exponential random
variables.
that a type
Specifically, we recall that qjk
j
busy period end during arrival state
the busy period ends during arrival state
k.
exponential distribution with parameter Ak ,
..-
•
is the probability
k.
Suppose
Then UN_1 has an
and
75
.-.
by the lack of memory of the
e~onential
Thus
-A x
K
Pr{SN < - x} =
distribution.
l
k=l
k
q'k e
for any
x > 0 .
J
But this implies that
K
= - 1:
k=l
qJ'¥, Ak
-1 > _
00
•
Next we use a variant of Wa!d's equation to show that
We apply Lemma AS \'lith Zo = Vo ' Zk = Xk
stopping time N. We have
E{SN}
It follows that
E{N} <
Zk
on
and consequently E{M}
00,
with
J
= U.J- land
00.
Fk being
k
~
I.
Vk)
Uo ,U I ,U 2 ,··· are iid and Uk _l ·is dependent
Since we could define M as
E {MIl
j=O
•
and
= (Uk_I'
inf{k : (Vo-Uo) + (Vl-U l )
[M S k] € Fk . Therefore
.-
X.
<
a-field generated by Vo,Uo,Vl,Ul"",Uk_l,VkJi.e. for
1k'
00.
= E{V o} + E{X l } E{N}
= bj + (b-a) • E{N} .
Again we apply Lemma AS
the
for k = 1,2, ••• ,
E{N} <
u.}
J
+ ••• +
=
(Vk_1-Uk_1) < O},
a. E{M}
<
00
•
it follows that
76
.-.
COROLLARY:
5
j, k wjk(s)
For all
is differentiable to the right at
= O.
PROOF:
Since
is a multiple of a Laplace transform of a
wjk(s)
non-negative random variable,
W'k
(s) - ...._"--_
WJ'k (0)
lim .........
J
i
=
Wik (0)
def
J
S
5-+0+
is well-defined, either finite or
Now
+~.
K
l
k=l
W' k (5)
J
is the Laplace transform of the length of a type
K
-k=l
l
,
w'k (0)
J
It follows from the Lemma that
= EY.
period, so
J
i
jk
b~sy
= m, •
J
w
j
(0)
0
is finite for all j,k.
We can now prove
LE~$~
PROOF:
~(O)
4.12:
Each
is differentiable to the right at
~.(s)
J
Suppose first that
= O.
Then
~(s)
~(s)
is an eigenvalue of
~(s)
5
= O.
such that
is defined implicitly by the equation
K
I
C, (s){ ll{s)}j = 0
j=O J
where
•
C. (5)
J
(Re (5)
<?:
0) ,
is a polynomial in the elements of
and C1 (0);'O,
since
~(s).
Now C (0)
0 is a non-repeated eigenvalue of
o
~(O).
:r:
0
77
Recalling that
~ ..
1)
= (A.1
(s)
corollary that each C.(s)
)
we see by the above
+ 5)0 .. - h.W . . (5) ,
1)
1
1)
is differentiable to the right at
5
= O.
Now
K
I
j=O
for all
{(Cj(s) - Cj(O»(~(S»j + Cj(O)(~(S»j}
{C.(s)
I)
j=O
K
Since
~
in {Re(s)
5
Cj(s)
OJ,
)
- C.(O)
5
K
I
5-+0+
exists.
.
(~Cs»J
+
s > 0
((
C.(O) ~ 5»
)
S
is differentiable to the right at
continuous to the right at
lim
so that for all
C (0)
j
j=O
= 0,
5
=0
j} = o .
=0
s
and
~(s)
it follows that
(~(S»j[ = lim {U(S) j=lI c.(O)CU(S»j-l}]
)
s
s+O+
S
But
K
lim
5-+0+
exists, i.e.
l
Cj(O)Cu(S»j-l
j=l
~(s)
Since nCO)
u(O)
= C f o.
has an eigenvalue
=0
, 0,
so
lim ~
s+O+
s
is differentiable to the right at
Suppose now that
Then '¥c(s)
= Cl(O)
and 'Yc(O)
Let
n(s)
has
K
proceed as above to show that
n(s)
differentiable to the right at
s
'¥c(s)
such that
g
= o.
= 'Y(s)
- cI
n{s)
u(s) - c.
=
K
distinct eigenvalues, we may
(and consequently u(s»
= O.
is
0
Thus, we have shown by Lemmas 4.10 and 4.12 that (4.9) holds, and
by the Corollary to Lemma 4.10, we have
•
to
is
78
~-.
= w~l
mew)
,
-w~.(O)
K,
(0) -
l
j=l
H. (0) e
J
J
We need to consider how to compute the quantities
~l(O)
We know already that
= OJ
~j(O)'~l
,
,
(O),Hj (0).
~eO)
the non-zero eigenvalues of
are positive roots of the equation
Q(z,O).
(4.10)
•
= 1.
Q(~,O)
Write Q(z)
for
If this equation had more than K-l positive roots, then
we would need to consider the limiting behaviour of the roots of the
equation Q(z,s)
=1
roots of Q(z)
1 were the eigenvalues of
~
as
in order to decide which positive
s~O+
~(O).
Our next result
shows that this contingency does not arise.
LEMMA 4.13:
PROOF:
Q(z)
i
=
The equation Q(z)
=1
has exactly K-l positive roots.
From the proof of Lemma 4.10 we know that the equation
=1
(A.,~. 1)'
1
1+
has at least one root in each of the intervals
l,2, .•. ,K-1.
We may show that once Q(z)
reaches
1 in any of
these intervals it is strictly increasing throughout the remainder of
the interval.
Let
Then gj(Z)
is the (bilateral) Laplace transform of a probability
distribution, with region of convergence {z: 0 < Re(z) < Aj },
although gjez) is actually defined for all z in {Re(z) > O}
such that
z; A .
j
In particular,
gj(z)
= c{e-Z(V-U)},
where V
and U are independent random variables such that V has
Bj
•
and U is exponentially distributed with parameter
non-negative real axis
gj(x)
Aj .
has the following properties:
d.f.
On the
79
.-.
= 1, gjeX) >
(0) = l/~· - b •
J
J
(PI)
gj(O)
0 for all x e (O,A j ), gj(Aj-O)
(P2)
g.
;
,
J
If
(P3)
gj (x)
(P4)
gj(X) < 0
(PI),(P2)
CP3)
t
~
0 for all
>
x e (O,A j )
gj (x) > 0 for all x > A •
j
,
and
g
j
=+
and
(P4)
(x)
=
are clear since
A
j
Bj
A - x
j
*'
A.
(x) +
(A
J
2
- x)
j
B.*(x)
J
follows from the equation
~
g. " (x)
J
where
=J
u 2 e -ux G.(du)
J
is the d.f. of V - U [23; p. 240], since this integral
Gj
is finite and positive for all
o < Re(z)
<
x e (O,A j )
is analytic for
(gj(z)
A.) •
J
We now consider 'the behaviour of
Q(X)
= r. 'IT.g.
(x)
J J
J
for
x::: O.
,
Q(O) ~ I,Q (0)
and Q(x)
= lj
'IT.(l/l. J
J
b.)
J
=a
- b
>
°,
is convex for all x e (O,A l ) by (P3) (Recall that
< A ). Therefore Q(x) > l for all x e (O,A ).
l
K
From (P4) it is clear that Q(x) < 0 for all x > AK,
equation Q(z)
(AK'~)'
•
=I
has no roots in the intervals
For the interval
so that the
(O,A I )
(Ai,A i + I ), i e {1,2, •.• ,K-l},
or
we define
80
X:f A. 1, ... ,A ,x c:: O.
K
1.+
By (P4)
is negative and strictly increasing for
x
>
A..
Now
1
0 < x < Ai + l , and f 2 (A i + 1 - 0) = + 00. Therefore
we can define x.1 = inf {x : x ~ A.1. and f (x) ~ I}. The equation
2
Q(z) = 1 can have no roots in (Ai,xi ) since Q(x) = flex) + f 2 (x)<1
o<
f 2 (x)
for all
<
x
for
00
(if any) in this interval.
f (x) is convex in
2
(O,Ai + l ). Therefore, since f 2 (0) = 1 and f 2 (xi ) c:: 1, f 2 (x) is
strictly increasing in the interval (xi,A i + l ). It follows that Q(x)
is strictly increasing for
Q{A i + 0) ;: -
(A.1 ,A.1+ 1)
Since Q(A i + 1 - 0) = +
Q(z) = 1 has exactly one root in
(xi,A i + l ).
€
it follows that
00
for each i e {1,2, ... ,K-I}.
~2(0),
positive roots
Thus there are exactly
~-l
0
positive roots.
Thus
x
By (P3),
... ,UK(O)
~re
are unambiguously determined if K-l
found for the equation Q(z)
= 1.
A simple
iterative technique, such as Newton's method, is appropriate here.
We may obtain
5
= 0,
,
u1 (0)
= 1.
setting i
by differentiating equation (4.8) at
We obtain
a-
I
U
l (0) ;: a - b
I
i
We may obtain HI (0), ... ,HK (0)
equation (4.7) with respect to
these
•
.'
K
linear equations:
as follows.
s
and set
S
If we differentiate
= 0,
then we obtain
00,
81
,
'H '(0) B *{ (O)}
1 + 1.1 1 (OHA.b. - l)
A j
k
j
lJk
= _ _---::~_J_J--
Aj - }Jk(O)
j = l, •.. ,K • (4.11)
. Aj
To summarize, we have the following procedure for computing mew):
Procedure B..
=1
1)
Solve the equation Q(z)
2)
Compute
3)
Solve Bquation (4.11) for HI (O), •.. ,HK (0).
,
}JI (0)
~
for
1J
2 (0), ••• ,lJ K (0).
a
a _b
,
I
Then
,
mew) = w1J 1 (0) -
r Hj
j=l
K
I
(0) e
-WlJ.(O)
J
Hence we can calculate for each i
co
mi
=f
mew) Bi(dw)
o
= 1.1 1
I
(4.12)
(0) • b 1...
If we know the stationary probabilitie$ Yl""'Y K '
the probability that a busy period be of type
where y.
J
is
j, then we can
calculate the overall mean length of a busy period m = E y.m . .
J
J J
Between busy periods are time intervals during which the server
is free (or the queue empty) - we call these idle periods.
The length
of an idle period is the time to the next arrival after the end of a
busy period.
If the busy period ended during arrival state
j, then
the length of the idle period has an exponential distribution with
•
82
••
parameter
~j'
The stationary probability that a busy period end
during arrival state
is Yj'
j
necessarily be of type
since the next busy period must
the mean length of an idle
Th~refore
j •
period is
Ie
l
Yj • A.-
j=l
1
J
= a*
say.
By an al'gument similar to that in [5 ; p. 37J, we could derive the
intuitively obvious result, that the stationary probability that the
server be free is
REMARK:
When
K
*
a a'll"
m+
= I,
the present model reduces to the case of
random arrivals (or the M/G/l queue).
A well-known result for the
case of random arrivals is that the stationary probability of the
server being free is
where
1 - p,
p
= b/a
[s; i2.7].
result is not true for the present model when
K > 1:
This
by (3.11)
we have
K
Pr{server is free} =
Thus
Pr{server is free}
1 - b/a
z
I
j=-l
=a
'r,
J
iff a
- b
'it
a
= a*,
and by Lemma 3.5
this !s not true in general.
EXAMPLE:
service time d. :f. s
•
mew), ml , ••. ,mK in the case
-a.x
Bj (x) = 1 _ e J • We tilke
We solve for
1
'Il'1
= if'
Al
(11
1
I:
1
_
2
= 2'
'/T
3
4
= I,
"2
= 2,
~3
;::. 3
= 2,
'1 2
= 6,
(13
= 1.
'/T
II': ::;
3,
with
83
,-.
Carrying out Procedure B, we find that
= 1.34(3),
~2(0)
~3(0)
= 2.89(4).
The mean overall service and inter-arrival times are
b
so
,
~1
= 14/3.
(0)
,
HI (0)
ml
= 2.73,
a
= 7/12
Solving (4.11) we find that
,
=-
0.73, H2 (0)
= 14w/3 +
m2 = 0.92,
Thus mew)
= 11/24,
I
= 0.22,
H3 (0)
= 0.51.
-1' 34w - 0.51 e-~·89w and
0.73 - 0.22 e
ID
3
5.17.
~
.. §plving Equations (S. 9), (3.10) we obtain
The stationary probability that the server be free is
I
j
T. =
J
0.19.
Therefore the stationary probabilities y.
J
(c.f.
~l
= 0.25,
Yl
= 0.36,
YZ
= 0.47,
~Z
= 0.5
~3
= O.2S),
Y3
are
= 0.18
and the mean length of a
ieneral busy period is
m
= (O.36){2.73)
= 2.:P .
•
+ (0.47)(0.98) + (0.18)(5.17)
84
;.
4.4 The baakwapd tpansition model with
A =
o
CIll.
The method of Section 4.2 may be applied to the (more general)
backward transition modal with
Theorems 4.1 and 4.2.
Ao =
We obtain analogues of
CIll.
However the integral equation corresponding to
(4.5) involves components of the busy period transform
~(w,s)
than
directly.
~(w,s)
rather
As a result we are led to much more compli-
cated equations.
We recall that arrivals for this model correspond to transitions
into state
0
{O,I, •.• ,K}.
of an homogeneous Markov chain
The time spent in state
i
exponential distribution with parameter
next state is chosen from {O,l, .•. ,K}
{Pij}'
A =
o
CIll
We suppose Poo
=0
(Zt)
with state space
between transitions has an
at a transition the
A. ;
1
according to the probabilitie$
is an instantaneous state since
(0
).
We may describe this model in terms of infinitesimal probabilities as follows.
Pr{type i
Given
arr~val
Zt
in
Pr{transition to state
=i
~
0, T > 0:
(t,t+'1')}
j :; 0
in
T
= A.p.
1 10
+
OCT)
(t,t+'1')}
= A.1
(p.. + p. n .) T
1)
10"
oJ
+ OCT)
Pr{no transition in
(t,t+T)}
=1
- A.T
+ OCT) •
1
At each arrival epoch the next state of
{l,2, •.• ,K}
(Zt)
according to the probabilities
is selected from
{poj : j
= 1,2, ... ,K}.
Thus we see that this model reduces to the backward model with
K
85
customer types and exponential arrivals if Pio
=1
for each i.
As in Section 4.2 we suppose that a customer with service time
w has arrived at time
and
0 to find the server free.
be defined as before.
~(wJs)
described above; we take
~ij(w,s)
as before.
Zt
(Zt)
Let
(Wt),Y(w)
is now the guiding chain
to be right-continuous, and define
«Wt,Zt),t~O)
Once again
is an homogeneous
Markov process, so we may prove analogues of (4.1) and (4.2), with
replaced by Poj .
nj
Let w. . (5)
be as before J and
1)
the
~(s)
~
matrix with
x K
(i,j)-elemen~
~'j(S)
1
• (A.+s)6
•. - A.Pi
1
1)
1 0
W. . (5) -
1J
A.p
..•
1 1J
We may show as in the proof of Theorem 4.1 that
~'j(h,s)
1
as
h
~
• P01.{(l - (Ai+s)h)o 1)
..
0+.
+
A.p
.. h
1 1)
+
Therefore we have the following analogue of Theorem 4.1:
PROPOSITION:
satisfies the matrix differential equation
~(w,s)
awa ~(w,s) =- ~(w,s)
for all
A.p.
w 1)
. . (s)h} + o(h)
1 10
w > 0,
in
s
~(w.s' ~
Since
{Re(s)
!i lj
~
0
O}.
~1'J'(W,s),
W(s)
we have the following analogue of
Theorem 4.2:
PROPOSITION:
~(s)
•
has
K
Por
a~l
distinct
w ~ 0 and all
eigenvalues~
s
in {Re(s)
$(w,s)
~
O}
is of the form
such that
86
••
-W·l.l. (S)
K
~(W,s)
D
l
jlil
Hj(S) e
o
J
Beyond this point the analogies with Section 4.2 begin to break
down.
Por example, the proof of Lemma 4.5 does not work for the
present model.
Hj (s),l.lj (s)
More importantlY. when we attempt to solve for
we find that the integral equation corresponding to
(4.5) is
AkPko
w -(Ak+S)Z
~
o
0
fe
w
J
e
f~(W-Z+V'S)Bk(dV)dZ
.
-(Ak+S)Z
!Pj: (w-z,s)dz} •
o
Since this involves the components
equations for Hj(S),l.lj(s)
row-sum of the matrix
Theorom 4.2, we see that, if
J.
where Hjk(S)
obtai~
for
•
is the
(w,s)
Now ~j. (w,s)
= k=l
r
is the
jth
Therefore, as in the derivation of
~(s)
has
K
distinct .igenvalues
-W Ilk (5)
K
HJ'k(S) e
jth row.. sum of the matrix R,k(~(s),s). We may
equations for l.lj(s),Hjk(S)
~j:
(w,s) we cannot obtain
J.
as before.
~(w,s).
~. (w,s)
~ ..
by
substituting this expr.ssion
into the integral equation
87
f
W
-(s+A.)w .
$. (w,s) = p .{e
J + A.p.
J.
oJ
J J0
-(A.+s)z
e
J
o
OIl
f
ll/l.
(w-z+v,s)B.(dv)dz
. 1·
J
0 1
t(
+
r
k=l
However, the equations obtained are much more complicated than
(4.7)t(4.8), and do not seem amenable to further treatment.
A P PEN 0 I X
If f(x)
is a real or complex function of a real variable,
then we define the Laplace transform of f
to be
=
fOes)
e- SX f(x) dx
=f
-w
wherever this integral converges for
decr~asing
s
complex.
function of bounded variation on
R,
If P is a nonthen we define the
Laplace-Stieltjes transform of P to be
00
P*(s)
If F(x)
=0
for all
=
J e-sx P(dx).
x < 0,
then
=
F~(S) = J
e- sx F(dx)
0-
= spoCs)
(from integr~tion by patts).
If F is the d.f. Qf a non-negative random variable, then F*(5)
converges for all
{Re(s) > o}
s with Re(s)
~
O.
Also
F* (5)
is analytic in
and continuQus to the right at the imaginary axis.
We
have
LE~~~
F* (5)
•
AI:
If F is the d.f. of a non-negative random variable, then
cannot vanish on the positive real axis.
.
..
89
:.
PROOF:
."
F
(s)
Suppose F* (0 0 )
=0
for all s
=0
for some
C1
=0
with Re(s)
o
> 0
>
o.
o
This implies
since
co
~ fle-~xl
IF*(s)1
F(dx)
o
GO
~ f e-
Oox
F(dx)
o
But this is impossible since
."
its region of analyticity
has isolated zeros throughout
(F * (0) = 1 implies F." (5) _ 0).
0
LEMMA A2:
=0
If H(x) • K(x)
F (s)
fOf all
x
<
0, where H is a non-
decreasing function of bounded variation, an4
x
J K(x-u)H{du)
G(x)
=
(x~o)
o
o
(x<O)
-then
If K is also a non-decreasing function of bounded variation, then
."
G
'It
(5)
PROOF:
•
= K (s)
."
H
(5) •
I f we put x - u
=v,
u
=z
then we have
90
j
GO(s) •
1~(x-u)H(du)
.-sx
o
dx
0
CCI 00
JJe-S(V+Z) K(v) dv H(du)
=
o
0
The second part follows since
K are [23;
LEMMA A3:
§
11], and G* (5)
Suppose
on a compact set
all z
~
p(z,F)
G is of bounded variation if Hand
f
= sG0 (s).
0
is a continuous complex-valued function
E of complex numbers such that
F, a closed subset of E.
= inf{lz-y!
: y
E,
€
let
Then there exists
F}.
€
For z
/f(z)1 < c for
p(z,F) < O:lt,> !f(z)
I
<
0 > 0 such that
c .
PROOF:
If I is continuous on F and therefore attains its supremum
in F.
Thus there exists e
Z €
F.
By Theorem F in
so there exists
Suppose
p(z,F)
<
O.
<
f
is uniformly continuous on E,
that for any x,y
0 => If(x) - f(y) 1
Then
If(x) I ~ c - e for all
0 such that
[19],
su~h
0 > 0
Ix-yl
>
Iz-YI
<
<
€
E
e .
0 for some
y
€
F.
If(z)1 ~ !f(z) - f(y)! + If(y) 1
< e: +
=
•
c •
c - e
o
Hence
91
.•
~
LEMMA A4:
If Yl,Y2 ""'Yn are distinct complex numbers with
n
ty.
C e 1 = 0 for t = 1,2, ... ,n ,
i
i=1
2
=0
c.1
then
PROOF:
for all
i
c j .; 0 for some
If
Y
e l
e
2Yl
Y
e 2
e
j,
then
Y
e n
2y
~ n
...
2Y2
=
0
Therefore some non-zero linear combination of the rows of this
determinant is zero, i.e. there exist d1, ..• ,dn such that
n
ty.
y.
2 dt e 1 = 0 for each i = 1,2, •.• , n. Hence e 1 is a solut=1
tion of the equation
n
r d
t=1
t
= 1,2, ..• ,n.
for each i
xt
:;: 0
But this equation has at most
4istinct solutions, one of them being
be distinct.
LE~~1A
AS:
with
21 ,2 ""
2
= O.
so
Yl""'Yn cannot
0
Suppose
.
x
n
20 ,2 1 ,22 ""
are independent random vectors
identically distributed.
Let
Fk
be the
a-field
generated by
20 ,2 1 , •.. ,2
(k ~ 1), and suppose that N is a
k
stopping time with respect to {Fn }, i.e. N is a random variable
•
taking positive integral values such that
[N S k]
€
Fk
for each
92
-.
k
Xk
~
l.
Suppose
= f(Zk)
{Xn } is a sequence of random variables such that
for some function f. Let Sn = Xl + ••• + Xn (n ~ 1) .
Then
whenever the right-hand side is well-defined (possibly infinite) .
PROOF:
This result may be proved using the method of [18], since
Zl, ••• ,zn of remark (e)
in [18] may be random vectors and the
presence of Zo does not affect the
ar~ment.
0
BIB L lOG RAP H Y
[1]
K. L. Chung (1968). A Couree in ppobabiZity Theopy.
Brace and World.
[2]
B. W. Conolly (1968). The waiting time process for a certain
correlated queue. Opepations Research3 16, 1006-1015.
[3]
B. W. Conolly and N. Hadidi (1969). A correlated queue.
J'oumaZ of AppZied ProbabiZitY3 6, 122-136.
[4 ]
D. R. Cox (1955). A use of complex probabilities in the theory
of stochastic processes. Pttoceedin(Js of the Cambrid(je
PhiZosophicaZ SoaietY3 51, 313-319.
[5 J
D. R. Cox and W. L. Smith (1961).
[6J
E. J. B. Goursat (1916). MathematicaZ AnaZysis, 2, Part I
(translated by E. R. Hedrick and O. Dunkel). Ginn and
Co.
[1]
C. M. Harris (1967). Queues with state-dependent stochastic
service rates. Operations ReseaPch3 15, 117-130.
[8]
F. I. John (1963). Single server queues with dependent service
and inter-arrival times. JO'U.'ffl,aZ of the Society for
Industrial and AppZied Mathematic$3 11, 526·533.
[9]
S. Karlin (1966). A Pipst Coupse in Stochastic Ppocesses.
Academic Press.
Harcourt,
Queues. Methuen.
[lOJ P. Lancaster (1969). Theopy of Matrices. Academic Press.
(l1J
D. V. Lindley (1952). The theory of queues with a single
server. Ppoceedin(J8 of the Carnbroidge PhiZosophiaaZ
SocietY3 48, Z77-289.
[12)
R. M. Loynes (1962). The stab~lity of a queue with nonindependent inter-arrival and service times. Ibid, 58,
497-520.
[13]
R. M. Loynes (1962). Stationary waiting-time distributions for
single-server queues. AnnaZs of Mathematical Statistics.,
33, 1323-1339.
94
..
•
:
[14]
R. M. Loynes (1962). A continuous-time treatment of certain
queues and infinite dams. J'ouma't of the AU8traZian
Mathematiaa't Soaiety, 2, 484-497.
[15]
P.-A. Heyer (1966).
[16]
M. F. Neuts (1971). A queue subject to extraneous phase
changes. Advanaes in App'tied Probabi'tity, 3, 78-119.
[17]
A.
[18]
H. Robbins ~nd E. Samuel (1966). An extension of a lemma of
Waldo J'oumaZ of AppZied ProbabiZity, 3, 272·273.
[19]
G. F. Simmons (1963). Introduation to TopoZogy and Modern
Ana'tysis. McGraw-Hill.
[20]
W. L. Smith (1953). On the distribution of queueing times.
Pl'oaeedings of the Cambrid(Je Philosophical, Society, 49,
449-461.
[21]
W. L. Smith (1972). Lecture notes for Statistics 280 (university of North Carolina - unpublished).
[22]
E. C. Titchmal'sh (1932).
[23]
D. V. Widder (1963). Tho LapZaoo Transform.
R~nyi
(1970).
ProbabiZity and Potential-so Blaisdell.
Probabitity Theory.
North-Holland.
The Theory of Functions. Oxford.
Princeton •