Time Spent below a Random Threshold by a Poisson Driven

Time Spent below a Random Threshold by a Poisson Driven Sequence of Observations
Author(s): S. N. U. A. Kirmani and Jacek Wesołowski
Source: Journal of Applied Probability, Vol. 40, No. 3 (Sep., 2003), pp. 807-814
Published by: Applied Probability Trust
Stable URL: http://www.jstor.org/stable/3215955
Accessed: 02/04/2009 07:48
Your use of the JSTOR archive indicates your acceptance of JSTOR's Terms and Conditions of Use, available at
http://www.jstor.org/page/info/about/policies/terms.jsp. JSTOR's Terms and Conditions of Use provides, in part, that unless
you have obtained prior permission, you may not download an entire issue of a journal or multiple copies of articles, and you
may use content in the JSTOR archive only for your personal, non-commercial use.
Please contact the publisher regarding any further use of this work. Publisher contact information may be obtained at
http://www.jstor.org/action/showPublisher?publisherCode=apt.
Each copy of any part of a JSTOR transmission must contain the same copyright notice that appears on the screen or printed
page of such transmission.
JSTOR is a not-for-profit organization founded in 1995 to build trusted digital archives for scholarship. We work with the
scholarly community to preserve their work and the materials they rely upon, and to build a common research platform that
promotes the discovery and use of these resources. For more information about JSTOR, please contact [email protected].
Applied Probability Trust is collaborating with JSTOR to digitize, preserve and extend access to Journal of
Applied Probability.
http://www.jstor.org
J. Appl. Prob. 40, 807-814 (2003)
Printed in Israel
? AppliedProbabilityTrust2003
TIME SPENT BELOW A RANDOM THRESHOLD BY
A POISSON DRIVEN SEQUENCE OF OBSERVATIONS
S. N. U. A. KIRMANI,*Universityof NorthernIowa
JACEKWESOLOWSKI,**PolitechnikaWarszawska
Abstract
Themeanandthevarianceof thetimeS(t) spentby a systembelowa randomthreshold
untilt areobtainedwhenthesystemlevelis modelledby thecurrentvalueof a sequence
of independent
andidenticallydistributed
randomvariablesappearing
at theepochsof a
Poissonprocess.Inthecaseof thehomogeneous
Poissonprocess,the
nonhomogeneous
asymptoticdistributionof S(t)/t as t -* oo is derived.
Poissonprocess;
Exceedancestatistics;limit theorems,nonhomogeneous
Keywords:
randomthreshold;
Poissondrivensequenceof observations;
orderstatistics
AMS2000SubjectClassification:
Primary60G55;60G70;60K10
60K37;62E15;62E20
Secondary
1. Introduction
If shocks occurringin time affect the level of an economic, financial,environmental,biological or engineeringsystem, then the proportionof time spent by the system below a threshold
is frequentlyof interest. If Yi denotes the system level duringthe periodbetween the (i - 1)th
and ith shocks and if N(t) counts the numberof shocks during[0, t] for t > 0, thenthe process
Zt = El
Yn1(N(t) = n - 1), t > 0, keeps track of the level of the system. Given any
t > 0, we are interestedin the proportionof time during[0, t] when the process Z = (Zt)t>o is
below the system's threshold.However,thereare situationsin which thereis no practicalway
to identify the system's thresholdwith certainty. Therefore,the thresholdwill be considered
a random variable X and interest will center on S(t)/t, where S(t) denotes the total time
during [0, t] that the process Z falls below X. Assuming that the process N = (N(t))t>o is
a nonhomogeneousPoisson process with N, Y = (Yi)i>l, and X independent,we obtain the
mean and varianceof S(t)/t in Section 2. The choice of the nonhomogeneousPoisson process
to model the time epochs of shocks was first proposed by Esary et al. [4]. We also prove,
in Section 3, that if the shock process N is homogeneous Poisson, then S(t)/t converges in
distributionto G(X), where G denotes the common distributionfunctionof the Yi.
A related but easier scheme of exceedance was proposed by Wesolowski and Ahsanullah
[8]. They investigated the exact and asymptotic distributionsof three statistics connected
with exceeding an independentrandomthresholdin a sequence of independentand identically
distributed(i.i.d.) observations.In particular,they considereda discrete analogueof our S(t).
Some additionaldistributionalpropertiesrelated to the exceedance scheme of [8] have been
recently studiedby Bairamovand Eryilmaz[I], Bairamovand Kotz [2] and Eryilmaz[3].
Received 22 November2000; revision received 4 April 2003.
* Postal address:Departmentof Mathematics,Universityof NorthernIowa, CedarFalls, IA 50614-0506, USA.
** Postal address:WydzialMatematykii Nauk Informacyjnych,PolitechnikaWarszawska,Warszawa00-661, Poland.
Email address:[email protected]
807
808
S. N. U. A. KIRMANI
ANDJ. WESOLOWSKI
2. Mean and variance
AssumingthatY = (Yi)i=l,2,...is a sequenceof i.i.d. observationswith commondistribution
function G, one of the statisticsconsideredin [8] was the numberS*(n) of observationsin a
sampleof size n falling below the randomlevel X, where X is independentof Y. It was proved
therethatthe conditionaldistributionof S*(n) given X is binomialwith parametersn andG(X).
Consequently,
E(S*(n)) = n E(G(X))
and
var(S*(n)) = n E(G(X)G(X)) + n2 var(G(X)),
where G = 1- G.
Here, we study similarcharacteristicsin the case when observationsarriveat the epochs of
a nonhomogeneousPoisson process N = (N(t))t,o which is independentof (Y, X).
The mainobjectof ourinterestis the process Z = (Z(t))t>o which keeps trackof the current
Yj as follows:
00
Ynl(N(t) = n - 1),
Z(t) =
t > 0.
n=1
Denote by S(t) the time spent by the process Z below the level X up to the time t, and by
Wi, i = 0, 1,..., the interarrivaltimes of the process N, that is, Wi = Vi+l - Vi, where
Vi = inf{t > 0 : N(t) = i} is the ith epoch of N, i = 0, 1, 2.... Then S(t) has the form
N(t)-I
S(t) =
- i=
N(t)-I
Wi l(Yi+ <X) + (t
-
E
Wi) l(YN(t)+, <X)
l(N(t) > 1)
i=
+ t 1(YN(t)+l < X) 1(N(t) = 0)
N(t)-I
= ( E Wi[1(Yi+ < X) - I(YN(t)+I <
X)])
i=O
+ t l(YN(t)+l < X).
(N(t) > 1)
(1)
Though we are unable to derive the exact distributionof S(t), the first two moments are
computableand similarin form to the correspondingresultsin [8].
Proposition 1. In the model definedabove,for any t > 0,
E(S(t)) = tE(G(X)),
var(S(t)) =
(2)
(t) E(G(X)G(X)) + t2 var(G(X)),
(3)
where
X(t) = 2
P((y
Ny)
N(x))dxdy < t2.
<x<y<t
Before provingthe above proposition,we presenta result on orderstatistics which will be
used lateron in the proof.
809
Timespentbelowa randomthreshold
Lemma 1. Let X:n, ..., Xn:nbe orderstatisticsfroman i.i.d. samplewithdistributionfunction
F having support[0, a]. Then
(Xi:n - Xi-i:n)2 + (a - Xn:n)2 = 2
E X1:n +
-
-
i=2
(F(x) + F(y))n dx dy. (4)
/
<x<y<a
Proof Observe first that, for any square integrablerandom variable X with distribution
function F having support[0, a],
E(X2) = 2
yF(y) dy = 2 f
J~O
F(y) dx dy.
JJO~<x<y<a
Consequently,
E((a - X)2) = 2<x<ya
F(a - y) dx dy
=2 O<x<y<a
=2r
F(x)dxdy.
JJo<x<y<a
We proceedby inductionwith respectto n. For n = 1, the resulthas just been proved. Denote
the left-handside of (4) by Ln. Then
n
- (
Ln = E E( X2
+ >(Xi:n
i=2
-
Xil:n)2
- Xn:n)2
Xn:n) + (a
It is known (see, for instance, [6, Chapter4]) that the conditionaldistributionof (Xl:n,....
Xn-l:n) given Xn:n = x is the same as the joint distribution of order statistics from an i.i.d.
sample of size n - 1 based on the distributionfunction
GX(u)=
GF(u)=
u) foru <x,
F(x)
foru > x.
1
Then, by the inductionassumption,it follows that
Ln = E(2
= 2
[GXn:n
+ Gxn,:(y)]-1 dxdy + (a - Xn:n)2)
(x) +
<x<y<Xn:n
y
[Gu(x) + G(y)]n-l
(
If1
= 21
(n
<x<y<a
dFn:n(u) + Fn:n(x)) dx dy
y
O<x<y<a
[t + F(x) - F(y)]"-1 dt + Fn(x)) dx dy,
F(y)
which immediatelyimplies the result.
ANDJ. WESOLOWSKI
N. U. A. KIRMANI
810
~~~~~~~~~S.
810
Proof of Proposition1. By the independenceproperties,we have
oc n
E(S(t))
1
< X)
E(Wi [1(Y,I?
=L
1(Y,,+ < X)] IN(t) = n) P(N(t) =n)
-
n=I i=O
00
+ tLEE(1(Yn+i < X) I N(t) = n) P(N(t) =n)
n=O
I
oo n
=L
E(Wj I N(t)
n)[E(1(Y1?l < X))
=
-
E(1(Yn+l < X))] P(N(t)
=n)
n=1 i=0
+ tE (G (X)).
Theformula(2) follows sinceE(1(Y?i+I< X)) -E (1(Yn + < X)) = E(G (X)) -E (G (X)) = 0.
In orderto find the varianceof S(t), we firstcomputeits second conditionalmomentgiven
N(t)
=
n
0:
>
nlI
E(S2 (t)
I N (t)
n)
=
N(t) =n) E((Ii+ I
: EW2I
=
In+)1)2
i-O
n-I
+ >jE(Wi Wj
N(t)
=n)
-
E((Ii+1
-
In?1)(Ij-+l
I+)
i#i
n-I
+ 2t
2...E(W1
N(t)
E(In?1 (Ij?I _ In?+,)) ? t2 E(In+1),
=n)
i-O
where Ij
=
1(Yj
<
X) for j
E
EIil-
-(I
Observethat
2.
=1,
I1
1)2)
-
0
2 E(G (X)Gij(X)),
0 <i, j
In+j)) = E(G(X)G(X)),
In,(jlE(I+I(I+I-
=
In+,))
<i
0
- E(G(X)G(X)),
<i
n,
<
<
n,i :Aj
<
n.
Consequently,
E(S2(t) I N(t)
=n)
E
E(G(X) G(X)) E((t
?
t2
-
V
I N(t)
=n)
+E
:w7
N (t)
=n
E(G2(X))
for n > 0. Setting Eformulaalso holdsforn
Hence, for any t > 0,
=
=O0and recallingthat Vo =0 almost surely,we find that the above
) tE(G
E (X)).
N(t)
Osince,by directcomputation,E(S2(t) N
N(t)-1
var(S (t))
=
E(G (X) G(X)) E((t
-
-
? E(
VN(t))
~
,
var(G(X))
N (t)
=E (G(X) G(X)) E VI?LV
i -2
_1)2+(t-_VN(t))2
+t2var(G(X)).
Timespentbelowa randomthreshold
811
Now to get (3) it suffices to compute X(t). In orderto do that, we recall (see, for instance,
[7, Theorem 12.2.1]) that the conditional distributionof the random vector (V1,..., VN(t))
given N(t) = n is equal to the joint distributionof the orderstatistics of a randomsample of
size n from the distribution
0,
x<0,
^(x)
OA
H(x)=
-<x <t,
A(t)'
1,
t <x,
where A is the mean value function of the Poisson process. Now applying Lemma 1 and
changingthe orderof integration,we obtainthat
N(t)
(-E
E(Ev2+
V2+E(Vi-Vi_
i=2
k
= 2
y|
V,_1)2 +(t-V
+ (t)2
n)2 N(t) )
dx dy
E([H(x) + H(y)]N())
<x<y<t
= 2|
+ H(y) - 1]) dx dy
exp(A(t)[H(x)
yt
<x<y<t
= 2
exp(-[A(y)
- A(x)]) dx dy.
<x<y<t
Remark 1. ObservethatE(S(t)) does not dependon the parametersof the drivingprocess N.
On the other hand, var(S(t)) dependson the intensity A of the Poisson process and increases
in t at most as a quadraticfunction.
Remark 2. If N is a homogeneous Poisson process with mean value function A(t) = Xt,
where X is a positive constant,then
t-Y
(t
e-xw dw dy =
X(t) = 2f J
2
(e-t
- 1 + Xt).
In case N is a nonhomogeneousPoisson process with A(t) = log(l + t),
rt rt-Y
(t)=
1
14 y
+y
+y+w
2
dwdy = t + -
2
log(l
+ t).
Note that X(t)/t2 -> 0 as t -- oo when N is homogeneous,but not when A(t) = log(l + t).
3. Asymptotic distribution
It was shown in [8] that S*(n)/n converges in distributionto G(X), S*(n)/n -n G(X) as
n -> oo. Here the formulaefor E(S(t)) and var(S(t)) suggest that a similarresult cannot be
true unless X(t)/t2 convergesto 0 as t -> oo. At the same time it is reasonableto conjecture
that S(t)/t
-
G(X) if X (t)/t2 -
0. We are not able to answer this question in full generality;
however,in the case of the homogeneousPoisson process, the answeris affirmative.
Proposition 2. If N is a homogeneousPoissonprocess, then
S(t)
-S
t
G(X) o as t -- o<.
S. N. U. A. KIRMANIAND J. WESOLOWSKI
812
Proof Let Ij = 1(Yj < X) for j = 1, 2,.. .and rewrite (1) as S(t) = S1 (t+
where
S2(t)+ S3(t,
N(t)-
Sl(t) =
1) E
Wjlj+l,
j=o
1(N(t)
S2(t) = (t - VN(t))IN(t)+I
(N(t) > 1)
and
S3(t) = tlN(t)+l 1(N(t) = 0).
Clearly, S2(t) < t - VN(t). But the distribution of t - VN(t) is known; see, for instance, [5].
Thus, for any e > 0 and t sufficiently large,
-
Pt
- 0 as t Consequently, S2(t)/t
Further, for any E > 0,
VN(t) >)
=e-xt
oo.
S3 (t)
= P(IN(t)+l 1(N(t) = 0) >
)
< P(N(t) = 0)
=
e -Xt
as t -- oo;
-0
0.
hence S3(t)/t
Observe now that l(N(t) > 1) -> 1 as t -> oo. Consequently, to prove that S (t)/t G(X), it suffices to prove that
N(t)-I
-
E
j=0
Wjj+il
G(X).
This will be done in two steps.
Denoting by X the intensity of the Poisson process, we will first prove that
LtJ-I
j=o
W
w 1j+l
G(X)
where L-Jis the integer-part function. Note that the sequence (Wj Ij+l)j=o, ... is conditionally
i.i.d. given X and E(WjIj+l I X) = G(X)/X. Then, by the law of large numbers, for any
real z,
LJL-1
E exp(
E
WjIj+
j=o
(L
-EE(exP
tJ
l
iz
Lw
Lxt j-
/ +,) E
)
) -ast
-* oc.
813
Timespent below a randomthreshold
Secondly, we will show that, as t -+ oo,
[AXt
N(t)-l
(
- E
W,I
Wjij+i)p o0.
j=0
j=0
To this end, observe that
N(t)
___=
L[tJ
N(t) At p
-- 1
At [LtJ
as t --oo.
Let
< N(t) < (l +
AE = {(1 -e)LtJ
)LXAtJ for any E >
and
= E(G(X))2-E(G(X))
X2
Then, for sufficiently large t, P(A)) > 1 e. Thus,
a2 = var(Woi)
lAtJ-1
N(t)-l
p(_
WjIj+l-
E
E
j=0
LtJ-1
N(t)-l
<P(
> E1
WjIj+l
j=0
WjIj+l
E
-
E
WjIj+I
j =0
j =0
N(t)-1
LAtJ-1
> tel
n A)
t
n A,
+E.
But
E
P(
j =0
WjjI+1- E
>L wj+l
j =0
max{N(t), LtJ)-1
=P(
WIj+l > t?l n A.)
E
j=min(N(t), LXtJ)
KL(1+)
L-AJJ - 1
<P(
E
wjIj+i>
j=L(1-e)L),tJJ
(5[ E P 7J j>
t? }n Ae)
)
j=O
<
2eXta2
-2 E t2
0
as t -+oo.
This completes the proof of the theorem.
Remark 3. Observe that, in the case of nonrandom threshold, the convergence in Proposition 2
holds in probability.
Acknowledgement
The authors are indebted to the referee for helpful remarks, particularly one which led to
Lemma 1.
814
S. N. U. A. KIRMANI AND J. WESOLOWSKI
References
I. G. ANDERYILMAZ,
S. N. (2000). Distributionalpropertiesof statisticsbased on minimal spacing
[1] BAIRAMOV,
and recordexceedance statistics.J. Statist.Planning Infer.90, 21-33.
I. G. ANDKOTZ,S. (2001). On distributionsof exceedancesassociatedwith orderstatisticsandrecord
[2] BAIRAMOV,
values for arbitrarydistributions.Statist.Papers42, 171-185.
S. (2003). Randomthresholdmodels based on multivariateobservations.J. Statist. Planning Infer.
[3] ERYILMAZ,
113, 557-568.
G. D., MARSHALL,
A. W. ANDPROSCHAN,
F. (1973). Shock models and wear processes. Ann. Prob. 1,
[4] ESARY,
627-649.
S. N. U. A. ANDGUPTA,R. C. (1989). On repairage and residual repairlife in the minimal repair
[5] KIRMANI,
process. Prob. Eng. Inf. Sci. 3, 381-391.
V. B. (2001). Records: Mathematical Theory (Translations Math. Monogr. 194). American
[6] NEVZOROV,
MathematicalSociety, Providence,RI.
V. ANDTEUGELS,
J. (1999). StochasticProcessesfor Insuranceand Finance.
T., SCHMIDLI,
H., SCHMIDT,
[7] ROLSKI,
JohnWiley, Chichester.
M. (1998). Distributionalpropertiesof exceedance statistics.Ann. Inst.
J. ANDAHSANULLAH,
[8] WESOLOWSKI,
Statist.Math. 50, 543-565.