No. 1794
8/1989
A PROBABILISTIC APPROACH
TO THE TAILS OF INFINITELY DIVISIBLE LAWS
SANDOR CSORGOI and DAVID M. MASON2
1. Introduction. Consider an arbitrary non-degenerate infinitely divisible real
random variable X. Usually this is described by the Levy formula for its characteristic function. For any t in the real line IR, we have
C(t) = E exp(iXt)
(72
= exp { iBt -
+/0 (e
izt
2"t 2
1
-00
where B E IR and
(7
~
2) dL(x) + Jo{':>e (e
+~
ixt
-1-
izt
-1-
ixt
1+
2) dR(x)} ,
x
0 are uniquely determined constants and Land Rare
uniquely determined left-continuous and right-continuous functions (the so-called
Levy measures) on (-00,0) and (0,00), respectively, that is L(·) and R(·) are nondecreasing functions such that L( -(0) = 0 = R( oc) and
(1.1)
Many authors have investigated the tail behavior of the distribution of X (see
[1], [3], [7-11], [13-21], for instance). Of necessity, the methods normally used
have a Fourier-analytic character and the results are usually formulated in terms of
•
1 Partially supported by the Hungarian National Foundation for Scientific Research, Grants
1808/86 and 457/88
2 Partially supported by the Alexander von Humboldt Foundation and NSF Grant #OMS8803209.
1
conditions on the Levy measures, conditions that do not always allow an immediate
probabilistic insight.
As an integral part of a probabilistic approach to the asymptotic distribution of sums of independent, identically distributed random variables and of the
corresponding lightly trimmed sums, a representation of X is given in Theorem 3
of [.5]; ef. also the end of Section 2 of [6] in the present volume. Introduce the
non-decreasing, non-positive, right-continuous inverse functions
¢l(S) = inf {x < 0: £(x) > s}, 92(S) = inf {x < 0: -R( -x) > s}, 0 < S < 00,
and consider two standard (intensity one) left-continuous Poisson processes N 1 (s)
and N 2 (s),
S :::::
0, and a standard normal random variable Z such that N 1 (·), Z,
and N 2 (.) are independent. Since (1.1) implies that
1 9~(s)ds + 1 ¢~(s)ds
00
(1.2)
00
< 00 for any
f
> 0,
t.he random variables
(1.3)
where Yj is the first jump-point of N j (.), are well defined for j = 1,2 (the first integrals exist as improper Riemann integrals with probability one), and the constants
8j
t
= -¢j(l) + Jo
roo
¢j(s)
1 + 9](s) ds - J
1
are finite, and we have the distributional equality
(1.4)
2
¢~(s)
.
1 + ¢](s) ds, J = 1,2,
that is E exp(iV1 ,2(0',B)t) = C(t), t E JR.
The aim of the present paper is to introduce a direct probabilistic approach to
the investigation of the tails of X based on the representation in (1.4) and on the use
of elementary calculations and inequalities for the Poisson distribution. Vve derive
several earlier results by this approach, sometimes in a polished form, under natural
new versions of the conditions, including Sato's [1.5] bounds. The fact that Sato's
theorem can be obtained by this approach is particularly interesting to us. This is
because we used it in the proof of Corollary 1 in [.5] when proving the necessity half
of the normal convergence criterion in the framework of our general probabilistic
theory of convergence, also joint v.;thErich Haeusler. \Vith the present proof of
Sato's theorem our entire theory becomes independent of the existing literature and
uses Fourier analysis only to ensure the uniqueness of </>1, </>2' 0', and B on the right
side of (1.4). The results are stated in Section 2, all the proofs are in Section 3.
\Ve close the present introduction by pointing out that it is very easy to see
directly that the right side of (1.4) is infinitely divisible. Indeed, for any integer
n ~ 1, let Z(k,n), 1
:s k :s
n, be Normal (O,ljn) random variables and
Ny>,
1:S k:S n, j = 1,2, be standard Poisson processes such that Z(l,n), ... ,Z(n,n),
(I)
N 1
, ..• ,
N(n) 1\T(l)
N(n)
1 ,1'2 , •••
2
are
. d
d
III epen ent,
r
an d 10rm
Then the random variables
are independent and identically distributed and, interchanging' integration and sum-
3
mation and substituting nu = s, we see that
n
L
VI(.~·n)(O') E -VI + O'Z + V2 •
k=I
It would be interesting to prove the converse statement (that if X is infinitely divisi-
ble then with uniquely determined ¢}, ¢2, 0', and 8 we necessarily have (1.4)) without
a recourse to the Levy or Levy-Hinchin formula for the characteristic function. This
problem seems to be difficult.
2. Results. The first set of results is formulated for the extreme pieces V; in (1.4),
defined in (1.3). It is worthwhile to separate these because the corresponding results
for the general X will follow from these special ones in an extremely elementary
fashion.
Introduce the non-negative quantities
and let 8 denote any finite constant. Unless otherwise stated, we do not necessarily
assume that A j <
below,
i
00.
If Aj =
00,
then l/A j is interpreted as zero. In Theorem 1
can be 1 or 2, and we only consider the non-degenerate case when ¢j ~ O.
This is equivalent to assuming that A j > O.
THEOREM 1. (i) For every a > 1/ A j there exists an Xo = xo( a, ¢ij, 8) > 0
such that for all z
~
Zo,
P{Vj
(ii) If Aj <
00,
such that for all z
~
+ 8 > x}
then for et1ery a
~ exp( -ax log
z).
< l/A j there exists an Zo
Zo,
P{Vj
+8 ~
x} ::; exp( -ax log z).
4
= xo(a,oj,8)
>0
(iii) For et'ery r > 0 there exists an Xo = xo(r,</>j,6) > 0 such that for all
(iv) The essential infimum of V;
+ 6 is
00
-
J sd</>j(s) + 6.
1
(v) The random t'ariable V j
+ 6 is
bounded from below (with probability one) if
00
and only if
J sd</>j(s) < 00, which
00
in turn is equit'alent to -
1
J </>j(s)ds < 00.
1
Now we consider the general infinitely divisible random variable X = X(L, R, u, 8)
with characteristic function C(·) or, what is the'same, X = X(91l9z,u,8) on the
left side of (1.4). \Ve exclude the trivial case when X degenerates at 8, which
happens if and only if A. 1 = A z = u = 0 (d. The.orem 4 in [5]). So at least one
e
of AI' A z , and u is assumed to be positive. As usual, ep will denote the standard
normal distribution function.
THEOREM 2. (i) If 0 < A z ::;
00,
then for every a > 1/.4 z there exists an
Xo > 0 such that for all x 2: Xo,
P{X > x} 2: exp( -ax log x).
(2.1)
If 0 < A z <
00,
then for every 0 < a < 1/ A z there exists an Xo > 0 such that for
all x 2: Xo,
P{ X 2: x} ::; exp( -ax log x).
(2.2)
If A 2
= 0, Al
> 0, and
u
> 0, then for every 0 < e < 1 there exists an Xo > 0 such
that for all x 2: Xo,
(2.3)
P{X 2: x} ::; 1 - ep ((1 -
5
e)~) .
(ii) If 0
for all x
<
Al :::;
00,
then for every {3
2: xo,
P{X < -x} 2: exp( -(3x log x).
(2.4)
< Al <
If 0
> 1/.4 1 there exists an Xo > 0 such that
00,
then for every 0
< {3 < 1/Al
>0
there exists an Xo
such that for
P{X:::; -x} :::; exp( -(3x log x).
(2.5)
If Al
= 0, .4. 2 > 0,
that for all x
and (7
> 0,
then for et'ery 0
< e < 1 there exists an Xo > 0 such
2: Xo,
(2.6)
(iii) The random t·ariable X is almost surely bounded from below if and only if
.4 1
= 0,(7 = 0,
00
and
f
00
sd¢2(s)
<
00,
f
or, equit'alently, -
1
<h(s)ds
<
'Xl,
in which
1
case
-00
< ess inf X =
(J
+
1
d> (s)
00
o
.2¢2( )
2 s
1+
1
00
ds = (J -
x
l+x
0
2
dR(x).
(iv) The random variable X is almost surely bounded from abot!e if and only if
..4 2
= 0, (7 = 0,
00
and
f
00
Sd¢1 (s)
<
00,
or, equivalently, -
1
f
¢1 (s
)ds
<
00,
<
00.
1
case
ess sup X
= (J -
1
00
o
¢ (s)
1¢2 ( )
1 S
1+
ds
= (J -
fO
-00
x
1+x
(v) The random t'ariable X is almost surely unbounded.
6
2 dL( x)
in which
Results of the type of the first two statements of parts (i) and (ii) of Theorem 2
have been proved by Kruglov [11], Ruegg (13,14], Horn [10], Steutel [16] and Elliott
and Erdos [7]. Related is the work of \Volfe [18]. The principally final result has been
achieved by Sato [15], who formulates it in terms of the support of the combined
Levy measure L( -x) - R( x), x > O. Sato's result in fact covers the multidimensional
case, and hence the univariate special case of his statement is by nature two-sided.
However, the univariate special case of his proof allowed Bingham, Goldie, and
Teugels [2; page 342] to formulate the more precise one-sided results. Our results
in (2.1), (2.2) and (2.4), (2.5) of parts (i) and (ii) of Theorem 2 above are equivalent
forms of these one-sided variants of Sato's theorem. An equivalent form of Sato's
two-sided univariate result has been proved later but independently by Esseen [9].
e
We are not aware of a precise formulation of the complementary results in part
(iii) of Theorem 1 and in (2.3) and (2.6) of Theorem 2 in the literature.
Parts (iii) and (iv) of Theorem 2 are equivalent forms of results achieved jointly
by the two papers of Baxter and Shapiro [1] and Tucker [17]. Independently of
Tucker they have been also obtained by Esseen [8]. These two parts trivially imply
part (v) of Theorem 2, which has been obtained first by Chatterjee and Pakshirajan
[3] by an independent Fourier-analytic proof.
There are, of course, other works that deal with the tail behavior of X (L, R, (7, 9).
For example, Yakymiv [19] has recently proved rather precise results for a special class of infinitely divisible variables, extending greatly the main theorem of
Zolotarev [20]. It would be interesting to see if these problems can also be approached by our more direct probabilistic methods.
7
3. Proofs. For the sake of simplicity in the notation, we drop all the subscripts j
in the proof of Theorem 1. Thus
~,<I>j, N j , Yj ,
and .4 j become V, <1>, N, Y, and .4 in
this proof.
The main trick in the proof of (i) and (iii) is to introduce the functions
where x > 0, and to write (d. (1.3))
Y
(3.1)
V
= [oo(N(s) -
s)d<l>(s) - l sd<l>(s)
= V(x) + V(x),
where
(3.2)
and
(3.3)
Note that for x
> 1,
Clearly, (1.2) implies that x 2 ¢(x 2 ) = o(x) as x ~
00,
and by an elementary argu-
ment based on the Cauchy-Schwarz inequality we also have
1
~2
<I>(s)ds
= o(x)
~ 00.
as
x
as
x~oo.
Hence
(3.4)
l°Osd<l>~(S)=o(x)
8
The following inequality for the Poisson distribution, stated as Inequality 2 in
[12], is an easy consequence of Stirling's formula.
LEMMA 1. For every A > 0 there ezists a constant 0 < K(>.) <
for all y
~
00
such that
1,
P{N(>.) ~ y} ~ K(>.)y-l/2 exp {-y(log y - 1 -log A)}.
LEMMA 2. For all x ~ 1"
> 0, and ,j4>(x 2 )
E exp(tV(x))
~ t
<
~ exp ( ; f(x)e"'Y)
00,
,
where
Proof. Setting
the argument used for the computation of the corresponding characteristic function
in the proof of Theorem 3 in [5] gives
- = 100 { -
log E exp(tV(x))
o
exp(t4>z(s)) -1 - t
~ /00 {t<fz(s) + t
10
2
<fz(s)
1
+ 4>;(s)
} ds + tB(x)
s
4>1 (s)e"'Y _ t <fzS ) } ds
2.
1 + <1>;(s)
t2
= -f(x)e"'Y,
2
9
+ t9(x)
where we used the inequality
The next two lemmas together show that the right tial of V is determined by
the term V( x) in (1.3).
LEMMA 3. For all a > 1/04 and all sufficiently small p > 1,
lim exp(ax log x)P{V(x) > px} =
z ..... oo
00.
Proof. Using (3.4), we see that for any 0 < e < 1 and all x large enough,
P{V(x) > px}
~ P {[OO N(s)¢z(s) > (1 + e)px }
~ P{N(e)(¢(x 2 ) - ¢(e))
= P{N(e)
> (1 + e)px}
> ,Bx}
~ K(e)(,Bx)-1/2 exp{ -,Bx(log x
+ log(,B/e} - I)}
by Lemma 1 in the last step, where e > 0 and p > 1 are chosen so small and x so
large that
1
(l+e)p
A <,B = ,B(x) = ¢(x 2) _ ¢(e) < a.
This inequality clearly implies the lemma. •
LEMMA 4. For all a > 0 and p > 0,
lim exp(ax 2)P{IV(x)1 ~ px} =
z .....oo
Proof. For any 0 < t <
P{V(x)
00,
o.
the Markov inequality and Lemma 2 give that
~ px} :5 exp ( -ptx + t; !(x)e P)
10
•
Upon choosing t
= pe-Px/ f(x) > 0, since
lim exp(ax 2)p{V(x)
•
;c~oo
~ px}:S
f(x) -- 0 as x --
00,
we obtain
lim exp {x 2 (a _ p2P f(l
2e
;c ..... oo
x
»)}
=
o.
On the other hand, for any t > 0 such that p/</>(x 2) :s -t < 0,
P{ - V(x)
~ px} :s exp ( -ptx + t; f(x )e P)
It is easy to see that presently the choice t
=
.
p(x 2/(e Pf(x»))1/2 is permissible,
whence we get
lim
;c-oo
exp(ax2)p{-V(x)~px}:S lim
;c~oo
again since f(x) -- 0 as x --
exp {x
2(,,+p22 - ve~~)}=o
f( x)
P
The two limit relations prove the lemma. •
00.
Proof of Theorem 1. (i) Choose a > 1/04 and let e > 0 be small enough so
that Lemma 3 holds for p = 1 + e. Then by (3.1),
P{V > x} ~ P{V(x) > (1
~ P{V(x)
+ e)x, !V(x)1 < ex}
> (1 + e)x} - P{lii"(x)1 ~ ex}.
Now Lemmas 3 and 4 imply that
lim exp(ax log x)P{V > x} =
00,
;c-oo
from which the statement follows. •
(iii) For any 0 < e < 1 and x > 0, by (3.1) we have
P{V:S -x}:S P{V(x):S -(1- e)x}
+ P{-V(x)
~ ex}.
Since by (3.2) and (3.4),
•
;c2
V(x) = [
N(s)d</>(s) - o(x)
11
as
x --
00,
we notice that the first probability in this bound becomes zero for all x large enough.
Hence the st atement follows from Lemma 4. •
(ii) The proof of this statement is more analytical. It goes as the proof of
Lemma 1 of Sato [15] and hence uses arguments from the proof of Theorem 1 of
Zolotarev [21]. These calculations performed with the moment generating function
actually go back to Cramer [4]. Our presentation is self-contained.
First of all note that the condition A <
1£ sd<jJ( s) <
(3.5)
00
00
implies that
for all
€
> O.
Introduce the random variable
~v =
1
OO
(N(s) - s)d¢(s)
and note that by (1.3),
1
1
(3.6)
W
=V -
sd</>(s).
Again, arguing as in the proof of Theorem 3 in [5], we find that
E exp(tW)
(3.7)
where
1
= exp(e(t)),
t E IR,
00
e(t)
=
{exp(-t<jJ(s)) -1
+ t<jJ(s)}ds.
Since lexp(-v) -1 +vl ~ v 2 exp(lvl)/2 for all v E IR, we see that
le(t)1
~ (~lOO <jJ2(S)dS)
12
t2eAltl,
tEIR ..
By (1.2) this says that, due to the condition A < 00, E exp(tl'V) is finite for all
t E
JR.
Differentiating
~
we obtain
((t) =
-1
00
¢(s){exp(-t¢(s)) -1}ds,
t E
JR.,
and differentiating once more,
1
00
('(t) =
from
\\~hich
4>2(s) exp(-t4>(s))ds > 0,
t E
JR,
we see that
1
00
((t)
where
-00 ::;
J.L
< 0,
e(O)
introduce the inverse to
•
(3.8)
1 J.L:=
= 0,
¢(s)ds
as.t
T 00
and e(t)
1 -00,
as t
T 00.
For any J.L
<
00,
e(1]( x )) = x.
e' is strictly increasing and continuous.
Furthermore, by the inverse function theorem we have
..
<
e:
The function "1 is well defined on (J.L, 00) since
(3.9)
x
~"("1(x))"1'(x) = 1,
and we know from the above that "1( x)
J.L
<x<
00,
°
> if and only if x > 0.
Now by (3.7) and (3.8), for any x > 0,
P{W ~ x} = P{"1(x)W ~ "1(x)x}::; exp{~("1(x)) - "1(X")(("1(x))}.
13
Observe that
!1/(:I:)
e(7](X)) - 7](X)((7](X)) = Jo
e'(s)ds - 7](x)e'(7](x))
•
!'7(:I:)
==
=
Jo
s(' (s )ds
-1:1: 7](t)e"(7](t))7]'(t)dt
-1:1: 7](t)dt
by (3.9). Thus for all x > 0,
P{lV;::: x} $ exp (
(3.10)
-1:1: 7](t)dt) .
Since exp(v) - 1 $ v exp( t') for all v > 0,
((t) $
(1
00
q}(s)ds) t exp(At) =: Bt exp(At), t > O.
from which it follows by (3.8) that t $ B7](t) exp(A.7](t)), t > O. But TJ(t) T 00 as
t
T 00, and thus the last inequality implies that for any 0 <
an Xo
~
0:
< 1/04 there exists
e such that t $ exp(7](t)jo:) for all t ;::: :ro. This says that
0:
for all t ;::: :ro. Substituting this bound into (3.10), we obtain for all :r
log t $ 7](t)
~
Xo that
{_1:1: 7](t)dt - 1: log tdt}
0
P{W;::: x} $ exp
$ exp { -
0:
1:
0:
log t dt }
$ exp( o:x log x).
By (3.6) this leads immdiately to the statement. •
(iv) Starting out from (1.3) again, we now use a different decomposition of V :
for any x > 1 we have V = U1(x)
U1(x) =
+ U2 (x),
where
1:1: N(s)d¢(s) - ¢(x)N(x) + ¢(x)x -1:1: sd¢(s)
14
..
and
1
=1
00
U2 ( X) =
•
•
(N ( 05) - 05 ) d<p ( 05) + ¢( x )( N (x) - x)
00
((N(s) - N(x)) - (05 - x)}d<p(s).
Since (1.2) implies that
and
as x -
00,
we have
(3.11)
On the other hand, since 1'1 has independent increments, U1 (x) and U2 (x) are
independent for each x > 1. Therefore, for any
•
P{V <
f
> 0,
-1:1: sd<p(s) + ¢(x)x + f}
;::: P{U1(x)
-1:1: sd<p(s) + 4>(x)x, IU (x)1 < e}
=
= P{N(x)
2
= O} P{jU2 (x)1
= exp( -x) P{!U2 (x)!
< f}
< f} > 0
for all x large enough..
00
If
J sd</>( 05) =
00,
then this inequality obviously implies that V is unbounded
1
..
from below.
00
,
If
.
J sd¢( 05) < 00, then
1
(3.12)
</>(x)x
-+
0
as
15
x
-"00,
and hence the above inequality implies that for any
P {V <
-1
13 > 0,
00
sd¢(s)
+ 13} > o.
At the same time, it follows form (3.11), (3.12), and the law of large numbers that
P
{v ~ -1
00
sd¢(s)} = P
{loo
N(s)d¢(s)
~ o} =
•
1,
and hence the statement. •
(v) This was in fact proved above, and formally follows from (iv). •
Proof of Theorem 2. (i) Let a > I/A 2 be given, fix a, E (0,1), and choose
€ > 0 so small that a > a/(1 + €) > a' > 1/.4 2 for some a > a' > 1/.4 2 • Then,
applying (i) of Theorem 1 to V2 , for all x
P{X > x}
+ uZ + () -
> 0 large enough we have by (1.4) that
P{-VI
~
(1-,) exp{-a'(1 + €)x 10g((1 + €)x)}.
()I
.,
+ ()2 > -€x} P{V2 > (1 + €)x}
~
Multiplying this by exp( ax log x), we see that the resulting lower bound goes to
infinity as x
-+ 00,
and hence (2.1) follows.
Next, suppose 0 < A 2 <
00,
let 0 < a < I/A 2 be given, and choose 0 <
€
<1
so small that 0 < a < a/(l - €) < a" < 1/A 2 for some a < a" < I/A 2 • Then,
applying (ii) and (iii) of Theorem 1 to V2 and Vb respectively, for any
T
> 0 and
for all x > 0 large enough,
P{X ~ x} ~ P{V2
+ () -
()I
+ ()2 ~ (1 -
~ exp{ -a"(l - €)x
€)x}
log((l - €)x)}
+ P {-
VI
~ ~x} + P {u Z ~. ~x}
+ exp( -TX 2 ) + (1 - ep (2: x)) .
Multiplying this by exp( ax log x), the resulting upper bound goes to zero as x -and hence (2.2) follows.
16
00,
Finally, assume t.hat A 2 = 0 and AI, (7 > 0 and fix 0 < e < 1. Let 0 < e' < e
and choose
so t.hat
T
T
> (20'2 )-1, Then, applying (iii) of Theorem 1 to Vb
•
P{X~x}=P{-VI+O'Z+8-8I~x}
::s P{ - VI + 8 -
81 ~ e'x}
::s exp( -TX 2 ) + 1 Multiplying this inequality by (1lower est.imates for 1 zero as x
--+ 00.
~(.),
+ P{O'Z
~
(1 - e')x}
~((1 - e')x/(7).
~((1- e)x/O'))-I
and using standard upper and
we see that the resulting upper bound converges to
This proves (2.3). •
(ii) This is completely analogous to (i). •
(iii) Sufficiency follows from part (v) of
Theor~m
1. Conversely, suppose that
= O.
X is bounded from below. Then (2.4) implies that Al
Obviously there exists
a y > 0 such that P{V2 < y} > O. thus, for any z > 0,
P{X < -x} = P{O'Z + V2 + 8 + 82 < -x}
~
unless
(7
P{O'Z
+ 8 + 82 < -z - y} P{V2 < y} > 0,
= O. Hence X is distributed as V2 +8+82 , and by (v) and (iv) of Theorem
00
1, we have
J Sd</>2(S) < 00 and
1
ess inf X = - (00 Sd</>2(S)
JI
+ 8 + 82 = 8 +
(iv) This is completely analogous to (iii) . •
17
1
00
0
</>2~~~)
1+
2 S
ds.
•
REFERENCES
[1] BAXTER, G. and SHAPIRO, J. M. (1960). On bounded infinitely divisible
random variables. Sankhya 22, 253-260.
•
[2] BINGHAM, N. H., GOLDIE, C. M. and TEUGELS, J. L. (1987). Regular
Variation. Cambridge Univ. Press, Cambridge.
[3] CHATTERJEE, S. D. and PAKSHIRAJAN, R. P: (1956). On the unboundedness of infinitely divisible laws. Sankhya 17, 349-350.
[4] CRAMER, H. (1938). Sur un nouveau theoreme-limite de la theorie des probabilites. Aetualites Sci. et Indust. No. 736, Paris.
[5] CSORGO, S., HAEUSLER, E. and MASON, D. M. (1988). A probabilistic
approach to the asymptotic distribution of sums of independent, identically
distributed random variables. Adv. in Appl. -'lath. 9, 259-333.
[6] CSORGO, S., HAEUSLER, E. and ~'1ASON, D. M. (1989).
[7] ELLIOTT, P. D. T. A. and ERDOS, P. (1979}. The tails of infinitely divisible
laws and a problem in number theory. J. Number Theory 11, 542-551.
[8] ESSEEN, C. -G. (1965). On infinitely divisible one-sided distributions. Jlath.
Scand. 17,65-76.
[9] ESSEEN, C. -G. (1981). On the tails of a class of infinitely divisible distributions. In: Contributions to Probability. A Collection of Papers Dedicated
to Eugene Lukacs (J. Gani and V. K. Rohatgi, eds.), pp. 115-122. Academic
Press, New York.
[10] HORN, R. A. (1972). On necessary and sufficient conditions for an infinitely
divisible distribution to be normal or degenerate. Z. Wahrsch. Verw. Gebiete
21, li9-187.
[11] KRUGLOV, V. M. (1970). A note on infinitely divisible distributions. Theory
Probab. Appl. 15, 319-324.
[12] MASON, D. M. and SHORACK, G. R. (1988). Non-normality of a class of
random variables. Technical Report No. 125, Department of Statistics, Univ.
of \Vashington.
[13] RUEGG, A. F. (1970). A characterization of certain infinitely divisible laws.
Ann. Afath. Statist. 41, 1354-1356.
[14] RUEGG, A. F. (1971). A necessary condition on the infinite divisibility of
probability distributions. .4nn. Math. Statist. 42, 1681-1685.
18
•
[15] SATO, K. (1973). A note on infinitely divisible distributions and their Levy
measures. Sci. Rep. Tokyo Kyoiku Daigaku Sect. A 12, 101-109.
•
[16] STEUTEL, F. W. (1974). On the tails of infinitely divisible distributions. Z.
·Wahrsch. Verw. Gebiete 28, 273-276.
[17] TUCKER, H. G. (1961). Best one-sided bounds for infinitely divisible random
variables. Sankhya 23, 387-396.
[18] \-VOLFE, S. J. (1971). On moments of infinitely divisible distribution functions.
Ann. Afath. Statist. 42, 2036-2043.
[19] YAKY1UV, A. L. (1987). The asymptotic behavior of a class of infinitely
divisible distributions. Theory Probab. Appl. 32, 628-639.
[20] ZOLOTAREV, V. M. (1961). On the asymptotic behavior of a class of infinitely
divisible distribution laws. Theory Probab. Appl. 6, 304-307.
[21] ZOLOTAREV, V. M. (1965). Asymptotic behavior of the distributions of
processes with independent increments. Theory Probab. Appl. 10, 28-44.
•
19
© Copyright 2026 Paperzz