Houston Journal of Mathematics
c 2050 University of Houston
Volume 76, No. 1, 2050
ON THE MAXIMUM OF RANDOM VARIABLES ON
PRODUCT SPACES
JOSCHA PROCHNO AND STIENE RIEMER
Communicated by William B. Johnson
Abstract. Let ξi , i = 1, ..., n, and ηj , j = 1, ..., m be iid p-stable respectively q-stable random variables, 1 < p < q ≤ 2. We prove estimates for
n
EΩ1 EΩ2 maxi,j |aij ξi (ω1 )ηj (ω2 )| in terms of the `m
p (`q )-norm of (aij )i,j if
n )-norm in the case q = 2, where M depends
q < 2 and in terms of the `m
(`
ξ
p
Mξ
on the Gaussians. Furthermore, we show that a sequence ξi , i = 1, ..., n of
iid log −γ(1, p), p > 1, distributed random variables generates the `p -norm.
As far as we know, the generating distribution for `p -norms with p ≥ 2 has
not been known up to now.
1. Introduction and Notation
A convex function M : [0, ∞) → [0, ∞) with M (t) > 0 for t > 0 and M (0) = 0
is called an Orlicz function. For an Orlicz function M we define the Orlicz norm
k·kM on Rn by
(
)
n
X
|xi |
M
kxkM = inf t > 0 :
≤1 ,
t
i=1
and the Orlicz space `nM to be Rn equipped with the norm k·kM . Moreover, an
Orlicz norm k·kM is uniquely determined on the interval [0, s0 ] where M (s0 ) = 1.
For a detailed and thorough introduction to Orlicz spaces see for example [5].
Throughout this paper, we will give order estimates and this will be denoted by
∼, since we are not focussed on the exact values of the absolute constants. If the
absolute constants depend on a certain parameter p, we denote this by ∼p . By
2000 Mathematics Subject Classification. 46B09; 60E15.
Key words and phrases. Expectation, p-stable random variables, gaussian random variables,
Orlicz norms, product spaces .
1
2
JOSCHA PROCHNO AND STIENE RIEMER
c, C... we denote positive absolute constants and we write cp , Cp if they depend
on some parameter p. The value of the constants may change from line to line.
We recall that a real valued random variable ξ on a probability space (Ω, A, P) is
called normalized symmetric p-stable (in short: p-stable) for 0 < p ≤ 2, if it has
the characteristic function
p
φξ (x) = e−|x| ,
where the characteristic function φξ : R → C of ξ is defined by φξ (t) = Eeitξ . In
the case 1 < p < 2 we have finite first moments, but no finite variance, because of
the heavy tails (for p = 2 the variance is finite). In fact, if 1 < p < 2, the p-stable
random variable ξ satisfies for all t > 0
P (|ξ| ≥ t) ≤ Cp t−p .
(1)
For more information see for example [6].
Now, let ξi , i = 1, ..., n be independent copies of a random variable ξ on a
probability space (Ω1 , A1 , P1 ), whose first moment is finite. Furthermore, let ai ,
i = 1, ..., n be real numbers. In [3] and [4], the following theorem was shown:
Theorem 1.1. Let
Z
(2)
Mξ (s) =
0
s
!
Z ∞
1
1
P1 |ξ| ≥
+
P1 (|ξ| ≥ u)du dt.
1
t
t
t
Then, for all x ∈ Rn ,
E max |ai ξi | ∼ k(ai )ni=1 kMξ .
1≤i≤n
In the following let also nj , j = 1, ..., m be independent copies of a random
variable η on a probability space (Ω2 , A2 , P2 ), whose first moment is finite and
aij , i = 1, ..., n, j = 1, ..., m be real numbers.
It is a natural question if we can give estimates for
(3)
EΩ1 EΩ2 max |aij ηj (ω2 )ξi (ω1 )| .
i,j
n,m
Since the random variables (ξi ηj )i,j=1 are no longer independent on the product
space (Ω1 ×Ω2 , A1 ⊗A2 , P1 ⊗P2 ) the previous result, Theorem 1.1, is not applicable
in this case.
We give precise estimates up to absolute constants for a certain class of random
variables, namely p- and q-stable, p, q ∈ (1, 2), p < q, and standard Gaussians.
This shows in addition that we can treat dependent random variables with a
certain structure of dependence and give precise estimates, this has not been
MAXIMUM OF RANDOM VARIABLES ON PRODUCT SPACES
3
feasible at all by now. Considering p-stable random variables seems to be natural
in this case, since they generate the `p -norm, which means the Orlicz function
resulting in Theorem 1.1 equals s 7→ sp for p ∈ (1, 2) (see [4]). One could expect
that standard Gaussians generate the `2 -norm. However, as shown for example
in [4], they do not, but we can treat this case as well. All these estimates can be
found in the second section. For applications of those probabilistic inequalities in
connection with Orlicz norms we refer the reader to [1], [2], [3] and [4] to mention
a few.
Furthermore, in this context the question arose which random variables generate the `2 -norm, since standard gaussians obviously do not. We provide the
solution together with the solution of the generation of `p -norms for all p > 1 in
the third section. Additionally, we give order estimates for (3) for these generating
distributions.
2. Estimates for EΩ1 EΩ2 max |aij ηj (ω2 )ξi (ω1 )|
i,j
In the following let p, q ∈ (1, 2) with p < q. We analyze the expression
EΩ1 EΩ2 max |aij ηj (ω2 )ξi (ω1 )| under two different assumptions. Let η always be a
i,j
p-stable random variable.
In the first case let ξ be a q-stable random variable, we prove the following:
m n .
EΩ1 EΩ2 max |aij ηj (ω2 )ξi (ω1 )| ∼p;q (aij )i=1 q
i,j
j=1
p
In the second case let ξ be a standard gaussian random variable, we prove under
this assumption
m n ,
EΩ1 EΩ2 max |aij ηj (ω2 )ξi (ω1 )| ∼p (aij )i=1 M
ξ j=1 i,j
p
where k·kMξ denotes the Orlicz norm given by the Orlicz function
, if s = 0
0 3
− 2s2
(4)
Mξ (s) =
, if s ∈ (0, 1)
e
e− 32 (3s − 2) , if s ≥ 1.
The idea to prove these two results is using the triangle inequality and Jensen’s
inequality for getting a lower and an upper bound. Afterwards, using Theorem
1.1, we show that the resulting expressions are equal up to constants depending
only on p and q. Furthermore, we show that we can express this resulting object
in terms of the corresponding product norm. This also allows us, in these cases,
to express a result due to S. Kwapień and C. Schütt ([8], Example 1.6) in terms
4
JOSCHA PROCHNO AND STIENE RIEMER
of random variables and in a form, which is very easy to apply.
Applying the results from [2], combined with the first steps of the proof of
Theorem 2.1, one obtains
n n + 1 − j c1
≤ EΩ1 EΩ2 max |aij ηj (ω2 )ξi (ω1 )|
max n
i,j
α 1≤l≤n P
1
aij
i=1
j=1 p
n n + 1 − j cp
,
≤
max n
ln(n + 1)
1≤l≤n P
β
1
aij
i=1
j=1
p
where η is p-stable and ξ is a standard gaussian. Since there is a logarithmic factor
in the upper bound, this obviously does not give the correct order. With our
method we give the correct order up to absolute constants in an easily applicable
form.
Theorem 2.1. Let p, q ∈ (1, 2) with p < q. Additionally, let ξi , i = 1, ..., n
be independent copies of a q-stable random variable ξ on (Ω1 , A1 , P1 ) and let ηj ,
j = 1, ..., m be independent p-stable copies of a random variable η on (Ω2 , A2 , P2 ).
Then, for all (aij )i,j ∈ Rn×m ,
m .
(aij )n EΩ1 EΩ2 max |aij ηj (ω2 )ξi (ω1 )| ∼p;q i=1 q
i,j
j=1
p
Proof. Let aj , j = 1, ..., m be real numbers. In [4] it was shown that
(5)
EΩ2 max |aj ηj (ω2 )| ∼p (aj )m
j=1 p .
1≤j≤m
Applying this, we get
m EΩ1 EΩ2 max |aij ηj (ω2 )ξi (ω1 )| ∼p EΩ1 max |aij ξi (ω1 )|
.
i,j
1≤i≤n
j=1
p
Using the triangle inequality and (5) for the q-stable ξi , i = 1, ..., n, we get
m EΩ1 EΩ2 max |aij ηj (ω2 )ξi (ω1 )|
≥
cp EΩ1 max |aij ξi (ω1 )|
i,j
1≤i≤n
j=1 p
m .
(aij )n ∼p;q i=1 q
j=1 p
MAXIMUM OF RANDOM VARIABLES ON PRODUCT SPACES
5
For the upper bound we apply Jensen’s inequality and obtain
1 !m p p
p
.
EΩ1 max aij ξi (ω1 )
EΩ1 EΩ2 max |aij ηj (ω2 )ξi (ω1 )| ≤ Cp i,j
1≤i≤n
j=1
p
By Theorem 1.1 we get
m EΩ1 max apij ξip (ω1 ) ∼ apij i=1 1≤i≤n
,
Mξp
where
Z
(6)
Mξp (s) =
0
s
!
Z ∞
1
1
p
p
P1 |ξ| ≥
+
P1 (|ξ| ≥ u) du dt.
1
t
t
t
To prove that the upper and lower bound are equal up to constants, we have to
show that Mξp (sp ) ≤ Cp,q sq . This will yield
1
p n p
n n = (aij )i=1 M p (tp ) ≤ Cp,q (aij )i=1 q ,
aij i=1 ξ
Mξp
since the function s 7→ sq generates the `q -norm. To do so, we use (1), i.e., for
all t > 0,
P (|ξ| ≥ t) ≤ Cq t−q .
(7)
Combining (6) and (7) we get, since q > p,
!
Z ∞
Z s
1
1
Mξp (s) =
P1 |ξ|p ≥
P1 (|ξ|p ≥ u) du dt
+
1
t
t
0
t
!
p1 ! Z ∞ Z s
1
1
1
=
P1 |ξ| ≥
+
P1 |ξ| ≥ u p du dt
1
t
t
0
t
!
− pq Z ∞
Z s
q
1 1
≤
+
u− p dt
1
t t
0
t
Z s
h
i∞ q
q
−1+ p
−p
+1
=
t
+ −u
dt
1
0
t
Z s
q
=2
t−1+ p dt
0
q
∼p;q s p ,
which yields the desired result.
6
JOSCHA PROCHNO AND STIENE RIEMER
Theorem 2.2. Let p ∈ (1, 2), let ηj , j = 1, ..., m be independent copies of a
p-stable random variable η on (Ω1 , A1 , P1 ) and let ξi , i = 1, ..., n be independent
copies of a standard gaussian random variable ξ on (Ω2 , A2 , P2 ). Then, for all
(aij )i,j ∈ Rn×m ,
m ,
(aij )n EΩ1 EΩ2 max |aij ηj (ω1 )ξi (ω2 )| ∼p i=1 Mξ
i,j
j=1
p
where Mξ is given by (4).
n
Proof. Applying Theorem 1.1, we get for all (zi )i=1 ∈ Rn
E max |zi ξi | ∼ k(zi )ni=1 kMξ ,
1≤i≤n
where, as shown in [4], the following holds
Mξ (s)
=
Rs
0
1
t P1
|ξ| ≥
1
t
+
R∞
!
P1 (|ξ| ≥ u)du dt
1
t
, if s = 0
0 3
− 2s2
∼
e
, if s ∈ (0, 1)
e− 32 (3s − 2) , if s ≥ 1.
In accordance with the ideas from the proof of Theorem 2.1, we get
m (aij )n EΩ1 EΩ2 max |aij ηj (ω1 )ξi (ω2 )| ≥ cp i=1 Mξ
i,j
j=1
p
and
p1 !m p n .
EΩ1 EΩ2 max |aij ηj (ω1 )ξi (ω2 )| ≤ Cp aij i=1 i,j
Mξp
j=1
p
As in the previous proof it remains to show that
1
p m p
∼p k(aij )m
aij i=1 i=1 kMξ .
Mξp
This holds, since it follows from Example 16 in [4] that Mξp (sp ) ∼p Mξ (s).
3. Generation of `p -norms
Since standard gaussian random variables do not generate the `2 -norm, the
question arises what distribution does. We prove that log −γ1,p , p > 1, distributed
random variables generate the `p -norm, and therefore, log −γ1,2 distributed random variables the `2 -norm.
MAXIMUM OF RANDOM VARIABLES ON PRODUCT SPACES
7
We recall that the density of a log −γ
(q,p qdistributed random variable ξ with
p
−p−1
(ln(x))q−1 , x ≥ 1,
Γ(q) x
parameters q, p > 0 is given by fξ (x) =
Notice
0,
x < 1.
that for all y ≥ 1 we have the following tail behavior:
Z ∞
Z ∞
−p
(8)
P(|ξ| ≥ y) =
fξ (x)dx =
px−p−1 dx = [−x−p ]∞
,
y =y
y
y
and
(9)
Z ∞
Z ∞
1
fξ (x)dx =
px−p−1 dx = [−x−p ]∞
P |ξ| ≥
=
1 = 1,
1
t
1
t
if y < 1.
We prove the following theorem:
Theorem 3.1. Let p > 1 and ξ1 , ..., ξn be iid copies of a log −γ1,p distributed
random variable ξ. Then, for all x ∈ Rn ,
E max |xi ξi | ∼p kxkp .
1≤i≤n
Proof. By Theorem 1.1, we have
E max |xi ξi | ∼ kxkMξ ,
1≤i≤n
where
Z
(10)
!
Z ∞
1
1
P |ξ| ≥
+
P(|ξ| ≥ u)du dt.
1
t
t
t
s
Mξ (s) =
0
Case 1: Let s ≤ 1. Since the limits of integration are 0 and s, we have
Now (8) yields
Z s
Z s
1
1
1 p
sp
(11)
P |ξ| ≥
dt =
t dt = ,
t
p
0 t
0 t
and moreover, since p > 1,
Z
Z ∞
P(|ξ| ≥ u)du =
(12)
1
t
Hence
Z
s
∞
u−p du =
1
t
1 p−1
t .
p−1
1
1 p−1
t dt =
sp .
p(p − 1)
0 p−1
Using the representation (10), we obtain
1
1
1 p
Mξ (s) = sp +
sp =
s .
p
p(p − 1)
p−1
1
t
≥ 1.
8
JOSCHA PROCHNO AND STIENE RIEMER
Case 2: Let s > 1. We first calculate
Z s
1
1
P |ξ| ≥
dt.
t
0 t
We have
Z
s
0
Z s
Z 1
1
1
1
1
1
1
P |ξ| ≥
dt =
P |ξ| ≥
dt +
P |ξ| ≥
dt .
t
t
t
t
t
t
{z
} |1
{z
}
|0
(I)
(II)
In (I), we have 1t ≥ 1. Therefore, equation (11) with s = 1 yields (I) =
(II), we have 1t ≤ 1 and hence, (9) gives
Z s
1
(II) =
dt = ln(s).
1 t
Therefore
Z
0
s
1
p.
In
1
1
1
P |ξ| ≥
dt = ln(s) + .
t
t
p
We now calculate
s
Z
∞
Z
P(|ξ| ≥ u)du dt.
0
Again, we have
Z sZ ∞
Z
P(|ξ| ≥ u)du dt =
0
1/t
0
|
1
Z
∞
Z
s
Z
∞
P(|ξ| ≥ u)du dt .
P(|ξ| ≥ u)du dt +
1
1/t
{z
}
(III)
1/t
|
{z
(IV )
1
t
≥ 1 again. Using (12), we get
Z 1
∞
1
1 p−1
t dt =
.
P(|ξ| ≥ u)du dt =
p(p − 1)
1/t
0 p−1
In part (III), we have
Z 1Z
(III) =
0
1/t
In part (IV ), we have 1t ≤ 1, so we get
Z 1
Z ∞
Z ∞
P(|ξ| ≥ u)du =
P(|ξ| ≥ u)du +
P(|ξ| ≥ u)du .
1
1
1
t
t
{z
}
|
|
{z
}
(IV.2)
(IV.1)
Since in (IV.1) we have u ≤ 1, we obtain
Z 1Z ∞
Z 1Z ∞
Z 1
1
(IV.1) =
fξ (x)dx du =
fξ (x)dx du =
1du = 1 − ,
1
1
1
t
u
t
t |1
t
{z
}
=1
}
MAXIMUM OF RANDOM VARIABLES ON PRODUCT SPACES
9
1
and equation (12) for t = 1 yields (IV.2) = p−1
. So
Z s
Z sZ ∞
1
1
p
P(|ξ| ≥ u)du dt =
1− +
(IV ) =
dt =
(s − 1) − ln(s).
t
p−1
p−1
1/t
1
1
Altogether, we have
Mξ (s) = ln(s) +
1
1
p
+
+
(s − 1) − ln(s),
p p(p − 1) p − 1
p
s − 1.
i.e., for s > 1 we have Mξ (s) = p−1
Thus, we obtain
E max |xi ξi | ∼ kxkMξ ,
1≤i≤n
where
(
Mξ (s) =
1
p
p−1 s ,
p
p−1 s −
1,
s ≤ 1;
s > 1.
Since the corresponding Orlicz norm k·kMξ is uniquely determined on the interval
[0, 2], and because for all s ∈ [0, 2] and all p > 1
sp
sp
≤ Mξ (s) ≤
,
p
2
p−1
we obtain
E max |xi ξi | ∼p kxkp .
1≤i≤n
Especially, this shows that log −γ1,2 distributed random variables generate the
`2 -norm. This is interesting, since one could assume that standard Gaussians
generate the `2 -norm. In fact, their generated norm is far from that.
Naturally now the question arises, can we prove Theorem 2.1 and Theorem 2.2
also in case that the inner respectively the outer norm is the `2 -norm, i.e., in the
case that the random variables ξi , i = 1, ..., n, respectively ηj , j = 1, ..., m are
independent log −γ1,2 distributed. We can do so, as we show in the following.
Theorem 3.2. Let p ∈ (1, 2), let ξi , i = 1, ..., n be independent copies of a
log −γ1,2 distributed random variable ξ on (Ω1 , A1 , P1 ) and let ηj , j = 1, ..., m be
independent p-stable copies of a random variable η on (Ω2 , A2 , P2 ). Then, for all
(aij )i,j ∈ Rn×m ,
n m EΩ1 EΩ2 max |aij ηj (ω2 )ξi (ω1 )| ∼p (aij )i=1 2 j=1 .
i,j
p
10
JOSCHA PROCHNO AND STIENE RIEMER
Proof. We follow the proof of Theorem 2.1. Therefore, we have
m EΩ1 EΩ2 max max |aij ξi (ω1 )ηj (ω2 )| ∼p EΩ1 max |aij ξi (ω1 )|
.
1≤i≤n 1≤j≤n
1≤i≤n
j=1
p
As before, we have to show that Mξp (s) ∼p Mξ (s1/p ) = s2/p . We calculate Mξp (s)
and start with s ≤ 1. By (9), for all y ≥ 1
Z ∞
fξ (x)dx = y −2 .
P(|ξ| ≥ y) =
y
We obtain
!
Z ∞
1
−1/p
1/p
P(|ξ| ≥ t
P(|ξ| ≥ u )du dt
)+
1
t
0
t
∞
Z s
1
u−2/p+1
dt
t2/p−1 + −
2/p − 1
1
0
t
Z s
2
t2/p−1 dt
2−p 0
p 2/p
s .
2−p
Z
Mξp (s)
=
=
=
=
s
p
p
p
So for all s ≤ 1 we have Mξp (s) = 2−p
s2/p = 2−p
Mξ (s1/p ). Since 2−p
> 1, the
case 0 ≤ s ≤ 1 suffices, because Mξp (1) > 1. Thus, the Orlicz norm k·kMξp is
uniquely determined on this interval.
Theorem 3.3. Let ηj , j = 1, ..., m be independent copies of a log −γ1,2 distributed
random variable η on (Ω1 , A1 , P1 ) and let ξi , i = 1, ..., n be independent copies of a
standard gaussian random variable ξ on (Ω2 , A2 , P2 ). Then, for all (aij ) ∈ Rn×m ,
m (aij )n ,
EΩ1 EΩ2 max |aij ηj (ω1 )ξi (ω2 )| ∼ i=1 M
i,j
ξ
j=1 2
where Mξ is given by (4).
The proof works exactly in the same way as the proof of Theorem 2.2.
Acknowledgments. We thank the anonymous referee for helpful comments
that improved the representation of the results, especially the statement of Theorem 3.1.
MAXIMUM OF RANDOM VARIABLES ON PRODUCT SPACES
11
References
[1]
[2]
[3]
[4]
[5]
[6]
[7]
[8]
D. Alonso-Gutiérrez, J. Prochno, Estimating Support Functions of Random Polytopes via Orlicz Norms, preprint, 2012
Y. Gordon, A. E. Litvak, C. Schütt, E. Werner, On the minimum of several random
variables, Proc. Amer. Math. Soc., 134(12):366–3675 (electronic), 2006
Y.Gordon, A. Litvak, C. Schütt, E. Werner, Uniform estimates for order statistics
and orlicz functions, Positivity, Vol. 16 (1), 1–28, 2012
Y.Gordon, A. Litvak, C. Schütt, E. Werner, Orlicz norms of sequences of random
variables, Ann. Probab., 30(4):183–1853, 2002
M. A. Krasnoselski, Y. B. Rutickii, Convex Functions and Orlicz Spaces, P. Noordhoff
LTD., Groningen (1961)
V. Milman, G. Schechtman, Asymptotic theory of finite dimensional normed spaces,
Lecture Notes in Math., 1200. Springer-Verlag, 1986.
S. Kwapień, C. Schütt, Some combinatorial and probabilistic inequalities and their
application to Banach space theory, Studia Math., 82, 91–106, 1985
S. Kwapień, C. Schütt, Some combinatorial and probabilistic inequalities and their
application to Banach space theory II, Studia Math., 95(2), 14–154, 1989
Received October 19, 2011
Revised version received April 3, 2012
(Joscha Prochno) Mathematisches Seminar, Christian-Albrechts-Universität zu Kiel,
Ludewig-Meyn-Str. 4, 24098 Kiel, Germany
E-mail address: [email protected]
(Stiene Riemer) Mathematisches Seminar, Christian-Albrechts-Universität zu Kiel,
Ludewig-Meyn-Str. 4, 24098 Kiel, Germany
E-mail address: [email protected]
© Copyright 2026 Paperzz