Distributed MST for Constant Diameter Graphs

Lecture 2
Dr Zvi Lotker
Conclusion



The power assumption E(r)=C·r
Where 24
The space assumpion “The dimension
is 2d3
These two assumption don’t get
well
Two that are almost the same
Birthday Paradox

The prob that no two students have the same
 365 
Birthday is

m!


P
Note that if n>365 this is 0
m 
365m
We can also by considering one person at a time:

If we take k to be small we get that
 k
1    e
 n
k
 
n
 j
m 1
j  m 1  n 

1     e

n  j 1
j 1 
 m 1  j 
 exp    
 j 1  n 
e
e
 m ( m 1) 


 2n 
 m2

 2n




m 1
j

1  

n
j 1 
Birthday Paradox
Hence the value for m at which the
prob that all people have different
birth is ½ is approximately:
m2=ln 2
In case of n=365, m=22.5
 j
m 1
j  m 1  n 

1     e

n  j 1
j 1 
 m 1  j 
 exp    
 j 1  n 
e
e
 m ( m 1) 


 2n 
 m2

 2n




Birthday Paradox

Let Ei be the event that the i persons birthday
does not match any of the birthday i-1 people.
Then
PE  E ...  E    PE 
k
1
2
k
i 1
i
i 1
i 1 n
k (k  1)

2n
n


If k≤n, this prob is less than 1/2 . So if there
are n people then the prob at list ½ all will have
distinct birthday.
Birthday Paradox


Now assume that all the first n people all
have distinct birthday. Each person after have
at list 1/n to have a birthdays with some
one.
Hence the prob that the next n people all
will have distinct birthday is at most
1 

1 

n

n
1
  0.5
e
Balls into bins

Suppose we sequentially throw m balls
into n bins.



We consider the case n=m.
What is the maximum number of balls
in any bin.
Theorem: The prob that the max load
bin has more them 3 ln n/ln(ln n) is less
then 1/n for big n.
Balls into bins

Theorem: The prob that the max load bin has
more them 3 ln n/ln(ln n) is less then 1/n for big
n.
The prob that bin 1 have at list M ball is  Mn  1n 
 
This is follow from union bound
Now we use the inequality
 n  1 
1  e 
   
The second inequality follows from  M  n  M !   M 
M



M

kk
ki
   ek
k!
i i!
M
Balls into bins

Theorem: The prob that the max load bin has
more them 3 ln n/ln(ln n) is less then 1/n for big
n.


The prob that bin 1 have at list M ball is  n  1 
  
 M  n 
This is follow from union bound
 n  1 
1  e 
   
 
Now we use the inequality
M!  M 
 M  n 
Applying union bound for M> 3 ln n/ln(ln n)
M
M


M
 e 
 e ln ln n 
n   n

M
3
ln
n
 


 ln ln n 
 n

 ln n 
1

n
3 ln n / ln ln n
3 ln n / ln ln n

 e ln n e ln ln ln n ln ln n

3 ln n / ln ln n
M
Motivation
For random network





Cheap device.
Throw the sensors from an airplane.
Usually we use random positioning
when we run a simulation.
…..
Random is a strong assumption.
Random distribution


1
We assume that d=2.
Uniform random points.

pi~U[0,1]d .
Bad:
not i.i.d
0.8
0.6
0.4
0.2
0.2
0.4
0.6
0.8
1
Good:
We know the
number of
point
The number of
point in [0,1]2
is not constant
Random distribution
We assume that d=2.
How do we
Poisson[1]
construct




d.s
Take nd random
points
in
p
~U[0,n-1]
i
Poisson[]
Take the points that fell in pi~U[0,1]d .
1
10
0.8
8
0.6
6
0.4
4
0.2
2
0.2
2
4
6
8
10
0.4
0.6
0.8
1
We
assume that
d=2.
Random
distribution
Poisson[1]


Take n~Poisson[1].
Take pi~U[0,1]d for i=1,…,n


1
0.8
0.6
0.4
0.2
0.2
0.4
0.6
0.8
1
The number of
point in [0,1]2
is not constant
Using Poisson assumption



Assume we have an x~ Poisson[]
Let A[0,1]2 be an area.
Denote (A) to be the random variable
that counts the number of points in A.
P[   k ] 

e
 A
A
k
k!
The advantage of the Poisson assumption
is that the random variable are i.i.d if
(A), (B) if AB=.
Poisson distribution
Poisson distribution
P[   k ] 
e
 A
A
k
k!
Some fact on Poisson
Poisson property
Parameters
Support
Probability mass function (pmf)
Cumulative distribution function (cdf)
Mean
Median
Mode
Variance
Skewness
Excess kurtosis
Entropy
Moment-generating function (mgf)
Characteristic function
Poisson and the
binomial distribution.
e  A A
P[   k ] 
k!


The binomial dis is
Usually when p is fix


we use the normal dis
But when p~O(1/n)

we use Poisson
distribution
k
Tail of the Poisson[]

Let X be a Poisson dis with parameter 




If t> then P[Xt]≤e-(e)t/tt
If t> then P[X≤t]≤e-(e)t/tt
Proof
For any a>0 and t>,




P[Xt]=P[Exp(aX)Exp(at)] ≤E[exp(aX)]/Exp(aX)
Now E[exp(aX)] is the moment generating
E[exp(aX)]=Exp((Exp(a)-1)
Choosing a=log(t/) give the
P[Xt] ≤e-t--tlog(t/) ≤e-(e)t/tt
Poisson and the
uniform distribution


Thm:
 If Yi~Pos[i] follow a Poisson distribution
with parameter i and Yi are independent,
then Y=Yi ~Pos[ i]
Thm:
 Let Yi~Pos[i], assume that Yi=k then
 (Y1,Y2,…,Yn)~ Uniform i.e.
k




n
k1 , k1 ,..., kn 

P[Y1  k1 ,, Yn  k n |  Yi  k] 
n
n
i 1
Poisson and the
uniform distribution

Suppose that m balls are thrown into n bins i.u.



be the number of balls in the i bin
be the independent Poisson var with mean m/m
X im
Yi m
Theorem


Let
Let
Let f(xm1,xm2,…,xmn) be a non negative function then
E f X1m , X 2m ,..., X nm  e mE f Y1m , Y2m ,..., Ynm


,..., Y    E  f Y , Y ,..., Y  |  Y  k P[ Y  k ]


m
1
m
2
E f Y ,Y
m
n
k 1

n
m
1

m
2
m
n
n
m
i 1
i
m

i 1
i
n

 n m
m
m
m
m
 E  f Y1 , Y2 ,..., Yn |  Yi  m P[ Yi  m]
i 1

 i 1



 E f X , X ,..., X

m
1
m
2
 E f X , X ,..., X
m
1
m
2
m
n
P[ Y
m
n

n
i 1
i
m
 m]


m me m
1
 E f X 1m , X 2m ,..., X nm
m!
e m

