Harmonic Things

Harmonic Things
John D Barrow
Geometric series
S(n) = a + ar + ar² + ar³ +...+ arⁿ⁻¹, -1 <r< 1
rS(n) = 0 + ar + ar² + ar³+...+ arⁿ⁻¹ + arⁿ
(1 - r)S(n) = a - arⁿ
S(n) = a(1 - rⁿ)/(1 - r)
S(n  ) = a/(1 - r)
So if a = ½ and r = ½
S(n  ) = 1
1/2 + 1/4 + 1/8 + 1/16 + 1/32 + ........ = 1
Behold…..
1/2 + 1/4 + 1/8 + 1/16 + 1/32 +........ = 1
1
1
The value of your investments can plummet as well as go down
VAT in the eternal future
17.5% = 10% + 5% + 2.5%
Next step
18.75% = 10% + 5% + 2.5% + 1.25%
Or
** 15% = 10% + 5% **
And ultimately…?
10%  (1 + 1/2 + 1/4 + 1/8 + 1/16 + 1/32 + ....) = 20%
ie 10%  1/(1 – ½)
The Harmonic Series
H = 1 + 1/2 + 1/3 + 1/4 + 1/5 + 1/6 + 1/7 + 1/8 +.......
Has an infinite sum
Note that the sum of
1/1p + 1/2p +1/3p + .....   if p  1
But is finite if p > 1
Recall that 1 1/x dx =  but

1
1/xp dx = 1/(p-1) for p>1
H has an Infinite Sum
H = 1 + 1/2 + 1/3 + 1/4 + 1/5 + 1/6 + 1/7 + 1/8 + 1/9 + 1/10 + 1/11 + .......
H = 1 + (1/2 + 1/3) + (1/4 + 1/5 + 1/6 + 1/7) + (1/8 + 1/9 + 1/10 + 1/11 +…..1/15) +..
> 1/2 + (1/4 + 1/4) +(1/8 + 1/8 + 1/8 + 1/8) + (1/16 + 1/16 + 1/16 + 1/16 + ..+ 1/16
H > 1/2 + 1/2 + 1/2 + 1/2 + …….   **
“Divergent series are the invention of the devil,
and it is a shame to base on them any demonstration whatsoever”
Niels Abel
But H goes to infinity very slowly
H(n) = 1 + 1/2 + 1/3 + 1/4 + 1/5 + …. + 1/n
Then H(1) = 1, H(2) = 1.5, H(3) = 1.833, H(4) = 2.083,
H(10) = 2.93, H(100) = 5.19
The sum grows very slowly as the number of terms
increases:
H(256) = 6.124 but H(1000) = 7.49
H(1,000,000) = 14.39.
When n gets large H(n) only increases as fast as the
natural logarithm of n and approximately
H(n) = 0.577 + logen.
Rainfall Records
In year 1 the rainfall must be a record. So the number of record years is
1
In year 2, if the rainfall is independent of year 1, there is a chance of 1/2
of beating the record year 1 rainfall and a chance of 1/2 of not beating it.
So the expected number of record years in the first 2 years is
1 + 1/2
In year 3 there are just two ways in which the 6 possible rankings
(123, 132, 321, 213, 312, 231)
of the rainfall in years 1, 2 and 3 could produce a record in year 3 (ie a 1
in 3 chance). So the expected number of record years after 3 years is
1 + 1/2 + 1/3
If you keep on going, applying the same reasoning to each new year,
you will find that after n independent years the expected number of
record years is the sum :
1 + 1/2 + 1/3 + 1/4 + ... + 1/n = H(n)
Random Records Are Rare
Suppose that we were to apply our formula to
the rainfall records for some place in the UK
from 1748 to 2004 - a period of 256 years.
Then we predict that we should find only H(256)
= 6.124, or about 6 record years. We would
have to wait for more than a thousand years to
have a good chance of finding even 8 record
years.
“I always thought that record would stand until
It was broken”
Yogi Berra
H(100) = 5.19
Central England Temperature Record
Bunched Traffic
In single-lane traffic, with no overtaking, a slow car will be followed
by a bunch of cars wanting to overtake and go faster. If N cars set
out, how many bunches will form? That is the same as asking how
many record low speeds will be observed, and we know the
answer:
H(N) = 1 + 1/2 + 1/3 +…+ 1/N
Eg… H(1000) = 7.49
The bunches are successively slower, so they will be more widely
spaced. This explains why cars near the exit of a long tunnel tend
to travel faster and in smaller, more widely separated, bunches
than cars near the entrance of the tunnel.
Testing To Destruction
Strength of rth component is Br
Test 1st to destruction so we know B1
Stress 2nd beam to B1. If OK B2 > B1. If
it breaks note B2
Test 3rd to min of B1 and B2. If it breaks
note B3 otherwise move to 4th
component.
Expected number of broken
components is
H(n) = 1 + 1/2 + 1/3 +….+ 1/n
So with 1000 components you only
break about H(1000) = 7.5 of them to
discover the minimum breaking stress
Variance is H(n)  2/6
Collecting Sets
How many cards should you expect to buy in order to collect the set of 50 ?
If All the Cards Exist in Equal Numbers!
1st card: I always need the first card.
2nd card: There is a 49/50 chance that I haven’t already got it.
3rd card: There is a 48/50 chance
and so on. …
After you have got 40 different cards there will be a 10/50 chance that the next
one will be one you haven’t got.
On the average you will have to buy another 50/10, or 5 more cards, to have a
better than evens chance of getting another one that you need.
Therefore the total number of cards you will need to buy on average to get the
whole set of 50 will be the sum of 50 terms:
50/50 + 50/49 + 50/48+….+50/3 + 50/2 + 50/1
Each successive term tells you how many extra cards you need to buy to get the
1st, 2nd , 3rd , and so on, missing members of the set of 50 cards. The sum is
50  (1 + 1/2 + 1/3 + ….+ 1/50) = 50 H(50)  225
Collecting Sets of N
Cards
On the average we will have to buy a total of
(N/N) + (N/N -1) + (N/N-2) + ….+ N/2 + N/1 cards
This is
N(1 + ½ + 1/3 +…..+1/N) = NH(N)  N × [0.58 + ln(N)]
It’s much harder to complete the second half of the collection than the
first half. The number of cards that you need to buy in order to collect N/2
cards for half a set is only
N/N + N/(N-1) + N/(N - 2) + ….+ N/(½ N + 1)
 N × [ln(N) + 0.58 – ln(N/2) -0.58] = Nln(2) = 0.7N
I need on average to buy just 35 cards to get half my set of 50.
The standard deviation is 1.3N so a
66% chance of needing the average  1.3  total cards
Swopping
Suppose you have F friends and you all pool cards in
order to build up F+1 sets so that you have one each.
How many cards would you need to do this? When the
number of cards N is large, and you share cards, on
average you need
N × [ln(N) + F ln(lnN) +0.58]
But if you had each collected a set without swopping
you would have needed about (F+1)N[ln(N) + 0.58]
cards to complete F+1 separate sets.
For N = 50 the number of card purchases saved would
be 156F. Even with F = 1 this is a considerable
economy.
“The Secretary Problem”
N job applicants when N is large
Interview all of them ?? Takes too much time!
Pick one at random (1/N chance of the best) !!
Is there a ‘Goldilocks’ method between these
extremes that gives the best chance of getting
the top candidate quickly?
An Optimal Strategy
See the first C of the N candidates
Keep a note of who is the best
candidate seen so far
Then hire them or next one you see who
is better
How should you pick the number C ?
The Simple Case of Three Candidates
• Imagine we have three candidates 1, 2 and 3, where 3 is actually better
than 2, who is better than 1; the six possible orders that we could see
them in are
123
132
213
231
312
321
• If we always take the FIRST candidate we see then we pick the best
one (number 3) in only two of the six interview patterns
• So we would pick the best person with a probability of 2/6, or 1/3.
• If we always let the first candidate go and picked the next one we saw
who had a higher rating, we get the best candidate in the second (132),
third (213), and the fourth cases (231) only, so the chance of getting the
best candidate is now 3/6, or 1/2.
• If we let the first two candidates go and picked the third one we see
with a higher rating we get the best candidate only in the first (123) and
third (213) cases.
• The chance of getting the best one is again only 1/3.
• With 3 candidates the strategy of letting one go and picking the next
with a better rating gives the best chance of getting the best candidate.
The Situation with N Candidates
Note that if the best candidate is in the (r+1)st position and we are
skipping the first r candidates then we will choose the best candidate for
sure, but this situation will only occur with a chance 1/N.
If the best candidate is in the (r+2)st position we will pick them with a
chance 1/N  (r/r+1). Carrying on for the higher positions we see that the
overall probability of success is just the sum of all these quantities,
which is
P(N, r) = 1/N [1 + r/(r+1) + r/(r+2) + r/(r+3) + r/(r+4) + ….+ r/(N-1)]
P(N, r)  1/N[1 + r ln[(N-1)/r].
This last quantity, which the series converges towards as N gets large
has its maximum value when the logarithm ln(N-1)/r = e, so e = (N-1)/r 
N/r when N is large.
Hence the maximum value of P(N, r) occurring there is
P  r/N  ln(N/r)  1/e  0.37.
In life we tend to stop searching too soon !
The Magic Recipe
• Reject the first C = N/2.7= 0.37N applicants then pick the
first one that is better than all of those rejected
• We will find the best one with probability 1/e = 0.37
• Consider the case where we have 100 candidates. The
optimal strategy is to see 37 of them and then pick the
next one that we see who is better than any of them and
then see no one else. This will result in us picking the
best candidate for the job with a probability of about
37% -- quite good compared with the 1% chance if we
had picked a candidate at random
The Exploration Problem
Lots of jeeps and fuel
How do you go as far as you like using minimum
fuel?
I jeep goes 1 unit
2 jeeps use 1/3 then move 1/3 tank from jeep 2 to 1,
jeep 2 goes back and jeep one goes
1+1/3 units
3 jeeps stop after using 1/5. Put 1/5 from jeep 3 into
jeeps 1 and 2 which go on as before with 2 coming
back empty to join 3. They return home but jeep 1
goes
1+1/3+1/5 units
4 jeeps allow jeep 1 to go
1+1/3+1/5+1/7 units
N Jeeps and You Can Go
Forever
Using N jeeps you can organise
refuelling so that jeep 1 goes a
distance
1 + 1/3 + 1/5 + 1/7 +…+ 1/(2N-1)
By making N large this grows as
½ ln(N)
and gets as large as you like……..
An unlimited supply chain!