Statistical Physics Time`s Arrow

Statistical Physics
The Second Law
Time’s Arrow
• Most macroscopic processes are
“irreversible” in everyday life.
 Glass breaks but does not reform.
 Coffee cools to room temperature but does not
spontaneously heat up.
1
Probability
• Thermodynamic processes happen in one
direction rather than the other because of
probabilities.
• Systems move toward the most probable
macrostate.
Two State Systems
Counting States
2
Coin Flips
• All 8 possible
“microstates” of a set
of three coins.
• H = heads
• T = tails
Coin 1
Coin 2
Coin 3
H
H
H
H
H
T
H
T
H
H
T
T
T
H
H
T
H
T
T
T
H
T
T
T
Coin Flips
• “Macrostates”
 3 Heads (1 microstate)
Coin 1
Coin 2
Coin 3
H
H
H
H
H
T
H
T
H
H
T
T
T
H
H
T
H
T
T
T
H
T
T
T
3
Coin Flips
• “Macrostates”
 3 Heads (1 microstate)
 2 Heads (3 microstates)
Coin 1
Coin 2
Coin 3
H
H
H
H
H
T
H
T
H
H
T
T
T
H
H
T
H
T
T
T
H
T
T
T
Coin Flips
• “Macrostates”
 3 Heads (1 microstate)
 2 Heads (3 microstates)
 1 Head (3 microstates)
Coin 1
Coin 2
Coin 3
H
H
H
H
H
T
H
T
H
H
T
T
T
H
H
T
H
T
T
T
H
T
T
T
4
Coin Flips
• “Macrostates”
 3 Heads (1 microstate)
 2 Heads (3 microstates)
 1 Head (3 microstates)
 0 Heads (1 microstate)
Coin 1
Coin 2
Coin 3
H
H
H
H
H
T
H
T
H
H
T
T
T
H
H
T
H
T
T
T
H
T
T
T
Coin Flips
• “Macrostates”
 3 Heads (1 microstate)
 2 Heads (3 microstates)
 1 Head (3 microstates)
 0 Heads (1 microstate)
• The number of micro-
states corresponding to a
given macrostate is called
the multiplicity, Ω.
Coin 1
Coin 2
Coin 3
H
H
H
H
H
T
H
T
H
H
T
T
T
H
H
T
H
T
T
T
H
T
T
T
5
Coin Flips
• Assuming the coins
are fair all 8
microstates are equally
probable.
• Thus the probability of
any given macrostate
is
Coin 1
Coin 2
Coin 3
H
H
H
H
H
T
H
T
H
H
T
T
T
H
H
T
H
T
T
T
H
T
T
T
100 coins
• The number of microstates for 100 coins is
2100.
• The number of macrostates is 101.
 0 heads, 1 head, 2 heads,…,100 heads.
• Multiplicities:
 Ω(0) = 1
 Ω(1) = 100
6
100 coins
• To find Ω(2):
 100 choices for 1st coin
 99 choices for 2nd coin
 Any pair in either order so divide by two.
• To find Ω(3):




100 choices for 1st coin
99 choices for 2nd coin
98 choices for third coin
But any triplet has 3 choices for 1st flip, then 2 choices for 2nd
flip.
Combinatorics
• You can probably see the pattern:
• or
• For N coins the multiplicity of the macrostate with n
heads is
7
Combinatorics
• In general the number of ways of arranging N
indistinguishable particles among n states is
• The number of ways of arranging N
distinguishable particles among n states is
"(n) =
n!
(N # n)!
!
Statistics to Physics
Microstates, Macrostates and
Entropy
8
The Boltzman Postulate
• An isolated system in equilibrium is
equally likely to be in any of its accessible
microstates.
 Accessible → physically allowed and reachable
via some process.
Gas in a box
• A box contains N = 4
identical particles.
• Let us describe the
macrostate in terms of
number of particles on the
right side of the box, NR.
• The microstates for each
macrostate can be
computed using the
formulas or by inspection.
9
Gas in a Box
#N&
N!
"(N R ) = % ( =
$ N R ' N R !(N ) N R )!
!
•
•
•
•
•
Ω(0) = 1
Ω(1) = 4
Ω(2) = 6
Ω(All) = 16
As each microstate has the same
probability (according to the
Boltzmann Postulate) the probability
of a given macrostate is proportional
to the multiplicity.
Gas in a Box
10
The 2nd Law - Again
• The equilibrium state is the most probable macrostate.
• The most probable macrostate is the state with the most
microstates - the largest multiplicity.
• Boltzmann realized that the entropy is maximum at
equilibrium so that there must be a connection between
multiplicity and entropy.
• Boltzmann defined entropy as
S " kB ln#
!
The 2nd Law - Again
• The results of our analysis of multiplicity:
Any large system in equilibrium will be found in
the macrostate with the greatest multiplicity
or
Multiplicity tends to increase.
• becomes the Second Law of Thermodynamics:
Entropy tends to increase.
S " kB ln#
!
11
Temperature and Entropy
Objects in Thermal Contact
• Consider two systems A and B that can exchange thermal
energy (heat) back and forth.
• Suppose that initially heat is flowing slowly from A to B.
• As the two systems change energy they also change their
entropy.
Solid A
Heat
Solid B
12
Entropy change
• The total entropy change during which an infinitesimal
amount of heat (dE) flows is
dSTotal = dSA + dSB
• The energy exchange causes the entropy change so we can
write
# &
# &
! dSTotal = % "S ( dE A + % "S ( dE B
$ "E ' A
$ "E ' B
• But the systems exchange energy so dEA = -dEB :
*# "S & # "S & dSTotal = ,% ( ) % ( /dE A
+$ "E ' A $ "E ' B .
!
!
Temperature and Entropy
*# "S & # "S & dSTotal = ,% ( ) % ( /dE A
+$ "E ' A $ "E ' B .
• Now if the two systems are far from equilibrium the
entropy change of the whole system must be a non-zero
positive value as it moves toward maximum entropy.
!
• If the system is at equilibrium the entropy change must be
zero.
• Thus the two systems are in equilibrium when
# "S & # " S &
% ( =% (
$ "E ' A $ " E ' B
!
13
Temperature and Entropy
# "S & # " S &
% ( =% (
$ "E ' A $ " E ' B
• But temperature is the quantity we define as being the
same when two objects are in equilibrium.
! we can define
• Therefore
# "S & 1
% ()
$ "E ' T
• where the inverse ensures that the heat flows from the
higher temperature to the lower temperature object .
!
Temperature and Entropy
• More formally we define the temperature
for a system with a fixed number of
particles and a fixed volume.
• The letter U is used for energy to emphasize
that it is the internal energy of the system.
14
Statistical Physics
Partition Function and the Boltzmann
Factor
Isolated vs. Constant Temperature Systems
• Our previous work has been with isolated systems
but what about systems at constant temperature?
• What is the probability of finding a system in a
particular energy state at a given temperature?
System A
System A exchanges
energy with a temperature
reservoir B.
Temperature reservoir B
15
One big system and one small system
• If we can cast the problem in terms of an isolated system
we can use the work we have already developed.
• We can treat the combined system as an isolated system if
system A is much smaller than system B so that the
temperature of B is unaffected by the exchange of energy
with A.
• Let’s choose system A to be a single atom.
Probability ratios
• Let’s look at the ratio of probabilities for
two of the atom’s microstates: s1 and s2.
• The energies and probabilities of these
states are written as
16
Probability ratios
• For an isolated system each accessible state
is equally likely but the atom is not an
isolated system.
• The atom + reservoir is an isolated system
and the reservoir has a much larger number
of microstates than the atom so
Probability ratios
• Using the relationship
• we can write
17
Entropy and Energy
• We previously proved
• so for dN = 0 and dV = 0
Energy and Entropy
• The difference in entropies of the system
corresponding to the two states is
• The difference in energy of the reservoir is just
minus the difference between the two atomic
states.
18
Probabilities
• Returning to our ratio of probabilities
• becomes
The Boltzmann Factor
• This means that the probability of finding a
system in energy state s at temperature T is
proportional to the Boltzmann Factor:
• Now this just gives proportionality - to get
the total probability we have to normalize.
19
The partition function
• The normalized probability is
• where Z is called the partition function and
is given by
The Most Important Equation
• The most important equation in classical
statistical mechanics is
• Note: the sum is over all states of the
system not all energies.
20
Using the Boltzmann Factor
Averages
• If we have a system of N particles at
temperature T then the number of particles
in state s, N(s), is given by
21
Averages
• We can use this to find the average
properties of a system:
Averages
• Average energy is
• The internal energy of the system is
22
Example: Height and density
• The ratio of the number of particles at different
heights in a gas at uniform temperature is
• but the difference in energy at different heights is
just mg∆z so
Energy distributions
23
Energy vs. States
• It is common to express the probability in
terms of energies rather than states.
• If the energies are not degenerate no change
in the expression is required but if there is
more than one state with the same energy
the expression changes.
P(s) " P(E)
!
Energy vs. States
• In terms of states the normalized probability
is
• In terms of energies it is
g(E)e"E / kT
P(E) =
# g(E)e"E / kT
E
!
24
Degeneracy
• g(E) is the degeneracy of energy level E. It
is the number of states with the energy E.
• The partition function can also be rewritten
in terms of the energy.
Z = # g(E)e"E / kT
E
!
Sum to Integral
• In many systems the spacing of energy
levels is much smaller than the typical
particle energy. (Certainly true in classical
systems.)
• In this case the sums over energy can be
replaced by integrals.
"
#
$ dE
E
!
25
Sum to Integral
• The degeneracy g(E) needs to be replaced by a continuous
quantity called the density of states, D(E).
• D(E) = the number of states with energies between E and E
+ dE
P(E)dE =
D(E)e"E / kT dE
$
#
0
D(E)e"E / kT dE
• where P(E)dE is the probability that the system energy is
between E and E + dE
!
Quantum Statistics
• For classical systems and systems where kT >> EF (for
fermions) the Boltzmann distribution is valid but for very
low temperatures we need to replace the Boltzmann
distribution with the appropriate quantum distribution:
P(E)dE =
• where
D(E) f (E)dE
#
"
0
D(E) f (E)dE
#
1
% e(E "E F )/ kT + 1 fermions
%
f (E) = $
%
1
bosons
%& (E " µ )/ kT
e
"1
!
!
26
Heat capacity
Definition of Heat Capacity
• The heat capacity of a system is defined as
dU
dT
• Experimentally the heat capacity is the
amount of heat added to the system when
the!
temperature is raised by one unit of
temperature.
C=
27
Degrees of Freedom
• The temperature dependence of internal energy of a system
can be determined by using the equipartition theorem.
• Monatomic gas:
• Diatomic gas:
!
• Elemental Solid:
!
3
U = NkT
2
6
U = NkT
2
C = 3Nk
!
7
U = NkT
!
2
!
3
C = Nk
2
C=
7
Nk
2
!
Quantum Effects
• Diatomic Molecules
28
Quantum Effects
• Solids:
29