LECTURE: “Spontaneity, Entropy and Free Energy” (Chapter 10)

LECTURE:
“Spontaneity, Entropy and
Free Energy” (Chapter 10)
 Spontaneity, Disorder & Probability
 Statistical Definition of Entropy
 Thermodynamic Definition of
Entropy
 Expansion and Compression of an
Ideal Gas
1
Spontaneous Processes and
Entropy (Section 10.1)
 Spontaneous processes: proceed without
outside intervention, i.e., naturally.
 Most processes in nature have a natural
direction.
 But what determines that direction?
 Most exothermic reaction are spontaneous
 But also some endothermic processes

Ex. Ice melting at room temp.
 The driving force for a spontaneous process
is an increase in the randomness/disorder
(Entropy) of the Universe.
 Entropy: A thermodynamic function that
measures randomness or disorder.
2
Spontaneity, Entropy & Free
Energy - II
 First Law of Thermodynamics tells us about
energy flow:
E  q  w
 We can perform “Energy bookkeeping” to
find out:
 i) which processes are allowed
 ii) by how much energy
 iii) in what form of energy
 iii) where the processes are going to.
 But...“Energy bookkeeping” does not tell us
why changes occur, neither do kinetics nor
equilibria.
 The Second Law of Thermodynamics
examines why changes occur and why
certain processes occur spontaneously while
others do not.

Concept of Entropy
3
Spontaneity, Entropy & Free Energy –
Examples of Spontaneous processes
Increasing disorder
4
Spontaneous Processes and
Entropy (III)
 Entropy: A measure of randomness or disorder.
 The entropy of a entire system can never
become smaller, but, if changed at all, must
always increase.
 Applying universally this deduction leads to the
idea that all matter is becoming more
disordered, i.e.,
Order  Disorder
 As the inventor of the concept, Rudolf Clausius
(in 1854), said, “the entropy of the Universe
tends to a maximum”.
5
Statistical Definition of Entropy
(Boltzmann)
 Entropy: describes the number of possible
arrangements (or positions) that are
available to a system.
 Thus, entropy relates to the probability of
finding the system in a particular
configuration, or “microstate”, .
S  kB ln 
 Boltzmann's constant kB = R / NA
  = Number of microstates corresponding
to a given state of the system.
 Nature spontaneously proceeds towards
states that have the highest probabilities
of existing.
6
Statistical Definition of Entropy (II)
 Suppose we have two identical flasks
connect by a narrow neck.
 What is the probability if we place four
molecules into this system that all four will
be present in the left-hand flask?
 ½ * ½ * ½ * ½ = 1/16 = 0.0625
7
Statistical Definition of Entropy (III)
Probability
Entropy
1/16
4/16
6/16
Entropy is related to the number of possible
microstates.
The more microstates, or greater probability the
greater the entropy
8
Thermodynamic Definition of
Entropy
 Natural progression is linked to an increase
in randomness or disorder.
 Heat is the transfer of energy which results
in an increase in the random motion of
particles.
Entropy  Heat ???
M
Generator
Gas
In this case, work
is transformed
into heat (q),
which raises the
random motion
(entropy) of the
gas molecules.
Thus, S  q
9
Thermodynamic Definition of
Entropy (II)
 Transfer of heat occurs from a hot body to a
cold body spontaneously.
Cold
Spontaneous
processes are
associated with
an increase in
entropy.
Hot
Heat transfers from the hot solution to the cold solution
 Entropy is larger if heat flows into a colder
body. Thus, S  1/T
 Combine  and  
S 
q
T
10
A spontaneous process always
occurs
A) Quickly
B) Slowly
C) without outside intervention
D) Exothermically
E) Endothermically
11