Lecture 26 Cauchy`s Theorem Last time, we introduced the notion of

Lecture 26 Cauchy’s Theorem
Last time, we introduced the notion of a differential of a function of two variables
f (x, y), namely
∂f
∂f
df =
dx +
dy.
∂x
∂y
We said that a complex-valued function of a complex variable is analytic if
df = f 0 (z)dz,
i.e. the differential of f is a complex multiple of dz.
Last time, we asked the following question. Given an analytic function f (z) when can
we find any analytic antiderivative, a function F (z) so that
dF = f (z)dz.
In thinking about how to answer this, we can first ask the question: given a differential
form
ω = u(x, y)dx + v(x, y)dy,
when can we find F (x, y) so that
dF = ω.
[We’ll assume that u(x, Y ) and v(x, y) are continuous and everywhere differentiable.]
Thus we are trying to solve the equations
∂F
=u
∂x
∂F
=v
∂y
since , of course,
dF =
∂F
∂F
dx +
dy.
∂x
∂y
Now these are two equations for only one unknown function so we should expect this
always to be possible.
Fact Let F be a function of two variables whose first partial derivatives have continuous first partial derivatives. Then
∂ ∂F
∂ ∂F
(
)=
(
).
∂y ∂x
∂x ∂y
1
We refer to the quantity on both sides of this equation as
derivative of F .
∂2F
∂x∂y ,
the mixed second partial
How would we go about proving the fact? We note that the left hand side of the
equality is given by
F (x + h, y + k) − F (x + h, y) − F (x, y + k) + F (x, y)
∂ ∂F
(
) = lim lim
,
k−→0 h−→0
∂y ∂x
hk
while the right hand side is given by
∂ ∂F
F (x + h, y + k) − F (x + h, y) − F (x, y + k) + F (x, y)
(
) = lim lim
.
h−→0 k−→0
∂x ∂y
hk
The only difference between the two is the order in which the limit is taken. A careful -δ
∂F
argument shows they are the same. Thus if we have ∂F
∂x = u, and ∂y = v then we must
have
∂u
∂v
=
.
∂y
∂x
We would like to see that the necessary condition is also sufficient. By the single
variable fundamental theorem of calculus, we can solve ∂F
∂x = u by
x
Z
F (x, y) =
u(s, y)ds + C1 (y),
x0
and similarly we can solve
∂F
∂y
= v by
Z
y
F (x, y) =
v(x, t)dt + C2 (x).
y0
All this works out if
Z
x
Z
y
u(s, y)ds −
G(x, y) =
x0
v(x, t)dt,
y0
is the difference between a function of x and a function of y. To verify this, we just need
to check that the second mixed partial vanishes. We calculate
∂G
∂x
Z
y
= u(x, y) −
y0
y
Z
= u(x, y) −
y0
= u(x, y0 ).
2
∂v
(x, t)dt
∂x
∂u
(x, t)dt
∂y
Taking the derivative in y, we calculate
∂2G
= 0,
∂x∂y
as desired. Thus everything works. We can find the antiderivative F as long as u and v
satisfy the consistency condition
∂u
∂v
=
,
∂y
∂x
and as long as all derivatives are defined and continuous on the rectangle with opposite
corners (x0 , y0 ) and (x, y).
So how does this work for f (z)dz when
f (x + iy) = u(x, y) + iv(x, y),
is analytic? We calculate
f (z)dx = (u + iv)(dx + idy) = (u + iv)dx + (−v + iu)dy.
We check the consistency condition
∂u
∂v
∂v
∂u
+i
=−
+i .
∂y
∂y
∂x
∂x
These are the same by the Cauchy Riemann equations and so everything works. We can
find F (z) so that
dF = f (z)dz.
By defintion, F (z) is analytic and an anti-derivative of f .
Now how (and on what) can we integrate an expression like dF ?
We integrate on a curve, a complex valued function α on a real interval [a, b]. We
view α geometrically as a curve between α(a) and α(b).
We define
Z
Z
b
f (α(t))α0 (t)dt,
dF =
α
a
where the multiplication, of course, is complex multiplication. We observe since dF =
f (z)dz, that the integral becomes
Z
a
b
d
F (α(t))dt = F (α(b)) − F (α(a)).
dt
This is rather amazing. For analytic functions, integration on a curve is just like one
variable integration of a real function. You don’t care about the path of the curve. The
integral is the difference of the values of the antiderivative at the endpoints.
3
Usually, this is stated in terms of closed curves, those where α(a) = α(b). We arrive
at
Cauchy’s theorem Let f be analytic with continuous derivative on a rectangle R.
Let α be a closed curve lieing in the rectangle R. Then
Z
f (z)dz = 0.
α
The catch here is the function must be analytic on a rectangle containing the curve
for our argument to work. What happens when that isn’t the case. We consider a central
example.
Suppose we want to integrate
Recall that
dz
z .
Does
1
z
have an antiderivative? It should be log z.
log z = log |z| + iθ,
where
θ = arcsin p
y
x2 + y 2
.
The problem is that θ doesn’t have a continuous value on any rectangle containing the
origin. (Alternatively, the problem is that z1 is not analytic at 0. So let’s integrate dz
z on
the closed curve
α(t) = e2πit ,
defined on [0, 1]. We get
Z
α
dz
=
z
Z
0
1
2πie2πit dt
=
e2πit
Z
1
2πidt = 2πi.
0
Why? It is because θ changes by 2π as we go around the circle. The integral measures
how many times the curve winds around the singularity at 0. Next time, we will use this
idea to discover one more nice property of analytic functions.
4