14.3: Partial Derivatives, 14.4: The Chain Rule, and 14.5: Directional Derivatives and Gradient Vectors TA: Sam Fleischer November 10 Section 14.3: Partial Derivatives Definition: Partial Derivative with respect to x, y The partial derivative of f (x, y) with respect to x at the point (x0 , y0 ) is ∂f d f (x0 + h, y0 ) − f (x0 , y0 ) fx = = = lim f (x, y ) 0 h→0 ∂x (x0 ,y0 ) dx h (x0 ,y0 ) The partial derivative of f (x, y) with respect to y at the point (x0 , y0 ) is d ∂f f (x0 , y0 + h) − f (x0 , y0 ) = fy = f (x0 , x) = lim h→0 ∂y (x0 ,y0 ) dy h (x0 ,y0 ) Example: Taking Partial Derivatives f (x, y) = y sin (xy) To find fx (the partial derivative of f with respect to x), treat y as a constant. fx = y cos (xy)y = y 2 cos (xy) To find fy (the partial derivative of f with respect to y), treat x as a constant. fy = (1) sin (xy) + y cos (xy)x = sin (xy) + xy cos (xy) Example: Implicit Differentiation Find ∂z if the equation ∂x yz − ln z = x + y defines z as a function of two independent variables x and y and the partial derivative exists. d d (yz − ln z) = (x + y) dx dx 1 ∂z 1 ∂z y − =1+0 z ∂x ∂x 1 ∂z y− =1 z ∂x ∂z x = ∂x yz − 1 Definition: Second-Order Partial Derivatives When we differentiate a function f (x, y) twice, we produce its second-order derivatives. Notation: ∂ 2f ∂ 2f or f , or fyy xx ∂x2 ∂y 2 ∂ 2f ∂ 2f or fyx , and = fxy ∂x∂y ∂y∂x The defining equations are ∂ 2f ∂ ∂f = , ∂x2 ∂x ∂x ∂ 2f ∂ ∂f = ∂x∂y ∂x ∂y ∂ 2f Note that fyx = means you first take the derivative with respect to y, and then take the ∂x∂y derivate with respect to x. In general, fxy 6= fyx . Theorem 2: Mixed Derivative Theorem If f (x, y) and its partial derivatives fx , fy , fxy , and fyx are defined throughout an open region containing a point (a, b) and are all continuous at (a, b), then fxy (a, b) = fyx (a, b) Example: Partial Derivatives of Higher Order Find fyxyz if f (x, y, z) = 1 − 2xy 2 z + x2 y f (x, y, z) = 1 − 2xy 2 z + x2 y fy = −4xyz + x2 fyx = −4yz + 2x fyxy = −4z fyxyz = −4 2 Definition: Differentiability A function z = f (x, y) is differentiable at (x0 , y) if fx (x0 , y0 ) and fy (x0 , y0 ) exists and ∆z satisfies an equation of the form ∆z = fx (x0 , y0 )∆x + fy (x0 , y0 )∆y + 1 ∆x + 2 ∆y in which each of 1 , 2 → 0 as both ∆x, ∆y → 0. We call f differentiable if it is differentiable at every point in its domain, and say that its graph is a smooth surface. Theorem 3: The Increment Theorem for Functions of Two Variables Suppose that the first partial derivatives of f (x, y) are defined throughout an open region R containing the point (x0 , y0 ) and that fx and fy are continuous at (x0 , y0 ). Then the change ∆z = f (x0 + ∆x, y0 + ∆y) − f (x0 , y0 ) in the value of f that results from moving from (x0 , y0 ) to another point (x0 + ∆x, y0 + ∆y) in R satisfies an equation of the form ∆z = fx (x0 , y0 )∆x + fy (x0 , y0 )∆y + 1 ∆x + 2 ∆y in which each of 1 , 2 → 0 as both ∆x, ∆y → 0. Corollary of Theorem 3 If the partial derivatives fx and fy of a function f (xy) are continuous throughout an open region R, then f is differentiable at every point of R. Theorem 4 - DIfferentiability Implies Continuity If a function f (x, y) is differentiable at (x0 , y0 ), then f is continuous at (x0 , y0 ). Section 14.4: The Chain Rule Theorem 5: Chain Rule for Functions of One Independent Variable and Two Intermediate Variables If w = f (x, y) is differentiable and if x = x(t), y = y(t) are differentiable functions of t, then the composite w = f (x(t), y(t)) is a differentiable function of t and dw = fx (x(t), y(t)) · x0 (t) + fy (x(t), y(t)) · y 0 (t) dt or dw ∂f dx ∂f dy = + dt ∂x dt ∂y dt 3 Theorem 6: Chain Rule for Functions of One Independent Variable and Three Intermediate Variables If w = f (x, y, z) is differentiable and x, y, and z are differentiable functions of t, then w is a differentiable function of t and ∂w dx ∂w dy ∂w dz dw = + + dt ∂x dt ∂y dt ∂z dt Example Find dw if dt w = xy + z, x = cos t, y = sin t, z=t Use Theorem 6: dw ∂w dx ∂w dy ∂w dz = + + dt ∂x dt ∂y dt ∂z dt = (y)(− sin t) + (x)(cos t) + (1)(1) = (sin t)(− sin t) + (cos t)(cos t) + 1 = − sin2 t + cos2 t + 1 = cos 2t + 1 What is the derivative at t = 0? dw = 1 + cos 0 = 2 dt t=0 Theorem 7: Chain Rule for Two Independent Variables and Three Intermediate Variables Suppose that w = f (x, y, z), x = g(r, s), y = h(r, s), and z = k(r, s). If all four functions are differentiable, then w has partial derivatives with respect to r and s given by the formulas ∂w ∂w ∂x ∂w ∂y ∂w ∂z = + + ∂r ∂x ∂r ∂y ∂r ∂z ∂r ∂w ∂w ∂x ∂w ∂y ∂w ∂z = + + ∂s ∂x ∂s ∂y ∂s ∂z ∂s Theorem 8: A Formula for Implicit Differntiation Suppose that F (x, y) is differentiable and that the equation F (x, y) = 0 defines y as a differentiable function of x. Then at any point where Fy 6= 0, dy Fx =− dx Fy 4 Example Find dy if y 2 − x2 − sin xy = 0. dx dy Fx =− dx Fy −2x − y cos xy =− 2y − x cos xy 2x + y cos xy = 2y − x cos xy Expansion of Theorem 8 to Three Variables Suppose F (x, y, z) = 0 and z = f (x, y). Assuming F and f are differentiable functions, then ∂z Fx =− ∂x Fz and ∂z Fy =− ∂y Fz Expansion of Chain Rule to Functions of n variables In general, suppose z = f (x1 , x2 , . . . , xn ) is a differential function of the intermediate variables x1 , x2 , . . . , xn where n is a positive integer (n ∈ N). Also suppose each xi is a differentiable function of the independent variables t1 , t2 , . . . , tm , with m ∈ N. In equation form, x1 = g1 (t1 , t2 , . . . , tm ) x2 = g2 (t1 , t2 , . . . , tm ) .. . xn = gn (t1 , t2 , . . . , tm ) Then w is a differential function of each of the independent variables t1 , t2 , . . . , tm , and the partial derivatives of w with respect to each ti are n ∂w ∂x1 ∂w ∂x2 ∂w ∂xn X ∂w ∂xi ∂w = + + ··· + = ∂t1 ∂x1 ∂t1 ∂x2 ∂t1 ∂xn ∂t1 ∂xi ∂t1 i=1 n ∂w ∂w ∂x1 ∂w ∂x2 ∂w ∂xn X ∂w ∂xi = + + ··· + = ∂t2 ∂x1 ∂t2 ∂x2 ∂t2 ∂xn ∂t2 ∂xi ∂t2 i=1 .. . n ∂w ∂w ∂x1 ∂w ∂x2 ∂w ∂xn X ∂w ∂xi = + + ··· + = ∂tm ∂x1 ∂tm ∂x2 ∂tm ∂xn ∂tm ∂xi ∂tm i=1 More compactly, n ∂w X ∂w ∂xi = ∂tj ∂xi ∂tj i=1 for j = 1, 2, . . . , m 5 Directional Derivatives and Gradient Vectors Definition: Directional Derivative The derivative of f at P0 (x0 , y0 ) in the direction of the unit vector u = u1 i = u2 j is the number df f (x0 + su1 , y0 + su2 ) − f (x0 , y0 ) = lim ds u,P0 s→0 s provided the limit exists. The directional derivative is also denoted df = (Du f )P0 ds u,P0 and is read “The derivative of f at P0 in the direction of u”. Definition: Gradient Vector The gradient vector (gradient) of f (x, y) at a point P is the vector ∇f = ∂f ∂f i+ j ∂x ∂y Theorem 9: The Directional Derivative is a Dot Product If f (x, y) is differentiable in an open region containing P0 (x0 , y0 ), then (Du f )P0 = (∇f )P0 · u In words, the derivative of f at P0 in the direction of u is the dot product of the gradient ∇f at P0 and u. In brief, (Du )f = ∇f · u Example Find the derivative of f (x, y) = xey + cos xy at the point (2, 0) in the direction of v = 3i − 4j. First, find the unit direction vector u= v v 3 4 = = i− j kvk 5 5 5 Then we need to find the partial derivatives of f at (2, 0) because together, they make up the gradient, ∇f . fx = ey − y sin xy fx (2, 0) = e0 − 0 sin(2 · 0) = 1 − 0 = 1 fy = xey − x sin xy fy (2, 0) = 2e0 − 2 sin(2 · 0) = 2 − 0 = 2 6 Plug these values into the definition of gradient. ∇f |(2,0) = fx (2, 0)i + fy (2, 0)j = i + 2j Then the directional derivative of f at (2, 0) in the direction of u is (Du f )(2,0) = ∇f |(2,0) · u 3 4 = i − j · (i + 2j) 5 5 3 4 = −2· 5 5 = −1 Properties of the Directional Derivative 1. The function f increases most rapidly when cos θ = 1, i.e. when θ = 0, i.e. when u is the direction of ∇f . That is, at each point P in its domain, f increases most rapidly in the direction of the gradient vector ∇f at P . The derivative in this direction is Du f = k∇f k cos 0 = k∇f k 2. Similarly, f decreases most rapidly in the direction of −∇f . The derivative in this direction is Du f = k∇f k cos π = −k∇f k 3. Any direction u orthogonal to a gradient ∇f = 6 0 is adirection of zero change in f because π θ = and 2 π Du f = k∇f k cos = k∇f k · 0 = 0 2 Example x2 y 2 Let f (x, y) = + , and consider the point (1, 1). 2 2 The function increases most rapidly in the direction of ∇f . (∇f ) = xi + yj =⇒ (∇f )(1,1) = i + j The unit vector of (∇f )(1,1) is i+j 1 1 u= √ = √ i+ √ j 2 2 2 The function decreases most rapidly in the direction −(∇f )(1,1) 1 1 −√ i − √ j 2 2 The directions of zero change at (1, 1) are the directions ofthogonal to ∇f : 1 1 n = −√ i + √ j 2 2 and 7 1 1 −n= √ i− √ j 2 2 Important Concept At every point (x0 , y0 ) in the domain of a differentiable function f (x, y), the gradient of f is normal to the level curve through (x0 , y0 ). Tangent Line to a Level Curve fx (x0 , y0 )(x − x0 ) + fy (x0 , y0 )(y − y0 ) = 0 Notice this is the same as point-slope form from elementary algebra. y − y0 = m(x − x0 ) where m=− dy fx = fy dx by Theorem 8. Algebra Rules for Gradients 1. 2. 3. 4. Sum Rule: Difference Rule: Constant Multiple Rule: Product Rule: 5. Quotient Rule: ∇(f + g) = ∇f + ∇g ∇(f − g) = ∇f − ∇g ∇(kf ) = k∇f ∇(f g) = f ∇g + g∇f f g∇f − f ∇g ∇ = g g2 Gradients of Functions of n variables For a differential function f (x1 , x2 , . . . xn ) and a unit vector u = hu1 , u2 , . . . , un i in space, we have ∂f ∂f ∂f , ,..., ∇f = ∂x1 ∂x2 ∂xn and n X ∂f ∂f ∂f ∂f Du f = ∇f · u = u1 + u2 + . . . un = ui ∂x1 ∂x2 ∂xn ∂xi i=1 The Derivative Along a Path Let r(t) = x(t)i + y(t)j + z(t)k be a smooth path C and w = f (r(t)) a scalar function along C. Then dw ∂w dx ∂w dy ∂w dz = + + dt ∂x dt ∂y dt ∂z dt or in vector notation, d f (u(t)) = ∇f (r(t)) · r0 (t) dt 8
© Copyright 2026 Paperzz