# Complete Proof

### Error. Page cannot be displayed. Please contact your service provider for more details. (1)

### Error. Page cannot be displayed. Please contact your service provider for more details. (17)

**Complete Proof**

**What's the proof for the chain rule in the partial form df= fxdx+fydy+fzdz?**

*
*

*Let's say f is a scalar function of three variables (x,y,z) and fx, ty and fz are the partial derivatives. I would like to see the proof for df= fxdx+fydy+fzdz
*

the formula

df = fxdx + fydy + fzdz

is the definition of the total differential for f. there's nothing to prove, it's a definition.

this definition is motivated by the chain rule for higher dimensions:

df/dt = (fx)(dx/dt) + (fy)(dy/dt) + (fz)(dz/dt)

actually, this represents somewhat of an abuse of notation, because what is usally meant by this is f is a real-valued function of 3 variables, each of which is parametrized by t. so we actually have a composition:

(fop)(t), where p(t) = (x(t),y(t),z(t)).

so it really shouldn't be df/dt, but rather, d/dt(fop).

now p'(t) = (x'(t),y'(t),z'(t)) = (dx/dt,dy/dt,dz/dt). while the derivative of f is

∇f = (fx,fy,fz), and in this context, the chain rule is:

(fop)'(t) = ∇f(p(t))•p'(t)), where we have a dot product (actually, this is a special example of a matrix product).

the proof of this isn't terribly hard, but requires a lot of definitions to be made regarding derivatives of functions of more than one variable, and space prevents me from giving a complete proof here.

a "semi-intuitive proof" is the following: f:R^3--->R can be approximated (at a given point) by a linear function L, where L(x,y,z) = ax + by + cz + d. and the parametrization p can also be approximated by some line: p(t) = (x0 + tu, y0 + tv, z0 + tw).

now (Lop)(t) = a(x0 + tu) + b(y0 + tv) + c(z0 + tw) + d

= (au + bv + cw)t + ax0 + by0 + cz0 + d

computing (Lop)'(t) directly, we have:

(Lop)'(t) = au + bv + cw

while ∇L is the constant vector (a,b,c), and p'(t) = (u,v,w), so

(Lop)'(t) = (a,b,c)•(u,v,w).

now the derivative of a function g:R^n-->R^m can be defined as the unique linear transformation λ such that:

lim h-->0 |g(x+h) - g(x) - λ(h)|/|h| = 0

in other words λ(x) + g(x) is the "best linear approximation to g" at x, so if λ is the derivative of g, and μ is the derivative of h, then it makes sense for the best linear approximation of goh to be the matrix product, λμ, and when λ = g, μ = h, this is certainly the case (we just showed that for L and p above).

it is sometimes easier to see this is true with an example:

let f(x,y,z) = 2x + xy^2 + 3z

let p(t) = (x,y,z) = (t, t^2, t^3)

∇f(x,y,z) = (2 + y^2, 2xy,3) and p'(t) = (1, 2t,3t^2)

so ∇f(p(t)) = (2 + t^4, 2t^3, 3) so ∇f(p(t))•(1,2t,3t^2)

= 2 + t^4 + (2t^3)(2t) + 3(3t^2) = 2 + 9t^2 + 5t^4

on the other hand, f(p(t)) = 2t + t^5 + 3t^3, which has the derivative:

(fop)"(t) = 2 + 9t^2 + 5t^4.

**Scroll Down To Find Complete Proof On eBay Below:**

### Error. Page cannot be displayed. Please contact your service provider for more details. (31)

$104.95 |
$0.01 |
$67.95 |
$0.01 |

$5.75 |
$295.00 |
$2.75 |
$25.00 |

$58.89 |
$10.49 |
$4.45 |
$7.65 |

$8.95 |
$4.50 |
$9.99 |
$25.25 |

$23.50 |
$23.50 |
$10.75 |
$29.95 |

$184.95 |
$124.83 |
$75.99 |
$14.50 |

$33.74 |
$19.99 |
$22.50 |
$675.00 |

$23.45 |
$64.95 |
$5.50 |
$13,999.00 |

$99.99 |
$28.95 |
$7.45 |
$26.49 |

$12.50 |
$64.99 |
$52.95 |
$75.00 |

$189.99 |
$69.99 |