On November 22, Joe Riel posted an implicit differentiation problem that caught my attention. It took the manipulations typically learned in an Advanced Calculus course one step further, but the devices learned in such a course could readily be applied. Joe's solution was expressed in terms of exterior derivatives and exterior products, so he used the liesymm and DifferentialGeometry packages to obtain solutions.

Here's the problem: Given a constrait, F(x, y) = 0, and functions G(x, y) and H(x, y), find dG/dH.

Here's a solution that a student in an Advanced Calculus course could be expected to fathom.

Define u = G(x, y) and v = H(x, y) and assume that the inverse function theorem allows us to write x = x(u, v) and y = y(u, v). The constraint equation then becomes `≡`(F(x(u, v), y(u, v)), f(u, v)) = 0. From the derivative sought, infer that u = u(v) can be obtained from f(u, v) = 0. Hence, from `≡`(f(u(v), v), 0) we obtain 

f[u]*du/dv+f[v] = 0 

or

du/dv = -f[v]/f[u] 

where the derivative du/dv is the required dG/dH. To obtain f[v] and f[u], return to the constraint

 

F(x(u, v), y(u, v)) = 0 

 

and apply the chain rule, obtaining

 

f[v] = F[x]*x[v]+F[y]*y[v] 

and

f[u] = F[x]*x[u]+F[y]*y[u] 

 

To obtain x[u], y[u], x[v], y[v], write out at length the inverse functions 

x = x(u(x, y), v(x, y))

and

y = y(u(x, y), v(x, y)) 

 

and apply the chain rule, differentiating the first equation with respect to x and y. Solve the resulting set of simultaneous equations with Cramer's rule, and recognize the determinant in the denominators as the Jacobian.

 

1 = x[u]*u[x]+x[v]*v[x]
 

0 = x[u]*u[y]+x[v]*v[y]
``

 implies

Typesetting[delayDotProduct](Matrix(2, 2, {(1, 1) = m[1, 1], (1, 2) = m[1, 2], (2, 1) = m[2, 1], (2, 2) = m[2, 2]}), Vector(2, {(1) = m[1, 1], (2) = m[2, 1]}), true) = (Vector(2, {(1) = m[1, 1], (2) = m[2, 1]}))

 implies

x[u] = LinearAlgebra[Determinant](Matrix(2, 2, {(1, 1) = 1, (1, 2) = v[x], (2, 1) = 0, (2, 2) = v[y]}))/LinearAlgebra[Determinant](Matrix(2, 2, {(1, 1) = u[x], (1, 2) = v[x], (2, 1) = u[y], (2, 2) = v[y]})) and LinearAlgebra[Determinant](Matrix(2, 2, {(1, 1) = 1, (1, 2) = v[x], (2, 1) = 0, (2, 2) = v[y]}))/LinearAlgebra[Determinant](Matrix(2, 2, {(1, 1) = u[x], (1, 2) = v[x], (2, 1) = u[y], (2, 2) = v[y]})) = v[y]*`∂`(x, y)/`∂`(u, v) and v[y]*`∂`(x, y)/`∂`(u, v) = v[y]/J

``

``

``

``

x[v] = LinearAlgebra[Determinant](Matrix(2, 2, {(1, 1) = u[x], (1, 2) = 1, (2, 1) = u[y], (2, 2) = 0}))/LinearAlgebra[Determinant](Matrix(2, 2, {(1, 1) = u[x], (1, 2) = v[x], (2, 1) = u[y], (2, 2) = v[y]})) and LinearAlgebra[Determinant](Matrix(2, 2, {(1, 1) = u[x], (1, 2) = 1, (2, 1) = u[y], (2, 2) = 0}))/LinearAlgebra[Determinant](Matrix(2, 2, {(1, 1) = u[x], (1, 2) = v[x], (2, 1) = u[y], (2, 2) = v[y]})) = -u[y]*`∂`(x, y)/`∂`(u, v) and -u[y]*`∂`(x, y)/`∂`(u, v) = -u[y]/J

 

Repeat these calculations with the second equation:

 

0 = y[u]*u[x]+y[v]*v[x]
 

1 = y[u]*u[y]+y[v]*v[y]

 implies

Typesetting[delayDotProduct](Matrix(2, 2, {(1, 1) = u[x], (1, 2) = v[x], (2, 1) = u[y], (2, 2) = v[y]}), Vector(2, {(1) = y[u], (2) = y[v]}), true) = (Vector(2, {(1) = 0, (2) = 1}))

 implies

y[u] = LinearAlgebra[Determinant](Matrix(2, 2, {(1, 1) = 0, (1, 2) = v[x], (2, 1) = 1, (2, 2) = v[y]}))/LinearAlgebra[Determinant](Matrix(2, 2, {(1, 1) = u[x], (1, 2) = v[x], (2, 1) = u[y], (2, 2) = v[y]})) and LinearAlgebra[Determinant](Matrix(2, 2, {(1, 1) = 0, (1, 2) = v[x], (2, 1) = 1, (2, 2) = v[y]}))/LinearAlgebra[Determinant](Matrix(2, 2, {(1, 1) = u[x], (1, 2) = v[x], (2, 1) = u[y], (2, 2) = v[y]})) = -v[x]*`∂`(x, y)/`∂`(u, v) and -v[x]*`∂`(x, y)/`∂`(u, v) = -v[x]/J

``

``

``

``

y[v] = LinearAlgebra[Determinant](Matrix(2, 2, {(1, 1) = u[x], (1, 2) = 0, (2, 1) = u[y], (2, 2) = 1}))/LinearAlgebra[Determinant](Matrix(2, 2, {(1, 1) = u[x], (1, 2) = v[x], (2, 1) = u[y], (2, 2) = v[y]})) and LinearAlgebra[Determinant](Matrix(2, 2, {(1, 1) = u[x], (1, 2) = 0, (2, 1) = u[y], (2, 2) = 1}))/LinearAlgebra[Determinant](Matrix(2, 2, {(1, 1) = u[x], (1, 2) = v[x], (2, 1) = u[y], (2, 2) = v[y]})) = u[x]*`∂`(x, y)/`∂`(u, v) and u[x]*`∂`(x, y)/`∂`(u, v) = u[x]/J

 

 

Put all this together to get

 

du/dv = -f[v]/f[u] and -f[v]/f[u] = -(F[x]*x[v]+F[y]*y[v])/(F[x]*x[u]+F[y]*y[u]) and -(F[x]*x[v]+F[y]*y[v])/(F[x]*x[u]+F[y]*y[u]) = -(F[x](-u[y]/J)+F[y](u[x]/J))/(F[x](v[y]/J)+F[y](-v[x]/J)) and -(F[x](-u[y]/J)+F[y](u[x]/J))/(F[x](v[y]/J)+F[y](-v[x]/J)) = (F[x]*G[y]-F[y]*G[x])/(F[x]*H[y]-F[y]*H[x]) 

 

which is what Joe Riel obtained. Of course, prior to working out the details that Joe merely alluded to, I worked a simple example to clarify the dependencies between the variables. Hence, take F, G, H as linear functions,

 

F := A*x+B*y:

 

so that x(u, v) and y(u, v) are

 

solve({u = G, v = H}, {x, y})

{x = -(-u*d+b*v)/(a*d-c*b), y = (a*v-c*u)/(a*d-c*b)}

(1)

 

The constraint then becomes 

eval(F, {x = -(-u*d+b*v)/(a*d-c*b), y = (a*v-c*u)/(a*d-c*b)})

-A*(-u*d+b*v)/(a*d-c*b)+B*(a*v-c*u)/(a*d-c*b)

(2)

 

Set this equal to zero and solve explicitly for u = u(v), obtaining

 

solve(-A*(-u*d+b*v)/(a*d-c*b)+B*(a*v-c*u)/(a*d-c*b), u)

v*(A*b-B*a)/(A*d-B*c)

(3)

 

Differentiate to obtain du/dv, that is, dG/dH:

 

diff(v*(A*b-B*a)/(A*d-B*c), v)

(A*b-B*a)/(A*d-B*c)

(4)

Now, apply the formula du/dv = (F[x]*G[y]-F[y]*G[x])/(F[x]*H[y]-F[y]*H[x]).

 

((diff(F, x))*(diff(G, y))-(diff(F, y))*(diff(G, x)))/((diff(F, x))*(diff(H, y))-(diff(F, y))*(diff(H, x))) = (A*b-B*a)/(A*d-B*c)``

 

The results agree.

 

 

Download implicit_diff.mw

Please Wait...