I need to calculate the Hessian matrix of a scalar in spherical coordinates. To do so, I tried to determine the gradient of the gradient. Hence, I want a gradient of a vector field. My question is: Can the formula for the Nabla operator simply be applied to each component of the vector field, or is it more tricky? I googled hard, but only found the formula for scalars...
-
The gradient for a "scalar" is obtained by the partial derivatives of a differentiable function, so it doesn't make sense to calculate the gradient of a vector field. There is, although, another operation called divergence, which can be defined, loosely speaking, as the dot product of the "nabla operator" with the vector field. That is a very imprecise definition, but that's how you'll find it someplaces. – 園田海未 May 10 '16 at 15:08
-
In cartesian coordinates, you can build the gradient of each component of a vector and assemble them to a matrix. I am wondering if it also works in spherical coordinates. – ChristianR May 10 '16 at 16:41
2 Answers
I think I figured it out. This is my approach for polar coordinates, it should work likewise for sphericals. For a scalar function $f$, the gradient in polar coordinates $r$ and $\varphi$ is
$$\mathrm{grad}(f)=\dfrac{\partial f}{\partial r}\underline{e}_r+ \dfrac{1}{r}\dfrac{\partial f}{\partial\varphi}\underline{e}_\varphi,$$
where $\underline{e}_i$ are the unit basis vectors. Substitute $f$ by its own gradient
$$\mathrm{grad}(\mathrm{grad}(f))=\dfrac{\partial}{\partial r}\left(\dfrac{\partial f}{\partial r}\underline{e}_r+\dfrac{1}{r}\dfrac{\partial f}{\partial\varphi}\underline{e}_\varphi\right)\otimes\underline{e}_r+ \dfrac{1}{r}\dfrac{\partial}{\partial\varphi}\left(\dfrac{\partial f}{\partial r}\underline{e}_r+\dfrac{1}{r}\dfrac{\partial f}{\partial\varphi}\underline{e}_\varphi\right)\otimes\underline{e}_\varphi.$$
With
$$\dfrac{\partial\underline{e}_r}{\partial r}=0, \quad \dfrac{\partial\underline{e}_\varphi}{\partial r}=0, \quad \dfrac{\partial\underline{e}_r}{\partial\varphi}=\underline{e}_\varphi, \quad \dfrac{\partial\underline{e}_\varphi}{\partial\varphi}=-\underline{e}_r,$$
one gets
$$\mathrm{grad}(\mathrm{grad}(f))= \dfrac{\partial^2f}{\partial r^2}\underline{e}_r\otimes\underline{e}_r+ \dfrac{\partial}{\partial r}\left(\dfrac{1}{r}\dfrac{\partial f}{\partial\varphi}\right)\underline{e}_\varphi\otimes\underline{e}_r+$$ $$ \dfrac{1}{r}\dfrac{\partial^2f}{\partial r\partial\varphi}\underline{e}_r\otimes\underline{e}_\varphi+ \dfrac{1}{r}\dfrac{\partial f}{\partial r}\underline{e}_\varphi\otimes\underline{e}_\varphi+ \dfrac{1}{r^2}\dfrac{\partial^2f}{\partial\varphi^2}\underline{e}_\varphi\otimes\underline{e}_\varphi- \dfrac{1}{r^2}\dfrac{\partial f}{\partial\varphi}\underline{e}_r\otimes\underline{e}_\varphi,$$
or
$$\mathrm{grad}(\mathrm{grad}(f))=\begin{bmatrix} \dfrac{\partial^2f}{\partial r^2}&\dfrac{1}{r}\dfrac{\partial^2f}{\partial r\partial\varphi}-\dfrac{1}{r^2}\dfrac{\partial f}{\partial\varphi}\\ \dfrac{\partial}{\partial r}\left(\dfrac{1}{r}\dfrac{\partial f}{\partial\varphi}\right)&\dfrac{1}{r}\dfrac{\partial f}{\partial r}+\dfrac{1}{r^2}\dfrac{\partial^2f}{\partial\varphi^2} \end{bmatrix}$$ Does this make sense?
- 31
-
You can't use the gradient operator on a scalar, but you can use the divergence operator. It's represented as $\nabla \cdot\nabla f$ or $div (grad f) $. As far as I can see, most of your operations require this understanding, and I'm not really sure why you're using tensor products in your calculations. – GodotMisogi May 11 '16 at 06:08
-
The Laplacian is exactly what I don't want. Gradient of a gradient is a common operation in continuum mechanics, but so far, I have only seen it in cartesian coordinates. The Laplacian is the trace of the matrix I am looking for. – ChristianR May 11 '16 at 06:14
-
I see what you mean. I think I know a tensor approach to this. The derivative of a vector with respect to a component in tensor analysis is: $\frac{\partial \vec V}{\partial x^{\beta}} = \frac{\partial {V}^{\alpha}}{\partial x^{\beta}}\vec{e}{\alpha} + V^{\alpha}\frac{\partial \vec{e}{\alpha}}{\partial x^{\beta}} $, so you can generalize this for the rest of the components and second derivatives. – GodotMisogi May 11 '16 at 06:28
Laplacian is the trace of the Hessian or an inner product of the gradient of the gradient. Hessian is the outer product. So if nabla is a "column vector" of differential operators: $$\nabla = \left[\begin{array}{c}\frac{\partial }{\partial x}\\\frac{\partial}{\partial y}\end{array}\right]$$ we have: $${\bf L} = \nabla^T\nabla = \left[\begin{array}{cc}\frac{\partial}{\partial x} \frac{\partial }{\partial y}\end{array}\right]\left[\begin{array}{c}\frac{\partial }{\partial x}\\\frac{\partial}{\partial y}\end{array}\right] = \frac{\partial^2}{\partial x^2}+\frac{\partial^2}{\partial y^2}\\{\bf H} = \nabla \nabla^T = \left[\begin{array}{c}\frac{\partial }{\partial x}\\\frac{\partial}{\partial y}\end{array}\right]\left[\begin{array}{cc}\frac{\partial}{\partial x} \frac{\partial }{\partial y}\end{array}\right] = \left[\begin{array}{cc}\frac{\partial^2}{\partial x^2}& \frac{\partial^2 }{\partial xy}\\\frac{\partial^2 }{\partial yx}& \frac{\partial^2 }{\partial y^2}\end{array}\right]$$ Hessian $\bf H$ contains the terms of $\bf L $ in the diagonal, so ${\bf L} = \text{trace}( {\bf H} )$
Then the coordinate system determines the chain rule you will have to consider when carrying out the calculations. But that is on another level of abstraction.
- 25,824