2

For a differentiable function $f:\mathbb{R} \rightarrow \mathbb{R}$, Newton's method consists of iterating $$x_{n+1} = x_{n} - \frac{f(x_n)}{f'(x_n)}$$ where $x_0$ is some initial guess, to find a point $f(x^*) \approx 0$, and $x_i \in \mathbb{R}$.

For a differentiable function $g: \mathbb{R}^n \rightarrow \mathbb{R}^n$, Newton's method in this case consists of $$x_{n+1} = x_n-J_g(x_n)^{-1} g(x_n)$$ where $J_g(x_n)$ is the Jacobian matrix of $g$ of size $n \times n$ evaluated at $x_n$, to find a point $\nabla f(x^*) \approx 0$, and $x_i \in \mathbb{R}^n$.

Now, what about a function $h: \mathbb{R}^k \rightarrow \mathbb{R}$, is there a generalization of Newton's method for finding a zero either of $h$ or of $\nabla h$?

0 Answers0