# Calculus/Newton's Method

 ← Extrema and Points of Inflection Calculus Related Rates → Newton's Method

Newton's Method (also called the Newton-Raphson method) is a recursive algorithm for approximating the root of a differentiable function. We know simple formulas for finding the roots of linear and quadratic equations, and there are also more complicated formulae for cubic and quartic equations. At one time it was hoped that there would be formulas found for equations of quintic and higher-degree, though it was later shown by Neils Henrik Abel that no such equations exist. The Newton-Raphson method is a method for approximating the roots of polynomial equations of any order. In fact the method works for any equation, polynomial or not, as long as the function is differentiable in a desired interval.

 Newton's Method Let $f(x)$ be a differentiable function. Select a point $x_{0}$ based on a first approximation to the root, arbitrarily close to the function's root. To approximate the root you then recursively calculate using: $x_{n+1}=x_{n}-{\frac {f(x_{n})}{f'(x_{n})}}$ As you recursively calculate, the $x_{n+1}$ 's often become increasingly better approximations of the function's root.

In order to explain Newton's method, imagine that $x_{0}$ is already very close to a 0 of $f(x)$ . We know that if we only look at points very close to $x_{0}$ then $f(x)$ looks like its tangent line. If $x_{0}$ was already close to the place where $f(x)$ was 0, and near $x_{0}$ we know that $f(x)$ looks like its tangent line, then we hope the 0 of the tangent line at $x_{0}$ is a better approximation then $x_{0}$ itself.

The equation for the tangent line to $f(x)$ at $x_{0}$ is given by

$y=f'(x_{0})\cdot (x-x_{0})+f(x_{0})$ Now we set $y=0$ and solve for $x$ .

$0=f'(x_{0})\cdot (x-x_{0})+f(x_{0})$ $-f(x_{0})=f'(x_{0})\cdot (x-x_{0})$ ${\frac {-f(x_{0})}{f'(x_{0})}}=(x-x_{0})$ $x={\frac {-f(x_{0})}{f'(x_{0})}}+x_{0}$ This value of $x$ we feel should be a better guess for the value of $x$ where $f(x)=0$ . We choose to call this value of $x_{1}$ , and a little algebra we have

$x_{1}=x_{0}-{\frac {f(x_{0})}{f'(x_{0})}}$ If our intuition was correct and $x_{1}$ is in fact a better approximation for the root of $f(x)$ , then our logic should apply equally well at $x_{1}$ . We could look to the place where the tangent line at $x_{1}$ is zero. We call $x_{2}$ , following the algebra above we arrive at the formula

$x_{2}=x_{1}-{\frac {f(x_{1})}{f'(x_{1})}}$ And we can continue in this way as long as we wish. At each step, if your current approximation is $x_{n}$ our new approximation will be $x_{n+1}=x_{n}-{\frac {f(x_{n})}{f'(x_{n})}}$ .

## Examples

Find the root of the function $f(x)=x^{2}$ . Figure 1: A few iterations of Newton's method applied to y = x 2 {\displaystyle y=x^{2}}   starting with x 0 = 4 {\displaystyle x_{0}=4}   . The blue curve is f ( x ) {\displaystyle f(x)}   . The other solid lines are the tangents at the various iteration points.

{\begin{aligned}x_{0}&=&4\\x_{1}&=&x_{0}-{\frac {f(x_{0})}{f'(x_{0})}}=4-{\frac {16}{8}}=2\\x_{2}&=&x_{1}-{\frac {f(x_{1})}{f'(x_{1})}}=2-{\frac {4}{4}}=1\\x_{3}&=&x_{2}-{\frac {f(x_{2})}{f'(x_{2})}}=1-{\frac {1}{2}}={\frac {1}{2}}\\x_{4}&=&x_{3}-{\frac {f(x_{3})}{f'(x_{3})}}={\frac {1}{2}}-{\frac {\frac {1}{4}}{1}}={\frac {1}{4}}\\x_{5}&=&x_{4}-{\frac {f(x_{4})}{f'(x_{4})}}={\frac {1}{4}}-{\frac {\frac {1}{16}}{\frac {1}{2}}}={\frac {1}{8}}\\x_{6}&=&x_{5}-{\frac {f(x_{5})}{f'(x_{5})}}={\frac {1}{8}}-{\frac {\frac {1}{64}}{\frac {1}{4}}}={\frac {1}{16}}\\x_{7}&=&x_{6}-{\frac {f(x_{6})}{f'(x_{6})}}={\frac {\frac {1}{256}}{\frac {1}{8}}}={\frac {1}{32}}\end{aligned}}

As you can see $x_{n}$  is gradually approaching 0 (which we know is the root of $f(x)$ ) . One can approach the function's root with arbitrary accuracy.

Answer: $f(x)=x^{2}$  has a root at $x=0$ .