In numerical analysis, Newton’s method (also known as the Newton–Raphson method), named after Isaac Newton and Joseph Raphson, is a method for finding successively better approximations to the roots (or zeroes) of a real-valued function.

The idea of the method is as follows: one starts with an initial guess which is reasonably close to the true root, then the function is approximated by its tangent line (which can be computed using the tools of calculus), and one computes the x-intercept of this tangent line (which is easily done with elementary algebra). This x-intercept will typically be a better approximation to the function’s root than the original guess, and the method can be iterated.

Suppose f : (ab) → ℝ is a differentiable function defined on the interval (ab) with values in the real numbers ℝ. The formula for converging on the root can be easily derived. Suppose we have some current approximation xn. Then we can derive the formula for a better approximation, xn + 1 by referring to the diagram on the right. The equation of the tangent line to the curve y = f (x) at the point x = xn is


where f′ denotes the derivative of the function f.

The x-intercept of this line (the value of x such that y = 0) is then used as the next approximation to the root, xn + 1. In other words, setting y to zero and x to xn + 1 gives{\displaystyle 0=f'(x_{n})\,(x_{n+1}-x_{n})+f(x_{n}).}


Solving for xn + 1 gives{\displaystyle x_{n+1}=x_{n}-{\frac {f(x_{n})}{f'(x_{n})}}.}

{\displaystyle x_{n+1}=x_{n}-{\frac {f(x_{n})}{f'(x_{n})}}.}

We start the process off with some arbitrary initial value x0. (The closer to the zero, the better. But, in the absence of any intuition about where the zero might lie, a “guess and check” method might narrow the possibilities to a reasonably small interval by appealing to the intermediate value theorem.)

The method will usually converge, provided this initial guess is close enough to the unknown zero, and that f ′(x0) ≠ 0. Furthermore, for a zero of multiplicity 1, the convergence is at least quadratic (see rate of convergence) in a neighbourhood of the zero, which intuitively means that the number of correct digits roughly at least doubles in every step.

Know more: