Skip to content
Open
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 18 additions & 2 deletions 02_newton.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,11 +12,27 @@ Iterative techniques for solving $f(x) = 0$ for $x$.

*Bisection*: start with an interval $[a, b]$ bracketing the root.
Evaluate the midpoint. Replace one end, maintaining a root bracket.
Linear convergence. Slow but **robust**.
Linear convergence. Slow but **robust**. Error bound is defined by
$|x_{n}-{x}^*|<(b-a)/{2}^n$.

*Fixed-point iteration*: A number $p$ is a **fixed point** for a given
function g if $g(p)=p$. so given a root-finding problem $f(p)=0$,we can
define function $g$ with a fixed point at $p$ in a number of ways, for
example, as $g(x)=x-f(x)$ or as $g(x)=x+3f(x)$. Conversey, if the function
$g$ has a fixed point at $p$, then the function defined by $f(x)=x-g(x)$
has a zero at $p$.

*Newton's Method*: $x_{k+1} = x_k - f(x_k) / f'(x_k)$. Faster,
quadratic convergence (number of correct decimals places doubles each
iteration).
iteration).when $|p_{n}-p_{n-1}|<e$ or $|f(p_{n})|<e$, we can stop the
the iterations.

*Secant method*: Newton's method is an extremly powerful technique, but it
has a major weekness:the need to know the value of the derivative of $f$ at
each approximation. Frequently, $f(x)'$ is far more arithmetic operation to
calculate than $f(x)$. We introduce the **secant method**.

$$ p_{n}=p_{n-1}-f(p_{n-1})(p_{n-1}-p_{n-2})/(f(p_{n-1})-f(p_{n-2})) $$.

Downsides of Newton's Method: need derivative info, and additional
smoothness. Convergence usually not guaranteed unless "sufficiently
Expand Down