From 84cd049ef434cef97eaf905ba1ee7bbba7142f3b Mon Sep 17 00:00:00 2001 From: zhamel Date: Sun, 27 Sep 2015 09:36:26 -0700 Subject: [PATCH] ZoeH-update02_Newton.md Added info about Secant's method --- 02_newton.md | 14 ++++++++++---- 1 file changed, 10 insertions(+), 4 deletions(-) diff --git a/02_newton.md b/02_newton.md index 4b01352..64bbfb6 100644 --- a/02_newton.md +++ b/02_newton.md @@ -10,18 +10,24 @@ Rooting Finding Iterative techniques for solving $f(x) = 0$ for $x$. -*Bisection*: start with an interval $[a, b]$ bracketing the root. +*Bisection*: **Linear convergence**. Start with an interval $[a, b]$ bracketing the root. Evaluate the midpoint. Replace one end, maintaining a root bracket. -Linear convergence. Slow but **robust**. + Slow but **robust**. *Newton's Method*: $x_{k+1} = x_k - f(x_k) / f'(x_k)$. Faster, -quadratic convergence (number of correct decimals places doubles each +**quadratic convergence** (number of correct decimals places doubles each iteration). Downsides of Newton's Method: need derivative info, and additional smoothness. Convergence usually not guaranteed unless "sufficiently -close": not **robust**. +close": **not robust**. +*Secant Method*: $x_{k+1} = x_k - (f(x_k) (x_{k-1} - x_k)) / (f(x_{k-1}) - f(x_k))$. **Sublinear convergence**: +faster than bisection but slower than Newton's method. Useful when the derivative of the function is more complicated +than the function itself, hard to compute, nd simply doesn't exist. Need two initial values. + +Downsides of Secant's Method: similarly to Newton's, not guaranteed to converge if the initial +values are "too far": **not robust**. Systems