ECE 204: Numerical Methods
David Lau
Estimated study time: 4 minutes
Table of contents
Sources and References
Equivalent UW courses — CS 370 (Numerical Computation), AMATH 242 / CS 371 (Introduction to Computational Mathematics) Primary textbook — Sauer, Timothy. Numerical Analysis, 3rd ed. (recommended) Supplementary references — Heath, Scientific Computing: An Introductory Survey; Trefethen & Bau, Numerical Linear Algebra for deeper linear-algebra background.
Equivalent UW Courses
ECE 204 covers the same core syllabus as CS 370 and AMATH 242 / CS 371: floating-point and error analysis, root finding, linear systems, interpolation, numerical integration, and IVPs for ODEs. CS 370 tends to be the broadest and most CS-flavoured version, AMATH 242 is taken by Math / CM students and treats the underlying analysis more carefully, and CS 371 (cross-listed with AMATH 242) is the honours variant with more proofs. A student who has taken any one of the three would find ECE 204 largely familiar in content, with the main differences being rigour, programming environment, and pace.
What This Course Adds Beyond the Equivalents
- Engineering framing. Problems are motivated by circuit simulation, signal processing, and ODE models from physics / control rather than by generic mathematical examples. Algorithm selection is discussed in terms of what a practicing engineer needs (robustness, step-size control, software reliability).
- Optimization primer. A short introduction to numerical optimization is included at the end, which CS 370 treats only lightly and AMATH 242 leaves to follow-up courses.
- Omits, relative to CS 370 / AMATH 242: detailed error-analysis proofs, IEEE 754 subtleties beyond working knowledge, sparse-matrix techniques, and the QR / SVD family of decompositions (which CS 370 covers for least squares). Discretization of PDEs is mentioned at the learning-outcome level only.
Topic Summary
Number Systems and Error Propagation
Floating-point representation, machine epsilon, roundoff vs truncation error, and forward / backward error. Conditioning of a problem vs stability of an algorithm; catastrophic cancellation in near-equal subtraction.
Linear Systems
Gaussian elimination with partial pivoting, LU factorisation, and operation counts. Norms, condition number \( \kappa(A) = \|A\|\,\|A^{-1}\| \), and how it bounds the sensitivity of \( Ax = b \) to perturbations.
Roots of Nonlinear Equations
Bisection, fixed-point iteration, Newton’s method, and the secant method. Convergence orders (linear, quadratic, superlinear) and failure modes (multiple roots, poor initial guesses).
Systems of Nonlinear Equations
Newton’s method in \( \mathbb R^n \) using the Jacobian, with a short discussion of when to use quasi-Newton updates. Stopping criteria based on residual and step size.
Regression and Least Squares
Overdetermined systems \( Ax \approx b \); the normal equations
\[ A^\top A\,x = A^\top b \]and when they become ill-conditioned. Linear and polynomial regression, with a note on why high-degree polynomial fits are a bad idea.
Interpolation
Lagrange and Newton divided-difference forms. Runge’s phenomenon on equispaced nodes motivates Chebyshev points. Piecewise and cubic-spline interpolation for smooth reconstruction from samples.
Numerical Differentiation
Forward, backward, and central finite differences; truncation-error order via Taylor series. The roundoff / truncation trade-off that makes very small \( h \) worse, not better.
Numerical Integration
Newton-Cotes rules (trapezoid, Simpson), composite rules, error formulas in terms of derivatives. Brief mention of adaptive quadrature and Gaussian quadrature.
ODE Initial-Value Problems
Explicit Euler, midpoint / RK2, and classical RK4. Local vs global truncation error, absolute stability, stiffness, and the motivation for implicit methods. Adaptive step-size control via embedded error estimates.
Optimization
Golden-section search in 1D; gradient descent and Newton’s method for unconstrained problems in \( \mathbb R^n \). Line-search vs trust-region strategies are introduced conceptually.