MATH 137: Calculus 1 for Honours Mathematics

Estimated study time: 47 minutes

Table of contents

This quick reference covers MATH 137 (Fall 2017, University of Waterloo), based on the course notes by Barbara A. Forrest and Brian E. Forrest. The material progresses from sequences through limits, derivatives, the Mean Value Theorem, and culminates in Taylor polynomials.

Chapter 1: Sequences and Convergence

1.1 Absolute Values

The absolute value provides the fundamental notion of distance on the real number line and underpins all of the limit definitions in this course.

\[ |x| = \begin{cases} x & \text{if } x \ge 0 \\ -x & \text{if } x < 0. \end{cases} \]

Geometrically, \(|x|\) is the distance from \(x\) to \(0\), and \(|a - b|\) is the distance between \(a\) and \(b\).

1.1.1 Inequalities Involving Absolute Values

\[ |x - y| \le |x - z| + |z - y|. \]

This says the distance from \(x\) to \(y\) never exceeds the sum of the distances from \(x\) to \(z\) and from \(z\) to \(y\).

\[ |x + y| \le |x| + |y|. \]

The key inequality \(|x - a| < \delta\) describes the open interval \((a - \delta, a + \delta)\), while \(0 < |x - a| < \delta\) describes \((a-\delta, a+\delta) \setminus \{a\}\). These sets appear throughout the \(\varepsilon\)-\(\delta\) framework.

1.2 Sequences and Their Limits

1.2.1 Introduction to Sequences

A sequence is an infinite ordered list of real numbers \(\{a_1, a_2, a_3, \ldots\}\), also written \(\{a_n\}_{n=1}^\infty\) or simply \(\{a_n\}\). Sequences can be given by explicit formulas (e.g.\ \(a_n = 1/n\)), by a list from which a pattern is deduced, or by recursion.

1.2.2 Recursively Defined Sequences

A recursively defined sequence specifies each term in terms of previous terms. For instance, \(a_1 = 1\) and \(a_{n+1} = \frac{1}{1 + a_n}\) defines a sequence whose terms converge to the golden ratio \(\frac{-1+\sqrt{5}}{2}\). Heron’s algorithm for square roots uses the recursion \(a_{n+1} = \frac{1}{2}\bigl(a_n + \frac{\alpha}{a_n}\bigr)\) to approximate \(\sqrt{\alpha}\).

1.2.3 Subsequences and Tails

Definition (Subsequence). Let \(\{a_n\}\) be a sequence. Let \(n_1 < n_2 < n_3 < \cdots\) be natural numbers. The sequence \(\{a_{n_k}\} = \{a_{n_1}, a_{n_2}, a_{n_3}, \ldots\}\) is called a subsequence of \(\{a_n\}\).

Definition (Tail of a Sequence). Given a sequence \(\{a_n\}\) and \(k \in \mathbb{N}\), the subsequence \(\{a_k, a_{k+1}, a_{k+2}, \ldots\}\) is called the tail of \(\{a_n\}\) with cutoff \(k\).

1.2.4 Limits of Sequences

Definition (Limit of a Sequence, Formal I). We say that \(L\) is the limit of the sequence \(\{a_n\}\) as \(n \to \infty\) if for every \(\varepsilon > 0\) there exists \(N \in \mathbb{N}\) such that if \(n \ge N\), then \(|a_n - L| < \varepsilon\). We write \(\lim_{n\to\infty} a_n = L\).

Definition (Limit of a Sequence, Formal II). \(L = \lim_{n\to\infty} a_n\) if for every \(\varepsilon > 0\), the interval \((L - \varepsilon, L + \varepsilon)\) contains a tail of \(\{a_n\}\).

Theorem 3 (Equivalent Characterizations of Convergence). The following are equivalent:

  1. \(\lim_{n\to\infty} a_n = L\).
  2. Every interval \((L-\varepsilon, L+\varepsilon)\) contains a tail of \(\{a_n\}\).
  3. Every interval \((L-\varepsilon, L+\varepsilon)\) contains all but finitely many terms of \(\{a_n\}\).
  4. Every open interval \((a,b)\) containing \(L\) contains a tail of \(\{a_n\}\).
  5. Every open interval \((a,b)\) containing \(L\) contains all but finitely many terms of \(\{a_n\}\).

Theorem 4 (Uniqueness of Limits for Sequences). If a sequence \(\{a_n\}\) has a limit \(L\), then \(L\) is unique.

Proposition 5. Let \(\{a_n\}\) be a sequence with \(a_n \ge 0\) for each \(n \in \mathbb{N}\). If \(L = \lim_{n\to\infty} a_n\), then \(L \ge 0\).

1.2.5 Divergence to \(\pm\infty\)

Definition (Divergence to \(+\infty\)). We say \(\{a_n\}\) diverges to \(\infty\) if for every \(M > 0\) there exists \(N \in \mathbb{N}\) so that if \(n \ge N\), then \(a_n > M\). We write \(\lim_{n\to\infty} a_n = \infty\).

Definition (Divergence to \(-\infty\)). We say \(\{a_n\}\) diverges to \(-\infty\) if for every \(M < 0\) there exists \(N \in \mathbb{N}\) so that if \(n \ge N\), then \(a_n < M\). We write \(\lim_{n\to\infty} a_n = -\infty\).

Theorem 6. (i) If \(\alpha > 0\), then \(\lim_{n\to\infty} n^\alpha = \infty\). (ii) If \(\alpha < 0\), then \(\lim_{n\to\infty} n^\alpha = 0\).

1.2.6 Arithmetic for Limits of Sequences

Theorem 7 (Arithmetic Rules for Limits of Sequences). Let \(\lim_{n\to\infty} a_n = L\) and \(\lim_{n\to\infty} b_n = M\). Then:
(i) If \(a_n = c\) for every \(n\), then \(c = L\).
(ii) \(\lim_{n\to\infty} c\,a_n = cL\).
(iii) \(\lim_{n\to\infty}(a_n + b_n) = L + M\).
(iv) \(\lim_{n\to\infty} a_n b_n = LM\).
(v) \(\lim_{n\to\infty} \frac{a_n}{b_n} = \frac{L}{M}\) if \(M \ne 0\).
(vi) If \(a_n \ge 0\) for all \(n\) and \(\alpha > 0\), then \(\lim_{n\to\infty} a_n^\alpha = L^\alpha\).
(vii) For any \(k \in \mathbb{N}\), \(\lim_{n\to\infty} a_{n+k} = L\).

Theorem 8. Assume \(\lim_{n\to\infty} b_n = 0\) and \(\lim_{n\to\infty} \frac{a_n}{b_n}\) exists. Then \(\lim_{n\to\infty} a_n = 0\).

1.3 Squeeze Theorem

Theorem 9 (Squeeze Theorem for Sequences). Assume \(a_n \le b_n \le c_n\) and \(\lim_{n\to\infty} a_n = L = \lim_{n\to\infty} c_n\). Then \(\{b_n\}\) converges and \(\lim_{n\to\infty} b_n = L\).

For example, since \(-1/n \le \sin(n)/n \le 1/n\) and both bounds converge to \(0\), the Squeeze Theorem gives \(\lim_{n\to\infty} \sin(n)/n = 0\).

Squeeze theorem for sin(n)/n

1.4 Monotone Convergence Theorem

Definition (Upper and Lower Bounds). Let \(S \subset \mathbb{R}\). We say \(\alpha\) is an upper bound of \(S\) if \(x \le \alpha\) for every \(x \in S\). We say \(\beta\) is a lower bound if \(\beta \le x\) for every \(x \in S\). \(S\) is bounded if it is bounded both above and below.

Definition (Least Upper Bound). \(\alpha = \text{lub}(S)\) if \(\alpha\) is an upper bound of \(S\) and is the smallest such upper bound. Also called the supremum, \(\sup(S)\).

Definition (Greatest Lower Bound). \(\beta = \text{glb}(S)\) if \(\beta\) is a lower bound of \(S\) and is the largest such lower bound. Also called the infimum, \(\inf(S)\).

Axiom 10 (Least Upper Bound Property). Let \(S \subset \mathbb{R}\) be nonempty and bounded above. Then \(S\) has a least upper bound.

Theorem 11 (Monotone Convergence Theorem). Let \(\{a_n\}\) be an increasing sequence.
1. If \(\{a_n\}\) is bounded above, then \(\{a_n\}\) converges to \(L = \text{lub}(\{a_n\})\).
2. If \(\{a_n\}\) is not bounded above, then \(\{a_n\}\) diverges to \(\infty\).
In particular, \(\{a_n\}\) converges if and only if it is bounded above. A similar statement holds for decreasing sequences.

Monotone convergence: increasing bounded sequence

1.5 Introduction to Series

Definition (Series). Given a sequence \(\{a_n\}\), the formal sum \(\sum_{n=1}^\infty a_n\) is called a series. The \(k\)-th partial sum is \(S_k = \sum_{n=1}^k a_n\). The series converges if \(\{S_k\}\) converges; in that case \(\sum_{n=1}^\infty a_n = \lim_{k\to\infty} S_k\).

1.5.1 Geometric Series

Definition (Geometric Series). A geometric series is \(\sum_{n=0}^\infty r^n = 1 + r + r^2 + \cdots\). The number \(r\) is the ratio.

\[ \sum_{n=0}^\infty r^n = \frac{1}{1 - r}. \]

1.5.2 Divergence Test

Theorem 13 (Divergence Test). If \(\sum_{n=1}^\infty a_n\) converges, then \(\lim_{n\to\infty} a_n = 0\). Equivalently, if \(\lim_{n\to\infty} a_n \ne 0\) or does not exist, then \(\sum_{n=1}^\infty a_n\) diverges.

The converse is false: the Harmonic Series \(\sum 1/n\) diverges even though \(1/n \to 0\).


Chapter 2: Limits and Continuity

2.1 Introduction to Limits for Functions

Definition (Limit of a Function at \(x = a\)). Let \(f\) be a function and \(a \in \mathbb{R}\). We say \(\lim_{x \to a} f(x) = L\) if for every \(\varepsilon > 0\) there exists \(\delta > 0\) such that if \(0 < |x - a| < \delta\), then \(|f(x) - L| < \varepsilon\).

ε-δ Definition of Limitxya−δa+δaL+εL−εL|x−a|<δ ⟹ |f(x)−L|<ε

2.2 Sequential Characterization of Limits

Theorem 1 (Sequential Characterization of Limits). Let \(f\) be defined on an open interval containing \(a\), except possibly at \(a\). Then \(\lim_{x\to a} f(x) = L\) if and only if for every sequence \(\{x_n\}\) with \(x_n \ne a\) and \(x_n \to a\), we have \(\lim_{n\to\infty} f(x_n) = L\).

Theorem 2 (Uniqueness of Limits for Functions). If \(\lim_{x\to a} f(x) = L\) and \(\lim_{x\to a} f(x) = M\), then \(L = M\).

2.3 Arithmetic Rules for Limits of Functions

Theorem 3 (Arithmetic Rules for Limits of Functions). Let \(\lim_{x\to a} f(x) = L\) and \(\lim_{x\to a} g(x) = M\). Then:
(i) If \(f(x) = c\) for all \(x\), then \(\lim_{x\to a} f(x) = c\).
(ii) \(\lim_{x\to a} cf(x) = cL\).
(iii) \(\lim_{x\to a}[f(x)+g(x)] = L+M\).
(iv) \(\lim_{x\to a} f(x)g(x) = LM\).
(v) \(\lim_{x\to a} \frac{f(x)}{g(x)} = \frac{L}{M}\) if \(M \ne 0\).

Theorem 4. If \(\lim_{x\to a} \frac{f(x)}{g(x)}\) exists and \(\lim_{x\to a} g(x) = 0\), then \(\lim_{x\to a} f(x) = 0\).

Theorem 5 (Limits of Polynomials). If \(p(x) = \alpha_0 + \alpha_1 x + \cdots + \alpha_n x^n\) is any polynomial, then \(\lim_{x\to a} p(x) = p(a)\).

2.4 One-sided Limits

Definition (Limit from the Right). \(\lim_{x\to a^+} f(x) = L\) if for every \(\varepsilon > 0\) there exists \(\delta > 0\) such that if \(0 < x - a < \delta\), then \(|f(x) - L| < \varepsilon\).

Definition (Limit from the Left). \(\lim_{x\to a^-} f(x) = L\) if for every \(\varepsilon > 0\) there exists \(\delta > 0\) such that if \(0 < a - x < \delta\), then \(|f(x) - L| < \varepsilon\).

Theorem 6 (One-sided vs Two-sided Limits). \(\lim_{x\to a} f(x) = L\) exists if and only if both one-sided limits exist and \(\lim_{x\to a^-} f(x) = L = \lim_{x\to a^+} f(x)\).

2.5 The Squeeze Theorem

Theorem 7 (Squeeze Theorem for Functions). Assume \(g(x) \le f(x) \le h(x)\) on an open interval containing \(a\) (except possibly at \(a\)), and \(\lim_{x\to a} g(x) = L = \lim_{x\to a} h(x)\). Then \(\lim_{x\to a} f(x) = L\).

2.6 The Fundamental Trigonometric Limit

\[ \lim_{\theta \to 0} \frac{\sin(\theta)}{\theta} = 1. \]

This is proved by comparing the areas of a triangle, a circular sector, and a larger triangle on the unit circle, then applying the Squeeze Theorem.

2.7 Limits at Infinity and Asymptotes

Definition (Limits at Infinity). \(\lim_{x\to\infty} f(x) = L\) means for every \(\varepsilon > 0\) there exists \(N\) such that if \(x > N\), then \(|f(x) - L| < \varepsilon\). Similarly for \(\lim_{x\to -\infty} f(x) = L\).

Definition (Horizontal Asymptote). If \(\lim_{x\to\infty} f(x) = L\) or \(\lim_{x\to -\infty} f(x) = L\), then \(y = L\) is a horizontal asymptote of \(f\).

Definition (Infinite Limits at \(\infty\)). \(\lim_{x\to\infty} f(x) = \infty\) means for every \(M > 0\) there exists \(N > 0\) such that if \(x > N\), then \(f(x) > M\).

Theorem 9 (Squeeze Theorem at \(\pm\infty\)). If \(g(x) \le f(x) \le h(x)\) for all \(x \ge N\) and \(\lim_{x\to\infty} g(x) = L = \lim_{x\to\infty} h(x)\), then \(\lim_{x\to\infty} f(x) = L\). Analogously for \(x \to -\infty\).

2.7.2 Fundamental Log Limit

\[ \lim_{x\to\infty} \frac{\ln(x)}{x} = 0. \]

More generally, for any \(p > 0\), \(\lim_{x\to\infty} \frac{\ln(x)}{x^p} = 0\) and \(\lim_{x\to\infty} \frac{x^p}{e^x} = 0\). Logarithms grow slower than any positive power; exponentials grow faster than any polynomial.

2.7.3 Vertical Asymptotes and Infinite Limits

Definition (Right-Hand Infinite Limits). \(\lim_{x\to a^+} f(x) = \infty\) means for every \(M > 0\) there exists \(\delta > 0\) such that if \(a < x < a + \delta\), then \(f(x) > M\). Analogously for \(-\infty\) and for limits from the left.

Definition (Infinite Limits). \(\lim_{x\to a} f(x) = \infty\) if both \(\lim_{x\to a^-} f(x) = \infty\) and \(\lim_{x\to a^+} f(x) = \infty\).

Definition (Vertical Asymptote). If any of \(\lim_{x\to a^\pm} f(x) = \pm\infty\) holds, then \(x = a\) is a vertical asymptote for \(f\).

2.8 Continuity

Definition (Continuity I). A function \(f\) is continuous at \(x = a\) if (i) \(\lim_{x\to a} f(x)\) exists, and (ii) \(\lim_{x\to a} f(x) = f(a)\).

Definition (Continuity II). \(f\) is continuous at \(x = a\) if for every \(\varepsilon > 0\) there exists \(\delta > 0\) such that \(|x - a| < \delta\) implies \(|f(x) - f(a)| < \varepsilon\).

Theorem 11 (Sequential Characterization of Continuity). \(f\) is continuous at \(x = a\) if and only if whenever \(\{x_n\}\) is a sequence with \(\lim_{n\to\infty} x_n = a\), we have \(\lim_{n\to\infty} f(x_n) = f(a)\).

2.8.1 Types of Discontinuities

A removable discontinuity occurs when \(\lim_{x\to a} f(x)\) exists but does not equal \(f(a)\) (or \(f(a)\) is undefined). An essential discontinuity occurs when the limit does not exist: this includes jump discontinuities (both one-sided limits exist but differ), infinite discontinuities (function blows up), and oscillatory discontinuities.

2.8.2–2.8.4 Continuity of Standard Functions

Theorem 12. Polynomials are continuous everywhere. The functions \(\sin(x)\), \(\cos(x)\), \(e^x\) are continuous on \(\mathbb{R}\), and \(\ln(x)\) is continuous on \((0,\infty)\).

Theorem 13 (Arithmetic Rules for Continuous Functions). If \(f\) and \(g\) are continuous at \(a\), then so are \(f+g\), \(fg\), \(cf\), and \(f/g\) (when \(g(a)\ne 0\)).

Theorem 14 (Composition of Continuous Functions). If \(f\) is continuous at \(a\) and \(g\) is continuous at \(f(a)\), then \(g \circ f\) is continuous at \(a\).

Definition (Continuity on an Interval). \(f\) is continuous on an open interval \((a,b)\) if it is continuous at every point in \((a,b)\). \(f\) is continuous on \([a,b]\) if it is continuous on \((a,b)\), right-continuous at \(a\), and left-continuous at \(b\).

Theorem 15 (Continuous Image of a Closed Interval). If \(f\) is continuous on \([a,b]\), then the range of \(f\) on \([a,b]\) is also a closed interval.

2.9 Intermediate Value Theorem

Theorem 16 (Intermediate Value Theorem). If \(f\) is continuous on \([a,b]\) and \(k\) is any value between \(f(a)\) and \(f(b)\), then there exists \(c \in (a,b)\) such that \(f(c) = k\).

The IVT provides the theoretical basis for the bisection method: if \(f(a)\) and \(f(b)\) have opposite signs and \(f\) is continuous, then \(f\) has a root in \((a,b)\).

2.10 Extreme Value Theorem

Definition (Global Maximum and Minimum). \(c\) is a global maximum for \(f\) on \(I\) if \(f(x) \le f(c)\) for all \(x \in I\). Similarly for global minimum.

Theorem 17 (Extreme Value Theorem). If \(f\) is continuous on a closed interval \([a,b]\), then \(f\) attains both a global maximum and a global minimum on \([a,b]\).


Chapter 3: Derivatives

3.1 Instantaneous Velocity

If \(s(t)\) is the position of an object at time \(t\), then the instantaneous velocity at \(t_0\) is defined as \(v(t_0) = \lim_{h\to 0} \frac{s(t_0+h)-s(t_0)}{h}\), provided this limit exists. This motivates the general definition of derivative.

3.2 Definition of the Derivative

\[ f'(a) = \lim_{h\to 0} \frac{f(a+h) - f(a)}{h} \]

exists. Equivalently, \(f'(a) = \lim_{t\to a} \frac{f(t)-f(a)}{t-a}\).

3.2.1 The Tangent Line

Definition (Tangent Line). If \(f\) is differentiable at \(x = a\), the tangent line to the graph of \(f\) at \(x = a\) is \(y = f(a) + f'(a)(x - a)\).

3.2.2 Differentiability versus Continuity

Theorem 1 (Differentiability Implies Continuity). If \(f\) is differentiable at \(t = a\), then \(f\) is continuous at \(t = a\).

The converse is false: \(f(x) = |x|\) is continuous at \(0\) but not differentiable there.

3.3 The Derivative Function

Definition (Derivative Function). \(f\) is differentiable on an interval \(I\) if \(f'(a)\) exists for every \(a \in I\). The derivative function is \(f'(t) = \lim_{h\to 0}\frac{f(t+h)-f(t)}{h}\).

Definition (Higher Derivatives). The second derivative is \(f'' = (f')'\), also written \(f^{(2)}\) or \(\frac{d^2 f}{dx^2}\). In general, \(f^{(n+1)} = \frac{d}{dx}(f^{(n)})\).

3.4 Derivatives of Elementary Functions

Theorem 2 (Derivative of \(\sin(x)\)). If \(f(x) = \sin(x)\), then \(f'(x) = \cos(x)\).

Theorem 3 (Derivative of \(\cos(x)\)). If \(f(x) = \cos(x)\), then \(f'(x) = -\sin(x)\).

Theorem 4 (Derivative of \(e^x\)). If \(f(x) = e^x\), then \(f'(x) = e^x\).

3.5 Tangent Lines and Linear Approximation

\[ L_a^f(x) = f(a) + f'(a)(x - a). \]

Definition (Error in Linear Approximation). The error is \(|f(x) - L_a(x)|\).

\[ |f(x) - L_a(x)| \le \frac{M}{2}(x - a)^2 \]

for each \(x \in I\).

3.6 Newton’s Method

Newton’s Method generates a recursive sequence \(x_{n+1} = x_n - \frac{f(x_n)}{f'(x_n)}\) to approximate a root of \(f(x) = 0\). Each step uses the tangent line approximation. In favorable conditions, the number of correct decimal places roughly doubles with each iteration, making it far more efficient than the bisection method.

3.7 Arithmetic Rules of Differentiation

Theorem 6 (Arithmetic Rules for Differentiation). If \(f\) and \(g\) are differentiable at \(x = a\):
1) Constant Multiple: \((cf)'(a) = cf'(a)\).
2) Sum Rule: \((f+g)'(a) = f'(a) + g'(a)\).
3) Product Rule: \((fg)'(a) = f'(a)g(a) + f(a)g'(a)\).
4) Reciprocal Rule: \((1/g)'(a) = -g'(a)/[g(a)]^2\) if \(g(a) \ne 0\).
5) Quotient Rule: \((f/g)'(a) = \frac{f'(a)g(a) - f(a)g'(a)}{[g(a)]^2}\) if \(g(a) \ne 0\).

Theorem 7 (Power Rule). If \(\alpha \in \mathbb{R}\), \(\alpha \ne 0\), and \(f(x) = x^\alpha\), then \(f'(x) = \alpha x^{\alpha - 1}\) wherever \(x^{\alpha-1}\) is defined.

3.8 The Chain Rule

\[ h'(a) = g'(f(a)) \cdot f'(a). \]

In Leibniz notation: \(\frac{dz}{dx} = \frac{dz}{dy}\cdot\frac{dy}{dx}\).

3.9 Derivatives of Other Trigonometric Functions

\[ \frac{d}{dx}\tan(x) = \sec^2(x), \quad \frac{d}{dx}\cot(x) = -\csc^2(x), \]\[ \frac{d}{dx}\sec(x) = \tan(x)\sec(x), \quad \frac{d}{dx}\csc(x) = -\cot(x)\csc(x). \]

3.10 Derivatives of Inverse Functions

\[ g'(b) = \frac{1}{f'(a)} = \frac{1}{f'(g(b))}. \]

Key consequences: \(\frac{d}{dx}\ln(x) = \frac{1}{x}\).

3.11 Derivatives of Inverse Trigonometric Functions

\[ \frac{d}{dx}\arcsin(x) = \frac{1}{\sqrt{1-x^2}}, \quad \frac{d}{dx}\arccos(x) = \frac{-1}{\sqrt{1-x^2}}, \quad \frac{d}{dx}\arctan(x) = \frac{1}{1+x^2}. \]

3.12 Implicit Differentiation

When a relation \(F(x,y) = 0\) implicitly defines \(y\) as a differentiable function of \(x\), we differentiate both sides with respect to \(x\) (using the Chain Rule on \(y\) terms) and solve for \(\frac{dy}{dx}\). Logarithmic differentiation handles functions of the form \(y = g(x)^{f(x)}\) by taking \(\ln\) of both sides first.

3.13 Local Extrema

Definition (Local Maxima and Minima). \(c\) is a local maximum for \(f\) if there exists an open interval \((a,b)\) containing \(c\) with \(f(x) \le f(c)\) for all \(x \in (a,b)\). Similarly for local minimum.

Theorem 10 (Local Extrema Theorem). If \(c\) is a local maximum or local minimum for \(f\) and \(f'(c)\) exists, then \(f'(c) = 0\).

Definition (Critical Point). A point \(c\) in the domain of \(f\) is a critical point if \(f'(c) = 0\) or \(f'(c)\) does not exist.


Chapter 4: The Mean Value Theorem

4.1 The Mean Value Theorem

\[ f'(c) = \frac{f(b) - f(a)}{b - a}. \]

Theorem 2 (Rolle’s Theorem). If \(f\) is continuous on \([a,b]\), differentiable on \((a,b)\), and \(f(a) = 0 = f(b)\), then there exists \(c \in (a,b)\) with \(f'(c) = 0\).

4.2 Applications of the Mean Value Theorem

4.2.1 Antiderivatives

Definition (Antiderivative). Given a function \(f\), an antiderivative is a function \(F\) such that \(F'(x) = f(x)\).

Theorem 3 (Constant Function Theorem). If \(f'(x) = 0\) for all \(x \in I\), then \(f\) is constant on \(I\).

Theorem 4 (Antiderivative Theorem). If \(f'(x) = g'(x)\) for all \(x \in I\), then there exists a constant \(\alpha\) such that \(f(x) = g(x) + \alpha\) for every \(x \in I\).

\[ \int x^\alpha\,dx = \frac{x^{\alpha+1}}{\alpha+1} + C. \]

4.2.2 Increasing Function Theorem

Definition (Increasing and Decreasing Functions). \(f\) is increasing on \(I\) if \(x_1 < x_2\) implies \(f(x_1) < f(x_2)\). \(f\) is decreasing if \(x_1 < x_2\) implies \(f(x_1) > f(x_2)\).

Theorem 6 (Increasing/Decreasing Function Theorem).
(i) If \(f'(x) > 0\) on \(I\), then \(f\) is increasing on \(I\).
(ii) If \(f'(x) \ge 0\) on \(I\), then \(f\) is non-decreasing on \(I\).
(iii) If \(f'(x) < 0\) on \(I\), then \(f\) is decreasing on \(I\).
(iv) If \(f'(x) \le 0\) on \(I\), then \(f\) is non-increasing on \(I\).

4.2.3 Functions with Bounded Derivatives

\[ f(a) + m(x-a) \le f(x) \le f(a) + M(x-a) \]

for all \(x \in [a,b]\).

4.2.4 Comparing Functions Using Their Derivatives

Theorem 8. Assume \(f\) and \(g\) are continuous at \(x=a\) with \(f(a) = g(a)\).
(i) If \(f'(x) \le g'(x)\) for all \(x > a\), then \(f(x) \le g(x)\) for all \(x > a\).
(ii) If \(f'(x) \le g'(x)\) for all \(x < a\), then \(f(x) \ge g(x)\) for all \(x < a\).

\[ e^\alpha = \lim_{n\to\infty}\left(1 + \frac{\alpha}{n}\right)^n. \]

4.2.5–4.2.6 Concavity

Definition (Concavity). The graph of \(f\) is concave upwards on \(I\) if for every pair \(a, b \in I\), the secant line joining \((a,f(a))\) and \((b,f(b))\) lies above the graph. Concave downwards means the secant lies below.

Theorem 10 (Second Derivative Test for Concavity).
(i) If \(f''(x) > 0\) on \(I\), then \(f\) is concave upwards on \(I\).
(ii) If \(f''(x) < 0\) on \(I\), then \(f\) is concave downwards on \(I\).

Definition (Inflection Point). \(c\) is an inflection point for \(f\) if \(f\) is continuous at \(c\) and the concavity of \(f\) changes at \(c\).

Theorem 11 (Test for Inflection Points). If \(f''\) is continuous at \(c\) and \((c,f(c))\) is an inflection point, then \(f''(c) = 0\).

4.2.7 Classifying Critical Points

Theorem 12 (First Derivative Test). Let \(c\) be a critical point of \(f\) with \(f\) continuous at \(c\).
(i) If \(f'(x) < 0\) for \(x < c\) and \(f'(x) > 0\) for \(x > c\) (near \(c\)), then \(f\) has a local minimum at \(c\).
(ii) If \(f'(x) > 0\) for \(x < c\) and \(f'(x) < 0\) for \(x > c\), then \(f\) has a local maximum at \(c\).

Theorem 13 (Second Derivative Test). If \(f'(c) = 0\) and \(f''\) is continuous at \(c\):
(i) If \(f''(c) < 0\), then \(f\) has a local maximum at \(c\).
(ii) If \(f''(c) > 0\), then \(f\) has a local minimum at \(c\).

4.3 L’Hopital’s Rule

\[ \lim_{x\to a}\frac{f(x)}{g(x)} = \lim_{x\to a}\frac{f'(x)}{g'(x)} \]

provided the latter limit exists (or is \(\pm\infty\)). The rule also holds for one-sided limits and limits at \(\pm\infty\).

Other indeterminate forms (\(0\cdot\infty\), \(\infty - \infty\), \(1^\infty\), \(\infty^0\), \(0^0\)) can often be rewritten to apply L’Hopital’s Rule by algebraic manipulation or taking logarithms.


Chapter 5: Taylor Polynomials and Taylor’s Theorem

5.1 Introduction to Taylor Polynomials

Taylor polynomials encode the value of a function and its first \(n\) derivatives at a single point, providing increasingly accurate polynomial approximations.

\[ T_{n,a}(x) = \sum_{k=0}^{n} \frac{f^{(k)}(a)}{k!}(x-a)^k. \]

Key examples centered at \(a = 0\):

  • \(e^x = 1 + x + \frac{x^2}{2} + \frac{x^3}{6} + \cdots + \frac{x^n}{n!} + \cdots\)
  • \(\sin(x) = x - \frac{x^3}{6} + \frac{x^5}{120} - \cdots\)
  • \(\cos(x) = 1 - \frac{x^2}{2} + \frac{x^4}{24} - \cdots\)

Taylor polynomials for eˣ: T₁, T₂, T₃, T₅

5.2 Taylor’s Theorem and Errors in Approximations

Definition (Taylor Remainder). \(R_{n,a}(x) = f(x) - T_{n,a}(x)\). The error is \(|R_{n,a}(x)|\).

\[ R_{n,a}(x) = f(x) - T_{n,a}(x) = \frac{f^{(n+1)}(c)}{(n+1)!}(x-a)^{n+1}. \]

When \(n = 0\), this reduces to the Mean Value Theorem. When \(n = 1\), it gives the error bound for linear approximation.

\[ |f(x) - T_{k,0}(x)| \le M|x|^{k+1} \]

for each \(x \in [-1,1]\).

5.3 Big-O

Definition (Big-O Notation). We write \(f(x) = O(g(x))\) as \(x \to a\) if there exist \(\varepsilon > 0\) and \(M > 0\) such that \(|f(x)| \le M|g(x)|\) for all \(x \in (a-\varepsilon, a+\varepsilon)\) except possibly at \(x = a\).

Definition (Extended Big-O). \(f(x) = g(x) + O(h(x))\) as \(x \to a\) means \(f(x) - g(x) = O(h(x))\) as \(x \to a\).

\[ f(x) = T_{n,0}(x) + O(x^{n+1}) \quad \text{as } x \to 0. \]

Theorem 4 (Arithmetic of Big-O). Assume \(f(x) = O(x^n)\) and \(g(x) = O(x^m)\) as \(x \to 0\). Let \(k = \min\{n,m\}\). Then:
1) \(c \cdot O(x^n) = O(x^n)\).
2) \(O(x^n) + O(x^m) = O(x^k)\).
3) \(O(x^n) \cdot O(x^m) = O(x^{n+m})\).
4) If \(k \le n\), then \(f(x) = O(x^k)\).
5) If \(k \le n\), then \(\frac{1}{x^k}O(x^n) = O(x^{n-k})\).
6) \(f(u^k) = O(u^{kn})\) (substitution).

5.3.1 Calculating Taylor Polynomials

Theorem 5 (Characterization of Taylor Polynomials). Assume \(r > 0\), \(f\) is \((n+1)\)-times differentiable on \([-r,r]\) with \(f^{(n+1)}\) continuous. If \(p\) is a polynomial of degree \(n\) or less with \(f(x) = p(x) + O(x^{n+1})\), then \(p(x) = T_{n,0}(x)\).

This theorem is enormously useful: it lets us find Taylor polynomials of complicated compositions and products by combining known Taylor expansions with Big-O arithmetic, completely avoiding the direct computation of higher derivatives.

Back to top