Skip to main content

Section 4.1 Limits

With the inner product given by (2.1.2), the space \(\ell^2\) of sequences of square summable functions acts much like \(\C^n\text{.}\) We can extend this analogy further. In Euclidean space, a sequence that looks convergent is convergent. (This shows up in theorems like "Cauchy sequences are convergent", e.g., and falls into a larger category we might refer to in a silly but accurate manner as a “duck theorem” - if it quacks like a duck, then...). Much of our intuition from real analysis is based on this property. Spaces on which sequences converge to points outside the space are very difficult to do analysis on! (Consider trying to do calculus on the rational numbers, for example.)
On the other hand, recall the space we defined in \(\ell_0\) Definition 3.2.2. In Example 3.2.1, we constructed a sequence of vectors in \(\ell_0\) that converged in \(\ell^2\) but not in \(\ell_0\text{.}\) That is, if we restricted our attention to the inner product space \(\ell_0\text{,}\) we could not perform any analysis using limits, because we might leave the space.
As our ultimate interest is in examining function spaces, we can also consider an example in \(C[0,1]\text{.}\) A natural norm on \(C[0,1]\) is induced by the inner product (2.1.1), so that
\begin{equation*} \norm{f} = \int_0^1 \abs{f(x)}^2 \, dx. \end{equation*}
With respect the norm above, consider the sequence
\begin{equation*} f_k(x) = \min ((2x)^k, 1). \end{equation*}
It isn’t difficult to show that the elements in this sequence of functions grow arbitrarily close as \(k \to \infty\) with respect to \(\norm{\cdot}\text{.}\) It should also be clear from the figure above that this sequence of continuous functions is approaching a discontinuous function in the limit. (What is it?) That is, \(C[0,1]\) is evidently not closed with respect to \(\norm{\cdot}\text{.}\)
We would very much like to work in spaces where limits make sense. So we’ll introduce some machinery that will guarantee that this is the case.

Definition 4.1.1.

Let \((M, d)\) be a metric space with distance function \(d\text{.}\) A sequence \((x^k)\) in \(M\) is called a Cauchy sequence if for every \(\eps > 0\text{,}\) there exists an integer \(K\) so that \(k, l > K\) implies that \(d(x^k, x^l) \lt \eps\text{.}\)
If every Cauchy sequence in \(M\) converges to a limit in \(M\text{,}\) then \(M\) is called a complete metric space.
Some metric spaces come complete. The prototypical space is \(\R\text{,}\) where the result that Cauchy sequences are convergent is a standard test question in real analysis. (Can you prove it?). Completeness of \(\R\) implies completeness of \(\C\text{.}\) (Do it.) More complicated spaces take more work. We’ve already seen examples of spaces that are not complete.

Proof.

We need to show that any Cauchy sequence in \(\ell^2\) is convergent to another \(\ell^2\) sequence. First, we propose a candidate for the limit of a Cauchy sequence of vectors in \(\ell^2\text{.}\) To do so, we’ll need to take advantage of the fact that vectors in \(\ell^2\) are themselves sequences.
Consider the array we can construct by arranging the vectors \(x^k\) as rows -
\begin{equation*} \begin{array}{cccccc} x^1 \amp = \amp (x^1_1, \amp x^1_2, \amp x^1_3, \amp \ldots)\\ x^2 \amp = \amp (x^2_1, \amp x^2_2, \amp x^2_3, \amp \ldots)\\ x^3 \amp = \amp (x^3_1, \amp x^3_2, \amp x^3_3, \amp \ldots)\\ \vdots \amp \vdots \amp \vdots \amp \vdots \amp \vdots \amp \\ \hline (x^k) \amp \amp (x^k_1) \amp (x^k_2) \amp (x^k_3) \amp \end{array} \end{equation*}
We are going to argue that the column sequences in the array above are convergent. Consider the \(j\)-th column sequence \((x^k_j)\text{.}\) Choose \(\eps > 0.\) Because \(x^k\) is a Cauchy sequence of vectors, there exists some \(K\) so that \(k, l > K\) implies that \(\norm{x^k - x^l} \lt \eps\text{.}\) But since
\begin{equation*} \norm{x^k - x^l} = \left(\sum \abs{x_i^k - x_i^l}^2\right)^{1/2} \lt \eps \end{equation*}
and each term in the sum is positive, we get \(\abs{x^k_j - x^l_j} \lt \eps\) when \(k, l > K\text{.}\) This shows that \((x^k_j)\) is a Cauchy sequence in \(\C\text{,}\) which is a complete metric space, and so \((x^k_j) \to y_j\) for some limit \(y_j \in \C\) as \(j \to \infty\text{.}\) Let \(y = (y_j)\) be the sequence of column limits. This is our candidate limit in \(\ell^2\) for the sequence of vectors \(x^k\text{.}\)
Let us show that \(y \in \ell^2\text{.}\) To do so, we will show that \(x^k - y\) is in \(\ell^2\) for some \(k\) and use the vector space structure of \(\ell^2\text{.}\) Let \(\eps > 0\) be given. Since \(x^k\) is Cauchy, there exists \(K\) so that \(\norm{x^k - x^l} \lt \eps\) for all \(k, l > K\text{.}\) Noting again that the terms are positive, it is clear that for \(k, l > K\text{,}\)
\begin{equation*} \sum_{i=1}^N \abs{x_i^k - x_i^l}^2 \leq \sum_{i=1}^{\infty} \abs{x_i^k - x_i^l}^2 \leq \eps^2. \end{equation*}
We previously showed that the sequences \(x_i^l\) converge to \(y_i\text{,}\) so taking a limit as \(l \to \infty\) on the left-hand side of the inequalty gives
\begin{equation*} \sum_{i=1}^N \abs{x_i^k - y_i}^2 \leq \eps^2. \end{equation*}
This statement holds for all \(n \in \mathbb{N}\text{,}\) and so letting \(N\) tend to \(\infty\) gives us
\begin{equation*} \sum_{i=1}^\infty \abs{x_i^k - y_i}^2 \leq \eps^2, \end{equation*}
which is to say that \(\norm{x^k - y} \lt \eps\text{.}\) Having shown that \(x^k - y\) is in \(\ell^2\text{,}\) we note that \(y = x^k - (x^k - y) \in \ell^2\text{,}\) which concludes this step of the proof.
The final step is to argue that the sequence \(x^k \to y\text{.}\) In the previous step, we showed that for a given \(\eps\text{,}\) there exists \(K\) so that \(k > K\) implies that \(\norm{x^k - y} \lt \eps\text{.}\) Thus, \(x^k\) converges to the \(\ell^2\) sequence \(y\text{.}\) Since every Cauchy sequence converges to a limit in \(\ell^2\text{,}\) we conclude that \(\ell^2\) is a complete metric space.

Checkpoint 4.1.3.

Show that \(\ell^\infty\text{,}\) the space of bounded sequences of complex numbers with the supremum norm is complete.
We are now (finally) ready to define Hilbert spaces. The distinction between Hilbert spaces and inner product spaces is important to keep in mind.

Definition 4.1.4.

A Hilbert space is an inner product space which is a complete metric space with respect to the metric induced by its inner product.

Definition 4.1.5.

A Banach space is a normed space which is complete with respect to the metric induced by its norm.
As inner products induce norms, every Hilbert space is a Banach space. The converse statement is not true. Examples abound of complete normed spaces in which the parallelogram rule fails to hold (and thus there can exist no inner product that induces the norm.) In fact, our results to now show that \(\ell^\infty\) is an example of a Banach space that is not a Hilbert space. Banach spaces are an area of wide interest in mathematics, but the lack of an inner product makes the structure much more delicate to parse. The central spaces of functional analysis are Hilbert spaces, where the inner product gives useful geometry not available in the more general setting of Banach spaces.
We’ll continue by looking at perhaps the single most important Hilbert space of functions.