For any \(y \in W\text{,}\) the map
\begin{equation*}
x \mapsto \ip{Ax}{y}_W
\end{equation*}
is a continuous linear functional on
\(V\text{.}\) Then
Theorem 7.2.1 gives the existence of a unique vector
\(z \in V\) so that
\begin{equation*}
\ip{Ax}{y}_W = \ip{x}{z}_V
\end{equation*}
for all \(x \in V\text{.}\) Then we define the map \(A\ad: W \to V\) by
\begin{equation*}
A\ad y = z.
\end{equation*}
It remains to show that \(A\ad\) is a continuous linear functional.
To see that \(A\ad\) is linear, choose \(\alpha, \beta \in \C\) and \(u, v \in W\text{.}\) Then for all \(x \in V\text{,}\) we have
\begin{align*}
\ip{x} {A\ad(\alpha u + \beta v)} \amp = \ip{Ax}{\alpha u + \beta v}\\
\amp= \cc\alpha \ip{Ax}{u} + \cc\beta\ip{Ax}{v}\\
\amp = \cc\alpha \ip{x}{A\ad u} + \cc\beta\ip{x}{A\ad v}\\
\amp= \ip{x}{ \alpha A\ad u + \beta A\ad v}.
\end{align*}
Since this holds for all \(x \in V\text{,}\) we must have
\begin{equation*}
A\ad(\alpha u + \beta v) = \alpha A\ad u + \beta A\ad v.
\end{equation*}
To see that \(A\ad\) is bounded, notice that for all \(y \in W,\)
\begin{align*}
\norm{A\ad y}^2 \amp= \ip{A\ad y}{A\ad y} \\
\amp=\ip{A A\ad y}{y}\\
\amp\leq \norm{A A\ad y}\norm{y}\\
\amp\leq \norm{A} \norm{A\ad y}\norm{y}.
\end{align*}
Dividing through, we get
\begin{equation*}
\norm{A\ad y} \leq \norm{A} \norm{y}
\end{equation*}
which gives that \(\norm{A\ad} \leq \norm{A}\text{,}\) and so \(A\ad\) is bounded.
Uniqueness follows from the observation that if \(A\) had two adjoints, then for all \(x \in V\text{,}\) we’d have
\begin{equation*}
\ip{x}{A_1\ad y} = \ip{x}{A_2\ad y},
\end{equation*}
and so \(A_1\ad = A_2\ad\text{.}\)