A normed space is called a preHilbert space if for each pair $(x,y)$ of elements in the space there is a unique complex (or real) number called an inner product of $x$ and $y$, denoted by $\langle x,y\rangle$, subject to the following conditions:
 (i) The functional $f(x)=\langle x,y\rangle$ is linear.
 (ii) $\langle x,y\rangle ={\overline {\langle y,x\rangle }}$
 (iii) $\langle x,x\rangle >0$ for every nonzero $x$
The inner product in its second variable is not linear but antilinear: i.e., if $g(y)=\langle x,y\rangle$, then $g(\alpha y)={\bar {\alpha }}y$ for scalars $\alpha$. We define $\x\=\langle x,x\rangle ^{1/2}$ and this becomes a norm. Indeed, it is clear that $\\alpha x\=\alpha \x\$ and (iii) is the reason that $\x\=0$ implies that $x=0$. Finally, the triangular inequality follows from the next lemma.
3.1 Lemma (Schwarz's inequality) $\langle x,y\rangle \leq \x\\y\$ where the equality holds if and only if we can write $x=\lambda y$ for some scalar $\lambda$.
If we assume the lemma for a moment, it follows:

$\x+y\^{2}$ 
$=\x\^{2}+2\operatorname {Re} \langle x,y\rangle +\y\^{2}\leq \x\^{2}+2\langle x,y\rangle +\y\^{2}$ 

$\leq (\x\+\y\)^{2}$ 
since $\operatorname {Re} (\alpha )\leq \alpha $ for any complex number $\alpha$
Proof of Lemma: First suppose $\x\=1$. If $\alpha ={\overline {\langle x,y\rangle }}$, it then follows:
 $0\leq \\alpha xy\^{2}=\alpha ^{2}2\operatorname {Re} (\alpha \langle x,y\rangle )+\y\^{2}=\alpha ^{2}+\y\^{2}$
where the equation becomes $0$ if and only if $x=\lambda y$. Since we may suppose that $x\neq 0$, the general case follows easily. $\square$
3.2 Theorem A normed linear space is a preHilbert space if and only if $\xy\^{2}=2\x\^{2}+2\y\^{2}\x+y\^{2}$.
Proof: The direct part is clear. To show the converse, we define
 $\langle x,y\rangle =4^{1}(\x+y\^{2}\xy\^{2}+i\x+iy\^{2}i\xiy\^{2})$.
It is then immediate that $\langle x,y\rangle ={\overline {\langle y,x\rangle }}$, $\langle x,y\rangle =\langle x,y\rangle$ and $\langle ix,y\rangle =i\langle x,y\rangle$. Moreover, since the calculation:

$\x_{1}+x_{2}+y\^{2}\x_{1}+x_{2}y\^{2}$ 
$=2\x_{1}+y\^{2}2\x_{1}y\^{2}\x_{1}x_{2}+y\^{2}\x_{1}x_{2}y\^{2}$ 

$=\sum _{j=1}^{2}\x_{j}+y\^{2}\x_{j}y\^{2}$, 
we have: $\langle x_{1}+x_{2},y\rangle =\langle x_{1},y\rangle +\langle x_{2},y\rangle$. If $\alpha$ is a real scalar and $\alpha _{j}$ is a sequence of rational numbers converging to $\alpha$, then by continuity and the above, we get: $\langle \alpha x,y\rangle =\lim _{j\to \infty }\langle \alpha _{j}x,y\rangle =\alpha \langle x,y\rangle .\square$
3.3 Lemma Let ${\mathfrak {H}}$ be a preHilbert. Then $x_{j}\to x$ in norm if and only if for any $y\in {\mathfrak {H}}$ $\x_{j}\\to \x\$ and $\langle x_{j}x,y\rangle \to 0$ as $j\to \infty$.
Proof: The direct part holds since:
 $\x_{j}\\x\+\langle x_{j}x,y\rangle \leq \x_{j}x\(1+\y\)\to 0$ as $j\to \infty$.
Conversely, we have:
 $\x_{j}x\^{2}=\x_{j}\^{2}2\operatorname {Re} \langle x_{j},x\rangle +\x\^{2}\to 0$ as $j\to \infty$
$\square$
3.4 Lemma Let $D$ be a nonempty convex closed subset of a Hilbert space. Then $D$ admits a unique element $z$ such that
 $\z\=\inf\{\x\;x\in D\}$.
Proof: By $\delta$ denote the righthand side. Since $D$ is nonempty, $\delta >0$. For each $n=1,2,...$, there is some $x_{n}\in D$ such that $0\leq \x_{n}\\delta \leq n^{1}$. That is, $\delta =\lim _{n\to \infty }\x_{n}\$. Since $D$ is convex,
 ${x_{n}+x_{m} \over 2}\in D$ and so $\delta \leq {1 \over 2}\x_{n}+x_{m}\$.
It follows:
$\x_{n}x_{m}\^{2}$ 
$=2\x_{n}\^{2}+2\x_{m}\^{2}\x_{n}+x_{m}\^{2}$ 

$\leq 2\x_{n}\^{2}+2\x_{m}\^{2}4\delta ^{2}$ 

$\to 2\delta ^{2}+2\delta ^{2}4\delta ^{2}=0$ as $n,m\to \infty$ 
This is to say, $x_{n}$ is Cauchy. Since $D$ is a closed subset of a complete metric space, whence it is complete, there is a limit $z\in D$ with $\z\=\delta$. The uniqueness follows since if $\w\=\delta$ we have
 $\zw\^{2}=2\z\^{2}+2\w\^{2}\z+w\^{2}$
where the right side is $\leq 0$ for the same reason as before. $\square$
The lemma may hold for a certain Banach space that is not a Hilbert space; this question will be investigated in the next chapter.
For a nonempty subset $E\subset {\mathfrak {H}}$, define $E^{\bot }$ to be the intersection of the kernel of the linear functional $u\mapsto \langle u,v\rangle$ taken all over $v\in E$. (In other words, $E^{\bot }$ is the set of all $x\in {\mathfrak {H}}$ that is orthogonal to every $y\in E$.) Since the kernel of a continuous function is closed and the intersection of linear spaces is again a linear space, $E^{\bot }$ is a closed (linear) subspace of ${\mathfrak {H}}$. Finally, if $x\in E\cap E^{\bot }$, then $0=\langle x,x\rangle =\x\$ and $x=0$.
3.5 Lemma Let ${\mathcal {M}}$ be a linear subspace of a preHilbert space. Then $z\in {\mathcal {M}}^{\bot }$ if and only if $\z\=\inf\{\z+w\;w\in {\mathcal {M}}\}$.
Proof: The Schwarz inequality says the inequality
 $\langle z,z+w\rangle \leq \z\\z+w\$
is actually equality if and only if $z$ and $z+w$ are linear dependent. $\square$ (TODO: the proof isn't quite well written.)
3.6 Theorem (orthogonal decomposition) Let ${\mathfrak {H}}$ be a Hilbert space and ${\mathcal {M}}\subset {\mathfrak {H}}$ be a closed subspace. For every $x\in {\mathfrak {H}}$ we can write
 $x=y+z$
where $y\in {\mathcal {M}}$ and $z\in {\mathcal {M}}^{\bot }$, and $y$ and $z$ are uniquely determined by $x$.
Proof: Clearly $x{\mathcal {M}}$ is convex, and it is also closed since a translation of closed set is again closed. Lemma 3.4 now gives a unique element $y\in {\mathcal {M}}$ such that $\xy\=\inf\{\xw\;w\in {\mathcal {M}}\}$. Let $z=xy$. By Lemma 3.5, $z\in {\mathcal {M}}^{\bot }$. For the uniqueness, suppose we have written:
 $x=y'+z'$
where $y'\in {\mathcal {M}}$ and $z'\in {\mathcal {M}}^{\bot }$. By Lemma 3.5, $\xy'\=\inf\{\xw\;w\in {\mathcal {M}}\}$. But, as noted early, such $y'$ must be unique; i.e., $y'=y$. $\square$
3.7 Corollary Let ${\mathcal {M}}$ be a subspace of a Hilbert space ${\mathfrak {H}}$. Then
 (i) ${\mathcal {M}}^{\bot }=\{0\}$ if and only if ${\mathcal {M}}$ is dense in ${\mathfrak {H}}$.
 (ii) ${\mathcal {M}}^{\bot \bot }={\overline {\mathcal {M}}}$.
Proof: By continuity, $\langle x,{\overline {\mathcal {M}}}\rangle \subset {\overline {\langle x,{\mathcal {M}}\rangle }}$. (Here, $\langle x,E\rangle$ denotes the image of the set $E$ under the map $y\mapsto \langle x,y\rangle$.) This gives:
 ${\mathcal {M}}^{\bot }={\overline {\mathcal {M}}}^{\bot }$ and so ${\mathfrak {H}}={\overline {\mathcal {M}}}\oplus {\mathcal {M}}^{\bot }$
by the orthogonal decomposition. (i) follows. Similarly, we have:
 ${\mathfrak {H}}={\mathcal {M}}^{\bot }\oplus {\mathcal {M}}^{\bot \bot }={\mathcal {M}}^{\bot }\oplus {\overline {\mathcal {M}}}$.
Hence, (ii). $\square$
3.8 Theorem (representation theorem) Every continuous linear functional $f$ on a Hilbert space ${\mathfrak {H}}$ has the form:
 $f(x)=\langle x,y\rangle$ with a unique $y\in {\mathcal {M}}$ and $\f\=\y\_{\mathfrak {H}}$
Proof: Let ${\mathcal {M}}=f^{1}(\{0\})$. Since $f$ is continuous, ${\mathcal {M}}$ is closed. If ${\mathcal {M}}={\mathfrak {H}}$, then take $y=0$. If not, by Corollary 3.6, there is a nonzero $z\in {\mathfrak {H}}$ orthogonal to ${\mathcal {M}}$. By replacing $z$ with $z\z\^{1}$ we may suppose that $\z\=1$. For any $x\in {\mathfrak {H}}$, since $zf(x)f(z)x$ is in the kernel of $f$ and thus is orthogonal to $z$, we have:
 $0=\langle zf(x)f(z)x,z\rangle =\langle z,z\rangle f(x)\langle f(z)x,z\rangle$
and so:
 $f(x)=\langle x,{\overline {f(z)}}z\rangle$
The uniqueness follows since $\langle x,y_{1}\rangle =\langle x,y_{2}\rangle$ for all $x\in {\mathfrak {H}}$ means that $y_{1}y_{2}\in {\mathfrak {H}}^{\bot }=\{0\}$. Finally, we have the identity:
 $\y\=\langle {y \over \y\},y\rangle \leq \f\\leq \y\$
where the last inequality is Schwarz's inequality. $\square$
3.9 Exercise Using Lemma 1.6 give an alternative proof of the preceding theorem.
In view of Theorem 3.5, for each $x\in {\mathfrak {H}}$, we can write: $x=y+z$ where $y\in {\mathcal {M}}$, a closed subspace of ${\mathfrak {H}}$, and $z\in {\mathcal {M}}^{\bot }$. Denote each $y$, which is uniquely determined by $x$, by $\pi (x)$. The function $\pi$ then turns out to be a linear operator. Indeed, for given $x_{1},x_{2}\in {\mathfrak {H}}$, we write:
 $x_{1}=y_{1}+z_{1},x_{2}=y_{2}+z_{2}$ and $x_{1}+x_{2}=y_{3}+z_{3}$
where $y_{j}\in {\mathcal {M}}$ and $z_{j}\in {\mathcal {M}}^{\bot }$ for $j=1,2,3$. By the uniqueness of decomposition
 $\pi (x_{1})+\pi (x_{2})=y_{1}+y_{2}=y_{3}=\pi (x_{1}+x_{2})$.
The similar reasoning shows that $\pi$ commutes with scalars. Now, for $x=y+z\in {\mathfrak {H}}$ (where $y\in {\mathcal {M}}$ and $z\in {\mathcal {M}}^{\bot }$), we have:
 $\x\^{2}=\\pi (x)\^{2}+\z\^{2}\geq \\pi (x)\^{2}$
That is, $\pi$ is continuous with $\\pi \\leq 1$. In particular, when ${\mathcal {M}}$ is a nonzero space, there is $x_{0}\in {\mathcal {M}}$ with $\pi (x_{0})=x_{0}$ and $\x_{0}\=1$ and consequently $\\pi \=1$. Such $\pi$ is called an orthogonal projection (onto ${\mathcal {M}}$).
The next theorem gives an alternative proof of the HahnBanach theorem.
3 Theorem Let ${\mathcal {M}}$ be a linear (not necessarily closed) subspace of a Hilbert space. Every continuous linear functional on ${\mathcal {M}}$ can be extended to a unique continuous linear functional on ${\mathfrak {H}}$ that has the same norm and vanishes on ${\mathcal {M}}^{\bot }$.
Proof: Since ${\mathcal {M}}$ is a dense subset of a Banach space ${\overline {\mathcal {M}}}$, by Theorem 2.something, we can uniquely extend $f$ so that it is continuous on ${\overline {\mathcal {M}}}$. Define $g=f\circ \pi _{\overline {\mathcal {M}}}$. By the same argument used in the proof of Theorem 2.something (HahnBanach) and the fact that $\\pi _{\mathcal {F}}\=1$, we obtain $\f\=\g\$. Since $g=0$ on ${\mathcal {M}}^{\bot }$, it remains to show the uniqueness. For this, let $h$ be another extension with the desired properties. Since the kernel of $fh$ is closed and thus contain ${\overline {\mathcal {M}}}$, $f=h$ on ${\overline {\mathcal {M}}}$. Hence, for any $x\in {\mathfrak {H}}$,
 $h(x)=(h\circ \pi _{\overline {\mathcal {M}}})x=(f\circ \pi _{\overline {\mathcal {M}}})x=g(x)$.
The extension $g$ is thus unique. $\square$
3 Theorem Let ${\mathcal {M}}_{n}$ be an increasing sequence of closed subspaces, and ${\mathcal {M}}$ be the closure of ${\mathcal {M}}_{1}\cup {\mathcal {M}}_{2}\cup ...$. If $\pi _{\mathcal {M}}$ is an orthogonal projection onto ${\mathcal {M}}$, then for every $x\in {\mathcal {M}}$ $\pi _{{\mathcal {M}}_{n}}(x)\to x$.
Proof: Let ${\mathcal {N}}=\{x\in {\mathcal {M}};\pi _{{\mathcal {M}}_{n}}(x)\to x(n\to \infty )\}$. Then ${\mathcal {N}}$ is closed. Indeed, if $x_{j}\in {\mathcal {N}}$ and $x_{j}\to x$, then
 $\\pi _{{\mathcal {M}}_{n}}(x)x\\leq 2\xx_{j}\+\\pi _{{\mathcal {M}}_{n}}(x_{j})x_{j}\$
and so $x\in {\mathcal {N}}$. Since ${\mathcal {M}}\subset {\overline {\mathcal {N}}}$, the proof is complete. $\square$
Let $({\mathfrak {H}}_{j},\\cdot \_{j}=\langle \cdot ,\cdot \rangle _{j})$ be Hilbert spaces. The direct sum of ${\mathfrak {H}}_{1}\oplus {\mathfrak {H}}_{2}$ is defined as follows: let ${\mathfrak {H}}_{1}\oplus {\mathfrak {H}}_{2}=\{(x_{1},x_{2});x_{1}\in {\mathfrak {H}}_{1},x_{2}\in {\mathfrak {H}}_{2}\}$ and define
 $\langle x_{1}\oplus x_{2},y_{1}\oplus y_{2}\rangle =\langle x_{1},y_{1}\rangle _{1}+\langle x_{2},y_{2}\rangle _{2}$.
It is then easy to verify that $({\mathfrak {H}}_{1}\oplus {\mathfrak {H}}_{2},\langle \cdot ,\cdot \rangle )$ is a Hilbert space. It is also clear that this definition generalizes to a finite direct sum of Hilbert spaces. (For an infinite direct sum of Hilbert spaces, see Chapter 5.)
Recall from the previous chapter that an isometric surjection between Banach spaces is called "unitary".
3 Lemma (Hilbert adjoint) Define $V:{\mathfrak {H}}_{1}\oplus {\mathfrak {H}}_{2}\to {\mathfrak {H}}_{2}\oplus {\mathfrak {H}}_{1}$ by $V(x_{1}\oplus x_{2})=x_{2}\oplus x_{1}$. (Clearly, $V$ is a unitary operator.) Then $(V\operatorname {gra} T)^{\bot }$ is a graph (of some linear operator) if and only if $T$ is densely defined.
Proof: Set ${\mathcal {M}}=(V\operatorname {gra} T)^{\bot }$. Let $u\in (\operatorname {dom} T^{*})^{\bot }$. Then
 $0=\langle 0,Tv\rangle _{2}+\langle u,v\rangle _{2}=\langle 0\oplus u,Tv\oplus v\rangle$ for every $v$.
That is to say, $0\oplus u\in {\mathcal {M}}$, which is a graph of a linear operator by assumption. Thus, $u=0$. For the converse, suppose $f\oplus u_{1},f\oplus u_{2}\in {\mathcal {M}}$. Then
 $0=\langle f\oplus u_{j},Tv\oplus v\rangle =\langle f,Tv\rangle _{2}+\langle u_{j},v\rangle _{1}\qquad$ $(j=1,2)$
and so $\langle u_{1}u_{2},v\rangle _{1}=0$ for every $v$ in the domain of $T$, dense. Thus, $u_{1}=u_{2}$, and ${\mathcal {M}}$ is a graph of a function, say, $S$. The linear of $S$ can be checked in the similar manner.$\square$
Remark: In the proof of the lemma, the linear of $T$ was never used.
For a densely defined $T$, we thus obtained a linear operator which we call $T^{*}$. It is characterized uniquely by:
 $0=\langle f,Tu\rangle _{2}+\langle T^{*}f,u\rangle _{1}=\langle f\oplus T^{*}f,V(u\oplus Tu)\rangle$ for every $u$,
or, more commonly,
 $\langle Tu,f\rangle _{2}=\langle u,T^{*}f\rangle _{1}$ for every $u$.
Furthermore, $T^{*}f$ is defined if and only if
 $u\mapsto \langle Tu,f\rangle$
is continuous for every $u\in \operatorname {dom} T$. The operator $T^{*}$ is called the Hilbert adjoint (or just adjoint) of $T$. If $T$ is closed in addition to having dense domain, then
 $(V'\operatorname {gra} T^{*})^{\bot }=(V'(V\operatorname {gra} T)^{\bot })^{\bot }=\operatorname {gra} T^{\bot \bot }=\operatorname {gra} T$
Here, $V'(x_{2},x_{1})=x_{1}\oplus x_{2}$. By the above lemma, $T^{*}$ is densely defined. More generally, if a densely defined operator $T$ has a closed extension $S$ (i.e., $\operatorname {gra} T\subset \operatorname {gra} S={\overline {\operatorname {gra} S}}$), then $S$ and $S^{*}$ are both densely defined. It follows: $\operatorname {gra} S^{*}\subset \operatorname {gra} T^{*}$. That is, $T^{*}$ is densely defined and $T^{**}$ exists. That $S=T^{**}$ follows from the next theorem.
3 Theorem Let $T:{\mathfrak {H}}_{1}\to {\mathfrak {H}}_{2}$ be a densely defined operator. If $T^{*}$ is also densely defined, then
 ${\overline {\operatorname {gra} T}}=\operatorname {gra} T^{**}=\operatorname {gra} S$
for any closed extension $S$ of $T$.
Proof: As above,
 $(V'\operatorname {gra} T^{*})^{\bot }=\operatorname {gra} T^{\bot \bot }$
Here, the lefthand side is a graph of $T^{**}$. For the second identity, since $\operatorname {gra} S$ is a Hilbert space, it suffices to show $\operatorname {gra} T^{\bot }\cap \operatorname {gra} S=\{0\}$. But this follows from Lemma 3.something.$\square$
The next corollary is obvious but is important in application.
3 Corollary Let ${\mathfrak {H}}_{1},{\mathfrak {H}}_{2}$ be Hilbert spaces, and $T:{\mathfrak {H}}_{1}\to {\mathfrak {H}}_{2}$ a closed densely defined linear operator. Then $u\in \operatorname {dom} T$ if and only if there is some $K>0$ such that:
 $\\langle T^{*}f,u\rangle \\leq K\f\$ for every $f\in \operatorname {dom} T^{*}$
3 Lemma Let $T:{\mathfrak {H}}_{1}\to {\mathfrak {H}}_{2}$ be a densely defined linear operator. Then $\operatorname {ker} T^{*}=(\operatorname {ran} T)^{\bot }.$
Proof: $f$ is in either the lefthand side or the righthand side if and only if:
 $0=\langle T^{*}f,u\rangle =\langle f,Tu\rangle$ for every $u$.
(Note that $\langle f,Tu\rangle =0$ for every $u$ implies $f\in \operatorname {dom} T^{*}$.) $\square$
In particular, a closed densely defined operator has closed kernel. As an application we shall prove the next theorem.
3 Theorem Let $T:{\mathfrak {H}}_{1}\to {\mathfrak {H}}_{2}$ be a closed densely defined linear operator. Then $T$ is surjective if and only if there is a $K>0$ such that
 $\f\_{2}\leq K\T^{*}f\_{1}$ for every $f\in \operatorname {dom} T^{*}$.
Proof: Suppose $T$ is surjective. Since $T$ has closed range, it suffices to show the estimate for $f\in (\operatorname {ker} T^{*})^{\bot }=\operatorname {ran} T$. Let $u\in (\operatorname {ker} T)^{\bot }$ with $Tu=f$. Denoting by $G$ the inverse of $T$ restricted to $(\operatorname {ker} T)^{\bot }$, we have:
 $\f\_{2}^{2}\leq \T^{*}f\_{1}\Gf\_{1}\leq \T^{*}f\\G\\f\_{2}$
The last inequality holds since $G$ is continuous by the closed graph theorem. To show the converse, let $g\in {\mathfrak {H}}_{2}$ be given. Since $T^{*}$ is injective, we can define a linear functional $L$ by $L(T^{*}f)=\langle f,g\rangle _{2}$ for $f\in {\mathfrak {H}}_{2}$.,
 $L(T^{*}f)=\langle f,g\rangle _{2}\leq K\T^{*}f\$ for every $f\in \operatorname {dom} T^{*}$.
Thus, $L$ is continuous on the range of $T^{*}$. It follows from the HahnBanach theorem that we may assume that $L$ is defined and continuous on ${\mathfrak {H}}_{1}$. Thus, by Theorem 3.something, we can write $L(\cdot )=\langle \cdot ,u\rangle _{1}$ in ${\mathfrak {H}}_{1}$ with some $u$. Since $L(T^{*}f)$ is continuous for $f\in \operatorname {dom} T^{*}$,
 $L(T^{*}f)=\langle f,g\rangle _{2}=\langle T^{*}f,u\rangle _{1}=\langle f,T^{**}u\rangle _{2}$ for every $f\in \operatorname {dom} T^{*}$.
Hence, $Tu=T^{**}u=g$. $\square$
3 Corollary Let $T,{\mathfrak {H}}_{1},{\mathfrak {H}}_{2}$ be as given in the preceding theorem. Then $\operatorname {ran} T$ is closed if and only if $\operatorname {ran} T^{*}$ is closed.
Proof: Define $S:{\mathfrak {H}}_{1}\to \operatorname {ran} T$ by $S=T$. It thus suffices to show $S^{*}$ is surjective when $T$ has closed range (or equivalently $S$ is surjective.) Suppose $S^{*}f_{j}$ is convergent. The preceding theorem gives:
 $\f_{j}f_{k}\_{2}\leq K\S^{*}(f_{j}f_{k})\_{1}\to 0$ as $j,k\to \infty$.
Thus, $f_{j}\oplus S^{*}f_{j}$ is Cauchy in the graph of $S^{*}$, which is closed. Hence, $S^{*}f_{j}$ converges within the range of $S^{*}$. The converse holds since $T^{**}=T$. $\square$
We shall now consider some concrete examples of densely defined linear operators.
3 Theorem $T:{\mathfrak {H}}_{1}\to {\mathfrak {H}}_{2}$ is continuous if and only if $T^{*}$ is continuous. Moreover, when $T$ is continuous,
 $\T\^{2}=\T^{*}T\=\TT^{*}\=\T^{*}\^{2}$.
Proof: It is clear that $T^{*}$ is defined everywhere, and its continuity is a consequence of the closed graph theorem. Conversely, if $T^{*}$ is continuous, then $T^{**}$ is continuous and $T=T^{**}$. For the second part,
 $\T^{*}f\_{1}^{2}=\langle TT^{*}f,f\rangle \leq \T\\T^{*}f\_{2}\f\_{2}$ for every $f$.
Thus, $T^{*}$ is continuous with $\T^{*}\\leq \T\$. In particular, $T^{*}T$ is continuous, and so:
 $\T^{*}f\_{1}^{2}\leq \TT^{*}\\f\_{2}^{2}$ for every $f$.
That is to say, $\T^{*}\^{2}\leq \TT^{*}\\leq \T\^{2}$. Applying this result to $T^{*}$ in place of $T$ completes the proof.
The identity in the theorem shows that $B({\mathcal {H}})$ is a $C^{*}$algebra, which is a topic in Chapter 6.
3 Lemma Let $S,T\in B({\mathfrak {H}})$. If $\langle Tx,x\rangle =\langle Sx,x\rangle$ for $x\in {\mathfrak {H}}$, then $S=T$.
Proof: Let $R=TS$. We have $0=\langle R(x+y),x+y\rangle =\langle Rx,y\rangle +\langle Ry,x\rangle$ and $0=i\langle R(x+iy),x+iy\rangle =\langle Rx,y\rangle +i^{2}\langle Ry,x\rangle$. Summing the two we get: $0=2\langle Rx,y\rangle$ for $x,y\in {\mathfrak {H}}$. Taking $y=Rx$ gives $0=\Rx\^{2}$ for all $x\in {\mathfrak {H}}$ or $R=0$. $\square$
Remark: the above lemma is false if the underlying field is $\mathbf {R}$.
Recall that an isometric surjection is called unitary.
3 Corollary A linear operator $U:{\mathfrak {H}}_{1}\to {\mathfrak {H}}_{2}$ is unitary if and only if $U^{*}U$ and $UU^{*}$ are identities.
Proof: Since $(U^{*}Ux\mid x)=\Ux\^{2}=(x\mid x)>$, we see that $U^{*}U$ is the identity. Since $UU^{*}U=U$, $UU^{*}$ is the identity on the range of U, which is ${\mathcal {H}}_{2}$ by surjectivity. Conversely, since $\Ux\_{2}^{2}=\langle U^{*}Ux,x\rangle _{1}=\x\_{1}^{2}$, $U$ is an isometry. $\square$
Curiously, the hypothesis on linearity can be omitted:
3 Theorem If $U:{\mathcal {H}}_{1}\to {\mathcal {H}}_{2}$ is a function such that
 $\U(x)U(y)\_{2}=\xy\_{1}$
for every x and y and $U(0)=0$, then $U$ is a linear operator (and so unitary).
Proof: Note that U is continuous. Since $\U(x)\=\U(x)U(0)\=\x\$, we have:
 $\xy\_{1}^{2}=\U(x)U(y)\_{2}^{2}=\x\_{1}^{2}2\operatorname {Re} (U(x)\mid U(y))+\y\_{1}^{2}$.
Thus,
 $\operatorname {Re} (x\mid y)_{1}=\operatorname {Re} (U(x),U(y))_{2}$
It now follows:
 $\U(\alpha x+y)\alpha U(x)U(y)\_{2}^{2}=\U(\alpha x+y)U(\alpha x)\_{2}^{2}2\operatorname {Re} (y\mid y)+\U(y)\_{2}^{2}=\y\_{1}^{2}2\y\_{1}^{2}+\y\_{1}^{2}=0$
for any $x,y\in {\mathcal {H}}_{1}$ and scalar $\alpha$. $\square$
There is an analog of this result for Banach space. See, for example, http://www.helsinki.fi/~jvaisala/mazurulam.pdf)
3 Exercise Construct an example so as to show that an isometric operator (i.e., a linear operator that preserves norm) need not be unitary. (Hint: a shift operator.)
A densely defined linear operator $T$ is called "symmetric" if $\operatorname {gra} T\subset \operatorname {gra} T^{*}$. If the equality in the above holds, then $T$ is called "selfadjoint". In light of Theorem 3.something, every selfadjoint is closed and densely defined. If $T$ is symmetric, then since $T^{**}$ is an extension of $T$,
 $\operatorname {gra} T\subset \operatorname {gra} T^{*}\cap \operatorname {gra} T^{**}$.
3 Theorem Let $T_{j}:{\mathfrak {H}}_{j}\to {\mathfrak {H}}_{j+1}$ be densely defined linear operators for $j=1,2$. Then $\operatorname {gra} T_{1}^{*}\circ T_{2}^{*}\subset \operatorname {gra} (T_{2}\circ T_{1})^{*}$ where the equality holds if $T_{j}^{**}=T_{j}$ $(j=1,2)$ and $T_{1}^{*}\circ T_{2}^{*}$ is closed and densely defined.
Proof: Let $u\in \operatorname {dom} (T_{1}^{*}\circ T_{2}^{*})$. Then
 $\langle T_{2}\circ T_{1}v,u\rangle =\langle T_{1}v,T_{2}^{*}u\rangle =\langle v,T_{1}^{*}\circ T_{2}^{*}u\rangle$ for every $v\in \operatorname {dom} (T_{2}\circ T_{1})$.
But, by definition, $(T_{2}\circ T_{1})^{*}u$ denotes $T_{1}^{*}\circ T_{2}^{*}u$. Hence, $(T_{2}\circ T_{1})^{*}$ is an extension of $T_{1}^{*}\circ T_{2}^{*}$. For the second part, the fact we have just proved gives:
 $\operatorname {gra} T_{1}^{*}\circ T_{2}^{*}\subset \operatorname {gra} (T_{2}\circ T_{1})^{*}=\operatorname {gra} (T_{2}^{**}\circ T_{1}^{**})^{*}\subset \operatorname {gra} (T_{1}^{*}\circ T_{2}^{*})^{**}$. $\square$
3 Theorem Let $T:{\mathfrak {H}}_{1}\to {\mathfrak {H}}_{2}$ be a Hilbert spaces. If $T:{\mathfrak {H}}_{1}\to {\mathfrak {H}}_{2}$ is a closed densely defined operator, then $T^{*}T$ is a selfadjoint operator (in particular, densely defined and closed.)
Proof: In light of the preceding theorem, it suffices to show that $T^{*}T$ is closed. Let $u_{j}\in \operatorname {dom} T^{*}T$ be a sequence such that $(u_{j},T^{*}Tu_{j})$ converges to limit $(u,v)$. Since
 $\Tu_{j}Tu_{k}\_{2}\leq 2(\T^{*}T(u_{j}u_{k})\_{1}+\u_{j}u_{k}\_{1})$,
there is some $f\in {\mathfrak {H}}_{2}$ such that: $\Tu_{j}f\_{2}\to 0$. It follows from the closedness of $T^{*}$ that $T^{*}f=v$. Since $\u_{j}u\_{1}+\Tu_{j}f\_{2}\to 0$ and $T$ is closed, $T^{*}Tu=T^{*}f=v$. $\square$
3 Theorem Let $T$ be a symmetric densely defined operator. If $T$ is surjective, then $T$ is selfadjoint and injective and $T^{1}$ is selfadjoint and bounded.
Proof: If $Tu=0$,
 $\langle Tu,v\rangle =\langle u,Tv\rangle$ and $u=0$
if $T$ has a dense range (for example, it is surjective). Thus, $T$ is injective. Since $T^{1}$ is closed (by Lemma 2.something) and $\operatorname {ran} T={\mathfrak {H}}_{2}$, $T^{1}:{\mathfrak {H}}_{2}\to \operatorname {dom} T$ is a continuous linear operator. Finally, we have:
 $\operatorname {gra} T^{1}=V\operatorname {gra} T\subset V\operatorname {gra} T^{*}=\operatorname {gra} (T^{*})^{1}=\operatorname {gra} (T^{1})^{*}$.
Here, $V(x_{1}\oplus x_{2})=x_{2}\oplus x_{1}$, and the equality holds since the domains of $T$ and $T^{*}$ coincide. Hence, $T^{1}$ is selfadjoint. Since we have just proved that the inverse of a selfadjoint is selfadjoint, we have: $(T^{1})^{1}$ is selfadjoint.$\square$
3 Theorem Let ${\mathcal {M}}$ be a closed linear subspace of a Hilbert space ${\mathfrak {H}}$. Then $\pi$ is an orthogonal projection onto ${\mathcal {M}}$ if and only if $\pi =\pi ^{*}=\pi ^{2}$ and the range of $\pi$ is ${\mathcal {M}}$.
Proof: The direct part is clear except for $\pi =\pi ^{*}$. But we have:
 $\langle \pi (x),x\rangle =\\pi (x)\^{2}$
since $\pi (x)$ and $x\pi (x)$ are orthogonal. Thus, $\pi$ is real and so selfadjoint then. For the converse, we only have to verify $x\pi (x)\in {\mathcal {M}}^{\bot }$ for every $x$. But we have: $\pi (x\pi (x))=0$ and $\operatorname {ker} (\pi )=\operatorname {ker} (\pi ^{*})=(\operatorname {ran} (\pi ))^{\bot }={\mathcal {M}}^{\bot }$. $\square$
We shall now turn our attention to the spectral decomposition of a compact selfadjoint operator. Let $T:{\mathfrak {H}}\to {\mathfrak {H}}$ be a compact operator.