> [!Definition] Definition (Orthonormal) > A set of [[Euclidean Vector|Euclidean vectors]] $\underline{v}_{1},\dots,\underline{v}_{s}$ is *orthonormal* iff the [[Dot Product in Real n-Space|dot product]] of every pair: $\underline{v}_{i}\cdot \underline{v}_{j} = \delta_{ij}$ for each $i,j=1,,\dots,s$ where $\delta_{ij}$ is the [[Kronecker Delta Function]]. > [!info] Remark > In other words, $\underline{v}_{1},\dots,\underline{v}_{s}$ are orthonormal iff they all have length $1$ and they are pairwise [[Angle Between Nonzero Real Vectors|orthogonal]]. > # Properties > [!NOTE] Lemma (Reading Componenets) > Suppose $v_{1},\dots,v_{s}$ are orthonormal and $v= \lambda_{1}v_{1}+\dots+\lambda_{s} v_{s}$ then $\lambda_{i}=v_{i}\cdot v \quad \text{for all } i=1,\dots,s$ >*Proof*. Since $v_{i}\cdot v_{i}=1$ and $v_{i}\cdot v_{j} =0$ for $i\neq j$ we have $v_{i}\cdot v = v_{i} \cdot (\lambda_{1}v_{1}+\dots+\lambda_{s}v_{s})= \lambda_{1}v_{1}\cdot v_{i} +\lambda_{2}v_{1}\cdot v_{2}+\dots+ \lambda_{s} v_{1} \cdot v_{s}=\lambda_{i}.$ > [!NOTE] Proposition (Orthonormal vectors are linearly independent) > If $v_{1},\dots,v_{s}$ are orthonormal, then they are [[Linear Independence|linearly independent]]. >*Proof*. Suppose $\lambda_{1}v_{1}+\dots+\lambda_{s}v_{s}=0_{V}.$ By above lemma we have $\lambda_{i} = v_{i}\cdot(\lambda_{1}v_{1}+\dots+\lambda_{s}v_{s})=v_{i}\cdot{0_{V}} = 0$ # Examples >[!Example] >The only orthonormal sets of two vectors in $\mathbb{R}^2$ are its [[Standard basis of real n-space|standard basis]] and $\{ (\cos \theta,\sin\theta)^{T}, (-\sin \theta,\cos \theta)^{T} \}$. > [!Example] Example (Orthongonal projection) > If $\underline{v}=(a_{1},\dots,a_{n})^{T} \in\mathbb{R}^{n}$ and $\underline{e}_{1},\dots,\underline{e}_{n}$ is the standard basis of $\mathbb{R}^{n}$ then the component of $\underline{v}$ in the direction of $\underline{e}_{i}$ is given by $a_{i}=\underline{e}_{i}\cdot \underline{v}$, for each $i=1,\dots,n$, which is the length of the projection of $\underline{v}$ onto $\underline{e}_{i}$. > > More generally, consider the following picture: the vector $\lambda \underline{\hat{w}}$, which has length $\lambda$, is the [[Orthogonal Projection in Real n-Space]] of $\underline{v}$ onto the the line through the unit vector $\underline{\hat{w}}$, though the required scalar multiple is yet to be calculated. > > ![[orthogonal projection.png|400]] > >Define $\underline{n}=\underline{v}-\lambda \underline{\hat{w}}$ then $\underline{n}\cdot \underline{ \hat{w}}=\underline{v} \cdot \underline{\hat{w}} - \lambda || \underline{\hat{w}} ||^{2} = \underline{v} \cdot \underline{\hat{w}} - \lambda$since $||\underline{\hat{w}}||=1$. Therefore $\lambda \underline{\hat{w}}$ is orthogonal to $\hat{n}$ iff $\lambda= \underline{v} \cdot \underline{\hat{w}}$. ^400afe # Applications - [[Gram-Schmidt orthogonalisation in real n-space]].