> [!NOTE] Lemma (Equivalence of component-wise convergence and convergence in $\mathbb{R}^n$)
> A sequence $(x_{j})$ of vectors in $\mathbb{R}^n$ [[Convergence|converges]] to $x_{0}\in \mathbb{R}^n$ iff, for each $i\in \{ 1,2,3,\dots,n \}$, $\lim_{ j \to \infty } x_{j,i}=x_{0,i}$, where $x_{j}=(x_{j,1},\dots,x_{j,n})$ and $x_{0}=(x_{0,1},\dots,x_{0,n})$.
###### Proof
($\implies$) To see that convergence implies component-wise convergence, note that $\forall i\in \{ 1,\dots, n \}, \; \lvert x_{0},_{i} - x_{j,i} \rvert = \sqrt{ (x_{0,i}-x_{j,i})^2 } \leq \sqrt{ \sum_{k=1}^n (x_{0,k}-x_{j,k})^2 } = \lVert x_{0} - x_{j} \rVert. $
Now by the definition of convergence, for all $\varepsilon>0, \exists N \in \mathbb{N}$ such that$j \geqslant N \Rightarrow \mid x_0-x_j \|<\varepsilon$and therefore $j \geqslant N \Rightarrow\left|x_{0, i}-x_{j, i}\right|<\varepsilon, \; \forall i \in\{1, \ldots n\},$i.e., $\lim _{j \rightarrow \infty} x_{j, i}=x_{0, i}$ for every $i \in\{1, \ldots n\}$.
$(\impliedby)$ Given $\varepsilon>0$ and $i \in\{1, \ldots, n\}, \exists N_i$ such that $j \geqslant N_i \Rightarrow\left|x_{0, i}-x_{j, i}\right|<\varepsilon$. Set $N:=\max \left\{N_1, \ldots, N_n\right\}$. Then
$
j \geqslant N \Rightarrow\left\|x_0-x_j\right\|:=\left(\sum_{i=1}^n\left(x_{0, i}-x_{j, i}\right)^2\right)^{1 / 2}<\sqrt{n} \varepsilon
$