> [!NOTE] Theorem (Rank-Nullity formula) > Let $\varphi:V\to W$ be a [[Linear Map|linear map]] such that $V$ is [[Finite Dimensional Vector Space|finite dimensional]]. Then its [[Image of Linear Map|image]] and [[Kernel of Linear Map|kernel]] are finite dimensional and their [[Dimension of Vector Space|dimensions]] satisfy $\dim \text{Im } \varphi + \dim \ker \varphi = \dim V.$ > In other words $\text{rk }\varphi + \text{nullity } \varphi =\dim V$ where $\text{rk }\varphi$ and $\text{nullity }\varphi$ denote the [[Rank-Nullity Formula|rank]] and [[Nullity of Linear Map|nullity]] of $\varphi.$ **Proof 1**: Both $\ker\varphi \subset V$ and $\text{Im }\varphi \subset W$ are finite dimensional [[Kernel and Image of a Linear Map are Subspaces#^616c3c|because]] $V$ is. Pick bases $u_{1},\dots,u_{s}$ of $\ker \varphi$ and $w_{1},\dots,w_{r}$ of $\text{Im }\varphi$. Since each $w_{i}$ lies in the image, there are $v_{1},\dots,v_{r}\in V$ which satisfy $\varphi(v_{i})=w_{i}$ for $i=1,\dots,r$. We claim that $B=\{ u_{1},\dots,u_{s}, v_{1},\dots,v_{r} \}$ is a basis of $V$. This will complete the proof since $s= \dim \ker \varphi$ and $r=\dim \text{Im } \varphi$. To show that $B$ **spans** $V$, consider any $v\in V$. Set $w= \varphi(v)\in \text{Im }\varphi$. Since $w_{1},\dots,w_{r}$ is a basis of $\text{Im } \varphi$, $\exists! \mu_{i} \in\mathbb{R}$ so that $w = \mu_{1}w_{1}+\dots+\mu_{r}w_{r}$The key is to note that $v'=\mu_{1}v_{1}+\dots+\mu_{r } v_{r}$ also maps to $w$. By linearity of $\varphi$: $\begin{array}{rcl}\varphi(v-v^{\prime})&=&\varphi(v)-\mu_1\varphi(v_1)-\ldots-\mu_r\varphi(v_r)\\&=&w-\mu_1w_1-\ldots-\mu_rw_r\\&=&0_W\end{array}$so that $v- v' \in \ker \varphi$. Therefore there exists scalars $\lambda_{1},\dots,\lambda_{s}$ so that $v-v' = \lambda_{1} u_{1} +\dots+\lambda_{s} u_{s}$Rearranging that gives $v = \lambda_{1}u_{1}+\dots+\lambda_{s} v_{s}+\mu_{1}v_{1}+\dots \mu_{r}v_{r}$ so that $v\in \langle B \rangle$, as required. To show that $B$ is **linearly independent**, suppose there are scalars $\lambda_{i},\mu_{j}$ such that $\lambda_{1}\mu_{1}+\dots+\lambda_{s}u_{s}+\mu_{1}v_{1}+\dots+\mu_{r} v_{r} = 0_{V}\tag{1}$Applying $\varphi$, and noting that $\varphi(u_{i})=0_{W}$ and $\varphi(v_{j})=w_{j}$, we have $0_{W}+\mu_{1}w_{1}+\dots+\mu_{r}w_{r}=0_{W}$Since $w_{1},\dots,w_{r}$ are linearly independent, being a basis of $\text{Im }\varphi$, we have $\mu_{1}=\dots=\mu_{r}=0$. Equation $(1)$ now reads $\lambda_{1}u_{1}+\dots+\lambda_{s}u_{s}=0_{V}$Since $u_{1},\dots,u_{s}$ are linearly independent, being a basis of $\ker \varphi$, we have $\lambda_{1}=\dots=\lambda_{s}=0$. So all coefficients of $(1)$ are necessarily zero, and so $B$ is linearly independent. **Proof 2:** Let $n=\dim V$ and $m = \dim \text{Im }$. Pick bases for $V$ and $\text{Im}$ and consider the matrix $A\in\mathbb{R}^{m\times n}$ that represent [[Linear Map#^467233|the matrix of linear map wrt to the given bases]]. Compute the [[Left Multiplication Linear Map of Real Matrix#^453fd6|Smith normal form]] $EAF$ of $A$. Let $r$ be the [[Smith Normal Form for Real Matrix|rank]] of the Smith normal form. Then $\text{rk }\varphi = \dim \text{Im }L_{A} = \dim \text{Im }L_{EAF}=r$and $\text{nullity }\varphi = \text{dim}\ker L_{A} = \dim \ker L_{EAF} = n-r$and the result follows since $r+n-r=n$. **Proof 3**: Since $V$ is f.d., so is $U=\ker \varphi \subset V$. Let $U'\subset V$ be a [[Complement to Vector Subspace|complement]] to $U.$ Define the linear map $\begin{align} \psi: U' & \to \text{Im }\varphi \\ u &\mapsto \varphi(u) \end{align}$which is simply the restriction of the linear map to $U'$. We claim that $\psi$ is an [[Linear Isomorphism|isomorphism]]. This completes the proof since then by [[Complement to Vector Subspace#^6489d8|dimension of complement]] and [[Linear Isomorphism#^aa66b3|dimension of isomorphic vector spaces]] $\dim V = \dim U + \dim U' = \dim \ker \varphi + \dim \text{Im }\varphi$Firstly, $\psi$ is injective since $\psi(u)=0_{W}$ for $u\in U'$ only if $u\in\ker\varphi$, but $U'\cap\ker\varphi=\{ 0_{V} \}$ since $U'$ is a complement to $\ker\varphi$ and so we have $\psi(u)=0_{W}$ iff $u=0_{V}$, as required ([[Linear Map#^537e6e|Linear map is injective iff kernel only contains zero vector]]). It remains to show that $\psi$ is surjective. Suppose $w\in \text{Im }\varphi$. Then there is $v\in V$ so that $\varphi(v)=w$. Using [[Complement to Vector Subspace#^d77d91|direct sum lemma]], $v=u+u'$ with $u\in U$ and $u'\in U'$. Then computing the image of $u'$ gives $\psi(u')=\psi(v-u)=\psi(v)-\psi(u)=w-0_{V}=w$so $w\in \text{Im }\psi$, as required.