Methods of linear algebra such as row reduction allows one to solve systems of linear equations.
The following theorem applies to system of non-linear equations.
# Statements
> [!NOTE] Theorem (Implicit function)
> Let $U\subset \mathbb{R}^{n} \times \mathbb{R}^{m}$ be open and $f:U\to \mathbb{R}^{m}$. Let $(a,b)\in \mathbb{R}^{n}\times \mathbb{R}^{m}$ such that $f(a,b)=0$ and $f$ is [[Continuous Differentiability|continuously differentiable]] at $(a,b)$. Let $M$ be the matrix of the last $m$ columns of $\partial f(a,b)$ (we do not bother to invent notation for this - recall that $\partial f(a,b)$ is an $m\times (n\times m)$ matrix). If $M$ is invertible (i.e. $\det M\neq 0$), then there exists: an open neighbourhood $A\subset \mathbb{R}^{n}$ of $a$; an open neighbourhood $B \subset \mathbb{R}^{m}$ of $b$; and a continuously differentiable function $g:A\to B$ such that $f(x, g(x))=0, \quad \forall x \in A.$
Asserting that the last $M$ columns of $\partial f(a,b)$ is invertible is equivalent to asserting that $D f(a,b)$ has full rank after relabelling the coordinates of $\mathbb{R}^{n} \times \mathbb{R}^{m}$.
Zero can be replace with any 'regular value' of $f$
Discuss geometric interpetations i.e. sub-manifolds of $\mathbb{R}^{n}$ (TBC, See [[Submanifolds of Euclidean Space]]).
# Proofs
Apply the [[Inverse function theorem]] to $\begin{align}
F: \mathbb{R}^{n}\times \mathbb{R}^{m} &\to \mathbb{R}^{n}\times \mathbb{R}^{m} \\
(x,y) &\mapsto (x,f(x,y))
\end{align}$