> [!Theorem] Theorem (General Solution of $(2.1)$) >Given any two [[Linear Independence|linearly independent]] solutions of the ODE in $(2.1)$, $x_{1}$ & $x_{2}$, all solutions of are of the form $x(t)=l_{1}x_{1}(t)+l_{2}x_{2}(t), \quad t \in (\alpha,\beta)$for some $l_{1},l_{2}\in \mathbb{C}$. **Proof** Assume that $x$ is a solution to $(2.1)$ and, for some $t_{0}\in (\alpha,\beta)$ let $x_{0}=x(t_{0})$ and $v_{0}=\dot{x}(t_{0})$. If we can write $x$ in the form $l_{1}x_{1}+l_{2}x_{2}$ then the coefficients $l_{1}$ and $l_{2}$ have to be such that $\begin{align} x_{0}=l_{1}x_{1}(t_{0})+l_{2}x_{2}(t_{0}), \\ v_{0}=l_{1}\dot{x}_{1} (t_{0}) + l_{2} \dot{x}_{2}(t_{0}). \end{align} \tag{2.2}$This is a $2\times 2$ system of linear equations. BWOC suppose the matrix is [[Singular Matrix|singular]] (the [[Wronskian|Wronskian]] is zero), then $0 = \det \begin{pmatrix} x_{1}(t_{0}) & x_{2}(t_{0}) \\ \dot{x}_{1}(t_{0}) & \dot{x}_{2}(t_{0}) \end{pmatrix} = x_{1}(t_{0}) \dot{x}_{2}(t_{0})-x_{2}(t_{0})\dot{x}_{1}(t_{0})$Assume for simplicity that $x_{2}(t_{0})\neq 0$ and $\dot{x}_{2}(t_{0}) \neq 0$. We obtain that $\frac{x_{1}(t_{0})}{x_{2}(t_{0})} = \frac{\dot{x}_{1}(t_{0})}{\dot{x}_{2}(t_{0})} =: \bar{c} \in \mathbb{R}$Therefore $x_{1}=\bar{c}x_{2}(t)$ which contradicts the fact $x_{1},x_{2}$ are linearly independent. Consequently, the matrix system in $(2.2)$ is regular and the coefficients $l_{1}$ & $l_{2}$ can be uniquely determined. By, the above lemma, we know that $l_{1}x_{1}+l_{2}x_{2}$ is a solution to $(2.1)$. But $x$ is also a solution hence, by our assumption of uniqueness, indeed $x(t)=l_{1}x_{1}(t)+l_{2}x_{2}(t)$ for all $t\in(\alpha,\beta)$.