Linear algebra is the study of vector spaces. Vector spaces abstract some properties of $\mathbb{R}^{n}$, or in general $K^{n}$ where $K$ is some [[Field (Algebra)|field]] (fields themselves abstract some properties of $\mathbb{R}$). Note, however, that $\mathbb{R}^{n}$ has some added structure in that it has an inner product.
Why should one study vector spaces and their morphisms (linear maps) as opposed to just studying $K^{n,n}$? A virtue of linear algebra is that one distinguishes a vector from its coordinate representations (i.e. coordinate-free thinking, TBC).
# Definitions
###### Vector space axioms
A vector space over of [[Field (Algebra)|field]] $\mathbb{F}$ is a triple $(V,+,f)$ consisting of a non-empty set $V$ together with a [[Binary Operation|binary operation]] $+$ (called addition) and a [[Binary Function|binary function]] (called scalar multiplication) such that $(V,+)$ is an [[Groups|abelian group]] that is closed under scalar multiplication by elements of $\mathbb{F}$: for any $v\in V$ and any $\lambda\in\mathbb{F}$, there is another element $\lambda v\in V$, and this multiplication satisfies the rules:
1. $\lambda(v+w)=\lambda v+\lambda w$
2. $(\lambda+\mu)v=\lambda v+\mu w$ (scalar multiplication is [[Distributivity|distributive]] over addition).
3. $\lambda(\mu v)=(\lambda \mu)v$
4. $1v=v$ for any $\lambda,\mu\in\mathbb{R}$ and $v,w\in V$.
We usually denote the additive identity by $0_{V}$ and for any $v\in V$ its additive inverse is $-v.$
Relate with [[Modules]].
> [!Example]-
> 1. The [[Ring of Polynomial Forms|ring of univariate polynomials with real coefficients in the indeterminate]] $x$ denoted $\mathbb{R}[x]$ is a vector space.
> 2. The set of functions $V = \left\{ f: \mathbb{R} \to \mathbb{R} : f \text{ is twice continuously differentiable and } \frac{d^{2}f}{dx^{2}}+9f = 0 \right\}$is a vector space.
# Properties
> [!NOTE] Additive identities & inverses
> Let $V$ be a vector space. For any $\lambda\in\mathbb{R}$ and $v\in V$ we have
> 1. $\lambda 0_{V}=0_{V}$ and $0v=0_{V}$
> 2. $(-1)v=-v$ and $(-\lambda)v=-(\lambda v)=\lambda(-v)$.
>
>
>*Proof*.
>1. $v+0_{V} =v\implies\lambda(v+0_{V})=\lambda v\implies\lambda v + \lambda 0_{V}=\lambda v\implies\lambda0_{V}=0_{V}$.
>2. Note that $\lambda v+(-\lambda)v=(\lambda+(-\lambda))v=0v=0_{V}$ so $-\lambda v=(-\lambda) v$ by uniqueness of inverses.
> [!info] Vector Subspace
> A [[Vector subspace|vector subspace]] is a subset of $V$ that is closed under linear combinations.
>
> [!info] Span
> A [[Span of Subset of Vector Space|span]] of a non-empty subset of $V$ is the set that contains all linear combinations of its elements. Note that a span is therefore a vector subspace of $V$.
> [!info ] Finite Dimensional
> A vector space $V$ is [[Vector spaces|finite dimensional]] iff there is a finite subset $\{ v_{1},\dots,v_{s} \}\subset V$ that spans $V$ i.e. $\langle v_{1},\dots,v_{s} \rangle=V$.
>
> Note that every FDVS is [[Linear Isomorphism|linearly isomorphic]] to $\mathbb{R}^{n}$.
>
> [!info] New spaces from old
> See [[Direct sum of vector spaces]].
> [!NOTE] Definition (Inner Product)
> An [[Inner products|inner product]] on a vector space $V$ associates a scalar, denoted $\langle v,w \rangle$ to any $v,w\in V$ subject to the following rules:
> 1. Commutativity: $\langle v,w\rangle = \langle w,v\rangle$ for any $v,w\in V.$
> 2. Bilinearity: $\langle (\lambda_{1}v_{1},\lambda_{2}v_{2}),w\rangle = \lambda_{1} \langle v_{1},w\rangle+ \lambda_{2}\langle v_{2},w\rangle$ for any $v_{1},v_{2},w\in V$ and $\lambda_{1}, \lambda_{2} \in \mathbb{F}.$
> 3. Positive Definite: For any $v\in V,$ $\langle v,v\rangle \geq 0,$ and furthermore $\langle v,v\rangle=0$ iff $v=0_{V}.$
> [!Note] Definition (Finite Dimensional)
> A [[Vector spaces|vector space ]] $V$ is *finite dimensional* iff it has a finite [[Span of Subset of Vector Space|spanning set]].
> [!Example] Examples
> For $\mathbb{R}^{n}$ where $n\in \mathbb{N}$, the standard basis $\underline{e}_{1},\dots,\underline{e}_{n}$ provides a finite spanning set.
>
> For $\mathbb{R}[x]_{\leq d}$ where $d\in \mathbb{N}$, the monomials $1,x,x^{2}, \dots,x^{d}$ provide a finite spanning set.
# Properties
Any finite dimensional vector space $V$ has a [[Basis of Vector Space#^cbe5c8|basis]] by [[Basis of Vector Space#^df1788|sifting lemma]]. Also any two bases of $V$ have the same size known as *[[Basis of Vector Space#^b1e1c3|dimension]]* of $V.$ Note [[Dimension of Sum of Finite Dimensional Vector Subspaces (Dimension Formula)]].
Note [[Basis of Vector Space#^61f832|subspaces]] of $V$ must also be finite dimensional with dimension less than or equal to that of $V$.
Moreover any [[Linear Independence#^fbfe78|linearly independent subset]] of $V$ has at most $d$ elements where $d$ is the dimension of $V.$ On the other hand, if a vector space is not finite dimensional, then [[Linear Independence#^d0db64|has infinitely many linearly dependent elements]].
Note that the [[Direct sum of vector spaces|direct sum]] of two FDVS is also a FDVS.
An FDVS over $\mathbb{R}$ equipped with an inner product is known as a [[Euclidean spaces|Euclidean space]].