# Definition(s)
> [!NOTE] Definition (Stochastic Processes)
> A stochastic process is a family of [[Random Variables|random variables]] $\{ X_{t}(\omega) \mid t \in T \}$ indexed by a set $T$ and defined on the same [[Probability Space|probability space]] $(\Omega, \mathcal{F}, \mathbb{P})$.
**Terminology**:
- The **state space** of $\{ X_{t} \}_{t\in T}$ is defined as the set of all values taken by the random variables.
- If $T=\mathbb{N}$, then the stochastic process is said to be a **discrete-time** stochastic process. A prime of discrete-time stochastic process are [[Discrete-time Markov chains|discrete time Markov chains]].
- If $T=[0,\infty)$ then the stochastic process is said to be in **continuous time**. [[Brownian Motion]] is a continuous time stochastic process $(X_{t})_{t\geq 0}$ used to model the movement of a particle in a fluid.