> [!NOTE] Master Theorem (for polynomial work) >Given a recurrence of the form $T(n)=aT\left( \left \lceil \frac{n}{b} \right \rceil \right)+\Theta(n^{d})$for constants $a\geq 1$ and $b>1$, the following statements are true: > 1. $\left( \frac{a}{b^{d }}<1 \right)$ then $T(b)=\Theta(n^{d})$ > 2. $\left( \frac{a}{b^{d}} = 1 \right)$ then $T(n) = \Theta(n^{d} \cdot \log_{b} n)$ > 3. $\left( \frac{a}{b^{d}} > 1 \right)$ then $T(n ) = \Theta\left( n^{d} \left( \frac{a}{b^{d}} \right)^{\log_{b}n} \right)=\Theta(n^{\log_{b}a})$. > *Proof*. Note that the maximum recursion depth is given by $h:=\log_{b}n$. 🔑 At depth of $i$ in the recursion tree, there are $a^{i}$ problems of size $\frac{n}{b^{i}}$. Hence total time is given by $T(n)=\sum_{i=1}^{h} a^{i} \left( \frac{n}{b^{i}} \right)^{d} = n^{d} \sum_{i=0}^{h} \left( \frac{a}{b^{d}} \right)^{i} $ In case 1, the sum lt; \frac{1}{1-\left( \frac{a}{b^{d}} \right)} \leq 1$. In case 2, the sum $=h$. In case 3, using [[Asymptotic Value of Partial Sum of Geometric Series]], sum $=\Theta\left( \left( \frac{a}{b^{d}} \right)^{h} \right)$. # Applications Generalisation: See https://www.cs.cornell.edu/courses/cs3110/2012sp/lectures/lec20-master/mm-proof.pdf. Consider a problem that can be solved by the following [[Divide & Conquer Algorithm]]: ```pseudo \begin{algorithm} \begin{algorithmic} \Procedure{p}{input $x$ of size $n$} \If{$n$ < some constant $k$} \State Solve $x$ directly without recusrion \Else \State Create $a$ subproblems of $x$, each having size $n/b$ \\ Call procedure $p$ recursively on each subproblem\\ Combine the results from the subproblems \EndIf \EndProcedure \end{algorithmic} \end{algorithm} ``` The runtime of an algorithm denoted $T(n)$ can be expresses as the [[Recurrence Relation]]: $T(n)=aT\left( \left \lceil \frac{n}{b} \right \rceil \right)+\Theta(n^{d})$ where $\Theta(n^{d})$ denotes the asymptotic cost of call and combination for a subproblems of size $n$. Consequences: See [[Merge Sort]]; See [[Binary Search]];