# The Master Theorem

Learn the master theorem of the divide-and-conquer algorithm.

## We'll cover the following

## Divide-and-conquer algorithm

A typical divide-and-conquer algorithm solves a problem of size $n$ as follows:

**Divide:** Break the problem into $a$ problems of size at most $n/b$ and solve them recursively. We assume that $a ≥ 1$ and $b > 1$.

**Conquer:** Combine the answers found by recursive calls to an answer for the initial problem.

Therefore, the algorithm makes $a$ recursive calls to problems of size $n/b$.

Note:$b$ does not necessarily divide $n$, so instead of $n/b$, we should usually write $⌈n/b⌉$. Still, we write $n/b$; it simplifies the notation and does not break the analysis.

Assume that everything that is done outside of recursive calls takes time $O(n^d)$ where $d$ is a non-negative constant. This includes preparing the recursive calls and combining the results.

Since this is a recursive algorithm, we also need to specify conditions under which a problem is solved directly rather than recursively (with no such base case, a recursion would never stop). This usually happens when $n$ becomes small. For simplicity, we assume that it is $n = 1$ and that the running time of the algorithm is equal to $1$ in this case.

Therefore, by denoting the running time of the algorithm on problems of size $n$ by $T (n)$, we get the following *recurrence relation* on $T(n)$:

$T(n) = \begin{cases} 1 & \text{if }~~ n = 1, \\ aT(\frac{n}{b})+O(n^d) & \text{if }~~ n > 1. \end{cases}$

Below, we show that we can find out the growth rate of $T (n)$ by looking at parameters $a$, $b$, and $d$. Before doing this, we illustrate how this recurrence relation captures some well-known algorithms.

Level up your interview prep. Join Educative to access 70+ hands-on prep courses.