A recurrence is a way of expressing the running times of a recursive algorithm. A general form of a recurrence, which is used in the [[master method]], is given by
$T(n) =
\begin{cases}
\Theta(1) \text{ if } n\le 1 \\
a T \Big (\frac{n}{b} \Big) + f_{split}(n) + f_{combine}(n)
\end{cases}
$
where $a$ is the number of recursive calls in each recursion, $n/b$ is the size of the subproblem at each recursion, and $f_{split}$ and $f_{combine}$ are the runtimes to split and combine the subproblems, respectively.
To solve this recurrence, we can use the **expansion method**. If we assume that the split and combine run in constant $n$ time and simplify both terms to $C_1n$. The general form after expanding $k$ times will result in
$2^k T\Big( \frac{n}{2^k} \Big) + k C_1 n = 2^k T(1) + k C_1 n$
Note we assume $k$ results in the base case which we have already said is $T(1)$ because $n=1$, which can also be stated as the constant $C_0$.
How many times will we recurse to get to the base case? In other words, what is a good value for $k$? If we're splitting by half each time, we can split up to $\lg n$ times until we hit the base case. Substituting $\lg n$ for $k$ we can solve the recursion.
$2^{\lg n} C_0 + \lg n \ C_1 n = C_0 n + C_1 n \ \lg n$
which can be expressed as $\Theta(n \lg n)$.