Complexity Analysis
We'll cover the following
Building the heap
As mentioned in the previous lesson, the time complexity of building a heap is $O(N)$. This is not obvious right away so let’s analyse the algorithm.
The entire building of the heap can be expressed as $N$ heapify events. Each heapify event take $O(h)$ time where $h$ is the height of the tree.
When I say heapify in this lesson, I mean the top to bottom heapify, the same can be proved for the bottom to up heapify as well.
Each heapify event can be provided with a tighter upper bound than $h$. For example, for leaf nodes, the heapify event is just $1$ operation since there is no node below to propagate the heapify to.
Number of operations
Step by step:
 $1st$ element (root): $h$ levels below it, so $h$ operations
 For the next two elements $2nd$ to $3rd$: $h1$ operations.
 For the next four elements $4th$ to $7th$: $h2$ operations.
 And so on…
 Second last level: 2 operations.
 Leaf nodes (last level): 1 operation.
For the sake of simplicity, let’s assume $N$ is a power of $2$ so that we work with a complete binary heap.
Also, recall that the count of nodes at any level can be expressed in terms of $N$. Nodes halves every level as two nodes connects to one node each time to go up a level.

Nodes in the last level (leafs) = $\frac{N}{2}$. Let’s call this level $0$.

Nodes in the second last level i.e. level $1$ = $\frac{N}{4}$.

Nodes in level 2 = $\frac{N}{8}$

And so on… where the last level, the level $logN$ has only $1$ node, the root.
Compute time per level
List out the number of operations required for each level of the tree rather than each node because the count of operations will be the same for all nodes with the same level.

Level $0$: $1$ operation for $\frac{N}{2}$ nodes.

Level $1$: $2$ operations for $\frac{N}{4}$ nodes.

Level $2$: $3$ operations for $\frac{N}{8}$ nodes.

…

Level $logN$: $logN$ operations for $1$ node.
or,
$T(N) = [1 * \frac{N}{2}] + [2 * \frac{N}{4}] + [3 * \frac{N}{8}] + ... +[(logN1)*2] +[logN*1]$
$T(N) =\sum_{h=1}^{logN} h *\frac{N}{2^{h}}$
$T(N) =N*\sum_{h=1}^{logN} h *\frac{1}{2^{h}}$
We can extend the summation to $\infty$ to get the upper bound.
$T(N)=N*\sum_{h=1}^{\infty} h *\frac{1}{2^{h}}$
The term can be derived from the sum of infinite GP.
Sum of infinite GP $(x<1)$ is:
$\sum_{n=0}^{\infty}x^{n} = \frac{1}{1x}$
Differentiate both sides
$\sum_{n=0}^{\infty}nx^{n1} = \frac{1}{(1x)^2}$
Multiply by $x$
$\sum_{n=0}^{\infty}nx^{n} = \frac{x}{(1x)^2}$
Putting $x = \frac{1}{2}$, we get the second term.
$\sum_{n=0}^{\infty}\frac{n}{2^n} = \frac{\frac{1}{2}}{(1\frac{1}{2})^2}=\frac{\frac{1}{2}}{\frac{1}{4}}=2$
Putting this result back into the above equation
$T(N)=N*2=O(N)$
In the next lesson, we’ll see how to use the STL priority queue for heap operations.
Get handson with 1200+ tech skills courses.