## Solution

Let’s try the most frequently used type of the divide-and-conquer strategy. Split the input sequence into two halves, $LeftHalf$ and $RightHalf$, and make a recursive call for each of them. This allows us to compute all inversions that lie in the same half. But it does not reveal the numbers of split inversions, i.e., the number of pairs $(a_i, a_j)$ such that $a_i$ lies in the left half, $a_j$ lies in the right half, and $a_i >a_j$.

Stop and think:Consider an element $x$ in $LeftHalf$. What is the number of split inversions that $x$ belongs to?

Given an array $List$ and an integer $x$, let $List_x$ be the number of elements in $List$ that are smaller than $x$. Since the answer to the question above is $RightHalf_x$, our goal is to rapidly compute $List_x$.

It is equal to the number of elements in $RightHalf$ that are smaller than $x$. This way, we face the following problem: given a sequence of integers $List$ and an integer $x$, find the number of elements in $List$ that are smaller than $x$. We can do it in $O(|List|)$ time in the case of the unordered array (since each element of the array has to be checked) and in $O(log |List|)$ time in the case of an ordered array by using the binary search.

Exercise break:Show how to implement a method $CountSmaller(List, x)$ for counting the number of elements of $List$ that are smaller than $x$ in time $O(log_2 |List|)$.

This way, we arrive at the following divide-and-conquer algorithm:

$CountInversions(List)$:

$if |List| ≤ 1$:

return $0$

$inversions ← 0$

$LeftHalf ← left half$ of $List$

$RightHalf ← right half$ of $List$

$inversions ← inversions +$ $CountInversions$$(LeftHalf)$

$inversions ← inversions +$ $CountInversions$$(RightHalf)$

sort $RightHalf$

for $x$ in $LeftHalf$:

$inversions ← inversions$ + $CountSmaller$$(RightHalf, x)$$

return $inversions$

The running time $T (n)$ (where $n$ is the length of $List$) satisfies a recurrence relation

$T(n)≤2T(n/2)+O(nlogn).$

The $O(nlogn)$ term includes two things: sorting $RightHalf$ and answering $n/2$ $CountSmaller$ queries. This recurrence cannot be plugged into the master theorem directly as the term $O(nlogn)$ is not of the form $O(n^d)$ for a constant $d$. Still, we can analyze this recurrence in the same fashion. The recursion tree has $log_2 n$ levels, the total size of all problems on every level is equal to $n$, and the total time spent on every level is $O(nlogn)$. Therefore, the total running time is $O(n log^2 n)$. Instead of formally proving it, we will improve the above algorithm so that it works in time $O(nlogn)$.

Level up your interview prep. Join Educative to access 80+ hands-on prep courses.