The algorithm above is tested on five test cases: 0, 5, 7, 10, and 14. The solution runs fine on these test cases. Let’s test it on a larger value e.g., 60, by uncommenting that test case in the code. We observe that the algorithm keeps on running and finally times out, indicating that the recursive algorithm crashed due to stack overflow caused by too many recursive calls.
Time complexity
This algorithm has exponential time complexity, specifically O(2n). This is because we take the sum of the two preceding values every time.
Space complexity
As the maximum depth of the recursive call tree is n, and each call stores its data on the stack, the space complexity of this approach is O(n).
Dynamic programming to the rescue
Dynamic programming rescues crashing recursive algorithms by optimizing their performance and managing memory use. It achieves this through the following techniques:
Top-down technique such as memoization
Bottom-up technique such as tabulation
DP stores the results of subproblems, thus avoiding the repetitive computation that exacerbates stack overflow in pure recursion. By ensuring that each subproblem is solved only once and reusing these precomputed answers, DP reduces the number of recursive calls and the depth of the call stack. This approach not only prevents crashes but also significantly speeds up the execution by transforming exponential time complexity into a more manageable polynomial complexity. For a deeper understanding of dynamic programming, check out this blog.
The applicability of dynamic programming depends upon the following two properties:
Optimal substructure: This property means that an optimal solution to a problem can be effectively constructed from optimal solutions of its subproblems. In dynamic programming, this property is crucial because it ensures that the problem can be broken down into smaller, manageable parts, and by solving each part optimally, the solution to the overall problem can be derived.
Overlapping subproblems: This property is present when the recursive algorithms revisit the same problems repeatedly. Dynamic programming uses this property by storing the results of these subproblems in a table. Once a subproblem has been solved, its result is stored and reused (thus avoiding the work of re-solving it), which drastically reduces the computational cost and prevents redundant calculations that could lead to a stack overflow in naive recursion.
Let’s see how DP rescues the crashing recursive algorithms using memoization and tabulation.
Top-down technique: Memoization
The top-down solution, commonly known as the memoization technique, enhances the recursive solution. It overcomes the problem of calculating redundant solutions repeatedly by storing them in an array.
The basic idea is to check if the array already contains the result of the Fibonacci number we are trying to find before calculating it. This way, we will reduce the number of recursive calls by using the already-stored result in the array.
The solution of the Fibonacci number problem using memoization is given below: