# Recursion Runtime Patterns

## Case 1: $O(N\log{N})$ and $O(N^2)$

### Quicksort

The complexity is dependent on whether the partition halves the data.

• Time:
• Best: $T(N) = 2T(N/2) + O(N) = O(N\log{N})$
• Worst: $T(N) = T(N - 1) + T(1) + O(N) = O(N^2)$
• Non-worst: $T(N) = T(N/10) + T(9N/10) + O(N) = O(N\log{N})$
• Space:
• Best: $O(\log{N})$
• Worst: $O(N)$

### Mergesort

• Time:
• Best: $T(N) = 2T(N/2) + O(N) = O(N\log{N})$
• Worst: $O(N\log{N})$ since the partition always halves the array.
• Space:
• $O(N) = O(\log{N} + N)$ since mergesort needs an auxiliary array.

## Case 2: $O(\log{N})$ and $O(N)$

• Time: $T(N) = T(N/2) + O(1) = O(\log{N})$
• Space: $O(\log{N})$

### Binary Tree Traversal

• Time:
• Best: $T(N) = 2T(N/2) + O(1) = O(N)$ if the tree is perfect or bushy.
• Worst: $T(N) = T(N - 1) + T(1) + O(1) = O(N)$ if the tree is spindly.
• Note: Notice that the runtimes are the same.
• Space:
• Best: $O(\log{N})$ if the tree is perfect or bushy.
• Worst: $O(N)$ if the tree is spindly.
• Variant:
• If the visiting part takes more than $O(1)$, say $O(N)$, the runtime would be similar to Quicksort, which depends on the partition. And $O(N)$ is the total amount of work at each level (please imagine the situation in a binary tree).

## Case 3: $O(2^N)$ and $O(N!)$

### Combination

• $N$ is the total job size
• Initially, call: combination(0, 0)
• For each depth, call combination() for $N$ times.
• Time: $T(N) = T(N - 1) + T(N - 2) + \ldots + T(1) = O(2^N)$
• Space: $O(N)$

### Permutation

• Time: $T(N) = N * T(N -1) = O(N!)$
• Space: $O(N)$

## Case 4: DP

### Fibonacci

• Time: $T(N) = T(N - 1) + T(N - 2) + O(1) = O(2^N) = O(1.618^N)$
• Space: $O(N)$

### Fibonacci with DP

• Time:
• Number of subproblems $N$ $\times$ Exclusive time to solve each subproblem $O(1)$
• $T(N) = O(N)$
• Space:
• Max depth of call stack $\times$ Space used by each subproblem
• $O(N)$

More to go when you work on DP problems

Comment
Junhao Wang
a software engineering cat