Recursive Algorithms
The time complexity of recursive algorithms are usually not as simple as their iterative conterparts.
For that we will need to learn the theory behind it, and then practice on some examples.
1. Theoretical Appraoch
For simplicity, we won't consider indirect recursive calls, and we will only consider deterministic algorithms
1.1 Main Result
Let a recursive algorithm accepting as input
We will call the execution cost of on
We will suppose that on the input , had recursive calls with respective inputs
We will suppose that the iterative portion of has cost
We have the following result:
1.2 GCD Example
Consider the following algorithm with :
#Assuming a >= b
def gcd(a,b):
if b==0:
return a
return gcd(b,a%b)
We can establish the running time as:
So the time complexity of this algorithm is the number of iterations of the Euclidean algorithm.
Now we will prove that
- If then the result is immediate
- Else, we have , so that with
With , and by induction, we can prove that:
Now, we will estimate an index from which must be zero:
We have
2. Master Theorem
2.1 Importance
Usually, a recursive algorithm will divide a problem of size into problems of size with an additional iterative cost of .
The cost of can be described using this recurrence formula:
The Master Theorem will give a sharp estimate for the asymptotic behaviour of based on the values of and the function
2.2 Table
Let
Description | Condition | Result |
---|---|---|
Work to split/recombine a problem is dwarfed by subproblems. | with | |
Work to split/recombine a problem is comparable to subproblems. | with | |
Work to split/recombine a problem dominates subproblems. | with |
2.3 Merge Sort example
def merge(A,B):
C=[]
i=0
j=0
while i < len(A) or j < len(B):
if i == len(A):
C.append(B[j])
j+=1
elif j == len(B):
C.append(A[i])
i+=1
elif A[i] < B[j]:
C.append(A[i])
i+=1
else:
C.append(B[j])
j+=1
return C
def merge_sort(A):
n=len(A)
if n<2:
return A
B,C=A[:n//2],A[n//2:]
B=merge_sort(B)
C=merge_sort(C)
return merge(B,C)
2.3.1 merge
complexity
2.3.2 merge_sort
complexity
Let be the cost of a merge sort of an array of size
The merge procedure has two arrays of size and . So it has a complexity of:
Now, we can establish this recurrent relation for :
We have and . So we have the second case of the master theorem.
So, the complexity of merge sort is: