Complexity Theory
1. Introduction
Complexity Theory is a sub-field of computer science in which we will estimate algorithm costs mathematically and independently from the underlying hardware
2. Model of computation
2.1 Definition
To calculate an algorithm cost, we need a model of computation M.
A model of computation gives what are the instructions possible in our algorithms, and what are their cost.
By default we will use the Random Access Machine model in which we will pose the following assumptions:
- Memory Access (read/write) is done on constant time.
- Each arithmetic instruction can be executed no more than a constant time for standard types.
- Each Bit-wise Instruction can be executed no more than a constant time for standard types.
- Each Boolean Test can be executed no more than a constant time for standard types.
Here by standard types, we mean Integers with no more than minteger=64 bits, and Floating point types with no more than mFloat=128 bits
2.2 Algorithm Cost
Using the model M, any algorithm can be decomposed into one of the elementary instructions described above.
To simplify the analysis, we can require that the cost of each elementary instruction is 1
2.3 Example
3. Big O-Notation
3.1 Mathematical Definition
This is not a general definition, but for all practical purposes, it will work well in competitive programming setting.
Let f,g∈F(R+,R+) be two real valued functions with positive values.
By definition, we say that f is asymptotically bounded by g if:
∃K∈R+,R∈R+/∀x>R,f(x)≤Kg(x) If g is eventually strictly positive, this is equivalent to:
∃K∈R+∗,R∈R+/∀x>R,g(x)f(x)≤K We say that f is eventually bounded from above by some constant multiple of g, or roughly speaking f is a big O of g, and we note it by:
f=O(g) 3.2 Properties
Name | Equation | Utility |
---|
Reflexive | f=O(f) | Generally not useful |
Transitive | f=O(g) and g=O(h)⟹f=O(h) | Get an easier upper bound |
Sum | $$ f_1+f_2=\mathcal{O}(\max(f_1,f_2))$$ | Get an easier upper bound |
Scaling | f=O(g)⟹f=O(k⋅g)∀k∈R+∗ | Remove constants |
Product | {f1=O(g1)f2=O(g2)⟹f1f2=O(g1g2) | Simplify products |
Cumulative Sum | f increasing⟹∑i=1nf(i)=O(∫1n+1f(x)dx) | |
4. Big Ω-Notation
4.1 Mathematical Definition
This is not a general definition, but for all practical purposes, it will work well in competitive programming setting.
Let f,g∈F(R+,R+) be two real valued functions with positive values.
By definition, we say that f is asymptotically bounded from below by g if:
∃K∈R+,R∈R+/∀x>R,f(x)≥Kg(x) If g is eventually strictly positive, this is equivalent to:
∃K∈R+∗,R∈R+/∀x>R,g(x)f(x)≥K We say that f is eventually bounded from below by some constant multiple of g, or roughly speaking f is a big O of g, and we note it by:
f=Ω(g) 3.3 Properties
Name | Equation | Utility |
---|
Reflexive | f=Ω(f) | Generally not useful |
Transitive | f=Ω(g) and g=Ω(h)⟹f=Ω(h) | Get an easier upper bound |
Sum | f1+f2=Ω(min(f1,f2)) | Get an easier upper bound |
Scaling | f=Ω(g)⟹f=Ω(k⋅g)∀k∈R+∗ | Remove constants |
Product | {f1=Ω(g1)f2=Ω(g2)⟹f1f2=Ω(g1g2) | Simplify products |
Cumulative Sum | f increasing⟹∑i=1nf(i)=Ω(∫1nf(x)dx) | |
Duality | f=O(g)⟺g=Ω(f) | |
5. Big Θ- Notation
5.1 Mathematical Definition
This is not a general definition, but for all practical purposes, it will work well in competitive programming setting.
Let f,g∈F(R+,R+) be two real valued functions with positive values.
By definition, we say that f=Θ(g) if f=O(g) and f=Ω(g)
6. Examples
Function | Big O | Big Ω | Big Θ |
---|
f(n)=n5+10n−50 | O(n5) | Ω(n5) | Θ(n5) |
f(n)=n−n1 | O(n) | Ω(n) | Θ(n) |
f(n)=en−n | O(en) | Ω(en) | Θ(en) |
f(n)=2+sinn | O(1) | Ω(1) | Θ(1) |
f(n)=exp(sin2n10)+n2 | O(n2) | Ω(n2) | Θ(n2) |
f(n)=∑k=1n3k | O(3n) | Ω(3n) | Θ(3n) |
f(n)=exp(n2(sinn+2)) | O(exp(3n2)) | Ω(exp(n2)) | |