Week 2: Asymptotics

Chris Tralie

Build Your Own Definition (BYOD)

Below is a little list to help you build the 4 different asymptotic definitions we need in this course. Start with a function f(N) and a function g(N).

If there exists a constant M and

for all N ≥ M, then f(N) is

Little-o Notes And Complexity Hierarchies


Big-O Exercises

Exercise 1

Devise an algorithm whose best polynomial bound for time complexity is O(N3). This algorithm doesn't have to do anything useful, but you should at least come up with pseudocode. If you're interested after you complete the other exercises, you can implement this, time it, and try to see if what you plot is a cubic


Exercise 2

Consider the function \[ f(N) = N^2 \log_2 N + N^3 \] Prove that the function is O(N3). That is, find a constant c and a number n0 so that \[ f(N) \leq c N^3, N \geq n_0 \]


Exercise 3

Let's say we have a timing function \[ T(N) = a \log_2(N^2) \]

Prove that T(N) is O(log(N)). As a hint, recall that \[ \log_a(x^b) = b \log_a(x) \]

Exercise 4

We stated that there are algorithms for sorting with timing functions \[ T(N) = aN\log_2(N) \] for some constant a*. Assuming that python's built-in sorted method for lists takes this amount of time, devise an algorithm where it is impossible to come up with a better time complexity bound than O(N^2 log(N))

Footnote*

Note that the base 2 isn't necessary, because we could apply a change of base formula \[ \log_2(x) = \log_b(x) / \log_b(2) \] and fold in the coefficient (1/logb(2)) into the constant a, but it's clearer to understand what the algorithm is doing if we leave the base 2 in there, as we will see later