Skip to content

Understanding Big-O

Big O measures time complexity (in number of operations) and space complexity (in amount of memory allocated).

Some rules:

  • Drop the constants $O(2n+6) = O(n)$
  • All exponentials are denoted by $O(n^2)$.
    • e.g. $O(n^3) = O(n^2)$
  • Drop non-dominant terms $O(n^2 + n) = O(n)$
    • Think of these as a limit, as $n$ grows the $n^2$ will dominate the complexity
  • Constant time $O(n)$ means that the number of operations remains constant
    • e.g. One addition, one iteration, etc.
  • $O(\log n)$ refers to complexity whereby the complexity scales with the order of magnitude of data being operated on.
  • $O(n \log n)$ is the most efficient you can make a sorting algorithm

Different terms for inputs If you have inputs you cannot simplify them both down to $O(n)$ or $O(n^2)$, instead you can only simplify them down to $O(a+b)$ or $O(a \times b)$.

Lists Big O of lists

  • .append() or .pop() has $O(1)$ since there is no re-indexing required.
  • .pop(n) or .insert(n, X) both require re-indexing, so this operation is $O(n)$
  • List traversal has $O(n)$ but accessing a value at a specific index is $O(1)$

$O(n^2)$ is a loop within a loop O(n) proportional O(log n) divide and conquer O(1) constant