jazzpeh

A cheat guide to run time complexity

Jazz Peh, April 28, 2020

This guide is curated to help prepare for technical interviews. I’ve complied all the useful information including how to calculate the complexity in this guide.

Big-O Complexity Chart

Big-O Complexity Chart

Disclaimer: I didn’t create this chart, it’s adapted from https://www.bigocheatsheet.com/. All credits goes to Eric.

TypesNameExample
O(1)ConstantAdding an element at front of linked list
O(log n)LogarithmicFinding an element in sorted array
O(n)LinearFinding an element in unsorted array
O(n log n)Linear LogarithmicMerge Sort
O(n2)QuadraticShortest path between 2 nodes in a graph
O(n3)CubicMatrix Multiplication
O(2n)ExponentialTower of Hanoi Problem

Common Data Structure Operations

Below are some common data structure and their operation time and space complexity.

Physical Data Structures

There are two types of physical data structures, array and linked list.

Array

Operations1D Array2D Array
TimeSpaceTimeSpace
create()O(1)O(n)O(1)O(mn)
insert()O(1)O(1)O(1)O(1)
traverse()O(n)O(1)O(mn)O(1)
access[]O(1)O(1)O(1)O(1)
search()O(n)O(1)O(mn)O(1)
delete()O(1)O(1)O(1)O(1)

Logical Data Structures

Below are some common types of logical data structures.

Stack

Operationsusing Arrayusing Linked List
TimeSpaceTimeSpace
createStack()O(1)O(n)O(1)O(1)
push()O(1)O(1)O(1)O(1)
pop()O(1)O(1)O(1)O(1)
peek()O(1)O(1)O(1)O(1)
isEmpty()O(1)O(1)O(1)O(1)
isFull()O(1)O(1)N/AN/A
deleteStack()O(1)O(1)O(1)O(1)

Queue

Operationsusing Arrayusing Linked List
TimeSpaceTimeSpace
createQueue()O(1)O(n)O(1)O(1)
enqueue()O(1)O(1)O(1)O(1)
dequeue()O(1)O(1)O(1)O(1)
peek()O(1)O(1)O(1)O(1)
isEmpty()O(1)O(1)O(1)O(1)
isFull()O(1)O(1)N/AN/A
deleteQueue()O(1)O(1)O(1)O(1)

Tree

Binary Tree

The table below shows the implementation between Array and Linked List for Binary Tree.

Operationsusing Arrayusing Linked List
TimeSpaceTimeSpace
createTree()O(1)O(n)O(1)O(1)
insert()O(1)O(1)O(n)O(n)
delete()O(n)O(1)O(n)O(n)
search()O(n)O(1)O(n)O(n)
traverse()O(n)O(1)O(n)O(n)
deleteTree()O(1)O(1)O(1)O(1)

Using Linked List is more space efficient as creating a binary tree is O(1) as compared to Array which is O(n), hence, the preferred implementation is Linked List.

Binary Search Tree & AVL Tree
OperationsBinary Search TreeAVL Tree
Avg.Worse
TimeSpaceTimeSpaceTimeSpace
createTree()O(1)O(1)O(1)O(1)O(1)O(1)
insertNode()O(log n)O(log n)O(n)O(n)O(log n)O(log n)
deleteNode()O(log n)O(log n)O(n)O(n)O(log n)O(log n)
search()O(log n)O(log n)O(n)O(n)O(log n)O(log n)
traverse()O(n)O(n)O(n)O(n)O(n)O(n)
deleteTree()O(1)O(1)O(1)O(1)O(1)O(1)

Heap (Using Array)

OperationsTimeSpace
initHeap()O(1)O(n)
peekTopOfHeap() O(1)O(1)
sizeOfHeap()O(n)O(1)
insertValue()O(log n)O(log n)
deleteValue()O(log n)O(log n)
extractMin() / extractMax() O(log n)O(log n)
deleteHeap()O(1)O(1)
createHeap() / heapSort()O(n log n)O(n log n)
heapify()O(n)O(n)

Hash Table

OperationsTimeSpace
Avg.Worse
search()O(1)O(n)O(n)
insert()O(1)O(n)O(n)
delete()O(1)O(n)O(n)

Graphs

Common Operations
OperationsTimeSpace
Breadth First SearchO(V + E)O(E)
Depth First Search O(E + V)O(V)
Topological sortO(E + V)O(E + V)
Common Algorithms for Graph problems
OperationsSingle Source Shortest PathAll Pair Shortest Path
TimeSpaceTimeSpace
Breadth First SearchO(V2)O(E)O(V3)O(EV)
DijkstraO(V2)O(V)O(V2)O(EV)
Bellman FordO(VE)O(V)O(VE2)O(V2)
Floyd WarshallN/AN/AO(V3)O(V2)

Sorting Algorithms

TypesTimeSpaceStable
Bubble sortO(n2)O(1)Yes
Selection sortO(n2)O(1)No
Insertion sortO(n2)O(1)Yes
Bucket sortO(n log n)O(n)Yes*
Merge sortO(n log n)O(n)Yes
Quick sortO(n log n)O(n)*No
Heap sortO(n log n)O(1)No

How to find time complexity

This section will show how to calculate the time complexity of iterative vs recursive algorithms.

Iterative

function findBiggestNum(arr) {
  let biggestNum = arr[0];                  // ------------------O(1)
  for (let i = 1; i < arr.length; ++i) {    // ---------O(n)--|--O(n)
    if (arr[i] <= biggestNum) continue;     // O(1)--|--O(1)--|
    biggestNum = arr[i];                    // O(1)--|
  }
  return biggestNum;                        // ------------------O(1)
}

Time complexity: O(1) + O(n) + O(1) = O(n)


Recursive #1

let highest = 0

function findBiggestNum(arr, n) {     // ----T(n)
  if (n === -1)                       // ----O(1)---|---Base condition
    return highest;                   // ----O(1)---|

  if (arr[n] > highest)               // ----O(1)
    highest = arr[n]                  // ----O(1)

  return findBiggestNum(arr, n-1)    // ----T(n-1)
}

Equation 1: T(n) = O(1) + T(n-1)

O(1) + O(1) + O(1) + O(1) = O(1)

Base Condition: T(-1) = O(1)

if n == -1, recursion will end, so if the recursion method = T(n), we replace n to -1 to get T(-1), and since base condition = O(1), T(-1) = O(1).

Equation 2: T(n-1) = O(1) + T((n-1)-1)

Equation 3: T(n-2) = O(1) + T((n-2)-1)

T(n) = 1 + T(n-1)
     = 1 + (1 + T((n-1)-1)) # substitute `Equation 2`
     = 2 + T(n-2)
     = 2 + (1 + T((n-2)-1)) # substitute `Equation 3`
     = 3 + T(n-3)           # notice the pattern, substitue with k
     = k + T(n-k)
     = (n+1) + T(n-(n+1))   # attempt to make it T(-1) since it is the base condition, replace k with n+1
     = n + 1 + T(-1)
     = n + 1 + 1            # T(-1) = 1
     = O(n)

Recursive #2

function search(num, arr, start, end) {      // ----T(n)
  if (start <= end) {                        // ----O(1) ---|--- Base Condition
    if (arr[start] == num)                   // ----O(1) ---|
      return start                           // ----O(1) ---|
    else                                     // ----O(1) ---|
      return false                           // ----O(1) ---|
  }
  let mid = find_mid(arr, start, end)        // ----O(1)
  if (mid > num) {                           // ----O(1)
    search(num, arr, start, mid)             // ----T(n/2)
  }
  else if (mid < num) {                      // ----O(1)
    search(num, arr, mid, end)               // ----T(n/2)
  }
  else if (mid == num)                       // ----O(1)
    return mid                               // ----O(1)
}

Equation 1: T(n) = T(n/2) + 1

Base Condition: T(1) = O(1)

If the array is of size 1, we can perform it in constant time.

Equation 2: T(n/2) = T(n/4) + 1

Equation 3: T(n/4) = T(n/8) + 1

T(n) = T(n/2) + 1
     = T(n/4) + 1 + 1       # substitue `Equation 2`
     = T(n/4) + 2
     = T(n/8) + 1 + 2       # substitue `Equation 3`
     = T(n/8) + 3
     = T(n/2^k) + k         # notice the pattern, substitue with k
     = T(1) + log n         # attempt to make it T(1) snce it is the base condition, see below how n/2^k = 1 = k
     = 1 + log n
     = O(log n)

[n/2^K = 1] => [n = 2^K] => [k = log n]


Explore more like this

© 2020 Jazz Peh, Built with Gatsby