A cheat guide to run time complexity
Jazz Peh, April 28, 2020
This guide is curated to help prepare for technical interviews. I’ve complied all the useful information including how to calculate the complexity in this guide.
Big-O Complexity Chart
Disclaimer: I didn’t create this chart, it’s adapted from https://www.bigocheatsheet.com/. All credits goes to Eric.
Types | Name | Example |
---|---|---|
O(1) | Constant | Adding an element at front of linked list |
O(log n) | Logarithmic | Finding an element in sorted array |
O(n) | Linear | Finding an element in unsorted array |
O(n log n) | Linear Logarithmic | Merge Sort |
O(n2) | Quadratic | Shortest path between 2 nodes in a graph |
O(n3) | Cubic | Matrix Multiplication |
O(2n) | Exponential | Tower of Hanoi Problem |
Common Data Structure Operations
Below are some common data structure and their operation time and space complexity.
Physical Data Structures
There are two types of physical data structures, array and linked list.
Array
Operations | 1D Array | 2D Array | ||
---|---|---|---|---|
Time | Space | Time | Space | |
create() | O(1) | O(n) | O(1) | O(mn) |
insert() | O(1) | O(1) | O(1) | O(1) |
traverse() | O(n) | O(1) | O(mn) | O(1) |
access[] | O(1) | O(1) | O(1) | O(1) |
search() | O(n) | O(1) | O(mn) | O(1) |
delete() | O(1) | O(1) | O(1) | O(1) |
Logical Data Structures
Below are some common types of logical data structures.
Stack
Operations | using Array | using Linked List | ||
---|---|---|---|---|
Time | Space | Time | Space | |
createStack() | O(1) | O(n) | O(1) | O(1) |
push() | O(1) | O(1) | O(1) | O(1) |
pop() | O(1) | O(1) | O(1) | O(1) |
peek() | O(1) | O(1) | O(1) | O(1) |
isEmpty() | O(1) | O(1) | O(1) | O(1) |
isFull() | O(1) | O(1) | N/A | N/A |
deleteStack() | O(1) | O(1) | O(1) | O(1) |
Queue
Operations | using Array | using Linked List | ||
---|---|---|---|---|
Time | Space | Time | Space | |
createQueue() | O(1) | O(n) | O(1) | O(1) |
enqueue() | O(1) | O(1) | O(1) | O(1) |
dequeue() | O(1) | O(1) | O(1) | O(1) |
peek() | O(1) | O(1) | O(1) | O(1) |
isEmpty() | O(1) | O(1) | O(1) | O(1) |
isFull() | O(1) | O(1) | N/A | N/A |
deleteQueue() | O(1) | O(1) | O(1) | O(1) |
Tree
Binary Tree
The table below shows the implementation between Array and Linked List for Binary Tree.
Operations | using Array | using Linked List | ||
---|---|---|---|---|
Time | Space | Time | Space | |
createTree() | O(1) | O(n) | O(1) | O(1) |
insert() | O(1) | O(1) | O(n) | O(n) |
delete() | O(n) | O(1) | O(n) | O(n) |
search() | O(n) | O(1) | O(n) | O(n) |
traverse() | O(n) | O(1) | O(n) | O(n) |
deleteTree() | O(1) | O(1) | O(1) | O(1) |
Using Linked List is more space efficient as creating a binary tree is O(1) as compared to Array which is O(n), hence, the preferred implementation is Linked List.
Binary Search Tree & AVL Tree
Operations | Binary Search Tree | AVL Tree | ||||
---|---|---|---|---|---|---|
Avg. | Worse | |||||
Time | Space | Time | Space | Time | Space | |
createTree() | O(1) | O(1) | O(1) | O(1) | O(1) | O(1) |
insertNode() | O(log n) | O(log n) | O(n) | O(n) | O(log n) | O(log n) |
deleteNode() | O(log n) | O(log n) | O(n) | O(n) | O(log n) | O(log n) |
search() | O(log n) | O(log n) | O(n) | O(n) | O(log n) | O(log n) |
traverse() | O(n) | O(n) | O(n) | O(n) | O(n) | O(n) |
deleteTree() | O(1) | O(1) | O(1) | O(1) | O(1) | O(1) |
Heap (Using Array)
Operations | Time | Space |
---|---|---|
initHeap() | O(1) | O(n) |
peekTopOfHeap() | O(1) | O(1) |
sizeOfHeap() | O(n) | O(1) |
insertValue() | O(log n) | O(log n) |
deleteValue() | O(log n) | O(log n) |
extractMin() / extractMax() | O(log n) | O(log n) |
deleteHeap() | O(1) | O(1) |
createHeap() / heapSort() | O(n log n) | O(n log n) |
heapify() | O(n) | O(n) |
Hash Table
Operations | Time | Space | |
---|---|---|---|
Avg. | Worse | ||
search() | O(1) | O(n) | O(n) |
insert() | O(1) | O(n) | O(n) |
delete() | O(1) | O(n) | O(n) |
Graphs
Common Operations
Operations | Time | Space |
---|---|---|
Breadth First Search | O(V + E) | O(E) |
Depth First Search | O(E + V) | O(V) |
Topological sort | O(E + V) | O(E + V) |
Common Algorithms for Graph problems
Operations | Single Source Shortest Path | All Pair Shortest Path | ||
---|---|---|---|---|
Time | Space | Time | Space | |
Breadth First Search | O(V2) | O(E) | O(V3) | O(EV) |
Dijkstra | O(V2) | O(V) | O(V2) | O(EV) |
Bellman Ford | O(VE) | O(V) | O(VE2) | O(V2) |
Floyd Warshall | N/A | N/A | O(V3) | O(V2) |
Sorting Algorithms
Types | Time | Space | Stable |
---|---|---|---|
Bubble sort | O(n2) | O(1) | Yes |
Selection sort | O(n2) | O(1) | No |
Insertion sort | O(n2) | O(1) | Yes |
Bucket sort | O(n log n) | O(n) | Yes* |
Merge sort | O(n log n) | O(n) | Yes |
Quick sort | O(n log n) | O(n)* | No |
Heap sort | O(n log n) | O(1) | No |
How to find time complexity
This section will show how to calculate the time complexity of iterative vs recursive algorithms.
Iterative
function findBiggestNum(arr) {
let biggestNum = arr[0]; // ------------------O(1)
for (let i = 1; i < arr.length; ++i) { // ---------O(n)--|--O(n)
if (arr[i] <= biggestNum) continue; // O(1)--|--O(1)--|
biggestNum = arr[i]; // O(1)--|
}
return biggestNum; // ------------------O(1)
}
Time complexity: O(1) + O(n) + O(1) = O(n)
Recursive #1
let highest = 0
function findBiggestNum(arr, n) { // ----T(n)
if (n === -1) // ----O(1)---|---Base condition
return highest; // ----O(1)---|
if (arr[n] > highest) // ----O(1)
highest = arr[n] // ----O(1)
return findBiggestNum(arr, n-1) // ----T(n-1)
}
Equation 1: T(n) = O(1) + T(n-1)
O(1) + O(1) + O(1) + O(1) = O(1)
Base Condition: T(-1) = O(1)
if
n == -1
, recursion will end, so if the recursion method =T(n)
, we replacen
to-1
to getT(-1)
, and since base condition =O(1)
,T(-1) = O(1)
.
Equation 2: T(n-1) = O(1) + T((n-1)-1)
Equation 3: T(n-2) = O(1) + T((n-2)-1)
T(n) = 1 + T(n-1)
= 1 + (1 + T((n-1)-1)) # substitute `Equation 2`
= 2 + T(n-2)
= 2 + (1 + T((n-2)-1)) # substitute `Equation 3`
= 3 + T(n-3) # notice the pattern, substitue with k
= k + T(n-k)
= (n+1) + T(n-(n+1)) # attempt to make it T(-1) since it is the base condition, replace k with n+1
= n + 1 + T(-1)
= n + 1 + 1 # T(-1) = 1
= O(n)
Recursive #2
function search(num, arr, start, end) { // ----T(n)
if (start <= end) { // ----O(1) ---|--- Base Condition
if (arr[start] == num) // ----O(1) ---|
return start // ----O(1) ---|
else // ----O(1) ---|
return false // ----O(1) ---|
}
let mid = find_mid(arr, start, end) // ----O(1)
if (mid > num) { // ----O(1)
search(num, arr, start, mid) // ----T(n/2)
}
else if (mid < num) { // ----O(1)
search(num, arr, mid, end) // ----T(n/2)
}
else if (mid == num) // ----O(1)
return mid // ----O(1)
}
Equation 1: T(n) = T(n/2) + 1
Base Condition: T(1) = O(1)
If the array is of size 1, we can perform it in constant time.
Equation 2: T(n/2) = T(n/4) + 1
Equation 3: T(n/4) = T(n/8) + 1
T(n) = T(n/2) + 1
= T(n/4) + 1 + 1 # substitue `Equation 2`
= T(n/4) + 2
= T(n/8) + 1 + 2 # substitue `Equation 3`
= T(n/8) + 3
= T(n/2^k) + k # notice the pattern, substitue with k
= T(1) + log n # attempt to make it T(1) snce it is the base condition, see below how n/2^k = 1 = k
= 1 + log n
= O(log n)
[n/2^K = 1] => [n = 2^K] => [k = log n]