ALGORITHM ANALYSIS

Big O Analysis Master Sheet

Complete reference for asymptotic notation, complexity analysis techniques, and time/space complexity bounds for algorithms and data structures.

big-ocomplexityasymptotic-notationalgorithm-analysistime-complexityspace-complexity

Cheat Sheet Overview

Quick reference and study metrics

ā—Intermediate
25 minutes

Reading Time

33

Sections

0

Downloads

šŸ“šalgorithm-analysis

Category

Enhanced SEO fields

Big O Analysis Master Sheet

Master asymptotic notation and complexity analysis with this comprehensive reference guide.

Quick Reference Table

NotationNameDefinitionExample
O(f(n))O(f(n))Big OUpper boundO(n2)O(n^2)
Ω(f(n))\Omega(f(n))Big OmegaLower boundΩ(nlog⁔n)\Omega(n \log n)
Θ(f(n))\Theta(f(n))Big ThetaTight boundΘ(n)\Theta(n)
o(f(n))o(f(n))Little oStrict upper boundo(n2)o(n^2)
ω(f(n))\omega(f(n))Little omegaStrict lower boundω(n)\omega(n)

Common Complexity Classes

Time Complexities (Best to Worst)

O(1)        - Constant
O(log n)    - Logarithmic
O(n)        - Linear
O(n log n)  - Linearithmic
O(n²)       - Quadratic
O(n³)       - Cubic
O(2^n)      - Exponential
O(n!)       - Factorial

Growth Rate Comparison

For large n, the following inequality holds: O(1)<O(log⁔n)<O(n)<O(nlog⁔n)<O(n2)<O(n3)<O(2n)<O(n!)O(1) < O(\log n) < O(n) < O(n \log n) < O(n^2) < O(n^3) < O(2^n) < O(n!)

Big O Notation Deep Dive

Definition

f(n)=O(g(n))f(n) = O(g(n)) if there exist positive constants cc and n0n_0 such that: 0≤f(n)≤cā‹…g(n) for all n≄n00 \leq f(n) \leq c \cdot g(n) \text{ for all } n \geq n_0

Key Properties

  • Transitivity: If f=O(g)f = O(g) and g=O(h)g = O(h), then f=O(h)f = O(h)
  • Scaling: O(cā‹…f(n))=O(f(n))O(c \cdot f(n)) = O(f(n)) for any constant c>0c > 0
  • Addition: O(f(n)+g(n))=O(max⁔(f(n),g(n)))O(f(n) + g(n)) = O(\max(f(n), g(n)))
  • Multiplication: O(f(n)ā‹…g(n))=O(f(n))ā‹…O(g(n))O(f(n) \cdot g(n)) = O(f(n)) \cdot O(g(n))

Algorithm Analysis Techniques

1. Counting Operations

def linear_search(arr, target):
    for i in range(len(arr)):      # O(n) iterations
        if arr[i] == target:       # O(1) comparison
            return i               # O(1) return
    return -1                      # O(1) return
# Total: O(n)

2. Recurrence Relations

Master Theorem: For recurrences of the form T(n)=aT(n/b)+f(n)T(n) = aT(n/b) + f(n):

CaseConditionSolution
1f(n)=O(nc)f(n) = O(n^c) where c<log⁔bac < \log_b aT(n)=Θ(nlog⁔ba)T(n) = \Theta(n^{\log_b a})
2f(n)=Θ(nclog⁔kn)f(n) = \Theta(n^c \log^k n) where c=log⁔bac = \log_b aT(n)=Θ(nclog⁔k+1n)T(n) = \Theta(n^c \log^{k+1} n)
3f(n)=Ω(nc)f(n) = \Omega(n^c) where c>log⁔bac > \log_b aT(n)=Θ(f(n))T(n) = \Theta(f(n))

Example: Merge Sort

  • T(n)=2T(n/2)+O(n)T(n) = 2T(n/2) + O(n)
  • a=2,b=2,f(n)=O(n)a = 2, b = 2, f(n) = O(n)
  • log⁔22=1\log_2 2 = 1, so Case 2 applies
  • T(n)=Θ(nlog⁔n)T(n) = \Theta(n \log n)

3. Loop Analysis

Nested Loops:

for i in range(n):           # O(n)
    for j in range(n):       # O(n) for each i
        # O(1) operation
# Total: O(n²)

Dependent Loops:

for i in range(n):           # O(n)
    for j in range(i):       # O(i) for each i
        # O(1) operation
# Total: O(1 + 2 + ... + n) = O(n²)

Data Structure Complexities

Arrays and Lists

OperationArrayDynamic ArrayLinked List
AccessO(1)O(1)O(1)O(1)O(n)O(n)
SearchO(n)O(n)O(n)O(n)O(n)O(n)
InsertionO(n)O(n)O(1)O(1) amortizedO(1)O(1)
DeletionO(n)O(n)O(n)O(n)O(1)O(1)

Trees

Tree TypeSearchInsertDeleteNotes
Binary Search TreeO(h)O(h)O(h)O(h)O(h)O(h)hh can be O(n)O(n) worst case
AVL TreeO(log⁔n)O(\log n)O(log⁔n)O(\log n)O(log⁔n)O(\log n)Self-balancing
Red-Black TreeO(log⁔n)O(\log n)O(log⁔n)O(\log n)O(log⁔n)O(\log n)Self-balancing
B-TreeO(log⁔n)O(\log n)O(log⁔n)O(\log n)O(log⁔n)O(\log n)Optimized for disk I/O

Hash Tables

OperationAverageWorst Case
SearchO(1)O(1)O(n)O(n)
InsertO(1)O(1)O(n)O(n)
DeleteO(1)O(1)O(n)O(n)

Load Factor: α=nm\alpha = \frac{n}{m} where nn = elements, mm = table size

Sorting Algorithm Complexities

AlgorithmBest CaseAverage CaseWorst CaseSpaceStable
Bubble SortO(n)O(n)O(n2)O(n^2)O(n2)O(n^2)O(1)O(1)Yes
Selection SortO(n2)O(n^2)O(n2)O(n^2)O(n2)O(n^2)O(1)O(1)No
Insertion SortO(n)O(n)O(n2)O(n^2)O(n2)O(n^2)O(1)O(1)Yes
Merge SortO(nlog⁔n)O(n \log n)O(nlog⁔n)O(n \log n)O(nlog⁔n)O(n \log n)O(n)O(n)Yes
Quick SortO(nlog⁔n)O(n \log n)O(nlog⁔n)O(n \log n)O(n2)O(n^2)O(log⁔n)O(\log n)No
Heap SortO(nlog⁔n)O(n \log n)O(nlog⁔n)O(n \log n)O(nlog⁔n)O(n \log n)O(1)O(1)No

Graph Algorithm Complexities

AlgorithmTime ComplexitySpace ComplexityUse Case
BFSO(V+E)O(V + E)O(V)O(V)Shortest path (unweighted)
DFSO(V+E)O(V + E)O(V)O(V)Topological sort, cycles
DijkstraO((V+E)log⁔V)O((V + E) \log V)O(V)O(V)Shortest path (weighted)
Floyd-WarshallO(V3)O(V^3)O(V2)O(V^2)All-pairs shortest path
Kruskal's MSTO(Elog⁔E)O(E \log E)O(V)O(V)Minimum spanning tree
Prim's MSTO(Elog⁔V)O(E \log V)O(V)O(V)Minimum spanning tree

Space Complexity Analysis

Stack Space in Recursion

def factorial(n):
    if n <= 1:
        return 1
    return n * factorial(n - 1)
# Space: O(n) due to call stack

Auxiliary Space vs Total Space

  • Auxiliary Space: Extra space used by algorithm
  • Total Space: Auxiliary space + input space

Common Space Patterns

  • In-place algorithms: O(1)O(1) extra space
  • Recursive algorithms: O(h)O(h) where hh is recursion depth
  • Dynamic programming: Often O(n)O(n) or O(n2)O(n^2) for memoization

Analysis Cheat Sheet

Quick Rules

  1. Drop constants: O(2n)=O(n)O(2n) = O(n)
  2. Drop lower-order terms: O(n2+n)=O(n2)O(n^2 + n) = O(n^2)
  3. Different inputs use different variables: O(a+b)O(a + b) not O(n)O(n)
  4. Nested loops multiply: O(n)ƗO(m)=O(nm)O(n) \times O(m) = O(nm)
  5. Consecutive loops add: O(n)+O(m)=O(n+m)O(n) + O(m) = O(n + m)

Common Pitfalls

  • Confusing average vs worst case
  • Ignoring space complexity
  • Not considering different input sizes
  • Assuming best-case scenarios

Interview Tips

  1. Always clarify: What are we optimizing for? Time or space?
  2. State assumptions: Input size, data distribution, constraints
  3. Analyze all cases: Best, average, worst
  4. Consider trade-offs: Time vs space complexity
  5. Verify with examples: Test your analysis with small inputs

Practical Examples

Example 1: Two Sum

def two_sum_naive(nums, target):
    for i in range(len(nums)):           # O(n)
        for j in range(i+1, len(nums)):  # O(n)
            if nums[i] + nums[j] == target:
                return [i, j]
    return []
# Time: O(n²), Space: O(1)

def two_sum_optimal(nums, target):
    seen = {}                            # O(n) space
    for i, num in enumerate(nums):       # O(n) time
        complement = target - num
        if complement in seen:           # O(1) average
            return [seen[complement], i]
        seen[num] = i                    # O(1) average
    return []
# Time: O(n), Space: O(n)

Example 2: Finding Duplicates

def has_duplicate_sort(nums):
    nums.sort()                     # O(n log n)
    for i in range(1, len(nums)):   # O(n)
        if nums[i] == nums[i-1]:
            return True
    return False
# Time: O(n log n), Space: O(1)

def has_duplicate_set(nums):
    seen = set()                    # O(n) space
    for num in nums:                # O(n) time
        if num in seen:             # O(1) average
            return True
        seen.add(num)               # O(1) average
    return False
# Time: O(n), Space: O(n)

Mathematical Foundations

Logarithm Properties

  • log⁔2n\log_2 n - Binary logarithm (common in CS)
  • log⁔(ab)=log⁔a+log⁔b\log(ab) = \log a + \log b
  • log⁔(ak)=klog⁔a\log(a^k) = k \log a
  • log⁔2n=O(log⁔n)\log_2 n = O(\log n) - Base doesn't matter for Big O

Summation Formulas

  • āˆ‘i=1ni=n(n+1)2=O(n2)\sum_{i=1}^{n} i = \frac{n(n+1)}{2} = O(n^2)
  • āˆ‘i=1ni2=n(n+1)(2n+1)6=O(n3)\sum_{i=1}^{n} i^2 = \frac{n(n+1)(2n+1)}{6} = O(n^3)
  • āˆ‘i=0n2i=2n+1āˆ’1=O(2n)\sum_{i=0}^{n} 2^i = 2^{n+1} - 1 = O(2^n)

Quick Reference Summary

Remember: Big O analysis helps us understand how algorithms scale. Focus on the growth rate for large inputs, not exact operation counts.

Golden Rule: When in doubt, trace through your algorithm with increasing input sizes and observe the pattern!

Share this cheat sheet