ALGORITHM ANALYSIS

Complexity Analysis Reference

Essential reference for algorithm complexity analysis including time and space complexity, best/worst/average cases, and practical analysis techniques.

complexity-analysistime-complexityspace-complexitybig-oalgorithm-analysis

Cheat Sheet Overview

Quick reference and study metrics

Intermediate
20 minutes

Reading Time

8

Sections

0

Downloads

📚algorithm-analysis

Category

Enhanced SEO fields

Complexity Analysis & Big O

🎯 Complexity Hierarchy (Fast → Slow)

Common Complexity Classes

NotationNameExamplen=10n=100n=1,000n=10,000
O(1)ConstantHash lookup1111
O(log n)LogarithmicBinary search371013
O(n)LinearArray scan101001K10K
O(n log n)LinearithmicMerge sort3070010K130K
O(n²)QuadraticBubble sort10010K1M100M
O(n³)CubicFloyd-Warshall1K1M1B1T
O(2^n)ExponentialSubset enumeration1K10³⁰10³⁰⁰10³⁰⁰⁰
O(n!)FactorialBrute force TSP3.6M10¹⁵⁷10²⁵⁶⁷10³⁵⁶⁵⁹

⏱️ Practical Time Limits (1 Second Execution)

Maximum Input Size by Complexity

Time ComplexityMax n (1 sec)Max n (1 min)Algorithm Examples
O(1)Hash table operations, array access
O(log n)~10¹⁸~10¹⁸Binary search, balanced BST operations
O(n)~10⁸~6×10⁹Linear search, single loop
O(n log n)~10⁶~10⁷Merge sort, heap sort, efficient algorithms
O(n²)~10⁴~2.5×10⁵Bubble sort, simple DP, nested loops
O(n³)~500~4×10³Floyd-Warshall, matrix multiplication
O(2ⁿ)~25~35Subset enumeration, some backtracking
O(n!)~12~15Brute force permutations, TSP

📊 Algorithm Complexity Reference

Sorting Algorithms

AlgorithmBestAverageWorstSpaceStableIn-Place
Quick SortO(n log n)O(n log n)O(n²)O(log n)
Merge SortO(n log n)O(n log n)O(n log n)O(n)
Heap SortO(n log n)O(n log n)O(n log n)O(1)
Bubble SortO(n)O(n²)O(n²)O(1)
Insertion SortO(n)O(n²)O(n²)O(1)
Selection SortO(n²)O(n²)O(n²)O(1)
Radix SortO(d·n)O(d·n)O(d·n)O(n+k)

Search Algorithms

AlgorithmData StructureTimeSpacePrerequisites
Linear SearchUnsorted arrayO(n)O(1)None
Binary SearchSorted arrayO(log n)O(1)Sorted data
Hash Table LookupHash tableO(1) avgO(n)Good hash function
BST SearchBalanced BSTO(log n)O(1)Balanced tree
BFSGraphO(V + E)O(V)Graph representation
DFSGraphO(V + E)O(V)Graph representation

Dynamic Programming Problems

ProblemTime ComplexitySpace ComplexityOptimization Possible
FibonacciO(n)O(n) → O(1)Rolling variables
LCS (2 strings)O(nm)O(nm) → O(min(n,m))Rolling array
Edit DistanceO(nm)O(nm) → O(min(n,m))Rolling array
Knapsack 0-1O(nW)O(nW) → O(W)1D array
Coin ChangeO(nS)O(nS) → O(S)1D array
Matrix ChainO(n³)O(n²)Knuth optimization possible
Optimal BSTO(n³)O(n²)Knuth optimization possible

Graph Algorithms

AlgorithmTime ComplexitySpace ComplexityUse Case
BFSO(V + E)O(V)Unweighted shortest path
DFSO(V + E)O(V)Graph traversal, connectivity
Dijkstra'sO((V + E) log V)O(V)Non-negative weighted shortest path
Bellman-FordO(VE)O(V)Negative weights, cycle detection
Floyd-WarshallO(V³)O(V²)All pairs shortest paths
Kruskal's MSTO(E log E)O(V)Minimum spanning tree
Prim's MSTO((V + E) log V)O(V)Minimum spanning tree
Topological SortO(V + E)O(V)DAG ordering

🧮 Complexity Analysis Techniques

Master Theorem for Divide & Conquer

For recurrences of the form: T(n)=aT(n/b)+f(n)T(n) = aT(n/b) + f(n)

CaseConditionSolutionExample
Case 1f(n) = O(n^c) where c < log_b(a)T(n) = Θ(n^log_b(a))T(n) = 8T(n/2) + n²
Case 2f(n) = Θ(n^c log^k n) where c = log_b(a)T(n) = Θ(n^c log^(k+1) n)T(n) = 2T(n/2) + n
Case 3f(n) = Ω(n^c) where c > log_b(a)T(n) = Θ(f(n))T(n) = 2T(n/2) + n²

Common Recurrence Patterns

RecurrenceSolutionAlgorithm Example
T(n) = T(n-1) + O(1)O(n)Linear search, factorial
T(n) = T(n-1) + O(n)O(n²)Selection sort, insertion sort
T(n) = 2T(n/2) + O(1)O(n)Binary tree traversal
T(n) = 2T(n/2) + O(n)O(n log n)Merge sort, quick sort (avg)
T(n) = T(n/2) + O(1)O(log n)Binary search
T(n) = 2T(n-1) + O(1)O(2^n)Naive Fibonacci, subset enumeration

📝 Loop Analysis Quick Rules

# Single loop over n elements
for i in range(n):          # O(n)
    # constant work

# Nested loops
for i in range(n):          # O(n²)
    for j in range(n):
        # constant work

# Loop with halving
while n > 1:                # O(log n)
    n = n // 2
    # constant work

# Loop with doubling
i = 1
while i < n:                # O(log n)
    i = i * 2
    # constant work

# Dependent nested loops
for i in range(n):          # O(n²)
    for j in range(i):      # Note: j depends on i
        # constant work

💾 Space Complexity Analysis

Space Complexity Categories

Space UsageNotationExampleNotes
ConstantO(1)Variables, pointersSize doesn't grow with input
LogarithmicO(log n)Recursion stack (balanced)Recursive calls on halved input
LinearO(n)Arrays, listsProportional to input size
QuadraticO(n²)2D arrays, matricesNested data structures

Common Space Optimizations

# Space-optimized Fibonacci
def fibonacci_optimized(n):
    if n <= 1: return n
    prev, curr = 0, 1
    for i in range(2, n + 1):
        prev, curr = curr, prev + curr
    return curr
# Space: O(1) instead of O(n)

# Space-optimized DP (rolling array)
def knapsack_optimized(weights, values, capacity):
    dp = [0] * (capacity + 1)
    for i in range(len(weights)):
        for w in range(capacity, weights[i] - 1, -1):
            dp[w] = max(dp[w], dp[w - weights[i]] + values[i])
    return dp[capacity]
# Space: O(W) instead of O(nW)

🚀 Performance Tips

When to Choose Different Complexities

  • O(1): Hash tables for frequent lookups
  • O(log n): Binary search in sorted data
  • O(n): Single pass algorithms, unavoidable for reading input
  • O(n log n): Efficient sorting, divide-and-conquer
  • O(n²): Small datasets, simple implementations
  • O(2ⁿ): Small datasets, exhaustive search needed
  • O(n!): Very small datasets, permutation problems

Red Flags in Complexity

⚠️ Avoid these patterns for large inputs:

  • Nested loops without early termination
  • Recursive calls without memoization
  • String concatenation in loops
  • Frequent array resizing
  • Unnecessary sorting in inner loops

Share this cheat sheet