Progress
🌍
Why does this topic matter?
Every significant algorithm in the second half of this course — Trees, Graphs, Dynamic Programming, Divide & Conquer — is built on top of recursion. If recursion feels magical and confusing, those topics will feel impossible. But here's the secret: recursion is just a function that trusts itself to solve a smaller version of the same problem. Once that mental click happens, it unlocks everything else. Backtracking is equally powerful — it is the engine behind Sudoku solvers, chess engines, and constraint-satisfaction systems.

📖 Before We Start — The Big Picture

Imagine you're asked to open the smallest of 5 nested Matryoshka dolls. You don't think about all 5 at once — you open the outermost one, set it aside, and repeat the same action. Eventually you reach the smallest doll (the base case) — it has nothing inside. Then you stop. That's recursion.

The call stack is the "pile of dolls you set aside." Each time your function calls itself, a new stack frame is added to the top of the pile. When the base case is reached, frames are removed one by one, in the reverse order they were added (LIFO — Last In, First Out). StackOverflowError means you never found the smallest doll — you kept opening forever.

Backtracking extends this: after opening a doll, if you discover it's not what you're looking for, you put it back and try a different one. This "try → explore → undo" pattern is used in every constraint-satisfaction problem: N-Queens, Sudoku, Word Search.

🔗
How it connects: Lecture 4 (Java 8+) introduced functional thinking — input → output, no side effects. Recursion is the same mindset applied to algorithms. Lectures 9 (Sorting — Merge Sort, Quick Sort) and 15 (Trees — every traversal) are almost entirely built on recursive thinking mastered here.

🧠 The Mental Model — Think, Trust, Induct

💡
Golden Rule for Recursion:
  1. Think: What is the smallest version of this problem I can solve?
  2. Trust: Assume the recursive call WORKS correctly for smaller input.
  3. Induct: Use that trusted result to build the answer for the current input.

Never try to trace the full recursion tree in your head. Trust the math.

Recursion is a function calling itself on a smaller version of the same problem, until it hits a base case (the simplest possible answer). It is the backbone of Trees, Graphs, DP, Divide & Conquer, and Backtracking — mastering it here pays dividends for the rest of the course.

⚙️ How Recursion Works — The Anatomy

Every recursive function has exactly two parts:

Part Purpose Consequence if missing
Base Case The terminating condition — returns without a recursive call Infinite recursion → StackOverflowError
Recursive Case Calls itself with a strictly simpler input (smaller, shorter, fewer) Infinite recursion if input never shrinks
☕ Java
// Template: every recursive function follows this skeleton
returnType solve(params) {

    // ① BASE CASE — stop condition
    if (simplest problem) {
        return known_answer;
    }

    // ② RECURSIVE CASE — shrink & trust
    smaller = shrink(params);           // make problem smaller
    subResult = solve(smaller);       // TRUST this works ✓
    return combine(subResult, current); // build answer from sub-result
}

Concrete Example: Factorial

☕ Java
int factorial(int n) {
    // BASE CASE: 0! = 1  (we know this cold)
    if (n == 0) return 1;

    // TRUST: factorial(n-1) gives (n-1)! correctly
    // INDUCT: n! = n × (n-1)!
    return n * factorial(n - 1);
}
factorial(4) ├── calls factorial(3) │ ├── calls factorial(2) │ │ ├── calls factorial(1) │ │ │ ├── calls factorial(0) │ │ │ │ └── returns 1 ← BASE CASE │ │ │ └── returns 1 × 1 = 1 │ │ └── returns 2 × 1 = 2 │ └── returns 3 × 2 = 6 └── returns 4 × 6 = 24 ← FINAL ANSWER
TimeO(n)
SpaceO(n) call stack

📚 The Call Stack — What Actually Happens in Memory

Each recursive call pushes a new stack frame onto the call stack. It holds: local variables, parameters, and the return address. When a base case is reached, frames are popped off in LIFO order.

⚠️
StackOverflowError — The JVM has a default stack size (~512KB–1MB). Deep recursion without a base case (or with n ≈ 10,000+) will crash. For very deep recursion, consider: (a) iterative rewrite, (b) tail recursion (compiler optimization), or (c) increasing JVM stack size with -Xss.
Stack for factorial(3)
factorial(0) → 1  BASE
factorial(1) → 1×1
factorial(2) → 2×1
factorial(3) → 3×2  ← TOP
↑ Each box = one stack frame in memory
Space = depth of recursion tree
factorial(n) → depth n → O(n) space
Binary search → depth log n → O(log n) space
Fibonacci (naive) → depth n → O(n) space

🛑 Base Cases — The Foundation of Correctness

Types of Base Cases

Type When to use Example
Empty / null check Working on arrays, strings, linked lists if (arr.length == 0) return;
Single element Arrays or lists that bottom out at size 1 if (lo == hi) return arr[lo];
Index out of bounds Grid/string traversal problems if (i < 0 || i >= n) return 0;
Counter reaches 0 Countdown problems, power, multiply if (exp == 0) return 1;
Target found / exceeded Search, sum, path problems if (target == 0) return true;
⚠️
Common Pitfall — Missing edge in base case:
For fibonacci(n), beginners write if (n == 0) return 0; but forget if (n == 1) return 1;. Without it, fib(-1) recursion never terminates. Always ask: what are ALL the trivial inputs?

🌿 Types of Recursion

1. Linear Recursion — One call per frame

Makes exactly one recursive call per invocation. The recursion tree is a straight line (depth = n).

☕ Java · Linear Recursion
// Sum of array: sum(arr, n) = arr[n-1] + sum(arr, n-1)
int arraySum(int[] arr, int n) {
    if (n == 0) return 0;                    // base case
    return arr[n-1] + arraySum(arr, n-1);   // linear: ONE call
}

// String reversal: reverse(s) = last char + reverse(s without last)
String reverse(String s) {
    if (s.length() <= 1) return s;
    return reverse(s.substring(1)) + s.charAt(0);
}

2. Tail Recursion — Call is the LAST thing

The recursive call is the very last operation — no work done after it returns. Some languages/compilers optimize this to avoid stack growth (tail-call optimization). Java does NOT do TCO by default, but the pattern is worth knowing.

☕ Java · Tail Recursion
// NOT tail recursive: multiply happens AFTER the call
int factorial(int n) {
    if (n == 0) return 1;
    return n * factorial(n-1);  // multiply happens after return!
}

// Tail recursive: accumulator carries the result
int factTail(int n, int acc) {
    if (n == 0) return acc;              // acc has the full answer
    return factTail(n-1, n * acc);      // call IS the last operation
}
// Call: factTail(5, 1)

3. Tree Recursion — Multiple calls per frame

Makes more than one recursive call. The call tree branches — this is what creates exponential complexity in naive implementations.

☕ Java · Tree Recursion
// Fibonacci: TWO recursive calls → tree recursion
int fib(int n) {
    if (n <= 1) return n;                 // base cases: fib(0)=0, fib(1)=1
    return fib(n-1) + fib(n-2);         // TWO calls → branching!
}
// Time: O(2ⁿ) — exponential! Same subproblems recomputed.
// Fix: Memoization (DP Lecture) brings it to O(n).
fib(4) ├── fib(3) │ ├── fib(2) │ │ ├── fib(1) → 1 │ │ └── fib(0) → 0 │ └── fib(1) → 1 └── fib(2) ← fib(2) computed AGAIN! Wasteful. ├── fib(1) → 1 └── fib(0) → 0 Total nodes: 9 for fib(4). For fib(40) → ~2 BILLION nodes!

4. Mutual Recursion — Functions calling each other

☕ Java · Mutual Recursion
// is n even? → if n==0: yes. else ask: is (n-1) odd?
boolean isEven(int n) {
    if (n == 0) return true;
    return isOdd(n - 1);   // delegates to isOdd!
}

boolean isOdd(int n) {
    if (n == 0) return false;
    return isEven(n - 1);  // delegates back to isEven!
}

🚀 Fixing Tree Recursion: Memoization

💡
Memoization = Recursion + Caching. The key insight: in tree recursion, the same subproblems are solved repeatedly. Cache the answer the first time, return it from cache thereafter. This turns O(2ⁿ) → O(n).
☕ Java · Memoization Pattern
// Without memoization: O(2ⁿ) — recomputes fib(2), fib(3) many times
int fib(int n) {
    if (n <= 1) return n;
    return fib(n-1) + fib(n-2);   // O(2ⁿ) — exponential
}

// WITH memoization: O(n) — each n solved exactly once
int[] memo = new int[n+1];  // -1 = not computed yet
Arrays.fill(memo, -1);

int fibMemo(int n, int[] memo) {
    if (n <= 1) return n;
    if (memo[n] != -1) return memo[n]; // ✔ CACHE HIT — skip recomputation
    memo[n] = fibMemo(n-1, memo) + fibMemo(n-2, memo);
    return memo[n];                      // store before returning
}

// For non-integer keys, use HashMap:
Map<String,Integer> cache = new HashMap<>();
int solve(String state) {
    if (cache.containsKey(state)) return cache.get(state);
    // ... compute result ...
    cache.put(state, result);
    return result;
}
Approach Time Space When to use
Naive recursion O(2ⁿ) or worse O(n) stack Understanding / tiny inputs
Memoization (top-down DP) O(n × states) O(n) stack + cache Overlapping subproblems, natural recursion structure
Tabulation (bottom-up DP) O(n × states) O(n) array No stack overflow risk, slightly faster in practice

🔷 Core Recursion Patterns

🎯
Pattern Recognition Guide
These patterns appear in 80% of recursion interview questions. Learn to identify which one applies before writing code.

Pattern 1 — Parameterized Recursion (carry state forward)

Pass extra parameters to carry accumulated state downward through the call tree. The base case uses the accumulated state as the answer.

☕ Java · Parameterized
// Print 1 to N using recursion (no return value — side effects)
void print1toN(int i, int n) {
    if (i > n) return;      // base case: done
    System.out.println(i);  // print BEFORE call → 1,2,3,...,n
    print1toN(i + 1, n);    // carry state: i increases each call
}
// Call: print1toN(1, n)

// Print N to 1 — just move print AFTER call!
void printNto1(int i, int n) {
    if (i > n) return;
    printNto1(i + 1, n);    // go to base first
    System.out.println(i);  // print on the WAY BACK → n,n-1,...,1
}
// KEY INSIGHT: Code BEFORE recursive call = top-down (going in)
//              Code AFTER recursive call  = bottom-up (coming back)
🔑
The Before/After Insight — Critical for Trees!
Code written before the recursive call executes while going deeper (pre-order).
Code written after the recursive call executes while returning (post-order).
This single insight explains Pre/In/Post-order tree traversals entirely.

Pattern 2 — Functional Recursion (return the answer)

The function returns a value that is built up from smaller sub-answers. Classic mathematical recursion style.

☕ Java · Functional
// Power function: x^n
double power(double x, int n) {
    if (n == 0) return 1;                       // x^0 = 1
    if (n < 0)  return power(1/x, -n);          // handle negatives
    // Naive: O(n) — multiply x n times
    return x * power(x, n-1);                   // O(n) time
}

// Optimized: Fast Power (Binary Exponentiation) — O(log n)
double fastPower(double x, long n) {
    if (n == 0) return 1;
    if (n < 0)  return fastPower(1/x, -n);
    double half = fastPower(x, n/2);            // compute HALF, not n-1!
    if (n % 2 == 0) return half * half;          // even: x^n = (x^(n/2))²
    else            return x * half * half;       // odd:  x^n = x*(x^(n/2))²
}
// fastPower(2, 10): 10→5→2→1→0  (only 4 levels deep vs 10)

Pattern 3 — Subsets / Subsequences (include or exclude)

⭐ The Include/Exclude Pattern

At each element, make a binary decision: include it or exclude it.
This generates all 2ⁿ subsets of an n-element array.
Template: solve(index, currentList) → two calls: include arr[index] or skip it.

☕ Java · Subsets / Power Set
void subsets(int[] arr, int idx, List<Integer> current, List<List<Integer>> result) {
    // BASE CASE: processed all elements → add current subset to result
    if (idx == arr.length) {
        result.add(new ArrayList<>(current)); // ⚠️ add a COPY, not reference!
        return;
    }

    // CHOICE 1: INCLUDE arr[idx]
    current.add(arr[idx]);
    subsets(arr, idx + 1, current, result);

    // CHOICE 2: EXCLUDE arr[idx] (backtrack — undo the add)
    current.remove(current.size() - 1);   // ← THIS IS BACKTRACKING
    subsets(arr, idx + 1, current, result);
}
// For arr=[1,2,3]: generates [], [3], [2], [2,3], [1], [1,3], [1,2], [1,2,3]
// Total 2³ = 8 subsets. Time: O(2ⁿ × n), Space: O(n) call stack
subsets([1,2], idx=0, []) ├── Include 1: ([1], idx=1) │ ├── Include 2: ([1,2], idx=2) → ADD [1,2] │ └── Exclude 2: ([1], idx=2) → ADD [1] └── Exclude 1: ([], idx=1) ├── Include 2: ([2], idx=2) → ADD [2] └── Exclude 2: ([], idx=2) → ADD []

Pattern 4 — Permutations (arrange all elements)

☕ Java · Permutations
void permutations(String s, String chosen, List<String> result) {
    if (s.isEmpty()) {
        result.add(chosen); // base case: all chars placed
        return;
    }
    for (int i = 0; i < s.length(); i++) {
        char c = s.charAt(i);
        // pick character c, remove from remaining s
        permutations(s.substring(0,i) + s.substring(i+1), chosen + c, result);
    }
}
// "ABC" → 3! = 6 permutations: ABC, ACB, BAC, BCA, CAB, CBA
// Time: O(n!), Space: O(n) — n! is the dominant cost for generating

// ─── Efficient version using swap (in-place, avoids String concat) ───
void permuteSwap(char[] arr, int start, List<String> res) {
    if (start == arr.length) { res.add(new String(arr)); return; }
    for (int i = start; i < arr.length; i++) {
        swap(arr, start, i);           // place arr[i] at position start
        permuteSwap(arr, start+1, res); // fix start, permute rest
        swap(arr, start, i);           // BACKTRACK: restore
    }
}

↩️ Backtracking

Technique What it does Undo step? Typical use
Pure DFS Explore every path to completion No Reachability, path existence
Backtracking Explore a choice, then undo it before trying the next Yes Generate all solutions (subsets, perms, combos)
Pruned Backtracking Same + skip branches that can’t lead to a valid answer Yes N-Queens, Sudoku, Combination Sum with target
📌
Backtracking = Recursion + Undo
It is a systematic way to try all possibilities by making a choice, exploring, then undoing that choice before trying the next one. Think of it as a depth-first search over a decision tree, pruning branches that can't lead to a valid solution.

The Universal Backtracking Framework

☕ Java · Universal Backtracking Template
void backtrack(int idx, List<T> current, List<List<T>> result, int[] input) {

    // ① Is the current state a valid complete solution?
    if (isSolution(idx, current)) {
        result.add(new ArrayList<>(current)); // ✓ collect it
        return;
    }

    // ② Try all choices at this decision point
    for (int i = idx; i < input.length; i++) {

        // ③ PRUNE: skip impossible choices early
        if (!isValid(input[i], current)) continue;

        // ④ MAKE the choice
        current.add(input[i]);

        // ⑤ EXPLORE deeper with this choice
        backtrack(i + 1, current, result, input);

        // ⑥ UNDO the choice (backtrack)
        current.remove(current.size() - 1);
    }
}

Pruning — The Key to Efficiency

Without pruning, backtracking explores every branch. Pruning cuts off branches that cannot possibly lead to a valid solution, often making the algorithm orders of magnitude faster in practice.

Problem Pruning Condition Effect
N-Queens Skip columns/diagonals already attacked Drastically reduces tree size
Combination Sum Skip if remaining sum < 0 No point going deeper
Sudoku Skip if digit already in row/col/box 99%+ of branches pruned
Word Search Skip if grid cell visited or out of bounds Prevents revisiting cells
Subsets (no dup) Sort + skip duplicate elements at same level Avoids duplicate subsets
📖 Deep Dive: N-Queens (Classic Backtracking)

Place N queens on an N×N chessboard so no two queens attack each other. A queen attacks any piece in the same row, column, or diagonal.

☕ Java · N-Queens
void nQueens(int row, int n, char[][] board, List<List<String>> result) {
    if (row == n) {
        result.add(buildBoard(board)); // all rows filled ✓
        return;
    }
    for (int col = 0; col < n; col++) {
        if (isSafe(row, col, board, n)) { // PRUNE: is this cell safe?
            board[row][col] = 'Q';         // PLACE queen
            nQueens(row + 1, n, board, result); // explore next row
            board[row][col] = '.';         // REMOVE queen (backtrack)
        }
    }
}

boolean isSafe(int row, int col, char[][] board, int n) {
    // Check column above
    for (int r = 0; r < row; r++)
        if (board[r][col] == 'Q') return false;
    // Check upper-left diagonal
    for (int r = row-1, c = col-1; r >= 0 && c >= 0; r--, c--)
        if (board[r][c] == 'Q') return false;
    // Check upper-right diagonal
    for (int r = row-1, c = col+1; r >= 0 && c < n; r--, c++)
        if (board[r][c] == 'Q') return false;
    return true;
}
// Time: O(N!) worst case, but pruning makes it much faster in practice
// 4-Queens: 2 solutions | 8-Queens: 92 solutions
Optimization: Use three boolean arrays cols[], diag1[], diag2[] to check safety in O(1) instead of O(n). diag1[row-col+n] and diag2[row+col] uniquely identify diagonals.
📖 Deep Dive: Sudoku Solver
☕ Java · Sudoku Solver
boolean solveSudoku(char[][] board) {
    for (int r = 0; r < 9; r++) {
        for (int c = 0; c < 9; c++) {
            if (board[r][c] != '.') continue; // skip filled cells
            for (char d = '1'; d <= '9'; d++) {
                if (isValid(board, r, c, d)) {
                    board[r][c] = d;               // PLACE digit
                    if (solveSudoku(board)) return true; // solved!
                    board[r][c] = '.';            // BACKTRACK
                }
            }
            return false; // no digit worked → propagate failure
        }
    }
    return true; // all cells filled → solved!
}

boolean isValid(char[][] board, int row, int col, char d) {
    for (int i = 0; i < 9; i++) {
        if (board[row][i] == d) return false;       // row check
        if (board[i][col] == d) return false;       // col check
        // 3×3 box check: map to box start corner
        int br = 3*(row/3) + i/3, bc = 3*(col/3) + i%3;
        if (board[br][bc] == d) return false;
    }
    return true;
}

⏱ Complexity Analysis for Recursive Algorithms

📐
Master Theorem (simplified): For T(n) = a·T(n/b) + O(nᵈ):
  • If d > log_b(a) → O(nᵈ)
  • If d = log_b(a) → O(nᵈ log n)
  • If d < log_b(a) → O(n^log_b(a))
Algorithm Recurrence Time Space (stack) Why
Factorial / Linear T(n)=T(n-1)+O(1) O(n) O(n) n levels, O(1) work each
Binary Search T(n)=T(n/2)+O(1) O(log n) O(log n) halves each time
Merge Sort T(n)=2T(n/2)+O(n) O(n log n) O(n) log n levels × O(n) merge
Fibonacci (naive) T(n)=T(n-1)+T(n-2) O(2ⁿ) O(n) exponential tree, depth n
Subsets T(n)=2T(n-1)+O(1) O(2ⁿ) O(n) 2 choices per element
Permutations T(n)=n·T(n-1)+O(1) O(n!) O(n) n! leaves
Fast Power T(n)=T(n/2)+O(1) O(log n) O(log n) halves exponent each time

💪 In-Lecture Practice Problems

Work through these in order. Each one builds on the previous pattern. Don't look at the solution before attempting!

Problem 01 · Linear Recursion Warmup
Sum of Digits of a Number
Easy Amazon TCS Linear Recursion
Problem 02 · Fibonacci + Memoization
Fibonacci Number (3 ways — Naive → Memo → DP)
Easy → Medium GoogleAmazonMicrosoft Tree RecursionMemoization Preview LC #509
Problem 03 · Subsets Pattern
Generate All Subsets (Power Set)
Medium GoogleFacebookAmazon Include/Exclude LC #78
Problem 04 · Combination Sum
Combination Sum (reuse allowed)
Medium GoogleAmazonApple Backtracking + Pruning LC #39
Problem 05 · Hard Backtracking
Word Search in a Grid
Hard GoogleFacebookMicrosoftAmazon Grid DFS + Backtracking LC #79
Problem 06 · Fibonacci in Disguise
Climbing Stairs
Easy GoogleAmazonApple Tree Recursion → DP
Problem 07 · Constraint-Based Backtracking
Generate Parentheses
Medium GoogleAmazonFacebook Backtracking · Pruning
Problem 08 · Permutation Pattern
Permutations of an Array
Medium GoogleAmazonFacebook Backtracking · Swap
Problem 09 · Classic Backtracking
N-Queens Problem
Hard GoogleAmazonFacebook Backtracking · O(1) Pruning
Problem 10 · State-Space Backtracking
Letter Combinations of a Phone Number
Medium GoogleAmazonFacebookMicrosoft Backtracking · Mapping

📝 Assignment

📋
Lecture 5 Assignment — 32 Problems
Complete the assignment before moving to Lecture 6. It includes 6 Easy, 9 Medium, 7 Hard problems, 3 complexity exercises, and 7 bonus problems — all with LeetCode/GFG links.

📄 Open Assignment →

✅ Lecture Completion Checklist

Check each item off as you master it. Don't proceed to Lecture 3 until all are checked.

I can explain recursion using Think-Trust-Induct without notes
I can draw the call stack for factorial(5) on paper
I know the difference between code BEFORE vs AFTER a recursive call
I can identify all 4 types of recursion by reading code
I can code the Include/Exclude pattern for subsets from memory
I can code the universal backtracking template from memory
I solved Combination Sum without looking at the solution first
I can state why Fibonacci naive is O(2ⁿ) and memoized is O(n)
I can explain pruning and implement it in N-Queens
I completed at least 10 problems from the Assignment file
🚀
You're ready for Lecture 6: Bit Manipulation
Bit manipulation is the third pillar of foundations and appears constantly in optimization problems at FAANG. Short lecture (~3 days) with high return.
← Topic 4: Java 8+ & StreamsTopic 6: Bit Manipulation →