Memoization
Memoization is caching for recursion. If you've already computed the answer for certain inputs, just look it up instead of computing it again. It turns exponential recursion into polynomial time.
The Core Idea
Before computing something, check if you've done it before. After computing, save the result. That's it. The magic is that overlapping subproblems get solved once instead of many times.
Memoization is top-down dynamic programming. You write natural recursion, then add caching. It's often easier to reason about than bottom-up DP, and just as efficient.
Classic: Fibonacci
Without memoization: O(2^n). With memoization: O(n).
def fib(n, memo={}):
if n <= 1:
return n
if n in memo:
return memo[n]
memo[n] = fib(n - 1, memo) + fib(n - 2, memo)
return memo[n]
Python's Built-in: @lru_cache
Python makes it even easier:
from functools import lru_cache
@lru_cache(maxsize=None)
def fib(n):
if n <= 1:
return n
return fib(n - 1) + fib(n - 2)
Classic: Coin Change
@lru_cache(maxsize=None)
def coinChange(coins, amount):
if amount == 0:
return 0
if amount < 0:
return float('inf')
min_coins = float('inf')
for coin in coins:
result = coinChange(coins, amount - coin)
min_coins = min(min_coins, result + 1)
return min_coins
When to Use Memoization
- Overlapping subproblems - Same inputs computed multiple times
- Optimal substructure - Answer builds from smaller answers
- Recursive solutions - Natural top-down approach
- State can be hashed - Inputs work as dictionary keys
If your recursive function has the same inputs producing the same outputs (pure function), it can be memoized. If it depends on external state, it can't.