Memoization in JavaScript: Speed Up Expensive Functions with Caching

TL;DR:
Memoization is a caching technique that stores the results of expensive function calls and returns the cached result when the same inputs occur again. You’ll learn:
- What memoization is and when to use it
- How to write a simple
memoize
helper - Dealing with multiple or complex arguments
- Advanced patterns: LRU caches,
WeakMap
, async memoization - Integrating memoization in React with
useMemo
What Is Memoization?
Memoization means “remembering” the output of a function for a given set of inputs. Instead of recomputing a result you already know, you return the cached value—dramatically cutting down on work:
// Without memoization: O(2ⁿ) Fibonacci function fib(n) { return n < 2 ? n : fib(n - 1) + fib(n - 2) }
Calling fib(35)
can take noticeable time. With memoization, each fib(k)
is computed once:
function memoizedFib() { const cache = { 0: 0, 1: 1 } return function fib(n) { if (cache[n] != null) return cache[n] cache[n] = fib(n - 1) + fib(n - 2) return cache[n] } } const fib = memoizedFib() console.log(fib(35)) // instantaneous after first run
When to Use Memoization
- Pure functions: no side-effects, output depends only on inputs.
- Expensive or recursive computations: Fibonacci, pathfinding, combinatorics.
- Repeated calls with same args: parsing, data transformation, formatting.
- Component render optimizations: React selectors or derived state.
Avoid memoizing functions that interact with external state, I/O, or randomness.
Building a Simple Memoize Helper
A reusable memoize
takes any single-arg function and returns a cached version:
function memoize(fn) { const cache = new Map() return function (arg) { if (cache.has(arg)) { return cache.get(arg) } const result = fn(arg) cache.set(arg, result) return result } } // Usage const slowSquare = (n) => { // imagine a CPU-intensive task for (let i = 0; i < 1e8; i++); return n * n } const fastSquare = memoize(slowSquare) console.time('First call') console.log(fastSquare(10)) // computes console.timeEnd('First call') console.time('Second call') console.log(fastSquare(10)) // cached console.timeEnd('Second call')
Handling Multiple & Complex Arguments
To memoize functions with >1 argument or non-primitive args, you need a key serializer or nested Map
s:
function memoizeMulti(fn) { const cache = new Map() return function (...args) { let node = cache for (const arg of args) { if (!node.has(arg)) node.set(arg, new Map()) node = node.get(arg) } if (node.has('result')) { return node.get('result') } const result = fn(...args) node.set('result', result) return result } } // Or use JSON key (beware order & functions): function memoizeJSON(fn) { const cache = new Map() return function (...args) { const key = JSON.stringify(args) if (cache.has(key)) return cache.get(key) const result = fn(...args) cache.set(key, result) return result } }
Advanced Patterns
1. LRU Cache
Limit memory by evicting the least-recently-used entry:
class LRUCache { constructor(limit = 50) { this.limit = limit this.cache = new Map() } get(key) { if (!this.cache.has(key)) return const value = this.cache.get(key) this.cache.delete(key) this.cache.set(key, value) return value } set(key, value) { if (this.cache.size >= this.limit) { // delete oldest const oldestKey = this.cache.keys().next().value this.cache.delete(oldestKey) } this.cache.set(key, value) } } function memoizeLRU(fn, limit) { const lru = new LRUCache(limit) return function (arg) { const hit = lru.get(arg) if (hit !== undefined) return hit const result = fn(arg) lru.set(arg, result) return result } }
2. WeakMap for Object Keys
Prevent memory leaks when caching by object identity:
function memoizeWeak(fn) { const cache = new WeakMap() return function (obj) { if (cache.has(obj)) return cache.get(obj) const result = fn(obj) cache.set(obj, result) return result } }
3. Async Memoization
Cache Promises to avoid duplicate in-flight requests:
function memoizeAsync(fn) { const cache = new Map() return async function (arg) { if (cache.has(arg)) return cache.get(arg) const promise = fn(arg).finally(() => { // optionally remove after resolution }) cache.set(arg, promise) return promise } }
Memoization in React
React offers useMemo
and useCallback
for inline caching:
function ExpensiveList({ items }) { const sorted = useMemo(() => { return items.slice().sort((a, b) => a.value - b.value) }, [items]) return <List data={sorted} /> }
For cross-component or global caching, you can wrap pure functions with your own memoize
and import them anywhere.
Pitfalls & Best Practices
- Stale Cache: clear or expire entries if inputs become invalid.
- Memory Growth: use LRU or WeakMap for long-lived apps.
- Key Collisions: avoid naive JSON serialization for complex or unordered data.
- Side-Effects: never memoize functions with side-effects or non-deterministic output.
Conclusion
Memoization is a powerful, easy-to-implement strategy to speed up pure, expensive functions in JavaScript. Start with a simple closure-based cache, then evolve to handle multiple args, limit memory with LRU, and integrate with React’s hooks. Use it judiciously to shave off milliseconds—or seconds—in your apps, without sacrificing correctness.
Next Up: Try building your own memoize
helper in your codebase, and measure the difference on a CPU-heavy task!