What is Memoization?
Memoization is an optimization technique where you cache the results of expensive function calls.
Imagine you solve a math problem and write the answer in a notebook. Next time you see the same problem, you just check your notebook instead of solving it again.
That’s exactly what memoization does, it caches the result of a function call. When the same function is called again with the same arguments, instead of recalculating the result, you return the cached value.
This can dramatically improve performance for functions that are called repeatedly with the same inputs.
Why Use Memoization?
- Avoids repeating expensive calculations.
- Speeds up performance for functions with repeated inputs.
- Especially useful in recursion, data processing, and UI rendering.
- Improves user experience by reducing lag or delay in the UI.
Basic Concept
Here's a simple example to illustrate the concept of memoization.
Without Memoization:
Every time the function is called, even with the same inputs, it doesn’t remember and it recalculates every time.
function slowAdd(a, b) {
console.log("Calculating..."); // Shows when computation happens
// Simulate expensive operation
for (let i = 0; i < 1000000000; i++) {} // Busy wait
return a + b;
}
console.log(slowAdd(5, 3)); // Takes time, logs "Calculating..."
console.log(slowAdd(5, 3)); // Takes time again, logs "Calculating..."
With Memoization:
You add a cache so the result is stored after the first calculation.
const memoizedAdd = (() => {
const cache = {};
return function (a, b) {
const key = `${a},${b}`; // Create a unique key for arguments
if (key in cache) {
console.log("Retrieved from cache");
return cache[key];
}
console.log("Calculating...");
// Simulate expensive operation
for (let i = 0; i < 1000000000; i++) {}
const result = a + b;
cache[key] = result; // Store the result
return result;
};
})();
console.log(memoizedAdd(5, 3)); // Takes time, logs "Calculating..."
console.log(memoizedAdd(5, 3)); // Instant, logs "Retrieved from cache"
Here,
- On first call, inputs (5, 3) are not in the cache, so calculation happens and result is saved.
- On second call, same inputs, but this time found in cache, so it instantly returns result.
Creating a Basic Memoization Function
You can create a general-purpose memoize function that works for any pure function.
function memoize(fn) {
const cache = {};
return function(...args) {
const key = JSON.stringify(args);
if (key in cache) {
console.log('Cache hit for:', args);
return cache[key];
}
console.log('Cache miss for:', args);
const result = fn.apply(this, args);
cache[key] = result;
return result;
};
}
// Usage
const expensiveFunction = (n) => {
console.log(`Computing for ${n}...`);
let result = 0;
for (let i = 0; i < n \* 1000000; i++) {
result += i;
}
return result;
};
const memoizedFunction = memoize(expensiveFunction);
console.log(memoizedFunction(100)); // Slow first time
console.log(memoizedFunction(100)); // Fast second time
console.log(memoizedFunction(200)); // Slow for new input
console.log(memoizedFunction(100)); // Fast for cached input
Here,
- The
memoize
function is a utility that adds caching to any function. - If the function is called with the same arguments again, it returns the saved result instantly instead of recalculating.
function memoize(fn) {const cache = {}; // Store for results
:cache
is like a notebook that saves the results for given inputs.return function(...args) {const key = JSON.stringify(args);
:- This returns a new function that wraps the original
fn
. - It accepts any number of arguments using
...args
. JSON.stringify(args)
converts the array of arguments into a string (e.g.,[100]
→"[100]"
), which is used as a cache key.
- This returns a new function that wraps the original
if (key in cache) {...}
: If the same input has been seen before, return the saved result.console.log('Cache miss for:', args);...return result;};}
, If it’s a new input:- It calls the original function with all arguments using
fn.apply(this, args)
. - Stores the result in
cache[key]
. - Then returns the
result
.
- It calls the original function with all arguments using
- Usage Example:
const expensiveFunction = (n) => {...};
simulates a heavy task, it loops millions of times to compute a value.const memoizedFunction = memoize(expensiveFunction);
:expensiveFunction
is wrapped inmemoize()
to avoid recalculating the same input.
- When you run this:
console.log(memoizedFunction(100)); // Slow first time, logs: "Computing...", "Cache miss..." console.log(memoizedFunction(100)); // Fast second time, logs: "Cache hit..." console.log(memoizedFunction(200)); // New input, slow, "Cache miss..." console.log(memoizedFunction(100)); // Already cached, instant!
Memoization with TTL (Time To Live)
Sometimes, you don’t want to keep data cached forever. Instead, you want it to expire after some time.
TTL = Time To Live (how long the cache should be kept (in milliseconds))
Why Use TTL?
- TTL helps auto-expire old cache.
- API results may change over time.
- You may want to refresh values after a few seconds/minutes.
- TTL keeps cache fresh and relevant.
- It's great for API calls, search, or anything time-sensitive.
Example of TTL
function memoizeWithTTL(fn, ttl = 60000) {
// Default 1 minute
const cache = {}; // Stores result + timestamp
return function (...args) {
const key = JSON.stringify(args);
const now = Date.now();
if (key in cache && now - cache[key].timestamp < ttl) {
console.log("Cache hit (not expired)");
return cache[key].value;
}
console.log("Cache miss or expired");
const result = fn.apply(this, args);
cache[key] = {
value: result,
timestamp: now,
};
return result;
};
}
// Example: API call simulation
const fetchUserData = (userId) => {
console.log(`Fetching data for user ${userId} from API...`);
return { id: userId, name: `User ${userId}`, data: Math.random() };
};
const memoizedFetchUserData = memoizeWithTTL(fetchUserData, 5000); // 5 second TTL
console.log(memoizedFetchUserData(1)); // Fetches from "API"
console.log(memoizedFetchUserData(1)); // Uses cache
// Wait 6 seconds...
setTimeout(() => {
console.log(memoizedFetchUserData(1)); // Fetches again (cache expired)
}, 6000);
Here,
function memoizeWithTTL(fn, ttl = 60000) { const cache = {};
:fn
is the original functionttl
is how long a cached result stays valid (default: 60 sec)
return function (...args) { const key = JSON.stringify(args); const now = Date.now();
:- Create a unique key from input arguments
now
stores the current time (in ms)
if (key in cache && now - cache[key].timestamp < ttl) {...}
: If the key is cached and not expired, return cached result.console.log("Cache miss or expired");...return result; };}
:- If no cache exists, or it’s expired:
- Call the original function
- Store
result
+timestamp
- Return the
result
- If no cache exists, or it’s expired:
- Simulating API Call,
const fetchUserData = (userId) => {...}
:- This simulates a fake API call (with changing data)
const memoizedFetchUserData = memoizeWithTTL(fetchUserData, 5000);
: Wrap the API call with TTL-based memoization.console.log(memoizedFetchUserData(1));
: Cache miss, calls APIconsole.log(memoizedFetchUserData(1));
: Cache hit, fast- Then after 6 seconds:
setTimeout(() => { console.log(memoizedFetchUserData(1)); // Cache expired, calls API again }, 6000);
Note: If someone’s system clock changes, this can break TTL-based cache.
LRU (Least Recently Used) Cache
An LRU cache remembers only the most recently used items.
If the cache gets full, it removes the oldest (least used) item to make space for new ones.
You can think of it like a small bag. If you put in too many things, you have to throw out the item you haven’t used in a while.
Why Use LRU?
- Keeps memory usage under control.
- Useful when working with limited storage.
- Automatically removes unused or old values.
Example of LRU
function memoizeWithLRU(fn, maxSize = 100) {
const cache = new Map();
return function (...args) {
const key = JSON.stringify(args);
if (cache.has(key)) {
// Move to end (most recently used)
const value = cache.get(key);
cache.delete(key);
cache.set(key, value);
return value;
}
// If cache is full, remove least recently used (first item)
if (cache.size >= maxSize) {
const firstKey = cache.keys().next().value;
cache.delete(firstKey);
}
const result = fn.apply(this, args);
cache.set(key, result);
return result;
};
}
// Example usage
const expensiveCalculation = (n) => {
console.log(`Computing for ${n}`);
return n * n * n;
};
const memoizedCalc = memoizeWithLRU(expensiveCalculation, 3); // Max 3 items
console.log(memoizedCalc(1)); // Computes
console.log(memoizedCalc(2)); // Computes
console.log(memoizedCalc(3)); // Computes
console.log(memoizedCalc(4)); // Computes, removes 1 from cache
console.log(memoizedCalc(1)); // Computes again (was evicted)
Here,
function memoizeWithLRU(fn, maxSize = 100) {const cache = new Map();
:fn
: the original expensive functionmaxSize
: max number of items allowed in the cacheMap
is used because it maintains insertion order, which is useful for LRU
return function(...args) {const key = JSON.stringify(args);
: Convert the arguments into a string to use as a cache key.if (cache.has(key)) {...}
:- If the key is already in the cache:
- Remove it
- Re-insert it (to move it to the end of the Map)
- Return the cached result
- If the key is already in the cache:
if (cache.size >= maxSize) {...}
:- If the cache is full:
- Get the first key (least recently used)
- Delete it to make space
- If the cache is full:
const result = fn.apply(this, args); cache.set(key, result); return result;};}
:- Run the original function
- Store the result
- Return the result
const memoizedCalc = memoizeWithLRU(expensiveCalculation, 3);
: Only keep 3 resultsconsole.log(memoizedCalc(1));
: Computesconsole.log(memoizedCalc(2));
: Computesconsole.log(memoizedCalc(3));
: Computesconsole.log(memoizedCalc(4));
: Computes, removes 1 (least recently used)console.log(memoizedCalc(1));
: Computes again (1 was evicted)- Cache stores
[1, 2, 3]
- Then you add 4 → Cache is full → 1 is the oldest → it’s removed
- When you ask for 1 again → not found → recomputes.
Performance Considerations
Memoization helps improve speed by remembering results, but it uses extra memory to store those results.
Memoization trades memory for performance. If used too much or without limits, it can cause high memory usage.
You can build a helper that not only memoizes a function but also tracks:
- How many times the cache was used
- How many times it had to compute
- How big the cache is
- The hit rate (efficiency)
Tracking hits/misses helps you see how well memoization is working. It's useful for debugging and tuning performance in real-world apps.
function memoizeWithMemoryTracking(fn) {
const cache = {};
let hitCount = 0;
let missCount = 0;
const memoized = function (...args) {
const key = JSON.stringify(args);
if (key in cache) {
hitCount++;
return cache[key];
}
missCount++;
const result = fn.apply(this, args);
cache[key] = result;
return result;
};
memoized.stats = () => ({
cacheSize: Object.keys(cache).length,
hitCount,
missCount,
hitRate: hitCount / (hitCount + missCount),
});
memoized.clearCache = () => {
Object.keys(cache).forEach((key) => delete cache[key]);
hitCount = 0;
missCount = 0;
};
return memoized;
}
// Usage
const expensiveFunction = (n) => {
console.log(`Computing for ${n}`);
let result = 0;
for (let i = 0; i < n \* 1000000; i++) result += i;
return result;
};
const memoizedFn = memoizeWithMemoryTracking(expensiveFunction);
console.log(memoizedFn(10));
console.log(memoizedFn(10));
console.log(memoizedFn.stats()); // { cacheSize: 1, hitCount: 1, missCount: 1, hitRate: 0.5 }
Here,
cache
: stores past resultshitCount
: number of times result was fetched from cachemissCount
: number of times the function actually ranconst memoized = function(...args) { const key = JSON.stringify(args);
: Converts arguments to a unique key so you can store/retrieve results.if (key in cache) {...}
: If key exists, then it's a cache hit.missCount++;...return result;};
: If key doesn’t exists, then it's a cache miss, compute and store the result.- Track Cache Stats:
memoized.stats = () => ({...});
Returns current stats:cacheSize
: total items in cachehitCount
: how many times cache was usedmissCount
: how many times original function ranhitRate
: % of requests served from cache
- Clear the Cache:
memoized.clearCache = () => {...}
: A helper to reset everything (good for tests or memory cleanup).
console.log(memoizedFn(10));
: Slow (cache miss)console.log(memoizedFn(10));
: Fast (cache hit)console.log(memoizedFn.stats());
: { cacheSize: 1, hitCount: 1, missCount: 1, hitRate: 0.5 }
When Should You Use Memoization?
- For expensive computations, mathematical calculations, data processing.
- For recursive functions like fibonacci, factorial, tree traversal.
- For API calls, when same data is requested multiple times.
- For pure functions that always return the same output for the same input.
- For DOM queries like expensive CSS selectors.
When You Should Not Use Memoization?
- For simple operations that don't memoize basic math operations.
- For functions with side effects (that modify global state).
- For rarely called functions.
- For functions where input data is always changing.
- For memory-constrained environments where memory is more important than speed.
Best Practices for Memoization
- Profile Before You Optimize
- Don’t memoize everything.
- Measure performance before and after applying memoization.
- Use browser dev tools or Node.js profilers.
- Memoization only helps when:
- The function is expensive (slow).
- The function is called repeatedly with the same inputs.
- Choose a Reasonable Cache Size
- Caching everything forever = memory issues.
- For long-running apps (like SPAs), limit cache size.
- Use an LRU (Least Recently Used) cache to keep only the most-used items.
- Example: keep last 50 or 100 entries, not all.
- Use TTL for Time-Sensitive Data
- Some data should expire.
- If you’re memoizing API results or data that might go stale, use TTL (Time To Live).
- Example: Cache weather data for 5 minutes, not forever.
- Handle Edge Cases
- Not all inputs are simple.
- Be careful with:
null
,undefined
,NaN
and complex objects or arrays. - Always normalize arguments (e.g., using
JSON.stringify
) or write a custom key generator.
- Clear the Cache When Needed
- Long-lived caches can grow silently.
- Provide a way to clear the cache manually.
- Useful for user logouts, state resets, or freeing memory in large apps.
- Use Weak References for Large Objects or DOM
- Avoid memory leaks from heavy or DOM-related objects.
- Use
WeakMap
orWeakRef
to memoize large or DOM-dependent values. - This allows JavaScript to garbage collect them when no longer needed.
Common Mistakes of Memoization
1. Memoizing Impure Functions
Problem: Memoization only works for pure functions, those that always return the same result for the same input and have no side effects.
// Don't do this
const badExample = memoize((x) => {
console.log("Side effect!"); // This will only run once
return x * 2;
});
badExample(2); // Logs: Side effect!
badExample(2); // Cached! No log, side effect skipped
Why it’s a problem: If your function has side effects (e.g., logging, changing DOM, making network requests), memoization will skip them after the first call.
Fix: Only memoize pure and deterministic functions.
2. Reference Equality Issues with Objects
Problem: Memoization checks arguments using reference, not content.
const obj1 = { a: 1 };
const obj2 = { a: 1 };
memoizedFn(obj1); // Cache miss
memoizedFn(obj2); // Also a cache miss, even though content is the same
Why it’s a problem: Even though obj1
and obj2
look the same, they are different objects in memory.
Fix:
- Use primitives as arguments whenever possible.
- Or, create a custom key generator that serializes inputs (e.g., using
JSON.stringify()
with care!).
3. Memory Leaks
Problem: Memoization keeps results in memory, which can build up over time and lead to high memory usage.
const memoizedFn = memoize(expensiveFunction);
Why it’s a problem: If never cleared, cache can grow endlessly.
Fix:
- Use
.clearCache()
method (if available) to reset cache. - Use TTL (Time To Live) to auto-expire old entries.
- Use WeakMap or LRU strategy to limit growth.
👉 Next tutorial: JavaScript Debounce and Throttle