Caching Essentials
September 5, 20253 min read
Caching is storing frequently used data in a fast, temporary storage like memory, Redis, CDN, or browser cache.
It reduces expensive operations like hitting the database or calling an external API.
Think of it like remembering the answer to a question—you don’t need to look it up every time.
Why Use Caching?
- Speed → faster response times
- Scalability → fewer DB/API hits
- Cost saving → less compute and bandwidth
- Resilience → fallback data when backend is slow/unavailable
Cache Flow Example
Without cache:
With cache:
Caching Mechanism
Example in Code
import { createClient } from "redis";
const redis = createClient();
redis.connect();
}
async function getUser(id) {
// 1. Check cache
const cached = await redis.get(`user:${id}`);
if (cached) {
console.log("Cache HIT");
return JSON.parse(cached); // FAST
}
// 2. If not in cache → fetch from DB
console.log("Cache MISS");
const user = await db.findUserById(id); // SLOW
// 3. Store in Redis with TTL (60 seconds)
await redis.set(`user:${id}`, JSON.stringify(user), { EX: 60 });
return user;
}
Cache Lifecycle & TTL
Every cache entry has a lifecycle:
- Insert → data added to cache
- Live → used while valid
- Expire → removed when TTL (Time-To-Live) ends
- Evict → removed early if cache is full (policy-based)
✅ TTL ensures freshness → no permanently stale data.
Cache Eviction Policies
When the cache runs out of space, it must decide which items to remove. That decision is made using an eviction policy.
Common Policies
1. LRU (Least Recently Used)
- Removes the item not accessed for the longest time.
- Widely used, balances performance.
2. LFU (Least Frequently Used)
- Removes the item with the lowest access frequency.
- Good when certain items are always popular.
3. FIFO (First In, First Out)
- Removes the oldest inserted item, regardless of usage.
- Simple, but may evict useful items.
4. MRU (Most Recently Used)
- Removes the most recently accessed item.
- Rarely optimal, but works in some DB workloads.
5. Random
- Removes a random item.
- Simple, sometimes effective in distributed caches.
6. TTL (Time To Live)
- Items expire after a set lifetime.
- Prevents stale data.
7. ARC (Adaptive Replacement Cache)
- Hybrid of LRU + LFU, adapts dynamically.
- Smart but complex.
Key Takeaway
- Use caching to speed up performance and reduce load.
- Always set a TTL to avoid stale data.
- Combine LRU + TTL → practical and widely used.
In short: Cache what’s expensive, expire what’s old, and evict smartly when full.