Caching Strategies

Medium•

Memory Cache

Fast session-level reads

  • - Lowest latency
  • - Clears on reload

Persistent Cache

LocalStorage / IndexedDB

  • - Survives reloads
  • - Invalidation is manual/hard

Server Cache

Redis / app-layer cache

  • - Shared across users
  • - Coherency + eviction complexity

Edge/CDN

Cache near the user

  • - Excellent global latency
  • - Needs precise cache keys

Core Lens

Caching is a consistency decision first, and a speed decision second. Define freshness guarantees before choosing cache tiers.

Flow

Request→
Cache hit?→
Origin fetch→
Write-back

Understanding different caching approaches and when to use client-side vs server-side caching for optimal performance.

Quick Navigation: Client-Side • Server-Side • Invalidation • Patterns • Comparison

Client-Side Caching

Store data in the browser to reduce network requests and provide instant access to frequently used data.

Memory Cache (React Query, SWR, Zustand)

Advantages

  • ✓ Fastest access (in-memory)
  • ✓ Automatic garbage collection
  • ✓ Smart invalidation with stale-while-revalidate
  • ✓ Works with any data type

Disadvantages

  • ✗ Lost on page refresh
  • ✗ Limited by available memory
  • ✗ Not shared across tabs

🎯 Best For: API responses, frequently accessed data during session

LocalStorage / SessionStorage

Advantages

  • ✓ Persists across page refreshes
  • ✓ Simple key-value API
  • ✓ Synchronous access
  • ✓ ~5MB storage limit

Disadvantages

  • ✗ Strings only (need JSON.parse/stringify)
  • ✗ Blocks main thread
  • ✗ No expiration mechanism
  • ✗ Vulnerable to XSS

🎯 Best For: User preferences, theme settings, non-sensitive cached data

IndexedDB

Advantages

  • ✓ Large storage (50MB+ per origin)
  • ✓ Stores complex objects, blobs, files
  • ✓ Asynchronous (doesn't block UI)
  • ✓ Indexed queries for fast lookups
  • ✓ Transactional (ACID compliant)

Disadvantages

  • ✗ Complex API
  • ✗ No built-in expiration
  • ✗ Requires wrapper libraries (Dexie, idb)
  • ✗ Can be cleared by user

🎯 Best For: Offline apps, large datasets, file/blob storage, PWAs

Service Worker Cache API

Advantages

  • ✓ Full offline support
  • ✓ Intercepts network requests
  • ✓ Caches entire responses (HTML, CSS, JS, images)
  • ✓ Background sync capabilities

Disadvantages

  • ✗ Complex to implement
  • ✗ Cache invalidation challenges
  • ✗ HTTPS required
  • ✗ Debugging is difficult

🎯 Best For: PWAs, offline-first apps, static asset caching

Server-Side Caching

Cache data on the server to reduce database load and improve response times for all users.

CDN (Content Delivery Network)

Advantages

  • ✓ Global edge locations (low latency)
  • ✓ Handles traffic spikes
  • ✓ Reduces origin server load
  • ✓ Built-in DDoS protection

Disadvantages

  • ✗ Cache invalidation delay
  • ✗ Not suitable for dynamic content
  • ✗ Cost increases with traffic
  • ✗ Configuration complexity

🎯 Best For: Static assets, images, CSS, JS, public API responses

Redis / Memcached

Advantages

  • ✓ Sub-millisecond response times
  • ✓ Supports complex data structures
  • ✓ Built-in TTL (expiration)
  • ✓ Pub/sub for cache invalidation
  • ✓ Horizontal scaling (Redis Cluster)

Disadvantages

  • ✗ Additional infrastructure
  • ✗ Memory-bound (expensive at scale)
  • ✗ Data loss on restart (without persistence)
  • ✗ Cache stampede risk

🎯 Best For: Session storage, database query results, rate limiting, real-time leaderboards

Cache Invalidation Strategies

"There are only two hard things in Computer Science: cache invalidation and naming things." — Phil Karlton

Time-Based (TTL)

Cache expires after a set duration.

Pros: Simple, predictable, no manual intervention
Cons: Stale data until expiry, choosing right TTL is hard

Event-Based (Publish/Subscribe)

Invalidate cache when data changes.

Pros: Always fresh data, real-time updates
Cons: Complex to implement, requires message queue

Version-Based (Cache Busting)

Include version/hash in cache key (e.g., app.v2.js).

Pros: Instant invalidation, safe deployments
Cons: Can't update cached items, storage grows

Stale-While-Revalidate

Serve stale content while fetching fresh data in background.

Pros: Fast response, eventual consistency
Cons: Users may see stale data briefly

Caching Patterns

Cache-Aside (Lazy Loading)

1
App checks cache for data
2
If miss → fetch from database
3
Store in cache for future requests

Best for: Read-heavy workloads, data that's expensive to compute

Write-Through

1
App writes to cache
2
Cache synchronously writes to database

Best for: Data that must be consistent, write-then-read patterns

Write-Behind (Write-Back)

1
App writes to cache
2
Cache asynchronously writes to database (batched)

Best for: Write-heavy workloads, analytics, logging

Comparison Table

Cache TypeStoragePersistenceUse Case
Memory (React Query)~50MBSession onlyAPI responses
LocalStorage~5MBPermanentUser preferences
IndexedDB50MB+PermanentOffline data, files
Service Worker50MB+PermanentOffline apps, PWAs
CDNUnlimitedTTL-basedStatic assets
RedisConfigurableOptionalSessions, DB results

Decision Guide

Use Client-Side Caching when:

  • •Data is user-specific (preferences, cart, session data)
  • •You need offline access
  • •Reducing network requests for repeat visits
  • •Building PWAs or offline-first apps

Use Server-Side Caching when:

  • •Data is shared across users (product catalog, public content)
  • •Database queries are expensive
  • •You need to reduce backend load
  • •Global edge caching for low latency

Best Practices

  • 1.Cache at multiple layers. Use CDN for static assets, Redis for DB results, React Query for API responses.
  • 2.Set appropriate TTLs. Balance freshness vs performance based on data volatility.
  • 3.Plan for cache misses. Cold start performance still matters.
  • 4.Monitor cache hit rates. Low hit rates indicate caching strategy issues.
  • 5.Handle cache stampede. Use locking or staggered TTLs to prevent thundering herd.