Caching Strategies: IMemoryCache to Redis
For .NET engineers who know:
IMemoryCache,IDistributedCache,StackExchange.Redis,[ResponseCache], and cache invalidation patterns in ASP.NET Core You’ll learn: How NestJS handles in-process and distributed caching with@nestjs/cache-manager, how client-side caching with TanStack Query replaces server-side output caching, and how to design a layered caching architecture that maps to what you know from .NET Time: 15-20 min read
The .NET Way (What You Already Know)
ASP.NET Core provides caching at multiple layers:
IMemoryCache — in-process, per-server, fast:
public class ProductService
{
private readonly IMemoryCache _cache;
private readonly IProductRepository _repo;
public async Task<Product?> GetProductAsync(string id)
{
var cacheKey = $"product:{id}";
if (_cache.TryGetValue(cacheKey, out Product? cached))
{
return cached;
}
var product = await _repo.GetByIdAsync(id);
if (product != null)
{
_cache.Set(cacheKey, product, new MemoryCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(10),
SlidingExpiration = TimeSpan.FromMinutes(2),
Size = 1,
});
}
return product;
}
}
IDistributedCache with Redis — shared across instances, slower than in-process:
// Registration
builder.Services.AddStackExchangeRedisCache(options =>
{
options.Configuration = builder.Configuration.GetConnectionString("Redis");
options.InstanceName = "MyApp:";
});
// Usage — same interface as IMemoryCache, but serialization is manual
public async Task<Product?> GetProductAsync(string id)
{
var key = $"product:{id}";
var bytes = await _distributedCache.GetAsync(key);
if (bytes != null)
{
return JsonSerializer.Deserialize<Product>(bytes);
}
var product = await _repo.GetByIdAsync(id);
if (product != null)
{
await _distributedCache.SetAsync(
key,
JsonSerializer.SerializeToUtf8Bytes(product),
new DistributedCacheEntryOptions
{
AbsoluteExpirationRelativeToNow = TimeSpan.FromMinutes(10),
});
}
return product;
}
[ResponseCache] — HTTP response caching at the controller action level:
[HttpGet("{id}")]
[ResponseCache(Duration = 60, VaryByQueryKeys = new[] { "id" })]
public async Task<IActionResult> GetProduct(string id)
{
var product = await _productService.GetProductAsync(id);
return Ok(product);
}
This framework gives you a consistent interface regardless of the backing store, plus HTTP-level caching for client-visible responses.
The NestJS Way
NestJS handles caching through @nestjs/cache-manager, which wraps the cache-manager library. The interface is similar to IDistributedCache — a generic get/set/del API — with pluggable stores: in-memory (default) or Redis (@keyv/redis or cache-manager-ioredis).
On top of server-side caching, the modern JS stack adds a layer .NET engineers often miss: client-side server-state caching with TanStack Query. This offloads a significant class of caching (avoiding redundant API calls) to the browser, which changes how you think about [ResponseCache] and HTTP cache headers.
Installation
# Core cache manager
npm install @nestjs/cache-manager cache-manager
# Redis store
npm install @keyv/redis keyv
# For ETag / HTTP cache header utilities
npm install etag
In-Memory Caching
// app.module.ts
import { Module } from '@nestjs/common';
import { CacheModule } from '@nestjs/cache-manager';
@Module({
imports: [
CacheModule.register({
isGlobal: true,
ttl: 60 * 1000, // Default TTL: 60 seconds (milliseconds in cache-manager v5+)
max: 1000, // Maximum number of items in the in-memory store
}),
],
})
export class AppModule {}
// product.service.ts
import { Injectable } from '@nestjs/common';
import { Cache } from 'cache-manager';
import { CACHE_MANAGER } from '@nestjs/cache-manager';
import { Inject } from '@nestjs/common';
import { ProductRepository } from './product.repository';
@Injectable()
export class ProductService {
constructor(
@Inject(CACHE_MANAGER) private readonly cache: Cache,
private readonly repo: ProductRepository,
) {}
async getProduct(id: string): Promise<Product | null> {
const cacheKey = `product:${id}`;
// Try cache first — equivalent to _cache.TryGetValue(...)
const cached = await this.cache.get<Product>(cacheKey);
if (cached !== undefined) {
return cached;
}
// Cache miss — load from DB
const product = await this.repo.findById(id);
if (product) {
// Store with a specific TTL (ms) — equivalent to AbsoluteExpirationRelativeToNow
await this.cache.set(cacheKey, product, 10 * 60 * 1000); // 10 minutes
}
return product ?? null;
}
// Cache invalidation — equivalent to _cache.Remove(key)
async invalidateProduct(id: string): Promise<void> {
await this.cache.del(`product:${id}`);
}
}
Redis Integration
For distributed caching across multiple instances, swap the in-memory store for Redis:
// app.module.ts
import { CacheModule } from '@nestjs/cache-manager';
import { createKeyv } from '@keyv/redis';
@Module({
imports: [
CacheModule.registerAsync({
isGlobal: true,
useFactory: () => ({
stores: [
createKeyv(process.env.REDIS_URL ?? 'redis://localhost:6379', {
namespace: 'myapp', // Key prefix — equivalent to InstanceName in .NET
}),
],
ttl: 60 * 1000,
}),
}),
],
})
export class AppModule {}
The ProductService code does not change — the Cache interface is the same regardless of the backing store. This mirrors how IDistributedCache abstracts over the backing store in .NET, except NestJS handles serialization automatically (values are serialized to JSON internally).
Multi-Layer Caching: L1 (In-Memory) + L2 (Redis)
The most robust production setup mirrors what some .NET teams implement manually: an in-memory L1 cache in front of Redis L2. Cache hits in L1 cost microseconds; Redis hits cost ~1ms.
// app.module.ts — two-tier cache
import { CacheModule } from '@nestjs/cache-manager';
import { createKeyv } from '@keyv/redis';
import Keyv from 'keyv';
import KeyvRedis from '@keyv/redis';
@Module({
imports: [
CacheModule.registerAsync({
isGlobal: true,
useFactory: () => ({
stores: [
// L1: in-memory (fast, per-instance)
new Keyv({ ttl: 30 * 1000 }), // 30 second in-memory cache
// L2: Redis (shared, survives restarts)
createKeyv(process.env.REDIS_URL, { namespace: 'myapp' }),
],
// cache-manager checks L1 first, falls back to L2, then writes back to L1
}),
}),
],
})
export class AppModule {}
Cache Interceptor (the [ResponseCache] Equivalent)
NestJS ships a CacheInterceptor that caches entire controller responses, equivalent to [ResponseCache] on an ASP.NET Core action. Apply it at the controller or method level:
// products.controller.ts
import { Controller, Get, Param, UseInterceptors } from '@nestjs/common';
import { CacheInterceptor, CacheTTL, CacheKey } from '@nestjs/cache-manager';
@Controller('products')
@UseInterceptors(CacheInterceptor) // Cache all responses from this controller
export class ProductsController {
constructor(private readonly productService: ProductService) {}
@Get()
@CacheTTL(30 * 1000) // Override TTL: 30 seconds for this endpoint
async listProducts() {
return this.productService.findAll();
}
@Get(':id')
@CacheKey('product-by-id') // Custom cache key prefix
@CacheTTL(10 * 60 * 1000) // 10 minutes
async getProduct(@Param('id') id: string) {
return this.productService.getProduct(id);
}
}
Apply the interceptor globally (equivalent to app.UseResponseCaching() globally):
// main.ts
import { CacheInterceptor } from '@nestjs/cache-manager';
import { APP_INTERCEPTOR } from '@nestjs/core';
// In AppModule providers:
{
provide: APP_INTERCEPTOR,
useClass: CacheInterceptor,
}
The CacheInterceptor only caches GET requests and uses the URL as the cache key by default. For more sophisticated key generation (varying by user, query params, or custom headers), extend the interceptor:
// custom-cache.interceptor.ts
import { CacheInterceptor, CACHE_KEY_METADATA } from '@nestjs/cache-manager';
import { Injectable, ExecutionContext } from '@nestjs/common';
@Injectable()
export class UserAwareCacheInterceptor extends CacheInterceptor {
// Override the key generation to include the authenticated user ID
// Equivalent to VaryByHeader or custom IResponseCachePolicy in .NET
protected trackBy(context: ExecutionContext): string | undefined {
const request = context.switchToHttp().getRequest();
const baseKey = super.trackBy(context);
if (!baseKey) return undefined;
// Scope cached data per user — prevents user A seeing user B's cached data
const userId = (request.user as { id: string } | undefined)?.id ?? 'anon';
return `${baseKey}:${userId}`;
}
}
Cache Invalidation Strategies
Cache invalidation is where most .NET engineers already know the hard truths. The patterns translate directly:
// invalidation.service.ts
import { Injectable, Inject } from '@nestjs/common';
import { CACHE_MANAGER } from '@nestjs/cache-manager';
import { Cache } from 'cache-manager';
import { InjectRedis } from '@nestjs-modules/ioredis'; // If using ioredis directly
import { Redis } from 'ioredis';
@Injectable()
export class CacheInvalidationService {
constructor(
@Inject(CACHE_MANAGER) private readonly cache: Cache,
) {}
// Single key invalidation — equivalent to _cache.Remove(key)
async invalidateProduct(id: string): Promise<void> {
await this.cache.del(`product:${id}`);
}
// Pattern-based invalidation (requires direct Redis access)
// Equivalent to iterating keys by prefix in .NET IDistributedCache
async invalidateProductCategory(category: string): Promise<void> {
// cache-manager does not support pattern-based deletion natively.
// Use ioredis directly for SCAN + DEL:
// await this.redis.eval(luaScript, 0, `myapp:product:category:${category}:*`);
// See note in Gotchas #3 below.
}
// Tag-based invalidation with Redis Sets
// This pattern has no direct .NET equivalent but is superior to key pattern scanning
async tagProduct(cacheKey: string, productId: string): Promise<void> {
// When setting a cache entry, also record the key in a tag set
// Then invalidate all keys for a product by deleting the set members
}
}
// Pattern for tag-based invalidation using raw Redis
// Register ioredis separately for direct access
async setWithTag(key: string, value: unknown, ttlMs: number, tags: string[]): Promise<void> {
await this.cache.set(key, value, ttlMs);
// Record key in each tag's set (tags expire slightly after the cache entries)
for (const tag of tags) {
await this.redis.sadd(`tag:${tag}`, key);
await this.redis.expire(`tag:${tag}`, Math.ceil(ttlMs / 1000) + 60);
}
}
async invalidateByTag(tag: string): Promise<void> {
const keys = await this.redis.smembers(`tag:${tag}`);
if (keys.length > 0) {
await Promise.all(keys.map(key => this.cache.del(key)));
await this.redis.del(`tag:${tag}`);
}
}
ETag Support
ETags let the browser cache a response and revalidate it cheaply — the server returns 304 Not Modified if the resource hasn’t changed, with no body. In ASP.NET Core this is built into the framework. In NestJS, you add it manually with middleware:
// middleware/etag.middleware.ts
import { Injectable, NestMiddleware } from '@nestjs/common';
import { Request, Response, NextFunction } from 'express';
import * as etag from 'etag';
@Injectable()
export class ETagMiddleware implements NestMiddleware {
use(req: Request, res: Response, next: NextFunction): void {
// Intercept response and add ETag header
const originalSend = res.send.bind(res);
res.send = (body: unknown): Response => {
if (req.method === 'GET' && res.statusCode === 200) {
const bodyStr = typeof body === 'string' ? body : JSON.stringify(body);
const tag = etag(bodyStr);
res.setHeader('ETag', tag);
res.setHeader('Cache-Control', 'private, must-revalidate');
// Check If-None-Match header — 304 if ETag matches
if (req.headers['if-none-match'] === tag) {
res.statusCode = 304;
return originalSend('');
}
}
return originalSend(body);
};
next();
}
}
// app.module.ts
import { MiddlewareConsumer, NestModule } from '@nestjs/common';
export class AppModule implements NestModule {
configure(consumer: MiddlewareConsumer) {
consumer.apply(ETagMiddleware).forRoutes('*');
}
}
Client-Side Caching with TanStack Query
This is a layer .NET engineers often underestimate because the server renders everything in server-centric apps. In a React or Vue SPA consuming a NestJS API, TanStack Query (formerly React Query) provides in-browser caching of server state. It reduces the number of API calls, handles loading/error states, and manages cache staleness — without any server-side code.
// hooks/useProduct.ts (React)
import { useQuery, useMutation, useQueryClient } from '@tanstack/react-query';
// Query key factory — use strings/arrays as stable cache keys
const productKeys = {
all: ['products'] as const,
list: (filters: ProductFilters) => ['products', 'list', filters] as const,
detail: (id: string) => ['products', 'detail', id] as const,
};
// Fetching with automatic caching
export function useProduct(id: string) {
return useQuery({
queryKey: productKeys.detail(id),
queryFn: () => fetch(`/api/products/${id}`).then(r => r.json()),
staleTime: 5 * 60 * 1000, // Data considered fresh for 5 minutes — no refetch
gcTime: 10 * 60 * 1000, // Keep in cache (unused) for 10 minutes
retry: 3, // Retry failed requests 3 times
});
}
// Mutation with cache invalidation
export function useUpdateProduct() {
const queryClient = useQueryClient();
return useMutation({
mutationFn: (data: UpdateProductDto) =>
fetch(`/api/products/${data.id}`, {
method: 'PUT',
body: JSON.stringify(data),
headers: { 'Content-Type': 'application/json' },
}).then(r => r.json()),
// Invalidate and refetch after update — equivalent to removing the key from IMemoryCache
onSuccess: (updatedProduct) => {
// Invalidate the list (may have changed)
queryClient.invalidateQueries({ queryKey: productKeys.all });
// Optimistically update the detail cache with the new data
queryClient.setQueryData(productKeys.detail(updatedProduct.id), updatedProduct);
},
});
}
The mental model shift: TanStack Query is not a replacement for server-side caching. It is an additional layer. A request that TanStack Query serves from its cache never reaches the server. A request that does reach the server can still hit the NestJS cache before reaching the database.
CDN and HTTP Cache Headers on Render
For public, non-personalized content, set HTTP cache headers and let a CDN handle it:
// products.controller.ts
import { Controller, Get, Res, HttpCode } from '@nestjs/common';
import { Response } from 'express';
@Controller('products')
export class ProductsController {
@Get('catalog')
async getPublicCatalog(@Res({ passthrough: true }) res: Response) {
// Equivalent to [ResponseCache(Duration = 300, Location = ResponseCacheLocation.Any)]
res.setHeader('Cache-Control', 'public, max-age=300, s-maxage=3600, stale-while-revalidate=60');
res.setHeader('Vary', 'Accept-Encoding, Accept-Language');
return this.productService.getPublicCatalog();
}
@Get(':id')
async getProduct(@Param('id') id: string, @Res({ passthrough: true }) res: Response) {
// Private — per-user, not cacheable by CDN
res.setHeader('Cache-Control', 'private, max-age=60, must-revalidate');
return this.productService.getProduct(id);
}
}
On Render, static assets and public API responses are cached by Render’s built-in CDN automatically when Cache-Control: public is set.
Key Differences
| Concept | ASP.NET Core | NestJS |
|---|---|---|
| In-memory cache | IMemoryCache | @nestjs/cache-manager (in-memory store) |
| Distributed cache | IDistributedCache + .AddStackExchangeRedisCache() | @nestjs/cache-manager + @keyv/redis store |
| Cache interface | IMemoryCache.TryGetValue / IDistributedCache.GetAsync | cache.get<T>(key) / cache.set(key, value, ttl) |
| Response caching | [ResponseCache] attribute | @UseInterceptors(CacheInterceptor) |
| HTTP cache middleware | app.UseResponseCaching() | Custom middleware or Express express-cache-controller |
| Serialization | Manual (JsonSerializer) for IDistributedCache | Automatic (cache-manager serializes internally) |
| Key prefix / namespace | InstanceName in options | namespace in Keyv store config |
| Cache TTL | AbsoluteExpirationRelativeToNow | ttl in milliseconds (v5+) |
| Sliding expiration | SlidingExpiration | Not natively in cache-manager — use Redis TTL refresh manually |
| Pattern invalidation | SCAN + DEL in StackExchange.Redis | Direct ioredis required — not in cache-manager abstraction |
| ETag support | Built-in via UseResponseCaching | Manual middleware or express-etag |
| CDN caching | Cache-Control headers + Azure CDN | Cache-Control headers + Render CDN / Cloudflare |
| Client-side cache | Not in scope for server code | TanStack Query (browser) — major new layer |
| Output caching | [OutputCache] (.NET 7+) | CacheInterceptor (similar) |
Gotchas for .NET Engineers
1. cache-manager v5 changed TTL units from seconds to milliseconds
If you find examples online showing ttl: 60 and wonder why your cache expires in 60 milliseconds rather than 60 seconds, you have hit the breaking change in cache-manager v5. In v4 and earlier, TTL was in seconds. In v5, TTL is in milliseconds. This change affects every cache.set() call and the default TTL in CacheModule.register().
// Wrong — using seconds (cache-manager v4 style) — data expires in 0.01 seconds
await this.cache.set('key', value, 10);
// Correct — using milliseconds (cache-manager v5+)
await this.cache.set('key', value, 10 * 1000); // 10 seconds
Check your cache-manager version and read the CHANGELOG before copying examples. Most StackOverflow answers and many blog posts still show the v4 API.
2. cache.get() returns undefined for a cache miss, not null
In .NET, IDistributedCache.GetAsync() returns null when a key is not found. In cache-manager, cache.get() returns undefined. This means a simple if (cached) check will incorrectly treat a cached value of false, 0, or empty string as a cache miss.
// Wrong — treats falsy cached values as cache misses
const cached = await this.cache.get<number>('score');
if (cached) { // Fails when cached score is 0
return cached;
}
// Correct — check for undefined explicitly
const cached = await this.cache.get<number>('score');
if (cached !== undefined) {
return cached;
}
This distinction also matters if you intentionally cache null to record a “known non-existent” entry (the cache-aside null-caching pattern). cache-manager will store null as a valid value — the check must be !== undefined.
3. Pattern-based cache invalidation requires bypassing the cache-manager abstraction
IDistributedCache in .NET doesn’t support pattern deletion either — you’d use StackExchange.Redis.IDatabase directly for KEYS or SCAN. The same is true in NestJS: cache-manager only provides get, set, del, and reset. For “delete all keys matching product:*”, you need a direct ioredis client.
// Inject ioredis alongside cache-manager for pattern operations
import { Redis } from 'ioredis';
@Injectable()
export class CacheService {
constructor(
@Inject(CACHE_MANAGER) private readonly cache: Cache,
@Inject('REDIS_CLIENT') private readonly redis: Redis,
) {}
async invalidateByPrefix(prefix: string): Promise<void> {
// SCAN is safer than KEYS in production — non-blocking
let cursor = '0';
const keysToDelete: string[] = [];
do {
const [nextCursor, keys] = await this.redis.scan(
cursor,
'MATCH', `myapp:${prefix}:*`,
'COUNT', 100,
);
cursor = nextCursor;
keysToDelete.push(...keys);
} while (cursor !== '0');
if (keysToDelete.length > 0) {
await this.redis.del(...keysToDelete);
}
}
}
Never use KEYS * in production — it is a blocking O(N) operation that will pause Redis for the duration. Always use SCAN.
4. CacheInterceptor caches based on the request URL — not on the response content
If two requests hit the same URL but receive different responses (because the handler has side effects, or the database changed between requests), CacheInterceptor returns the same cached response for both. This is correct behavior — but it means you must not apply CacheInterceptor to endpoints whose responses depend on:
- The authenticated user’s identity (unless you override
trackBy()to include the user ID in the key) - Request headers like
Accept-LanguageorAuthorization - Write operations (POST, PUT, DELETE) — the interceptor already skips non-GET requests, but be aware
The trackBy() method is your VaryByHeader / VaryByQueryKeys equivalent. Override it whenever the response varies by something other than the URL.
5. TanStack Query staleTime and gcTime are not the same thing — and developers confuse them constantly
Coming from server-side caching, you think of one TTL. TanStack Query has two:
staleTime: How long data is considered fresh. During this period, the hook returns cached data without any background refetch — equivalent tomax-agein HTTP caching.gcTime(formerlycacheTime): How long unused data stays in the in-memory cache before being garbage collected — equivalent tos-maxageor the IMemoryCache TTL after the component unmounts.
useQuery({
queryKey: ['products'],
queryFn: fetchProducts,
staleTime: 5 * 60 * 1000, // 5 min: no network request if data is this fresh
gcTime: 10 * 60 * 1000, // 10 min: keep in memory even after component unmounts
});
// What this means in practice:
// - 0 to 5 min after last fetch: returns cached data, no network request
// - After 5 min (data is stale): returns cached data BUT triggers background refetch
// - After component unmounts: data stays in cache for 10 min in case it's needed again
// - After 10 min unmounted: data is garbage collected
The default staleTime is 0 — meaning every render triggers a background refetch. For data that doesn’t change often (product catalog, user profile), set a meaningful staleTime.
6. In-memory cache is per-instance and is lost on restart
This is the same limitation as IMemoryCache in .NET — but it catches Node.js engineers by surprise more often because Node.js processes restart more frequently (PM2, container restarts, Render re-deploys). Every restart warms the cache from zero.
Design for this: warm critical caches on startup using a @Timeout() job (see Article 4.6), and prefer Redis for any data where cache misses under load are a problem. Use in-memory only as an L1 in front of Redis, or for data that is genuinely cheap to recompute.
Hands-On Exercise
Build a layered caching system for a product catalog API.
Requirements:
-
Set up
@nestjs/cache-managerwith a two-tier configuration: 30-second in-memory L1 and 10-minute Redis L2. -
In
ProductService, implement the cache-aside pattern for:getProduct(id)— cache individual products for 10 minutes, cache the null result (key not found) for 1 minute to prevent thundering herd on invalid IDslistProducts(category, page)— cache paginated lists for 60 seconds, keyed byproducts:list:${category}:${page}
-
Create a
ProductCacheServicethat handles tag-based invalidation:- When a product is updated, invalidate
product:${id}and allproducts:list:*keys matching the product’s category - Use
SCANrather thanKEYSfor pattern deletion
- When a product is updated, invalidate
-
Apply
CacheInterceptorto the GET/products/catalogendpoint (public, no auth), with a customtrackBy()that includes theAccept-Languageheader in the cache key for multi-language support. -
Add ETag support to GET
/products/:idso that clients receive a 304 when the product hasn’t changed. Use a hash ofupdatedAtas the ETag value. -
Create a React hook
useProductList(category: string, page: number)using TanStack Query with:staleTime: 60 * 1000(match the server cache TTL)gcTime: 5 * 60 * 1000- Proper query key using the factory pattern
Stretch goal: Implement a cache warming job using @Timeout(0) that, on startup, loads the 20 most-viewed products from the database and pre-populates the Redis cache. Add a BullMQ job to refresh the top-20 list every hour.
Quick Reference
| Task | ASP.NET Core | NestJS |
|---|---|---|
| Register in-memory cache | AddMemoryCache() | CacheModule.register({ max: 1000 }) |
| Register Redis cache | AddStackExchangeRedisCache(...) | CacheModule.register({ stores: [createKeyv(url)] }) |
| Get from cache | _cache.TryGetValue(key, out T value) | await cache.get<T>(key) (returns undefined on miss) |
| Set in cache | _cache.Set(key, value, options) | await cache.set(key, value, ttlMs) |
| Delete from cache | _cache.Remove(key) | await cache.del(key) |
| Clear all | (no built-in for IMemoryCache) | await cache.clear() |
| Inject cache | IMemoryCache or IDistributedCache | @Inject(CACHE_MANAGER) private cache: Cache |
| HTTP response caching | [ResponseCache(Duration = 60)] | @UseInterceptors(CacheInterceptor) + @CacheTTL(60000) |
| Vary by user | VaryByHeader = "Authorization" | Override trackBy() in custom CacheInterceptor |
| Global interceptor | app.UseResponseCaching() | { provide: APP_INTERCEPTOR, useClass: CacheInterceptor } |
| TTL unit | Seconds (TimeSpan) | Milliseconds (cache-manager v5+) |
| Cache miss value | null | undefined |
| Serialization | Manual (JsonSerializer) | Automatic (cache-manager handles it) |
| Key prefix | InstanceName option | namespace in Keyv store |
| Pattern invalidation | IDatabase.KeyDeleteAsync(pattern) | Direct ioredis SCAN + DEL |
| ETag | Built-in (UseResponseCaching) | Manual middleware or etag npm package |
| Client-side caching | Not applicable | TanStack Query (useQuery, staleTime, gcTime) |
| CDN / public cache | Cache-Control: public, max-age=300 | Same header via res.setHeader() |
Cache-Control header cheat sheet:
// Private, short-lived (user-specific data)
res.setHeader('Cache-Control', 'private, max-age=60, must-revalidate');
// Public, CDN-cacheable (catalog, static content)
res.setHeader('Cache-Control', 'public, max-age=300, s-maxage=3600, stale-while-revalidate=60');
// Never cache (sensitive data, real-time)
res.setHeader('Cache-Control', 'no-store, no-cache, must-revalidate');
// Revalidate every time but allow stale while checking
res.setHeader('Cache-Control', 'no-cache'); // Always revalidate with server
TanStack Query decision guide:
| Data type | staleTime | gcTime | Notes |
|---|---|---|---|
| User profile (infrequent changes) | 5 min | 10 min | Invalidate on update mutation |
| Product catalog (changes daily) | 5 min | 10 min | Background refetch acceptable |
| Shopping cart (session-bound) | 0 (always fresh) | 5 min | User actions mutate frequently |
| Real-time data (prices, stock) | 0 | 1 min | Refetch on window focus |
| Static reference data (countries) | 1 hour | 24 hours | Rarely changes |
Further Reading
- NestJS Caching — the official
@nestjs/cache-managerguide including Redis configuration and theCacheInterceptor - cache-manager v5 Migration Guide — covers the seconds-to-milliseconds TTL change and the new store API
- TanStack Query — Caching Overview — the definitive explanation of
staleTime,gcTime, and the query lifecycle - TanStack Query — Query Key Factories — the community-recommended pattern for managing query keys at scale
- Redis Keyspace Notifications — for building reactive cache invalidation triggered by Redis key expiry events
- HTTP Caching — MDN — the authoritative reference for
Cache-Control, ETags, and conditional requests - Cloudflare Cache Rules — configuring CDN caching without changing application code, complementing your
Cache-Controlheaders