Server-Timing Headers — Debug Slow Pages Without Guessing
Updated April 2026
Reading this? Verify your fix in real-time. Audit your performance headers → EdgeFix
Your API takes 800ms. Is it the database? The cache miss? The template render? Without Server-Timing headers, you are guessing. With them, the browser DevTools show you exactly where the time went.
The Server-Timing header format
# Single metric Server-Timing: db;dur=45.2 # Multiple metrics Server-Timing: db;dur=45.2, cache;dur=2.1, render;dur=12.8 # With description Server-Timing: db;desc="User query";dur=45.2, cache;desc="Redis lookup";dur=2.1
Add to your backend
Express / Node.js
app.get("/api/dashboard", async (req, res) => { const timings = []; // Measure database query const dbStart = performance.now(); const user = await db.query("SELECT * FROM users WHERE id = ?", [req.userId]); timings.push(`db;desc="User query";dur=${(performance.now() - dbStart).toFixed(1)}`); // Measure cache lookup const cacheStart = performance.now(); const cached = await redis.get(`dashboard:${req.userId}`); timings.push(`cache;desc="Redis";dur=${(performance.now() - cacheStart).toFixed(1)}`); // Measure external API const apiStart = performance.now(); const metrics = await fetchMetrics(req.userId); timings.push(`metrics_api;desc="Metrics service";dur=${(performance.now() - apiStart).toFixed(1)}`); res.setHeader("Server-Timing", timings.join(", ")); res.json({ user, metrics });
});
FastAPI / Python
import time
from fastapi import Response
@app.get("/api/dashboard")
async def dashboard(response: Response, user_id: int): timings = [] t = time.perf_counter() user = await db.fetch_user(user_id) timings.append(f'db;desc="User query";dur={((time.perf_counter() - t) * 1000):.1f}') t = time.perf_counter() cached = await redis.get(f"dashboard:{user_id}") timings.append(f'cache;desc="Redis";dur={((time.perf_counter() - t) * 1000):.1f}') response.headers["Server-Timing"] = ", ".join(timings) return {"user": user}
Django middleware
import time
class ServerTimingMiddleware: def __init__(self, get_response): self.get_response = get_response def __call__(self, request): start = time.perf_counter() response = self.get_response(request) duration = (time.perf_counter() - start) * 1000 response["Server-Timing"] = f'total;dur={duration:.1f}' return response
View in Chrome DevTools
Open DevTools → Network → click any request → Timing tab. Your Server-Timing metrics appear at the bottom as a bar chart. Each metric shows as a labelled segment.
Add a reusable timing utility
// utils/timing.js — reuse across routes
class ServerTimer { constructor() { this.metrics = []; } start(name, desc) { return { name, desc, start: performance.now() }; } end(timer) { const dur = (performance.now() - timer.start).toFixed(1); this.metrics.push( timer.desc ? `${timer.name};desc="${timer.desc}";dur=${dur}` : `${timer.name};dur=${dur}` ); } header() { return this.metrics.join(", "); }
}
// Usage
const t = new ServerTimer();
const dbTimer = t.start("db", "User query");
const user = await db.getUser(id);
t.end(dbTimer);
res.setHeader("Server-Timing", t.header());
Server-Timing is visible in the browser Network tab to anyone with DevTools open. Use generic metric names. Never include database hostnames, internal service names, or query content.
Audit your performance headers → EdgeFix