Designing Idempotent APIs: Why Your POST Endpoint Needs to Handle Duplicates
Stay on top of this story
Follow the names and topics behind it.
Add this story's key topics to your watchlist so LyscoNews can highlight related developments and future matches.
Create a free account to sync your watchlist, saved stories, and alerts across devices.
Quick Summary
Designing Idempotent APIs: Why Your POST Endpoint Needs to Handle Duplicates
A user clicks Buy. Nothing happens. They click again. Two charges. Same request N times = same result. GET, PUT, DELETE are idempotent. POST is not. Network retries: Mobile app retries on timeout. Server already processed the first request. Load balancer retries: Upstream timeout triggers retry to different backend. User double-clicks: Button not disabled fast enough. Without idempotency, each retry creates duplicates. Client generates a UUID and sends it as a header. Server checks before processing. POST /api/orders Idempotency-Key: 550e8400-e29b-41d4-a716-446655440000
{"product_id": "prod_123", "quantity": 2}
Server: check if key exists in Redis. If yes, return cached response. If no, process and cache. import { Request, Response, NextFunction } from "express"; import Redis from "ioredis";
const redis = new Redis(process.env.REDIS_URL!); const TTL = 86400; // 24 hours
interface CachedResponse { status: number; body: unknown; }
export function idempotency() { return async (req: Request, res: Response, next: NextFunction) => { if (req.method !== "POST") return next();
const key = req.headers["idempotency-key"] as string;
if (\!key) return next();
const cacheKey = `idem:${req.path}:${key}`;
// Check cache
const cached = await redis.get(cacheKey);
if (cached) {
const r: CachedResponse = JSON.parse(cached);
return res.status(r.status).json(r.body);
}
// Lock to prevent concurrent duplicates
const lock = await redis.set(`lock:${cacheKey}`, "1", "EX", 30, "NX");
if (\!lock) {
return res.status(409).json({ error: "Request in progress" });
}
// Intercept response to cache it
const origJson = res.json.bind(res);
res.json = function (body: unknown) {
redis.setex(cacheKey, TTL, JSON.stringify({ status: res.statusCode, body }));
redis.del(`lock:${cacheKey}`);
return origJson(body);
};
next();
}; }
CREATE TABLE orders ( id UUID PRIMARY KEY DEFAULT gen_random_uuid(), idempotency_key UUID UNIQUE NOT NULL, product_id TEXT NOT NULL, quantity INTEGER NOT NULL, total DECIMAL(10,2) NOT NULL, created_at TIMESTAMPTZ DEFAULT now() );
INSERT INTO orders (idempotency_key, product_id, quantity, total) VALUES ($1, $2, $3, $4) ON CONFLICT (idempotency_key) DO NOTHING RETURNING *;
If conflict, query existing row and return it. The UNIQUE constraint guarantees no duplicates even under concurrent load. Stripe idempotency is the gold standard:
- Key scoping: Keys scoped per API key, not global.
- Request fingerprinting: Reusing a key with different params returns 400, not silent cache hit.
- 24-hour TTL: Short enough to reclaim storage, long enough for retries.
- In-progress detection: Same key currently processing returns 409. function validateIdempotencyReuse( cached: CachedRequest, incoming: Request ): void { const hash = crypto.createHash("sha256") .update(JSON.stringify(incoming.body)).digest("hex"); if (cached.bodyHash !== hash) { throw new HttpError(400, "Idempotency key reused with different request body"); } }
Test: duplicate key = same response. Different body = 400. Concurrent = one 201, one 409. Never let server generate keys - client controls retry identity Scope keys: idem:user:path:key not just idem:key Only cache 2xx/4xx, not 5xx (transient errors should retry) Always set TTL (24h is standard) Use DB transactions for multi-step ops - idempotency key in same tx Idempotent APIs prevent duplicate charges, orders, and emails. Client sends unique key Server checks if key exists If seen: return cached response If new: process, cache, return Start with DB UNIQUE constraints. Add Redis for speed. Add request fingerprinting for Stripe-level safety. Your users will double-click. Your network will retry. Design for it. Part of my Production Backend Patterns series. Follow for more practical backend engineering. If this article helped you, consider buying me a coffee on Ko-fi! Follow me for more production backend patterns.