When the algorithm beat the AI
- The problem
- A high-volume classification pipeline was routing thousands of inbound records per minute. An LLM felt like the default answer; the team had already drafted a prompt and a budget.
- What was considered
- An LLM classifier, a small fine-tuned transformer, and a deterministic rules-plus-lookup approach fed by the actual input distribution from a week of production traffic.
- The decision
- Rules plus lookup. The real input space was narrower than it looked, and an LLM at that volume would have added latency, cost, and a new failure mode to operate without moving the accuracy needle.
- Outcome
- Millisecond-range latency, near-zero variable cost, and a classifier the on-call engineer could reason about at 3 a.m. Reach for AI when the input space is genuinely unbounded, not by default.