Every layer of review makes you 10x slower
PR review latency and queuing
- Many report PR turnaround in days, not hours; 5-hour “wall-clock” delays are seen as optimistic.
- Core issue is queuing and context switching: reviewers batch reviews 1–2 times a day and are busy with their own work.
- A few teams enforce fast SLAs (e.g. 4 hours) via tooling and culture, but some find this interrupt-driven style stressful.
- Queuing-theory framing appears: if review throughput is fixed, 10x more PRs just creates massive backlogs and staleness.
AI-generated code and review bottlenecks
- Consensus: AI speeds coding but not review; if reviews are the bottleneck, more PRs can reduce team throughput.
- Reviewers complain of “AI slop”: duplicated helpers, random changes, unreviewed agent output, and rising defect rates.
- Some argue one engineer + AI can replace a team if they self-review rigorously; others say this ignores org politics, risk, and queue constraints.
- Strong push that authors must review AI output before burdening peers.
Alternatives and process experiments
- Suggestions to “shift review left”: up-front design sessions, daily alignment, pair/mob programming, and trunk-based development.
- Some teams effectively stop formal code review, relying on trust, strong safety nets, and fast rollback. Others see that as dangerous outside low-risk domains.
- Support for rotating “review duty” / support rotas to reduce ambiguity and speed reviews.
Planning vs doing
- Sharp split: some say an hour of planning saves 10 hours of work; others claim in software it’s often the reverse.
- Many describe a hybrid: minimal initial design, early spike/prototype, then iterative redesign as reality contradicts plans.
- Up-front design is seen as valuable for interfaces and risk boundaries, but overplanning is called out as “progress theater.”
Org size, approvals, and culture
- Multiple layers of management/approvals scale delays superlinearly; big organizations trade speed for perceived risk reduction.
- Several note incentives favor visible feature delivery over reviewing or deleting code, leading to neglected review queues and nitpicky, low-value comments.
- Trust, small teams, clear ownership, and sandboxed/blast-radius-limited environments are repeatedly cited as keys to moving fast without drowning in review.