Optimizers need a rethink
Optimization vs Language Semantics
- Several comments argue that once an “optimization” is guaranteed, it effectively becomes part of the language or library semantics (e.g., mandatory RVO, guaranteed tail calls, value types).
- Others emphasize keeping behavior and performance specs separate: a simple semantic model plus a separate, stable performance contract is seen as valuable but hard to maintain.
- Unreliability is framed as intrinsic to optimizations; if users must rely on something, it should be promoted into the language design rather than left to heuristics.
Need for Guarantees and User Control
- Many participants want “assert-style” controls: annotations or pragmas that either guarantee a transform (inline, no allocation, vectorization, constant time) or fail compilation.
- There is frustration with opaque optimizer behavior (e.g., -ffast-math changing behavior, autovectorization hit-or-miss, inlining hints ignored).
- Some see load‑bearing optimizations (no-GC, tail calls, escape analysis) as language-design bugs if not guaranteed; others stress the complexity and slowness of adding such features.
Verification, Security, and Constant Time
- Multiple comments describe work on formal equivalence proofs from spec → C → machine code, inspired by seL4, aiming to reduce effort via SMT/model checking.
- Optimizer bugs are reported as having introduced timing side channels in cryptographic code; this motivates user-guided or extensible optimizers and better ways to express timing constraints.
- Hardware and languages rarely provide constant-time guarantees; some see this as primarily a language-design problem, others as a hardware and tooling gap.
Databases and Query Planners as Analogy
- SQL planners are compared to compiler optimizers: powerful but unpredictable, often forcing users to read plans and fight heuristics.
- Some want the ability to “freeze” or explicitly specify query plans or receive stable plan identifiers/metrics; others note vendors already offer “frozen plans” or similar.
- There’s tension between dynamic re-planning (good for changing data) and reproducible performance.
Debugging, Tooling, and Complexity
- Optimizations make debugging hard; mapping optimized code back to source is likened to reconstructing a cow from hamburger.
- Suggestions include better optimization reports, performance regression testing, plan/IR lockfiles, and tiered language/implementation “levels.”
- Commenters note diminishing gains from traditional optimizers (referencing Proebsting’s law), suggesting future effort may be better spent on programmer productivity and verification rather than ever-more-complex heuristic passes.