C++26: Erroneous behaviour

Erroneous behaviour & uninitialized variables

  • Central topic: the new “erroneous behaviour” category (well-defined but incorrect behavior that implementations are encouraged to diagnose), especially for uninitialized variables.
  • One line of discussion asks whether this is just a compromise for hard‑to‑analyze cases (e.g., passing address of an uninitialized variable across translation units); others agree many such cases can’t be reliably detected.
  • A detailed comment contrasts four options:
    1. Make initialization mandatory (breaks tons of existing code).
    2. Keep undefined behavior (UB) and best‑effort diagnostics.
    3. Zero‑initialize by default (kills existing diagnostic tools and creates subtle logic bugs).
    4. “Erroneous behaviour”: keep diagnostics valid, avoid UB, but still mark it as programmer error.
  • Skeptics argue that once behavior becomes reliable (e.g., always zeroed), people will depend on it, making #3 and #4 similar in practice and undermining the “erroneous” label.
  • Others point out the security dimension (infoleaks via padding), and praise compiler options like pattern‑init and attributes to opt out for performance.

Safety, performance, and diagnostics

  • Some worry “erroneous behaviour” is a cosmetic change to claim “less UB” without real teeth.
  • Others stress performance/compatibility trade‑offs: strict mandatory init (#1) is seen as politically impossible, and fully defined behavior (#3) conflicts with existing sanitizers.
  • There’s concern that compilers recommended to diagnose might still skip checks for performance or niche targets.

C++ ergonomics, safety, and long‑term future

  • A long‑time user vents that C++ is effectively “over”: backwards compatibility plus fundamental flaws (types, implicit conversions, initialization rules, preprocessor, UB) make real fixes impossible, while continual feature accretion increases complexity.
  • Counterpoint: huge existing C++ codebases (hundreds of devs, billion‑dollar rewrites) cannot realistically be migrated wholesale, so incremental improvements—even if imperfect—are valuable.
  • Some see C++ as inevitably following COBOL/Fortran: shrinking but still standardized for decades (C++29, C++38…), with individual developers informally “freezing” at older standards like C++11.
  • Others say they now use C++ mostly as “nicer C” and do not expect it to ever feel truly safe/ergonomic.

Backwards compatibility, profiles, and breaking changes

  • Debate over whether C++ should break compatibility to gain Rust‑like safety. One side calls the compatibility obsession overdone; ancient code doesn’t need coroutines.
  • Opposing view: compatibility and legacy knowledge are C++’s main competitive advantage; a breaking “new C++” would be competing in Rust’s niche without offering enough differentiation.
  • “Safety profiles” are discussed: intended as opt‑in subsets banning unsafe features. Critics highlight severe technical issues (translation units, headers, ODR violations) and note that current profile proposals are early and contentious.

New syntaxes and safer subsets

  • Several propose a “modern syntax over the same semantics” (like Reason/OCaml, Elixir/Erlang): new grammar, const‑by‑default, better destructuring, clearer initialization, local functions—but compiled to standard C++ for perfect interop.
  • Existing experiments like cppfront/cpp2 are cited; some disagree with their specific design choices (e.g., not making locals const‑by‑default).
  • Another safety proposal is Safe C++ (via Circle), claiming full memory safety without breaking source compatibility. Supporters call it a “monumental” effort and criticize the committee for effectively shutting it down via new evolution principles; others note that porting such a deep compiler change across vendors is nontrivial.

Rust vs C++: safety, domains, and ecosystem

  • Strong Rust advocates claim “no reason to use C++ anymore” for new projects, asserting Rust does “everything better” as a language; they concede C++ remains preferable for quick prototyping, firmware, some interfacing, and because of existing ecosystems.
  • C++ defenders counter with domains where C++ still dominates: high‑performance numerics, AI inference, HFT, browser engines, console/VFX toolchains, GPU work, and mature GUIs (Qt, game engines, vendor tools).
  • Rust proponents point to evolving GUI/game stacks (egui, Slint, Bevy) and FFI, but others respond these are far from matching Qt, Unreal, Godot, console devkits, or GPU tooling (RenderDoc, Nsight, etc.).
  • Safety comparison: one side emphasizes that safe Rust “never segfaults” in practice; another points to known soundness bugs and LLVM miscompilations but agrees they’re rare and contrived compared to everyday C++ errors.
  • Some argue that with good tests, sanitizers, and linters, modern C++ can be nearly as safe for many domains; others reply that Rust’s type system makes high‑coverage testing and reasoning about design easier.

Culture, standard library, and “good bones”

  • There’s a recurring theme that many C++ pain points are cultural/ergonomic rather than strictly technical: bad defaults (non‑const locals, multiple initialization syntaxes), non‑composing features, and an inconsistent standard library.
  • Several view C++’s “bones” (low‑level control, metaprogramming power, C ABI interop) as excellent, but the standard library and defaults as the real mess; they note that custom libraries and internal “dialects” can mitigate this.
  • A few commenters like modern C++ and find it elegant if you stick to a curated subset plus tooling; others see only “wizards and sunk‑cost nerds” willingly writing modern C++ and urge the community to move on instead of eternally patching it.