Stop making me memorize the borrow checker
Borrow checker & learning curve
- Many see the borrow checker as central to Rust: you must internalize its rules to be productive, much like a type system.
- Supporters say this pushes you to reason about ownership, lifetimes, and data layout upfront, leading to safer, cleaner designs and better habits even in other languages.
- Critics argue that this “mental tax” is high: you end up architecting for the borrow checker rather than the problem, and you effectively are still “doing memory management,” just at compile time.
Refactoring, brittleness, and exploratory work
- A recurring complaint: large refactors can fail late. You can redesign a big chunk of code, pass typechecking, then hit borrow-check errors that force wide, mechanical rewrites (changing ownership, lifetimes, introducing Rc/Arc/RefCell, indices, etc.).
- This makes Rust feel brittle for R&D and greenfield projects with evolving architectures, and better suited to rewrites where the shape is already known.
- Some mitigate by prototyping minimal cores first, or starting with more cloning / higher-level patterns, then optimizing.
Comparisons: C/C++, GC languages, Go, others
- Rust vs C/C++:
- Rust turns many C/C++ minefields (use-after-free, dangling pointers, data races) into compile-time errors.
- In C/C++, you either memorize enough UB rules and rely on tools, or risk latent bugs; Rust makes those constraints explicit and enforced.
- Rust vs GC languages (Java, C#, Python, etc.):
- GC advocates say manual/ownership-style memory management complicates code and refactoring; GC lets you design with fewer constraints.
- Rust advocates counter that, once BC is internalized, GC brings little benefit and more runtime cost; they prefer explicit ownership to hidden heuristics.
- Rust vs Go: Go is praised for simple, fast compile-time tooling but criticized for weaker types; Rust offers stronger guarantees at the cost of complexity.
Idiomatic Rust & patterns
- Common advice: stop writing C++/Java/OOP in Rust. Prefer:
- Ownership at outer scopes; pass references into functions.
- Tree-shaped ownership; use indices or arenas when you truly need graphs/cycles.
- Functional core, imperative shell; heavy use of enums, pattern matching, immutability.
- Cloning and reference counting where performance is “good enough,” then optimize hotspots.
Safety vs productivity trade-offs
- Pro-Rust view: extra upfront effort saves time later—fewer memory-safety bugs, safer refactors, more confidence in correctness.
- Critical view: encoding lifetimes/ownership and rich types into APIs can “calcify” them; fundamental design shifts require large, mechanical edits, hurting iteration speed.
- Some argue Rust is excellent when memory safety and performance are paramount (systems, critical backends, embedded), but overkill for many applications where GC or higher-level languages suffice.
Ecosystem, tooling, and language evolution
- Debate over many small crates vs a “fat” standard library:
- Many small deps improve evolution but increase supply-chain risk; a strong stdlib with funded maintainers could be safer.
- Others see a “folk stdlib” of popular crates as natural and preferable.
- Tooling: rustc diagnostics and rust-analyzer are improving, but borrow checking still runs after typechecking, limiting early IDE feedback.
- Several commenters expect future language designs (or Rust editions) to address some of these ergonomics, seeing Rust as a major step, not the final answer.