Tesla's ‘Robotaxis' Keep Crashing—Even With Human ‘Safety Monitors' Onboard

Waymo vs. Tesla: Maturity and Direction

  • Many see Waymo as “years ahead” of Tesla, already operating driverless services in multiple cities, while Tesla’s robotaxis remain limited pilots with safety drivers.
  • Some argue Tesla may never achieve true self‑driving without changing direction (e.g., adding lidar), though others note multiple companies can eventually reach the goal.
  • There’s concern Tesla is already losing any first‑mover advantage as others commercialize.

Sensors and “Premature Optimization”

  • A major thread blames Tesla’s vision‑only approach and early decision to drop lidar, characterizing it as optimizing for cost before having a robust working system.
  • Waymo’s use of lidar and HD maps is framed as the opposite strategy: accept higher hardware cost to gain reliable performance and operational data, then optimize cost later.
  • Several posters note lidar prices have already dropped dramatically and will likely continue to fall, undermining Tesla’s original cost argument.

Economics and User Priorities

  • Debate over whether robotaxis will compete mainly on price per mile or on comfort/style.
  • Some think Tesla and Chinese OEMs can dominate if they reach low cost per mile; others argue car cost per km is only a modest part of the fare and that safety, comfort, and brand will matter.
  • Long digression on how Americans value time, image, and convenience over pure transport cost.

Crash Rates, Safety, and Data Transparency

  • Cited figures: ~4 Tesla robotaxi crashes in ~250k miles vs Waymo roughly one crash per ~98k miles, with Tesla’s having safety drivers and Waymo’s not. Some claim Tesla’s rate is ~10× humans; others challenge the methodology.
  • Posters stress that comparisons must consider severity, fault, and driving context (urban vs highway), as well as interventions by safety drivers—data Tesla does not disclose.
  • Waymo is praised for detailed public safety datasets; Tesla is criticized for redactions and avoiding regimes (like California permits) that require reporting.

Media Framing and Bias

  • Several see the Miami Herald piece as a “hit” or clickbait, pointing to an unrelated burned‑Tesla video at the top and emphasis on very low‑speed incidents.
  • Others counter that Tesla’s broader Autopilot/FSD safety record justifies skepticism and tougher scrutiny than individual fender‑benders suggest.

Trust, Liability, and Readiness

  • Some argue machines must be an order of magnitude safer than humans to be socially accepted, given accountability concerns.
  • One current FSD user reports heavy daily use but says it is clearly not ready for unsupervised operation, still making “silly” and sometimes dangerous errors.
  • Broader worry that companies are prioritizing hype and stock price over transparent safety metrics, eroding public trust in AVs generally.