Lessons from the PG&E outage
Disaster planning and remote-ops limits
- Several commenters argue that robust “power-outage mode” should have been standard for San Francisco, given predictable earthquake/outage risk.
- Others note Waymo did have protocols (treating dark signals cautiously and phoning home), but these didn’t scale: the outage produced a spike in remote-assistance requests, overloading the control center and causing cars to block intersections.
- Debate whether this reveals poor disaster planning (“didn’t understand the complexity at scale”) vs normal production learning (“plans weren’t sufficient, they’ll iterate like any complex system”).
Fail-safe vs traffic disruption
- One camp prefers conservative behavior: if the system is uncertain, stopping is safer than improvising in a disaster scenario (floods, fires, riots, blacked-out signals). Extra congestion during a rare emergency is seen as acceptable.
- Others argue a “fail-safe” that clogs intersections during an emergency is itself unsafe and must be engineered out with additional layers of protection.
Outrage, public roads, and human-driver context
- Some are specifically angry that a private company’s equipment blocked public roads during an emergency and worry about a single firm’s malfunction gridlocking a city.
- Others downplay outrage, comparing this to everyday human-caused blockages and emphasizing that human drivers kill ~40k people per year in the US. They stress consistency: tolerating massive human risk while demanding perfection from AVs is viewed as incoherent.
- Counterpoint: better infrastructure, education, and road design (citing safer countries) could slash fatalities independent of automation, and Waymo’s blockage remains a separate problem.
Connectivity and remote assistance
- Questions raised about whether connectivity issues (cell congestion, dropped packets) contributed to the “backlog.”
- Some note AVs typically use multiple carriers with business-priority SIMs; still unclear from the blog how much the network, versus pure volume of requests, caused delays.
- Suggestions like Starlink are criticized as unnecessary in dense urban areas and environmentally problematic.
Handling edge cases and human directions
- Concerns about reliance on remote operators for unusual states (dark signals, construction, traffic officers, ferries).
- Commenters note Waymo claims to interpret traffic officers’ hand signals; others remain skeptical it’s always autonomous rather than human-assisted.
- Meta-critique: if many rare situations require explicit software updates, the system risks becoming a growing pile of special cases.
Learning and regulation
- Supporters see a major advantage in fleet-wide updates: once improved, all vehicles “learn” instantly.
- Critics view Waymo’s blog as marketing spin with insufficient acknowledgment of responsibility, using the incident as an argument for stronger AV/AI regulation.