Tesla blows past stopped school bus and hits kid-sized dummies in FSD tests

Scope of the Failure

  • Core issue highlighted: FSD fails to stop for a school bus with flashing red lights and extended stop sign on the opposite side of an undivided road, then hits child-sized dummies and continues driving.
  • Many commenters say this is not an edge case but a basic, legally mandated behavior every minimally competent human driver handles.
  • Several note that the real test isn’t “detecting the kid” but obeying the bus stop signal and slowing/stopping when visibility is obstructed.

Driver Responsibility vs. System Behavior

  • Some argue Tesla’s disclaimers (“requires fully attentive driver”) legally shift responsibility to humans.
  • Others counter: at the speeds shown, there is little realistic time for human recovery; if the system is marketed as “Full Self Driving,” it must be judged as such.
  • Clarification from users: Tesla has separate “pay attention” nags vs. collision warnings and automatic emergency braking; the concern is that neither appeared to prevent this scenario.

Regulation, Testing, and Accountability

  • Broad agreement that the U.S. lacks a dedicated safety certification regime for self‑driving systems; current crash testing focuses mostly on occupants, not AV behavior toward pedestrians.
  • Debate over who should test:
    • One camp: manufacturers test, regulators “trust but verify.”
    • Strong opposing camp: history shows companies routinely hide safety problems; independent, pre‑market testing by public agencies is necessary, with severe penalties for deceit.
  • Some argue regulation will only arrive after many deaths; others call this unacceptable and point to prior creation of agencies like FDA/EPA as proof independent oversight is essential.

Role and Credibility of Anti‑Tesla Testers

  • Skeptics say the Dawn Project and similar groups have strong anti‑Tesla and commercial incentives, produce non‑reproducible “gotcha” videos, and thus require independent validation.
  • Defenders reply that consumer and activist groups historically forced auto safety reforms; bias doesn’t automatically invalidate the results, especially when regulators are weakened or politically interfered with.
  • Several criticize Tesla’s repeated appeal to “the next version” of FSD; they see a pattern where present failures are excused by hypothetical future improvements without transparent data.

Technical and Design Questions

  • Discussion of how FSD may be trained on human driving habits (e.g., rolling stops) vs. explicit rule encoding.
  • Many argue self‑driving should at minimum:
    • Always stop for active school bus signals on undivided roads.
    • Slow to a crawl or stop whenever visibility is obstructed by large vehicles.
    • Use sensors and planning to ensure it never drives faster than it can safely stop within visible distance.
  • Some suggest Tesla’s camera‑only approach makes such scenarios harder than for systems using lidar/radar, though whether that’s the root cause is unclear.