Tesla withheld data, lied, misdirected police to avoid blame in Autopilot crash
Alleged evidence tampering and legal issues
- Many see Tesla’s behavior (withholding crash data, steering police requests to omit key logs, misrepresenting what data existed) as clear obstruction/tampering that would be criminal for individuals.
- Others argue corporations and their lawyers are incentivized to “not talk to police” and only comply narrowly with subpoenas, though even they call the non-disclosure “obviously problematic.”
- Several expect appeals on the civil verdict, but note that the cover‑up, not the crash itself, likely drove the ~$329M damages.
Deletion of “snapshot_collision” file
- A long subthread debates the car uploading a tarball (“snapshot_collision_airbag-deployment.tar”) within minutes of the crash and then deleting the local copy.
- Some engineers say auto‑deleting temporary archives is standard embedded-systems hygiene (avoid ENOSPACE, flash wear, resale/ privacy concerns).
- Others counter that once airbags fire, data stops being “temporary” and becomes critical evidence akin to a black box; actively deleting it post‑upload looks deliberate and unjustifiable.
- Key complaint: not that a temp file was unlinked, but that Tesla later hid the existence of uploaded crash snapshots from investigators and plaintiffs.
Responsibility: driver vs Tesla vs regulators
- Facts cited from the other Electrek article: driver was speeding on a city street, using 2019 Autopilot outside its intended domain, bent down to pick up a phone, and had his foot on the accelerator overriding automatic braking. Jury allocated ~⅔ fault to the driver, ~⅓ to Tesla.
- One camp: Autopilot is just advanced cruise control; the driver is always fully responsible, and blaming Tesla for not geofencing or over‑nagging is unfair, especially given on‑screen and manual warnings.
- Opposing camp: Tesla let Autopilot/Autosteer run where it knew the system was unsafe, failed to issue available “take over immediately” warnings, and marketed capabilities far beyond reality; that’s contributory negligence.
- There’s extended debate over whether NHTSA/FTC share blame for weak or delayed regulation vs this being primarily a Tesla problem.
Marketing, naming, and user expectations
- Huge disagreement over terms like “Autopilot” and “Full Self‑Driving (Supervised).”
- Critics: For normal drivers, “Full Self‑Driving” and “Autopilot” reasonably imply the car can truly drive itself; years of CEO statements and promo videos reinforced this. Legal fine print and later warnings don’t neutralize that.
- Defenders: In aviation/marine contexts, autopilot still requires pilot supervision; drivers must read and accept clear in‑car warnings and manuals, and are licensed to understand the tech they use.
- Some note that many Tesla owners don’t understand the distinction between basic Autopilot and FSD, yet behavior (hands‑off, eyes‑off) shows trust that the system will “save them,” which juries now interpret as a foreseeable result of Tesla’s marketing.
Corporate accountability and punishment
- Many commenters argue corporate fines alone aren’t deterrent; they want criminal charges for executives and even “corporate death penalty” (judicial dissolution) in egregious cases.
- Others warn that past criminal convictions (e.g., Arthur Andersen) effectively destroyed firms and distorted markets, which makes prosecutors and courts reluctant.
- There’s debate over whether $329M is “pocket change” vs a material hit to Tesla’s free cash flow, and whether it will actually change behavior.
Trust, data, and alternatives
- Several say the core reputational issue isn’t just safety performance but trust: Tesla keeps data secret when it’s unfavorable while selectively leaking crash telemetry to attack at‑fault drivers in the press.
- Suggested fixes include automatic mirroring of crash data to a neutral third party accessible to police and victims under subpoena.
- Waymo is repeatedly contrasted as more transparent and cautious; some say they’d trust Waymo, not Tesla, to drive their children.
Meta: media, imagery, and HN reaction
- Electrek’s AI‑generated hero image is widely criticized as trashy/clickbait for a serious topic; some say the site is essentially a Tesla‑traffic blog, not journalism.
- A few note how hard it is for victims’ families to learn the truth without civil discovery, and criticize Tesla fans who frame lawsuits only as “cash grabs” instead of attempts to understand and prevent future deaths.