Self Driving Car Insurance
Liability and Responsibility
- Core debate: if a system is marketed as “self‑driving” or sold as a subscription service, why is the human still paying for liability insurance?
- Current reality: most jurisdictions still treat the person in the driver’s seat as the legal “operator,” regardless of automation level, similar to being responsible for a company car, a horse, a pet, or a minor child.
- Several argue: if liability always stays with the human, then these systems are just “driver assist,” and calling them “autonomous” or “Full Self‑Driving” is misleading.
- Others note that contracts and future laws could shift primary liability to manufacturers (examples cited: robotaxis, limited Mercedes programs), but that raises questions about economic feasibility and how much vehicle prices would need to rise.
Tesla & Lemonade Insurance Economics
- Tesla already tried its own insurance; commenters cite data showing loss ratios >100%, suggesting it was subsidizing premiums to make Teslas/FSD more attractive and boost sales and stock price.
- Some argue expansion stalled because the unit economics didn’t work, despite Tesla’s data advantage and cheap access to parts.
- Lemonade’s 50% FSD discount may be:
- Genuine risk-based pricing if FSD is meaningfully safer, or
- A subsidized marketing/data‑gathering play by an unprofitable insurtech.
- There is speculation Tesla may be backstopping or otherwise supporting Lemonade, but no concrete evidence in the thread.
Safety of FSD vs Human Driving
- Pro‑FSD voices: anecdotes of FSD preventing crashes; some drivers report using it for ~90% of miles and feeling notably safer than in other high‑end cars.
- Skeptics: point to erratic behavior, constant disengagements, stress of supervising, and analogy to “teaching a teenager.”
- Critiques of Tesla safety stats:
- FSD more often used in easy, low‑risk conditions (e.g., highways).
- Crashes after disengagement might be counted as “manual.”
- A 52% crash reduction is seen by some as surprisingly low if the system were truly superior.
Behavior Monitoring and Privacy
- Lemonade’s program uses Tesla’s fleet API to distinguish FSD vs manual miles and to rate “how you drive.”
- Likely implications: higher rates for aggressive or inattentive manual driving; essentially a telematics/safe‑driver program.
- Several commenters object to pervasive telemetry and location tracking, with some physically disabling vehicle modems; others note phones already leak similar data.
Impact on Insurance Markets
- If supervised FSD measurably cuts claims, insurers will increasingly price around it, effectively pushing adoption.
- Concerns that, if unsupervised robo‑vehicles become much safer, premiums for the shrinking pool of human drivers could spike, especially if that pool skews toward riskier drivers or those with older, cheaper cars.
- Counterargument: which drivers migrate first (e.g., drunk or heavy‑mileage drivers) will heavily affect the risk profile of the remaining manual pool.
Ownership, Robotaxis, and Transit
- Some see the point of self‑driving as avoiding ownership and insurance entirely by using robotaxis.
- Others fear “forever subscriptions” and loss of ownership/control, preferring private cars as personal spaces and criticizing SaaS‑style mobility.
- Debate extends to whether self‑driving fleets could replace buses and traditional transit; some claim minivan fleets could beat transit economics, others point to capacity, congestion, and union politics.
Ethical and Legal Ambiguities
- Discussion around whether humans in the loop serve mainly as “entities you can jail” when automation fails.
- Questions raised about how law should treat machine decision‑making, manufacturer responsibility, and the ethics of shifting blame through contracts and branding (e.g., “FSD Supervised”).