© Unsplash/Jenny Ueberberg
The accident occurred in 2019: a Model 3 crashed into a tree at around 105 km/h (65 mph) on a road in California. The driver of the vehicle dies – but the two passengers, an adult and an 8-year-old child, escaped with injuries. Later, the surviving adult passenger decided to take the matter to court. According to him, Autopilot was engaged during the accident and could have been responsible for the crash.
However, the affair led Tesla to once again deploy a winning strategy to defend itself. Rather than checking, by reviewing the vehicle's logs, whether or not the Autopilot was turned on as in certain previous cases, the brand instead pointed out the responsibility of the driver, who would have consumed alcohol (he had a level of 0.05% in the blood, below the legal limit of 0.08% in California). While adding that the driver must be ready to regain control of the vehicle at any time when the driver assistance system is engaged.
Tesla's autopilot judged “ ;not guilty” in a fatal accident
Four days later, after deliberations, the verdict was reached. In the United States, in many states, a popular jury can influence the outcome of a proceeding. And that's precisely what happened: 9 votes for, 3 votes against, Tesla was found not guilty in this case. A first for a trial around Autopilot technology (level 2 autonomy) in a fatal accident. In other recent cases involving only injuries, the firm has also deployed the same argument with similar success.
Tesla's attitude contrasts with that of other manufacturers. Electrek recalls, for example, that Mercedes is beginning to accept its responsibility in the event of an accident involving its Drive Pilot system (level 3 autonomy) in the United States. Asked about this question recently during a conference call, the entrepreneur said: A lot of people assume that we are legally responsible, judging by the lawsuits. We are clearly not being allowed to escape this, whether we like it or not.
The firm emphasizes in its documentation that the driver must always remain attentive and is solely responsible when using autopilot. A message that he must accept is also displayed on the vehicle's screen during the first use. However, this mode of operation is regularly criticized: in question, all the semantics deployed by the brand (doesn't the term Autopilot suggest that the car is already capable of driving itself entirely on its own?). But also, the whole process of “consent” for this type of legal notice.
Indeed, the warning message is only displayed once, according to Electrek, and does not reappear subsequently. And users are used to reading and accepting messages without even reading them. Beyond that, not all drivers keep their attention or their hands on the wheel, despite the reminders that are regularly displayed in the vehicle.