Enlarge image

Tesla Model 3 in “Full Self-Driving” mode

Photo: Mike Blake / REUTERS

With its “autopilot,” a driving assistance system with different levels, the electric car manufacturer Tesla has so far made more negative than positive headlines. The list of accidents that have or could have something to do with the system is now long. Some of them were fatal for the inmates.

According to its own information, the "Washington Post" has now reconstructed an accident in detail for the first time, which may be the first fatal accident involving the highest level of the assistance system: the so-called "Full Self-Driving", which is still only a beta version.

This happened in 2022 in Evergreen, Colorado. According to the report, Hans von Ohain, a Tesla employee and driver of the car, and Erik Rossiter were returning from a golf trip. The car left the road, collided with a tree and burst into flames.

The Washington Post cites a recording from the emergency call center that says Ohain was using an "automatic driving function on the Tesla" that "simply turned off the road." Rossiter, who survived the accident as a passenger, testified that von Ohain used “full self-driving.” However, this cannot be proven because the fire destroyed data recordings from the car. Tesla did not respond to inquiries about the incident, according to the newspaper.

Von Ohain died during the accident after he was unable to free himself from the burning vehicle. According to the article, the investigation report lists “smoke inhalation and thermal injuries” as the cause of death. An autopsy revealed that von Ohain had a blood alcohol level of 2.6 per mille, more than three times the legal limit.

Courts are already dealing with Tesla’s “Autopilot”

Tesla points out on its website that the "autopilot" is intended for an "attentive driver" who "keeps his hands on the steering wheel and can take over at any time." The current report suggests that human and technical error could have come together in this case: On the one hand, von Ohain would no longer have been able to drive a car at this alcohol level, according to the article.

On the other hand, the accident analysis points to faults in the car, such as rolling tire marks, which mean that the engine continued to drive the wheels after the impact. "Given the dynamics of the impact and the way the vehicle left the road without there being any signs of a sudden maneuver," this fits with an active driver assistance function, the chief investigator is quoted in the article.

The question of whether drivers or vehicle manufacturers are responsible for such accidents is now also a matter of concern to courts. A California court recently considered a 2019 accident in which a Tesla Model 3 left the road in Autopilot mode, killing the driver and seriously injuring two passengers. The court ruled that Tesla could not be held responsible for the accident. According to the Washington Post, at least nine more cases are expected to go to court this year.

But Elon Musk claimed that the car could drive itself and was better than a human, the accident victim's widow is quoted in the article. "We were sold a false sense of security."

lki