“Unreasonable Risk”: Feds Say Tesla’s Autopilot Results in Hundreds of Crashes, Multiple Fatalities

April 28, 2024   |   Tags:

(The Epoch Times)—Tesla’s Autopilot system has contributed to at least 467 vehicle crashes, with 14 resulting in deaths and many others that caused serious injuries, according to federal authorities who say there is a “critical safety gap” in the technology.

The U.S. Department of Transportation is investigating Tesla’s December 2023 recall, which pulled more than 2 million vehicles from the road, to determine whether the company’s updates to its Autopilot driving systems were sufficient in preventing driver distractions. The department analyzed 956 crashes that were alleged to involve the automaker’s driver assistance technology.

The department’s National Highway Traffic Safety Administration on April 26 posted documents to its website that suggest an additional 20 crashes occurred since Tesla’s recall, concerning investigators. The more than 2 million vehicles that Tesla recalled represent nearly all the vehicles that Tesla had sold at that point.

Tesla’s December 2023 recall affected all of its vehicles with Autopilot or driver assistance systems. This includes Tesla’s Model 3, Model X, Model S, and Model Y cars, as well as its Cybertruck. The Autopilot’s driver monitoring systems are supposed to detect the force from drivers’ hands on the steering wheel and send alerts when they are absent.

The safety administration asked the automaker to recall its vehicles after a two-year investigation into Tesla’s Autopilot system. Specifically, the agency probed multiple instances of Teslas crashing into roadside emergency vehicles while using the Autopilot systems.

The automaker said the “prominence and scope of the system’s controls may be insufficient to prevent driver misuse.”

The National Highway Traffic Safety Administration on April 26 said its two-year investigation into Tesla’s Autopilot system was done to find any problems with the technology that “created an unreasonable risk” to driver and vehicle safety, and “involved extensive crash analysis, vehicle evaluations, and assessment of vehicle control authority and driver engagement technologies.”

The administration’s Office of Detects Investigation found 13 crashes that involved one or more fatalities and several others that caused serious injuries in which “foreseeable driver misuse” of the Autopilot system played a key role.

For owners of vehicles that use driver assistance technology, Tesla sent an online software update to increase driver warnings, including when a driver’s hands leave the steering wheel. The agency found 20 additional crashes after Tesla sent the update.

Tesla also stated that the solution requires the driver to opt in but allows the driver to reverse the fail-safe, according to the agency.

Autopilot Allegedly Kills Motorcyclist

A 2022 Tesla Model S using the Autopilot driving system was allegedly linked to a motorcyclist’s death in Seattle, Washington, on April 19.

The vehicle owner told a Washington State Patrol trooper after the crash that he was using the driver assistance technology while looking at his phone and driving the car.

“The next thing he knew there was a bang and the vehicle lurched forward as it accelerated and collided with the motorcycle in front of him,” the police officer wrote in the affidavit.

The trooper arrested the 56-year-old driver for investigation of vehicular homicide after the driver said he was using the Autopilot mode while operating the car inattentively and using his cell phone as it moved forward, “putting trust in the machine to drive for him,” according to the document.

Authorities found the motorcyclist, Jeffrey Nissen, 28, underneath the car, and he was pronounced dead at the scene.

However, authorities are still investigating the crash and have not verified whether the driver was using Autopilot at the time of the incident.

The federal agency probing Tesla said it was looking for defects in Autopilot’s driver monitoring system, which is supposed to alert drivers when their hands are no longer touching the steering wheel. Some experts have alleged that the monitoring system is defective and have also criticized its limitations during night driving.

Teslas have cameras to monitor drivers on the road, but they do not possess night vision capabilities, and Autopilot is still operable when the cameras are covered.

The initial investigation began in 2021 after 11 reports surfaced of Teslas striking parked emergency vehicles while using Autopilot. Between June 8, 2022, and April 25, the Autopilot system caused 467 crashes and 14 fatalities.

Tesla said that both of its driver assistance systems—Autopilot and the advanced “Full Self Driving”—cannot drive themselves, despite the choice of the name.

The latter technology was linked to 75 crashes and one death, according to the investigation.

Tesla’s CEO Elon Musk previously promised a fleet of robotaxis driven with “Full Self Driving” to create revenue for both the company and the vehicles’ owners, as they can operate as taxis when they’d normally be parked. The cars are still being tested and have faced years of delays after Mr. Musk said they would be ready by 2020.

The company says drivers must always be ready to take control of the wheel when using Autopilot or “Full Self Driving” and that they do not give the vehicle complete autonomous control.

Join the conversation about this article on The Liberty Daily Substack.

The Epoch Times has reached out to Tesla for comment. The Associated Press contributed to this report.

The post “Unreasonable Risk”: Feds Say Tesla’s Autopilot Results in Hundreds of Crashes, Multiple Fatalities appeared first on 🔔 The Liberty Daily.


Notify of

Inline Feedbacks
View all comments
Would love your thoughts, please comment.x