TechyMag.com - is an online magazine where you can find news and updates on modern technologies


Back
Auto

Tesla Autopilot and Full Self-Driving involved in hundreds of crashes and dozens of fatalities - NHTSA

Tesla Autopilot and Full Self-Driving involved in hundreds of crashes and dozens of fatalities - NHTSA
0 0 0 0

The National Highway Traffic Safety Administration (NHTSA) investigated 956 accidents involving Tesla electric cars with Autopilot and Full Self-Driving (FSD) features. The investigation covered incidents that occurred between January 2018 and August 2023. Overall, there were more accidents.

NHTSA began the investigation after several incidents where Tesla cars crashed into parked ambulances on the side of the road. Most of these incidents occurred at night when the car's software ignored warning measures such as hazard lights, flashing lights, cones, and light arrows.

These accidents (some involving other vehicles as well) resulted in 29 fatalities. There were also 211 accidents where "the front plane of the Tesla collided with a vehicle or obstacle in its path." These accidents, which were often the most serious, led to 14 deaths and 49 injuries.

In its investigation, the agency found that Autopilot — and in some cases, FSD — was not designed to keep the driver engaged. Tesla claims to warn its customers that they need to be attentive when using Autopilot and FSD, meaning keep their hands on the wheel and eyes on the road. However, NHTSA says that in many cases, drivers become overly confident and lose focus. And when the need to react arose, it was often too late.

The agency found that in 59 accidents, Tesla drivers had "5 or more seconds" to react before crashing into another object. In 19 of these accidents, the danger was visible for 10 or more seconds before the collision. Reviewing crash logs and data provided by Tesla, NHTSA found that in most analyzed accidents, drivers did not brake or steer to avoid danger.

NHTSA also compared Tesla's level 2 (L2) automation features with products available in other companies' vehicles. Unlike other systems, Autopilot is more likely to distance the driver from control rather than assist in control. This "hinders" drivers from being engaged in the task of driving the car. Tesla stands out in the industry in its approach to L2 technology due to the mismatch of low driver engagement with the allowed operational capabilities of Autopilot. Even the brand name "Autopilot" misleads consumers. Tesla's products make drivers think they are more capable and efficient than they actually are. Other manufacturers use terms like "assist."

NHTSA concludes that drivers using Autopilot or more advanced Full Self-Driving system "were not adequately engaged in the task of driving," and Tesla's technology "did not properly ensure that drivers focused their attention on the task of driving."

NHTSA acknowledges that its study may be incomplete due to "gaps" in Tesla's telemetry data. This could mean that accidents involving Autopilot and FSD are much more numerous than the Administration was able to detect.

Source: The Verge

Thanks, your opinion accepted.

Comments (0)

There are no comments for now

Leave a Comment:

To be able to leave a comment - you have to authorize on our website

Related Posts