The company has launched a fresh investigation on the effectiveness of the most recent software updates.
After reviewing hundreds of collisions, including 13 fatal occurrences that resulted in 14 deaths, the National Highway Traffic Safety Administration (NHTSA) has concluded an investigation of Tesla’s Autopilot driver assistance system. The inquiry was conducted after reviewing hundreds of accidents. It has been determined by the organization that the drivers’ improper usage of the system was the cause of these incidents.
Furthermore, the National Highway Traffic Safety Administration (NHTSA) came to the conclusion that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.” To put it another way, the software did not place an emphasis on the attentiveness of the driver. Due to the fact that Tesla “did not adequately ensure that drivers maintained their attention on the driving task,” riders who were utilizing Autopilot or the company’s Full Self-Driving technology “were not sufficiently engaged.”
Between the months of January 2018 and August 2023, the agency conducted investigations into approximately one thousand collisions, which resulted in a total of 29 fatalities. The National Highway Traffic Safety Administration (NHTSA) identified approximately 489 of these collisions as having “insufficient data to make an assessment.” Depending on the circumstances, the other party was either at fault or the Tesla drivers were not utilizing the Autopilot function effectively.
There were 211 accidents in which “the frontal plane of the Tesla struck a vehicle or obstacle in its path,” and these accidents were frequently associated with either Autopilot or FSD. These accidents were the most serious. These occurrences resulted in fourteen fatalities and forty-nine major injuries. According to the findings of the agency, drivers had sufficient time to react, yet they failed to do so in 78 of these occurrences. Despite having at least five seconds to make a move, these drivers did not apply the brakes or modify their steering in order to avoid the hazard.
The complaints that have been lodged against the program play a role in this regard. According to the National Highway Traffic Safety Administration (NHTSA), drivers would simply become overconfident, trusting that the system will manage any potential dangers. By the time there was time to respond, it was already too late. “Crashes with no or late evasive action attempted by the driver were found across all Tesla hardware versions and crash circumstances,” the organization noted. “These crashes occurred in a variety of different circumstances.” A “critical safety gap” was created as a result of the mismatch between the expectations of the driver and the capacity of Autopilot to operate, which resulted in “foreseeable misuse and avoidable crashes.”
The National Highway Traffic Safety Administration (NHTSA) was appalled by the branding of Autopilot, which it deemed to be deceptive and implied that it gave drivers the impression that the program had complete control. To this aim, competing businesses frequently employ branding strategies that include phrases such as “driver assist.” The term “autopilot” refers to a pilot who is acting independently. Tesla is also being investigated for false branding and marketing by the Department of Motor Vehicles of the state of California as well as the Attorney General of the state of California.
The Verge reports that Tesla has stated that it cautions its customers to pay attention while using Autopilot and FSD. Tesla’s statement was made in response to the aforementioned report. According to the official statement of the company, the software includes regular indicators that serve to remind drivers to maintain their hands on the wheel and their eyes on the road. These warnings, according to the National Highway Traffic Safety Administration (NHTSA) and other safety organizations, do not go far enough and are “insufficient to prevent misuse.” Elon Musk, the CEO of Tesla, has made a guarantee that the business will continue to “go balls to the wall for autonomy,” despite the warnings made by safety groups.
There is a possibility that the findings only represent a small portion of the real number of collisions and accidents that are associated with FSD and Autopilot. It was said by the National Highway Traffic Safety Administration (NHTSA) that “gaps in Tesla’s telematic data create uncertainty regarding the actual rate at which vehicles operating with Autopilot engaged are involved in crashes…” The National Highway Traffic Safety Administration (NHTSA) asserts that Tesla gathers data on approximately 18 percent of all crashes that are reported to the police. This indicates that Tesla only obtains data from specific sorts of collisions.
Taking all of this into consideration, the group has initiated yet another investigation into Tesla. This particular one investigates a recent over-the-air (OTA) software repair that was distributed in December, following the recall of two million vehicles. NHTSA will determine whether or not the Autopilot recall repair that Tesla implemented is effective enough to warrant further investigation.