Close Menu
    Login
    • Register
    • Log in
    • Entries feed
    • Comments feed
    • WordPress.org
    • Home
    • Technology
    • Daily Tech
      • Science and Technology
    • Gadgets
    • Gaming
    • Space Exploration
    • Scope
    • Tech News
    Facebook X (Twitter) Instagram Pinterest YouTube WhatsApp
    Facebook X (Twitter) Instagram
    NewTechMania | Tech Revolution Mastering The InsightsNewTechMania | Tech Revolution Mastering The Insights
    Login
    • Home
    • Blog
    • Gadgets
      • Gaming
    • Technology
      • Science
    • Automobile
    • Exploration
    • Scope
    • Tech News
    NewTechMania | Tech Revolution Mastering The InsightsNewTechMania | Tech Revolution Mastering The Insights
    You are at:Home»Automobile»After connecting Tesla Autopilot to 14 fatalities, the NHTSA ends its investigation into the technology.
    Automobile

    After connecting Tesla Autopilot to 14 fatalities, the NHTSA ends its investigation into the technology.

    By Tanuja sharma26 April 2024No Comments5 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    ebff35a0 03e5 11ef b65f 06383a30b576
    ebff35a0 03e5 11ef b65f 06383a30b576
    Share
    Facebook Twitter LinkedIn Pinterest Email

    The company has launched a fresh investigation on the effectiveness of the most recent software updates.

    After reviewing hundreds of collisions, including 13 fatal occurrences that resulted in 14 deaths, the National Highway Traffic Safety Administration (NHTSA) has concluded an investigation of Tesla’s Autopilot driver assistance system. The inquiry was conducted after reviewing hundreds of accidents. It has been determined by the organization that the drivers’ improper usage of the system was the cause of these incidents.

    Furthermore, the National Highway Traffic Safety Administration (NHTSA) came to the conclusion that “Tesla’s weak driver engagement system was not appropriate for Autopilot’s permissive operating capabilities.” To put it another way, the software did not place an emphasis on the attentiveness of the driver. Due to the fact that Tesla “did not adequately ensure that drivers maintained their attention on the driving task,” riders who were utilizing Autopilot or the company’s Full Self-Driving technology “were not sufficiently engaged.”

    Between the months of January 2018 and August 2023, the agency conducted investigations into approximately one thousand collisions, which resulted in a total of 29 fatalities. The National Highway Traffic Safety Administration (NHTSA) identified approximately 489 of these collisions as having “insufficient data to make an assessment.” Depending on the circumstances, the other party was either at fault or the Tesla drivers were not utilizing the Autopilot function effectively.

    There were 211 accidents in which “the frontal plane of the Tesla struck a vehicle or obstacle in its path,” and these accidents were frequently associated with either Autopilot or FSD. These accidents were the most serious. These occurrences resulted in fourteen fatalities and forty-nine major injuries. According to the findings of the agency, drivers had sufficient time to react, yet they failed to do so in 78 of these occurrences. Despite having at least five seconds to make a move, these drivers did not apply the brakes or modify their steering in order to avoid the hazard.

    The complaints that have been lodged against the program play a role in this regard. According to the National Highway Traffic Safety Administration (NHTSA), drivers would simply become overconfident, trusting that the system will manage any potential dangers. By the time there was time to respond, it was already too late. “Crashes with no or late evasive action attempted by the driver were found across all Tesla hardware versions and crash circumstances,” the organization noted. “These crashes occurred in a variety of different circumstances.” A “critical safety gap” was created as a result of the mismatch between the expectations of the driver and the capacity of Autopilot to operate, which resulted in “foreseeable misuse and avoidable crashes.”

    The National Highway Traffic Safety Administration (NHTSA) was appalled by the branding of Autopilot, which it deemed to be deceptive and implied that it gave drivers the impression that the program had complete control. To this aim, competing businesses frequently employ branding strategies that include phrases such as “driver assist.” The term “autopilot” refers to a pilot who is acting independently. Tesla is also being investigated for false branding and marketing by the Department of Motor Vehicles of the state of California as well as the Attorney General of the state of California.

    The Verge reports that Tesla has stated that it cautions its customers to pay attention while using Autopilot and FSD. Tesla’s statement was made in response to the aforementioned report. According to the official statement of the company, the software includes regular indicators that serve to remind drivers to maintain their hands on the wheel and their eyes on the road. These warnings, according to the National Highway Traffic Safety Administration (NHTSA) and other safety organizations, do not go far enough and are “insufficient to prevent misuse.” Elon Musk, the CEO of Tesla, has made a guarantee that the business will continue to “go balls to the wall for autonomy,” despite the warnings made by safety groups.

    There is a possibility that the findings only represent a small portion of the real number of collisions and accidents that are associated with FSD and Autopilot. It was said by the National Highway Traffic Safety Administration (NHTSA) that “gaps in Tesla’s telematic data create uncertainty regarding the actual rate at which vehicles operating with Autopilot engaged are involved in crashes…” The National Highway Traffic Safety Administration (NHTSA) asserts that Tesla gathers data on approximately 18 percent of all crashes that are reported to the police. This indicates that Tesla only obtains data from specific sorts of collisions.

    Taking all of this into consideration, the group has initiated yet another investigation into Tesla. This particular one investigates a recent over-the-air (OTA) software repair that was distributed in December, following the recall of two million vehicles. NHTSA will determine whether or not the Autopilot recall repair that Tesla implemented is effective enough to warrant further investigation.

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Previous ArticleThe thinnest and lightest pen display Wacom has ever produced is its first OLED model – technology
    Next Article Aaron Sorkin is developing a follow-up to The Social Network with an emphasis on January 6 – technology

    Related Posts

    Rivian will likely receive a $6.6 billion government loan to open its Georgia facility

    US Hyundai recalls 145,000 EVs

    Tesla promises to ‘begin launching’ cheaper EVs next year

    Ford advises EV users to stop using complimentary Tesla Supercharger adapters

    Add A Comment

    Comments are closed.

    NewTechMania Tech Revolution Mastering Insights Embark on a tech adventure with latest gadgets technologies join us exploring possibilities main logo

    About US

    Embark on a tech adventure with NewTechMania. From the latest gadgets to emerging technologies, join us in exploring the possibilities that lie ahead.

    Terms

    • Privacy
    • Cookie
    • Terms
    • Disclaimer
    • DMCA

    Useful Links

    • Home
    • About Us
    • Contact Us
    • Get In Touch
    • Privacy

    Weekly Newslatter

    Subscribe to our newsletter to get updated!
    © 2025 NewTechMania. All RightS Reserved.
    Facebook-f Twitter Instagram Pinterest Youtube

    Type above and press Enter to search. Press Esc to cancel.

    Sign In or Register

    Welcome Back!

    Login below or Register Now.

    Lost password?
    Continue with Google

    Register Now!

    Already registered? Login.

    A password will be e-mailed to you.

    Continue with Google