More questions have surfaced about driver-assist technology as the U.S. National Highway Traffic Safety Administration (NHTSA) launches a probe into nearly 2.9 million Tesla vehicles equipped with the company’s Full Self-Driving (FSD) system. Attention has grown around how these advanced technologies impact road safety, especially with widely reported incidents involving red lights and problematic lane changes. Drivers and authorities now grapple with the challenges posed by evolving features like FSD, raising the debate about trust and responsibility behind the wheel. Consumers eager for innovation may now reassess how they interact with Tesla’s FSD, given that safety remains a top concern for both manufacturers and regulators.
Earlier investigations and regulatory reviews of Tesla’s autopilot features have focused mostly on crash events and driver misuse, with outcomes prompting incremental updates from the automaker rather than wide-reaching recalls. Things have shifted with this large-scale probe covering almost 3 million vehicles, as current scrutiny centers on FSD-specific errors such as running red lights—an area less emphasized in prior examinations. Previously, claims often revolved around ambiguous handoff between automation and human oversight, whereas the present probe investigates concrete traffic law violations potentially linked to specific software versions. NHTSA’s previous data-driven focus on human error as the main cause of crashes continues, but the attention towards identifiable patterns in automated systems sets this incident apart.
What Prompted the NHTSA’s Latest Investigation?
The agency cited concerns after receiving multiple reports of potentially hazardous behavior induced by the FSD system, which allegedly resulted in Teslas entering intersections against red lights and, in some cases, causing collisions. Six documented incidents involved Teslas running red lights, with four of those resulting in significant injuries. Additionally, several complaints alleged the vehicles failed to stop properly or misinterpreted traffic signals while FSD was engaged.
How Does Tesla Respond to Investigation and Safety Concerns?
While Tesla has yet to issue an official response to the ongoing probe, the company maintains that FSD is a supervised system requiring driver attention at all times. The automaker’s technical notes clarify,
“Drivers must remain attentive and ready to take control of the vehicle at all times when FSD is activated.”
Yet, unresolved questions linger about FSD’s handling of intersections and signal detection. Tesla recently began rolling out FSD (Supervised) V14.1, which is said to offer improved lane management and intersection performance — changes the company hopes will address earlier criticisms.
Is Human Error a Greater Risk Than Automation Flaws?
Despite rising scrutiny on autonomous features, NHTSA’s own statistics highlight the persistent risk created by human drivers. According to the agency, distracted driving led to more than three thousand deaths in 2023, dwarfing the handful of events linked to software errors so far. NHTSA officials stress that,
“Our top priority is to ensure the safety of all road users — regardless of the technology in use.”
Unreported minor violations by both humans and machines remain a widespread concern.
Addressing the intersection of innovation and safety, the NHTSA’s current probe into Tesla’s FSD software underscores both regulatory caution and public curiosity around advanced driver-assistance systems. The difference between earlier Tesla investigations and the current one draws attention to the complexity of regulating systems that continuously evolve via software updates. Owners of Tesla models equipped with FSD (Supervised) might benefit from reviewing update notes and ensuring they are familiar with system limitations. The persistent presence of human error in accident statistics also suggests a need for balanced discussions: while automation presents new challenges, its actual risk profile may change as software matures. Evaluating these systems requires careful monitoring from both regulators and consumers, paired with transparent reporting of all types of road incidents.