Tesla Faces Renewed Scrutiny as Self-Driving System Under Investigation for Traffic Violations and Accidents

The U.S. National Highway Traffic Safety Administration (NHTSA) has launched a new investigation into approximately 2.88 million Tesla vehicles equipped with the company’s “Full Self-Driving” (FSD) system. Authorities express concerns that the system may be violating traffic laws and contributing to accidents on the road. Reports indicate that Tesla’s FSD has been involved in numerous incidents, including running red lights, drifting into wrong lanes, and collisions at intersections.

Alarming Incidents and Safety Concerns

According to recent reports, 58 instances have been documented where Teslas operating under FSD allegedly disregarded traffic signals and failed to recognize traffic lights properly. Notably, six of these cases involved vehicles running red lights and subsequently colliding with other cars. One Houston driver reported that their Tesla stopped at green lights but failed to recognize red ones, even during a test drive, with the manufacturer reportedly refusing to address the issue. Additionally, the NHTSA is investigating reports of FSD-equipped Teslas failing to navigate railroad crossings safely, with at least one near-miss involving an oncoming train.

Regulatory Action and Legal Challenges

This is not Tesla’s first encounter with regulatory scrutiny. The company is already under investigation for its Autopilot and FSD systems, facing lawsuits and legal penalties. A notable case involved a California jury awarding $329 million in damages after an Autopilot-related crash resulted in fatal injuries. The company is also under investigation for its limited Robotaxi service in Austin, Texas, where passengers reported inconsistent driving behavior, including speeding and erratic maneuvers, despite the presence of safety drivers. Furthermore, Tesla is contesting a false advertising lawsuit filed by the California Department of Motor Vehicles (DMV), which claims that labeling the system as “Full Self-Driving” is misleading since the software still requires constant human supervision. To address these concerns, Tesla recently renamed the feature “Full Self-Driving (Supervised).”

Latest Software Update and Future Risks

Just days before the investigation was announced, Tesla released a new FSD software update. However, the NHTSA states that the current version of the system has already exhibited behaviors that violate traffic safety laws, raising questions about its readiness for widespread deployment. The agency’s ongoing inquiry could potentially lead to a recall if it determines that Tesla’s self-driving software poses significant safety risks.

Industry Trends and Consumer Advice

Meanwhile, other automakers like Lucid Motors and General Motors are advancing their own autonomous driving technologies, including hands-free highway driving features. Despite these developments, experts warn that “self-driving” systems still require vigilant human oversight. Drivers should remain alert and ready to intervene at all times when using semi-autonomous features.

For consumers, understanding the limitations of current autonomous systems is critical. Regularly consulting trusted resources such as the official [National Highway Traffic Safety Administration](https://www.nhtsa.gov/) website can help keep drivers informed about ongoing safety issues and regulatory updates in the autonomous vehicle sector.

Engage and Stay Safe

How comfortable are you with the idea of an AI controlling your vehicle? Share your thoughts with us at [CyberGuy.com](https://cyberguy.com/). Meanwhile, stay vigilant and informed about the evolving landscape of autonomous driving technology and safety regulations.

Ethan Cole

Ethan Cole

I'm Ethan Cole, a tech journalist with a passion for uncovering the stories behind innovation. I write about emerging technologies, startups, and the digital trends shaping our future. Read me on x.com