Tesla FSD near miss! Driver crashes after takeover. Check details

Tesla driver narrowly avoids train crash on FSD, blames system malfunction & delayed reaction due to trust in autopilot. Incident highlights dange

Dashcam footage shows the car, allegedly on autopilot, barreling through heavy fog towards a train crossing with no signs of slowing down.

Following former UK Prime Minister Boris Johnson’s recent praise for Tesla’s Full Self-Driving (FSD) system, the technology finds itself back in the headlines – but for concerning reasons this time. A close call for a Tesla Model 3 driver has reignited concerns about the capabilities of FSD system. Dashcam footage shows the car, allegedly on autopilot, barreling through heavy fog towards a train crossing with no signs of slowing down.

The incident, shared on online forums, depicts the Tesla navigating a rural road with visibility significantly hampered by fog. As the car approaches the train crossing, the warning lights flash, but the vehicle maintains its speed. Only at the last possible moment does the driver take control, swerving to avoid a collision with the train.

Unfortunately, the last-ditch manoeuvre wasn’t without consequence. The Tesla careened into a pole, sustaining damage to the front right side and leaving the driver with minor injuries. While fortunate to escape a more serious accident, the incident raises critical questions about driver reliance on FSD and the system’s limitations.

Also Read : Former UK Prime Minister amazed by Tesla’s full self-driving tech. Know more

The driver, who has owned the car for a year, claims this isn’t the first time FSD has malfunctioned at train crossings. They allege experiencing two similar incidents in the past six months and are now seeking data from those events.

The driver acknowledges some responsibility, admitting to a delayed reaction due to trusting FSD to operate like other driver-assistance features. “You tend to trust it,” he explained, “like adaptive cruise control that slows down for slower cars. But when it doesn’t, you’re forced to take control.” This highlights a potential pitfall of FSD – a system that can lull drivers into a false sense of security, leading to delayed reactions in critical situations.

Furthermore, the driver contested the speed maintained by the car on autopilot. They claim a recent software update caused the Tesla to exceed the 55 mph speed limit on the rural road, putting them in a situation where even a perfect response might not have prevented the accident entirely.

Tesla’s own manual is clear: FSD requires “constant attention” and driver readiness to take immediate action. However, the name “Full Self-Driving” creates confusion, leading many owners to believe the system offers a higher level of autonomy than it actually does (Level 2 on the SAE scale, which necessitates constant human supervision). This confusion has arguably contributed to several high-profile accidents involving Teslas on autopilot.

The incident serves as a stark reminder of the limitations of FSD and the dangers of overreliance on such systems. While driver-assistance features can be valuable tools, they should never be a substitute for active and attentive driving. Regulatory bodies around the world are scrutinising Tesla’s use of the term “Full Self-Driving,” and this latest event is likely to further fuel that discussion.

First Published Date: 22 May 2024, 19:29 PM IST

Denial of responsibility! Thelocalreport.in is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us.The content will be deleted within 24 hours.

Reference Url