November 11, 2024 at 3:21 pm

Drivers Testing Tesla’s Full-Self-Driving Mode For More Than One Thousand Miles Have Serious Concerns

by Trisha Leigh

Source: Shutterstock

It seems like there have been more concerns than successes when it comes to Tesla and their dream of putting self-driving cars on the road.

This newest report is no different, citing serious concerns when attempting to drive the cars more than 1000 miles in self-driving mode.

The team of researchers came from independent testing firm AMCI, and found its capabilities to be “suspect” due to dangerous and unpredictable infractions like running a red light.

Source: YouTube

The determined that even if the full-self-driving mode is impressive in some respects, in others there is a whole lot left to be desired.

Tesla CEO Elon Musk thinks the cars are ready to be set loose fully autonomous, but the firm disagrees. The drivers were forced to intervene over 75 times, or once every 13 miles.

Source: YouTube

The average driver in the States does 35-40 miles per day, so that’s a lot of intervening on a regular basis.

Director Guy Mangiamele gave a statement about the test.

“What’s most disconcerting and unpredictable is that you may watch FSD successfully negotiate a specific scenario many times – often on the same stretch of road or intersection – only to have it inexplicably fail the next time.”

They do note that the FSD was successful a good amount of the time as well, like the time one pulled into a gap to let other cars pass on a tight road, for example.

That said, it also ran red lights while driving at night and disregarded a double yellow line, veering into oncoming traffic around a curve. This particular issue has been ongoing, and since the driver had to intervene to stop a crash, appears to not have been solved.

According to these tests, the cars pose risks while in FSD when humans are behind the wheel, so letting “cybertaxis” into the world should be a ways off.

“When drivers are operating with FSD engaged, driving with their hands in their laps or away from the steering wheel in incredibly dangerous. As you will see in the videos, the most critical moments of FSD miscalculation are split-second events that even professional drivers, operating with a test mindset, must focus on catching.”

Not exactly the report Musk was hoping for, I’m sure.

But he’ll be glad for all the safeguards when it saves him millions in lawsuits.

Probably.

If you enjoyed that story, check out what happened when a guy gave ChatGPT $100 to make as money as possible, and it turned out exactly how you would expect.