Cleantech & EV'sNews

Elon Musk claims there has been no crash in Tesla’s Full Self-Driving Beta over a year into the program

Elon Musk claims that Tesla has not had a single crash in its Full Self-Driving Beta program since the start over a year ago, but that’s just as much proof that the testers are being careful as a proof that the system is safe.

“Full Self-Driving Beta” (FSD Beta) is an early version of Tesla’s self-driving software that is currently being tested by a fleet of Tesla owners selected by the company and through its “safety test score.“

The software enables the vehicle to drive autonomously to a destination entered in the car’s navigation system, but the driver needs to remain vigilant and ready to take control at all times. Tesla started the program in October 2020, and it has now pushed the software to several thousands of customers.

The test program has been criticized for putting advanced autonomous features in the hands of customers and leaving the responsibility with them by calling it a level two autonomous system in Beta testing – Tesla has defended itself by saying that it has been careful with slowly rolling out the features to customers it deems “safer drivers”.

In response to a comment by Tesla shareholder Ross Gerber on Twitter, CEO Elon Musk confirmed yesterday that Tesla believes there still has been any accident in the Full Self-Driving Beta program over a year after the launch:

It would mean that he is disputing a previous crash report to the National Highway Traffic Safety Administration (NHTSA). A Model Y owner in the FSD Beta claimed in a complaint to NHTSA that the system caused a crash, but the complaint couldn’t be confirmed.

If the report was indeed inaccurate, it is impressive that Tesla didn’t have an accident in likely millions of miles on FSD Beta. NHTSA says that on average there’s an accident every 500,000 miles for human drivers (aka all drivers).

Electrek’s Take

While impressive, it is probably more proof that Tesla owners in the FSD Beta program are being careful than the system itself is safe because we have seen plenty of videos where the FSD Beta would have caused an accident if it wasn’t for the driver taking control.

It’s a “so far so good” situation, but we know that accidents are inevitable. Once one happens, I expect to see a significant ramp-up in criticism of Tesla’s approach to testing its self-driving system. In the meantime, Tesla is enjoying the use of a lot of data from a test fleet that is not only for free, but is made of customers who paid a lot of money to test the system.

We can argue if this is right or not, but you can’t argue that as a business, this is one hell of a move.


Subscribe to Electrek on YouTube for exclusive videos and subscribe to the podcast.


Author: Fred Lambert
Source: Electrek

Related posts
AI & RoboticsNews

DeepSeek’s first reasoning model R1-Lite-Preview turns heads, beating OpenAI o1 performance

AI & RoboticsNews

Snowflake beats Databricks to integrating Claude 3.5 directly

AI & RoboticsNews

OpenScholar: The open-source A.I. that’s outperforming GPT-4o in scientific research

DefenseNews

US Army fires Precision Strike Missile in salvo shot for first time

Sign up for our Newsletter and
stay informed!