Cleantech & EV'sNews

Tesla FSD Beta tried to kill me last night

I was testing Tesla’s latest Full Self-Driving (FSD) Beta update last night (v11.4.7), and a new aggressive bug has nearly made me crash at highway speed twice.

Please take this as a public service announcement.

I received the new FSD Beta v11.4.7 update on my Model 3 this week.

Tesla has been known to have a “two-step forward, one step back” process with FSD Beta.

Last night, I decided to test it on my way to Montreal, and I found one of those “back steps,” and it almost got me into a potentially deadly crash.

I was on FSD Beta with the speed set at 118 km/h (73 mph) on the 20 direction Montreal, and the system automatically moved to the left lane to pass a car.

As I was passing the car, I felt FSD Beta veering aggressively to the left toward the median strip. Fortunately, I use FSD Beta as recommended by Tesla, which means my hands on the wheel and my eyes on the road.

I was able to steer back toward the road, which disengaged FSD Beta. It was super scary as I almost lost control when correcting FSD Beta and again, I was passing a vehicle. I could have crashed into it if I overcorrected.

When you disengage Autopilot/FSD Beta, Tesla encourages you to send a message about why you disengaged the system.

I did that, but I wasn’t sure what happened, so my message was something like: “Autopilot just tried to kill me, so please fix it.”

Despite having a storage device connected, I didn’t see the camera button to record what happened. It’s something I just noticed following this update.

A few moments later, I gave FSD Beta another shot, and I was actually able to repeat the problem.

As I moved to the left lane again, I was way more alert, and when FSD Beta again veered to the left toward the median strip, this time I saw one of those sections for U-turns for emergency vehicles:

FSD Beta tried to enter it at full speed. I again was able to correct it in time and sent Tesla a bug report, though it cut me off before I could explain what happened. It should be clear if they can pull the video.

This is a brand new behavior for FSD Beta – for me, at least.

Tesla Autopilot used to try to take exit ramps it wasn’t supposed to in the early days, but it was something that Tesla fixed a while ago. I haven’t had that happen to me in years.

Top comment by Mark Wegman


Liked by 7 people

One important life lesson is to know your own limitations. Earlier versions of FSD might have been more humble and more useful. Up until the 11.4.4, I found that my Tesla would do a good job on highways. Now it’s a bit like mad max. Changes lanes inappropriately. In the past it sort of knew it didn’t know when to change lanes and didn’t. Now it seems to know a little more but not really enough.

Suppose an exit that I want to take is a bit more than a mile away and I’m in the right hand lane, going maybe 5 miles slower than I’d like because there’s a car in front of me. Now it will decide to switch lanes pass the car and then come close to cutting off the car it just passed to enable it to get to the exit. This is impolite at best. I can’t seem to convince it to become a polite driver no matter what settings I try. Sounds like Fred’s recent experience is a more aggressive version of the same problem.

View all comments

And this is actually a lot more dangerous than a surprise exit ramp because there’s no exit ramp to slow down in with those median strip u-turn areas. FSD Beta basically tried to take a sharp left turn at 118 km/h. It could have been deadly.

Considering it happened twice in a two-minute timeframe, this is likely a new bug that creeped into FSD Beta.

Hopefully, Tesla can quickly fix this before anything bad happens.

In the meantime, please always use Autopilot and FSD Beta as intended, which means with your hands on your steering wheels and your eyes on the road.


Author: Fred Lambert
Source: Electrek

Related posts
AI & RoboticsNews

Nvidia and DataStax just made generative AI smarter and leaner — here’s how

AI & RoboticsNews

OpenAI opens up its most powerful model, o1, to third-party developers

AI & RoboticsNews

UAE’s Falcon 3 challenges open-source leaders amid surging demand for small AI models

DefenseNews

Army, Navy conduct key hypersonic missile test

Sign up for our Newsletter and
stay informed!