Home Ideas A Tesla-Stan Thought This Autopilot “Save” Was Amazing But No

A Tesla-Stan Thought This Autopilot “Save” Was Amazing But No

92

You may remember last week when I wrote about an absurdly hyperbolic claim of alleged life-saving from a Tesla driving via its Level 2 semi-autonomous Autopilot system; this week, I have another one, but this one is a little different because it was brought to my attention by a Tesla-stan who was very sure this would end up in us writing “something positive about Tesla.” Well, friend, you’re in luck, because that’s exactly what will happen. I say this because I’m positive that Tesla on Autopilot drove like a moron.

Just so you know the full story, let me explain. Our helpful Tesla-loving friend slid into my DMs to tell me this:

undefined

Screenshot: Twitter

First, he complained we never say anything nice about Tesla, which is bullshit — we just gave the Model Y a glowing review, after all.

He linked to this article on Teslarati that claims a BMW X5 attempted a PIT maneuver on a Tesla Model 3 using Autopilot, and the reaction of the Autopilot system saved the Tesla and its occupant.

The article included this tweet, which in turn included the tweet to the original tweet, which has video of the event, so we can see it for ourselves:

I wanted to include the outer shell tweet just because it adds more of the Autopilot saving lives narrative and all that. But let’s watch that video to see what’s going on here.

The first thing you may notice is that, despite the article’s claim, that BMW wasn’t doing anything like a PIT maneuver at all, which is accomplished by pushing the attacking car into the rear corner of the target car, forcing it to spin.

I don’t see anything that looks like that here, and even somehow assuming this was an attempted PIT maneuver is a pretty gigantic leap. I mean, I get that road rage exists, but who the hell is just casually PIT maneuvering other drivers on the road? That’s a really extreme leap to make, and not one borne out by the video.

To be fair, the PIT maneuver business wasn’t mentioned in the tweet, and seems to have been brought up first in that Teslarati story, maybe to add a bit of drama to what really just looks like a mild freakout by the Tesla because it was passed.

What looks like was actually happening here is that the Autopilot-guided Tesla was camping out in the passing lane, and the BMW passed it, but the Tesla, instead of slightly slowing down to allow the BMW to safely pass, maintained speed or possibly increased speed a bit, and over-reacted to the passing BMW, swerving into the shoulder and coming alarmingly close to tagging the concrete barrier.

Sure, the BMW’s pass was a bit aggressive and close, but it didn’t have to be that close. A human driver would have seen the BMW bearing down on their tail, and likely slowed down a bit to allow more room for the pass, and, ideally, taken the damn hint and gotten out of the passing lane altogether.

That’s the other thing, here: How long was that Tesla camped out in the passing lane? I see that the Tesla passes a white work pickup on its right, so, okay, that’s reasonable, but why did it remain in that lane after that, especially with a car behind it clearly attempting to go faster or pass on its own?

Now, that doesn’t excuse the BMW’s too-tight pass that cuts off the Tesla a bit, but, much like the situation we saw last week, this would not have been an issue for most remotely reasonable human drivers, and, again the culprit seems to be that Autopilot doesn’t seem to understand that sometimes it makes the most sense to reduce speed a bit to avoid a potentially dangerous situation.

The Tesla could have just slowed down a little bit and that would have given the BMW plenty of room to pass with a greater margin of safety. It’s not like there was anyone behind the Tesla once the BMW overtook it, as we can see in the video:

undefined

Screenshot: Twitter

And, the reason we get that nice clear view of the lane behind the Tesla from its rear-facing side-mounted camera is because instead of just slowing down a bit, the Tesla reacts by swerving dangerously onto the shoulder, kicking up dust and coming alarmingly close to the concrete barrier.

undefined

Screenshot: Twitter

In short, it looks like the Tesla wasn’t aware of the BMW at all, and panicked when it did become aware of it as it was completing its overtaking maneuver, at which point the Tesla swerved, needlessly.

Any human driver would be aware of several things that Autopilot does not seem to be aware of, specifically there’s a car behind you, that car appears to be approaching rapidly, and that car is about to pass you.

A dickhead human driver might do what the Tesla did and maintain or increase speed while the BMW was attempting to pass — an aggressive pass, sure, but hardly anything that unusual. A decent driver would have, again, let off the throttle, let the BMW pass, and then, ideally, gotten out of the passing lane.

So, in this example that was sent to me specifically as an opportunity for me to finally redeem myself and give Autopilot some of the praise that it desperately craves, I’m afraid I’ve failed. I’ve failed because I can’t praise Autopilot here — it is driving like an idiot, and if this is what’s being shown as examples of Autopilot “saving lives,” then I think we’d be better off fending for ourselves, thanks.

This twitter thread is mostly composed of people praising Autopilot and its actions, blaming the BMW’s aggressive actions, and claiming that any attempt to note that Autopilot was camped out in the passing lane a “victim blaming.”

There’s a lot of interesting reactions going on here. The fact that Autopilot reacted dramatically and quickly (if needlessly and dangerously) is treated as an example of its superiority:

These responses seem to be so taken by the fact that the car swerved on its own that they fail to consider that the reaction itself was dangerous and just bad driving.

Elon’s assistant there isn’t wrong — a human wouldn’t have reacted like that, because most humans would have just let the SUV pass without incident.

I’m sure one day autonomous vehicles will likely develop to a point where they are better than human drivers, but we are in no way there yet, and holding up examples like this one as “life saving” doesn’t do anyone any good at all.

Besides, it’s still a Level 2 system, which is inherently flawed for human, not technological, reasons.

That’s not to say these shouldn’t be publicized and shown — they should, absolutely. But they need to be looked at objectively, and used to learn from. Based on this and the one we saw last week, I’d hope Tesla’s Autopilot engineers are studying these incidents and developing algorithms that help the cars understand there are times when slightly reducing speed can keep everyone much safer.

So, Tesla-stan who slid into my DMs looking for me to finally give some credit to Autopilot, here you go: I absolutely credit Autopilot for driving significantly worse than a human on a relatively low-traffic road, in perfect weather and great visibility, causing a potentially dangerous situation that didn’t need to happen.

Thank you for reaching out.

Source: gizmodo.com