Automotive

A Tesla That The Driver Says Was On Autopilot Crashed Into A Stopped Police Car, Again


Image for article titled A Tesla That The Driver Says Was On Autopilot Crashed Into A Stopped Police Car, Again

Photo: Florida Highway Patrol

Recently, we reported about the NHTSA opening an investigation about Teslas with their Autopilot Level 2 semi-automated driving system active that have crashed into emergency vehicles, like police cars or fire trucks. It appears that another such crash may have happened this Saturday morning, when a Tesla Model 3 crashed into a police car that was stopped on the side of the road to help a motorist in Orlando, Florida. Thankfully, nobody was injured, but this does seem to reinforce the idea that Autopilot has a problem with authority.

Advertisement

A statement from the Florida Highway Patrol (FHP) reported that a FHP officer had stopped to assist a driver with a disabled 2012 Mercedes-Benz GLK 350 on the shoulder of I-4 around Orlando.

The 2019 Tesla Model 3, which the driver claims was in Autopilot mode, struck and sideswiped the cop car, then struck the stricken Mercedes as well, perhaps just to be really thorough and certain that everyone’s day would be ruined.

There’s an investigation currently taking place to actually confirm the reports about the status of Autopilot and what role it had in the crash; it’s certainly not impossible that this information could prove to be wrong, but, then again, this sort of incident is one that’s been seen with Autopilot before, hence the NHTSA investigation.


Image for article titled A Tesla That The Driver Says Was On Autopilot Crashed Into A Stopped Police Car, Again

Photo: Florida Highway Patrol

I’ve made my stance on Autopilot, and, really, all Level 2 semi-automated driving systems quite clear: they suck, not necessarily for technical reasons, but for conceptual ones that have to do with how human beings—the primary target market of Teslas and many other new cars—interact with them. And I’m not alone in thinking this.

Humans are pretty good at avoiding emergency vehicles parked on the side of the highway. Autopilot seems to be quite bad at it. If Autopilot was being used properly in this instance, the human driver would have seen that the car was deciding to drive smack into the police car, and taken over.

Advertisement

But the problem is that when a system is doing nearly all of the actual driving—like Autopilot can often do in highway situations and how the system is generally demonstrated to be doing—humans are terrible about keeping attention focused on monitoring the system.

It’s the human’s fault, sure, but it’s also the result of a bad system that doesn’t take into account how human beings actually work. It’s like that terrible hockey puck mouse Apple made about 20 years ago: technically, it was fine, but the design didn’t take into account how human hands were actually shaped, and as a result, it was a disaster.

Advertisement

Those mice didn’t crash into cop cars along the side of the road, it’s worth mentioning, too. Autopilot and other L2 systems are making the same basic mistake by ignoring how humans actually work.

I’m curious to see the results of this investigation, and if video from the car can be pulled. If it does show Autopilot was in control, I’d hope that would spur Tesla to really focus on improving the ability to avoid parked cars on the sides of roads in an update to the system as soon as possible.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Popular

To Top