Automotive

We Can’t Trust Humans Not To Screw Up In Audi’s New Semi-Autonomous System


Photo credit Audi

Audi lately has been showing off its new semi-autonomous system planned for the 2018 Audi A8 sedan. Dubbed the Traffic Jam Pilot, the automaker confidently says it will allow drivers to “focus on a different activity that is supported by the car, such as watching the on-board TV”—so long as they pay some attention to the road. This probably isn’t a good idea for the public.

Audi says the traffic jam pilot is the “world’s first” semi-autonomous that checks in at Level 3 on the Society of Automotive Engineers level of autonomy.

How’s Level 3 defined? We have a chart to help:


The Society of Automotive Engineers calls level 3 “conditional automation,” which is about as accurate of a phrase as it gets. It’s a notch above Tesla’s Autopilot, in that Audi says that drivers will no longer have to “continuously monitor the car and can focus on another activity supported by the on-board infotainment system.”

Advertisement

“They must merely remain alert and capable of taking over the task of driving when the system prompts them to do so,” Audi says.

Sounds reasonable enough, sure, and on paper, the step up from Level 2 to Level 3 seems obvious and logical. But there’s significant concerns about the public use of Level 3 autonomy—a lot of which is valid.

“Having a human there to resume control is very difficult,” Bryan Reimer, an MIT researcher who researchers driver behavior, told Wired earlier this year.

Advertisement

Plenty of autonomous tech developers have admitted as much; Tesla engineers, for instance, actually expected that people would stop focusing on the road, despite the fact that Autopilot is no where near capable of handling all driving functions.

So here, a disaster scenario is easily conceivable. Drivers can engage the system if their A8 is traveling less than 37 mph, and there’s a physical barrier between two directions of travel. It’ll handle stops, accelerating, steering and braking, and a camera is in place to check whether the driver is able to resume control of the wheel if needed.

Here’s more of how Audi explains it:

The prompt to take over is given in three phases – ranging from visual and acoustic warnings all the way to an emergency brake application. If the speed exceeds 60 km/h (37.3 mph) or the traffic begins to clear, the traffic jam pilot informs drivers they need resume driving themselves. If the driver ignores this prompt and the subsequent warnings, the A8 is braked until it stops completely in its lane.

Again, all good in theory—but humans have proven themselves to be extremely inept at resuming the wheel when prompted. It makes for intricate dance between tech and human to make Level 3 work, and that’s exactly why some automakers have shied away from it, aiming instead to move from driver-assist functions—for instance, adaptive cruise control—to full autonomy.

“Level 3 may turn out to be a myth,” John Krafcik, ex-Hyundai North America boss turned chief exec of Googles’ self-driving car project, previously said of autonomous cars. “Perhaps it’s just not worth doing.”

Volvo’s in the same boat. CEO Hakan Samuelsson has said there’s no sensor that’s strong enough yet to give drivers adequate notice to resume control of the wheel, beyond a few seconds, and avoid crashing.

“We don’t believe in five seconds, 10 seconds,” Samuelsson told Bloomberg. “It could even be dangerous. If you are doing something else, research shows that it will take two minutes or more before you can come back and take over. And that’s absolutely impossible. That really rules out Level 3.”

Advertisement

That a slightly more-advanced system than Autopilot or GM’s Super Cruise could help people adapt to autonomous cars quicker isn’t that persuasive of an argument for Level 3 earlier. Ford’s former CEO cited a potential fatal accident—akin to the death of Tesla owner Joshua Brown—as something that could shake public confidence and derail any momentum toward the development of self-driving cars.

Human beings are fickle; he’s almost certainly right. Tesla CEO Elon Musk thinks Level 3’s smart because, even with the risks, it could shave an extra percentage or two off the total fatal crashes in a year. But if one fatal crash transpires in a Level 3 system, and it permanently damages the public appetite for fully-automated cars, why risk it?

Several companies are aiming for a rollout of fully-automated vehicles in limited circumstances by early next decade. It’s a small, measured approach—but perfecting the technology in a limited capacity offers a more reasonable approach to integrating robotcars into the public sphere, rather than selling cars that offer the capability of watching TV while the car drives.

Advertisement

I’m interested to try out the Audi system, and the fact it’s limited to 37 mph is at least one measurement that could prevent a terrible accident from occurring in the A8.

But that’s contingent on Audi selling the new A8 only to consumers who aren’t willing to take the traffic jam pilot to its extreme limits—people who aren’t inattentive or slow to respond to the system’s prompt to retake the wheel. That seems like quite a gamble.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Popular

To Top