It’s happening too often. Someone spots a Tesla owner sleeping while motoring down the freeway, their car under the control of Tesla’s Autopilot driver assistance system. Next thing you know, it’s all over social media.
You may wonder how Tesla was able to release this product onto public roads. Are there no regulations covering such features? Isn’t this a safety issue? According to a report from the Los Angeles Times, it really breaks down to oversight from the government.
The Trump administration focused its efforts on rolling back fuel economy requirements. Its arguments for doing so was that cars would become both cheaper and safer. That didn’t happen, and it’s a mystery why Trump thought it would. One explanation is he didn’t know shit about cars.
Unfortunately, fuel economy and emissions control rollbacks were just about the only things Trump’s NHTSA did get around to doing. NHTSA’s important regulatory oversight work stalled for four years with no director at the helm. Now, the Biden administration has a backlog of neglected tasks to dig through. As the Times report shows, NHTSA has been pretty much hands-off when it comes to driver-assistance systems, specifically when it comes to Tesla’s misleadingly named Autopilot:
Officially, the National Highway Traffic Safety Administration discourages such behavior, running a public awareness campaign last fall with the hashtag #YourCarNeedsYou. But its messaging competes with marketing of Tesla itself, which recently said it will begin selling a software package for “Full Self Driving” — a term it has used since 2016 despite objections from critics and the caveats in the company’s own fine print — on a subscription basis starting this quarter.
That NHTSA has so far declined to confront Tesla directly on the issue is firmly in character for an agency that took a hands-off approach to a wide range of matters under the Trump administration.
”Inactive,” is how Carla Bailo, chief executive of the Center for Automotive Research, summed up NHTSA’s four previous years. “Dormant,” said Jason Levine, executive director at the Center for Auto Safety. “No direction,” said Bryant Walker Smith, a professor and expert in autonomous vehicle law at the University of South Carolina.
The agency went the full Trump term without a Senate-confirmed administrator, leaving deputies in charge. It launched several safety investigations into Tesla and other companies, but left most unfinished. “A massive pile of backlog” awaits the Biden administration,” said Paul Eisenstein, publisher of The Detroit Bureau industry news site.
While NHTSA has been absent on a number of issues, its lack of oversight on autonomous driving is perhaps the biggest. The Times says Level 2 autonomy is the biggest safety challenge since Ralph Nader’s Unsafe At Any Speed. Silly Nader references aside, the Times does have a point.
How to deal with emerging autonomous driving technologies is a long term issue. But one thing is for sure, the way Tesla uses its customers as beta testers raises alarm bells with experts.
Whoever takes charge must balance the long-term potential for next-generation cars to reduce pollution, traffic and greenhouse gases against the near-term risks of deploying buggy new technologies at scale before they’re fully vetted. In the “move fast and break things” style of Silicon Valley, Tesla Chief Executive Elon Musk has embraced those risks.
While other driverless car developers — from General Motors’ Cruise, to Ford’s Argo AI, to Amazon’s Zoox, to Alphabet’s Waymo, to independent Aurora and more — all take an incremental, slow rollout approach with professional test drivers at the wheel, Tesla is “beta testing” its driverless technology on public roads using its customers as test drivers.
Musk said last month that Tesla cars will be able to fully drive themselves without human intervention on public roads by late this year. He’s been making similar promises since 2016. No driverless car expert or auto industry leader outside Tesla has said they think that’s possible.
While law professor Smith is impressed by Tesla’s “brilliant” ability to use Tesla drivers to collect millions of miles of sensor data to help refine its software, “that doesn’t excuse the marketing, because this is in no way full self-driving. There are so many things wrong with that term. It’s ludicrous. If we can’t trust a company when they tell us a product is full self-driving, how can we trust them when they tell us a product is safe?”
The Detroit Bureau’s Eisenstein is even harsher. “Can I say this off the record?” he said. “No, let me say it on the record. I’m appalled by Tesla. They’re taking the smartphone approach: Put the tech out there, and find out whether or not it works. It’s one thing to put out a new IOS that caused problems with voice dictation. It’s another thing to have a problem moving 60 miles per hour.”
A late 2016 NHTSA directive under the Obama administration considered “predictable abuse” as a potential defect in autonomous driving tech deployment. Unfortunately, under Trump NHTSA did nothing. For context, the directive came about a year after the software that enabled Autopilot driver assistance in the Tesla Model S was released.
The inaction of NHTSA drew ire from another federal safety agency, the National Transportation Safety Board. The NTSB — which is most known for its investigations of plane and train incidents — blamed predictable abuse for a 2018 crash where a Tesla Model X crashed into a concrete divider.
Part of the issue is the lack of transparency from Musk and Tesla regarding how safe the Autopilot driver-assist system is as well as a lack of data in general. From the Times:
Musk regularly issues statistics purporting to show that Autopilot and Full Self Driving are on balance safer than cars driven by humans alone. That could be, but even if Musk’s analysis is sound — several statisticians have said it is not — the data is proprietary to Tesla, and Tesla has declined to make even anonymized data available to university researchers for independent confirmation. Tesla could not be reached — it disbanded its media relations department last year.
***
In 2019, after a series of Tesla battery fires, NHTSA launched a probe of the company’s software and battery management systems. Later, the agency said allegedly defective cooling tubes that could cause leaks were being investigated as well. At the time, the agency did not make public information it held about battery cooling tubes prone to leakage that were installed in early versions of the Model S and Model X.
Since late 2016, many Tesla drivers had been complaining about “whompy wheels” on their cars — a tendency for the suspension system to break apart, which sometimes caused a wheel to collapse or fall off the car. Chinese drivers lodged similar complaints, and last October, China authorities ordered a recall of 30,000 Model S and Model X cars. A Tesla lawyer wrote NHTSA a letter arguing no U.S. recall was necessary and blamed driver “abuse” for the problems in China. NHTSA said in October it is “monitoring the situation closely.”
Four days before Biden’s inauguration, NHTSA announced that defects in Tesla touchscreen hardware can make the car’s rear-view camera go blank, among other problems. Rather than order a recall, NHTSA said it asked Tesla to voluntarily recall approximately 158,000 Model S and Model X cars for repair. On Feb. 2, Tesla agreed to recall 135,000 of those cars.
Check out the full Los Angeles Times report, it’s well worth the read!