Autonomous Angst

We’ve been on board for the autonomous car revolution for a while now. Since the heady days of Apple’s maybe-defunct Project Titan, to counting the number of self-driving cars we see in Palo Alto (simple pleasures), we’re all for it.

There’s something about the idea of a fleet of autonomous cars that speaks not only to the Total Recall fan in us, but seems like a fulfillment of the sci-fi promises of our childhood. I mean, we don’t have flying cars (yet), but we can at least have safe, efficient robot cars.


What seemed like an easy slide into the age of autonomy is proving to be more of a hard slog than we expected. By this point, we were supposed to have 10 million autonomous vehicles on the road. There are a lot of reasons for the delay in getting us our glorious tech paradise, and safety is proving one of the most troublesome.

This week, the National Transportation Safety Board criticized (to put it mildly) Tesla for not doing enough to prevent drivers from misusing its Autopilot feature. NTSB’s report also found that the National Highway Traffic Safety Administration’s hands-off approach to regulating new vehicle technology is regularly overlooking risks of advanced driver-assistance features like those found in vehicles made by Tesla, GM, and Audi. 

With three fatal crashes to answer for, the heat Tesla’s taking is understandable. But the real issue here is the NHTSA’s voluntary approach to safety tracking and regulation of new features.

This isn’t to say the NHTSA isn’t taking control where they can. Last week they partially suspended operations of shuttle company EasyMile, after a passenger in Ohio was injured in a self-driving shuttle.

It’s a learning curve, and we’re behind. “Fixing problems after people die is not a good highway approach,” Robert Malloy, the director of highway safety at the NTSB, told board members. The NTSB has no power to change federal or state regulations, but they can make recommendations, and they’re doing so.

The thing is, they’ve been doing so for a while, and not much has changed

One consistent note in all reports of autonomous-vehicle crashes is user error. Human users, that is. Partially automated cars don’t really drive themselves, and the assumption that you can just kick back and take a nap while your Tesla cruises you to Whole Foods is, well, going to cause some problems.

NTSB Chairman Robert Sumwalt pulls no punches: "If you own a car with partial automation, you do not own a self-driving car. So don’t pretend that you do,” Sumwalt said. “This means that when driving in the supposed self-driving mode you can’t sleep. You can’t read a book. You can’t watch a movie or TV show. You can’t text. And you can’t play video games.

FINE, Robert. Spoilsport.

We’re hopeful (those childhood, sci fi dreams again) that these are just bumps in the inevitable road to an autonomous paradise. 


You Might Also Like…

The FTC’s Next Strike on Data Privacy

Remember last week when we talked about the new avalanche of lawsuits against Facebook? Well it gets worse for the social media giant (whether or not it’s better for us remains to be seen). Yesterday, the FTC made a move that could signal future regulation of data handling: They demanded all of it. The data, …

The FTC’s Next Strike on Data Privacy Read More »