In 1986, for reasons that now seem absurd, the Audi 5000 became the victim of a national panic over “sudden acceleration incidents.” These were, allegedly, events in which the car shot forward even though the driver was not stepping on the gas, but was rather pressing on the brake as hard as possible.
There had always been a certain number of these incidents reported to regulators. Regulators didn’t do much with them, because they assumed what you are probably assuming: The drivers were not, in fact, stepping on the brake, but were flooring the gas.
Then in 1986, the New York Times wrote an article on the phenomenon. It mentioned the Audi only in passing, but it caught the eye of a woman on Long Island who had had two such accidents in her Audi. She formed a support group for similarly afflicted Audi owners and got a Ralph Nader group involved. The next thing you knew, “60 Minutes” was doing a broadcast on the issue, and as P.J. O’Rourke would later write, “Audis began jumping and leaping and cavorting in suburban driveways like killer whales at Sea World, and the sky turned legal-pad yellow with law suits.”
Eventually, the National Highway Traffic Safety Administration got involved, and wrote up a report which found that ... yup, these drivers were stepping on the gas instead of the brakes, often with horrific results. That didn’t save Audi, for which sales collapsed and which almost pulled out of the U.S. market. (To add insult to injury, the company was sued by Audi owners for the lost resale value of their cars.)
I’ve been thinking about that history this morning because of the Tesla crash in which a driver using the “Autopilot” feature was killed when the car drove itself under a tractor-trailer. It appears to have been something of a freak accident -- white trailer riding high against the bright sky, so that the autopilot didn’t detect the truck in its path. Tesla says that “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky,” but this may be because, according to the driver of the truck, the driver of the Tesla appeared to be watching a movie.
The immediate lesson of this is something that experts have been telling self-driving-car overenthusiasts for quite a while: It’s going to be longer than you think before we have a truly road-safe car that can drive itself all the time. But the wider lesson is that even products that are good are vulnerable to bad safety news at the wrong time.
The firestorm that hit Audi in the ’80s was not something they had deserved; it was mostly just dumb bad luck. They were the lowest-priced German luxury car in the market, and many American drivers new to such cars found their design unfamiliar, because it had the gas pedal and brake very close together.
But one thing to note is that the Audi pedal design, which may have somewhat contributed to the craze, was a good design; you can brake faster if the pedal that stops the car is closer to the one that makes it go. The fact that this actually represented progress toward safety did not save Audi from a flaying in the court of public opinion.
I’ve long thought that Tesla made a mistake calling its system “Autopilot”; it sounds nifty, but the implicit promise it makes amounts to sticking a sign on every car saying “Please, someone file a class-action lawsuit.” Watching a movie while driving is not the recommended use of Autopilot. The company says: “Always keep your hands on the wheel. Be prepared to take over at any time.” But if you call your system “Autopilot,” you can‘t be surprised when some drivers watch a movie while using that mode. We all want self-driving cars so badly that some people are behaving as if they’re already here. That sets up expectations that are bound to be dashed.
Tesla drivers may well still actually be safer using Autopilot than they are driving themselves -- yes, even while watching a movie. But we are more afraid of being driven unwary into a tractor-trailer than we are of our own mistakes. We’re also more afraid of the unknown than the status quo. Which means that however rare this sort of thing actually is, high-profile cases like this, coming at the wrong moment, have the potential to derail both regulatory progress and market adoption of self-driving cars.
That may not be a perfectly rational calculation. But the failure of human beings to make perfectly rational calculations is exactly why we’re trying to get them out of the business of driving in the first place.
By Megan Mcardle
Megan McArdle is a Bloomberg View columnist who writes on economics, business and public policy. She is the author of “The Up Side of Down.” –Ed.
(Bloomberg)
There had always been a certain number of these incidents reported to regulators. Regulators didn’t do much with them, because they assumed what you are probably assuming: The drivers were not, in fact, stepping on the brake, but were flooring the gas.
Then in 1986, the New York Times wrote an article on the phenomenon. It mentioned the Audi only in passing, but it caught the eye of a woman on Long Island who had had two such accidents in her Audi. She formed a support group for similarly afflicted Audi owners and got a Ralph Nader group involved. The next thing you knew, “60 Minutes” was doing a broadcast on the issue, and as P.J. O’Rourke would later write, “Audis began jumping and leaping and cavorting in suburban driveways like killer whales at Sea World, and the sky turned legal-pad yellow with law suits.”
Eventually, the National Highway Traffic Safety Administration got involved, and wrote up a report which found that ... yup, these drivers were stepping on the gas instead of the brakes, often with horrific results. That didn’t save Audi, for which sales collapsed and which almost pulled out of the U.S. market. (To add insult to injury, the company was sued by Audi owners for the lost resale value of their cars.)
I’ve been thinking about that history this morning because of the Tesla crash in which a driver using the “Autopilot” feature was killed when the car drove itself under a tractor-trailer. It appears to have been something of a freak accident -- white trailer riding high against the bright sky, so that the autopilot didn’t detect the truck in its path. Tesla says that “Neither Autopilot nor the driver noticed the white side of the tractor trailer against a brightly lit sky,” but this may be because, according to the driver of the truck, the driver of the Tesla appeared to be watching a movie.
The immediate lesson of this is something that experts have been telling self-driving-car overenthusiasts for quite a while: It’s going to be longer than you think before we have a truly road-safe car that can drive itself all the time. But the wider lesson is that even products that are good are vulnerable to bad safety news at the wrong time.
The firestorm that hit Audi in the ’80s was not something they had deserved; it was mostly just dumb bad luck. They were the lowest-priced German luxury car in the market, and many American drivers new to such cars found their design unfamiliar, because it had the gas pedal and brake very close together.
But one thing to note is that the Audi pedal design, which may have somewhat contributed to the craze, was a good design; you can brake faster if the pedal that stops the car is closer to the one that makes it go. The fact that this actually represented progress toward safety did not save Audi from a flaying in the court of public opinion.
I’ve long thought that Tesla made a mistake calling its system “Autopilot”; it sounds nifty, but the implicit promise it makes amounts to sticking a sign on every car saying “Please, someone file a class-action lawsuit.” Watching a movie while driving is not the recommended use of Autopilot. The company says: “Always keep your hands on the wheel. Be prepared to take over at any time.” But if you call your system “Autopilot,” you can‘t be surprised when some drivers watch a movie while using that mode. We all want self-driving cars so badly that some people are behaving as if they’re already here. That sets up expectations that are bound to be dashed.
Tesla drivers may well still actually be safer using Autopilot than they are driving themselves -- yes, even while watching a movie. But we are more afraid of being driven unwary into a tractor-trailer than we are of our own mistakes. We’re also more afraid of the unknown than the status quo. Which means that however rare this sort of thing actually is, high-profile cases like this, coming at the wrong moment, have the potential to derail both regulatory progress and market adoption of self-driving cars.
That may not be a perfectly rational calculation. But the failure of human beings to make perfectly rational calculations is exactly why we’re trying to get them out of the business of driving in the first place.
By Megan Mcardle
Megan McArdle is a Bloomberg View columnist who writes on economics, business and public policy. She is the author of “The Up Side of Down.” –Ed.
(Bloomberg)