Who gets the blame when the government knowingly allows the public roads to be used as a test track for dangerous technologies?
That’s the question the National Highway Traffic Safety Administration (NHTSA) should be asking itself as it “probes” the self-crashing cars electric car manufacturer Tesla has been testing on public roads … with living crash test dummies.
With NHTSA’s full awareness.
Tesla has never hidden that it offers self-crashing technology; it simply advertises it as “self-driving” capability, styled Autopilot. It has been offering it for years. The media has been reporting on it for years. The crashes — and deaths — have been happening for years. The most recent crash happened in Texas last week, when a 2019 Model S equipped with Tesla’s Autopilot system left the road “at a high rate of speed” and self-drove into a tree.
Two men were killed. Apparently, neither one was in the driver’s seat.
This is not, however, the first time. Or the 10th time.
There have been at least two dozen other crashes over the past several years involving Teslas equipped with Autopilot technology. Videos have been posted to social media of people literally asleep behind the wheel of Teslas. On purpose. Some of these have hundreds of thousands of views and made national network news. Owners openly brag about not driving their cars.
NHTSA, of course, is perfectly aware of this. It is merely selective about “safety” — calling into question whether that is its raison d’être … or its excuse.
The goriest example — literally — of the federal agency’s indifference to “safety” is the recent fiasco over lethally defective Takata airbags. Even though NHTSA has issued a recall and knows that there are still tens of thousands of cars on the road equipped with these bags — which can and already have killed — it refuses to allow car dealers or independent mechanics to even temporarily disable them until all of them can be replaced.
Perhaps that’s because that would set a dangerous regulatory precedent for the disabling of a government-mandated “safety” device. It also might raise uncomfortable questions about the mandating of “safety” technology generally since no “safety” technology — including even seat belts — is perfectly “safe.” Sometimes they kill. They may, on balance, save more lives. But when it is our lives at stake, perhaps it ought to be us to weigh the risks and rewards?
But then, there’s no power for government “safety” apparatchiks in that.
Is it possible that NHTSA has looked away from the debris left in the wake of self-crashing Teslas because of the political importance of coddling Tesla because it is driving the push for electric cars?
It’s noteworthy that the same federal “safety” apparat that hasn’t showed much interest in self-crashing Teslas has also not shown much interest in self-immolating Teslas.
On a per-capita basis, Teslas are more fire-prone than ’70s Pintos — which never just caught on fire.You at least had to run into one first. But several Teslas have gone up in smoke while parked because of battery fires. These fires burn extremely hot, too. It took four hours to extinguish the fire that resulted after the Autopiloted Texas Tesla piloted itself into a tree.
There is clearly a problem. Mr. Magoo could see it.
NHTSA doesn’t seem especially interested.
But the self-crashing problem is interesting for different reasons. If a Tesla catches fire because of a defect in the battery, it’s obvious who is at fault. Or at least it’s obvious who isn’t at fault.
You can’t blame the owner. All he did was buy the thing — and park it.
But when a “self-driving” Tesla self-crashes itself because no self was driving the thing, who is responsible for the crash?
Is it the person who wasn’t driving?
Who was encouraged not to by dint of the car’s manufacturer, who built it to “self” drive and marketed that capability as a feature, meant specifically to attract buyers? Is it reasonable to expect such buyers not to use the feature?
Tesla’s lawyers say it cautions the people who buy its Autopiloted cars to always be “prepared to intervene” — that is to say, prepared to drive when the car appears likely to crash if someone doesn’t intervene. As in, keep your foot covering the brake and your hands on the wheel or at least hovering over it. No checking emails!
This is pretty cheeky stuff — right up there with the infamous catalytic converter “test pipe” of the late 1970s that came with a warning that it must never be used in place of the catalytic converter, notwithstanding it was made for precisely that purpose.
We won’t tell if you won’t.
Similarly, “self-driving” technology is made precisely to let the driver not have to drive — else why bother with it? What would be the point of “self-driving” tech that requires the driver to be “ready to intervene” at any moment?
The point, of course, is to bait customers with a feature that no other car offers. Teslas are expensive — and impractical, being electric cars. They are very quick — briefly. They take a long time to get moving again. How to counteract these negatives? By sexing things up with features that wow the prospective owner and the owner’s friends and family.
Look what it’ll do!
There is nothing intrinsically wrong with this — provided no one gets hurt. More specifically, provided innocent bystanders aren’t hurt. Tesla’s Autopilot isn’t just a threat to those not behind (or asleep at) the wheel. It is a threat to anyone in the car’s path.
Tesla is responsible for that. Not the code. Not the tech. And also the drivers who use the tech irresponsibly.
But NHTSA ought to be held accountable for not dealing with these safety concerns.
Else why bother with NHTSA at all?