A guy was killed by his auto-piloted Tesla last week (new story here) and Uncle is looking into it. Of course he won’t do anything about it. Because “if it saves even one life” is very selectively applied. It applies only when whatever the danger happens to be is something Uncle wants to use as an excuse to impose yet another mandate.
Never to rescind one.
See, for instance, air bags.
Or — lately — cars that drive themselves.
Uncle very much wants such cars and so is prepared to do nothing about their potential — and now actual — lethality.
Because, you see, the point is not that cars that drive themselves are “safe” (they’re not, bear with me) though that is much talked up (like air bags, which also aren’t “safe”) and used as the pretext for force-feeding them to us. Note that. We are never given the choice. Never offered whatever the thing in question is and allowed to weigh the pros and cons and then choose for ourselves. Free people are not merely allowed such latitude, they are entitled to it. It is not bequeathed, conditionally, by political parents in a remote bureaucracy but respected as an inviolable moral principle.
In a free society, that is.
But we are not free except to do as we are told.
Back to this Tesla thing.
The “driver” (who wasn’t) in the recent lethal incident was reportedly doing something else besides paying attention to what was in front of him. In this case, a big rig making a left turn in his path of travel. The Tesla’s autopilot did not grok the big rig and drove right into it.
Right under it, actually.
According to Tesla, “autopilot is getting better all the time but it is not perfect and still requires the driver to remain alert.” (Italics added.)
Then why bother with autopilot?
Isn’t the touted benefit of cars that drive themselves this idea that “drivers” no longer have to remain alert? That they can take a siesta or play Candy Crush or watch a DVD or do some work on their laptops? If they have to remain alert, they cannot do those things, too.
Remaining alert means keeping one’s eyes on the road at all times — not occasionally. It means being prepared to react to changing conditions.
Like a tractor trailer turning left in front of you.
The imbecility of all this makes my teeth ache.
Vehicular autopilot is often likened to autopilot in commercial airplanes but the parallel doesn’t parse. Airplanes don’t just take off and fly wherever, however. Their flight plans are filed in advance and strictly adhered to, their course (speed and altitude) strictly monitored the entire time. Spur-of-the-moment deviation is not allowed. The airspace is controlled at all times to keep one airplane away from another airplane. Pilots have very little latitude to control their aircraft’s flight path.
And that’s what we’re really getting at here.
Autopilot in cars makes sense if all cars are similarly under control. If you have to file a “flight plan” before you go anywhere — and your course is monitored and subject to control the entire time. Then it might be possible to avert incidents such as the one described above. The auto-driving Tesla would have known about the big rig’s intention to turn long before the turn ever occurred — and accommodations could have been made by each vehicles’ computer brain.
Lovely, if you don’t mind the idea of no longer being in control of your vehicle.
For automated cars to be “safe,” it is necessary — mandatory — that the caprice of human drivers be taken entirely out of the picture. Else, the vagaries of human imperfection will lead to accidents — and that is not “safe” and so Uncle will step in.
And human imperfection behind the wheel continues to worsen as less and less is expected of these human drivers as drivers. As even former basic competences like the ability to parallel park a car are no longer required because technology can handle that now. Every incremental dumbing-down — starting with ABS back in the 1980s through to the present day and cars that automatically brake, can come to a complete stop without the “driver” even touching the brake pedal — has been at least tacitly an effort to get the driver out of the driver’s seat.
To render him a passenger.
It is possible that Tesla and Google and the rest of the juggernaut don’t consciously grok the fact that what they are pushing requires the driver to become a passenger. You can’t, on the one hand, fit cars with systems that invite the driver to stop driving — and at the same time expect him to “remain alert.”
You’re either a driver.
Or you’re not.
What’s it going to be?