I recently test drove a new car with an autopilot system. Turn on the cruise control and the car not only maintains whatever speed you set, it also steers itself through curves and keeps the car from wandering out of its lane.
You are supposed to keep your hands on the wheel, of course. But who wouldn’t be tempted to take them off?
Just to see?
Well, one thing I saw was that the system works (in a clumsy, overcautious old person sort of way) until it doesn’t see. A klaxon sounds; a flashing yellow display appears advising that the auto-pilot system has disabled itself because it lost track of the road.
Because the sensors — the electronic eyes which scan the road, to keep the vehicle in its lane and (hopefully) out of the ditch — became suddenly glaucomic for one reason or another. It happens in bright daylight — glare, perhaps? — and also when it’s rainy (fog, one assumes).
Ice buildup is a problem in the winter if you can’t park overnight indoors. And even if you can, once you’re out on the road, if the air temperature is well below freezing, ice can and will form on the car’s exposed exterior surfaces — including the surfaces of the sensors which are critical to the operation of these self-driving technologies.
But I’ve also had a test car literally come to a dead stop in the middle of the road, for reasons known only to the code. It happened on a bright, clear day. I almost had dashboard for lunch — I never buckle up for safety — because the sudden halt was not expected.
I am able to type all of this because it was my good luck that day no Kenworth was behind me when the car I was in decided to brake violently and stop in the middle of the road — my attempts to countermand this by mashing the accelerator pedal being as futile as Justin Bieber punching a brick wall.
Point being, these systems are fallible.
As are we, of course. But there is a big difference between running off the road because you weren’t paying attention and running off the road because the car wasn’t — or no longer could.
Particularly when the car is being counted on to not run off the road, and when you — the erstwhile driver — have been encouraged (whether tacitly or overtly) to pay less or no attention to the car’s progress.
This is the never-mentioned Catch 22 of these automated driving systems. On the one hand, people are still expected to keep their hands on the wheel — and their eyes on the road.
On the other hand, technology is tempting them not to.
Otherwise, what is the point? If you have to pay attention — or are expected to — then the automated driving tech is useless at worst and a distraction at best. It is also an expense but never mind that.
A part-time driver is like a part-time pregnancy.
You either is — or you isn’t.
If you isn’t, then the tech had better work — your life literally depends on it. If it cannot be depended on to work — all the time — then you had better be ready to intervene, assuming your life has value to you.
This is like not having your cake — or eating it, either.
It places people in an impossible position. Who’s responsible, ultimately, for controlling the vehicle and the consequences, if the vehicle becomes uncontrollable? People are being encouraged to hand off more and more control over the car to technology, but the tech doesn’t get the ticket — or the felony indictment/lawsuit — when the car runs someone over.
Now factor in the absence of interventionary controls — a steering wheel, brake pedal. This is actively being considered, both by the car industry and Uncle (see here), on the usual egg-breaking being necessary to make the omelets base.
On the upside, we’ll know who — or rather, what — to blame, post mortem.
But that is cold comfort (literally) when your mortal remains are lying on a slab awaiting the embalming machine because there was nothing you could do when the tech hiccupped — and the car took a nosedive off the road, carrying you along for the ride.
Got a question about cars — or anything else? Click on the “ask Eric” link and send ’em in!