Dan Gardner is the New York Times best-selling author of Risk, Future Babble, Superforecasting (co-authored with Philip E. Tetlock), and How Big Things Get Done (co-authored with Bent Flyvbjerg). His books have been published in 26 countries and 20 languages. Prior to becoming an author, Gardner was an award-winning investigative journalist. More >

The long and dangerous road to self-driving cars

Published in The Globe and Mail, January 2, 2019

Roughly 1,700 Canadians are killed and 10,000 seriously injured on the roads each year. Nine in 10 collisions are the result of human error.

These two facts mean autonomous vehicles – driverless cars – have the potential to dramatically reduce a major cause of death and suffering. But for all the excited talk, we are a very long way from roads and highways filled with safe, reliable, driverless cars. Some observers think that will take a decade. Others say more. Whatever the correct forecast, we are in for – in fact, have already entered – a long transition in which cars are not autonomous but are increasingly automated.

This interim could be dangerous. Unless we rethink how we train and license drivers, rising automation could actually increase the death toll.

The problem is the paradox of automation: As a system becomes increasingly automated, and human contributions to the system decrease, the importance of those contributions increase.

Aviation is the perfect illustration.

Thanks to enormous advances in automation, pilots really don’t do much in a modern passenger jet. They take off and land. In between, automated systems take care of the rest. “For 90 per cent of the trip or more, the plane will be on autopilot,” notes Ken Pennie, retired chief of the Royal Canadian Air Force (and an associate of my firm). “The computer basically flies the plane.”

With such a large percentage of every flight now automated, the pilots are much more marginal than in the past, their skills less important. That is, until something goes wrong.

It could be a malfunction in the automated systems, a mechanical breakdown, unanticipated clear-air turbulence or some other nasty surprise in the environment. Even a collision with birds can turn calm routine into panicked emergency – such as the flock of geese that struck US Airways Flight 1549 and forced Captain Chesley (Sully) Sullenberger to make his famous landing on the Hudson River.

In those rare moments, the pilots’ skills are not only needed. Everything depends on them.

As automation develops in cars, we can expect to see the same paradox of automation at work. Cruise control has been common for many years but cars increasingly have adaptive cruise control, which can include braking when objects get too close and lane assistance that may warn when the car is leaving a lane or take control of the car to ensure it doesn’t. On the standard five-level categorization scheme for automated vehicles – where zero is an ordinary car and five is full automation with no need for a driver – this qualifies as Level 1.

The next step is already in place in some Tesla vehicles. The company’s “Autopilot” system features adaptive cruise control, lane centring, lane switching (after driver approval), entering and exiting highways and self-parking.

These features will likely multiply and advance and, gradually, become standard in a steadily growing share of all cars sold. Drivers will do less and less driving. Until something goes wrong.

Automated systems will malfunction, mechanical problems will occur, snowstorms will blot out white lines on highways. Some of these disruptions will be expected, others will be more surprising and a few will be completely unpredictable. But in every case, drivers will have to take control.

Will they be able to? That’s unclear now. But experience in aviation suggests that with current driver training and licensing systems, the answer will be no. From the earliest days of civil and military aviation, pilots logged their hours – meaning time in the air piloting an aircraft. More hours meant more experience using their skills. More hours meant a better pilot.

But automation threatened to undo that equation because pilots spent less time in the air actually using their skills. The solution? Simulators and licences.

Pilots are required by law and their employers to spend specified amounts of time in flight simulators, practising what they would have to do in the event of an emergency. And they are regularly tested to ensure that if they have to intervene, they can. If they can’t deliver, they can’t fly.

This system helped ensure that when Sully Sullenberger had to land on the Hudson River, he could.

The skills of drivers, by contrast, are currently checked when they first get a licence. And that’s it. Until they are very old, the law never again asks them to prove that they are capable of safely handling a multitonne slab of speeding metal and glass.

That worked because, as time passed, drivers drove. That not only maintained drivers’ skills; it improved them. An experienced driver was a better driver.

But as automation advances, that equation will break down. Drivers will use their skills less and less. Those skills will diminish. When emergencies require drivers to take control, they will fail. And people will die.

A difficult choice is coming: Either we institute training and licensing systems similar to that for pilots, require drivers to occasionally be tested in simulators, and revoke the licences of those who cannot handle emergencies.

Or we accept the possibility that the already terrible death toll on the roads will worsen in the many years before we reach the driverless Nirvana.