Self-driving cars 'too law-abiding' to be safe

Programmed vehicles chalk up higher crash rates, especially in chaotic traffic: US study

SOUTHFIELD (Michigan) • The self-driving car, that cutting-edge creation that is supposed to lead to a world without accidents, is achieving the exact opposite right now: The vehicles have racked up a crash rate double that of those with human drivers.

The glitch? They obey the law all the time, without exception. This may sound like the right way to program a robot to drive a car, but good luck trying to merge onto a chaotic, jam-packed highway with traffic flying along well above the speed limit. It tends not to work out well.

As the accidents have piled up - all minor scrapes for now - the arguments among programmers at places like Google and Carnegie Mellon University are heating up: Should they teach the cars how to commit infractions from time to time?

"It's a constant debate inside our group," said Professor Raj Rajkumar, co-director of the General Motors-Carnegie Mellon Autonomous Driving Collaborative Research Lab in Pittsburgh. "And we have basically decided to stick to the speed limit. But when you go out and drive the speed limit on the highway, pretty much everybody on the road is just zipping past you. And I would be one of those people."

Last year, he offered test drives to Congress members in his lab's self-driving Cadillac SRX sports utility vehicle. The car performed perfectly, except when it had to merge onto I-395 South and swing across three lanes of traffic in 137m to head towards the Pentagon. The car's cameras and laser sensors detected traffic in a 360-degree view but didn't know how to trust that drivers would make room in the ceaseless flow, so the human minder had to take control.

Prof Rajkumar said: "We don't want to get into an accident because that would be front-page news. People expect more of autonomous cars."

Turns out, though, their accident rates are twice as high as for regular cars, according to a study by the University of Michigan's Transportation Research Institute in Ann Arbor, Michigan. Driverless vehicles have never been at fault, the study found: They are usually hit from behind in slow-speed crashes by inattentive or aggressive humans unaccustomed to machine motorists that always follow the rules and proceed with caution.

"It's a dilemma that needs to be addressed," Dr Rajkumar said.

It is similar to the thorny ethical issues driverless car creators are wrestling with over how to programme them to make life-or- death decisions in an accident. For example, should an autonomous vehicle sacrifice its occupant by swerving off a cliff to avoid killing a school bus full of children?

California is urging caution in the deployment of driverless cars. It published proposed rules this week requiring a human always to be ready to take the wheel and also compel companies creating the cars to file monthly reports on their behaviour.

BLOOMBERG

Join ST's Telegram channel and get the latest breaking news delivered to you.

A version of this article appeared in the print edition of The Straits Times on December 19, 2015, with the headline Self-driving cars 'too law-abiding' to be safe. Subscribe