Will a driverless car deliberately try to kill you?

The short answer is: quite possibly, yes.

WHILE THE GENERAL CONSENSUS IS that autonomous cars will reduce the road toll, one speaker at the International Driverless Cars Conference raised some potentially disturbing ethical issues.

Associate Professor Robert Sparrow of the School of Philosophical, Historical and International Studies at Monash University asked delegates to consider some wider implications, such as should a driverless car ever kill its occupant for the greater good?

More than 90 percent of car crashes are attributable to human error, so removing the human from the equation should result in a huge reduction in crashes. But removing the errors made by drivers also means removing their capacity for making ethical decisions in real time, or delegating it to someone else such as the manufacturer.

Prof Sparrow argues that the road toll is always, in a sense, a public policy decision; a compromise, a trade-off between speed, driver competence and the value of human life. Inevitably, until we reach the theoretically unlikely position of driverless cars never crashing, machines will be called upon to face some potentially terrifying ethical quandaries.

As he explains, “We need to build ethics into driverless cars because sometimes, driverless cars will be forced to choose what to crash into.”

Taking this further, in a deadly situation, in some circumstances the automated vehicle should be programmed to kill the driver, such as when the choice is between crashing into pedestrians or crashing into a tree. As Prof Sparrow observes, that could be a bit hard to sell to people.

Or how about this scenario? Imagine an unavoidable collision in which one of two motorcyclists will have to be run into. One is wearing a helmet; the other is not. The driverless car must make a decision, and because the rider wearing the helmet is statistically less likely to be killed, the car will choose to hit him instead of the more vulnerable helmetless rider. “So,” points out Sparrow “if you’re a law-abiding cyclist, you’ve just made yourself a target for autonomous vehicles.”

Prof Sparrow posited another hypothetical situation in which a driverless school bus is faced with the choice of crashing into a tree or into a single cyclist. “It looks as though, if you’re really just concerned to save human lives, what you should do is run over the one cyclist,” he said, adding “That makes a Google Bus a very dangerous thing to be around.”

Taking things to a logical (although perhaps not very practical conclusion), Prof Sparrow asked, “If human error is the cause of most car crashes, should humans be allowed to drive at all?” He went on to argue that driving by humans may eventually be made illegal, for the greater good. “If you take the safety argument seriously, then this should be mandated, and we should be moving to take the existing fleet off the road as soon as possible.”

One of the biggest hurdles for autonomous vehicles is who will be held accountable. In a statement that surprised many, Volvo Car Group President and CEO Håken Samuelsson said the company would accept full liability whenever one of its cars is in autonomous mode.

But if a human being is “partially” driving the car which may well be “communicating” with other vehicles and infrastructure, who or what is responsible if there’s a crash?

“It’s one thing to say that the company who sold me the car is responsible for the road fatality (but) if that car is communicating with the infrastructure of the road (and) with other vehicles then there’s a tremendous dispersion of responsibility,” Sparrow said. “I think is going to be really difficult. It’s one thing to say ‘you’re driving my car, you’re responsible for what happens as a result’… it’s another thing to say ‘this car doesn’t have a steering wheel, (the manufacturer is) driving it’.”

We’ve said it before, and we’ll say it again: the issues involved in autonomous vehicles are complicated. However, Prof Sparrow suggested he is not “overly concerned” because removing human error from driving would undoubtedly make the roads safer. “However,” he added “taking the driver out of the equation places somebody else in the situation of decision-maker and you really want to read the fine print if you’re getting into this vehicle.”

This article was prepared using Prof Robert Sparrow’s presentation to the  2015 International Driverless Cars Conference held in Adelaide.