The Moral Dilemma Of Driverless Cars

Wed 30th Jan 2019

A new survey has found that in a future of driverless cars motorists will put a child pedestrian’s safety before their own if a crash was inevitable.

The AA surveyed 21,000 of their members asking them morally tough questions relating to driverless car technology and they found that 59 per cent of drivers admitted they would risk their own life if a child ran out into the road. It wasn’t an overwhelming majority choosing the child’s safety over there own, 800 of those who responded say they would prefer the car to keep driving straight and not risk their own safety. Four per cent of drivers wanted their car to carry on and run over the children, while two per cent chose an option which saw the car swerve off the road and hit an elderly couple.

The answers to the questions show that despite the technology progressing faster than ever, there are still some moral questions to be considered for those times when things do go wrong.

Edmund King, president of the AA, said: “Of those who could make a choice, a clear majority decided to put themselves in danger, perhaps indicating they accept the risks and potential fallibilities of the technology.

“The driverless dilemma is a common question for programmers of autonomous vehicles. The number of people who avoided giving a definitive answer shows this is a difficult live or let die dilemma.”

The AA’s research follows on from the publication of a controversial website last year called the Moral Machine, which presented a series of scenarios to motorists asking those in the vehicle to decide who survives in the case of an accident. The concept was the idea two scientists from America who were hoping that their data would help inform car makers when they set up their driverless car software.