Should A Self-Driving Car Kill The Baby Or The Grandma?

Wikimedia Commons
Please Share This Story!
image_pdfimage_print
Different cultures give different answers, and the there is obviously no rigid commonality between nations. When AI programs are created, however, they must start with a moral judgement as to how their programs will behave. ⁃ TN Editor

The infamous “trolley problem” was put to millions of people in a global study, revealing how much ethics diverge across cultures.

In 2014 researchers at the MIT Media Lab designed an experiment called Moral Machine. The idea was to create a game-like platform that would crowdsource people’s decisions on how self-driving cars should prioritize lives in different variations of the “trolley problem.” In the process, the data generated would provide insight into the collective ethical priorities of different cultures.

The researchers never predicted the experiment’s viral reception. Four years after the platform went live, millions of people in 233 countries and territories have logged 40 million decisions, making it one of the largest studies ever done on global moral preferences.

new paper published in Nature presents the analysis of that data and reveals how much cross-cultural ethics diverge on the basis of culture, economics, and geographic location.

The classic trolley problem goes like this: You see a runaway trolley speeding down the tracks, about to hit and kill five people. You have access to a lever that could switch the trolley to a different track, where a different person would meet an untimely demise. Should you pull the lever and end one life to spare five?

The Moral Machine took that idea to test nine different comparisons shown to polarize people: should a self-driving car prioritize humans over pets, passengers over pedestrians, more lives over fewer, women over men, young over old, fit over sickly, higher social status over lower, law-abiders over law-benders? And finally, should the car swerve (take action) or stay on course (inaction)?

Rather than pose one-to-one comparisons, however, the experiment presented participants with various combinations, such as whether a self-driving car should continue straight ahead to kill three elderly pedestrians or swerve into a barricade to kill three youthful passengers. 

The researchers found that countries’ preferences differ widely, but they also correlate highly with culture and economics. For example, participants from collectivist cultures like China and Japan are less likely to spare the young over the old—perhaps, the researchers hypothesized, because of a greater emphasis on respecting the elderly.

Similarly, participants from poorer countries with weaker institutions are more tolerant of jaywalkers versus pedestrians who cross legally. And participants from countries with a high level of economic inequality show greater gaps between the treatment of individuals with high and low social status.

And, in what boils down to the essential question of the trolley problem, the researchers found that the sheer number of people in harm’s way wasn’t always the dominant factor in choosing which group should be spared. The results showed that participants from individualistic cultures, like the UK and US, placed a stronger emphasis on sparing more lives given all the other choices—perhaps, in the authors’ views, because of the greater emphasis on the value of each individual. 

Read full story here…

Join our mailing list!


avatar
1 Comment threads
0 Thread replies
0 Followers
 
Most reacted comment
Hottest comment thread
1 Comment authors
Cal Recent comment authors
  Subscribe  
newest oldest most voted
Notify of
Cal
Guest
Cal

The Owner and/or driver should be the first choice to die as they made the choice to buy/have the vehicle. Was part of that choice the concept that others might be chosen to die instead of them facing the consequences of their actions . It has been advertised that way, that OTHERS who were not involved in that decision would pay the price. Those wh buy/use them MUST be the ones who pay whatever consequences come because of these vehicles.