Categories: News

Moral Machine Lets You Decide Who Gets Killed by a Faulty Autonomous Car

With the onset of self-driving vehicles, a major concern raised is how they would respond to the trolley problems. This is basically moral questions, like when a self-driving car full of elderly individuals has to make a decision on whether to crash to avoid puppies in the cross-walk or if it is ok to run over two criminals to save one doctor. Whose lives are worth more, those of senior citizens or seven-year-olds? MIT researchers have developed a game dubbed the Moral Machine that lets players make the similar calls on ethics. The primary goal of this game is to help build a crowd-sourced picture of human opinion on how machines should make the decision when faced with moral dilemmas.

Participants in this game are asked 13 questions, all with just two options. In all the scenario, a self-driving car with unexpected brake failure has to make a decision: continue moving ahead and run into whatever is in front or swerve out of the way, smashing whatever it finds in its way.

Moral Machine has all types of people in different scenarios including children, elderly, male, females and adult. There are executives, criminals, homeless or nondescript. In one of the questions, participants are asked to choose between saving a pregnant woman in a car or a boy, a female doctor, a female executive, two women athletes and a female physician. The scenario raises more nuanced questions, Should the passenger who never complained about the speeding car be saved? Or should we depend on airbags and other safety features in a crash instead of swerving into unprotected civillians?

The game also poses basic options like whether AI should be involved at all if it will save more lives or just stay passive instead of actively changing events in a way they make it responsible for someone’s death. Most people playing the game definitely thinks the decisions are tough with clear-cut situations, imagine how tough it will be for self-driving cars in amidst chaotic road conditions.

Trolley Problem were first formulated in the late 1960s. The question is it more just to pull a lever, which sends a trolley down a different track to kill one person or leave the trolley on its course where it will kill five. It is an inherent moral problem, and slight variation can significantly change how people choose to answer.

At the end of the Moral Machine, it informs test-taker that their answers were part of a data collection effort by scientists at MIT for research into autonomous machine ethics and society. However, people can opt-out of submitting their data.

Featured Image Credit:techcrunch

David Waiganjo

Recent Posts

How Second-Screen Habits Turned Sports Betting Into a Mobile-First Experience

Watching sport does not feel as self contained as it once did. It used to…

2 weeks ago

Why Your “Stress Response” is Unique: A Deep Dive into Enneagram Triggers

You're in the same meeting as your coworker. The deadline just got moved up by…

1 month ago

Are Lone Worker Apps A Worthwhile Investment?

Imagine working alone on a late shift, driving between job sites, or checking a property…

2 months ago

The Digital High-Roller: Why Streamed High-Stakes Wagers Became the Unmissable New Gaming Genre

Something fundamental shifted in the world of content creation a few years ago. For a…

5 months ago

How mobile phones are shaping the future of the casino industry

Ever since the first casino was opened in the 17th century, this sector has made…

7 months ago

How to Unlock All Factions in Princess & Conquest – Complete Guide for All Unlockable Races and Kingdoms

If you've spent even a few hours in Princess & Conquest, you’ve likely stumbled upon…

9 months ago