The future is racing towards us on the wheels of autonomous vehicles (AVs), and with it, a complex web of moral dilemmas. Should an AV swerve towards a couple of people instead of a large group of bystanders? How should these life-or-death decisions be programmed? MIT researchers have delved into these questions with the “Moral Machine” survey, a monumental experiment with unprecedented reach, providing some distinct global preferences and regional variations concerning the ethics of AVs.
The “Moral Machine” Survey: A Global Experiment
The Moral Machine experiment, led by Edmond Awad, a postdoc at the MIT Media Lab, reached over 2 million online participants from more than 200 countries. Designed as a multilingual online game, participants weighed in on versions of the classic “Trolley Problem,” revealing how they would prefer AVs to act in potentially fatal situations.
The survey found three distinct elements most approved by participants:
- Sparing Human Lives over Animals: The value of human life takes precedence.
- Saving Many over Few: More lives saved is better.
- Preserving the Lives of the Young over the Elderly: A preference, though varied among different regions.
While these preferences were universally agreed upon, Awad noted that the degree of agreement varied, especially concerning favoring younger people over the elderly in certain “eastern” cluster countries.
Uncovering Clusters of Moral Preferences
The data, collected from 233 countries with nearly 40 million individual decisions, was analyzed as a whole and also broken into subgroups based on demographic characteristics like age, education, gender, income, and political and religious views.
The researchers did not find significant differences based on these characteristics, but they identified “western,” “eastern,” and “southern” clusters of countries. For instance, the southern cluster showed a stronger tendency to favor sparing young people rather than the elderly, especially compared to the eastern cluster.
Informing Public Discussion
These types of preferences should be part of the public discourse surrounding AVs. According to Awad, knowing these preferences might theoretically inform how software is written to control AVs.
“The question is whether these differences in preferences will matter in terms of people’s adoption of the new technology when [vehicles] employ a specific rule,” Awad contemplates.
Rahwan, another researcher, notes that public interest in the platform surpassed expectations, allowing for both public engagement and valuable insights.
You Be the Judge: The Future of Autonomous Vehicles
The Moral Machine experiment, published in Nature, is not just a study; it’s an invitation for public engagement in decisions that will shape our future.
With the rapid proliferation of AVs, the time has come for a robust dialogue about their moral programming. The decisions we make now will influence how we navigate the ethical roads of tomorrow.
What kind of rules should govern the autonomous vehicles of the future? How do we balance the scales of ethics, technology, and human values?
You be the judge.
Join the conversation with OnderLaw as we explore these critical questions and more. Stay engaged with us and help shape the future. If you or a loved one have been injured in an accident, contact us today for your free, no-obligation consultation.