Showing posts with label Self-Driving Cars. Show all posts
Showing posts with label Self-Driving Cars. Show all posts

Tuesday, August 6, 2019

econlife - When Cars Have To Decide Whether to Kill Grandma by Elaine Schwartz


Imagine for a moment that an out-of-control streetcar is hurtling toward five people. Sitting at the controls, the driver can do nothing and assume those five individuals will die. He can though flip a switch and instead hit a single person.

Doing nothing results in five fatalities. Acting will cause one person’s death. What to do?

It might all depend on where you live.

But first, the reason we need to know.

Autonomous Vehicles

A group at MIT wanted to identify what people believe is ethical because self-driving vehicles will have to make decisions. Yes, most involve the everyday traffic rules we (are supposed to) observe. However, especially in emergencies, some will be about life and death. A machine will have to decide whether to strike someone who is old or young. It could have to choose between harming men or women, more or fewer people, animals or humans. An autonomous vehicle can act or do nothing.

The Study

To see the ethical driving decisions that should be made by self-driving vehicles, those MIT researchers created a survey. Called the Moral Machine, 13 scenarios were presented to millions of people in 233 countries. All had situations in which someone had to die. Their task was deciding who.

The geography of the Moral Machine:




The Results

Among the participants, there was general agreement about three moral dilemmas:


  • humans over animals
  • save many rather than a few
  • preserve the young instead of the old


However, it wasn’t quite that easy. Moving from general preferences to specific answers, researchers uncovered much more division.

Geographically, the responses to the Moral Machine questions divided into three groups:


  • The Western Cluster: Values that related to Christianity characterized the answers from North America and some of Europe.
  • The Eastern Cluster: People with a Confucian or Islamic heritage from places that included Japan, Indonesia, and Pakistan tended to have similar opinions.
  • The Southern Cluster: The third group, not explicitly tied to specific religious roots, was from Central and South America, France, and former French colonies.

Nature Magazine illustrated the differences through a moral compass:





In addition to geography and religion, ethics varied by gender, age, education, income, and politics. Institutions mattered. In more affluent countries with established institutions (like Finland and Japan), survey participants were more likely to condemn an illegal jaywalker. As for age, respondents in the Southern Cluster tended to spare the young rather than the aged.

Our Bottom Line: Externalities

For self-driving cars, ethical programming has become a reality. Necessitating countless choices, the decisions will create externalities. Defined as the impact on an uninvolved third party, the externality can be positive or negative.

Now though we can see that what people call positive or negative will vary.

I wonder if that means having self-driving cars with a Western, an Eastern, or a Southern personality.

My sources and more: I recommend taking a look at Nature’s article on The Moral Machine Experiment.  From there, you might also enjoy this 99% Invisible podcast and this German Ethics Commission report, and do be sure to take MIT’s moral machine quiz.


Ideal for the classroom, econlife.com reflects Elaine Schwartz’s work as a teacher and a writer. As a teacher at the Kent Place School in Summit, NJ, she’s been an Endowed Chair in Economics and chaired the history department. She’s developed curricula, was a featured teacher in the Annenberg/CPB video project “The Economics Classroom,” and has written several books including Econ 101 ½ (Avon Books/Harper Collins). You can get econlife on a daily basis! Head to econlife.

Thursday, September 28, 2017

econlife - Is Your Self-Driving Car Ethical Enough? by Elaine Schwartz



Assume for a moment that you are driving a car and suddenly see some debris in the road. If you veer to the side into a crowd of pedestrians, you can avoid injuring yourself.

Harm yourself or someone else? As the debris approaches, the decision is yours to make.

Where are we going? To self-driving cars that make decisions for us. 



AV Ethical Contradictions


In a survey on the most moral way to program AVs (autonomous vehicles), the results were somewhat inconsistent. When asked if in an emergency situation, the car should kill its occupant or 10 pedestrians, most respondents selected the passenger.

The_social_dilemma_of_autonomous_vehicles___Science-4

Similarly, many consistently selected minimizing casualties whenever the choice was the passengers or avoiding others outside the car.

All changed though when survey respondents were personally impacted. Then they would not buy a car that was programmed to sacrifice their safety or their family’s well-being.

As a result, autonomous vehicle designers will have philosophical decisions that have an economic impact. How they reconcile our aversion to self-harm with the need for “the greater good” will impact sales.

Out Bottom Line: Externalities


For AVs, ethical programming is far from abstract. Creating positive and negative externalities, the decisions that AV programmers make will affect pedestrians and passenger, regulators, insurance companies…the list is endless.

For now though, when you see one of Google’s self-driving cars, you can wonder whether it is ethical.

My sources…For a specific look at A.V. ethical dilemmas, these articles from the NY Times and  Science are ideal. Then, for the broader picture, Quartz takes us to robotic 
morality.


Hazlegrove-6763_6bIdeal for the classroom, econlife.com reflects Elaine Schwartz's work as a teacher and a writer. As a teacher at the Kent Place School in Summit, NJ, she’s been an Endowed Chair in Economics and chaired the history department. She’s developed curricula, was a featured teacher in the Annenberg/CPB video project “The Economics Classroom,” and has written several books including Econ 101 ½ (Avon Books/Harper Collins). You can get econlife on a daily basis! Head to econlife.

econlife - Who Will Sacrifice Civil Liberties During a Pandemic? by Elaine Schwartz

  In a new NBER paper, a group of Harvard and Stanford scholars investigated how much of our civil liberties we would trade for better heal...