(406) 404-1600

Wednesday Word: Self-Driving Ethics

April 28, 2021 | Matthew C. Green

Self-Driving Ethics?

In May, Yellowstone National Park will begin operating two unmanned shuttles to ferry tourists from place to place in the Canyon area. The vehicles are part of a pilot program assessing the feasibility of such technology in America’s national parks.

While seen as a step toward the future by some, others have greeted the program with skepticism—even hostility—and plenty of questions. How will these vehicles keep from hitting park visitors? Or strolling buffalo? What will stop them from driving over a cliff?

Although these questions are mitigated by the fact that the shuttles will have onboard operators who can wrest control from the machines if needed, experiments with driverless vehicles in Yellowstone and elsewhere bring to the surface a long-standing ethical conundrum.

Of Trolleys and Shuttles

In 1967, Oxford philosopher Philippa Foot introduced what has become famous as the “trolley problem.”1 This thought experiment has since been presented in various forms, but the basic idea is this: An out-of-control trolley car is hurtling down the tracks toward five people certain to be killed on impact. However, between the trolley and these people is a switch that will divert the car to another track, sparing the five. However, there’s a problem here, too. There is also someone walking on the alternate track, who will be killed if the trolley is switched to this route.

What to do if you’re the trolley driver? Is the most ethical response to spare the five by actively redirecting the trolley toward the one person? Or, is it better to remain passive and not commit a wrong act by mowing down the one person who would not be in harm’s way but for your action, even though that will result in five being killed instead of one?

In a short blog article, we can’t even begin to broach a solution to such a dilemma, let alone discuss a controversial and complex issue such as self-driving vehicles. But suffice it to say, those programming the potential travel paths of self-driving vehicles must make their decisions from within some sort of ethical framework.

Ethical Frameworks: Conscious and Unconscious

That framework is guiding their science—and is being applied to those who ride in or are in the potential path of such vehicles. And it informs the answers to various practical questions: If a choice must be made, do you hit a tourist or a buffalo? A buffalo or a bear? What if you opt to hit a buffalo instead of three pedestrians, when the impact with the buffalo might risk serious injury or death for the six people on the shuttle? Suppose the Trolley Problem re-emerges as the Shuttle Problem, with the vehicle heading toward five pedestrians, and the only other feasible option being to change track toward one person? Or, what if a choice must be made between a group of school children on a field trip, or a group of senior citizens on a tour?

A person’s ethical framework will guide such decisions. All of us have—indeed, must have—such a framework, even if it is unconsciously or uncritically adopted from the surrounding culture. And it guides our choices in life.

In 2014, MIT researchers launched “Moral Machine,” an online experiment to discern the attitudes of people worldwide about how self-driving vehicles should respond in such matters of life and death. Their results, published in 2018,2 are interesting, but even more telling is the ethical framework assumed by their study.

The team observed in its report, “Before we allow our cars to make ethical decisions, we need to have a global conversation to express our preferences to the companies that will design moral algorithms, and to the policymakers that will regulate them.”3 At least to some extent, the basis for these ethical decisions is human preference.

A Solid Ethical Underpinning

As followers of Christ, whether we are dealing with self-driving shuttles or political questions or business affairs, we cannot allow morality to be merely a matter of public opinion. We believe we have a solid foundation for our ethical decisions in the knowledge of God and His ways as presented in Scripture, and we should live guided by the wisdom of the indwelling Holy Spirit.

As Paul puts it in Romans 12:2, we must “not be conformed to this world, but be transformed by the renewing of [our] mind, so that [we] may prove what the will of God is, that which is good and acceptable and perfect.”

This doesn’t mean the answers to all moral questions will be easy and immediately clear. But, we will be directed by God’s good, perfect will and unchanging love, instead of by this world’s priorities, which mutate with the values and fashions of the day.

And that’s a much better guide than human opinion.

1 The Trolley Problem first appeared in Foot’s paper “The Problem of Abortion and the Doctrine of the Double Effect.”

2 Results were published in Nature magazine, https://www.nature.com/articles/s41586-018-0637-6

3 https://www.bbc.com/news/technology-45991093

Matthew C. Green

Communications/Marketing

With more than 15 years of experience working with media, Matt brings his extensive media relations knowledge to the Communications and Marketing department […]

Read More

Stay Connected!

Start receiving email announcements
and our quarterly newsletter, Inscribed.
  • This field is for validation purposes and should be left unchanged.