NSF Awards $556K Grant to Study Dilemmas Presented by Autonomous Vehicles

Prof. Nicholas Evans
Prof. Nicholas Evans' research expertise includes the ethics of emerging technologies.

10/05/2017

Media contacts: Christine Gillette, 978-934-2209 or Christine_Gillette@uml.edu and Nancy Cicco, 978-934-4944 or Nancy_Cicco@uml.edu

LOWELL, Mass. – Should your self-driving car protect you at all costs? Or should it steer you into a ditch – potentially causing serious injury – to avoid hitting a school bus full of children? 

Those are the kinds of questions that preoccupy Nicholas Evans, a UMass Lowell assistant professor of philosophy who teaches engineering ethics and studies the ethical dilemmas posed by emerging technologies, including drones and self-driving vehicles.

“You could program a car to minimize the number of deaths or life-years lost in any situation, but then something counterintuitive happens: When there’s a choice between a two-person car and you alone in your self-driving car, the result would be to run you off the road,” Evans said. “People are much less likely to buy self-driving vehicles if they think theirs might kill them on purpose and be programmed to do so.”

Now Evans has won a three-year, $556,650 National Science Foundation grant to construct ethical answers to questions about autonomous vehicles, translate them into decision-making algorithms for the vehicles and then test the public health effects of those algorithms under different risk scenarios using computer modeling. 

He will be working with two fellow UMass Lowell faculty members, Heidi Furey, a lecturer in the Philosophy Department, and Yuanchang Xie, an assistant professor of civil engineering who specializes in transportation engineering. The research team also includes Ryan Jenkins, an assistant professor of philosophy at California Polytechnic State University, and experts in public health modeling at Gryphon Scientific.

Although the technology of autonomous vehicles is new, the ethical dilemmas they pose are age-old, such as how to strike the balance between the rights of the individual and the welfare of society as a whole. That’s where the philosophers come into the equation.

“The first question is, ‘How do we value, and how should we value, lives?’ This is a really old problem in engineering ethics,” Evans said. 

He cited the cost-benefit analysis that Ford Motor Co. performed back in the 1970s, after engineers designing the new Pinto realized that its rear-mounted gas tank increased the risk of fires in rear-end crashes. Ford executives concluded that redesigning or shielding the gas tanks would cost more than payouts in lawsuits, so the company did not change the gas tank design. 

Most people place a much higher value on their own lives and those of their loved ones than car manufacturers or juries do, Evans said. At least one economist has proposed a “pay-to-play” model for decision-making by autonomous vehicles, with people who buy more expensive cars getting more self-protection than those who buy bare-bones self-driving cars. 

While that offends basic principles of fairness because most people won’t be able to afford the cars with better protection, Evans said, “it speaks to some basic belief we have that people in their own cars have a right to be saved, and maybe even saved first.” 

Understanding how computers “think” – by sorting through thousands of possible scenarios according to programmed rules and then rapidly discarding 99.99 percent of them to arrive at a solution – can help create better algorithms that maintain fairness while also providing a high degree of self-protection, Evans said. For example, the self-driving car approaching the school bus could be programmed to first discard all options that would harm its own passenger, then sort through the remaining options to find the one that causes least harm to the school bus and its occupants, he said.

Although it’s not quite that simple – most people would agree that a minor injury to the autonomous vehicle’s occupant is worth it to prevent serious injuries to 20 or 30 schoolchildren – it’s a good starting point for looking at how much risk is acceptable and under what circumstances, according to Evans. 

Evans and his team also will look at other issues, including the role of insurance companies in designing algorithms and the question of how many autonomous vehicles have to be on the road before they reduce the overall number of accidents and improve safety.

The NSF also asked Evans and his team to look at potential cybersecurity issues with autonomous vehicles. Today’s cars could be vulnerable to hacking through unsecured Bluetooth and Wi-Fi ports installed for diagnostic purposes, but large-scale hacking of self-driving cars is potentially much more dangerous.

There are also important privacy questions involving the data that an autonomous vehicle’s computer collects and stores, including GPS data and visual images from the car’s cameras, Evans said.

UMass Lowell is a national research university located on a high-energy campus in the heart of a global community. The university offers its 18,000 students bachelor’s, master’s and doctoral degrees in business, education, engineering, fine arts, health, humanities, sciences and social sciences. UMass Lowell delivers high-quality educational programs, vigorous hands-on learning and personal attention from leading faculty and staff, all of which prepare graduates to be ready for work, for life and for all the world offers. www.uml.edu