What if a U.S. Air Force pilot, using a computer chip implanted in her brain, could fly a plane remotely to bomb a target?
In this scenario, is it ethical for the Air Force to implant the chip in the first place, when the pilot has no medical or psychological deficits to prevent her from flying a plane from the cockpit? And what happens when she leaves the service for civilian life?
Those are the kinds of questions that Nicholas Evans
, an assistant professor of philosophy
who specializes in bioethics and the ethics of technology, will examine under a $209,749 grant from The Greenwall Foundation, an independent bioethics institute. The grant comes with a three-year appointment as a Faculty Scholar in Bioethics
“The U.S. military is one of the largest funders of scientific research in the world, and often, when they want to test something, they will test it on enlisted personnel,” Evans says.
“When the tests involve prosthetics, vaccinations, surgery or drugs to treat serious disease or disability, you can make a strong argument in favor of them, as long as the military obtains informed consent from the research participants. But it’s far less clear whether it’s OK to take someone who is not sick and give them drugs or surgery.”
Evans says the military follows the Common Rule of research ethics by which many U.S. government agencies abide. But the Common Rule is typically interpreted as applying to research for medical purposes: It makes no mention of testing enhancements on “warfighters,” an umbrella term for all active-duty service members, Evans says.
Under the grant, “The Ethics of Warfighter Enhancement Research,” Evans will try to catalog the enhancement research currently underway or being explored by the Department of Defense and then come up with a schema for analyzing it ethically.
As part of that, he will look at the role of “dual use” research: research into technologies and treatments with both civilian medical and military enhancement applications. One example: a computer-human interface that can allow a mostly paralyzed person to maneuver a wheelchair or activate a prosthetic limb — and could in the future allow a military pilot to fly a plane remotely or a military analyst to communicate instantaneously with a computer that can crunch huge amounts of data.
There are looming ethical questions about the relationship between military leaders and warfighters that need to be answered before such technologies are deployed, Evans says.
“One of the cornerstones of military ethics is that soldiers have the freedom to object to unethical, illegal orders that they’re given. But what happens if there’s a chip in your brain? Would you be able to object, or would someone else control your actions?” he asks.
Previous research has shown that when enhancements are removed, people can become depressed. But leaving the chips in leads to other ethical dilemmas, he says.
“Once you start modifying your body, what does that mean when you leave the military? Does the government still have a claim on your body if there’s a chip in you? What if someone steals you to find out what the U.S. does to its soldiers? Now, you’re a security risk,” he says.
The ethical framework Evans is developing has implications for research into enhancements in civilian life, too, such as genetic modifications to human embryos, performance-enhancing and muscle-building drugs used by athletes, and so-called “study drugs,” prescription stimulants that are sometimes misused by students to help them focus longer and sleep less, he says.
The Greenwall Foundation Faculty Scholars program grants provide support to junior faculty to help them become leaders in their fields and obtain tenure. Mentorship is built into the grant, Evans says.
Evans’ previous research includes two National Science Foundation-funded projects, including one to examine the ethics of decision-making algorithms for self-driving cars. He also received a previous Greenwall Foundation grant for work on dual-use neurotechnologies.