Skip to Content

Professor’s Research Explores How Humans Respond When Autonomous Systems Report for Duty

Back to Article Listing

Author(s)

Tamara Chapman

Senior Managing Editor

Senior Managing Editor"

Tamara.Chapman@du.edu

Feature  •
Julia Macdonald
Julia Macdonald

When the Department of Defense (DOD) needs to get up to speed on the social, cultural, behavioral and political trends affecting the nation’s security, it calls in a special kind of special forces.

Via its Minerva Research Initiative, named for the Roman goddess of wisdom and strategic warfare, the department seeks insight — and data — from the ranks of the nation’s top researchers.     

Count Julia Macdonald of the University of Denver’s Josef Korbel School of International Studies among them. An assistant professor affiliated with the Korbel School’s Sié Chéou-Kang Center for International Security and Diplomacy, her research focuses on state threat assessments, use-of-force decisions, and U.S. military strategy and effectiveness. 

In her latest role, Macdonald is spearheading DU’s contribution to a multi-university study on autonomous systems (think self-driving cars or self-piloted aircraft) and their use in a military context. Titled “The Disruptive Effects of Autonomy: Ethics, Trust and Organizational Decision-making,” the three-year project is made possible by a $1 million Minerva grant, awarded to the University of Pennsylvania and including research teams from DU’s Sié Center, the Naval War College and Yale University.

Minerva funding comes with no expectations of outcomes, Macdonald explains. “They don’t tell us what they want to hear. They don’t even know sometimes what they want to hear.”

But they do know that, to devise effective strategy, they need to know more about what Macdonald considers an “impending reality.”

“Seeing some of these technologies coming into our everyday lives, like self-driving cars and even Siri on your iPhone has made everyone aware that we’re much closer to the widespread use of autonomous systems than perhaps it would seem,” she says.  

That gives her research urgency, so Macdonald is already hard at work addressing a long list of questions about how human beings respond to autonomous systems controlling their lives in high-risk situations and how public support for the use of force might change when these systems are deployed.

Along with that, Macdonald says, “We’re interested in things like trust. How do military personnel feel about putting their lives on the line when there is not a human being watching out for them, when it’s a machine? Will they be more or less willing to use those machines, and how will that affect military organizations in general?”

This project builds on Macdonald’s earlier research with Jacquelyn Schneider of the Naval War College on unmanned aerial vehicles, also known as drones. The public has grown accustomed to — and appears supportive of — the military use of drones, Macdonald notes. But that doesn’t mean that everyday citizens, their Congressional representatives or the troops on the ground will take kindly to autonomous systems. After all, drones are controlled by humans, who can monitor a mission remotely and use their judgment to modify or abort it, even with a moment’s notice.

“An autonomous system doesn’t have any of that,” Macdonald explains. “The human being obviously will have programmed the technology, but the technology will have the ability to change direction, reselect the target. A human being will not be involved in those decisions. Once you let it loose, it’s loose.”

Macdonald’s research will include interviews with military leaders, policy makers and industry executives developing many of these systems. Just as important, she and her research team will talk with the uniformed men and women who serve alongside these systems.

To assess the general public’s response to — and possible support of — autonomous systems in a military operation, the research team also plans to enlist “survey experiments.” These, Macdonald explains, “are a little like questionnaires, but they are set up to present people with manipulatable scenarios. ‘If this has happened, what would your response be?’”

A typical scenario might inform survey respondents that the U.S. military plans to use autonomous systems in an action against terrorist organizations overseas. Subsequent scenarios might assign agency to other parties — the federal government, perhaps, or a contractor acting on its behalf.

“[Survey experiments] are essentially a way of doing a lab experiment, as you would in a scientific environment, where you can manipulate different variables,” Macdonald says. “Then you can see how the variables affect the responses.” It’s highly possible, she adds, that the public will regard a decision made by the military one way and a decision by government another way—even if the outcome is the same.

Because the findings from Minerva projects are not classified, they can be shared with the general public, and Macdonald expects the information will be of interest to industry, the foreign policy establishment and educators. After all, widespread use of autonomous systems will no doubt lead to ethical dilemmas, job losses and job creation, and the need to retrain accordingly.

Concerned citizens also have a stake in the issue. “This is a reality,” Macdonald says, “that we all have to face.”