Killer Robot Use In Dallas Raises Ethical Questions

Dallas police used a robot to deliver and detonate a bomb to stop a suspected sniper who killed five officers and injured nine others. This deployment raises ethical questions.

Dawn Kawamoto, Associate Editor, Dark Reading

July 11, 2016

3 Min Read

The Rise Of The Bots: 11 Ways Your Business Can Prepare

The Rise Of The Bots: 11 Ways Your Business Can Prepare


The Rise Of The Bots: 11 Ways Your Business Can Prepare (Click image for larger view and slideshow.)

A bomb-toting robot's deployment to shut down a suspected snipper in the Dallas shootings last week marks what is believed to be one of the first uses of robots to terminate human life, rather than help it, in a non-warfare situation. This use raises ethical questions about such actions.

The suspect, Micah Xavier Johnson, allegedly killed five police officers and wounded nine others in downtown Dallas, during a peaceful rally to protest the recent Minnesota and Louisiana police shootings that killed two black men, according to media reports.

Dallas Police Chief David Brown, in a press conference aired on NBC, said after negotiations with the suspect broke down and he began firing on police officers, a decision was made to equip the robot the department uses to remove bombs to deliver them instead.

 "There was no other option but to use our bomb robot and place a device on its extension for it to detonate where the suspect was. Other options would have exposed our officers to great danger," Brown said.

{Image 1}

The decision by the Dallas police to use a robot to kill a person is believed to be a first, outside of military warfare, according to a report in Fortune. The military has used MARCbots with land mines strapped on during the second Iraq War.

The robot used in the Dallas shootings was operated under manual control by a human, even though technology exists to have bots with artificial intelligence perform tasks autonomously, according to a report in USA Today.

"When it comes to life and death, you want people making those decisions," Martial Hebert, director of Carnegie Mellon University's Robotics Institute, told USA Today.

When it comes to robot and drone use in the military, there are currently three major global debates underway relating to ethics, Ryan Calo, a University of Washington law professor who focuses on robotics policy and law, told The Verge.

[Read Google Developing Panic Button to Kill Rogue AI.]

One debate centers on whether society should allow a robot to make a decision to kill, or whether a human being should always be involved in the process, Calo said.

A second debate examines the process that is used to select who should be on a kill list and the deployment of drones to terminate those on the list.

Finally, if nations used armies of only robots, would the situation generate more violence because human lives would not be at risk, whether or not there were human operators?

The ethical debate for local law enforcement using weaponized robots and drones has not yet reached similar proportions.

"Maybe there should be policies in place for how robots are used, maybe we can be considered about the overuse of robots in policing, but this debate is not connected to the greater debate about the military use of robots, in my view," Calo told The Verge.

Read more about:

20162016

About the Author(s)

Dawn Kawamoto

Associate Editor, Dark Reading

Dawn Kawamoto is an Associate Editor for Dark Reading, where she covers cybersecurity news and trends. She is an award-winning journalist who has written and edited technology, management, leadership, career, finance, and innovation stories for such publications as CNET's News.com, TheStreet.com, AOL's DailyFinance, and The Motley Fool. More recently, she served as associate editor for technology careers site Dice.com.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights