DoD issues new directive on use of autonomous drones

Industrial Robot

ISSN: 0143-991x

Article publication date: 26 April 2013

332

Citation

(2013), "DoD issues new directive on use of autonomous drones", Industrial Robot, Vol. 40 No. 3. https://doi.org/10.1108/ir.2013.04940caa.004

Publisher

:

Emerald Group Publishing Limited

Copyright © 2013, Emerald Group Publishing Limited


DoD issues new directive on use of autonomous drones

Article Type: News From: Industrial Robot: An International Journal, Volume 40, Issue 3

The US Defense Department has issued a new directive on the use of autonomous and semi-autonomous weapon systems, an attempt to regulate a technology that officials say could be years from becoming reality.

The directive, released November 27, is focused on systems that can select and engage targets without the intervention of a human operator. Non-lethal autonomous systems, such as electronic attack or cyberspace systems, fall outside its jurisdiction. So do technologies such as the Patriot missile system, which have autonomous functions but still require human supervision.

Any autonomous and semi-autonomous weapon systems “shall be designed to allow commanders and operators to exercise appropriate levels of human judgment over the use of force,” the doctrine reads.

Humans still must play an oversight role, with the ability to activate or deactivate system functions should the need arise. Systems will also be required to go through “rigorous” verification and validation and operational test and evaluation stages to catch potential malfunctions before the systems ever see active duty.

Once through the testing stages, systems will require the approval of the undersecretary of defense for policy, the undersecretary of defense for acquisition, technology and logistics, and the chairman of the Joint Chiefs of Staff before their activation.

The overall goal of the new rules is to avoid “unintended engagements,” defined in the doctrine as “damage to persons or objects that human operators did not intend to be the targets of US military operations, including unacceptable levels of collateral damage beyond those consistent with the law of war, (rules of engagement), and commander’s intent.”

The new rules are not in place to discourage the development of an autonomous weapon system, said David Ochmanek, deputy assistant secretary for policy force development, who described the doctrine as “flexible.” “What it does say is that if you want to (develop an autonomous weapon system), there will be a rigorous review process, and if you expect it to be approved, you will be asked to show that your software and hardware has been subject to rigorous test and validation,” Ochmanek said.

While the department is looking toward the future, the report’s authors do not expect to need the new regulations any time soon. “This directive is, for once, out ahead of events,” Ochmanek said. “This isn’t something where we all of a sudden realized someone’s out there about to develop a Terminator and decided we better get a directive out. That’s not the case.”

Although Ochmanek declined to guess at a timetable for the development of this technology, “I can say with confidence that there is no development program going on right now that would create an autonomous weapons system,” he said.

The idea of a robotic military UAV that can identify enemies and hunt them down is a long-time staple in science fiction. But even when autonomous military systems become a reality, they are unlikely to resemble something out of “Star Wars” or “The Matrix”, inherently lacking in necessary human qualities.

“When you hear folks talk about this outside the Pentagon, in reports, they tend to leap to the hardest case […] something making a judgment call that [is] hard for people to make,” said a defense official involved with the drafting of the new doctrine.

The official used the example of two cars driving on the ground, one with an ally inside and one with an enemy inside. A machine would have to process an incredible amount of different data to be able to decide which car should be targeted.

“We don’t want to build a robot for that. Machines won’t have an advantage in that case,” said the official, who added that DoD would have a series of meetings with interested parties to brief them on the new doctrine.

The specter of that “hardest case” was raised in a November 19 Human Rights Watch (HRW) report, “Losing humanity: the case against killer robots.” The report warned of the need to regulate autonomous devices, “which would inherently lack human qualities that provide legal and non-legal checks on the killing of civilians.”

Figure 2 Drones continue to make headlines as policies evolve

Ochmanek denied any connection between the release of the HRW report and the new doctrine, which was in development for 18 months with the help of representatives from the Joint Staff, DoD’s acquisitions office, the Office of the General Counsel, the US military services, and the research and development community (Figure 2).

Related articles