Report to Congress on Lethal Autonomous Weapon Systems

The following is the Nov. 17, 2021 Congressional Research Service In Focus report, Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems. From report Lethal autonomous weapon systems (LAWS) are a special class of weapon systems that use sensor suites and computer algorithms to independently identify a target and employ an onboard weapon system to […]

The following is the Nov. 17, 2021 Congressional Research Service In Focus report, Defense Primer: U.S. Policy on Lethal Autonomous Weapon Systems.

From report

Lethal autonomous weapon systems (LAWS) are a special class of weapon systems that use sensor suites and computer algorithms to independently identify a target and employ an onboard weapon system to engage and destroy the target without manual human control of the system. Although these systems are not yet in widespread development, it is believed they would enable military operations in communications-degraded or -denied environments in which traditional systems may not be able to operate.

Contrary to a number of news reports, U.S. policy does not prohibit the development or employment of LAWS. Although the United States does not currently have LAWS in its inventory, some senior military and defense leaders have stated that the United States may be compelled to develop LAWS in the future if U.S. competitors choose to do so. At the same time, a growing number of states and nongovernmental organizations are appealing to the international community for regulation of or a ban on LAWS due to ethical concerns.

Developments in both autonomous weapons technology and international discussions of LAWS could hold implications for congressional oversight, defense investments, military concepts of operations, treaty-making, and the future of war.

U.S. Policy

Definitions. There is no agreed definition of lethal autonomous weapon systems that is used in international fora. However, Department of Defense Directive (DODD) 3000.09 (the directive), which establishes U.S. policy on autonomy in weapons systems, provides definitions for different categories of autonomous weapon systems for the purposes of the U.S. military. These definitions are principally grounded in the role of the human operator with regard to target selection and engagement decisions, rather than in the technological sophistication of the weapon system.

DODD 3000.09 defines LAWS as “weapon system[s] that, once activated, can select and engage targets without further intervention by a human operator.” This concept of autonomy is also known as “human out of the loop” or “full autonomy.” The directive contrasts LAWS with human-supervised, or “human on the loop,” autonomous weapon systems, in which operators have the ability to monitor and halt a weapon’s target engagement. Another category is semi-autonomous, or “human in the loop,” weapon systems that “only engage individual targets or specific target groups that have been selected by a human operator.” Semi-autonomous weapons include so-called “fire and forget” weapons, such as certain types of guided missiles, that deliver effects to human-identified targets using autonomous functions.

Download the document here.