Lethal Autonomous Weapons Systems, as defined by the U.S. Department of Defense, are “weapon system[s] that, once activated, can select and engage targets without further intervention by a human operator.”
Such systems, colloquially known (especially by their opponents) as "Killer Robots" or "Slaughterbots" have received negative attention from a number of actors, including a coalition known as the Campaign to Stop Killer Robots.
These activists appear to prefer a legally-binding instrument of international law, likely in treaty form, as a "Killer Robot Ban Treaty". Such a treaty has received support from several countries, as tracked by Human Rights Watch, and been discussed at the working group of the UN Convention on Certain Conventional Weapons (ibid.).
Formal and legally-binding arms control has proven elusive however, leading some to question the "end of arms control." Moreover, several of the most powerful militaries in the world, including the United States, the United Kingdom, and Russia, strongly oppose such a ban.
Moreover, a Killer Robot Ban Treaty faces a fundamental definitional problem, as weapons systems like "homing munitions" have elements of autonomy under conventional definitions, but are an integral part of many military operations, and have been in use since World War II, and as broad definitions may exclude many applications of artificial intelligence. Senior defense leaders, including Bob Work, the "father of the Pentagon's push for Artificial Intelligence" have also criticized such a ban.
Will the United States sign a Treaty on the Prohibition of Lethal Autonomous Weapons Systems by 2030?
This question will resolve positively if the United States Government formally agrees to an international treaty which purports to ban either the possession or the use of Lethal Autonomous Weapons as defined above, and is announced on the U.S. Department of State Office of Treaty Affairs or a mainstream news source or wire service (e.g. AP or New York Times).