Your submission is now a Draft.

Once it's ready, please submit your draft for review by our team of Community Moderators. Thank you!

You have been invited to co-author this question.

When it is ready, the author will submit it for review by Community Moderators. Thanks for helping!

Pending

This question now needs to be approved by community moderators.

You have been invited to co-author this question.

It now needs to be approved by community moderators. Thanks for helping!

Assassination by autonomous weapon by 2025?

{{ qctrl.resolutionRawString() }}
Resolved {{ qctrl.question.resolutionDate() }}

Assassination by autonomous weapon by 2025?

One of the many areas in which automation is steadily advancing is in weapons systems. Advances in machine learning systems that can parse photos and video, recognize faces, maneuver in complex 3-dimensional spaces, etc., can in principle allow new weapons systems that operate largely or wholly without human guidance.

As described here, such weapons raise a number of both strategic and ethical questions involving the threshold of conflict, arms races, and who (or what) chooses to take human lives. Several campaigns have arisen calling for an international ban on lethal autonomous weapons.

One major concern raised by such campaigns, articulated for example in this open letter, is that an arms race in autonomous weapons could lead to cheap, widely available, highly effective weapons that could be used for political purposes including suppression of dissent or assassinations. For example, a swarm of tiny drones with facial recognition systems could seek out particular individuals (or groups) and kill them with toxins or small close-range explosives.

Will a credible media report indicate that an autonomous weapon system has been used to kill a political figure by start of 2025?

Positive resolution requires that:

  • the figure killed is in a leadership role of a political group – either a government or other organization built around political ends, and

  • the target is identified by the autonomous system itself, according to some criteria, rather than by other means of surveillance (which may be used to localize the target but not select the target out of, for example, nearby people), and

  • no other "unintended" people are significantly harmed in the attack.

Note: this question resolved before its original close time. All of your predictions came after the resolution, so you did not gain (or lose) any points for it.

Note: this question resolved before its original close time. You earned points up until the question resolution, but not afterwards.

Current points depend on your prediction, the community's prediction, and the result. Your total earned points are averaged over the lifetime of the question, so predict early to get as many points as possible! See the FAQ.