You are here

Good Practice for the Development of Autonomous Weapons: Ensuring the Art of the Acceptable, Not the Art of the Possible
Tony GillespieRUSI Journal, 21 January 2021
UK Integrated Review 2021, Law and Ethics, Technology
Advanced surveillance systems can autonomously identify military targets. A consequent automated decision to attack without human assessment and authorisation of the action will almost certainly be in breach of international law. Separating decisions and actions identifies the role of machine-made decisions and the human ability to assess them and to authorise action. High autonomy levels in a weapon system place new responsibilities on organisations and personnel at all stages of procurement and use. In this article, Tony Gillespie builds on recent UN expert discussions to propose that detailed legal reviews at all procurement stages, including pre-development, are needed to ensure compliance with international law. Similar reviews are also needed for automated systems in the decision-making process.
BANNER IMAGE: The XQ-58A Valkyrie demonstrator, a long-range, high subsonic unmanned air vehicle on its inaugural flight at Yuma Proving Grounds, Arizona, 5 March 2019. Courtesy of US Air Force/Joshua Hoskins
Continue Reading
Become A Member
To access the full text of this article and many other benefits, become a RUSI member.
FCAS: Is the Franco-German-Spanish Combat Air Programme Really in Trouble?
Getting the Partnership Right
Iran in the South Caucasus: Adjustment and Evolution