You are here
Advanced surveillance systems can autonomously identify military targets. A consequent automated decision to attack without human assessment and authorisation of the action will almost certainly be in breach of international law. Separating decisions and actions identifies the role of machine-made decisions and the human ability to assess them and to authorise action. High autonomy levels in a weapon system place new responsibilities on organisations and personnel at all stages of procurement and use. In this article, Tony Gillespie builds on recent UN expert discussions to propose that detailed legal reviews at all procurement stages, including pre-development, are needed to ensure compliance with international law. Similar reviews are also needed for automated systems in the decision-making process.
BANNER IMAGE: The XQ-58A Valkyrie demonstrator, a long-range, high subsonic unmanned air vehicle on its inaugural flight at Yuma Proving Grounds, Arizona, 5 March 2019. Courtesy of US Air Force/Joshua Hoskins
Become A Member
To access the full text of this article and many other benefits, become a RUSI member.