Disruptive Technologies Programme
How can the UK secure advantage with disruptive strategic technologies, most notably AI, in an era of heightened geopolitical competition? We examine the opportunities and risks presented by these technologies for national security and prosperity and chart the inherent strategic advantages in the context of rapid technological change.
States are rapidly developing policies and strategies to ensure that they benefit from the opportunities presented by disruptive strategic technologies such as AI. RUSI uses its unique policy and geopolitical expertise to bridge the intellectual and conceptual gaps between technology and policy experts. We assess the effectiveness of current UK and international strategies to design, manage and use disruptive technologies across defence and security – with AI as our principal focus.
Recent rapid, and accelerating, advances in technological innovation are taking place in an era of increased geopolitical competition. Disruptive strategic technologies such as AI often rely on technical capabilities – sensors, computational, data – that are sourced through global supply chains.
The convergence of technology and geopolitics, public and private actors, expert communities, and different technologies will lead to potential societal consequences that are hard to anticipate, more so when those who are developing new technologies and those who devise policies and strategies struggle to understand one another.
The research programme looks at disruptive strategic technological applications across both the defence and civilian sectors. In defence, we look at target acquisition, insider threat detection and automated logistics. Meanwhile, across wider society, the programme explores the national security implications of disruptive strategic technologies in areas such as terrorism, cyber, climate and regional power dynamics.
Disruptive strategic technologies can be used maliciously or have unintended consequences for societies. As states develop technology capabilities to advance national security and prosperity agendas, it is important to acknowledge the limits of these technologies and explore how to mitigate related risks.
Programme team
James Sullivan
Director, Cyber Research
Cyber
Emma De Angelis
Director, Special Projects
Publications
Dr Pia Hüsch
Research Fellow
Cyber
Noah Sylvia
Research Analyst for C4ISR
Military Sciences
Natasha Buckley
Research Analyst
Cyber
Tom Keatinge
Director, CFS
Centre for Finance and Security
Kathryn Westmore
Senior Research Fellow
Centre for Finance and Security
Fellows
August Cole
Associate Fellow
Dr Keith Dear
Associate Fellow
Professor Sir Anthony Finkelstein CBE FREng DSc MAE FCGI
Distinguished Fellow
Ludovica Meacci
Associate Fellow
Dr Marion Oswald MBE
Associate Fellow
Dr Brad Pietras
Associate Fellow; Senior Commercial Advisor to DARPA
Huw Roberts
Associate Fellow
Engagement and outcomes
As part of this programme of work, we will develop new analytical tools and methods, as well as bring together new partnerships, that combine geopolitical strategy and technology foresight to help us address this research area.
RUSI’s partner-based programme of activity on disruptive strategic technologies encourages open and informed dialogue between technology experts and policymakers to bridge the knowledge gap and devise effective solutions to the practical and policy problems the UK and its allies face in an era of rapid technological change.
We are uniquely positioned to facilitate knowledge sharing and mutual learning between the security and defence communities and the technological disruptors who are reframing our present and future. We explore the essential role of technology and policy partnerships with the private sector and other states, the two critical pillars required if the UK and its allies are to secure a strategic advantage over adversaries.
In the pursuit of improved security and prosperity for the UK and its allied community, our programme focuses on themes such as innovation and capability, geopolitical competition and alliances, and risks and mitigations.