The UK as an AI Superpower: The Government’s Challenge Begins at Home


Team effort: more public-private partnerships will be needed to help realise the UK government's AI ambitions. Image: Vector / Adobe Stock


There is significant pressure for the public sector to mainstream AI at pace – after all, the UK has ambitions to be a global AI superpower. But even with the best intentions, fast-paced adoption and implementation of AI in the public sector will involve significant challenges. We outline several of these and point to possible solutions.

The widespread adoption of AI technologies comes with much promise. In the public sector in particular, AI integration has the potential to bring greater efficiency, for example by speeding up administrative services, thereby saving taxpayer money and offering better services to citizens. On a more macro level, AI integration in the public sector – civilian and military alike – is also supposed to secure an economic and military advantage to the UK.

The government already envisages widespread adoption of AI, and is encouraging civil servants to familiarise themselves with generative AI at work, for instance. The Cabinet Office is currently running a trial for an AI-powered chatbot, intended to make an interactive document analysis tool accessible to all civil servants. However, while the aim is for all government departments to use AI and support the wider goal of becoming a ‘global AI superpower’, in practice, many hurdles arise for the adoption of AI by the public sector. These range from the lack of skills and training in day-to-day engagement with AI technologies to a lack of flexible procurement and technical leadership.

Knowledge and Skills in the Private Sector

A key challenge the public sector faces is that much of the knowledge and experience around AI development, research and application currently lies in the private sector, whose deep pockets drive R&D and attract much sought-after talent. No public spending commitments come close to the budgets of the private sector – compare the UK’s £900 million commitment to build a supercomputer to the $35 billion committed by AWS to data centres in Virginia. Of course, many consultancies offer well paid advice to public sector entities on how to navigate this challenge. But it takes more and especially different kinds of partnerships with the private sector to enable greater information and knowledge exchange between public and private organisations. Such partnerships are needed not only to enable the public sector to reap the benefits of AI but also to design effective regulation and governance to mitigate any risks that arise. The National Cyber Security Centre’s Industry 100 scheme, through which public institutions host cyber security experts from the private sector for one day a week or month, is a laudable model that could be replicated in an AI context. This would allow the public sector to benefit from private-sector expertise while the private sector can access a greater network and gain insights into the government’s work.

Learning from Defence Procurement

Challenges also arise given the public sector’s slow procurement of AI technologies. The Ministry of Defence (MoD), for example, spent £25 billion with UK industry in 2022/23. Yet industry has long complained about the MoD as a customer, and these complaints extend to its procurement of AI. The MoD published its strategy for exploiting AI (the ‘Defence Artificial Intelligence Strategy’) in 2022, but has since faced criticism for failing to sufficiently detail how it will become ‘the world’s most effective, efficient, trusted and influential Defence organisation for our size [in terms of AI]’.

quote
A greater range of partnerships are needed not only to enable the public sector to reap the benefits of AI but also to design effective regulation and governance to mitigate any risks that arise

Companies, both large and small, have not received clear demand signals on the MoD’s operational objectives, hindering the deployment of successful AI in defence. Besides a lack of clarity on priorities and requirements (which, granted, the Defence AI Centre hopes to rectify), uncertainty persists over how to integrate AI with existing systems. The MoD contains a myriad of ‘architectural standards, patterns and interfaces’, thereby limiting data accessibility (the lifeblood of AI functionality) and AI integration into decision-making, analysis and training processes. Similarly, the rather disparate efforts at AI procurement across a host of defence entities – from front-line commands to the Defence Science and Technology Laboratory – inhibit AI’s efficiency benefits. The Integration Design Authority (IDA) will alleviate some of these frictions by incorporating a level of AI oversight into its efforts; however, it cannot bear sole responsibility for this task, as the IDA was created to facilitate integration across the whole of Defence.

Furthermore, most of the AI innovation occurs in smaller firms and start-ups, especially small to medium-sized enterprises (SMEs), while many of the MoD’s procurement processes are overwhelmingly oriented towards the largest defence contractors – the so-called ‘defence primes’. The regulatory framework requires experience of navigating the complex MoD bureaucracy as well as sufficient capital to sustain one’s firm through the long process. SMEs often have neither the £25,000 monthly revenue stream to stay afloat, nor the decades of organisational knowledge about the MoD’s security and classification protocols: their choice is either to fail or to pivot away from the MoD to easier customers. Ongoing efforts to shorten innovation cycles include the new ‘Integrated Procurement Model’ with a focus on software-oriented procurement, which requires maintaining a stronger relationship with companies in order to ensure constant, iterative updates. To secure the best technological advantage, MoD and wider public procurement need to become more agile to include smaller companies that provide cutting-edge technology, even if not at scale yet.

Changing the Public Sector Career Mindset

The MoD at least has an AI strategy; 76% of government departments don’t, even though over a third of surveyed government departments questioned on the use of AI by the public sector are already deploying AI. Declaring the aim to become a global AI superpower is perhaps an ambitious target, but it is by no means sufficient in terms of setting useful guidance and vision for government departments. The UK’s public sector needs more tangible goals to work towards. This means identifying in more concrete terms what being an ‘AI superpower’ looks like in the next 3, 5, 10 and 20 years.

quote
Only when its people support the government’s aim of being an AI superpower will the UK be able to turn ambition into reality

Setting out and implementing such a vision needs credible leadership at all levels and in all government departments. Yet the career civil servants and generalists who often fill these positions can easily lack the technological understanding and vision that a technician from the private sector, or even a specialist within the public sector, might offer. Thus, external hiring needs to become more agile to offer private-sector tech visionaries and specialists the opportunity to come into the public sector on a senior level and for a shorter timeframe than is currently the case, while still benefitting from full integration and trust from public sector colleagues. Similarly, technology experts already working within the public sector must be able to be promoted to leadership positions, not only to offer their expertise on new levels but also to show that a long-lasting technology career within the public service is possible. Only by hiring and retaining adequate technology leadership will the public sector be able to successfully roll out AI technologies.

Navigating the Talent Terrain

Technically focused leaders are crucial for planning and envisioning strategies, but are entirely reliant upon the skills held within government departments to enact their policies. Unfortunately, the National Audit Office’s autumn 2023 survey of government bodies found that 70% of respondents described a lack of skills as a barrier to AI adoption. Difficulties in hiring and retaining individuals with the necessary skills to develop and deploy AI have intensified in recent years, exacerbated by uncompetitive salaries relative to the private sector (which faces its own skills shortage). Every entity of the UK government needs greater expertise in machine learning, data management and cloud computing, along with a broader spectrum of software skills and lawyers trained in AI ethics and the corresponding legal compliance. Yet the demand for skills must extend beyond experts. Non-technical employees must familiarise themselves with simple data processes while understanding the function of their organisation’s models in order to trust the outputs.

The government is demonstrating its commitment to equipping younger generations with the skills for a data-driven world. But this investment is geared towards the long-term upskilling of the UK economy and does not mitigate the current shortcomings of an under-skilled civil service. Currently, the government is filling this gap with ‘contractors, agency workers, and temporary staff’, who comprise a third of ‘digital and data professionals’. Despite the government commitment to reduce reliance upon temporary workers in order to cut costs, the lead time on large-scale upskilling indicates that the government will need to continue to buy in skills before developed skills have matured. This transition must be accelerated by including data skills as part of on-the-job learning for all civil service roles, providing a boost in digital literacy without more major expenditure in formal training. Similarly, personnel management must become more porous to permit expertise to flow out of and back into government with ease. Flexible career paths between the public and private sectors are needed to ensure that private-sector knowledge, ideas and skills can be brought into government at no additional cost to the taxpayer.

A People Challenge as Much as a Technology One

Much can be debated about what the right aim is for a grand AI strategy, how the public sector can procure AI technologies and who has the vision and leadership to work towards AI implementation in the public sector. Most importantly, however, the UK cannot become an AI superpower unless the ambition is reflected in wider society. This includes early access to learning opportunities and safe engagement to try out human-machine teaming, as well as a clear signal of job opportunities rather than threats to the labour market. Critically, it also requires strong engagement with ethical concerns and other risks that need to be addressed to meaningfully assure the responsible use of AI. Only when its people support the government’s aim of being an AI superpower will the UK be able to turn ambition into reality.

The views expressed in this Commentary are the authors’, and do not represent those of RUSI or any other institution.

Have an idea for a Commentary you’d like to write for us? Send a short pitch to commentaries@rusi.org and we’ll get back to you if it fits into our research interests. Full guidelines for contributors can be found here.


WRITTEN BY

Dr Pia Hüsch

Research Fellow

Cyber

View profile

Noah Sylvia

Research Analyst for C4ISR

Military Sciences

View profile


Footnotes


Explore our related content