WhatsApp Hack Calls into Question Government Use of Commercial Spyware
As government use of commercial spyware becomes increasingly commonplace, stronger checks and balances will be needed to ensure software providers are acting in the public interest.
First identified as providing surveillance software to government agencies in 2016, Israeli technology firm NSO Group Technologies recently hit the headlines again – this time for allegedly exploiting a vulnerability in the encrypted messaging service WhatsApp to covertly install its Pegasus spyware onto targets’ devices. The ‘WhatsApp hack’, discovered by security researchers, raises questions over the lawful creation, purchase and use of so-called ‘spyware-as-a-service’. Are greater controls now needed for private companies that develop spyware for government agencies? Should these companies be required to adhere to an industry-wide ethical framework as part of export controls? And what measures can be taken to ensure these tools do not fall into the hands of cyber-criminals? Â
NSO Group’s Pegasus spyware exploited a ‘zero-day vulnerability’ within WhatsApp that previously went undetected and has raised concerns from all corners of civil society about the creation and development of spyware technology. The vulnerability was a result of a ‘buffer overflow’, which is used as extra data capacity storage when a large amount of data is transmitted – simply put, using up more space for the data than allocated. The NSO Group exploited this extra capacity by triggering the buffer overflow through a phone call and overwriting the data inside it with malware. Once installed, the malware gives the attacker full control over the target’s device, allowing them to remotely extract all data or monitor users by taking control of the camera and microphone.
The exploitation of software vulnerabilities to gain control of devices falls into a category of surveillance legally classified as ‘equipment interference’ (also known as ‘exceptional access’ or ‘computer and network exploitation’). The use of such technology would normally constitute a criminal offence under the Computer Misuse Act, but surveillance legislation gives law enforcement and intelligence agencies powers to use this technology lawfully under warrant. While there is no legal obligation for vulnerabilities to be reported to the developer of the software, there has to be a question about the rightness of a situation where a company keep quiet about a major vulnerability in an app used by 1.5 billion users worldwide, particularly when the targeted individuals are human rights activists and journalists and the users include countries known for disregarding human rights. An incident such as this raises questions as to whether there should be a legal requirement to report such vulnerabilities when they are discovered.
As technology has matured, end-to-end encryption has reduced the value of traditional passive interception methods as a means of gathering intelligence, shifting capability development to equipment interference, for example targeting an individual’s device rather than data in transit. Earlier this month, Director General of MI5 Andrew Parker summarised the challenge posed by encryption as ‘the haystack is bigger and the needle smaller’. Equipment interference is a linchpin in surveillance, providing a more targeted approach in a way that complies with the law. Companies such as NSO Group sell these capabilities in small packages at extremely high cost, enabling highly targeted and efficient surveillance.
This situation can be characterised as an endless game of ‘cat and mouse’ between intelligence agencies and technology companies: agencies implement capabilities to exploit software vulnerabilities; the developer then patches those vulnerabilities; and the agency is forced to find another way into the device. Although the WhatsApp zero-day vulnerability has now been patched, it is safe to assume there is a host of other capabilities being used and developed as a result of vulnerabilities in other apps.
In a detailed and revealing blog post, Ian Levy (technical director of the National Cyber Security Centre) and Crispin Robinson (technical director for cryptanalysis at GCHQ) suggested that equipment interference allows intelligence agencies to monitor communications on platforms such as WhatsApp without compromising end-to-end encryption. However, questions around the nature of ‘lawful hacking’ become increasingly complex when the capabilities are provided by a third-party commercial entity. While most of the population will never be the target of these capabilities from government agencies, commercial entities are not required to adhere to the same security standards as government agencies, and there is the question of whether companies are securing their hacking tools sufficiently to keep them from being stolen, either through external theft or as a result of insider threats. Â
That said, even major nation states appear to have difficulty in preventing their surveillance technology from being stolen and becoming a tool for cybercrime. The WannaCry ransomware attack of May 2017 was allegedly built on the stolen US National Security Agency (NSA) tool EternalBlue, which took only one unsecured device to spread the worm across an entire network. The attack exploited a vulnerability which Microsoft had previously patched but many users were slow to update, causing major disruption and financial damage across the UK’s National Health Service, as well as countless other European organisations. This, coupled with the Snowden leaks, raises serious questions about the NSA’s ability to secure its malware tools.
As to the fundamental question of how governments or companies decide what to do with vulnerabilities they discover, GCHQ for example have an internal Equities Process to determine whether to disclose vulnerabilities to the software developer or keep them secret for exploitation purposes. This decision is based on an assessment of whether ‘there is a clear and overriding national security benefit in retaining a vulnerability’, and GCHQ claims that ‘the starting position is always that disclosing a vulnerability will be in the national interest’. However, it is unclear whether the private companies that develop these products consider the danger of the vulnerability to wider society. NSO Group claims to have an internal business ethics committee consisting of employees and external counsel who vet the validity of sale to clients, but this is neither an impartial nor an independent committee. Many would argue that ethical oversight amounts to nothing more than an empty gesture, and that legally binding export controls are the only way to ensure these powerful capabilities are not used unlawfully.Â
By selling to states which subsequently used the technology to spy on activists and journalists, NSO Group has invited numerous lawsuits from the likes of Amnesty International and Citizen Lab. Many countries conduct surveillance operations outside of a proper legal framework, and concerns have been raised regarding the supply of surveillance capabilities to countries with dubious human rights records. While the sale of surveillance tools is governed by international agreements around export controls (most notably the Wassenaar Arrangement), questions remain over whether these voluntary agreements provide enough reassurances or sufficient enforcement. For example, Israel is not a participating state of the Wassenaar Arrangement, meaning the NSO Group is not required to abide by these controls.
Stronger export controls and oversight measures would go a long way to regulating firms such as NSO Group on who they can sell to and how they allow their technology to be used. Greater oversight measures would provide reassurances to the public, particularly activists and journalists who are common targets, that these companies are acting in the public interest and that surveillance capabilities are not used in contravention of their rights and freedoms. But to have any effect, such controls would need to be signed up to by the very countries that seem unlikely to be interested in doing so.
Government use of commercial spyware will only become more commonplace in the coming years. Greater transparency will be needed for citizens to be reassured that these tools are used in a way that respects their civil liberties and does not compromise their cyber security.
Sneha Dawda is a Research Analyst in Cyber Threats and Cyber Security at RUSI. Her research focuses on cyber security governance, emerging technology risks, diversity in cyber security, and international governance.
Alexander Babuta is a Research Fellow in National Security Studies at RUSI. His research focuses on policing, intelligence and technology.
The views expressed in this Commentary are the authors', and do not represent those of RUSI or any other institution.