Mind the Gaps: Russian Information Manipulation in the United Kingdom
This commentary is part of a new CSIS project exploring the impact of Russian and Chinese information operations in democratic nation-states. Part I of the project examines Russian disinformation campaigns in the United Kingdom and Germany and Chinese disinformation campaigns in Australia and Japan. Read the first piece on Japan here and on Germany here.
In today’s conflicts, nonmilitary tools are becoming as prevalent as military ones. Cyberattacks, information operations, and the leveraging of economic investment for political influence have all become instruments of some countries’ foreign and security policies. The world has watched in disbelief at Russian intelligence operatives attempts to hack the computers of international organization such as the Organization for the Prohibition of Chemical Weapons, meddle in other countries’ elections, or assassinate former Russia military officers and intelligence agents.
As U.S. and European leaders try to analyze and anticipate information manipulation operations and other malign activities by actors such as Russia and China, a look at the manifestations of recent Russian information manipulation in the United Kingdom provides a helpful case study in understanding Russian objectives, measures countries can take to reduce vulnerabilities that make them more susceptible to influence operations, and ways to deter or diminish the impact of future information manipulation efforts.
The Vulnerabilities: Societal Polarization and Regulatory Gaps
By many measures, the United Kingdom is one of the more resilient societies in the world. Its government demonstrates accountability and exercises appropriate checks and balances among the different branches of power. Its media landscape is robust, diverse, and largely trusted by the public, including by diasporas and minority groups. The Russian diaspora community is generally well off and integrated into broader society. And while the large amount of Russian money circulating in the London real estate market has raised concerns that properties were purchased using corrupt proceeds from offshore companies, British business and academic elites have not proved subject to “capture” using economic influence.
Yet almost by virtue of their open nature, democracies will have societal vulnerabilities. In the case of the United Kingdom, political polarization and gaps in regulatory regimes surface as the primary ones. As such, Russian information manipulation efforts in the United Kingdom have focused on exploiting this polarization in UK society and taking advantage of regulatory gaps to achieve two objectives: weakening the United Kingdom internally and diminishing the United Kingdom’s position in the world. Both speak to Russia’s zero-sum mentality that a strong, stable United Kingdom, NATO, and European Union present a threat to Russia.
In terms of Russia’s aim to weaken UK society, recent Russian disinformation efforts were directed at those outside the mainstream of UK society: far right groups, the British Muslim community, and Scottish and Northern Irish separatists. In each of these cases, Russia accentuated existing, yet not necessarily problematic, differences in UK society—between migrants and UK-born whites, “good Muslims” who are traditional in their beliefs and those who are more liberal, and unionists and separatists. Those monitoring the UK disinformation landscape note that the efforts appear to be event-driven, spiking just before a major decision or vote or after a potentially controversial event. The former was evident in the June 2016 Brexit referendum, where in the days before the vote officials saw a significant uptick in bot-generated tweets linked to Russia-based Twitter accounts in the days before the vote. In terms of post-event information manipulation, UK officials noted an increase in Russian bot activity both in the weeks following the Skripal attack and U.S.-UK-French airstrikes in Syria.
Concerning its broader goal of undermining the United Kingdom’s role in the world, Russian efforts focused on circulating negative stories about NATO and the European Union, supporting the Leave campaign, and generating skepticism around a future U.S.-UK trade deal. Russia aimed to discredit the United Kingdom and NATO by circulating false stories about UK forces in Estonia as part of NATO’s Enhanced Forward Presence. They pushed targeted messages suggesting NATO is unreliable and its troops unwelcome. Concerning the European Union, narratives focused on discrediting the European Union by painting it as ineffective, corrupt, and infringing upon UK sovereignty. The UK Electoral Commission also investigated allegations that Russia directed financing to the Leave campaign. Its findings are detailed in an October 2019 “Russia Report” that has yet to be released publicly. This financing was likely enabled by a loophole in UK campaign finance law that does not require disclosure of political donations if they are from “the beneficial owners of non-British companies that are incorporated in the EU and carry out business in the UK.” Arguably, the leak by Russian hackers of sensitive U.S.-UK trade documents on Reddit is the most concerning in terms of impact in that the (seemingly authentic) documents were then used by the Labour Party as proof that the National Health Service would be part of a future post-Brexit trade deal with the United States.
Interestingly, Russian disinformation did not necessarily advocate a specific position or take sides. Its purpose was simply to introduce confusion, doubt, and misinformation into existing debates. Often, Russian trolls and bots floated multiple false narratives as “trial balloons” to see which would be most successful, only later doubling down on those that got the most interest. In many cases, Russia first tested these messages on less-regulated fringe platforms either to avoid detection or refine the disinformation through used feedback before moving it into the mainstream.
The United Kingdom has since launched a whole of government effort to address many of these vulnerabilities. This includes measures on both the supply and demand sides. On the supply side, the focus is on limiting the inflow and spread of disinformation into the UK information environment, effectively creating “deterrence by denial.” This includes partnerships with social media companies to get them to remove manipulated content (e.g., deepfake videos) or to block and shut down fake social media accounts and bots. In April 2019, the UK Department for Digital, Culture, Media and Sport, in cooperation with the Home Office, published an Online Harms White Paper that recommends both legislative and non-legislative measures to enhance online safety and hold social media companies more accountable. The government subsequently solicited feedback on the paper with a view toward determining which of the report’s recommendations should be implemented.
While government cooperation with social media companies is a good start, more collaboration among social media platforms themselves is needed to compare tactics and contain the spread of disinformation. A model for such collaboration exists in the form of the UK’s Global Internet Forum to Counter Terrorism , which focuses on tackling extremist and violent content on social media platforms. A second line of effort is proactive monitoring of the information space to anticipate triggers and issues that are likely to be exploited by adversaries. In the near term, upcoming UK trade negotiations with the European Union and United States—to include the sensitive areas of healthcare and agriculture—are considered prime candidates for exploitation as are stories encouraging separatist sentiments as the United Kingdom exits the European Union.
Yet perhaps even more important are actions on the demand side. Insofar as it is inevitable that some degree of disinformation will circulate through the information space for the foreseeable future, governments also should focus on building societal resilience . This starts with enhancing awareness and media literacy, to include educating the public and politicians about the existence of disinformation in the media landscape and training them on how to spot it. In this regard, the UK government runs a number of digital media literacy campaigns for politicians and the public, such as the popular “Don’t Feed the Beast” campaign , which offers a checklist for determining whether a story is disinformation. Government organizations also partner closely with nongovernment entities to include community leaders, academics, and the private sector to identify and reach out to vulnerable audiences. Interestingly, the United Kingdom’s approach is agnostic to the disinformation actor, focusing instead on vulnerabilities in the information environment and among receiving audiences.
Borrowing lessons from classic deterrence, we know that Russia pushes until it meets costs and then stops. In this respect, deterrence by denial alone will not be sufficient to change behavior. “Deterrence by punishment” should also be a part of any successful counter disinformation strategy. Here again the UK case offers some indicators on what deters Russia. In the case of the Skripal poisoning , UK officials’ success was due to several factors. The first was coordinated messaging. Rather than each department issuing its own response (creating gaps for Russia to exploit), the various stakeholders coordinated their response through the Cabinet Office, resulting in a single unified message issued out of the Prime Minister’s Office. Second, this messaging was followed by public release of the evidence to include the identity of the Russian agents, closed-circuit television footage of them around the crime scene, and records of their hotel and flights. Finally, this was followed by punitive measures in the form of multiple countries’ expulsion of Russian diplomats in retaliation for the attack. Taken together, this swift, coordinated response backed by the imposition of punitive measures on an international scale exposed Russian malign activities and incompetence, embarrassing Russia in the eyes of its citizens. Over time, such reputational damage could cause more serious problems for the Russian government vis-à-vis the Russian people.
Ensuring open societies are not exploited by adversaries requires constant vigilance, attention to resilience, and good governance. Left unchecked, disinformation efforts will erode the foundations of democracy, such as trust and freedom of expression. By analyzing previous disinformation efforts—and being honest about the individual and societal vulnerabilities that help enable them—the hope is that governments can gradually reduce the frequency and impact of their occurrence.
This commentary was made possible by the Information Access Fund (IAF) administered by the Democracy Council of California. The opinions, conclusions, or recommendations contained herein are those of the authors and should not be interpreted as necessarily representing the official policies or endorsements, either express or implied, of the IAF or the U.S. government.
Rachel Ellehuus is deputy director and senior fellow with the Europe Program at the Center for Strategic and International Studies in Washington, D.C.
Commentary is produced by the Center for Strategic and International Studies (CSIS), a private, tax-exempt institution focusing on international public policy issues. Its research is nonpartisan and nonproprietary. CSIS does not take specific policy positions. Accordingly, all views, positions, and conclusions expressed in this publication should be understood to be solely those of the author(s).© 2020 by the Center for Strategic and International Studies. All rights reserved.