Written evidence submitted by By Dr. Emma L Briant, University of Essex
A Response to Recent Interim Reports and Proposals by: U.S. Senate Select Committee on Intelligence; & U.K. Parliament Digital, Culture, Media and Sport Select Committee Inquiry into Fake News.
Responses to the Problem of ‘Fake News’ and Digital Propaganda in Democracies This year, a series of whistleblowers, journalistic investigations and public inquiries exposed systems in which the public, as their online activities have increasingly been monitored and monetized, are being made successively more vulnerable to powerful actors abusing data for propaganda targeting.1 This is enabled by digital platforms and influence industry applications that consumers trust, and which obscure their central purpose as part of their business model. Inquiries interrogated the respective roles of: the campaigns themselves; foreign actors such as Russia; digital media platforms; influence industry companies and their business models and methodologies. As the inquiries mature, policymakers are suggesting solutions for the problem of ‘fake news’ and digital campaign practices that may undermine democracy. For the Digital, Culture, Media and Sport [DCMS] Select Committee (July 2018) Fake News Inquiry and Sen. Mark Warner (U.S. Senate Select Committee on Intelligence), proposals largely focus on: Information Operations (IO) and coordinated responses to Russia; privacy and transparency measures largely focused on encouraging better behavior from digital platforms like Facebook; and providing public media education. Central to debates has been the extent to which platforms like Facebook are complicit, enabling mis- or dis-information and misuse of data, or failing to act. Scholarly proposals rightly emphasize a need to address the monopoly of these platforms by forcing data portability and allowing competition and plurality (Baron et al 2017; Freedman, 2018; Tambini, 2017, for example). However, a central question remains about the influence industry itself. As democratic governments sought extended powers of surveillance and information warfare to enable them to counter threats post-9/11, they also helped to build a digital infrastructure and a corresponding influence industry that advantaged their range of action at home and abroad (Bakir, 2018; Briant, 2015). My own research focuses on propaganda and my submission helped expose the role of Cambridge Analytica (CA) and SCL Group (their parent company) and indicated problems which seem to be largely unaddressed by recent proposals.2 Importantly, if UK and US responses are likely to include more IO, targeting Russia, it is unfortunate that both reports fail to address the fact the company central to the scandal emerged out of IO contracting for US and UK governments and NATO. Policymakers must consider whether oversight and intelligence mechanisms were adequate as they failed to identify or prevent a developing problem, the public must see changes that will ensure there can be no recurring issues with another contractor.
1 The scope was far-reaching, including the UK’s EU Referendum, 2016 US election and other international elections such as those in Nigeria (2014) and Kenya (2013/2017).
2 See the Fake News Inquiry in April and June (Briant, 2018a; 2018b).