The term “fake news” has become a prominent nomenclature in public discourse. Indeed, the idea of “fake news” has brought to the fore a number of key concerns of modern global society, including if and how social media platforms should be regulated, and more critically, the potentially subversive role of online disinformation (and its spread on such platforms) to undermine democracy and the work of democratic institutions.

In response to the growing concerns over “fake news” and online disinformation, the European Union established an independent High Level Expert Group in early 2018 to establish policy recommendations to counter online disinformation. In March 2018, the HLEG published a report entitled ‘A Multidimensional Approach to Disinformation’ which sets out a series of short and longer term responses and actions for various stakeholders, including media practitioners and policymakers, to consider in formulating frameworks to effectively addressing these issues. The report recognises that online disinformation is both a complex and multifaceted issue, but is also a symptom of the broader social move toward globalised digitalism.

The Information Law and Policy Centre (ILPC) at the Institute for Advanced Legal Studies, University of London, convened a seminar at the end of April to discuss the report of the HLEG and the phenomenon of “fake news”. The seminar forms part of the ILPC’s public seminar series where contemporary issues regarding various aspects of information law and policy are discussed and deliberated with key stakeholders and experts in the field. Held on the evening of the 30th April 2018, the seminar consisted of an expert panel of discussants from both academia and the media.

Chaired by ILPC Director, Dr Nóra Ni Loideain, the panel included Professor Rasmus Kleis Nielsen, Professor of Political Communication at the University of Oxford and Director of Research at the Reuters Institute for the Study of Journalism and member of the HLEG, Dr Dimitris Xenos, Lecturer in Law at the University of Suffolk, Matthew Rogerson, Head of Public Policy at Guardian Media Group, and Martin Rosenbaum, Executive Producer, BBC Political Programmes.

After an introduction and overview of the report by the Chair, Professor Nielsen (speaking in his personal capacity) began the panel discussion by noting that “fake news” is part of a broader crisis of confidence between the public, press and democratic institutions. Professor Nielsen cautioned against the use of the term “fake news”, highlighting that it is dangerous and misleading, having been sensationalised for political use. Stressing that the HLEG has taken deliberate steps to quell the further populism of the term by not using it in reports and policy documents, he also pointed out that significant steps to redress the issue cannot take place until we have a deeper understanding of just how widespread the problem is or is not. That being said, Professor Nielsen emphasised that in addressing such issues, we must start from the principles we seek to protect, and particularly, freedom of expression and open democracy.

Accordingly, Professor Nielsen set out what he saw to be the six key recommendations from the HLEG’s report, noting that these recommendations come as a package to be realised and implemented together.

  • Abandon the term “fake news”;
  • Set aside funding for independent research in order to develop a better understanding of the scope and nature of the issue, noting too that little is known about the issue outside of the West and Global North countries;
  • Call for platform companies to share more data with fact checkers, albeit privacy complaint;
  • Call for public authorities at all level to share machine readable data to better enable professional journalists and fact-checkers;
  • Invest in media literacy at all levels of the population; and
  • Develop collaborative approaches between stakeholders.

Our second discussant, Dr Xenos, offered a light critique of the HLEG’s report. He agreed with the HLEG group’s focus on ‘disinformation’ relating to materials and communications that can cause ‘public harm’, and its contextual targets involving parliamentary elections and important ‘sectors, such as health, science, finance’, etc. He pointed out that as the institutions of powers (such the EU’s organs) are very often the original sources of such information that is subsequently treated by various media organisations and communicative platforms, the same standards and safeguards should apply to the communications and materials that are produced and published by the institutions. Dr Xenos referred to his contribution and the recent reports of the EU Ombudsman, involving the EU Commission, the EU Council and their very wide range of experts which highlight serious deficiencies in decision and policy-making, undermining the basic democratic safeguards which the HLEG’s report targets, such as transparency, accountability, avoidance of conflicts of interests, and access to relevant information. In this respect, Dr Xenos argues that the proposal for a fact-checking system of media communications covering decisions and policies of the institutions of power is unrealistic when such a system and safeguards do not apply to the original communications, materials and decisions of these institutions that the media may (if and when) subsequently cover. He emphasised the need for an independent academic insight that can offer analysis of events ex ante, in contrast to the traditional ex post analysis of journalism. However, Dr Xenos also said that the role of academics is undermined if appropriate research focus is lacking or there is a conflict of interest – an issue, he believes, concerns also those media organisations and platforms controlled by private corporations. In support of his claims, he referred to a recent study and its subsequent coverage by both the UK and US media. He welcomed the HLEG’ suggestions for access and analysis of platforms’ data and algorithm accountability in the dissemination of information.

Responding to Dr Xenos, Dr Ni Loideain noted that the report consistently emphasised the need for evidence-based decision- and policy-making.

Following from Dr Xenos’ remarks, Matt Rogerson from the Guardian Media Group emphasised the critical role online disinformation can play in determining the outcome of elections. Matt noted that current politics are marginal, with over a hundred constituencies won or lost with a swing vote of under 10%, which Facebook, for example, can greatly influence, given the high numbers of people who gain their news updates only from this source. Matt noted, too, how in the wake of the Cambridge Analytica scandal, tech companies are becoming increasingly hesitant to be open about their policies and activities. Matt further highlighted that the knowledge and understanding of citizens of the various news agencies and news brands varies, pointing to a study which demonstrated how citizens had a greater trust for the news items offered in certain broadsheet newspapers as opposed to particular tabloid presses.

Recognising, therefore, that there is strong media diversity and pluralism of news brands at present, Matt spoke of how this must be preserved and protected. One important issue, Matt noted, was the need for stronger visual queues on platforms such as Facebook, so users could readily distinguish between the branding of the Guardian news items, in comparison to other less-trusted news sources. Matt also raised concerns about the impact of programmatic digital advertising, which effectively decouples brand advertising from the context in which it is seen via online platforms, reducing the accountability of the advertiser as a result.

In terms of how to create trust between news organisations and the wider public, Matt drew our attention to the importance of diversity within media houses and social media platforms. Highlighting the lack of gender and ethnic diversity within the tech industry, as well as the related monopoly of Silicon Valley companies over the industry as a whole, Matt noted how the effect of this meant that there was little competition between platforms to raise standards and do the right thing by society. Matt recommended revisiting competition regulation to drive competition and diversity.

Martin Rosenbaum from the BBC and speaking in his personal capacity was the final discussant to offer their response on the report. Martin echoed the sentiments of Matt and Professor Nielsen in cautioning against downplaying the issue of disinformation and the effect it can have on society. Martin made note of the fact that disinformation can occur in various forms, including users sharing information despite not knowing or caring whether it is true or otherwise. Moreover, Martin emphasised how disinformation more broadly can foster a lack of trust in trusted news agencies and public institutions by generating, as Professor Nielsen spoke of too, a general crisis of truth.

Martin additionally mentioned how the ready consumption and splurge of news users receive on Facebook represents a divorce between the source of the information (for example, the BBC) and the way in which it is distributed and reaches the consumer. Martin explained how this works to undermine the trustworthy-ness of certain kinds of media and information.

In speaking of mechanisms through which to counter disinformation, Martin noted the BBC’s code on journalism that is accurate, fair and impartial, which underlies its position as one of the most trustworthy news sources globally. He further noted how the BBC has put in place various accountability mechanisms to handle complaints effectively. Martin additionally spoke of the need for media literacy and involving younger generations in news-making, reporting, and spotting “fake news”.

Responding to the claims made earlier by Dr Xenos, Martin assuaged that specialist journalists are most often best placed to fact-check news stories. And lastly, Martin also pointed out how chat apps were also complicit in the dissemination of “fake news” items, and that these platforms were much harder to regulate and monitor in terms of the content they handle.

Following from Martin’s contribution, the discussion opened up and various questions were posed to the panel regarding the scope and definition of disinformation and how this issue overlaps the fundamental principle of freedom of expression. There was a broad consensus to steer away from unnecessary government regulation that may impact upon free speech. Other issues raised included the tension between the call for open data and data sharing and the coming into effect of the GDRP this month.

Dr Rachel Adams, ILPC Early Career Researcher