Category Archives: Information law and policy

Annual Conference 2017 Resources

The Information Law and Policy Centre held its third annual conference on 17th November 2017. The workshop’s theme was: ‘Children and Digital Rights: Regulating Freedoms and Safeguards’.

The workshop brought together regulators, practitioners, civil society, and leading academic experts who addressed and examined the key legal frameworks and policies being used and developed to safeguard children’s digital freedoms and rights. These legislative and policy regimes include the UN Convention on the Rights of the Child, and the related provisions (such as consent, transparency, and profiling) under the UK Digital Charter, and the Data Protection Bill which will implement the EU General Data Protection Regulation.

The following resources are available online:

  • Full programme
  • Presentation: ILPC Annual Conference, Baroness Beeban Kidron (video)
  • Presentation: ILPC Annual Conference, Anna Morgan (video)
  • Presentation: ILPC Annual Conference, Lisa Atkinson (video)
  • Presentation: ILPC Annual Conference, Rachael Bishop (video)

AI trust and AI fears: A media debate that could divide society

File 20180109 83547 1gya2pg.jpg?ixlib=rb 1.1

In this guest post, Dr Vyacheslav Polonski, Researcher, University of Oxford examines the key question of trust or fear of AI.

We are at a tipping point of a new digital divide. While some embrace AI, many people will always prefer human experts even when they’re wrong.

Unless you live under a rock, you probably have been inundated with recent news on machine learning and artificial intelligence (AI). With all the recent breakthroughs, it almost seems like AI can already predict the future. Police forces are using it to map when and where crime is likely to occur. Doctors can use it to predict when a patient is most likely to have a heart attack or stroke. Researchers are even trying to give AI imagination so it can plan for unexpected consequences.

Of course, many decisions in our lives require a good forecast, and AI agents are almost always better at forecasting than their human counterparts. Yet for all these technological advances, we still seem to deeply lack confidence in AI predictions. Recent cases show that people don’t like relying on AI and prefer to trust human experts, even if these experts are wrong.

If we want AI to really benefit people, we need to find a way to get people to trust it. To do that, we need to understand why people are so reluctant to trust AI in the first place.

Continue reading

How websites watch your every move and ignore privacy settings

File 20171122 6055 jrvkjw.jpg?ixlib=rb 1.1

In this guest post, Yijun Yu, Senior Lecturer, Department of Computing and Communications, The Open University examines the world’s top websites and their routine tracking of a user’s every keystroke, mouse movement and input into a web form – even if it’s later deleted.

Hundreds of the world’s top websites routinely track a user’s every keystroke, mouse movement and input into a web form – even before it’s submitted or later abandoned, according to the results of a study from researchers at Princeton University.

And there’s a nasty side-effect: personal identifiable data, such as medical information, passwords and credit card details, could be revealed when users surf the web – without them knowing that companies are monitoring their browsing behaviour. It’s a situation that should alarm anyone who cares about their privacy.

The Princeton researchers found it was difficult to redact personally identifiable information from browsing behaviour records – even, in some instances, when users have switched on privacy settings such as Do Not Track.

Continue reading

Who’s responsible for what happens on Facebook? Analysis of a new ECJ opinion

In this guest post Lorna Woods, Professor of Internet Law at the University of Essex, provides an analysis on the new ECJ opinion . This post first appeared on the blog of Steve Peers, Professor of EU, Human Rights and World Trade Law at the University of Essex.

Who is responsible for data protection law compliance on Facebook fan sites? That issue is analysed in a recent opinion of an ECJ Advocate-General, in the case of Wirtschaftsakademie (full title: Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein v Wirtschaftsakademie Schleswig-Holstein GmbH, in the presence of Facebook Ireland Ltd, Vertreter des Bundesinteresses beim Bundesverwaltungsgericht).

This case is one more in a line of cases dealing specifically with the jurisdiction of national data protection supervisory authorities, a line of reasoning which seems to operate separately from the Brussels I Recast Regulation, which concerns jurisdiction of courts over civil and commercial disputes.  While this is an Advocate-General’s opinion, and therefore not binding on the Court, if followed by the Court it would consolidates the Court’s prior broad interpretation of the Data Protection Directive.  While this might be the headline, it is worth considering a perhaps overlooked element of the data-economy: the role of the content provider in providing individuals whose data is harvested.

Continue reading

Guilty until proven innocent? How a legal loophole is being used to name and shame children

File 20171026 13319 1cvf3uw.jpg?ixlib=rb 1.1

 

 

 

 

 

 

 

 

 

In this guest post, Faith Gordon, University of Westminster explores how, under UK law, a child’s anonimity is not entirely guaranteed. Faith is speaking at the  Information Law and Policy Centre’s annual conference – Children and Digital Rights: Regulating Freedoms and Safeguards this Friday, 17 November. 

Under the 1948 Universal Declaration of Human Rights, each individual is presumed innocent until proven guilty. A big part of protecting this principle is guaranteeing that public opinion is not biased against someone that is about to be tried in the courts. In this situation, minors are particularly vulnerable and need all the protection that can be legally offered. So when you read stories about cases involving children, it’s often accompanied with the line that the accused cannot be named for legal reasons.

However, a loophole exists: a minor can be named before being formally charged. And as we all know in this digital age, being named comes with consequences – details or images shared of the child are permanent. While the right to be forgotten is the strongest for children within the Data Protection Bill, children and young people know that when their images and posts are screenshot they have little or no control over how they are used and who has access to them.

Continue reading

Ethical issues in research using datasets of illicit origin

In this guest post Dr Daniel R. Thomas, University of Cambridge reviews research surrounding ethical issues in research using datasets of illicit origin. This post first appeared on “Light Blue Touchpaper” weblog written by researchers in the Security Group at the University of Cambridge Computer Laboratory.

On Friday at IMC I presented our paper “Ethical issues in research using datasets of illicit origin” by Daniel R. Thomas, Sergio Pastrana, Alice Hutchings, Richard Clayton, and Alastair R. Beresford. We conducted this research after thinking about some of these issues in the context of our previous work on UDP reflection DDoS attacks.

Data of illicit origin is data obtained by illicit means such as exploiting a vulnerability or unauthorized disclosure, in our previous work this was leaked databases from booter services. We analysed existing guidance on ethics and papers that used data of illicit origin to see what issues researchers are encouraged to discuss and what issues they did discuss. We find wide variation in current practice. We encourage researchers using data of illicit origin to include an ethics section in their paper: to explain why the work was ethical so that the research community can learn from the work. At present in many cases positive benefits as well as potential harms of research, remain entirely unidentified. Few papers record explicit Research Ethics Board (REB) (aka IRB/Ethics Commitee) approval for the activity that is described and the justifications given for exemption from REB approval suggest deficiencies in the REB process. It is also important to focus on the “human participants” of research rather than the narrower “human subjects” definition as not all the humans that might be harmed by research are its direct subjects.

The paper and the slides are available.

Co-existing with HAL 9000: Being Human in a World with AI

This event took place at the Information Law and Policy Centre at the Institute of Advanced Legal Studies on Monday, 20 November 2017.

Date
20 Nov 2017, 17:30 to 20 Nov 2017, 19:30
Venue
Institute of Advanced Legal Studies, 17 Russell Square, London WC1B 5DR

Description

As part of the University of London’s Being Human Festival, the Information Law and Policy Centre will be hosting a film and discussion panel evening at the Institute of Advanced Legal Studies.

One of the Centre’s key aims is to promote public engagement by bringing together academic experts, policy-makers, industry, artists, and key civil society stakeholders (such as NGOs, journalists) to discuss issues and ideas concerning information law and policy relevant to the public interest that will capture the public’s imagination.

This event will focus on the implications posed by the increasingly significant role of artificial intelligence (AI) in society and the possible ways in which humans will co-exist with AI in future, particularly the impact that this interaction will have on our liberty, privacy, and agency. Will the benefits of AI only be achieved at the expense of these human rights and values? Do current laws, ethics, or technologies offer any guidance with respect to how we should navigate this future society?

The primary purpose of this event is to particularly encourage engagement and interest from young adults (15-18 years) in considering the implications for democracy, civil liberties, and human rights posed by the increasing role of AI in society that affect their everyday decision-making as humans and citizens. A limited number of places for this event will also be available to the general public.

Continue reading

Fellowship opportunity at Parliament: Investigating the impact of Parliament on legislation

Legal researchers might be interested in the following fellowship opportunity at UK Parliament…

The UK Parliament is currently piloting an academic fellowship scheme that offers academic researchers, from different subject areas and at every stage of their career, the opportunity to work on specific projects from inside Westminster’s walls.

We are now in the second phase of this scheme. This involves an ‘Open call’ which offers academics the opportunity to come and work in Parliament on a project of their own choosing, as long as they can demonstrate that it is relevant, and will contribute, to the work of Parliament.

One area of interest to Parliament is the impact of Parliament on legislation. We are interested in working with academics with knowledge and/or experience in identifying, tracking and assessing impact to help us to understand better, and identify empirically, the influence of MPs and Peers’ scrutiny on legislation.

As a bill passes through parliament, MPs and Peers examine the proposals contained within it at both a general (debating the general principles and themes of the Bill) and detailed level (examining the specific proposals put forward in the bill, line-by-line). More information about the different stages in the passing of a bill is provided on the parliamentary website. In so doing, MPs and Peers debate the key principles and main purpose/s of a bill and flag up any concerns or specific areas where they think amendments (changes) are needed.

We are interested in developing a series of case studies that examine how Peers’ scrutiny of legislation has shaped the focus, content or tone of legislation as it becomes an Act (given Royal Assent). This can include:

  • Direct influence, for example an amendment tabled by a Peer is successful and is agreed to by the government and incorporated directly into the bill.
  • Indirect influence, for instance when an amendment tabled by a Peer is not successful but the substance of it is subsequently introduced by the government itself (and when the role of the Peer that tabled it in the first instance is not acknowledged).

We envisage that the case studies will look at a government bill scrutinized by the House of Lords and trace the outcome/s of amendments tabled and debated at each stage of the bill’s scrutiny.

The choice of bills to focus on will be decided in conjunction with the academic. This will require the Fellow to:

  • understand the intentions of the amendments tabled
  • understand how the amendment related to, and interacted with the bill as drafted
  • produce an explanation of the outcome in each case
  • draft a concise written account of the House’s impact on the bill.

The Scheme is open to academics (researchers with PhDs) employed at any of the 33 universities holding Impact Acceleration Award funding from either the Economic and Social Research Council (ESRC) or the Engineering and Physical Sciences Research Council (EPSRC). There are opportunities for flexible working including both part-time and remote working.

The deadline for submitting an expression of interest to the Scheme is midnight on 4th September 2017.

For more information about the Academic Fellowship Scheme see: http://www.parliament.uk/mps-lords-and-offices/offices/bicameral/post/fellowships/parliamentary-academic-fellowship-scheme/

If you would like to know more about this opportunity, please get in touch with Dr Caroline Kenny (kennyc@parliament.uk), at The Parliamentary Office of Science and Technology.

Submissions to the Law Commission’s consultation on ‘Official Data Protection’: Guardian News and Media

The Law Commission has invited interested parties to write submissions commenting on the proposals outlined in a consultation report on ‘official data protection’. The consultation period closed for submissions on 3 May, although some organisations have been given an extended deadline. (For more detailed background on the Law Commission’s work please see the first post in this series). 

The Information Law and Policy Centre is re-publishing some of the submissions written by stakeholders and interested parties in response to the Law Commission’s consultation report (pdf) to our blog. In due course, we will collate the submissions on a single resource page. If you have written a submission for the consultation you would like (re)-published please contact us

Please note that none of the published submissions reflect the views of the Information Law and Policy Centre which aims to promote and facilitate cross-disciplinary law and policy research, in collaboration with a variety of national and international institutions.

The fourteenth submission in our series is the response submitted by Guardian News and Media. The executive summary outlines that Guardian News and Media is “very concerned that the effect of the measures set out in the consultation paper (‘CP’) would be to make it easier for the government to severely limit the reporting of public interest stories”.

Download (PDF, 912KB)

(Previous submissions published in this series: Open Rights GroupCFOI and Article 19The Courage FoundationLibertyPublic Concern at WorkThe Institute of Employment RightsTransparency International UKNational Union of Journalists, and English Pen, Reporters Without Borders and Index on Censorship, the Open Government NetworkLorna Woods, Lawrence McNamara and Judith Townend, Global Witness, and the British Computer Society.)