Category Archives: Data Protection

New Special Issue of Communications Law: Information control in an ominous global environment

Communications Law JournalThe Information Law and Policy Centre is pleased to announce the publication of a special issue of the Communications Law journal based on papers submitted for our annual workshop last November. The journal articles are available via direct subscription, through the Lexis Library (IALS member link) and (coming soon) Westlaw.

In the following editorial for the special issue, Dr Judith Townend, Lecturer in Media and Information Law, University of Sussex, (the outgoing Director of the ILPC, Institute of Advanced Legal Studies) and Dr Paul Wragg, Associate Professor of Law, University of Leeds discuss the challenges of information control in an ominous global environment.

This special issue of Communications Law celebrates the first anniversary of the Information Law and Policy Centre (ILPC) at the Institute of Advanced Legal Studies. It features three contributions from leading commentators who participated in the ILPC’s annual conference ‘Restricted and redacted: where now for human rights and digital information control?‘, which was held on 9 November 2016 and sponsored by Bloomsbury Professional.

The workshop considered the myriad ways in which data protection laws touch upon fundamental rights, from internet intermediary liability, investigatory and surveillance powers, media regulation, whistle-blower protection, to ‘anti-extremism’ policy. We were delighted with the response to our call for papers. The conference benefited from a number of provocative and insightful papers, from academics including Professor Gavin Phillipson, Professor Ellen P Goodman, Professor Perry Keller and Professor David Rolph as well as Rosemary Jay, Mélanie Dulong de Rosnay, Federica Giovanella and Allison Holmes, whose papers are published in this edition.

The date of the conference, by happenstance, gave extra piquancy to the significance of our theme. News of Donald J Trump’s election triumph spoke to (and continues to speak to) an ominous and radically changed global environment in which fundamental rights protection takes centre stage. But as Trump’s presidency already shows, those rights have become impoverished in the rush to promote nationalism in all its ugly forms.

In the UK, the popularism that threatens to rise above all other domestic values marks a similar threat, in which executive decision-making is not only championed but also provokes popular dissent when threatened by judicial oversight. The Daily Mail’s claim that High Court justices were ‘enemies of the people’ when they sought to restrict the exercise of unvarnished executive power reminds us that fundamental rights are seriously undervalued.

Perhaps we should not be surprised at these events and their potential impact on communication law. In February 2015, at the ILPC’s inaugural conference Dr Daithí Mac Síthigh delivered a powerful paper in which he noted the rise of this phenomena in the government’s thinking on information law and policy under the Coalition Government 2010-15. In his view, following an ‘initial urgency’ of libertarianism, the mood changed to one of internet regulation or re-regulation. Such a response to perceived disorder, though not unusual, was ‘remarkable’ given how the measures in this field adopted during these final stages of the last government had been ‘characterised by the extension of State power in a whole range of areas.’ We should also note the demise of liberalism in popular thought. That much criticised notion which underpins all fundamental rights seems universally disclaimed as something weak and sinister. All of this speaks to a worrisome future in which the fate of the Human Rights Act remains undecided.

Concerns like these animate the papers in this special issue. The contribution from leading data protection practitioner Rosemary Jay, Senior Consultant Attorney at Hunton & Williams and author of Sweet & Maxwell’s Data Protection Law & Practice, is entitled ‘Heads and shoulders, knees and toes (and eyes and ears and mouth and nose…)’. Her paper discusses the rise of biometric data and restrictions on its use generated by the General Data Protection Regulation. As she notes, sensitive personal data arising from biometric data might be more easily shared, leading to loss of individual autonomy. It is not hard to imagine the impact unrestricted data access would have – the prospective employer who offers the job to someone else because of concerns about an applicant’s cholesterol levels; the partner who leaves after discovering a family history of mental ill heath; the bank that refuses a mortgage because of drinking habits. As Jay concludes, consent will play a major role in regulating this area.

In their paper, Federica Giovanella and Mélanie Dulong de Rosnay discuss community networks, a grassroots alternative to commercial internet service providers. They discuss the liability issues arising from open wireless local access networks after the landmark Court of Justice of the EU decision in McFadden v Sony Music Entertainment Germany GmbH. As they conclude, the decision could prompt greater regulation of, and political involvement in, the distribution of materials through these networks which may well represent another threat to fundamental rights.

Finally, Allison M Holmes reflects on the impact of fundamental rights caused by the status imposed on communication service providers. As Holmes argues, privacy and other human rights are threatened because CSPs are not treated as public actors when retaining communications data. As she says, this status ought to change and she argues convincingly on how that may be achieved.

Communicating Responsibilities: The Spanish DPA targets Google’s Notification Practices when Delisting Personal Information

In this guest post, David Erdos, University Lecturer in Law and the Open Society, University of Cambridge, considers the 2016 Resolution made by the Spanish Data Protection Authority in relation to Google’s approach to de-listing personal information. 

Spanish Data protection authorityThe Court of Justice’s seminal decision in Google Spain (2014) represented more the beginning rather than the endpoint of specifying the European data protection obligations of search engines when indexing material from the web and, as importantly, ensuring adherence to this.

In light of its over 90% market share of search, this issue largely concerns Google (even Bing and Yahoo come in a very distant second and third place).  To its credit, Google signalled an early willingness to comply with Google Spain.  At the same time, however, it construed this narrowly.  Google argued that it only had to remove specified URL links following ex post demands from individual European citizens and/or residents (exercising the right to erasure (A. 12c) and or objection (A. 14)), only as regards searches made under their name, only on European-badged search searches (e.g. .uk, .es) and even if the processing violated European data protection standards not if the processing was judged to be in the ʻpublic interestʼ.

It also indicated that it would inform the Webmasters of the ʻoriginalʼ content when de-listing took place (although it signalled that it would stop short of its usual practice of providing a similar notification to individual users of its services, opting instead for a generic notice only).

In the subsequent two and a half years, Google’s approach has remained in broad terms relatively stable (although from early 2015 it did stop notifying Webmasters when de-listing material from malicious porn sites (p. 29) and from early 2016 it has deployed (albeit imperfect) geolocation technology to block the return of de-listing results when using any of version of the Google search engine (e.g. .com) from the European country from where the demand was lodged).

Many (although not all) of these limitations are potentially suspect under European data protection, and indeed private litigants have (successfully and unsuccessfully) already brought a number of challenges.  No doubt partly reflecting their very limited resources, European Data Protection Authorities (DPAs) have adopted a selective approach, targeting only those issues which they see as the most critical.  Indeed, the Article 29 Working Party November 2014 Guidelines focussed principally on two concerns:

  • Firstly, that the geographical scope of de-listing was too narrow. To ensure “effective and complete protection” of individual data subjects, it was necessary that de-listing be “effective on all relevant domains, including .com”.
  • Secondly, that communication to third parties of data concerning de-listing identifiable to particular data subjects should be both very limited and subject to strong discipline. Routine communication “to original webmasters that results relating to their content had been delisted” was simply unlawful and, whilst in “particularly difficult cases” it might in principle be legitimate to contact such publishers prior to making a de-listing decision, even here search engines must then “take all necessary measures to properly safeguard the rights of the affected data subject”.

Since the release of the Guidelines, the French DPA has famously (or infamously depending on your perspective!) adopted a strict interpretation of the first concern requiring de-listing on a completely global scale and fining Google €100K for failing to do this.  This action has now been appealed before the French Conseil d’État and much attention has been given to this including by Google itself.  In contrast, much less publicity has been given to the issue of third party communication.

Nevertheless, in September 2016 the Spanish DPA issued a Resolution fining Google €150K for disclosing information identifiable to three data subjects to Webmasters and ordered it to adopt measures to prevent such practices reoccurring.  An internal administrative appeal lodged by Google against this has now been rejected and a challenge in court now seems inevitable.  This piece explores the background to, nature of and justification for this important regulatory development.

The Determinations Made in the Spanish Resolution

Apart from the fact that they had formally complained, there was nothing unusual in the three individual cases analysed in the Spanish Resolution.  Google had simply followed its usual practice of informing Webmasters that under data protection law specified URLs had been deindexed against a particular (albeit not directly specified) individual name.  Google sought to defend this practice on four separate grounds:

  • Firstly, it argued that the information provided to Webmasters did not constitute personal data at all. In contrast, the Spanish regulator argued that in those cases where the URL led to a webpage in which only one natural person was mentioned then directly identifiable data had been reported, whilst even in those cases where several people were mentioned the information was still indirectly identifiable since a simple procedure (e.g. conducting a search on names linked to the webpage in question) would render the information fully identified.  (Google’s argument here in any case seemed to be in tension with its practice since September 2015 of inviting contacted Webmasters to notify Google of any reason why the de-listing decision should be reconsidered – this would only really make sense if the Webmaster could in fact deduce what specific de-listing had in fact taken place).
  • Second, it argued that, since its de-listing form stated that it “may provide details to webmaster(s) of the URLs that have been removed from our search results”, any dissemination had taken place with the individual’s consent. Drawing especially on European data protection’s requirement that consent be “freely given” (A. 2 (h)) this was also roundly rejected.  In using the form to exercise their legal rights, individuals were simply made to accept as a fait accompli that such dissemination might take place.
  • Third, it argued that dissemination was nevertheless a compatible” (A. 6 (1) (b)) processing of the data given the initial purpose of its collection, finding a legal basis as necessary” for the legitimate interests (A. 7 (f)) of Webmasters regarding this processing (e.g. to contact Google for a reconsideration). The Spanish DPA doubted that Webmasters could have any legitimate interest here since “search engines do not recognize a legal right of publishers to have their contents indexed and displayed, or displayed in a particular order”, the Court of Justice had only referenced that the interests of the search engine itself and Internet users who might receive the information were engaged and, furthermore, had been explicit that de-listing rights applied irrespective of whether the information was erased at source or even if publication there remained lawful.  In any case, it emphasized that any such interest had (as article 7 (f) explicitly states) to be balanced with the rights and freedoms of data subjects which the Court had emphasized must be “effective and complete” in this context.  In contrast, Google’s practice of essentially unsafeguarded disclosure of the data to Webmasters could result in the effective extinguishment of the data subject’s rights since Webmasters had variously republished the deindexed page against another URL, published lists of all URLs deindexed or even published a specific news story on the de-listing decision.
  • Fourth, Google argued that its practice was an instantiation of the data subject’s right to obtain from a controller “notification to third parties to whom the data have been disclosed of any rectification, erasure or blocking” carried out in compliance with the right to erasure “unless this provides impossible or involves a disproportionate effort” (A. 12 (c)). The Spanish regulator pointed out that since the data in question had originally been received from rather than disclosed to Webmasters, this provision was not even materially engaged.  In any case, Google’s interpretation of it was in conflict with its purpose which was to ensure the full effectiveness of the data subject’s right to erasure.

Having established an infringement of the law, the Spanish regulator had to consider whether to pursue this as an illegal communication of data (judged ʻvery seriousʼ under Spanish data law) or only as a breach of secrecy (which is judged merely as ʻseriousʼ).  In the event, it plumped for the latter and issued a fine of €150K which was in the mid-range of that set out for ʻseriousʼ infringements.  As previously noted, it also injuncted Google to adopt measures to prevent re-occurrence of these legal failings and required that these be communicated to the Spanish DPA.

Analysis

This Spanish DPA’s action tackles a systematic practice which has every potential to fundamentally undermine practical enjoyment of rights to de-listing and is therefore at least as significant as the ongoing regulatory developments in France which relate to the geographical scope of these rights.  It was entirely right to find that personal data had been disseminated, that this had been done without consent, that the processing had nothing to do with the right (which, in any case, is not an obligation) of data subjects to have third parties notified in certain circumstances and that this processing was “incompatible” with the initial purpose of data collection which was to ensure data subject’s legal rights to de-listing.

It is true that the Resolution was too quick to dismiss the idea that original Webmasters do have “legitimate interests” in guarding against unfair de-listings of content.  Even in the absence of a de jure right to such listings, these interests are grounded in their fundamental right to “impart” information (and ideas), an aspect of freedom of expression (ECHR, art. 10; EU Charter, art. 11).   In principle, these rights and interests justify search engines making contact with original Webmasters, at the least as the Working Party itself indicated in particularly difficult de-listing cases.

However, even here dissemination must (as the Working Party also emphasized) properly safeguard the rights and interest of data subjects.  At the least this should mean that, prior to any dissemination, a search engine should conclude a binding and effectively policeable legal contract prohibiting Webmasters from disseminating the data in identifiable form.  (In the absence of this, those Webmasters out of European jurisdiction or engaged in special/journalistic expression cannot necessarily be themselves criticized for making use of the information received in other ways).

In stark contrast to this, Google currently engages in blanket and essentially unsafeguarded reporting to Webmasters, a practice which has resulted in a breakdown of effective protection for data subjects not just in Spain but also in other European jurisdictions such as the UK – see here and here.  Having been put on such clear notice by this Spanish action, it is to be hoped the Google will seriously modify its practices.  If not, then regulators would have every right to deal with this in the future as a (yet more serious) illegal and intentional communication of personal data.

Future Spanish Regulatory Vistas

The cases investigated by the Spanish DPA noted in this Resolution also involved the potential dissemination of data to the Lumen transparency database (formally Chilling Effects) which is hosted in the United States, the potential for subsequent publication of identifiable data on its publicly accessible database and even the potential for a specific notification to be provided to Google users conducting relevant name searches detailing that “[i]n response to a legal requirement sent to Google, we have removed [X] result(s) from this page.  If you wish, you can get more information about this requirement on LumenDatabase.org.

This particular investigation, however, failed to uncover enough information on these important matters.  Google was adamant that it had not yet begun providing information to Lumen in relation to data protection claims post-Google Spain, but stated that it was likely to do so in the future in some form.  Meanwhile, it indicated that the specific Lumen notifications which were found on name searches regarding two of the claimants concerned pre-Google Spain claims variously made under defamation, civil privacy law and also data protection.  (Even putting to one side the data protection claim, such practices would still amount to a processing of personal data and also highlight the often marginal and sometimes arbitrary distinctions between these very related legal causes of action).

Given these complications, the Spanish regulator decided not to proceed directly regarding these matters but rather open more wide-ranging investigatory proceedings concerning both Google’s practices in relation to disclosure to Lumen and also notification provided to search users.  Both sets of investigatory proceedings are ongoing.  Such continuing work highlights the vital need for active regulatory engagement to ensure that the individual rights of data subjects are effectively secured.  Only in this way will basic European data protection norms continue to ʻcatch upʼ not just with Google but with developments online generally.

David Erdos, University Lecturer in Law and the Open Society, Faculty of Law & WYNG Fellow in Law, Trinity Hall, University of Cambridge.

(I am grateful to Cristina Pauner Chulvi and Jef Ausloos for their thoughts on a draft of this piece.)

This post first appeared on the Inforrm blog. 

Call for papers: Critical Research in Information Law

Deadline 15 March 2017

The Information Law Group at the University of Sussex is pleased to announce its annual PhD and Work in Progress Workshop on 3 May 2017. The workshop, chaired by Professor Chris Marsden, will provide doctoral students with an opportunity to discuss current research and receive feedback from senior scholars in a highly focused, informal environment. The event will be held in conjunction with the Work in Progress Workshop on digital intermediary law.

We encourage original contributions critically approaching current information law and policy issues, with particular attention on the peculiarities of information law as a field of research. Topics of interest include:

  • internet intermediary liability
  • net neutrality and media regulation
  • surveillance and data regulation
  • 3D printing
  • the EU General Data Protection Regulation
  • blockchain technology
  • algorithmic/AI/robotic regulation
  • Platform neutrality, ‘fake news’ and ‘anti-extremism’ policy.

How to apply: Please send an abstract of 500 words and brief biographical information to Dr Nicolo Zingales  by 15 March 2017. Applicants will be informed by 30 March 2017 if selected. Submission of draft papers by selected applicants is encouraged, but not required.

Logistics: 11am-1pm 3 May in the Moot Room, Freeman Building, University of Sussex.

Afternoon Workshop: all PhD attendees are registered to attend the afternoon workshop 2pm-5.30pm F22 without charge (programme here).

Financial Support: Information Law Group can repay economy class rail fares within the UK. Please inform the organizers if you need financial assistance.

Your next social network could pay you for posting

In this guest post, Jelena Dzakula from the London School of Economics and Political Science considers what blockchain technology might mean for the future of social networking. 

You may well have found this article through Facebook. An algorithm programmed by one of the world’s biggest companies now partially controls what news reaches 1.8 billion people. And this algorithm has come under attack for censorship, political bias and for creating bubbles that prevent people from encountering ideas they don’t already agree with.

blockchainNow a new kind of social network is emerging that has no centralised control like Facebook does. It’s based on blockchain, the technology behind Bitcoin and other cryptocurrencies, and promises a more democratic and secure way to share content. But a closer look at how these networks operate suggests they could be far less empowering than they first appear.

Blockchain has received an enormous amount of hype thanks to its use in online-only cryptocurrencies. It is essentially a ledger or a database where information is stored in “blocks” that are linked historically to form a chain, saved on every computer that uses it. What is revolutionary about it is that this ledger is built using cryptography by a network of users rather than a central authority such as a bank or government.

Every computer in the network has access to all the blocks and the information they contain, making the blockchain system more transparent, accurate and also robust since it does not have a single point of failure. The absence of a central authority controlling blockchain means it can be used to create more democratic organisations owned and controlled by their users. Very importantly, it also enables the use of smart contracts for payments. These are codes that automatically implement and execute the terms of a legal contract.

Industry and governments are developing other uses for blockchain aside from digital currencies, from streamlining back office functions to managing health data. One of the most recent ideas is to use blockchain to create alternative social networks that avoid many of the problems the likes of Facebook are sometimes criticised for, such as censorship, privacy, manipulating what content users see and exploiting those users.

Continue reading

Book launch: ‘Private Power, Online Information Flows and EU Law: Mind The Gap’

angela-daly-eu-bookBook launch at: The Conservatory, Bloomsbury Publishing Plc 
50 Bedford Square
London
WC1B 3DP
6pm – 8pm, 31 January 2017

This event is FREE but registration is required on Eventbrite.

Speaker: Angela Daly

With guest speakers: Professor Chris Marsden, University of Sussex; Dr Orla Lynskey, London School of Economics and Political Science

About the Book

This monograph examines how European Union law and regulation address concentrations of private economic power which impede free information flows on the Internet to the detriment of Internet users’ autonomy. In particular, competition law, sector specific regulation (if it exists), data protection and human rights law are considered and assessed to the extent they can tackle such concentrations of power for the benefit of users.

Using a series of illustrative case studies, of Internet provision (including the net neutrality debate), search, mobile devices and app stores, and the cloud, the work demonstrates the gaps that currently exist in EU law and regulation. It is argued that these gaps exist due, in part, to current overarching trends guiding the regulation of economic power, namely neoliberalism, by which only the situation of market failure can invite ex ante rules, buoyed by the lobbying of regulators and legislators by those in possession of such economic power to achieve outcomes which favour their businesses. Given this systemic, and extra-legal, nature of the reasons as to why the gaps exist, solutions from outside the system are proposed at the end of each case study.

Praise for the Book

‘This is a richly textured, critically argued work, shedding new light on case studies in information law which require critical thinking. It is both an interesting series of case studies (notably cloud computing, app stores and search) that displays original and deeply researched scholarship and a framework for critiquing neoliberal competition policy from a prosumerist and citizen-oriented perspective.’ – Professor Chris Marsden, University of Sussex.

Call for Papers: Deadline 27/1: 4th Winchester Conference on Trust, Risk, Information and the Law

Date: Wednesday 3 May 2017
Venue: West Downs Campus, University of Winchester, Hampshire, UK
Book Online at University of Winchester Events

The Fourth Interdisciplinary Winchester Conference on Trust, Risk, Information and the Law (#TRILCon17) will be held on Wednesday 3 May 2017 at the West Downs Campus, University of Winchester, UK.  The overall theme for this conference will be:

Artificial and De-Personalised Decision-Making: Machine-Learning, A.I. and Drones

The keynote speakers will be Professor Katie Atkinson, Head of Computer Science, University of Liverpool, an expert in Artificial Intelligence and its application to legal reasoning, and John McNamara, IBM Senior Inventor, who will speak on ‘Protecting trust in a world disrupted by machine learning’.

Papers and Posters are welcomed on any aspect of the conference theme.  This might include although is not restricted to:

  • Machine learning and processing of personal information;
  • Artificial intelligence and its application to law enforcement, legal reasoning or judicial decisions;
  • Big Data and the algorithmic analysis of information;
  • Implications of the Internet of Things;
  • Machine based decision-making and fairness;
  • Drone law and policy;
  • Trust and the machine;
  • Risks of removing the human from – or leaving the human in – the process;
  • Responsibility, accountability and liability for machine-made decisions.

The conference offers a best poster prize judged against the following criteria: 1) quality, relevance and potential impact of research presented 2) visual impact 3) effectiveness of the poster as a way of communicating the research.

Proposals for workshops are also welcome.  Workshops offer organisers the opportunity to curate panels or research/scholarship activities on an aspect of the conference theme in order to facilitate interdisciplinary discussion.

This call for papers/posters/workshops is open to academics, postgraduate students, policy-makers and practitioners, and in particular those working in law, computer science & technology, data science, information rights, privacy, compliance, statistics, probability, law enforcement & justice, behavioural science and health and social care.

Abstracts for papers are invited for consideration.  Abstracts should be no more than 300 words in length.  Successful applicants will be allocated 15-20 minutes for presentation of their paper plus time for questions and discussion.

Abstracts for posters are invited for consideration.  Abstracts should be no more than 300 words in length.  Please note that accepted poster presenters will be required to email an electronic copy of their poster no later than a week before the conference.  Accepted poster presenters will also need to deliver the hard copy of their poster to the venue no later than 9am on the date of the conference to enable it to be displayed during the day.

Workshop proposals should summarise the workshop theme and goals, organising committee and schedule of speakers, panels and/or talks.  Proposals should be no more than 500 words.  Workshops should be timed to be 1.5-2 hours in length.

Abstracts and proposals, contained in a Word document, should be emailed to trilcon17@winchester.ac.uk.  Please include name, title, institution/organisation details and email correspondence address.  The deadline for submission of abstracts/proposals is Friday 27 January 2017.  Successful applicants will be notified by 17 February 2017.  Speakers/poster presenters/workshop organisers will be entitled to the early registration discounted conference fee of £80 and will be required to book a place at the conference by 28 February in order to guarantee inclusion of their paper/poster/workshop.

Speakers will be invited to submit their paper for inclusion in a special edition of the open access eJournal, Information Rights, Policy & Practice.

To book a place at the conference, please click here to visit the Winchester University Store and click on academic conferences.

For more information, please contact the conference team at trilcon17@winchester.ac.uk

Information Law and Policy Centre’s annual workshop highlights new challenges in balancing competing human rights

dsc_0892  dsc_0896  dsc_0898

Our annual workshop and lecture – held earlier this month – brought together a wide range of legal academics, lawyers, policy-makers and interested parties to discuss the future of human rights and digital information control.

A number of key themes emerged in our panel sessions including the tensions present in balancing Article 8 and Article 10 rights; the new algorithmic and informational power of commercial actors; the challenges for law enforcement; the liability of online intermediaries; and future technological developments.

The following write up of the event offers a very brief summary report of each panel and of Rosemary Jay’s evening lecture.

Morning Session

Panel A: Social media, online privacy and shaming

Helen James and Emma Nottingham (University of Winchester) began the panel by presenting their research (with Marion Oswald) into the legal and ethical issues raised by the depiction of young children in broadcast TV programmes such as The Secret Life of 4, 5 and 6 Year Olds. They were also concerned with the live-tweeting which accompanied these programmes, noting that very abusive tweets could be directed towards children taking part in the programmes.

Continue reading

Open letter in the Daily Telegraph: Concerns with ‘information sharing’ provisions in the Digital Economy Bill

Associate research fellow at the Information Law and Policy Centre and lecturer in media and information law at the University of Sussex, Dr Judith Townend, is among the signatories of this letter published on the letters page of the Telegraph on 25/11/2016 [subscription required].

SIR – We wish to highlight concerns with “information sharing” provisions in the Digital Economy Bill.

The Bill puts government ministers in control of citizens’ personal data, a significant change in the relationship between citizen and state. It means that personal data provided to one part of government can be shared with other parts of government and private‑sector companies without citizens’ knowledge or consent.

Government should be strengthening, not weakening, the protection of sensitive information, particularly given the almost daily reports of hacks and leaks of personal data. Legal and technical safeguards need to be embedded within the Bill to ensure citizens’ trust. There must be clear guidance for officials, and mechanisms by which they and the organisations with whom they share information can be held to account.

The Government’s intention is to improve the wellbeing of citizens, and to prevent fraud. This makes it especially important that sensitive personal details, such as income or disability, cannot be misappropriated or misused – finding their way into the hands of payday-loan companies, for example. Information sharing could exacerbate the difficulties faced by the most vulnerable in society.

The Government should be an exemplar in ensuring the security and protection of citizens’ personal data. If the necessary technical and legal safeguards cannot be embedded in the current Bill and codes of practice, we respectfully urge the Government to remove its personal data sharing proposals in their entirety.

Dr Jerry Fishenden
Co-Chairman, Cabinet Office Privacy and Consumer Advisory Group (PCAG)

Renate Samson
Chief Executive, Big Brother Watch

Ian Taylor
Director, Association of British Drivers

Jo Glanville
Director, English PEN

Jodie Ginsberg
Chief Executive Officer, Index on Censorship

Dr Edgar Whitley
Co-Chairman, Cabinet Office PCAG and London School of Economics and Political Science

David Evans
Director of Policy, BCS – The Chartered Institute for IT

Dr Gus Hosein
Executive Director, Privacy International and Member of Cabinet Office PCAG

Rachel Coldicutt
Chief Executive Officer, Doteveryone

Roger Darlington
Chairman, Consumer Forum for Communications

Dr Kieron O’Hara
Associate Professor Electronics and Computer Science, University of Southampton.

Professor Angela Sasse
Head of Information Security Research, University College London and Member of Cabinet Office PCAG

Dr Judith Townend
Lecturer in Media and Information Law, University of Sussex

Dr Louise Bennett
Chairman, BCS Security Group and Member of Cabinet Office PCAG

StJohn Deakins
Chief Executive Officer, CitizenMe

Rory Broomfield
Director, The Freedom Association

Sarah Gold
Director and Founder, Projects by IF

Jim Killock
Director, Open Rights Group

Guy Herbert
General Secretary, NO2ID and Member of Cabinet Office PCAG

Dr George Danezis
Professor of Security and Privacy Engineering, University College London and Member of Cabinet Office PCAG

Jamie Grace
Senior Lecturer in Law, Sheffield Hallam University

Eric King
Visiting Professor, Queen Mary University

Josie Appleton
Director, Manifesto Club

Jen Persson
Co-ordinator, Defend Digital Me

Dr Chris Pounder
Director, Amberhawk and Member of Cabinet Office PCAG

Sam Smith
medConfidential and Member of Cabinet Office PCAG

‘Tracking People’ research network established

Tracking People Research NetworkA new research network has been established to investigate the legal, ethical, social and technical issues which arise from the use of wearable, non-removable tagging and tracking devices.

According to the network’s website, tracking devices are increasingly being used to monitor a range of individuals including “offenders, mental health patients, dementia patients, young people in care, immigrants and suspected terrorists”.

The interdisciplinary network is being hosted at the University of Leeds and aims to foster “new empirical, conceptual, theoretical and practical insights into the use of tracking devices”.

The network is being coordinated by Professor Anthea Hucklesby and Dr Kevin MacNish. It will bring together academics, designers, policy-makers and practitioners to explore critical issues such as:

  • privacy;
  • ethics;
  • data protection;
  • efficiency and effectiveness;
  • the efficacy and suitability of the equipment design;
  • the involvement of the private sector as providers and operators;
  • the potential for discriminatory use.

Readers of the Information Law and Policy Centre blog might be particularly interested in a seminar event scheduled for April 2017 which will consider the “legal and ethical issues arising from actual and potential uses of tracking devices across a range of contexts”.

For further information, check out the network’s website or email the team to join the network.

Full Programme: Annual Workshop and Evening Lecture

Restricted and Redacted: Where now for human rights and digital information control?

The full programme for the Information Law and Policy Centre’s annual workshop and lecture on Wednesday 9th November 2016 is now available (see below).

For both events, attendance will be free of charge thanks to the support of the IALS and our sponsor, Bloomsbury’s Communications Law journal.

To register for the afternoon workshop please visit this Eventbrite page.
To register for the evening lecture please visit this Eventbrite Page.

Please note that for administrative purposes you will need to book separate tickets for the afternoon and evening events if you would like to come to both events.

PROGRAMME

10.45am: REGISTRATION AND COFFEE 

11.15am: Welcome

  • Judith Townend, University of Sussex
  • Paul Wragg, University of Leeds
  • Julian Harris, Institute of Advanced Legal Studies, University of London

11.30am-1pm: PANEL 1 – choice between A and B

Panel A: Social media, online privacy and shaming

Chair: Asma Vranaki, Queen Mary University of London

  1. David Mangan, City, University of London, Dissecting Social Media: Audience and Authorship
  2. Marion Oswald, Helen James, Emma Nottingham, University of Winchester, The not-so-secret life of five year olds: Legal and ethical issues relating to disclosure of information and the depiction of children on broadcast and social media
  3. Maria Run Bjarnadottir, Ministry of the Interior in Iceland, University of Sussex, Does the internet limit human rights protection? The case of revenge porn
  4. Tara Beattie, University of Durham, Censoring online sexuality – A non-heteronormative, feminist perspective

Panel B: Access to Information and protecting the public interest

Chair: Judith Townend, University of Sussex

  1. Ellen P. Goodman, Rutgers University, Obstacles to Using Freedom of Information Laws to Unpack Public/Private Deployments of Algorithmic Reasoning in the Public Sphere
  2. Felipe Romero-Moreno, University of Hertfordshire, ‘Notice and staydown’, the use of content identification and filtering technology posing a fundamental threat to human rights
  3. Vigjilenca Abazi, Maastricht University, Mapping Whistleblowing Protection in Europe: Information Flows in the Public Interest

1-2pm: LUNCH 

2-3.30pm: PANEL 2 – choice between A and B

Panel A: Data protection and surveillance

Chair: Nora Ni Loideain, University of Cambridge

  1. Jiahong Chen, University of Edinburgh, How the Best Laid Plans Go Awry: The (Unsolved) Issues of Applicable Law in the General Data Protection Regulation
  2. Jessica Cruzatti-Flavius, University of Massachusetts, The Human Hard Drive: Name Erasure and the Rebranding of Human Beings
  3. Wenlong Li, University of Edinburgh, Right to Data Portability (RDP)
  4. Ewan Sutherland, Wits University, Wire-tapping in the regulatory state – changing times, changing mores

Panel B: Technology, power and governance

Chair: Chris Marsden, University of Sussex

  1. Monica Horten, London School of Economics, How Internet structures create closure for freedom of expression – an exploration of human rights online in the context of structural power theory
  2. Perry Keller, King’s College, London, Bringing algorithmic governance to the smart city
  3. Marion Oswald, University of Winchester and Jamie Grace, Sheffield Hallam University, Intelligence, policing and the use of algorithmic analysis – initial conclusions from a survey of UK police forces using freedom of information requests as a research methodology
  4. Allison Holmes, Kent University, Private Actor or Public Authority? How the Status of Communications Service Providers affects Human Rights

3.30-5pm: PANEL 3 – choice between A and B

Panel A: Intermediary Liability

Chair: Christina Angelopoulos, University of Cambridge

  1. Judit Bayer, Miskolc University, Freedom and Diversity on the Internet: Liability of Intermediaries for Third Party Content
  2. Mélanie Dulong de Rosnay, Félix Tréguer, CNRS-Sorbonne Institute for Communication Sciences and Federica Giovanella, University of Trento, Intermediary Liability and Community Wireless Networks Design Shaping
  3. David Rolph, University of Sydney, Liability of Search Engines for Publication of Defamatory Matter: An Australian Perspective

Panel B: Privacy and anonymity online

Chair: Paul Wragg, University of Leeds

  1. Gavin Phillipson, University of Durham, Threesome injuncted: has the Supreme Court turned the tide against the media in online privacy cases?
  2. Fiona Brimblecombe, University of Durham, European Privacy Law
  3. James Griffin, University of Exeter and Annika Jones, University of Durham, The future of privacy in a world of 3D printing

5-6pm: TEA BREAK / STRETCH YOUR LEGS

6-8pm: EVENING LECTURE AND DRINKS

Lecture Title: Heads and shoulders, knees and toes (and eyes and ears and mouth and nose…): The impact of the General Data Protection Regulation on use of biometrics.

Biometrics are touted as one of the next big things in the connected world. Specific reference to biometrics and genetic data has been included for the first time in the General Data Protection Regulation. How does this affect existing provisions? Will the impact of the Regulation be to encourage or to restrict the development of biometric technology?

  • Speaker: Rosemary Jay, Senior Consultant Attorney at Hunton & Williams and author of Sweet & Maxwell’s Data Protection Law & Practice.
  • Chair: Professor Lorna Woods, University of Essex
  • Respondents: Professor Andrea Matwyshyn, Northeastern University and Mr James Michael, IALS