Tag Archives: data protection

Book launch: ‘Private Power, Online Information Flows and EU Law: Mind The Gap’

angela-daly-eu-bookBook launch at: The Conservatory, Bloomsbury Publishing Plc 
50 Bedford Square
London
WC1B 3DP
6pm – 8pm, 31 January 2017

This event is FREE but registration is required on Eventbrite.

Speaker: Angela Daly

With guest speakers: Professor Chris Marsden, University of Sussex; Dr Orla Lynskey, London School of Economics and Political Science

About the Book

This monograph examines how European Union law and regulation address concentrations of private economic power which impede free information flows on the Internet to the detriment of Internet users’ autonomy. In particular, competition law, sector specific regulation (if it exists), data protection and human rights law are considered and assessed to the extent they can tackle such concentrations of power for the benefit of users.

Using a series of illustrative case studies, of Internet provision (including the net neutrality debate), search, mobile devices and app stores, and the cloud, the work demonstrates the gaps that currently exist in EU law and regulation. It is argued that these gaps exist due, in part, to current overarching trends guiding the regulation of economic power, namely neoliberalism, by which only the situation of market failure can invite ex ante rules, buoyed by the lobbying of regulators and legislators by those in possession of such economic power to achieve outcomes which favour their businesses. Given this systemic, and extra-legal, nature of the reasons as to why the gaps exist, solutions from outside the system are proposed at the end of each case study.

Praise for the Book

‘This is a richly textured, critically argued work, shedding new light on case studies in information law which require critical thinking. It is both an interesting series of case studies (notably cloud computing, app stores and search) that displays original and deeply researched scholarship and a framework for critiquing neoliberal competition policy from a prosumerist and citizen-oriented perspective.’ – Professor Chris Marsden, University of Sussex.

Information Law and Policy Centre’s annual workshop highlights new challenges in balancing competing human rights

dsc_0892  dsc_0896  dsc_0898

Our annual workshop and lecture – held earlier this month – brought together a wide range of legal academics, lawyers, policy-makers and interested parties to discuss the future of human rights and digital information control.

A number of key themes emerged in our panel sessions including the tensions present in balancing Article 8 and Article 10 rights; the new algorithmic and informational power of commercial actors; the challenges for law enforcement; the liability of online intermediaries; and future technological developments.

The following write up of the event offers a very brief summary report of each panel and of Rosemary Jay’s evening lecture.

Morning Session

Panel A: Social media, online privacy and shaming

Helen James and Emma Nottingham (University of Winchester) began the panel by presenting their research (with Marion Oswald) into the legal and ethical issues raised by the depiction of young children in broadcast TV programmes such as The Secret Life of 4, 5 and 6 Year Olds. They were also concerned with the live-tweeting which accompanied these programmes, noting that very abusive tweets could be directed towards children taking part in the programmes.

Continue reading

‘Tracking People’ research network established

Tracking People Research NetworkA new research network has been established to investigate the legal, ethical, social and technical issues which arise from the use of wearable, non-removable tagging and tracking devices.

According to the network’s website, tracking devices are increasingly being used to monitor a range of individuals including “offenders, mental health patients, dementia patients, young people in care, immigrants and suspected terrorists”.

The interdisciplinary network is being hosted at the University of Leeds and aims to foster “new empirical, conceptual, theoretical and practical insights into the use of tracking devices”.

The network is being coordinated by Professor Anthea Hucklesby and Dr Kevin MacNish. It will bring together academics, designers, policy-makers and practitioners to explore critical issues such as:

  • privacy;
  • ethics;
  • data protection;
  • efficiency and effectiveness;
  • the efficacy and suitability of the equipment design;
  • the involvement of the private sector as providers and operators;
  • the potential for discriminatory use.

Readers of the Information Law and Policy Centre blog might be particularly interested in a seminar event scheduled for April 2017 which will consider the “legal and ethical issues arising from actual and potential uses of tracking devices across a range of contexts”.

For further information, check out the network’s website or email the team to join the network.

Full Programme: Annual Workshop and Evening Lecture

Restricted and Redacted: Where now for human rights and digital information control?

The full programme for the Information Law and Policy Centre’s annual workshop and lecture on Wednesday 9th November 2016 is now available (see below).

For both events, attendance will be free of charge thanks to the support of the IALS and our sponsor, Bloomsbury’s Communications Law journal.

To register for the afternoon workshop please visit this Eventbrite page.
To register for the evening lecture please visit this Eventbrite Page.

Please note that for administrative purposes you will need to book separate tickets for the afternoon and evening events if you would like to come to both events.

PROGRAMME

10.45am: REGISTRATION AND COFFEE 

11.15am: Welcome

  • Judith Townend, University of Sussex
  • Paul Wragg, University of Leeds
  • Julian Harris, Institute of Advanced Legal Studies, University of London

11.30am-1pm: PANEL 1 – choice between A and B

Panel A: Social media, online privacy and shaming

Chair: Asma Vranaki, Queen Mary University of London

  1. David Mangan, City, University of London, Dissecting Social Media: Audience and Authorship
  2. Marion Oswald, Helen James, Emma Nottingham, University of Winchester, The not-so-secret life of five year olds: Legal and ethical issues relating to disclosure of information and the depiction of children on broadcast and social media
  3. Maria Run Bjarnadottir, Ministry of the Interior in Iceland, University of Sussex, Does the internet limit human rights protection? The case of revenge porn
  4. Tara Beattie, University of Durham, Censoring online sexuality – A non-heteronormative, feminist perspective

Panel B: Access to Information and protecting the public interest

Chair: Judith Townend, University of Sussex

  1. Ellen P. Goodman, Rutgers University, Obstacles to Using Freedom of Information Laws to Unpack Public/Private Deployments of Algorithmic Reasoning in the Public Sphere
  2. Felipe Romero-Moreno, University of Hertfordshire, ‘Notice and staydown’, the use of content identification and filtering technology posing a fundamental threat to human rights
  3. Vigjilenca Abazi, Maastricht University, Mapping Whistleblowing Protection in Europe: Information Flows in the Public Interest

1-2pm: LUNCH 

2-3.30pm: PANEL 2 – choice between A and B

Panel A: Data protection and surveillance

Chair: Nora Ni Loideain, University of Cambridge

  1. Jiahong Chen, University of Edinburgh, How the Best Laid Plans Go Awry: The (Unsolved) Issues of Applicable Law in the General Data Protection Regulation
  2. Jessica Cruzatti-Flavius, University of Massachusetts, The Human Hard Drive: Name Erasure and the Rebranding of Human Beings
  3. Wenlong Li, University of Edinburgh, Right to Data Portability (RDP)
  4. Ewan Sutherland, Wits University, Wire-tapping in the regulatory state – changing times, changing mores

Panel B: Technology, power and governance

Chair: Chris Marsden, University of Sussex

  1. Monica Horten, London School of Economics, How Internet structures create closure for freedom of expression – an exploration of human rights online in the context of structural power theory
  2. Perry Keller, King’s College, London, Bringing algorithmic governance to the smart city
  3. Marion Oswald, University of Winchester and Jamie Grace, Sheffield Hallam University, Intelligence, policing and the use of algorithmic analysis – initial conclusions from a survey of UK police forces using freedom of information requests as a research methodology
  4. Allison Holmes, Kent University, Private Actor or Public Authority? How the Status of Communications Service Providers affects Human Rights

3.30-5pm: PANEL 3 – choice between A and B

Panel A: Intermediary Liability

Chair: Christina Angelopoulos, University of Cambridge

  1. Judit Bayer, Miskolc University, Freedom and Diversity on the Internet: Liability of Intermediaries for Third Party Content
  2. Mélanie Dulong de Rosnay, Félix Tréguer, CNRS-Sorbonne Institute for Communication Sciences and Federica Giovanella, University of Trento, Intermediary Liability and Community Wireless Networks Design Shaping
  3. David Rolph, University of Sydney, Liability of Search Engines for Publication of Defamatory Matter: An Australian Perspective

Panel B: Privacy and anonymity online

Chair: Paul Wragg, University of Leeds

  1. Gavin Phillipson, University of Durham, Threesome injuncted: has the Supreme Court turned the tide against the media in online privacy cases?
  2. Fiona Brimblecombe, University of Durham, European Privacy Law
  3. James Griffin, University of Exeter and Annika Jones, University of Durham, The future of privacy in a world of 3D printing

5-6pm: TEA BREAK / STRETCH YOUR LEGS

6-8pm: EVENING LECTURE AND DRINKS

Lecture Title: Heads and shoulders, knees and toes (and eyes and ears and mouth and nose…): The impact of the General Data Protection Regulation on use of biometrics.

Biometrics are touted as one of the next big things in the connected world. Specific reference to biometrics and genetic data has been included for the first time in the General Data Protection Regulation. How does this affect existing provisions? Will the impact of the Regulation be to encourage or to restrict the development of biometric technology?

  • Speaker: Rosemary Jay, Senior Consultant Attorney at Hunton & Williams and author of Sweet & Maxwell’s Data Protection Law & Practice.
  • Chair: Professor Lorna Woods, University of Essex
  • Respondents: Professor Andrea Matwyshyn, Northeastern University and Mr James Michael, IALS

Information Law and Policy Centre Annual Lecture and Workshop

An afternoon workshop and evening lecture to be given by leading information and data protection lawyer Rosemary Jay.

Restricted and Redacted: Where now for human rights and digital information control?

The Information Law and Policy Centre is delighted to announce that bookings are now open for its annual workshop and lecture on Wednesday 9th November 2016, this year supported by Bloomsbury’s Communications Law journal.

For both events, attendance will be free of charge thanks to the support of the IALS and our sponsor, although registration will be required as places are limited.

To register for the afternoon workshop please visit this Eventbrite page.

To register for the evening lecture please visit this Eventbrite Page.

Please note that for administrative purposes you will need to book separate tickets for the afternoon and evening events if you would like to come to both events.

AFTERNOON WORKSHOP/SEMINAR 
11am – 5pm (lunch and refreshments provided)

For the afternoon part of this event we have an excellent set of presentations lined up that consider information law and policy in the context of human rights. Speakers will offer an original perspective on the way in which information and data interact with legal rights and principles relating to free expression, privacy, data protection, reputation, copyright, national security, anti-discrimination and open justice.

We will be considering topics such as internet intermediary liability, investigatory and surveillance powers, media regulation, freedom of information, the EU General Data Protection Regulation, whistleblower protection, and ‘anti-extremism’ policy. The full programme will be released in October.

EVENING LECTURE BY ROSEMARY JAY, HUNTON & WILLIAMS
6pm-7.30pm (followed by reception)

The afternoon workshop will be followed by a keynote lecture to be given by Rosemary Jay, senior consultant attorney at Hunton & Williams and author of Sweet & Maxwell’s Data Protection Law & Practice. Continue reading

Brexit: “You don’t know what you’ve got till it’s gone”

Brexit IT law scrabble

In the following editorial, Professor Lilian Edwards considers the implications of the Brexit vote for information law and assesses the mood amongst the academic community in the aftermath of the EU Referendum.

The article was first published in Volume 13, Issue 2 of SCRIPT-ed: A Journal of Law, Technology and Society. Professor Edwards’ views do not represent those of the Information Law and Policy Centre or the Institute of Advanced Legal Studies. 

On 23 June 2016 a slim majority of UK voters decided we should leave the EU in one of the great political upsets of British political history. On 24 June, the next day, CREATe,[1] the RCUK copyright and business models centre which I have helped run since 2012, ran a one-day festival at the Royal Society of the Arts in London. This was designed to be a showcase and celebration of four years of working at the cutting edge of copyright and how it either helps or hinders the creative industries and arts. Hundreds of academics signed up to show and see, including the Director of CREATe, Martin Kretschmer of Glasgow University, from Germany by birth, and many others from all over Europe and beyond.

It was a classic international IT/intellectual property event: analysing laws made throughout the world to regulate globalised cultural markets, transnational data and product flows, disruptive technologies that disregard borders, and audiences as likely to listen to music made in Brazil via decentralised P2P networks, as watch Netflix series made in the US, or use smartphones made in Japan to watch Hindi pop videos on YouTube.

In the event, the CREATe Festival became more of a wake. Reportedly, experienced academics, who thought themselves hardened to trauma by years of bombardment from REF, TEF and NSS, were almost in tears at the first session. This writer, derelict of duty, was not there to corroborate, still staring like a rabbit in the headlights at the TV in a hotel bedroom in Docklands, where the dominant tech, business and financial workers were almost equally in shock.

So, Brexit. As the dust not so much settles as temporarily accumulates while we work out what on earth happens next, what are the implications for IT law and UK academe? Are they really as bad as they seemed that morning? Continue reading

Pokémon Go has revealed a new battleground for virtual privacy

Pokemon go and virtual privacyAndres Guadamuz, University of Sussex

People have been lingering outside Boon Sheridan’s house all through the night. The designer lives in an old church in Massachusetts that has been designated a “gym” in the new smartphone game Pokémon Go. Because the game requires players to visit places in the real world, Sheridan now has to put up with people regularly stopping outside his building to play.

It has got to the point where he has started wondering if there is anything the law can do in situations like this. He wrote on Twitter: “Do I even have rights when it comes to a virtual location imposed on me? Businesses have expectations, but this is my home.” This problem of virtual activities impinging on physical spaces in only likely to grow with the increasing popularity of the augmented reality used in games such as Pokémon Go to overlay digital landscapes on real ones. But there may be a way to deal with this before it becomes a serious legal problem for more people.

Pokémon Go encourages players to interact with their actual environment by using realistic maps of their surroundings as part of the game. Certain landmarks, monuments and public buildings are tagged as “stops”, where players can collect items, and some public spaces including churches, parks and businesses are tagged as “gyms”, where users can battle each other.

It is the tagging element that has prompted a few interesting legal questions about the role of augmented reality. The game’s developer, Niantic, is using a combination of data from Google Maps and user-generated tags collected from an earlier game called Ingress. This data is used to identify real-life spots as either a stop or a gym. But what happens when the data mistakenly identifies a house as a public space, as happened to Sheridan?

As it turns out, Niantic offers people the chance to highlight problems with a location. And in the grand scheme of things, whether a person’s house is mis-tagged in a game does not seem like something worthy of new laws, particularly when the developer offers to correct any errors. But Pokémon Go is just the beginning. The game has proven the potential of augmented reality to appeal to a very large audience, so we can expect many other applications of the technology to come our way.

The wild success of location-based gaming may bring about a horde of imitators, so expect a new generation of augmented reality gaming to hit the app stores soon. And the technology’s potential also goes beyond gaming so we can expect more mainstream applications of geo-tagging and location-based interaction, especially with the growth of wearable technology such as fitness trackers. You can imagine that soon we will have a world in which ever house, every car, even every person could come with an added virtual tag full of data. The potential for innovation in this area is staggering.

But what if your house is tagged in a global database without your permission and you value your privacy so do not want any passersby to know that you live there? Or what if a commercially-sensitive database identifies your business with incorrect data and you cannot reach the developer or they refuse to amend it? People looking for businesses in your area may miss you and go to a competitor that is correctly listed. Even more worrying, what if your house was previously occupied by a sex offender and is tagged in an outdated database with that information?

The problems would go far beyond what is happening with Sheridan’s house. These cases could have real negative effects on people’s lives, privacy, or business prospects.

The potential for trouble will be worse with the launch of apps that allow users to tag public or private buildings themselves. Why will abusers and trolls bother spray-painting a house, when they can geo-tag it maliciously? Paint washes away, but data may be more difficult to erase.

My proposal is to extend data protection legislation to virtual spaces. At the moment, data protection is strictly personal as it relates to any information about a specific person, known as a data subject. The data subject has a variety of rights, such as having the right to access their data and rectify and erase anything that is inaccurate or excessive.

Protecting objects

Under my proposal, the data subject’s rights would remain as they are, but the law would contain a new definition, that of the data object. This relates to data about a specific location. The rights of data objects would be considerably more limited than those of a data subject. But classifying them like this would take advantage of the data-protection mechanisms that already exist for when someone is intrinsically linked to a location.

In other words, just tagging a location on an augmented reality database wouldn’t violate the data protection. But mis-tagging a location as a public space in a way that could impinge on people’s enjoyment of that location could trigger action by the regulator to have the tag amended, removed or even erased. This would be especially useful for private spaces such as Sheridan’s house. If the app developer fails to make a change to the data, the property owner could make a request to the data protection authority, who would then force developers to change the data – or face fines.

There are limits to this proposal. Such a regime would only apply to companies based in the same country as the data protection regulator. So, for example, European countries wouldn’t be able to force Niantic to make changes to Pokémon Go’s tags, because the company is based in the US. There would also need to be strict restrictions on exactly what counts a data object and what is worth amending or deleting, otherwise the system could be abused.

But one thing is already certain: Pokémon Go is just the beginning of a new world of location-based data applications, and we need to find better ways to protect our digital rights in that space.

The ConversationAndres Guadamuz, Senior Lecturer in Intellectual Property Law, University of Sussex

This article was originally published on The Conversation. Read the original article.

Photo: Eduardo Woo, CC BY-SA 2.0

Analysing the Advocate General’s opinion on data retention and EU law

7562831366_66f986c3ea_o (1)Last week, the Advocate General published an opinion on a case brought to the European Court of Justice concerning the compatibility of the UK and Sweden’s data retention laws with EU law.

In a detailed analysis, Lorna Woods, Professor of Internet Law at the University of Essex considers the potential implications of the opinion for national data retention regimes (including the UK’s Investigatory Powers Bill) and the legal tensions which arise from the Advocate General’s opinion. This post first appeared on Professor Steve Peer’s EU Law Analysis blog.     

The Advocate General’s opinion concerns two references from national courts which both arose in the aftermath of the invalidation of the Data Retention Directive (Directive 2006/24) in Digital Rights Ireland dealing with whether the retention of communications data en masse complies with EU law.

The question is important for the regimes that triggered the references, but in the background is a larger question: can mass retention of data ever be human rights compliant. While the Advocate General clearly states this is possible, things may not be that straightforward. Continue reading

“Right to be forgotten” requires anonymisation of online newspaper archive

In this post, Hugh Tomlinson QC discusses the implications of a ruling in the Belgian justice system for the application of the “right to be forgotten” for news organisations. Tomlinson is a member of Matrix Chambers and an editor of the Inforrm blog. The post was first published on the Inforrm blog and is cross-posted here with permission. 

In the case of Olivier G v Le Soir (29 April 2016, n° C.15.0052.F [pdf]) the Belgian Court of Cassation decided that, as the result of the “right to be forgotten”, a newspaper had been properly ordered to anonymise the online version of a 1994 article concerning a fatal road traffic accident.

The applicant had been convicted of a drink driving offence as a result of the accident but his conviction was spent and the continued online publication of his name was a violation of his Article 8 rights which outweighed the Article 10 rights of the newspaper and the public.

Continue reading

Whistleblowers and journalists in the digital age

Snowden

Dr Aljosha Karim Schapals, research assistant at the Information Law and Policy Centre, reports on a research workshop hosted by the University of Cardiff on Digital Citizenship and the ‘Surveillance Society’.

A workshop led by researchers at the Cardiff School of Journalism, Media and Cultural Studies (JOMEC) on 27th June in London shared the findings of an 18 month ESRC funded research project examining the relationships between the state, the media and citizens in the wake of the Snowden revelations of 2013.

It was the concluding event of a number of conferences, seminars and workshops organised by the five principal researchers: Dr Arne Hintz (Cardiff), Dr Lina Dencik (Cardiff), Prof Karin Wahl-Jorgensen (Cardiff), Prof Ian Brown (Oxford) and Dr Michael Rogers (TU Delft).

Broadly speaking, the Digital Citizenship and the ‘Surveillance Society’ (DCSS) project has investigated the nature, opportunities and challenges of digital citizenship in light of US and UK governmental surveillance as revealed by whistleblower Edward Snowden.

Touching on more general themes such as freedom of expression, data privacy and civic transparency, the project aligns with the research activities of the Information Law and Policy Centre, which include developing work on journalism and whistleblower protection, and discussions and analysis of the Investigatory Powers Bill. Continue reading