Tag Archives: privacy

CFP: Bytes, Bodies and Souls: Interrogating Human Digitalisation

conference-imageKent Law School, in conjunction with the Eastern Academic Research Consortium, invites early career academics and postgraduate research students to participate in the “Bytes, Bodies and Souls: Interrogating Human Digitalisation” workshop to be held on 30th May, 2017.

The workshop aims to bring together researchers across the social sciences, humanities, sciences and other relevant disciplines who are interested in examining the consequences, possibilities, and limitations of human digitalisation.

Papers and Posters are welcomed on any aspect of the conference theme. This may include, although is not restricted to:

  • Big Data and its challenges
  • The role and impact of the Internet of Things
  • Digital ownership and appropriation processes
  • Privacy, surveillance, and control
  • The role of algorithms in the governance of human digitalisation
  • Politics of digital humans from cyber activism to post-truth
  • Digital human aesthetics; the forging of a digital soul

Abstracts for papers are invited for consideration. Abstracts should be no more than 300 words in length. Successful applicants will be allocated 15 minutes for the presentation of their paper plus time for questions and discussion.

Abstracts for posters are invited for consideration. Abstracts should be no more than 300 words in length. Accepted poster presenters will need to deliver the hard copy of their poster to the venue no later than 9 am on the day of the workshop to allow it to be displayed throughout the day.

Submissions should be sent in a Word document format to a.m.holmes@kent.ac.uk. Please include name, title, institution, and email correspondence address and whether you wish to be considered for a paper or poster presentation. The deadline for submission is Friday 3rd March 2017. Successful applicants will be notified by the 19th March 2017.

Your next social network could pay you for posting

In this guest post, Jelena Dzakula from the London School of Economics and Political Science considers what blockchain technology might mean for the future of social networking. 

You may well have found this article through Facebook. An algorithm programmed by one of the world’s biggest companies now partially controls what news reaches 1.8 billion people. And this algorithm has come under attack for censorship, political bias and for creating bubbles that prevent people from encountering ideas they don’t already agree with.

blockchainNow a new kind of social network is emerging that has no centralised control like Facebook does. It’s based on blockchain, the technology behind Bitcoin and other cryptocurrencies, and promises a more democratic and secure way to share content. But a closer look at how these networks operate suggests they could be far less empowering than they first appear.

Blockchain has received an enormous amount of hype thanks to its use in online-only cryptocurrencies. It is essentially a ledger or a database where information is stored in “blocks” that are linked historically to form a chain, saved on every computer that uses it. What is revolutionary about it is that this ledger is built using cryptography by a network of users rather than a central authority such as a bank or government.

Every computer in the network has access to all the blocks and the information they contain, making the blockchain system more transparent, accurate and also robust since it does not have a single point of failure. The absence of a central authority controlling blockchain means it can be used to create more democratic organisations owned and controlled by their users. Very importantly, it also enables the use of smart contracts for payments. These are codes that automatically implement and execute the terms of a legal contract.

Industry and governments are developing other uses for blockchain aside from digital currencies, from streamlining back office functions to managing health data. One of the most recent ideas is to use blockchain to create alternative social networks that avoid many of the problems the likes of Facebook are sometimes criticised for, such as censorship, privacy, manipulating what content users see and exploiting those users.

Continue reading

Case Preview: PNM v Times Newspapers, Open justice and the privacy of suspects – Hugh Tomlinson QC

In this guest post, Hugh Tomlinson QC previews an appeal to the Supreme Court in a case that considers where the balance lies between rights to privacy and the principle of open justice. The post was first published on the Inforrm blog

On 17 and 18 January 2017, a seven judge Supreme Court will hear the claimant’s appeal against the decision of the Court of Appeal in the case of PNM v Times Newspapers ([2014] EWCA Civ 1132).

That Court had upheld the judge’s view that, on the basis of the “open justice principle”, information mentioned in open court concerning a person who was arrested but not charged could be reported.  

Background

The claimant was one of a number of men arrested in March 2012 in connection with a Thames Valley Police investigation into allegations of child sex grooming and prostitution.  The claimant was released on bail and was subsequently notified that he was to be released without charge.

Nine men were charged and a criminal trial took place at the Central Criminal Court between January and May 2013.  The claimant was not a party or witness at the criminal trial.  On 25 January 2013 order under section 4(2) of the Contempt of Court Act 1981 was made prohibiting publication of any report which referred to evidence which may identify or tend to identify him.

On 14 May 2013, seven of the defendants were convicted of numerous serious sexual offences.  A further order under section 4(2) of the Contempt of Court Act 1981 was made on the claimant’s application.  It prohibited disclosure of details of applications made to the Court by Thames Valley Police (which concerned certain of the claimant’s property).

The claimant’s full name was mentioned in open court when a police officer said that a witness had failed to pick him out on an identification parade.  He was also mentioned in the course of cross-examination, in speeches and in the summing up.

At the conclusion of the criminal trial the Judge declined to discharge the section 4(2) order until the decision was made as to whether the claimant would be charged.  In July 2013 the police notified the claimant that he was not going to be charged.   The Times and the Oxford Mail applied to discharge the section 4(2) but, before he had handed down his ruling, the claimant applied to the High Court for an injunction.

By an application made on 15 October 2013 against the Times, the Oxford Mail and two journalists, the claimant sought an order to prevent publication of the fact of his arrest on suspicion of committing serious sexual offences against children and associated information because of the fear of the damage such publications may cause to him and his family, including his children.

Continue reading

Why the rise of wearable tech to monitor employees is worrying

Shutterstock.com

In this guest post, Ivan Manokha, Departmental Lecturer in International Political Economy at the University of Oxford, considers the use of wearable technology in the workplace and the potential privacy implications of collecting the data of employees. 

An increasing number of companies are beginning to digitally monitor their employees. While employers have always scrutinised their workers’ performance, the rise of wearable technology to keep tabs has more of a dystopian edge to it. Monitoring has become easier, more intrusive and is not just limited to the workplace – it’s 24/7.

Devices such as Fitbit, Nike+ FuelBand and Jawbone UP, which can record information related to health, fitness, sleep quality, fatigue levels and location, are now being used by employers who integrate wearable devices into employee wellness programmes.

One of the first was BP America, which introduced Fitbit bracelets in 2013. In 2015 at least 24,500 BP’s employees were using them and more and more US employers have followed suit. For instance, the same year, Vista Staffing Solutions, a healthcare recruitment agency, started a weight-loss programme using Fitbits and wifi-enabled bathroom scales. Appirio, a consulting company, started handing out Fitbits to employees in 2014.

In the UK similar projects are under consideration by major employers. And this trend will only intensify in the years to come. By 2018, estimates suggest that more than 13m of these devices will be part of worker wellness schemes. Some analysts say that by the same year, at least 2m employees worldwide will be required to wear health-and-fitness trackers as a condition of employment.

According to some, this is a positive development. Chris Brauer, an academic at Goldsmiths, University of London, argues that corporate managers will now be comparable to football managers. They will be equipped with a dashboard of employee performance trajectories, as well as their fatigue and sleep levels. They will be able to pick only the fittest employees for important business meetings, presentations, or negotiations.

It seems, however, that such optimism overlooks important negative and potentially dangerous social consequences of using this kind of technology. History here offers a word of warning.

Historical precedent

The monitoring of workers’ health outside the workplace was once attempted by the Ford Motor Company. When Ford introduced a moving assembly line in 1913 – a revolutionary innovation that enabled complete control over the pace of work – the increase in productivity was dramatic. But so was the rise in worker turnover. In 1913, every time the company wanted to add 100 men to its factory personnel, it was necessary to hire 963, as workers struggled to keep up with the pace and left shortly after being recruited.

Ford’s solution to this problem was to double wages. In 1914, the introduction of a US$5 a day wage was announced, which immediately led to a decline in worker turnover. But high wages came with a condition: the adoption of healthy and moral lifestyles.

The company set up a sociology department to monitor workers’ – and their families’ – compliance with its standards. Investigators would make unannounced calls upon employees and their neighbours to gather information on living conditions and lifestyles. Those that were deemed insufficiently healthy or morally right were immediately disqualified from the US$5 wage level.

Analysing Ford’s policies, Italian political philosopher and revolutionary Antonio Gramsci coined the term “Fordism” for this social phenomenon. It signalled fundamental changes to labour, which became much more intense after automation. Monitoring workers’ private lives to control their health, Gramsci argued, was necessary to preserve “a certain psycho-physical equilibrium which prevents the physiological collapse of the worker, exhausted by the new method of production”.

Parallels today

Today, we are faced with another great change to how work is done. To begin with, the “great doubling” of the global labour force has led to the increase in competition between workers around the world. This has resulted in a deterioration of working and employment conditions, the growth of informal and precarious labour, and the intensification of exploitation in the West.

So there has been a significant increase in the average number of hours worked and an increase in the intensity of labour. For example, research carried out by the Trade Union Congress in 2015 discovered that the number of people working more than 48 hours in a week in the UK was rising and it warned of a risk of “burnout Britain”.

Indeed, employee burnouts have become a major concern of employers. A UK survey of human resources directors carried out in 2015 established that 80% were afraid of losing top employees to burnout.

Ford’s sociology department was shut down in the early 1920s for two reasons. It became too costly to maintain it in the context of increasing competition from other car manufacturers. And also because of growing employee resistance to home visits by inspectors, increasingly seen as too intrusive into their private lives.

Wearable technology, however, does not suffer from these inconveniences. It is not costly and it is much less obviously intrusive than surprise home visits by company inspectors. Employee resistance appears to be low, though there have been a few attempts to fake the results of the tracking (for example, workers strapping their employer-provided Fitbits onto their dogs to boost their “activity levels”). The idea of being tracked has mostly gone unchallenged.

Labour commodified to the extreme

But the use of wearable technology by employers raises a range of concerns. The most obvious is the right to privacy. The use of wearable technology goes significantly further than computer systems where emails are already logged and accessible to employers.

Surveillance becomes continuous and all-encompassing, increasingly unconfined to the workplace, and also constitutes a form of surveillance which penetrates the human body. The right to equal employment opportunities and promotion may also be compromised if employers reserve promotion for those who are in a better physical shape or suffer less from fatigue or stress.

It may also be argued that the use of wearable technology takes what the Hungarian historian Karl Polanyi called the “commodification” of human labour to an extreme. Monitoring worker health both inside and outside the workplace involves the treatment of people as machines whose performance is to be maximised at all costs. However, as Polanyi warned, human labour is a “fictitious commodity” – it is not “produced” for sale to capital as a mere tool. To treat it as such risks ultimately leading to a “demolition of society”.

To protect individual rights, systems have been introduced to regulate how data that is gathered on employees is stored and used. So one possible solution is to render the data collected by trackers compulsorily anonymous. For example, one company that collects and monitors employee data for companies, Sociometric Solutions only charts broader patterns and connections to productivity, rather than individual performance.

This, however, does not address concerns about the increasing commodification of human labour that comes with the use of wearable technology and any potential threats to society. To prevent this, it is perhaps necessary to consider imposing an outright ban on its use by employers altogether.

The ConversationIvan Manokha, Departmental Lecturer in International Political Economy, University of Oxford

This article was originally published on The Conversation. Read the original article.

How the UK passed the most invasive surveillance law in democratic history

IPBill image

In this guest post, Paul Bernal, Lecturer in Information Technology, Intellectual Property and Media Law at the University of East Anglia, reflects on the passage of the Investigatory Powers Bill. The legislation was recently passed in Parliament and given Royal Assent on 29 November 2016.

You might not have noticed thanks to world events, but the UK parliament recently approved the government’s so-called Snooper’s Charter and it has now become law. This nickname for the Investigatory Powers Bill is well earned. It represents a new level and nature of surveillance that goes beyond anything previously set out in law in a democratic society. It is not a modernisation of existing law, but something qualitatively different, something that intrudes upon every UK citizen’s life in a way that would even a decade ago have been inconceivable. Continue reading

Information Law and Policy Centre’s annual workshop highlights new challenges in balancing competing human rights

dsc_0892  dsc_0896  dsc_0898

Our annual workshop and lecture – held earlier this month – brought together a wide range of legal academics, lawyers, policy-makers and interested parties to discuss the future of human rights and digital information control.

A number of key themes emerged in our panel sessions including the tensions present in balancing Article 8 and Article 10 rights; the new algorithmic and informational power of commercial actors; the challenges for law enforcement; the liability of online intermediaries; and future technological developments.

The following write up of the event offers a very brief summary report of each panel and of Rosemary Jay’s evening lecture.

Morning Session

Panel A: Social media, online privacy and shaming

Helen James and Emma Nottingham (University of Winchester) began the panel by presenting their research (with Marion Oswald) into the legal and ethical issues raised by the depiction of young children in broadcast TV programmes such as The Secret Life of 4, 5 and 6 Year Olds. They were also concerned with the live-tweeting which accompanied these programmes, noting that very abusive tweets could be directed towards children taking part in the programmes.

Continue reading

‘Tracking People’ research network established

Tracking People Research NetworkA new research network has been established to investigate the legal, ethical, social and technical issues which arise from the use of wearable, non-removable tagging and tracking devices.

According to the network’s website, tracking devices are increasingly being used to monitor a range of individuals including “offenders, mental health patients, dementia patients, young people in care, immigrants and suspected terrorists”.

The interdisciplinary network is being hosted at the University of Leeds and aims to foster “new empirical, conceptual, theoretical and practical insights into the use of tracking devices”.

The network is being coordinated by Professor Anthea Hucklesby and Dr Kevin MacNish. It will bring together academics, designers, policy-makers and practitioners to explore critical issues such as:

  • privacy;
  • ethics;
  • data protection;
  • efficiency and effectiveness;
  • the efficacy and suitability of the equipment design;
  • the involvement of the private sector as providers and operators;
  • the potential for discriminatory use.

Readers of the Information Law and Policy Centre blog might be particularly interested in a seminar event scheduled for April 2017 which will consider the “legal and ethical issues arising from actual and potential uses of tracking devices across a range of contexts”.

For further information, check out the network’s website or email the team to join the network.

Full Programme: Annual Workshop and Evening Lecture

Restricted and Redacted: Where now for human rights and digital information control?

The full programme for the Information Law and Policy Centre’s annual workshop and lecture on Wednesday 9th November 2016 is now available (see below).

For both events, attendance will be free of charge thanks to the support of the IALS and our sponsor, Bloomsbury’s Communications Law journal.

To register for the afternoon workshop please visit this Eventbrite page.
To register for the evening lecture please visit this Eventbrite Page.

Please note that for administrative purposes you will need to book separate tickets for the afternoon and evening events if you would like to come to both events.

PROGRAMME

10.45am: REGISTRATION AND COFFEE 

11.15am: Welcome

  • Judith Townend, University of Sussex
  • Paul Wragg, University of Leeds
  • Julian Harris, Institute of Advanced Legal Studies, University of London

11.30am-1pm: PANEL 1 – choice between A and B

Panel A: Social media, online privacy and shaming

Chair: Asma Vranaki, Queen Mary University of London

  1. David Mangan, City, University of London, Dissecting Social Media: Audience and Authorship
  2. Marion Oswald, Helen James, Emma Nottingham, University of Winchester, The not-so-secret life of five year olds: Legal and ethical issues relating to disclosure of information and the depiction of children on broadcast and social media
  3. Maria Run Bjarnadottir, Ministry of the Interior in Iceland, University of Sussex, Does the internet limit human rights protection? The case of revenge porn
  4. Tara Beattie, University of Durham, Censoring online sexuality – A non-heteronormative, feminist perspective

Panel B: Access to Information and protecting the public interest

Chair: Judith Townend, University of Sussex

  1. Ellen P. Goodman, Rutgers University, Obstacles to Using Freedom of Information Laws to Unpack Public/Private Deployments of Algorithmic Reasoning in the Public Sphere
  2. Felipe Romero-Moreno, University of Hertfordshire, ‘Notice and staydown’, the use of content identification and filtering technology posing a fundamental threat to human rights
  3. Vigjilenca Abazi, Maastricht University, Mapping Whistleblowing Protection in Europe: Information Flows in the Public Interest

1-2pm: LUNCH 

2-3.30pm: PANEL 2 – choice between A and B

Panel A: Data protection and surveillance

Chair: Nora Ni Loideain, University of Cambridge

  1. Jiahong Chen, University of Edinburgh, How the Best Laid Plans Go Awry: The (Unsolved) Issues of Applicable Law in the General Data Protection Regulation
  2. Jessica Cruzatti-Flavius, University of Massachusetts, The Human Hard Drive: Name Erasure and the Rebranding of Human Beings
  3. Wenlong Li, University of Edinburgh, Right to Data Portability (RDP)
  4. Ewan Sutherland, Wits University, Wire-tapping in the regulatory state – changing times, changing mores

Panel B: Technology, power and governance

Chair: Chris Marsden, University of Sussex

  1. Monica Horten, London School of Economics, How Internet structures create closure for freedom of expression – an exploration of human rights online in the context of structural power theory
  2. Perry Keller, King’s College, London, Bringing algorithmic governance to the smart city
  3. Marion Oswald, University of Winchester and Jamie Grace, Sheffield Hallam University, Intelligence, policing and the use of algorithmic analysis – initial conclusions from a survey of UK police forces using freedom of information requests as a research methodology
  4. Allison Holmes, Kent University, Private Actor or Public Authority? How the Status of Communications Service Providers affects Human Rights

3.30-5pm: PANEL 3 – choice between A and B

Panel A: Intermediary Liability

Chair: Christina Angelopoulos, University of Cambridge

  1. Judit Bayer, Miskolc University, Freedom and Diversity on the Internet: Liability of Intermediaries for Third Party Content
  2. Mélanie Dulong de Rosnay, Félix Tréguer, CNRS-Sorbonne Institute for Communication Sciences and Federica Giovanella, University of Trento, Intermediary Liability and Community Wireless Networks Design Shaping
  3. David Rolph, University of Sydney, Liability of Search Engines for Publication of Defamatory Matter: An Australian Perspective

Panel B: Privacy and anonymity online

Chair: Paul Wragg, University of Leeds

  1. Gavin Phillipson, University of Durham, Threesome injuncted: has the Supreme Court turned the tide against the media in online privacy cases?
  2. Fiona Brimblecombe, University of Durham, European Privacy Law
  3. James Griffin, University of Exeter and Annika Jones, University of Durham, The future of privacy in a world of 3D printing

5-6pm: TEA BREAK / STRETCH YOUR LEGS

6-8pm: EVENING LECTURE AND DRINKS

Lecture Title: Heads and shoulders, knees and toes (and eyes and ears and mouth and nose…): The impact of the General Data Protection Regulation on use of biometrics.

Biometrics are touted as one of the next big things in the connected world. Specific reference to biometrics and genetic data has been included for the first time in the General Data Protection Regulation. How does this affect existing provisions? Will the impact of the Regulation be to encourage or to restrict the development of biometric technology?

  • Speaker: Rosemary Jay, Senior Consultant Attorney at Hunton & Williams and author of Sweet & Maxwell’s Data Protection Law & Practice.
  • Chair: Professor Lorna Woods, University of Essex
  • Respondents: Professor Andrea Matwyshyn, Northeastern University and Mr James Michael, IALS

Information Law and Policy Centre Annual Lecture and Workshop

An afternoon workshop and evening lecture to be given by leading information and data protection lawyer Rosemary Jay.

Restricted and Redacted: Where now for human rights and digital information control?

The Information Law and Policy Centre is delighted to announce that bookings are now open for its annual workshop and lecture on Wednesday 9th November 2016, this year supported by Bloomsbury’s Communications Law journal.

For both events, attendance will be free of charge thanks to the support of the IALS and our sponsor, although registration will be required as places are limited.

To register for the afternoon workshop please visit this Eventbrite page.

To register for the evening lecture please visit this Eventbrite Page.

Please note that for administrative purposes you will need to book separate tickets for the afternoon and evening events if you would like to come to both events.

AFTERNOON WORKSHOP/SEMINAR 
11am – 5pm (lunch and refreshments provided)

For the afternoon part of this event we have an excellent set of presentations lined up that consider information law and policy in the context of human rights. Speakers will offer an original perspective on the way in which information and data interact with legal rights and principles relating to free expression, privacy, data protection, reputation, copyright, national security, anti-discrimination and open justice.

We will be considering topics such as internet intermediary liability, investigatory and surveillance powers, media regulation, freedom of information, the EU General Data Protection Regulation, whistleblower protection, and ‘anti-extremism’ policy. The full programme will be released in October.

EVENING LECTURE BY ROSEMARY JAY, HUNTON & WILLIAMS
6pm-7.30pm (followed by reception)

The afternoon workshop will be followed by a keynote lecture to be given by Rosemary Jay, senior consultant attorney at Hunton & Williams and author of Sweet & Maxwell’s Data Protection Law & Practice. Continue reading

Data Retention and the Automated Number Plate Recognition (ANPR) System: A Gap in the Oversight Regime

ANPR Intercept

The Advocate General’s Opinion in the recent Watson/Tele2 case re-emphasises the importance of considered justification for the collection and storage of personal data which has implications for a variety of data retention regimes. In this post, Lorna Woods, Professor of Internet Law at the University of Essex, considers the legal position of the system used to capture and store vehicle number plates in the UK.

The Data Retention Landscape

Since the annulment of the Data Retention Directive (Directive 2006/24/EC) (DPD) with Digital Rights Ireland (Case C-293/12), it has become clear that the mass retention of data – even for the prevention of terrorism and serious crime – needs to be carefully justified. Cases such as Schrems (Case C-362/14) and Watson/Tele2 (Case C-698/15) re-emphasise this approach. This trend can be seen also in the case law of the European Court of Human Rights, such as Zakharov v. Russia (47143/06) and Szabo v Hungary (11327/14 and 11613/14).

Not only must there be a legitimate public interest in the interference in individuals’ privacy and data protection rights, but that interference must be necessary and proportionate. Mechanisms must exist to ensure that surveillance systems are not abused: oversight and mechanisms for ex ante challenge must be provided.  It is this recognition that seems part of the motivation of the Investigatory Powers Bill currently before Parliament which deals – in the main – with interception and surveillance of electronic communications.

Yet this concern is not limited to electronic communications data, as the current case concerning passenger name records (PNR) data before the Court of Justice (Opinion 1/15) and other ECtHR judgments on biometric data retention (S and Marper v. UK (30562/04 and 30566/04)) illustrate.  Despite the response of the UK government to this jurisprudence, there seems to be one area which has been overlooked – at least with regard to a full oversight regime. That area is automated number plate recognition (ANPR) and the retention of the associated data. Continue reading