Category Archives: Privacy

Two Research Associate Posts at Hertie School of Governance

For the project ‘Evolving Internet Interfaces: Content Control and Privacy Protection’ within the Deutsche Foschungsgemeinschaft (German Research Foundation) research group on ‘Overlapping Spheres of Authority and Interface Conflicts in the Global Order’ (www.osaic.eu), the Hertie School is looking to hire:

2 Research Associates (m/f)
26 hours/week

The contract duration is 36 months. The envisaged start date is 1 June 2017. Salary is in accordance with TV-L Berlin. Continue reading

CFP: Bytes, Bodies and Souls: Interrogating Human Digitalisation

conference-imageKent Law School, in conjunction with the Eastern Academic Research Consortium, invites early career academics and postgraduate research students to participate in the “Bytes, Bodies and Souls: Interrogating Human Digitalisation” workshop to be held on 30th May, 2017.

The workshop aims to bring together researchers across the social sciences, humanities, sciences and other relevant disciplines who are interested in examining the consequences, possibilities, and limitations of human digitalisation.

Papers and Posters are welcomed on any aspect of the conference theme. This may include, although is not restricted to:

  • Big Data and its challenges
  • The role and impact of the Internet of Things
  • Digital ownership and appropriation processes
  • Privacy, surveillance, and control
  • The role of algorithms in the governance of human digitalisation
  • Politics of digital humans from cyber activism to post-truth
  • Digital human aesthetics; the forging of a digital soul

Abstracts for papers are invited for consideration. Abstracts should be no more than 300 words in length. Successful applicants will be allocated 15 minutes for the presentation of their paper plus time for questions and discussion.

Abstracts for posters are invited for consideration. Abstracts should be no more than 300 words in length. Accepted poster presenters will need to deliver the hard copy of their poster to the venue no later than 9 am on the day of the workshop to allow it to be displayed throughout the day.

Submissions should be sent in a Word document format to a.m.holmes@kent.ac.uk. Please include name, title, institution, and email correspondence address and whether you wish to be considered for a paper or poster presentation. The deadline for submission is Friday 3rd March 2017. Successful applicants will be notified by the 19th March 2017.

Your next social network could pay you for posting

In this guest post, Jelena Dzakula from the London School of Economics and Political Science considers what blockchain technology might mean for the future of social networking. 

You may well have found this article through Facebook. An algorithm programmed by one of the world’s biggest companies now partially controls what news reaches 1.8 billion people. And this algorithm has come under attack for censorship, political bias and for creating bubbles that prevent people from encountering ideas they don’t already agree with.

blockchainNow a new kind of social network is emerging that has no centralised control like Facebook does. It’s based on blockchain, the technology behind Bitcoin and other cryptocurrencies, and promises a more democratic and secure way to share content. But a closer look at how these networks operate suggests they could be far less empowering than they first appear.

Blockchain has received an enormous amount of hype thanks to its use in online-only cryptocurrencies. It is essentially a ledger or a database where information is stored in “blocks” that are linked historically to form a chain, saved on every computer that uses it. What is revolutionary about it is that this ledger is built using cryptography by a network of users rather than a central authority such as a bank or government.

Every computer in the network has access to all the blocks and the information they contain, making the blockchain system more transparent, accurate and also robust since it does not have a single point of failure. The absence of a central authority controlling blockchain means it can be used to create more democratic organisations owned and controlled by their users. Very importantly, it also enables the use of smart contracts for payments. These are codes that automatically implement and execute the terms of a legal contract.

Industry and governments are developing other uses for blockchain aside from digital currencies, from streamlining back office functions to managing health data. One of the most recent ideas is to use blockchain to create alternative social networks that avoid many of the problems the likes of Facebook are sometimes criticised for, such as censorship, privacy, manipulating what content users see and exploiting those users.

Continue reading

Case Preview: PNM v Times Newspapers, Open justice and the privacy of suspects – Hugh Tomlinson QC

In this guest post, Hugh Tomlinson QC previews an appeal to the Supreme Court in a case that considers where the balance lies between rights to privacy and the principle of open justice. The post was first published on the Inforrm blog

On 17 and 18 January 2017, a seven judge Supreme Court will hear the claimant’s appeal against the decision of the Court of Appeal in the case of PNM v Times Newspapers ([2014] EWCA Civ 1132).

That Court had upheld the judge’s view that, on the basis of the “open justice principle”, information mentioned in open court concerning a person who was arrested but not charged could be reported.  

Background

The claimant was one of a number of men arrested in March 2012 in connection with a Thames Valley Police investigation into allegations of child sex grooming and prostitution.  The claimant was released on bail and was subsequently notified that he was to be released without charge.

Nine men were charged and a criminal trial took place at the Central Criminal Court between January and May 2013.  The claimant was not a party or witness at the criminal trial.  On 25 January 2013 order under section 4(2) of the Contempt of Court Act 1981 was made prohibiting publication of any report which referred to evidence which may identify or tend to identify him.

On 14 May 2013, seven of the defendants were convicted of numerous serious sexual offences.  A further order under section 4(2) of the Contempt of Court Act 1981 was made on the claimant’s application.  It prohibited disclosure of details of applications made to the Court by Thames Valley Police (which concerned certain of the claimant’s property).

The claimant’s full name was mentioned in open court when a police officer said that a witness had failed to pick him out on an identification parade.  He was also mentioned in the course of cross-examination, in speeches and in the summing up.

At the conclusion of the criminal trial the Judge declined to discharge the section 4(2) order until the decision was made as to whether the claimant would be charged.  In July 2013 the police notified the claimant that he was not going to be charged.   The Times and the Oxford Mail applied to discharge the section 4(2) but, before he had handed down his ruling, the claimant applied to the High Court for an injunction.

By an application made on 15 October 2013 against the Times, the Oxford Mail and two journalists, the claimant sought an order to prevent publication of the fact of his arrest on suspicion of committing serious sexual offences against children and associated information because of the fear of the damage such publications may cause to him and his family, including his children.

Continue reading

Why the rise of wearable tech to monitor employees is worrying

Shutterstock.com

In this guest post, Ivan Manokha, Departmental Lecturer in International Political Economy at the University of Oxford, considers the use of wearable technology in the workplace and the potential privacy implications of collecting the data of employees. 

An increasing number of companies are beginning to digitally monitor their employees. While employers have always scrutinised their workers’ performance, the rise of wearable technology to keep tabs has more of a dystopian edge to it. Monitoring has become easier, more intrusive and is not just limited to the workplace – it’s 24/7.

Devices such as Fitbit, Nike+ FuelBand and Jawbone UP, which can record information related to health, fitness, sleep quality, fatigue levels and location, are now being used by employers who integrate wearable devices into employee wellness programmes.

One of the first was BP America, which introduced Fitbit bracelets in 2013. In 2015 at least 24,500 BP’s employees were using them and more and more US employers have followed suit. For instance, the same year, Vista Staffing Solutions, a healthcare recruitment agency, started a weight-loss programme using Fitbits and wifi-enabled bathroom scales. Appirio, a consulting company, started handing out Fitbits to employees in 2014.

In the UK similar projects are under consideration by major employers. And this trend will only intensify in the years to come. By 2018, estimates suggest that more than 13m of these devices will be part of worker wellness schemes. Some analysts say that by the same year, at least 2m employees worldwide will be required to wear health-and-fitness trackers as a condition of employment.

According to some, this is a positive development. Chris Brauer, an academic at Goldsmiths, University of London, argues that corporate managers will now be comparable to football managers. They will be equipped with a dashboard of employee performance trajectories, as well as their fatigue and sleep levels. They will be able to pick only the fittest employees for important business meetings, presentations, or negotiations.

It seems, however, that such optimism overlooks important negative and potentially dangerous social consequences of using this kind of technology. History here offers a word of warning.

Historical precedent

The monitoring of workers’ health outside the workplace was once attempted by the Ford Motor Company. When Ford introduced a moving assembly line in 1913 – a revolutionary innovation that enabled complete control over the pace of work – the increase in productivity was dramatic. But so was the rise in worker turnover. In 1913, every time the company wanted to add 100 men to its factory personnel, it was necessary to hire 963, as workers struggled to keep up with the pace and left shortly after being recruited.

Ford’s solution to this problem was to double wages. In 1914, the introduction of a US$5 a day wage was announced, which immediately led to a decline in worker turnover. But high wages came with a condition: the adoption of healthy and moral lifestyles.

The company set up a sociology department to monitor workers’ – and their families’ – compliance with its standards. Investigators would make unannounced calls upon employees and their neighbours to gather information on living conditions and lifestyles. Those that were deemed insufficiently healthy or morally right were immediately disqualified from the US$5 wage level.

Analysing Ford’s policies, Italian political philosopher and revolutionary Antonio Gramsci coined the term “Fordism” for this social phenomenon. It signalled fundamental changes to labour, which became much more intense after automation. Monitoring workers’ private lives to control their health, Gramsci argued, was necessary to preserve “a certain psycho-physical equilibrium which prevents the physiological collapse of the worker, exhausted by the new method of production”.

Parallels today

Today, we are faced with another great change to how work is done. To begin with, the “great doubling” of the global labour force has led to the increase in competition between workers around the world. This has resulted in a deterioration of working and employment conditions, the growth of informal and precarious labour, and the intensification of exploitation in the West.

So there has been a significant increase in the average number of hours worked and an increase in the intensity of labour. For example, research carried out by the Trade Union Congress in 2015 discovered that the number of people working more than 48 hours in a week in the UK was rising and it warned of a risk of “burnout Britain”.

Indeed, employee burnouts have become a major concern of employers. A UK survey of human resources directors carried out in 2015 established that 80% were afraid of losing top employees to burnout.

Ford’s sociology department was shut down in the early 1920s for two reasons. It became too costly to maintain it in the context of increasing competition from other car manufacturers. And also because of growing employee resistance to home visits by inspectors, increasingly seen as too intrusive into their private lives.

Wearable technology, however, does not suffer from these inconveniences. It is not costly and it is much less obviously intrusive than surprise home visits by company inspectors. Employee resistance appears to be low, though there have been a few attempts to fake the results of the tracking (for example, workers strapping their employer-provided Fitbits onto their dogs to boost their “activity levels”). The idea of being tracked has mostly gone unchallenged.

Labour commodified to the extreme

But the use of wearable technology by employers raises a range of concerns. The most obvious is the right to privacy. The use of wearable technology goes significantly further than computer systems where emails are already logged and accessible to employers.

Surveillance becomes continuous and all-encompassing, increasingly unconfined to the workplace, and also constitutes a form of surveillance which penetrates the human body. The right to equal employment opportunities and promotion may also be compromised if employers reserve promotion for those who are in a better physical shape or suffer less from fatigue or stress.

It may also be argued that the use of wearable technology takes what the Hungarian historian Karl Polanyi called the “commodification” of human labour to an extreme. Monitoring worker health both inside and outside the workplace involves the treatment of people as machines whose performance is to be maximised at all costs. However, as Polanyi warned, human labour is a “fictitious commodity” – it is not “produced” for sale to capital as a mere tool. To treat it as such risks ultimately leading to a “demolition of society”.

To protect individual rights, systems have been introduced to regulate how data that is gathered on employees is stored and used. So one possible solution is to render the data collected by trackers compulsorily anonymous. For example, one company that collects and monitors employee data for companies, Sociometric Solutions only charts broader patterns and connections to productivity, rather than individual performance.

This, however, does not address concerns about the increasing commodification of human labour that comes with the use of wearable technology and any potential threats to society. To prevent this, it is perhaps necessary to consider imposing an outright ban on its use by employers altogether.

The ConversationIvan Manokha, Departmental Lecturer in International Political Economy, University of Oxford

This article was originally published on The Conversation. Read the original article.

Call for Papers: Deadline 27/1: 4th Winchester Conference on Trust, Risk, Information and the Law

Date: Wednesday 3 May 2017
Venue: West Downs Campus, University of Winchester, Hampshire, UK
Book Online at University of Winchester Events

The Fourth Interdisciplinary Winchester Conference on Trust, Risk, Information and the Law (#TRILCon17) will be held on Wednesday 3 May 2017 at the West Downs Campus, University of Winchester, UK.  The overall theme for this conference will be:

Artificial and De-Personalised Decision-Making: Machine-Learning, A.I. and Drones

The keynote speakers will be Professor Katie Atkinson, Head of Computer Science, University of Liverpool, an expert in Artificial Intelligence and its application to legal reasoning, and John McNamara, IBM Senior Inventor, who will speak on ‘Protecting trust in a world disrupted by machine learning’.

Papers and Posters are welcomed on any aspect of the conference theme.  This might include although is not restricted to:

  • Machine learning and processing of personal information;
  • Artificial intelligence and its application to law enforcement, legal reasoning or judicial decisions;
  • Big Data and the algorithmic analysis of information;
  • Implications of the Internet of Things;
  • Machine based decision-making and fairness;
  • Drone law and policy;
  • Trust and the machine;
  • Risks of removing the human from – or leaving the human in – the process;
  • Responsibility, accountability and liability for machine-made decisions.

The conference offers a best poster prize judged against the following criteria: 1) quality, relevance and potential impact of research presented 2) visual impact 3) effectiveness of the poster as a way of communicating the research.

Proposals for workshops are also welcome.  Workshops offer organisers the opportunity to curate panels or research/scholarship activities on an aspect of the conference theme in order to facilitate interdisciplinary discussion.

This call for papers/posters/workshops is open to academics, postgraduate students, policy-makers and practitioners, and in particular those working in law, computer science & technology, data science, information rights, privacy, compliance, statistics, probability, law enforcement & justice, behavioural science and health and social care.

Abstracts for papers are invited for consideration.  Abstracts should be no more than 300 words in length.  Successful applicants will be allocated 15-20 minutes for presentation of their paper plus time for questions and discussion.

Abstracts for posters are invited for consideration.  Abstracts should be no more than 300 words in length.  Please note that accepted poster presenters will be required to email an electronic copy of their poster no later than a week before the conference.  Accepted poster presenters will also need to deliver the hard copy of their poster to the venue no later than 9am on the date of the conference to enable it to be displayed during the day.

Workshop proposals should summarise the workshop theme and goals, organising committee and schedule of speakers, panels and/or talks.  Proposals should be no more than 500 words.  Workshops should be timed to be 1.5-2 hours in length.

Abstracts and proposals, contained in a Word document, should be emailed to trilcon17@winchester.ac.uk.  Please include name, title, institution/organisation details and email correspondence address.  The deadline for submission of abstracts/proposals is Friday 27 January 2017.  Successful applicants will be notified by 17 February 2017.  Speakers/poster presenters/workshop organisers will be entitled to the early registration discounted conference fee of £80 and will be required to book a place at the conference by 28 February in order to guarantee inclusion of their paper/poster/workshop.

Speakers will be invited to submit their paper for inclusion in a special edition of the open access eJournal, Information Rights, Policy & Practice.

To book a place at the conference, please click here to visit the Winchester University Store and click on academic conferences.

For more information, please contact the conference team at trilcon17@winchester.ac.uk

How the UK passed the most invasive surveillance law in democratic history

IPBill image

In this guest post, Paul Bernal, Lecturer in Information Technology, Intellectual Property and Media Law at the University of East Anglia, reflects on the passage of the Investigatory Powers Bill. The legislation was recently passed in Parliament and given Royal Assent on 29 November 2016.

You might not have noticed thanks to world events, but the UK parliament recently approved the government’s so-called Snooper’s Charter and it has now become law. This nickname for the Investigatory Powers Bill is well earned. It represents a new level and nature of surveillance that goes beyond anything previously set out in law in a democratic society. It is not a modernisation of existing law, but something qualitatively different, something that intrudes upon every UK citizen’s life in a way that would even a decade ago have been inconceivable. Continue reading

Information Law and Policy Centre’s annual workshop highlights new challenges in balancing competing human rights

dsc_0892  dsc_0896  dsc_0898

Our annual workshop and lecture – held earlier this month – brought together a wide range of legal academics, lawyers, policy-makers and interested parties to discuss the future of human rights and digital information control.

A number of key themes emerged in our panel sessions including the tensions present in balancing Article 8 and Article 10 rights; the new algorithmic and informational power of commercial actors; the challenges for law enforcement; the liability of online intermediaries; and future technological developments.

The following write up of the event offers a very brief summary report of each panel and of Rosemary Jay’s evening lecture.

Morning Session

Panel A: Social media, online privacy and shaming

Helen James and Emma Nottingham (University of Winchester) began the panel by presenting their research (with Marion Oswald) into the legal and ethical issues raised by the depiction of young children in broadcast TV programmes such as The Secret Life of 4, 5 and 6 Year Olds. They were also concerned with the live-tweeting which accompanied these programmes, noting that very abusive tweets could be directed towards children taking part in the programmes.

Continue reading

Open letter in the Daily Telegraph: Concerns with ‘information sharing’ provisions in the Digital Economy Bill

Associate research fellow at the Information Law and Policy Centre and lecturer in media and information law at the University of Sussex, Dr Judith Townend, is among the signatories of this letter published on the letters page of the Telegraph on 25/11/2016 [subscription required].

SIR – We wish to highlight concerns with “information sharing” provisions in the Digital Economy Bill.

The Bill puts government ministers in control of citizens’ personal data, a significant change in the relationship between citizen and state. It means that personal data provided to one part of government can be shared with other parts of government and private‑sector companies without citizens’ knowledge or consent.

Government should be strengthening, not weakening, the protection of sensitive information, particularly given the almost daily reports of hacks and leaks of personal data. Legal and technical safeguards need to be embedded within the Bill to ensure citizens’ trust. There must be clear guidance for officials, and mechanisms by which they and the organisations with whom they share information can be held to account.

The Government’s intention is to improve the wellbeing of citizens, and to prevent fraud. This makes it especially important that sensitive personal details, such as income or disability, cannot be misappropriated or misused – finding their way into the hands of payday-loan companies, for example. Information sharing could exacerbate the difficulties faced by the most vulnerable in society.

The Government should be an exemplar in ensuring the security and protection of citizens’ personal data. If the necessary technical and legal safeguards cannot be embedded in the current Bill and codes of practice, we respectfully urge the Government to remove its personal data sharing proposals in their entirety.

Dr Jerry Fishenden
Co-Chairman, Cabinet Office Privacy and Consumer Advisory Group (PCAG)

Renate Samson
Chief Executive, Big Brother Watch

Ian Taylor
Director, Association of British Drivers

Jo Glanville
Director, English PEN

Jodie Ginsberg
Chief Executive Officer, Index on Censorship

Dr Edgar Whitley
Co-Chairman, Cabinet Office PCAG and London School of Economics and Political Science

David Evans
Director of Policy, BCS – The Chartered Institute for IT

Dr Gus Hosein
Executive Director, Privacy International and Member of Cabinet Office PCAG

Rachel Coldicutt
Chief Executive Officer, Doteveryone

Roger Darlington
Chairman, Consumer Forum for Communications

Dr Kieron O’Hara
Associate Professor Electronics and Computer Science, University of Southampton.

Professor Angela Sasse
Head of Information Security Research, University College London and Member of Cabinet Office PCAG

Dr Judith Townend
Lecturer in Media and Information Law, University of Sussex

Dr Louise Bennett
Chairman, BCS Security Group and Member of Cabinet Office PCAG

StJohn Deakins
Chief Executive Officer, CitizenMe

Rory Broomfield
Director, The Freedom Association

Sarah Gold
Director and Founder, Projects by IF

Jim Killock
Director, Open Rights Group

Guy Herbert
General Secretary, NO2ID and Member of Cabinet Office PCAG

Dr George Danezis
Professor of Security and Privacy Engineering, University College London and Member of Cabinet Office PCAG

Jamie Grace
Senior Lecturer in Law, Sheffield Hallam University

Eric King
Visiting Professor, Queen Mary University

Josie Appleton
Director, Manifesto Club

Jen Persson
Co-ordinator, Defend Digital Me

Dr Chris Pounder
Director, Amberhawk and Member of Cabinet Office PCAG

Sam Smith
medConfidential and Member of Cabinet Office PCAG

‘Tracking People’ research network established

Tracking People Research NetworkA new research network has been established to investigate the legal, ethical, social and technical issues which arise from the use of wearable, non-removable tagging and tracking devices.

According to the network’s website, tracking devices are increasingly being used to monitor a range of individuals including “offenders, mental health patients, dementia patients, young people in care, immigrants and suspected terrorists”.

The interdisciplinary network is being hosted at the University of Leeds and aims to foster “new empirical, conceptual, theoretical and practical insights into the use of tracking devices”.

The network is being coordinated by Professor Anthea Hucklesby and Dr Kevin MacNish. It will bring together academics, designers, policy-makers and practitioners to explore critical issues such as:

  • privacy;
  • ethics;
  • data protection;
  • efficiency and effectiveness;
  • the efficacy and suitability of the equipment design;
  • the involvement of the private sector as providers and operators;
  • the potential for discriminatory use.

Readers of the Information Law and Policy Centre blog might be particularly interested in a seminar event scheduled for April 2017 which will consider the “legal and ethical issues arising from actual and potential uses of tracking devices across a range of contexts”.

For further information, check out the network’s website or email the team to join the network.