Tag Archives: facebook

Conference: The Power Switch; How Power is Changing in a Networked World

Cripps Court auditorium, Magdalene College, 1-3 Chesterton Road, Cambridge
31 March 2017, 09:00 – 18:15

Registration for this conference is now open, please register here.
The full programme is available here.
Full fee £20 (includes refreshments and lunch)
Student and ECF fee £10 (includes refreshments and lunch)

Recent decades have seen the rise of a number of large US technology companies – Alphabet (Google’s holding company), Facebook, Microsoft, Amazon and Apple – which have achieved global dominance in their original fields (digital technology) and are now moving into other markets (healthcare, mobility, hotels, media, to name just four).

The scale and reach—as well as the wealth—of these corporations revives old concerns about corporate power and its regulation (for example in relation to monopoly and data protection). But their dominance also raises new questions deriving from the distinctive affordances of digital technology and the companies’ mastery of it.

In what ways is the power that they wield different from older kinds of corporate power? How should the power flowing from mastery of the technology be conceptualised? What kinds of regulatory approaches are viable in this new environment? Where does corporate responsibility begin and end in applications of Artificial Intelligence? And can the nation-state effectively regulate these new global entities?

This symposium, which is hosted by the Technology and Democracy project at CRASSH, will consider these and related issues. Continue reading

Your next social network could pay you for posting

In this guest post, Jelena Dzakula from the London School of Economics and Political Science considers what blockchain technology might mean for the future of social networking. 

You may well have found this article through Facebook. An algorithm programmed by one of the world’s biggest companies now partially controls what news reaches 1.8 billion people. And this algorithm has come under attack for censorship, political bias and for creating bubbles that prevent people from encountering ideas they don’t already agree with.

blockchainNow a new kind of social network is emerging that has no centralised control like Facebook does. It’s based on blockchain, the technology behind Bitcoin and other cryptocurrencies, and promises a more democratic and secure way to share content. But a closer look at how these networks operate suggests they could be far less empowering than they first appear.

Blockchain has received an enormous amount of hype thanks to its use in online-only cryptocurrencies. It is essentially a ledger or a database where information is stored in “blocks” that are linked historically to form a chain, saved on every computer that uses it. What is revolutionary about it is that this ledger is built using cryptography by a network of users rather than a central authority such as a bank or government.

Every computer in the network has access to all the blocks and the information they contain, making the blockchain system more transparent, accurate and also robust since it does not have a single point of failure. The absence of a central authority controlling blockchain means it can be used to create more democratic organisations owned and controlled by their users. Very importantly, it also enables the use of smart contracts for payments. These are codes that automatically implement and execute the terms of a legal contract.

Industry and governments are developing other uses for blockchain aside from digital currencies, from streamlining back office functions to managing health data. One of the most recent ideas is to use blockchain to create alternative social networks that avoid many of the problems the likes of Facebook are sometimes criticised for, such as censorship, privacy, manipulating what content users see and exploiting those users.

Continue reading

Social media and crime: the good, the bad and the ugly

social media and crime

Social media has revolutionised how we communicate. As part of a series for The Conversation, Alyce McGovern, UNSW Australia and Sanja Milivojevic, La Trobe University summarise how social media is affecting crime and criminal justice.  

The popularity of social media platforms such as Facebook, Twitter and Snapchat have transformed the way we understand and experience crime and victimisation.

Previously, it’s been thought that people form their opinions about crime from what they see or read in the media. But with social media taking over as our preferred news source, how do these new platforms impact our understanding of crime?

Social media has also created new concerns in relation to crime itself. Victimisation on social media platforms is not uncommon.

However, it is not all bad news. Social media has created new opportunities for criminal justice agencies to solve crimes, among other things.

Thus, like many other advancements in communication technology, social media has a good, a bad and an ugly side when it comes to its relationship with criminal justice and the law. Continue reading

Lorna Woods: Safe Harbour – Key Aspects of the ECJ Ruling

On Tuesday (6 October) the Court of Justice of the European Union (ECJ) declared that the Safe Harbour agreement that allows the movement of digital data between the EU and the US was invalid. The case was brought by Max Schrems, an Austrian student and privacy campaigner who, in the wake of the Snowden revelations of mass surveillance, challenged the way in which technology companies such as Facebook transferred data to the US. In this guest post, which originally appeared on the LSE Media Policy Project blog, Professor Lorna Woods of the University of Essex explains some key aspects of the judgment.

This case arises from a challenge to the transfer of personal data from the EU (via Ireland) to the United States, which relied on a Commission Decision 2000/520 stating that the Safe Harbour system in place in the United States was ‘adequate’ as permitted by Article 25 Data Protection Directive. While the national case challenged this assessment, the view of the Irish data protection authority (DPA) was that it had no freedom to make any other decision – despite the fact that the Irish authorities and courts were of the view the system did not meet the standards of the Irish constitution – because the European Commission decision was binding on them. The question of the validity and status of the Decision were referred to the Court of Justice of the European Union (ECJ).

The Advocate General, a senior ECJ official who advises on cases, took the view that the Commission’s decision could not limit the powers of DPAs granted under the directive and that the US system was inadequate, particularly as regards the safeguards against mass surveillance (a more detailed review of the AG’s Opinion can be found here). The ECJ has now ruled, following very swiftly on from the Opinion. The headline: the Commission’s decision is invalid. There is more to the judgment than this.

Powers of DPAs and Competence

The ECJ emphasised that the Commission cannot limit the powers granted by the Data Protection Directive, but at the same time Commission decisions are binding and benefit from a presumption of legality. Nonetheless, especially given the importance of the rights, individuals should have the right to be able to complain and ask a DPA to investigate. DPAs remain responsible for oversight of data processing on their territory, which includes the transfer of personal data outside the EU. The ECJ resolves this conundrum by distinguishing between the right and power of investigation and challenge to Commission decisions, and the declaration of such decisions’ invalidity. While the former remains with DPAs, the latter – following longstanding jurisprudence, remains with the ECJ.

Validity of Decision 2000/520

The ECJ noted that there is no definition of what is required by way of protection for the purposes of Article 25 of the Data Protection Directive. According to the ECJ, there were two aspects to be derived from the text of Article 25. There is the requirement that protection be ‘adequate’ in Article 25(1) and the fact that Article 25(6) refers to the fact that protection must be ensured. The ECJ agreed with the Advocate General that this Article is ‘intended to ensure that the high level of that protection continues where personal data is transferred to a third country’ (para [72], citing the Advocate’s General’s Opinion para [139]), which seems higher than ‘adequate’ might at first suggest. That requirement does not however mean that protection in third (non-EU) countries must be identical but rather that it is equivalent (para 73]) and effective (para [74]). This implies an on-going assessment of the rules and their operation in practice, where the Commission has very limited room for discretion.

The Court concluded that the Decision was unsound. It did so on the basis that mass surveillance is unacceptable, that there was no legal redress and that the decision did not look at the effectiveness of enforcement. It steered clear of determining whether the self-certification system itself could ever be fit for purpose, basing its reasoning on only elements of the Commission’s decision (but which were so linked with the rest that their demise meant the entire decision fell).


This is a judgment with very far reaching implications, not just for governments but for companies the business model of which is based on data flows. It reiterates the significance of data protection as a human right, and underlines that protection must be at a high level. In this, the ECJ is building a consistent line of case law – and case law that deals not just with mass surveillance (Digital Rights Ireland) but activities by companies (Google Spain) and private individuals (Rynes).

At a practical level, what happens today with the Decision declared invalid? Going forward, will there be more challenges looking not just at mass surveillance but at big data businesses self-certifying? What will happen to uniformity in the EU? Different Member States may well take different views. This should also be understood against the Weltimmo judgment of last week, according to which more than one Member State could have the competence to regulate a multinational business (irrespective of where that business has its registered office in the EU). Finally, what does this mean for the negotiation of the Data Protection Regulation? The political institutions had agreed that the Regulation would not offer lower protection than the Data Protection Directive, but now we might have to examine this directive more closely.

Lorna Woods: Schrems v Data Protection Commissioner – The beginning of the end for safe harbour?

The Advocate General of the European Court of Justice has delivered his non-binding legal opinion in Schrems v. Data Protection Commissioner, a case brought by an Austrian citizen against the Irish Data Protection Commissioner concerning the transfer of Facebook data to US servers.  Professor Lorna Woods, University of Essex, reports and comments on the opinion – and its potential implications – in this guest post. 

Case C-362/14: Schrems v. Data Protection Commissioner

Opinion of the Advocate General


The Data Protection Directive imposes relatively high standards of data protection on those processing data in the EU. It also prohibits the transfer of data to non-EU countries unless an adequate level of protection for the processing of data is ensured there too. Under Article 25(6) of the Data Protection Directive, the Commission can determine that a third country ensures an adequate level of protection of personal data by reason of its domestic law or of the international commitments it has entered into. Should the Commission adopt a decision to that effect, transfer of personal data to the third country concerned would be permissible.

The Commission adopted Decision 2000/520 pursuant to that provision accepting that the ‘Safe Harbor’ system in the United States provided a satisfactory level of protection. It sets out certain principles but mainly operates on a basis of self-certification, although the US authorities may intervene.  A number of mechanisms, combining private dispute resolution and oversight by the public authorities, exist to check compliance with the ‘safe harbor’ principles.

Decision 2000/520 permits the limitation of these principles, ‘to the extent necessary to meet national security, public interest, or law enforcement requirements’ and ‘by statute, government regulation, or case law that create conflicting obligations or explicit authorisations, provided that, in exercising any such authorisation, an organisation can demonstrate that its non-compliance with the Principles is limited to the extent necessary to meet the overriding legitimate interests furthered by such authorisation’. The reference concerns the legitimacy of these arrangements in the light of the Data Protection Directive and the EU Charter of Fundamental Rights.

The case was originated by an Austrian national who had signed up to Facebook, run in Europe by Facebook Ireland. All data is however transferred to the US parent company. Following the Snowden revelations, Schrems challenged the level of protection in the USA against state surveillance with reference in particular to the PRISM programme under which the NSA under which it obtained unrestricted access to mass data stored on servers in the United States.

The Irish Data Protection Commissioner refused to investigate the complaint as according to the Irish statute, Decision 2000/520 of the Commission was final (s. 11(2)(a) Data Protection (Amendment) Act 2003). The decision was reviewed before the High Court which found that if the matter were to be determined solely by Irish law, s. 11(2)(a) would end the matter. It recognised, however, that implementation of EU law must be carried out in the light of the EU Charter. The High Court then referred questions to the Court of Justice as to whether the Data Protection Commissioner was absolutely bound by Decision 2000/520.

Competence of the Irish Data Protection Commission

The Data Protection Commissioner argued that its responsibility relates to the implementation of the Irish legislation in particular cases of application of the rules; conversely, the assessment of adequacy of the US system overall is the responsibility of the European Commission. Section 11(2)(a) reflects this division and meant that the Irish Data Protection Commission could not act on Schrems’s complaint.

Given the important role of the national authorities in the overall system of protection (para 63), AG Bot concluded that power conferred by the Data Protection Directive on the Commission does not affect the powers which the Directive has conferred on the national supervisory authorities, so a national regulator could investigate matters notwithstanding the Commission’s decision (para 61). Art 8(3) of the Charter, which occupies ‘the highest level of the hierarchy of rules in EU law’ (para 72) requires independence (see also Case C-288/12 Commission v. Hungary and Case C-293 and 594/12 Digital Rights Ireland) and it would be this quality that would be curtailed were national authorities not able to investigate a claim on its merits.

So, while the Commission plays an important role in ensuring uniformity of approach across EU Member States and its decision is binding, this cannot justify a summary dismissal of a complaint without looking into its merits (para 85). Uniformity achieved by virtue of a Commission decision, such as Decision 2000/520, ‘can continue only while that finding [of adequacy] is not called in question’ (para 89).

Here, not only has the Commission decision been criticised by others, but the Commission has also expressed its concerns and has entered into negotiation with a view to remedying the problem.

In reaching these conclusions, Bot – referring to earlier case law – emphasised that the orientation of the Directive is towards ensuring privacy. Further, the Directive must be understood in the light of the Charter and not only that, but Member States must ensure that they do not rely on interpretations of the Directive which would be inconsistent with the Charter Rights (paras 99-100, relying on Case C-131/12 Google Spain and Cases C-411 and 493/10 NS). Here, the existence of an irrebuttable presumption was inconsistent with the duty of Member States to interpret EU law in a manner consistent with the Charter ( para 104).

Validity of Decision 2000/520

Bot then noted it is within the scope of the court’s powers to question on its own motion the validity of an instrument which it had been asked to interpret (going back as far as Case 62/76 Strehl). The review would consider only those aspects of the safe harbour scheme that had been discussed– specifically the PRISM programme and the generalised surveillance of citizens by the NSA.

While the normal position is that a decision is assessed as at the time at which the decision was taken, the ECJ has recognised that sometimes circumstances might subsequently come to light which changes that position. Bot suggested that this was one such case and that the review should be carried out by reference to the current legal and factual context.

The first issue is the determination of ‘adequate’. Bot argued that the purpose of the limitation on transfers was to ensure continuity of protection under the Data Protection Directive, which is described as a high level of protection. So while the means to ensure that level of protection might differ from the system in the EU, the level must be the same. Consequently, ‘the only criterion that must guide the interpretation of that word is the objective of attaining a high level of protection of fundamental rights…’ (para 142).

The Advocate General took as read two points: that the NSA would engage in surveillance; and EU citizens had no mechanism for complaint in the USA. So,

‘the law and practice of the United States allow the large-scale collection of the personal data of citizens of the Union …without those citizens benefiting from effective judicial protection’ (para 158).

Specifically, the law enforcement derogations are too broadly worded and allow the reliance on those derogations beyond what is strictly necessary. Such widespread access constitutes an interference with Art 8 EUCFR, a fact exacerbated by the secrecy surrounding these activities. While interferences can in principle be justified, here the Advocate General suggested that it was

‘extremely doubtful that the limitations at issue in the present case … [respect] the essence of Articles 7 and 8 of the Charter’ (para 177).

The exceptions are not specifically precisely defined and nor are they proportionate. The Advocate General referred back to Digital Rights Ireland to highlight that the legislature’s discretion in this context is limited because of the significance of the right to data protection. Limits to the right must be limited to that which is strictly necessary. The Advocate General highlighted the mass and indiscriminate nature of the surveillance carried out, which is ‘inherently disproportionate and constitutes an unwarranted interference’ (para 200).

It follows that third countries cannot be regarded as ensuring an adequate level of protection where such mass surveillance is permitted. Further, the safe harbour scheme – which relies on the FTC and private dispute resolution mechanisms -does not provide sufficient guarantees in terms of preventing abuse. It further allows the discrimination in terms of access between the protection of US citizens and EU citizens. In addition then to the interference with Articles 7 and 8 EUCFR, there was no right to an effective remedy in breach of Article 47 EUCFR.

The Advocate General concluded that:

  • a national regulatory authority is not precluded from investigating a complaint where there is a Commission decision such as Decision 2000/520; and
  • Decision 2000/520/EC is invalid.

This is the latest in a line of opinions and judgments which have emphasised the need to protect privacy and ensure data protection and which have run contrary to the industry lobby approach of ‘we make money from it therefore it is legal’. If the Court of Justice follows this line of reasoning this case will have very far reaching consequences, not just for Facebook but for all US data companies relying on the safe harbour scheme or similar. Of course the court is not bound by the opinion of the Advocate General but it should be noted that in data protection cases, where the court has departed (e.g. in Google Spain) the court has been more concerned about data protection than the Advocate General. Certainly, Digital Rights Ireland indicates the court is no fan of mass surveillance.

As regards the declaration of invalidity of Decision 2000/520, it should be noted that the decision is very much tied up with concerns about the activities of the NSA and the discriminatory treatment of EU citizens. That link between mass surveillance and inherent disproportionality does not automatically translate to other forms of data usage. It remains to be seen whether the “umbrella agreement” on data protection (see here) which has just been agreed between the EU and US (but which is still subject to European Parliament approval) will resolve these issues. One key point is the ending of the discrimination between US and EU citizens in terms of the rights to complain (via the adoption of the US Judicial Redress Bill).

Aside from this, there are some points which will affect any future decision as to adequacy:

  • The level of protection can no longer be viewed as ‘adequate’ in the English sense, but as a continuation of the high level of protection seen by the Directive; this may well be difficult given current practice in the US regarding tracking and using such data for purposes to which subjects have not given consent;
  • It is questionable what level of enforcement will be required – is self-certification together with the possibility of legal action sufficient, or is the Advocate General really suggesting there is a need for an independent regulator (see paras 2-7-208) – while the issue was not discussed, the FTC has started taking action against companies who claimed to self-certify but did not comply with the terms of the safe harbour agreement (see here, here and here).
  • The Commission may be obliged to review any such decision in the light in changing circumstances, and should not leave systems which are clearly inadequate in place.

In the absence of a safe harbour agreement, companies seeking to transfer data to the US will have to use other mechanisms such as ‘Binding Corporate Rules’ or ‘Standard Contractual Terms’. These are individual approved by national regulators.

The first part of the Opinion dealt with the position of national regulatory authorities, opening up the possibility for national regulators to challenge what they see as too low levels. Will this force an upwards standard of protection with regard third countries? Quite apart from this open question, we should note that the Advocate General took the opportunity to make some general points about the need to respect fundamental rights and not rely on interpretations of the law that are inconsistent with those rights.

While they were addressed to the making of the decision, they reiterate that the focus of the directive is the protection of privacy and the respect for data protection; the free movement of data seems to come a poor second whatever the data industry and the legal basis for the directive might have to say. Such an approach has relevance to the interpretation of the directive more generally. This reliance on fundamental rights arguments may also have significance as the EU institutions seek to finalise the long-awaited Data Protection Regulation.