The ILPC’s Annual Conference and Lecture for 2018, Transforming Cities with AI: Law, Policy, and Ethics took place on Friday, 23 November, 9.30am–5.30pm, at the Institute of Advanced Legal Studies, 17 Russell Square, London WC1B 5DR.

For the full conference programme, please see here.

Recordings of the panels are available here.


Baroness Onora O’Neill, Emeritus Professor of Philosophy (University of Cambridge) and Cross Bench Member of the House of Lords, delivered this year’s ILPC Annual Lecture entitled ‘Ethics for Communication’. Baroness O’Neill elucidated a new approach to thinking about the role ethics can and should play in communications, and topically, information communication technologies (ICT). Baroness O’Neill commented on the history of attempts to control speech acts, through censorship of various kinds.

This history spans from Plato’s disdain of written records as being a removal from the truth they sought to represent (thus the teachings of Plato we have today are as a result of Socrates’s recording them), to how John Stuart Mills distinguished self-expression from other forms of speech acts.

Baroness O’Neill continued to critique the role ethics currently plays in today’s discourse on data and artificial intelligence, arguing that the term ‘data ethics’ is a misnomer: there is nothing ethical about data itself, although data can be used, handled and developed in ways that are ethical.

Baroness O’Neill commented that there have been long recognised norms guiding the ethics of speech acts, or speech aimed at communicating, that go beyond the human rights paradigm of “freedom of expression” and “access to information”. Such norms include: clarity, truthfulness, relevance, civility and decency, amongst many others.

As such, Baroness O’Neill called for an ethics for communication, rather than an ethics of communication. An ethics for communication moves beyond addressing the relationship between ethics and communication, or the extent to which communication is ethical, and instead names a decisive purpose for which communication must be directed.


Baroness O’Neill’s lecture launched the ILPC Annual Conference 2018, which featured keynote panels and academic sessions with policymakers, practitioners, industry, civil society and academic experts from the fields of law, computer engineering, history, economics, sociology and philosophy.

Throughout the day, speakers and audience members engaged in lively debates and discussions on the laws and policies that govern and regulate the AI-driven systems that are transforming our daily interactions, communications, and relationships with the public and private sectors, technology and one another. These debates were multidisciplinary, cross-sector, with insights brought from all of the world by everyone who attended the conference, including the UK, Ireland, France, Belgium, the Netherlands, Italy, Spain, Turkey, Canada, the U.S. and Kenya.


The conference keynote panel included leading figures from government, industry, academia, and civil society, with Tony Porter (Surveillance Camera Commissioner), Helena U. Vrabec (Legal and Privacy Officer, Palantir Technologies), Peter Wells (Head of Policy, Open Data Institute) and Baroness O’Neill. This panel was chaired by Dr Nóra Ni Loideain (ILPC) with Silke Carlo (Chief Executive, Big Brother Watch) as discussant.

An impressive range of topics and issues were addressed by the panel. Tony Porter noted the complex oversight legislative patchwork (‘a murder of regulators’) governing matters of AI-driven surveillance, such as CCTV enabled with facial recognition and automated number plate recognition technologies. On a more encouraging note, Helena Vrabec highlighted the positive effect that the GDPR has had within corporate culture, particularly the generation of high-level conversations on privacy and ethical implications posed by the use of predictive analytics.

Peter Wells spoke of the societal value to be gained by viewing data as public infrastructure and the role that ‘data trusts’ could play in this space. Silkie Carlo stressed the importance of ensuring proper oversight and clear legislative frameworks of emerging technologies and the regular public engagement work and Freedom of Information research undertaken by Big Brother Watch to ensure a wider understanding of the use of AI-driven systems


The first academic panel of the conference was focussed on discussing both the legal and ethical implications of smarts cars. Chaired by Dr Rachel Adams (ILPC), the panel included Maria Christina Gaeta (University of Naples), who spoke on the use of personal data in smart cars, arguing for the development of stricter legal enforcement beyond the GDPR in order to more effectively regulate.

Speaking on the ethical dimensions of smart cars, Professor Roger Kemp (University of Lancaster) – the second panellist – drew on his wealth of experience in policy-making on transport related matters in discussing a range of issues from the ineffectiveness of safety pilot testing to the behaviour psychology of such technologies.

The discussant for this panel was Dr Catherine Easton (University of Lancaster), who discussed her work on the rights of persons with disabilities and the need for smart cars to be developed to be fully autonomous, and the shift from conceptualising smarts cars as a service and not just a product.


The second (parallel) academic panel was chaired by Peter Coe (ILPC Research Associate), with Professor Hamed Haddadi (Imperial College London) as discussant, and examined the different governance mechanisms and policy narratives around public trust and oversight that have framed the development of AI-decision making systems to date.

Gianclaudio Malgieri (Vrije Universiteit Brussel) spoke on ‘The Great Development of Machine Learning, Behavioural Algorithms, Micro-Targeting and Predictive Analytics’, observing that issues of trust in this area goes beyond the mere protection of private life in private spaces, but a protection of cognitive freedom. He also highlighted the role that data protection impact assessment could play in improving governance in this space. Dr Jedrzej Niklas’s (LSE) presentation concerned improving accountability of automated decision-making within public services. He put forward an analytical framework that identifies how and where current accountability mechanisms warrant updating. This framework includes recognising the following ‘critical points’: a) layers within the system (software, input data, polices); b) life stages of systems (legislative process, design of technological tools, actual use); c) actors involved in those stages (public administration, civil society) and d) balance of power and relationships between those actors.

Matthew Jewell (University of Edinburgh) spoke on the importance of policy narratives that underpin emerging technologies within smart cities and explored the accountability benefits to be gained from embracing the acknowledgement of the existence of ‘distrust’ within these new systems. Dr. Yseult Marique (University of Essex) and Dr. Steven Van Garsse (University of Hasselt) presented a joint paper on the increasing use of public-private partnerships within smart cities and highlighted the challenges and governance gaps within procurement contracts. In particular, drawing from case studies in the UK and Belgium, they noted the use of private sector focussed contracts for procurement for public services, as opposed to the use of public sector contracts.


The third panel of academics and practitioners was chaired by Sophia Adams Bhatti (Law Society of England and Wales), with Alexander Babuta (Royal United Services Institute) as discussant, and addressed the use and governance of AI-driven systems within the criminal justice sector.

Chief Superintendent David Powell (Hampshire Constabulary) and Christine Rinik (University of Winchester) presented a joint paper on ‘Policing, Algorithms and Discretion’ drawn from interviews with front-line professional prospective users. Dr John McDaniel (University of Wolverhampton) spoke on the critical need to ensure effective evaluation of the potential impact of AI-driven systems on police decision-making processes.

Marion Oswald presented an insightful paper on how key legal principles from administrative law could guide our ‘Algorithm-Assisted Future’ within the criminal justice sphere. Dr Nóra Ní Loideáin (ILPC) addressed how AI could be used to improve the oversight and safeguards of predictive policing systems, as provided for under the EU Criminal Justice and Police Data Protection Directive and the UK Data Protection Act 2018.


The last panel of the conference brought together an interdisciplinary range of speakers to discuss the use of AI technologies both in cities and in legal administration. Chaired by Dr Rachel Adams (ILPC) this panel included a presentation by Dr Edina Harbinja (Aston University) on the use of AI in intestacy and the execution of wills, and a presentation by Professor Andrew McStay (Bangor University) on smart advertising in cities and the use of AI technologies in emotion detection.

In addition, Robert Bryan and Emily Barwell (BPE Solicitors LLP) delivered an interactive presentation on the regulatory regime governing AI technologies. They spoke specifically on the role of transparency and unpacked in detail what this meant in context. The last presentation on this panel was delivered by Dr Joaquin Sarrion-Esteve (University of Madrid), who spoke on his work on the human rights impact of AI and the development of rights standards for AI-based city governance.

The discussant for this panel was Damian Clifford (Leuven) who discussed the role of the GDPR, and specifically its provisions relating to transparency and the rights of the data subject.


Professor Hamed Haddadi (Imperial College London), Dr Laura James (University of Cambridge) and Marion Oswald (University of Winchester) concluded the conference proceedings with some reflections and insights. In particular, they noted the importance of realising the both the benefits and limits to empowering and educating the public alongside the essential shift in corporate culture that must be take place in order for the design and development of data-driven systems to be intelligible to the public, secure, accountable and trustworthy.

Also highlighted was the need to focus more on the enforcement of existing legal frameworks and governance as opposed to the hasty development of new laws and the welcome impact that the GDPR has had in making privacy a reputational selling point for companies. This panel was chaired by Dr Nóra Ni Loideain (ILPC).

On a final note, the ILPC is grateful to all of its speakers and audience members who contributed to a dynamic day of rich policy and academic discussions and looks forward to welcoming everyone back for its forthcoming events in 2019.

With thanks to Bloomsbury Publishing and the John Coffin Memorial Trust Fund for their sponsorship of these events.