Skip to content

Ada Lovelace Institute curates ‘Ethics & Society’ at CogX 2020

Announcing the Ada Lovelace Institute-curated day of talks at virtual CogX 2020 – the international festival of AI.

26 May 2020

Reading time: 5 minutes

Ethics & Society, 8 June 202, 10am-6.45pm BST. CogX and Ada Lovelace Institute logos

This year’s CogX festival theme framed the question: How do we get the next 10 years right? On Monday 8 June from 10am to 6.45pm BST on the Ethics & Society stage, Ada bought together 26 thought-leaders from policy, academia and industry to tackle the knotty, real-life trade-offs of benefits and harms emerging technologies bring to people and society.  

To read the session summaries, click on the session titles.

Session 1: Does Whitehall need more WEIRDos?

10 – 10:45 BST

As attitudes to government use of citizens’ data become ever-more contested, this panel asks who we need in Whitehall to ensure data is managed for the public good?  

What expertise or understanding will promote the equitable treatment of data for everyone in society, support a process that values diversity, anticipate unintended consequences, deliver robust and trustworthy oversight, and secure fair partnerships that use data for social good? And who can reverse the tide of opinion on the public’s trust in government use of their data? 

Moderator: John NaughtonOpen UniversityPanellists: Chi Onwurah, Labour MP for Newcastle upon Tyne Central & Shadow Minister Digital, Science & Technology; Jack Stilgoe, University College London; Jeni Tennison, Open Data Institute


Session 2: How do we ensure the voices of citizens are heard?

12 – 12:45 BST

As we enter into a new industrial revolution, many people are being left behind by technological advancements, exacerbating trends of political disenfranchisement and polarisation. 

Bringing the wider public into discussions around technology policy and governance is critical to ensuring the legitimacy and quality of technology, ensuring it works for people and society. The session explores the value of public deliberation and societal debate, even at difficult times of lockdown. 

Moderator: Matthew Taylor, RSA. Panellists: Simon BurallInvolve; Chris Carrigan, use MYdataKerry FuriniCitizen and digital public engagement participant; Anja Thieme, Microsoft Research Cambridge


Session 3: Exit through the App Store?

14:00 – 14:45 BST

As epidemiological evidence, public health knowledge and testing capacity increase, debate is growing around digital immunity certificates.  

The UK government has significant policy investment in data-driven technologies as tools to support the transition from emergency lockdown measures in response to COVID-19, and immunity passports are under consideration. This panel asks what does it mean for society to introduce immunity status as a metric for population-scale decision making about who has freedom of movement, or the ability to go to work, or the right to travel?  

This session follows the Ada Lovelace Institute rapid evidence review  Exit through the App Store? published in April 2020 on the technical considerations and societal implications of using technology to transition from the COVID-19 crisis. 

Moderator: John Thornhill, Financial Times Panellists: Claire Craig, Queen’s College Oxford and a member of the AI CouncilHusayn KassaiOnfido; Jonathan Montgomery, University College London


Session 4: Investigating and prosecuting AI

15:00 – 15:45 BST

Investigative journalists, activists and lawyers are leading the way in uncovering problematic AI systems and bringing public attention to their inequitable impacts. 

Much of what we understand about problematic and harmful AI systems, from content curation algorithms which promote radical and extreme material or biased sentencing algorithms to automated benefits systems that make false fraud accusations, we owe to strategic litigation and investigative journalism. As a transparency mechanism, this work underpins the activities of organisations who work to make AI and its impacts equitable.  

This panel will be an opportunity for those on the frontline of investigating and litigating AI systems to share their tactics, and unpack the skills needed to investigate AI and hold it to account, and the structural, financial and evidentiary obstacles lawyers and journalists face.  

ModeratorMartha Spurrier, Liberty Panellists: Cori Crider, Foxglove; Ravi Naik, AWO; Adam Santariano, New York Times


Session 5: What does ‘good’ look like in a technosociety?

17:00 – 17:45 BST

Speakers will bring evidence of the benefits and harms of technologies  including automation, artificial intelligence, platform data aggregation and the algorithmic remodelling of the public sphere  to the conditions necessary for society to thrive.  

With expertise in government, law, civil society and media, they will ask, what is ‘good’, who is responsible for ‘good’, and how can we embed ‘good’ into structures and processes to make technology work for people and society? 

Moderator: Sarah Drinkwater, Omidyar Tech & Society Lab. Panellists: Thomas Hughes, Director of Oversight Board Administration, Facebook; Safiya Umoja Noble, University of California, Los Angeles (UCLA) and author, Algorithms of Oppression


Session 6: The Ethics Panel to End All Ethics Panels…

18:00 – 18:45 BST

Ethics has taken on a mantle of mistrust, tarnished by ‘ethics washing’ and the proliferation of ethics codes. We’ve all sometimes felt that we’d like this to be the last ever ethics panel we attend. 

But we can’t allow these critical societal challenges to slip through our fingers through inertia. Instead, we must mobilise, challenge and ask what’s wrong with the way we talk about ethics, and why do we need to change the narrative (and ethics panels) to make sure data and AI really start working for people and society? 

Moderator: Alison Powell, London School of Economics and JUST AI Panellists: Brent Mittelstadt, Oxford Internet InstituteJonnie Penn, author and AI researcher; Evan Selinger, Rochester Institute of Technology; Shannon Vallor, Edinburgh Futures Institute (EFI), University of Edinburgh