This year’s CogX festival theme framed the question: ‘How do we get the next 10 years right?’ On Monday 8 June from 10am to 6.45pm BST on the Ethics & Society stage, Ada bought together 26 thought-leaders from policy, academia and industry to tackle the knotty, real-life trade-offs of benefits and harms emerging technologies bring to people and society.
To read the session summaries, click on the session titles.
10 – 10:45 BST
As attitudes to government use of citizens’ data become ever-more contested, this panel asks who we need in Whitehall to ensure data is managed for the public good?
What expertise or understanding will promote the equitable treatment of data for everyone in society, support a process that values diversity, anticipate unintended consequences, deliver robust and trustworthy oversight, and secure fair partnerships that use data for social good? And who can reverse the tide of opinion on the public’s trust in government use of their data?
Moderator: John Naughton, Open University. Panellists: Chi Onwurah, Labour MP for Newcastle upon Tyne Central & Shadow Minister Digital, Science & Technology; Jack Stilgoe, University College London; Jeni Tennison, Open Data Institute
12 – 12:45 BST
As we enter into a new industrial revolution, many people are being left behind by technological advancements, exacerbating trends of political disenfranchisement and polarisation.
Bringing the wider public into discussions around technology policy and governance is critical to ensuring the legitimacy and quality of technology, ensuring it works for people and society. The session explores the value of public deliberation and societal debate, even at difficult times of lockdown.
Moderator: Matthew Taylor, RSA. Panellists: Simon Burall, Involve; Chris Carrigan, use MYdata; Kerry Furini, Citizen and digital public engagement participant; Anja Thieme, Microsoft Research Cambridge
14:00 – 14:45 BST
As epidemiological evidence, public health knowledge and testing capacity increase, debate is growing around digital immunity certificates.
The UK government has significant policy investment in data-driven technologies as tools to support the transition from emergency lockdown measures in response to COVID-19, and immunity passports are under consideration. This panel asks what does it mean for society to introduce immunity status as a metric for population-scale decision making about who has freedom of movement, or the ability to go to work, or the right to travel?
This session follows the Ada Lovelace Institute rapid evidence review Exit through the App Store? published in April 2020 on the technical considerations and societal implications of using technology to transition from the COVID-19 crisis.
Moderator: John Thornhill, Financial Times Panellists: Claire Craig, Queen’s College Oxford and a member of the AI Council; Husayn Kassai, Onfido; Jonathan Montgomery, University College London
15:00 – 15:45 BST
Investigative journalists, activists and lawyers are leading the way in uncovering problematic AI systems and bringing public attention to their inequitable impacts.
Much of what we understand about problematic and harmful AI systems, from content curation algorithms which promote radical and extreme material or biased sentencing algorithms to automated benefits systems that make false fraud accusations, we owe to strategic litigation and investigative journalism. As a transparency mechanism, this work underpins the activities of organisations who work to make AI and its impacts equitable.
This panel will be an opportunity for those on the frontline of investigating and litigating AI systems to share their tactics, and unpack the skills needed to investigate AI and hold it to account, and the structural, financial and evidentiary obstacles lawyers and journalists face.
Moderator: Martha Spurrier, Liberty Panellists: Cori Crider, Foxglove; Ravi Naik, AWO; Adam Santariano, New York Times
17:00 – 17:45 BST
Speakers will bring evidence of the benefits and harms of technologies – including automation, artificial intelligence, platform data aggregation and the algorithmic remodelling of the public sphere – to the conditions necessary for society to thrive.
With expertise in government, law, civil society and media, they will ask, what is ‘good’, who is responsible for ‘good’, and how can we embed ‘good’ into structures and processes to make technology work for people and society?
Moderator: Sarah Drinkwater, Omidyar Tech & Society Lab. Panellists: Thomas Hughes, Director of Oversight Board Administration, Facebook; Safiya Umoja Noble, University of California, Los Angeles (UCLA) and author, Algorithms of Oppression
18:00 – 18:45 BST
Ethics has taken on a mantle of mistrust, tarnished by ‘ethics washing’ and the proliferation of ethics codes. We’ve all sometimes felt that we’d like this to be the last ever ethics panel we attend.
But we can’t allow these critical societal challenges to slip through our fingers through inertia. Instead, we must mobilise, challenge and ask what’s wrong with the way we talk about ethics, and why do we need to change the narrative (and ethics panels) to make sure data and AI really start working for people and society?
Moderator: Alison Powell, London School of Economics and JUST AI Panellists: Brent Mittelstadt, Oxford Internet Institute; Jonnie Penn, author and AI researcher; Evan Selinger, Rochester Institute of Technology; Shannon Vallor, Edinburgh Futures Institute (EFI), University of Edinburgh
The failure of the A-level algorithm highlights the need for a more transparent, accountable and inclusive process in the deployment of algorithms.
Ada Lovelace Institute’s JUST AI network announces £40,000 AHRC support for projects addressing racial justice and AI ethics
Grants for projects designed to surface alternative, critical and diverse perspectives on data and AI in relation to racial justice.
The Nuffield Foundation has appointed Sir Alan Wilson as Executive Chair of the Ada Lovelace Institute.
The Nuffield Foundation has appointed the first Board members to lead the strategic development of the Ada Lovelace Institute.