About

The Ada Lovelace Institute is an independent research and deliberative body with a mission to ensure data and AI work for people and society.

Ada will promote informed public understanding of the impact of AI and data-driven technologies on different groups in society.

Read more

Latest news

Recruitment

Job vacancy: Network Director

The Ada Lovelace Institute (Ada) and the Arts & Humanities Research Council (AHRC, part of UK Research & Innovation) are seeking a Director for a new Network for AI & Ethics, to be established for an initial period of one year, in partnership with the two organisations.

Speech

Speech to the PICTFOR Parliamentary Summer Reception

What is the next digital revolution and how can the UK further embrace it to remain a world-leading digital economy? How can industry and government ensure citizens remain central to emerging tech and the changing world?

by Carly Kind

News

Nuffield Foundation publishes roadmap for AI ethics research

A new report from the Nuffield Foundation and the Leverhulme Centre for the Future of Intelligence at the University of Cambridge sets out a broad roadmap for work on the ethical and societal implications of technologies driven by algorithms, data and AI (ADA).

Speech

Data science and the case for ethical responsibility

Tim Gardam, Chief Executive of the Nuffield Foundation, recently delivered a speech to the British Computer Society on the history of data ethics and the importance of the Ada Lovelace Institute

by Tim Gardam

Blog

Public deliberation could help address AI’s legitimacy problem in 2019

The importance of public legitimacy – which refers to the broad base of public support that allows companies, designers, public servants and others to design and develop AI to deliver beneficial outcomes – was illustrated by a series of public, highly controversial events that took place in 2018.

by Reema Patel

Speech

The ethical and political questions raised by AI

Picture a system that makes decisions that have a huge impact on a person’s prospects and life course; or even that makes decisions that are literally life and death. Imagine that system is hugely complex, and also opaque: it is very hard to see how it comes to the conclusions it does. A system that is discriminatory by its nature: it sorts people into winners and losers; but the criteria by which it does so are not clear.

by Dr Stephen Cave

Speech

Digital technologies have yet to earn their ethical spurs

A new, a vast, and a powerful language In words that are at one and the same time prophetic, optimistic and haunting Ada Lovelace wrote: “A new, a vast, and a powerful language is developed for the future use of analysis, in which to yield its truths so that these may become of more speedy and […]

by Onora O'Neill

News

First board members appointed to lead Ada Lovelace Institute

The Nuffield Foundation has appointed the first board members to lead the strategic development of the Ada Lovelace Institute, an independent research and deliberative body with a mission to ensure data and Artificial Intelligence (AI) work for people and society.

Publication

Our prospectus

The Ada Lovelace Institute is an independent research and deliberative body with a mission to ensure data and AI work for people and society. Our new prospectus sets out how the Ada Lovelace Institute will promote informed public understanding of the impact of AI and data-driven technologies on different groups of people. It will guide […]

Blog

Data for the common good – framing the debate

Over the past few years debates about data have frequently made headline news. To help us to better understand data, including its uses and ethical implications, various analogies have been used. Although analogies can help us get a better grasp of this complex issue, we should be wary of the limitations of these comparisons.

by Reema Patel

Blog

Keeping society in the loop about data ethics and AI

There is a growing expectation that technologies and algorithms should align with, and reflect, commonly held public and societal values. But to make this happen, society needs to be kept ‘in the loop’. Do we need an implicit social contract between those developing and designing the tech and those who may be affected by it?

by Reema Patel

News

Letter to the Financial Times: Embed ethical thinking in tech culture

Your editorial “Making decisions that computers cannot” ( May 22) identifies a central issue surrounding the relationship between the artificial intelligence sector and the ethical and social frameworks in which it operates. You ask who will be left to speak for the common good. The Nuffield Foundation is funding the Ada Lovelace Institute, in partnership […]

by Tim Gardam

Blog

UK wants to lead the world in tech ethics…but what does that mean?

Last week we surely reached peak hype on tech ethics with this headline:  The Sun was responding to the House of Lord’s AI report, the latest of a growing number of interventions from the government, industry and civil society seeking to grapple with the ethical and social issues technology is posing. Public debate on tech […]

by Imogen Parker

Tim Gardam

Speech

Social Well-being and Data Ethics

I would like in a moment to set out the proposal for an independent Convention on Data Ethics and Artificial Intelligence. This has been developed by the Nuffield Foundation over recent months in partnership with The Alan Turing Institute, the Royal Statistical Society, the Nuffield Council on Bioethics, the Wellcome Trust, the Royal Society, the British […]

by Tim Gardam