The Algorithmic State

Our work

Shaping a positive vision for citizens’ interactions with the state through data and AI

The information age offers new possibilities for public service delivery, and our Algorithmic State programme is working to articulate a positive vision for a state that harnesses the power of data appropriately. We will research, evaluate and propose evidence-based strategies, working at different levels of government and looking particularly at the impact on vulnerable and at-risk groups.


The use of data fundamentally affects the relationship between citizens and the state, and between people and services. It shapes the individual and group experience of rights. It also touches upon deep, longstanding moral and political debates on the ‘social contract’ between individual freedom and state authority.

To date, algorithmic decision-making systems in public services have often been developed in piecemeal, localised and invisible ways, affecting critical decisions like citizens’ access to benefits and social care. In many cases, they have been developed without scrutiny, common approaches, or an overarching vision for the role data should play at the front line.

The need for evaluation, and a positive vision for effective data-driven statecraft, has become more urgent. COVID-19 brings new pressures and a disruption – an opportunity and a need – to reimagine the delivery of services. Immediate practice needs careful consideration, and choices made now may shape the experience of rights, services and protections in the future.


Programme aims

  1. To build a shared understanding of how algorithmic decision-making systems function, including tracking how emergency responses to COVID-19 are deploying those systems.
  2. To explore the impact of algorithmic decision-making systems on individuals and society
  3. To articulate a positive vision for how data and AI can be used in a way that truly benefits society, starting from those who are vulnerable.

What we will do

Our first projects in this programme aim to build the evidence base, ensure transparency and promote the adoption of a common vocabulary:

1. Taxonomy and openness

We will draft, test and ensure the adoption of a taxonomy of terms used to refer to algorithmic decision-making systems.

The taxonomy, produced with the expert help of Dr Michael Veale (UCL), aims to establish a common language to refer to algorithmic decision-making systems and underpin a policy call for mandatory reporting on their use.

2. Review and development of approaches for algorithmic assessment and audit

In collaboration with DataKind UK, our report Examining the Black Box looks at approaches to assessing algorithmic systems to understand their function, their compliance with regulation and their impact on people and society.

We are following this with a series of workshops developing methodologies for regulatory inspection of algorithms, with a focus in three domains: digital media platforms, in collaboration with Reset; pricing and competition in collaboration with Inclusive Competition; and equalities, in collaboration with the Institute for the Future of Work.

3. Ethnographic and technical case study of a local authority’s use of algorithmic decision-making systems during COVID-19

We will produce an in-depth case study of a local authority currently deploying data analytics and predictive analytics to support service delivery around homelessness, children’s social care and adult social care.

The foundation of this is will be an ethnographic study conducted in collaboration with Dr Hannah Knox (UCL). The case study will also seek to observe how data analytics usage within the local authority is adapting to, changed or accelerated by the COVID-19 crisis.

4. Predictive analytics and children’s social care

In collaboration with the Nuffield Family Justice Observatory, we hosted a one-day seminar on predictive analytics in use in children’s social care, identifying a ‘research, policy and practice roadmap’. Further work is currently paused in response to COVID-19.


Find out more:

-> Read our report: Examining the Black Box: Tools for assessing algorithmic systems
-> Contact the Algorithmic State team