Skip to content
Project

Regulatory inspection of algorithmic systems

Establishing mechanisms and methods for regulatory inspection of algorithmic systems, sometimes known as 'algorithm audit'.

Javascript code for website

Regulatory inspection of algorithmic systems is a vital component of algorithmic accountability – necessary for a world where data and AI work for people and society.

Our report, Technical methods for regulatory inspection of algorithmic systems reviews technical auditing approaches from academia, industry and civil society, and identifies how and where they may be applicable as part of a regulatory inspection process. It details existing technical approaches for auditing online platforms, and makes suggestions for how these techniques could be used to audit content-recommendation and moderation systems.

Technical methods for regulatory inspection report cover

Technical methods for regulatory inspection of algorithmic systems

Project background

As algorithmic systems become more critical to decision-making across many parts of society, there is increasing interest in how they can be scrutinised and assessed for societal impact and regulatory and normative compliance.

Regulatory inspection of algorithmic systems, sometimes referred to as ‘algorithm audit’, is a broad approach, focused on an algorithmic system’s compliance with regulation or norms. It requires a number of different tools and methods and likely be conducted by regulators or auditing professionals.

Many regulators and other audit bodies worldwide have not previously had to engage with the idea of inspecting algorithms. As policymakers contemplate expanding the remit of regulators to include algorithm inspection, there are numerous gaps to address in both the available legal remit and powers to conduct inspections, and organisational capacity and skillset.

This role is increasingly crucial; for regulators in many areas to have sufficient oversight over the impact of algorithmic systems, they will need to have the knowledge, skills and approaches to thoroughly inspect algorithmic systems and scrutinise how they function, both technically, and within the relevant social context.

Project overview

In Examining the Black Box, written in collaboration with DataKind UK, we identified priority questions for research and practice:

  • What legal powers do regulators need and how should they be defined, either generically or sectorally?
  • What skills and capabilities do regulators need, and how can these best be developed and shared?
  • What mechanisms are in place to enable regulators to share both successes and failures in developing and using inspection tool suites, to facilitate learning and improvement?

We explored these questions in a series of workshops, focusing on the practicalities and challenges of regulatory inspection across different domains. Insights and recommendations from our first workshop form the basis of Inspecting algorithms in social media platforms, a joint briefing from the Ada Lovelace Institute and Reset. Read the briefing.

Desk-based research and a synthesis of technical documentation, grey and academic literature on auditing methodologies was used to develop our subsequent report, Technical methods for regulatory inspection. It was also informed by policy analysis of white papers and draft legislation related to online harms, primarily in a UK and European context.

 

 

Related content