Skip to content
Project

Regulatory inspection of algorithmic systems

Establishing mechanisms and methods for regulatory inspection of algorithmic systems, sometimes known as 'algorithm audit'.

Reading time: 2 minutes

Javascript code for website

As algorithmic systems become more critical to decision-making across many parts of society, there is increasing interest in how they can be scrutinised and assessed for societal impact and regulatory and normative compliance.

Regulatory inspection of algorithmic systems, sometimes referred to as ‘algorithm audit’, is a broad approach, focused on an algorithmic system’s compliance with regulation or norms. It will require a number of different tools and methods and likely be conducted by regulators or auditing professionals.

Regulatory inspection: a crucial part of the algorithmic accountability ecosystem

Many regulators and other audit bodies worldwide have not previously had to engage with the idea of inspecting algorithms. As policymakers contemplate expanding the remit of regulators to include algorithm inspection, there are numerous gaps to address in both the available legal remit and powers to conduct inspections, and organisational capacity and skillset.

This role is increasingly crucial; for regulators in many areas to have sufficient oversight over the impact of algorithmic systems, they will need to have the knowledge, skills and approaches to thoroughly inspect algorithmic systems and scrutinise how they function, both technically, and within the relevant social context.

Priorities for research and practice

In Examining the Black Box, written in collaboration with DataKind UK, we identified priority questions for research and practice:

  • What legal powers do regulators need and how should they be defined, either generically or sectorally?
  • What skills and capabilities do regulators need, and how can these best be developed and shared?
  • What mechanisms are in place to enable regulators to share both successes and failures in developing and using inspection tool suites, to facilitate learning and improvement?
Expert workshops: regulatory inspection of algorithms in different domains

We are hosting a series of expert workshops to explore these questions and the practicalities and challenges of regulatory inspection across different domains. We aim to further the conversation in each area, as well as being able to draw insights across them:

Insights and recommendations from our first workshop form the basis of Inspecting algorithms in social media platforms, a joint briefing from the Ada Lovelace Institute and Reset. Read the briefing.

Further work and opportunities

Regulatory inspection of algorithmic systems is a vital component of algorithmic accountability – necessary for a world where data and AI work for people and society.

We expect to report on approaches to regulatory inspection of algorithmic systems across different domains later in the year. If you are interested in this work, you can keep up to date on our website, Twitter or our fortnightly newsletter.

Related content

This is not a cookie notice

The Ada Lovelace Institute website doesn’t have a cookie notice, because we don’t collect or share personal information about you.

We use only the minimum necessary cookies to make our site work, and to enable sharing on platforms like Twitter, YouTube and Google. We have no control over the tracking technologies used by these sites and services.

You can disable all cookies by changing your own browser settings, though this may change how the website works.

It’s part of what we do, making data and AI work for people and society. Find out more.