Examining the black box: Tools for assessing algorithmic systems is aimed primarily at policymakers, to inform more accurate and focused policy conversations around algorithmic audits and impact assessments. It is also intended to help those creating, commissioning or interacting with an algorithmic system to understand what methods or approaches exist to assess and evaluate that system.
Examining the Black Box is a joint report from the Ada Lovelace Institute and DataKind UK that clarifies terms around algorithmic audits and impact assessments, and the current state of research and practice.
- Cited in joint letter to EU commission calling for regulatory inspection of algorithms
- Influenced thinking and terminology used (inspection over audit) in (confidential briefing) joint paper ‘Algorithm Inspection and Regulatory Action’ from Demos, Doteveryone, Global Partners Digital, Institute for Strategic Dialogue and Open Rights Group
- Starting to see referencing in Academic conference papers e.g. at ICML
- Article on Society for Computers and Law website
Can transparency bring accountability to public-sector algorithmic decision-making (ADM) systems?
A look at transparency mechanisms that should be in place to enable us to scrutinise and challenge algorithmic decision-making systems.
Identifying common language for algorithm audits and impact assessments.
The failure of the A-level algorithm highlights the need for a more transparent, accountable and inclusive process in the deployment of algorithms.