1–12 of 13
The Ada Lovelace Institute is hiring a researcher for a project evaluating methods of algorithm auditing and impact assessment.
Exploring how we can ensure that algorithmic systems and those deploying them are truly accountable.
Accountability for algorithms: a response to the CDEI review into bias in algorithmic decision-making
Reviewing bias is welcome, and stopping the amplification of historic inequalities is essential.
Developing foundational tools to enable accountability of public administration algorithmic decision-making systems.
Establishing mechanisms and methods for regulatory inspection of algorithmic systems, sometimes known as 'algorithm audit'.
A review of existing UK mechanisms for transparency, and their contribution to making public information relating to the implementation of algorithmic
Establishing systems, powers and capabilities to scrutinise algorithms and their impact.
Joint briefing with Reset, giving insights and recommendations towards a practical route forward for regulatory inspection of algorithms
A ten-point checklist for the Test and Trace team working on contact tracing app 2.0, based on the findings of rapid online public deliberation.
Identifying common language for algorithm audits and impact assessments.
Our rapid evidence review 'Exit through the app store?' evaluates evidence to support the immediate deployment of digital contact tracing.
Data is bringing huge benefits in the fight against Covid-19, but we must remain vigilant about NHS plans to collaborate with tech giants.