In September 2019, the Ada Lovelace Institute published Beyond face value, the first UK survey of public attitudes to facial recognition technologies, recognising that there was a deficit of information in a fast-moving area of technological development and deployment about what people wanted, and who they trusted to make and use these technologies.
The first UK survey of public attitudes to facial recognition technology, exploring when the British public are prepared to accept use of facial recognition technology, with clear public benefit and appropriate safeguards, and when they want the Government to impose restrictions on its use.
Responding to citizen concerns, we led the call for a moratorium or pause in the use and deployment of facial recognition technology, which has now been reinforced by a diverse range of organisations including the UK House of Commons Science and Technology Select Committee.
The timeliness of the investigation is reinforced by subsequent US technology companies’ actions: IBM, who pulled out of the facial recognition technology market citing concerns about ‘racial profiling and violations of basic human rights and freedoms’, Microsoft, who pulled back from selling facial recognition systems to police departments until there is a federal law and Amazon, who banned police use of facial recognition technology for a year.
The global move towards questioning the legitimate uses of this technology is the result of years of work from organisations like the Cardiff-based Data Justice Lab and New York-based AI Now, and researchers Joy Buolamwini, Timnit Gebru and Deborah Raji, collecting evidence of injustices in social and technical infrastructures and systems and individuals to shift power imbalances.
Publication of the survey coincided with the High Court judgement on the use of live facial recognition technology by South Wales Police. In response to the judgement, Carly Kind said:
‘Today’s judgement on the legality of police trials of facial recognition in South Wales brings into sharp relief the gap between public expectations and current regulation. The first national survey of public opinion on facial recognition technology published today by the Ada Lovelace Institute shows that the majority of people in the UK want police use of this technology restricted.
‘The fact that the deployment of a new technology, with which a significant proportion of the public are not comfortable, can be deemed technically compliant underlines the need for an informed public discourse on what new restrictions and safeguards are needed. Facial recognition technology may be lawful, but that does not mean its use is ethical, especially outside of the very limited circumstances examined by the court in this case.’
Evidence from policymakers and regulators
Jonathan Bamford, Director of Strategic Policy from the Information Commissioner’s Office: ‘The Information Commissioner’s Office welcomes this valuable insight on public attitudes to facial recognition technology and is very pleased to support the Ada Lovelace Institute’s Citizens’ Biometrics Council, following on from its own extensive citizen engagement work through the use of citizen juries to understand public attitudes to explainability in AI. The Council will bring much-needed informed public perspective to the development of public policy on emerging biometrics technologies such as facial recognition and fingerprint scanning.’
Rt Hon Norman Lamb MP, Chair of the Science and Technology Committee: ‘My committee, the Science and Technology Committee, has already called for the government to issue a moratorium on the current use of facial recognition technology and that no further trials should take place until a legislative framework has been introduced and guidance is published on trial protocols, and an oversight and evaluation system, has been established.
‘The recent report by the Ada Lovelace Institute, Beyond face value, provides insight into public attitudes towards the use of facial recognition, and underscores the need for an independent review of the governance ecosystem for biometric data.’
- Financial Times: Facial recognition’s risks demand a temporary halt
First survey of public opinion on the use of facial recognition technology reveals the majority of people in the UK want restrictions on its use.
Proposing a way forward for regulators, policymakers and industry in the UK based on emerging public attitudes research.
Facial recognition technology is a complex area, which means the risk of misunderstandings is high.
Dr Daragh Murray explains how use of live facial recognition technology by the Metropolitan Police Service fails to comply with human rights law.