Skip to content
Blog

The technical and ethical distinction between random and non-random biometric data

Professors Clive Bowman and Peter Grindrod CBE on why we need a legal distinction between random and non-random biometrics.

6 September 2019

Reading time: 5 minutes

Gel electrophoresis of DNA fragments. The gel was photographed and retouched in PS.

Biometrics—metrics related to human physiological or behavioural characteristics—are now used for a variety of applications, from unlocking mobile phones, to passport checks, to identifying suspects in police investigations. Facial recognition technology is one form of biometric technology, and has been at the centre of much debate over the summer, both in the news (for example, in relation to its use in Kings Cross and by the police) and in academic research.

In our recent paper, we highlight a simple but important distinction that has gone overlooked in the conversation so far and that raises specific ethical concerns. While all biometrics are unique identifiers, not all of them are random. For instance, someone’s fingerprints are a unique and random identifier. Facial traits and DNA, instead, are not at all random in so far as a person’s facial traits probabilistically match the traits of their biological relatives and a person’s DNA partially matches portions of their relatives’ DNA.

The constraints to the randomness of DNA and facial traits is statistically modelled by the estimation of their genealogy. Indeed, this is precisely the basis for their use in forensic investigations, as in infamous cases like the one of the Golden State killer. However, this also means that, whenever we collect non-random biometric data about a person, we are also partially collecting their relatives’ biometric data.

The ethical problems with indirect identification

Serious issues have been raised concerning the use of biometrics in general, with regards to the proportionality of their use, with how to effectively gather people’s consent when biometrics are collected in public spaces and with the right of people to have their biometric data deleted. There are also legitimate concerns over the future use of biometric databases. However, the point we wish to stress is that the ethical problems raised here and the solutions we may find to them in the context of random biometrics (i.e. fingerprints, iris prints), do not, by default, transfer across to non-random biometrics, which confront us with further problems alongside the ones just listed. When unique, non-random biometrics are collected, the number of people who may become indirectly identifiable from them exceeds by far the number of people directly identified by the biometric data.

If someone is required to give up their unique, non-random biometrics, especially in the context of a criminal investigation, then their biological kin also become identifiable through that same biometric data. This raises a range of concerns, in particular when the data are stored and indexed in a searchable database or watchlist. As we wrote in our paper, kin are effectively dragooned onto such databases, without any qualifying cause and without their consent, awareness or clear warnings. This constitutes an unwarranted intrusion into the privacy of an individual who is not of any interest to police, and touches upon their data protection rights without their knowledge or consent.

This issue is particularly worrying if we consider the large amount of unique non-random biometrics currently stored in databases in the United Kingdom, such as the National DNA Database and the Police National Custody Image Database, which seem to treat random and non-random biometrics according to the same principles.

The National DNA Database, established in 1995 and mandatorily collecting the DNA of anyone who is arrested in the UK, as of 2019, holds the genetic profile of more than 5.5 million people, that is 8.2% of the UK population1. In 2012, the profiles of 1,766,000 people were deleted from the database, as a consequence of the UK Protection and Freedoms Act. The National DNA Database has been used to identify suspects via biometric kinship searches, matching the DNA of potential suspects with the DNA already stored. If, on average, each person has two living blood relatives then perhaps twice as many potentially identifiable people are not giving explicit consent than those who were legitimately collected. Even if this may have positive consequences for criminal investigations, the ethical problems remain. And, although still in its infancy, we can expect the use of facial images databases to become more and more frequent for the purpose of kinship matching, raising similar concerns.

The risk we are facing, by treating random and non-random biometrics in the same way, is that many individual rights may go unprotected.

We believe that a legal distinction should be drawn between the two kinds of biometrics and that the ethical implications of this distinction should be recognised. We use our paper to outline areas that require conceptual development as well as legal solutions, in particular with regard to the use of biometric kinship searches for policing purposes. We call for a meaningful public debate that puts human rights and people’s consent at the heart of the debate on the use of biometric data before problematic uses of technology, which are already common, become unquestioned and normalised practices.

Ada’s view: the need for a comprehensive overhaul of biometrics regulation

Peter Grindrod and Clive Bowman’s research provides an example of how the legal and policy framework for the governance of biometrics is no longer fit for purpose. The fuller implications of major technological advancements urgently need to be taken into account to inform new models for regulation and the rule of law. Calls for change are increasingly pressing and are only partially addressed. Changes tabled for oversight of biometrics in the Scottish criminal justice system, changes to the guidance from the Information Commissioner’s Office, and legal changes being recommended for ‘artificial intelligence’ industries by the European Commission, all require co-ordinated review.

Our research shows that concern about the inadequate governance of biometrics is shared by the British public, who are deeply uncomfortable with the normalisation of surveillance and the deployment of facial recognition technology for commercial benefit. We will now take this public engagement further, setting up a Citizens’ Biometric Council supported by the Information Commissioner’s Office. We will also support an independent review of biometrics governance and fresh principles for legislation, as was recommended in part by the UK Biometrics Commissioner and by the House of Commons Science and Technology Committee in 2019. Through this process we will address the ethical implications of rapidly advancing biometrically driven applications through an approach that is collaborative and human rights-focussed.

About the authors

Clive Bowman is visiting Professor at the Mathematical Institute, University of Oxford. Professor Peter Grindrod CBE is Director of the Oxford-Emirates Data Science Lab at the University of Oxford.

Footnotes

  1. 67 million – UNFPA 2019 population estimate.