Skip to content
Blog

Digital identity verification: a problem of trust

Why plans to use biometrics in the UK Digital Identity and Attributes Trust Framework need careful consideration.

Edgar Whitley

24 May 2022

Reading time: 9 minutes

A close up of a hand passing a British passport to someone else.

In the coming months, anyone in the UK who needs to prove their identity to employers and landlords will be able to do so digitally. For those who have current UK Government-issued photo ID documents (passports and driving licences) a key element of the new process will be automated facial recognition. This means that the matching of an individual’s face with the official photograph on their document will be done by a machine.

Despite claims that facial recognition technologies can simplify the identity checking process, there are growing concerns about the effectiveness of the technology and the downsides of making this the default route for identity checking services. These concerns risk undermining public trust in these processes.

In an attempt to address these issues, the Government has developed the UK Digital Identity and Attributes Trust Framework (‘the Framework’), a set of specifications, agreements and rules for organisations to follow if they want to offer services that provide secure and trustworthy evidence of digital identities (who a person is) or attributes (things about them). Legislation to formalise the status of the Framework will form part of the Data Reform Bill, announced in the Queen’s Speech in May 2022.

The Framework is expected to cover the activities of public and commercial services, including the One Login for Government service, and private companies that already know things about their customers and might wish to monetise this data.

Recent examples of the use of facial recognition for identification in Scotland and in the USA raise important questions for the operations of the Framework with regards to the accessibility of services, the need to reconcile the expectations of non-expert users with a workable certification scheme and the broader regulatory environment for biometric identification.

The Framework follows an earlier call for evidence that sought views on how the Government could support the secure use of digital identities in a way that is fit for the UK’s growing digital economy. The Framework, which also aims to make it more straightforward for individuals to prove something about themselves digitally, is expected to evolve over time in light of the experience accumulated using it in real-world circumstances.

The first commercial application of the Framework will be to enable people to prove their right to work or rent a property, under new guidance for employers and landlords. Certified organisations will be able to use document validation techniques to carry out identity checks on British and Irish citizens with a valid passport. At the time of writing, there are no certified identity providers, although five bodies have applied to become accredited to certify organisations against the Framework.

During the COVID-19 pandemic, work-from-home restrictions highlighted the extent to which many identity-related transactions, such as finalising employment and rental contracts, were reliant on face-to-face meetings and the physical inspection of identity documents.

In response, the UK introduced temporary changes to the guidance for employers in carrying out right-to-work checks. These changes included the ability to send scans of documents, rather than originals, and to carry out checks via videocalls rather than in-person. While these temporary measures alleviated some of the immediate pressures on the process, they are still susceptible to fraud and are not accessible to all.

The Framework can also be seen as a key step towards replacing these temporary fixes with more carefully considered proposals for reliable, business-as-usual processes, for example by certifying digital identity service providers against specified requirements on privacy and data protection, fraud management, security and inclusion.

Some of the techniques for checking an identity are relatively straightforward, for example, basic checks of the passport itself or comparing the data held on the personal details page with that on the passport chip. Organisations may also use facial recognition techniques to compare the face of the individual with the image on their passport.

Having an automated system for comparing an individual’s face with their photograph on a government-issued document, such as a passport, can simplify employment or rental checks significantly. This is because the process effectively ‘piggybacks’ on the identity checks undertaken when the passport was issued. Additionally, for British and Irish citizens, their right to work and rent properties is derived from their nationality.

However, not everyone has appropriate government-issued photo identity documents, such as passports. In the UK, passports are only needed for international travel and currently cost at least £75.50. Obtaining or renewing a passport will not be a high priority for all, especially as the cost-of-living crisis hits home. Similarly, the pandemic has significantly decreased the number of young people with a driving licence.

Identity checks are a prerequisite for employment and renting properties, which themselves are integral components of a dignified and fulfilling life. So it is important to ensure that the availability of one technology for this task doesn’t crowd out more inclusive options, particularly when alternative ways of checking someone’s identity are likely to be a lot more costly (to both the individual and the service provider).

Even when individuals do hold the appropriate documents there are still widely acknowledged problems with automated facial recognition technology. Examples include Uber drivers who lost their jobs when automated face-scanning software failed to recognise them or the automatic rejection of passport applications because of wrongly identified ‘problems’ with the photograph.

These issues are often a consequence of biases in the datasets used to train the automated identification systems. In simple terms: if the datasets do not represent society, then the resulting systems will not be usable by everyone in society, even if they have the relevant documents.

More generally, addressing such biases involves acknowledging that the problem exists across society as a whole, and that the causal relationships involved in these biases are often complex and indirect. Trying to resolve questions of bias is hard and can be undertaken in a superficial or tokenistic way.

Concerns about data use also arise as a result of surprising uses of personal data by identity service providers. For example, the Scottish Government used a third-party identity checking service for its COVID app, to ensure that the right vaccination records were correctly linked to the right person. Facial recognition was a key part of this process.

However, as a recent reprimand from the Information Commissioner’s Office notes, despite the high level of engagement with the Scottish Government, it was only shortly before the app was launched that the ICO learned that the third-party intended to keep images provided by users during the registration process for five days, in order to train their proprietary algorithms. Not only did this seem unnecessary for the app’s functioning, but users would not reasonably expect that their faces are also being used to improve the services offered by a commercial organisation, even if this is stated in the privacy policy for the service.

Alongside formal reprimands, concerns about the use of facial recognition technologies can become political flashpoints when used to support access to public services, voting or social media. The ID.me service in the United States, for individuals claiming unemployment benefits and filing tax returns, is a case in point. The rationale behind the use of ID.me was similar to that in the UK – use facial biometrics to confirm identity against an official document (as well as against databases of faces) as a way of reducing fraud.

According to reports, when the ID.me system failed, individuals could fall back on manual methods of proving their identity, but these under-resourced (human) systems were frequently overwhelmed and failed to provide the level of service expected. As a consequence, individuals found themselves unable to access key government services. The resulting uproar led the US tax authorities to reconsider the role that ID.me (and facial recognition more generally) plays in accessing services.

Just as UK digital identity service providers will need to be certified against the Framework, so ID.me was certified as a best-in-class provider and the services it offered conformed to nationally agreed standards. This highlights a divide between the expectations of ordinary users and the requirements of nationally agreed standards. It is unlikely that this divide can be resolved by simply adjusting the standards.

This raises important challenges for the new Office for Digital Identities and Attributes (OfDIA), initially part of the Department for Digital, Culture, Media and Sport (DCMS), which is going to provide the interim governance function for the Framework.

The first challenge for OfDIA is to ensure that digital identity services for important activities, such as proving a person’s right to work or rent, are as inclusive as possible.  Ongoing work by government and industry on this point must continue to ensure that technologically appealing solutions don’t end up creating a two–tier system,  reinforcing existing data divides or producing solutions that are overly burdensome.

A second challenge is reconciling a workable certification scheme with the needs and expectations of the non–expert users of the services. Although it is often claimed that the public may struggle to understand the complexities of something like the Framework, there is also growing evidence that people are fully able to understand complex issues, when they are presented with information in accessible, digestible and relevant ways, and take actions for themselves, including opting out of particular forms of processing.

As the ID.me case demonstrates, relying on compliance with standards or certification against the Framework is not necessarily sufficient for maintaining confidence in identity services. Asking people for their views on what additional considerations the certification process should include would go a long way towards securing their support.

A third challenge is demanding that digital identity service providers are clear about what they are doing with the data they collect. Improving transparency is an important step as it is natural behaviour to ignore privacy policies, in part because they rarely convey information in a way users understand and need. Transparency needs to be more than this, however, especially in the absence of clear regulation about the use of biometric data more generally.

One suggestion for OfDIA to consider is to build on the Open Banking Customer Experience Guidelines, which recommends including a clear data monetisation statement that explains how a company makes money from people’s data. When coupled with inclusive and acceptable alternative ways of proving things about themselves, such transparency could empower people to make active choices about the digital identity services they wish to engage with.

More broadly, it is useful to draw attention to established design principles, such as those of the Government Digital Service. Services should always start with people’s needs and not with the interests of providers or of the organisations using them.


The Ada Lovelace Institute has been researching the public attitudes towards, and the governance of biometric data. The forthcoming Ryder Review, commissioned to an independent legal team led by Matthew Ryder, and our policy report consider the regulations, accountability mechanisms and institutional frameworks required for the responsible deployment of biometric technologies.


Image credit: Maksims Grigorjevs

Related content