Skip to content
Evidence review

Who cares what the public think?

UK public attitudes to regulating data and data-driven technologies

Aidan Peppin

5 May 2022

Reading time: 55 minutes

man holds his hand near ear and listens carefully isolated on gray wall background

A draft of this evidence review served as the basis for a virtual roundtable we held on 31 March 2022, convening a cross section of academia, policy and civil society to discuss public attitudes towards the regulation of data in the UK.

Background

The need to meaningfully understand public attitudes towards data regulation has become urgent in the UK.

To ensure data policy and governance are aligned with societal values and needs, and worthy of public trust, it is vital to understand people’s perspectives and experiences in relation to data and data-driven technologies. It is therefore imperative that, as the UK Government develops renewed policy, strategy, guidance and regulation about data, policymakers have a robust understanding of relevant public attitudes.

This briefing paper is intended to support policymakers to build that understanding by presenting a review of evidence about public attitudes towards data that helps address the question: what does the UK public think about data regulation?

Introduction

Data is increasingly central to public services, innovation and civic life. This is highlighted through recent UK Government work, such as the ‘Data: A New Direction’ consultation in autumn 2021. This work aims to revisit data governance to strike a balance between permitting data use and protecting people’s rights.

There is a growing number of published studies on public attitudes towards data. However, in 2020 a comprehensive academic review of this research identified how the complex, context-dependent nature of public attitudes to data renders any single study unlikely to provide a conclusive picture. The authors of that review recommended that policymakers should ‘look beyond headline findings about public perceptions of data practices [and] engage with the breadth of evidence available across academic disciplines, policy domains and the third sector’.1

To support policymakers to engage with the breadth of existing research, we have reviewed evidence from nearly 40 studies about UK public attitudes to data conducted in recent years. The evidence presented in this paper is drawn from the body of research the Ada Lovelace Institute has used to inform our own policy positions and responses to recent government consultations.

Given the urgent nature of the question of how to regulate data with public support, this paper is not an exhaustive synthesis of every piece of research published on public attitudes towards data. Instead, it represents a curated overview of credible research that is relevant to recent data-related consultations, inquiries and debates.

In our review of this research, five key findings have emerged:

Summary of findings

  1. There is consistent evidence of public support for more and better regulation of data and data-driven technologies. (But more research is needed on what the public expects ‘better’ regulation to look like.)
  2. The UK public want data-driven innovation, and they expect it to be ethical, responsible and focused on public benefit. (Determining what constitutes ‘public benefit’ from data requires ongoing engagement with the public.)
  3. People want clearer information about data practices and how to enact their data rights. (But what this looks like in practice is not yet fully understood.)
  4. Creating a trustworthy data ecosystem is critical to protecting against potential public backlash or resistance. (Emerging research suggests that regulation is a key component in those ecosystems.)
  5. Public concerns around data should not be dismissed as lack of awareness or understanding, and simply raising awareness about the benefits of data will not increase public trust. (More research is needed to understand the connection between awareness of and attitudes to data.)

In this paper we present the evidence that underpins each of these findings. We also highlight gaps in our collective understanding and identify avenues for future research that policymakers, industry, academia and civil society must work together to address.

Detailed findings

1. There is consistent evidence of public support for more and better regulation of data and data-driven technologies

 

But more research is needed on what the public expects ‘better’ regulation to look like.

Findings from many surveys, focus groups and public dialogues conducted in recent years consistently indicate that the UK public does want data and digital regulation to be strengthened, and that any moves to deregulate this area would not achieve public support.

This evidence suggests that people expect regulation – and the governance structures that underpin it – to ensure that people’s rights and interests are protected, their privacy is preserved, and the power of large technology companies and other data controllers is held to account. Many people feel current regulation does not do this effectively. This evidence also shows that better regulation of data is necessary to ensure public trust in the use of data.

Evidence

  • In 2020, a team of researchers from the Living With Data research project published an extensive review of literature related to public attitudes towards data. The review concluded that existing research suggests regulation is a requirement for fairer data practices.2
  • UK research organisation Doteveryone conducted two large-scale surveys of UK public attitudes towards digital technology in 2018 and 2020. 64% of survey respondents thought that Government should regulate online services more heavily, even if that comes with disadvantages like making it harder for small businesses to make money or creating fewer choices for consumers. Respondents identified the Government (54%) and independent regulators (48%) as ‘having the most responsibility for directing the impacts of technology on people and society’. In 2018, Doteveryone reported that 66% of respondents felt the Government should play a role in enforcing rules that ensure people and society are treated fairly by technology companies.3
  • Findings from the first wave of the CDEI’s Public Attitudes to Data and AI tracker were published in March 2022. It found that just 26% of the public report knowing at least a fair amount about digital regulation, and that few people express confidence that there are protections in place around digital technologies:

    31% of people agree that ‘the digital sector is regulated enough to protect my interests’, compared with 30% who disagree.

    The survey also found that ‘respondents were more likely to be willing to share data if [strict] rules are place to protect them as users’.4

  • The CDEI’s 2020 Trust in COVID-19 Technology poll found that fewer than half of people (43%) trust that the right rules and regulations are in place to ensure digital technology is used responsibly. A similar proportion (44%) wouldn’t know where to raise their concerns if they were unhappy with how a digital technology was being used.5
  • A 2020 study by UK academics looked at public perceptions of good data management. They found that, of a range of models for data management, the most preferred option was a ‘personal data store’ that would offer individuals direct control over their personal data. The second-most preferred option was a ‘regulatory public body overseeing “how organizations access and use data, acting on behalf of UK citizens”’. Among other experiments the researchers conducted, regulation and regulatory oversight consistently ranked highly among various options, alongside individual control and consent or opt-out mechanisms.6
  • In the context of biometrics specifically, the Ada Lovelace Institute’s 2019 survey found that 55% of people agree that the Government should limit the police use of facial recognition to specific circumstances.7 In a subsequent public deliberation exercise called the Citizens’ Biometrics Council, members of the public developed 30 recommendations for the governance of biometric technologies. 9 of these recommendations focused on regulation, legislation and oversight of biometrics, and several recommendations specifically called for new regulation for biometrics, beyond the existing GDPR, that ensures people’s rights are protected and data controllers and processors are held to account.8
  • A public dialogue on the ethics of location data, commissioned by the Geospatial Commission and UKRI’s Sciencewise programme, was carried out by Traverse and the Ada Lovelace Institute in 2021. The dialogue participants said that effective regulation, accountability and transparency are essential for ensuring public trust in the use of location data. They also thought that data collectors should be accountable to regulators and data subjects, with consequences for breaches or misuse. Many participants questioned whether current regulation and regulators are effective in achieving this.9
  • A literature review conducted by Administrative Data Research UK in 2020 reported several studies ‘identified an increase in public acceptance [of data use] after study participants were informed about’ data protection and governance mechanisms.10 This research also suggests that there is public support for penalties if data is misused and laws to regulate access to data.
  • Results from the Information Commissioner’s Office 2020 and 2021 annual public attitudes trackers show the number of people who agree that ‘current laws and regulations sufficiently protect personal information’ increased from 33% to 49% between 2019 and 2020, but dropped to 42% in 2021.11 12
  • In 2021, Which? conducted online deliberative focus groups with 22 people, focusing on specific components of the ‘Data: A New Direction’ consultation.13 They found that some participants ‘called for AI and algorithms to be more regulated than at present, given the potential for negative impacts on people’. Some participants also thought that ‘unchallenged solely automated decisions would further skew the imbalance of power between consumers and businesses and give companies more ways to rescind responsibility if something went wrong.’ 
  • In 2021 Ada conducted citizens’ juries on the use of data during health emergencies, such as the COVID-19 pandemic. The juries found that even in times of health crises, good governance that includes appropriate regulation is essential for public trust in the use of data.14

Future research avenues : data regulation

This evidence sends a strong signal that the UK public want more and better regulation of data-driven technologies. However, more research is needed to understand exactly what the public expects this regulation to look like.

This has been explored in some domains, such as biometrics (for example, Ada’s Citizens’ Biometrics Council asked members of the public to set out their recommendations for the governance of biometrics). But for other domains, such as health or finance, what exactly the public thinks regulators and policymakers should focus on is not yet clear.

More research is also needed to understand how the public considers trade-offs and tensions that arise in the context of data regulation. While the Doteveryone survey, for example, suggested people want tougher regulation even at the expense of consumer choice, this remains an under-explored topic. The Geospatial Commission’s dialogue on location data offered some useful findings on this, demonstrating how public dialogue and deliberation offer fruitful methods to studying this.

Finally, it is important to note that, while we have drawn on research that we feel is robust, some surveys have limited sample sizes, especially in terms of the number of people surveyed from relevant marginalised groups, such as digitally excluded people. Any future research must ensure the views of marginalised groups are meaningfully included.

2. The UK public wants data-driven innovation, but expects it to be ethical, responsible and focused on public benefit

 

Determining what constitutes ‘public benefit’ from data requires ongoing engagement with the public

Research shows that people expect the Government to support innovation and for public bodies and commercial companies to use new data-driven technologies to improve services and tackle societal problems. At the same time, they expect the Government to use regulation and oversight to limit how and where these technologies can be used.

In our view, evidence that people want both the benefits of data and regulatory limits on it is not inconsistent or contradictory; it is reflective of how the public understand and experience both the potential good and the potential harm of data use. People want the benefits of data-driven innovation to be realised, but to minimise the harms, they want it to be safe, ethical, responsible and to put the good of the public first.

Evidence

  • A series of focus groups and participatory workshops conducted in 2019 by the RSA, Open Data Institute and Luminate found that

    people feel positive about the benefits of data-driven technologies, but want greater rights over its use, and to see Government regulate companies’ uses of data.15

  • Our 2019 survey of public attitudes towards the use of facial recognition found good support for the use of the technology, in certain circumstances. We found 70% support for the use of facial recognition by the police in criminal investigations, 54% support for its use to unlock smartphones and 50% support for its use in airports. This compares with very low support in other circumstances: just 7% in supermarkets to track shopper behaviour, 6% to monitor pupil’s expressions in schools and 4% in hiring processes. From this we concluded that any support for the use of data-driven systems like facial recognition is conditional on the context in which it is deployed, with use cases where there is clear public benefit enjoying more support. Importantly, people’s support for the use of facial recognition in any scenario ‘assumes appropriate safeguards were in place’.7
  • Our Citizens’ Biometrics Council reiterated this conclusion, finding that the use of biometric data and technologies can bring benefits in certain circumstances, but a prerequisite for their use is more effective regulation, independent oversight and strict standards for their development and deployment.8
  • A 2021 public dialogue commissioned by Understanding Patient Data and the National Data Guardian explored how health data can be used for public good. Participants concluded that data use should be subject to a ‘level of governance’ that balances the use of data for innovation with ethical and responsible practice, to ensure public benefit. Understanding Patient Data reported that participants argued for the need to ‘embed on-going evaluation of public benefit throughout the data life cycle’ and that, in the case of NHS data use, ‘public benefit must always outweigh profit.’18 19
  • The Centre for Data Ethics and Innovation conducted a set of focus groups about Trust in Data in 2021. Through the use of 12 case studies, they found that participants were most supportive of data use cases when the benefits to society are clear and substantial, as well as when there is an intuitive and direct link between the data collected and the purposes for its use.20
  • An Imperial College London-led survey of attitudes to health data sharing, published in 2020, found that ‘the more commercial the purpose of the receiving institution (e.g., for an insurance or technology company), the less often respondents were willing to share their anonymised personal health information’ in the UK and the USA.21
  • Deloitte and Reform’s 2017/18 State of the State report included a survey of 1,000 adults across the UK. It found that trust in Government use of data is ‘driven by a belief that it uses data for the good of society […] and its use is regulated’.22
  • In 2020, the Ada Lovelace Institute conducted an online public deliberation on the Government’s use of data-driven technologies to address the COVID-19 crisis. Participants in the dialogue expressed how, even in extreme circumstances like a pandemic, data-driven technologies must comply with data-privacy regulations. Participants expected to see regulators take an active role in overseeing the use of data, and clear standards for its use and development.23
  • In 2021 the UK Geospatial Commission and Sciencewise commissioned a public dialogue on the ethics of location data use. Delivered by Traverse and the Ada Lovelace Institute, the dialogue found that participants thought that why location data is used and who benefits from it are important when considering whether location data use is ethical and trustworthy, and that benefits to members of the public or wider society should be prioritised.24
  • Researchers on the Observatory for Monitoring Data-Driven Approaches to COVID-19 (OMDDAC) programme surveyed UK public perceptions of data sharing for COVID-19 purposes in 2021. They found that people are more willing to share data if it will help address an emergency, but that the nature of the data, the urgency of the issue, and who will access or use the data all affect levels of comfort in data-sharing. The researchers conclude that ‘it cannot be assumed that the urgency of a global pandemic leads to people disregarding their concerns and engaging with data-sharing initiatives.25
  • Academics from the Me and My Big Data project surveyed the UK public based on their digital literacy. In 2021 they reported that people at all levels of online engagement are more favourable toward data collection when it is used for consumers’ benefit rather than companies’ benefit.26

Future research avenues: public benefit

Evidence that the public expects both data-driven innovation to deliver benefits and to be ethical, responsible and put public benefit first helps understand the public’s priorities and desires. However, existing research has not yet offered detailed analysis of what the public consider ‘public benefit’ to be, nor what they expect to happen when tensions arise: for example if a company uses data to deliver a public service, but increases its market value in the process, do the public consider this to be ‘responsible data use in the public interest’?

Some research explores this in specific contexts, particularly in health, such as Understanding Patient Data and the National Data Guardian’s Putting Good Into Practice report cited above, but more research is needed on how the public feels tensions around the proportionate use of data in the public interest should be navigated. In the meantime, while this research is undertaken, policymakers should heed the fact that for the public, ‘innovation’ is not beneficial unless it is ethical and responsible.

It is also likely that what counts as ‘public benefit’ is context dependent and will vary case to case. This means that ongoing work, including public participation, will be needed to align data innovation with wider concepts of public benefit.

3. People want clearer information about data practices and how to enact their data rights

 

But what this looks like in practice is not yet fully understood

There is a large body of research on issues of transparency and people’s rights in relation to data use. This research suggests that people often want more specific, granular and accessible information about what data is collected, who it is used by, what it is used for and what rights data subjects have over that use. Often, people want this information so that they can make informed decisions, such as whether or not to consent to data collection, or who to raise a complaint with.

Evidence

  • In 2021 the CDEI commissioned Britain Thinks to conduct qualitative research with members of the UK public about transparency of algorithmic decision-making systems. They found that participants expected two ‘tiers’ of information that should be made publicly available. ‘Tier 1’ information should be made available to the general public and includes explanations of what data is collected, how that data is used, and how data subjects’ privacy is ensured. ‘Tier 2’ information should contain more technical detail, and be made available for expert scrutiny and for those members of the public with particular interest.27
  • Participants in a series of focus groups conducted by the Royal Society of Arts (RSA) said that transparency around data use is critical, and they expect companies and organisations to be clear about how they use data. The RSA’s report concluded that overall people want greater honesty and transparency around data use.28
  • The 2020 Information Commissioner’s Office (ICO) annual tracker found that

    only 37% of people agreed that companies and organisations are open and transparent about how they collect and use personal information,

    and only 31% agreed that is easy to find out how personal information is stored and used. This dropped to 33% and 30% respectively in 2021.29

  • In 2020 researchers at the University of Sheffield conducted a major survey of UK public attitudes towards data practices, called the Living With Data survey. It found that

    83% of respondents want to know who has access to data about them and 80% want to know where data about them is stored.30

  • Understanding Patient Data has reported extensively on transparency and communication around the use of health data. Their research, based on workshops with health professionals and members of the public, has found that the language and processes around health data are confusing for many patients and members of the public. Most people feel clear, accessible explanations of what health data is collected, how it’s used and how it’s protected are crucial to building public trust in data use.31
  • In 2018 the Academy of Medical Sciences conducted a series of dialogues with patients, the public and healthcare professionals. They found that there are ‘strong expectations from patients and the public for transparency around the use of data-driven technologies’.32
  • Ada’s 2020 citizen juries on COVID-19 technologies found that the public want a transparent evidence base around uses of data, particularly in high-stakes scenarios. A lack of transparency can generate suspicion and distrust, people feel it should be easy to know what data is held, by whom, for what purpose and how long.33
  • Online deliberative focus groups conducted by Which? found that participants felt that the ability to challenge AI decisions was a ‘right not a privilege’, and safeguards like consent to data use remain important to the public. Which? reported that ‘consumers want to continue to be able to actively choose which cookies are consented to when they use the internet and what data is collected about them.’34
  • The ICO and Alan Turing Institute’s conducted two citizens’ juries in 2019, as part of an initiative called Project ExplAIn.35 These juries found that ‘the importance placed on an explanation for automated decisions’ varies depending on the scenario, and ‘a majority [of participants] felt that explanations for AI decisions should be offered in situations where non-AI decisions come with an explanation.’ In some high-stakes scenarios or scenarios that are more technical than social, such as medical diagnosis, participants felt system accuracy was more important than transparency. 
  • Ofcom’s 2020/21 Adult Media Use & Attitudes Survey reported that ‘the most common reasons internet users gave for allowing companies to collect and use their data were having appropriate reassurances on the protection and use of their data: including that they can opt out at any point and the company will stop using their data; or that their information will not be shared with other companies.’36
  • The 2021 Lloyds Digital Consumer index found that

    36% of people who are offline say that more transparency about the data organisations have and how they are using it would encourage them to get online.

    44% of those offline say that the ability to easily stop organisations from using their data would encourage them to get online.37

Future research avenues: transparency

Though a vast body of research literature points towards a public desire for transparency, there is limited understanding of what the public expect this to look like in practice. Some studies, such as those conducted by the CDEI and Understanding Patient Data, cited above, have contributed to this understanding. However, more research is needed to understand the detail of what the public expect transparency and enacting their rights to look like in practice.

Further, there is little research that has focused on the link between transparency and understanding. Surveys about public awareness of data practices and GDPR, for instance, often do not explore whether transparency increases awareness, or whether awareness is sufficient or satisfactory. Nor does this work drill down into the specifics of people’s awareness: what people know, as opposed to what they are simply ‘aware’ of. This ‘awareness/understanding’ challenge is common in public attitudes studies, particularly polling, and requires further research in relation to data. Qualitative and co-production methodologies will therefore likely be key to better understanding what meaningful transparency should look like in practice.

4. Creating a trustworthy data ecosystem is critical to protecting against potential public backlash or resistance

 

Emerging research suggests that regulation is a key component in those ecosystems

Building or ensuring ‘public trust’ in the use of data is a common and laudable priority for policymakers and technology innovators. This is for good reason: without public trust, data innovation cannot proceed with social licence or legitimacy, and there is a risk of potential public backlash or resistance. Instances of this backlash are already evident in responses to the Cambridge Analytica Scandal,38 and more recently, the response to the A-level grading algorithm,39 and the GP Data for Planning and Research (GPDPR) programme.40

There is a rich body of research relating to public trust in data use. Our analysis of this evidence suggests that aims to ‘build public trust’ can too often place the burden on the public to be ‘more trusting’ and will do little to address the issue of trust in data. Instead, policymakers and regulators should focus on encouraging more trustworthy practices from data innovators and data processors.

Evidence

  • A 2014 study by Ipsos MORI for the Royal Statistical Society found a ‘data trust deficit’ in the UK, where the NHS and public institutions are among the most trusted when it comes to data use, but social media companies, technology companies and retail companies are the least trusted, and national and local government bodies fare somewhere in the middle.41
  • In 2022, the Centre for Data Ethics and Innovation’s Public Attitudes to Data and AI tracker showed that a similar ‘data trust deficit’ found by the Royal Statistical Society remains today. Average trust in managing data was highest for NHS (74%), universities (63%) and banks (66%). It was lowest for social media companies (33%) and big tech (46%). Local councils (51%), Utilities providers (51%) and local independent businesses (49%) fare in the middle. Notably, the Government scored low in this survey, just below big tech at 44%. The CDEI reported that people’s willingness to share data is ‘closely related to trust in the organisation [that uses the data] to act in the public’s best interest’.42
  • In Doteveryone’s 2020 survey, they found high levels of distrust in technology companies, with only 19% of respondents agreeing that tech companies design products and services with users’ best interests in mind, with qualitative data linking these responses to lack of trust.43 Their 2018 survey found that

    only 25% of people say they trust technology companies ‘to do the right thing’.44

  • In 2017 and 2018, the Open Data Institute surveyed public attitudes towards sharing personal data across the UK, France, Germany, Belgium and the Netherlands. It found only 2% of people in the UK trust social media companies with data, 22% trust online retailers, and 94% say trust is an important factor when deciding whether or not to share data. Findings are similar in the European countries surveyed.45
  • The Living With Data survey found that trust in data use breaks down into three categories of trust in organisations to: a) keep personal data safe, b) gather and analyse data in responsible ways, and c) be open and transparent about what they do with data. Using this framework, they found that

    67% to 69% of people trust the NHS with data, contrasted against only 5% of people who trust social media companies

    across all three categories.46

  • A 2020 report by the Ada Lovelace Institute drew on multiple public dialogues and concluded that public trust in data use is dependent on not just privacy and data protection, but on whether digital interventions are effective and whether the organisations involved were perceived to be trustworthy.47
  • The ICO found that public trust in companies and organisations storing and using data ‘shifted towards the neutral middle point in 2020’. Low trust dropped from 38% to 28%, and high trust dropped from 32% to 27%. Neither ‘trusting nor distrusting’ increased from 30% to 45%, suggesting growing ambivalence around data use.48 Alongside these shifts, the ICO’s findings also reiterate the ‘data trust deficit’: higher trust in public services using personal data, lower trust in social media and online platforms, and mid-level trust in government.

Future research avenues: trust

Creating ‘trustworthy data ecosystems’ is a complex challenge and will not be solved by policymakers or regulation alone. Our view is that the evidence presented above shows that trustworthy data ecosystems are core to public trust, but it does not show how to achieve those ecosystems. Entire research communities are dedicated to tackling various components that might be part of such ecosystems – such as transparency around data processors’ intentions, appropriate governance and rules, effective safeguards and protections for data subjects – but it is clear that it will be some time before this work yields change at scale.

We recommend that, to complement this emerging research field, policymakers and regulators should shift their mindset away from ‘building public trust’ and focus instead on designing policy, legislation and strategies that encourage more trustworthy data practices from innovators and data controllers. This will go a long way towards fostering trustworthy data ecosystems, incentivising more trustworthy practices, and realising the ultimate goal: widespread public trust in data use.

5. Public concerns around data should not be dismissed as lack of awareness or understanding, and simply raising awareness of the benefits of data will not increase public trust

 

More research is needed to understand the connection between awareness of and attitudes to data

Public concerns around an issue are commonly attributed it to a lack of understanding, knowledge or awareness. This is particularly so in the case of data, which often generates a tendency to reach for solutions that would ‘raise public awareness’ or ‘inform the public’ about the benefits of data and related regulation.

While there is some evidence of correlation between higher awareness of data use and higher support for it, the assumption that higher awareness causes higher support is flawed. Beyond the correlation-causation fallacy,49 this logic follows a widely criticised ‘deficit model’ of the public understanding of science and technology, which falsely assumes that the more a person knows about a technoscientific issue, the more they will support it.50 Such assumptions fail to recognise that many people’s concerns about data correlate with both high and low levels of understanding, and often those concerns are persistent or strengthened after being provided with information about data use.

Moreover, there is increasing evidence that public trust in and support for data-driven technologies is correlated with factors such as digital exclusion or ethnicity, rather than awareness. This points to a corollary argument for taking public trust seriously: in an increasingly complex digital world, trust in data is likely to be a contributing factor to digital inequality.

Evidence

  • Multiple public deliberation and public dialogues – such as our Citizens’ Biometrics Council, the Geospatial Commission’s public dialogue on location data ethics, and multiple citizens’ juries on the use of health data – provide strong evidence that informing people about data does not necessarily mean they will become more supportive of its use. These methods are designed to increase awareness and understanding by providing participants with information, access to experts, and time and support to consider this evidence carefully. Such methods consistently report that as participants in these dialogues develop informed views, they recognise the benefits of data-driven innovation, but this does not diminish their concerns about harms that might arise from data use. Importantly, many dialogues about data conclude with people recognising both benefits and disbenefits of data and concluding that neither wide deregulation nor blanket bans will work. Instead, nuanced approaches to regulation are required.
  • In 2021, Which? conducted online deliberative focus groups with 22 people, focusing on specific components of the ‘Data: A New Direction’ consultation. Which? reported that ‘consumers are fully able to understand complex issues such as automated decision-making when they are presented with information in accessible, digestible and relevant ways.’ They also found that, following consideration of information about automated decision-making and cookie notices, the participants showed expectations for better regulation and oversight that ensures data subjects have control over data use and their rights are protected (see above).51
  • The Living With Data survey, mentioned above, found that people want to know more about data uses, but the people who know most about them are the most concerned. Grouping respondents into four clusters, researchers found that the most well-informed respondents are more likely to be critical about data practices, and moderately well-informed respondents are more likely to be cautious. In other words, people who are more well-informed about data uses are more likely to have negative attitudes towards them. This suggests that increased awareness of data uses does not result in increased trust or acceptance.52
  • Focus groups conducted by Britain Thinks for the CDEI in 2021 found people often have negative views on data use because the bad examples are more memorable. These findings suggest these negative views are not a result of lack of awareness or understanding.53
  • Academics on the ‘Me and My Big Data’ project published research in 2021 about the UK public’s data literacy, outlining five categories that can be used to describe people’s knowledge, understanding and awareness about data and related issues. They found an overall majority of people are ‘uncomfortable with third-party sharing of data’. ‘Limited users’ – those with the lowest data literacy – are the second most uncomfortable with third-party sharing of personal data (71%), and ‘General users’ – who have just-above average data literacy – are the most uncomfortable with third-party sharing of personal data (74%). Those with much higher-than-average data literacy scores are ‘happier’ with data collection to deliver consumer benefit, but a majority of them (66%) still report discomfort around third-party data use.54
  • According to the Lloyds Digital Consumer Index, among the 14.9 million people in the UK with low digital engagement, 74% are concerned about using (digital) sites/tools to enter personal details. Among the 9.8 million with very high digital engagement, 58% are concerned about using sites/tools to enter personal details. 51% of non-users say they are worried about privacy and security and having their identity taken, and 44% say they are worried about how organisations use their data (up by more than 10 points since 2020.37
  • Findings from the 2019 Oxford Internet Survey show that 70% of respondents are not comfortable with companies tracking them online, and non-users are more likely to be concerned about privacy threats online (72% versus 52% among users). Only 29% of non-users think that technology is making things better. It is important to note higher concern is related to usage, not necessarily awareness.56
  • The ICO’s 2020 trust and confidence survey suggests that different demographic groups hold different views about different kinds of data use. It found that ‘Non-BAME respondents (60%) are significantly more likely to have a high level of trust and confidence in the Police storing and using their personal information than BAME respondents (50%).’57

Future research avenues: awareness of data

The studies cited here have drawn connections between understanding and attitudes, but they have not explicitly explored this connection. This evidence suggests that understanding alone does not engender trust; in fact, it may generate more critical views of data. It is also increasingly clear that trust is integral to closing some of the digital divides that contribute to unequal experiences and outcomes for people online, but exactly how trust and digital divides are connected remains poorly understood.

This means that, as noted by academics on the Living With Data project, more ‘analysis is needed [to] understand the relationship between awareness or understanding of data uses and attitudes towards them.’[footnote]Kennedy, H. et al. (2021). Living with Data survey report. University of Sheffield. Available at: https://livingwithdata.org/project/wp-content/uploads/2021/10/living-with-data-2020-survey-full-report-final-v2.pdf [/footnote]

Conclusion

It is clear that data – and the technologies built on it – will continue to define our societies and affect every member of the public. It is therefore paramount that people’s perspectives, attitudes and experiences directly shape the way data is governed. This is vital if we want the data-driven world we build to work for everyone, and if the UK wants to be world leading in data governance and innovation.

The research reviewed here represents only part of a vast body of work conducted in academia, civil society, the public sector and industry. Much of this research covers public attitudes towards data beyond just regulation: covering issues like inequality, privacy, agency and more. As data and data-driven technologies become increasingly central to our everyday lives, it is ever more important to bring these evidence-based insights together, identify key messages and themes, and develop policy that benefits people and society first and foremost.

The research reviewed for this briefing raises further questions that researchers and policymakers must consider:

  • How do the public expect regulation to balance the minimisation of harm with realising the benefits of data innovation?
  • How do the public define and determine what constitutes public benefit in data innovation?
  • What practical mechanisms for transparency meet public expectations?
  • What can public perspectives tell us about what trustworthy data use looks like in practice?
  • How can we build a more in-depth and robust understanding of the relationship between people’s awareness of data practices and their attitudes towards them?

In our report Participatory Data Stewardship, we describe a framework for such involvement of the public in the governance of data.59 One approach to this is to conduct regular surveys to track public opinion to monitor attitudes, and efforts to conduct such surveys will be informative for policymakers.60 For these surveys to be meaningful, they will need to ensure granular representation of digitally excluded and other marginalised groups. 

However, the value- and context-based nature of many of these questions – such as what constitutes a use of data for the public benefit – means traditional research methods may struggle to provide concrete answers or meaningfully incorporate public perspectives. Here, more participatory, deliberative and dialogue-based methods will be required, such as citizens’ assemblies, Government-led public dialogue, and co-design or ethnographic practices.  And more experimental methods will be needed too, such as randomised control trials or futures thinking. These methods complement public opinion surveys, because participants are supported to give informed and reasoned views, which are of more value to effective, evidence-based policymaking than survey responses that cannot offer the same depth of consideration.

There is one other key finding that this evidence review offers: the value of public participation lies in not only helping align data use with societal values, but in offering vital signals for what trustworthy, responsible and ethical data use looks like in practice.

We conclude this review by recommending that the Government, academia, industry and civil society work together to ensure public participation, engagement and attitudes research meaningfully informs and is embedded in the UK’s future data regulation.

 

 

Notes on the evidence included in this paper

  • Public attitudes research often describes public ‘expectations’. Where we have reported what the public ‘expect’ in this review, our interpretation of this term means what the public feel is necessary from data practices and regulation. ’Expectation’, in this sense, does not refer to what people predict will happen.
  • Data and digital technologies (like the internet, artificial intelligence and computer algorithms) are deeply intertwined. In this review, we focus on research into public attitudes towards data specifically, but draw on research about digital technologies more broadly where we feel it is applicable, relevant and informative.
  • We focus on research conducted in the UK within recent years. The oldest evidence included dates from 2014, and the vast majority has been published since 2018. We chose this focus to ensure findings are relevant, given that events of recent years have had a profound influence on public attitudes towards technology, such as the Cambridge Analytica scandal, the growth and prominence of large technology companies in society and the impacts of the COVID-19 pandemic.
  • Various methodologies are used in the research we have cited, from national online surveys to deliberative dialogues, qualitative focus groups and more. Each of these methods has different strengths and limitations. We place no hierarchy on the value of any one particular method; each has a role to play, and the strengths of one approach complement the limits of another. It is the collective value of drawing from a range of robust methods that we prioritise, as this helps to address the complex and context-dependent nature of public attitudes towards data.

Image credit: SIphotography

  1. Kennedy, H. et al. (2020). Public understanding and perceptions of data practices: a review of existing research. University of Sheffield. Available at: https://livingwithdata.org/project/wp-content/uploads/2020/05/living-with-data-2020-review-of-existing-research.pdf.
  2. Kennedy, H. et al. (2020). Public understanding and perceptions of data practices: a review of existing research. University of Sheffield. Available at: https://livingwithdata.org/project/wp-content/uploads/2020/05/living-with-data-2020-review-of-existing-research.pdf
  3. Miller, C., Kitcher, H., Perera, K., Abiola, A., (2020) People, Power and Technology: The 2020 Digital Attitudes Report. London: doteveryone. Available at: https://doteveryone.org.uk/wp-content/uploads/2020/05/PPT-2020_Soft-Copy.pdf (Accessed: 4 March 2021); and Miller, C., Coldicutt, R., Kos, A., (2018) People, Power, Technology. London: doteveryone. Available at: https://doteveryone.org.uk/wp-content/uploads/2018/06/People-Power-and-Technology-Doteveryone-Digital-Attitudes-Report-2018.compressed.pdf (Accessed: 30 November 2021).
  4. Centre for Data Ethics and Innovation. (2022). Public Attitudes to Data and AI Tracker: Wave 1, p.35. Available at: https://www.gov.uk/government/publications/public-attitudes-to-data-and-ai-tracker-survey (Accessed: 15 November 2021).
  5. Centre for Data Ethics and Innovation. (2020). Trust in technology: COVID-19. Available at: https://cdei.blog.gov.uk/wp-content/uploads/sites/236/2020/07/CDEI-Trust-in-Technology-Public-Attitudes-Survey-1.pdf (Accessed: 4 March 2021).
  6. Hartman, T. et al. (2020). ‘Public perceptions of good data management: Findings from a UK-based survey’, Big Data & Society, 7(1). doi: 10.1177/2053951720935616.
  7. Ada Lovelace Institute. (2019). Beyond face value: public attitudes to facial recognition technology. Available at: https://www.adalovelaceinstitute.org/report/beyond-face-value-public-attitudes-to-facial-recognition-technology/ (Accessed: 23 February 2021).
  8. Peppin, A., Patel, R. and Parker, I. (2021). The Citizens’ Biometrics Council. Ada Lovelace Institute. Available at: https://www.adalovelaceinstitute.org/report/citizens-biometrics-council/ (Accessed: 29 March 2022).
  9. McCool, S., Maxwell, M., Peppin, A., et al. (2021). Public dialogue on location data ethics. Geospatial Commission, Traverse, Ada Lovelace Institute. Available at: https://www.gov.uk/government/publications/public-dialogue-on-location-data-ethics. (Accessed 28 January 22)
  10. Waind, E. (2020). Trust, Security and Public Interest: Striking the Balance, p.18. Administrative Research UK. Available at: https://www.adruk.org/fileadmin/uploads/adruk/Trust_Security_and_Public_Interest-_Striking_the_Balance-_ADR_UK_2020.pdf (Accessed: 2 December 2021).
  11. Worledge, M. and Bamford, M. (2020). ICO Trust and Confidence Report. Harris Interactive and Information Commissioner’s Office. Available at: https://ico.org.uk/media/about-the-ico/documents/2618178/ico-trust-and-confidence-report-2020.pdf (Accessed: 29 March 2022).
  12. Worledge, M. and Bamford, M. (2021) ICO Annual track findings, 2021. Information Commissioner’s Office. Available at: https://ico.org.uk/media/about-the-ico/documents/2620165/ico-trust-and-confidence-report-290621.pdf (Accessed: 29 March 2022).
  13. Which? (2021). The Consumer Voice: Automated Decision Making and Cookie Consents proposed by “Data: A new direction”. Available at: https://www.which.co.uk/policy/digital/8426/consumerdatadirection. (Accessed: 19 November 2021).
  14. Peppin, A., Patel, R., Alnemr, N., Machirori, M. and Gibbon, K. (forthcoming) Report on Citizens’ Juries on data governance during pandemics. Ada Lovelace Institute.
  15. Samson, R., Gibbon, K. and Scott, A. (2019). About Data About Us. The RSA. Available at: https://www.thersa.org/globalassets/pdfs/reports/data-about-us-final-report.pdf (Accessed: 2 December 2021).
  16. Ada Lovelace Institute. (2019). Beyond face value: public attitudes to facial recognition technology. Available at: https://www.adalovelaceinstitute.org/report/beyond-face-value-public-attitudes-to-facial-recognition-technology/ (Accessed: 23 February 2021).
  17. Peppin, A., Patel, R. and Parker, I. (2021). The Citizens’ Biometrics Council. Ada Lovelace Institute. Available at: https://www.adalovelaceinstitute.org/report/citizens-biometrics-council/ (Accessed: 29 March 2022).
  18. Hopkins Van Mil. (2021). Putting Good into Practice: A public dialogue on making public benefit assessments when using health and care data. Available at: https://www.gov.uk/government/publications/putting-good-into-practice-a-public-dialogue-on-making-public-benefit-assessments-when-using-health-and-care-data (Accessed: 15 April 2021).
  19. Harrison, T. (2021).‘What counts as a “public benefit” for data use?’. Understanding Patient Data. Available at: http://understandingpatientdata.org.uk/news/what-counts-public-benefit-data-use (Accessed: 28 January 2022).
  20. CDEI and Britain Thinks. (2021). Trust in Data, p.33. Available at: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1049179/Trust_In_Data_-_Publishable_Report__1.pdf (Accessed: 17 March 2022).
  21. Ghafur, S. et al. (2020). ‘Public perceptions on data sharing: key insights from the UK and the USA’. The Lancet Digital Health, 2(9), pp. e444–e446. doi: 10.1016/S2589-7500(20)30161-8
  22. Deloitte and Reform. (2018). Citizens, government and business: the state of the State 2017-18, p.3. Available at: https://www2.deloitte.com/content/dam/Deloitte/uk/Documents/public-sector/deloitte-uk-the-state-of-the-state-report-2017.pdf (Accessed: 1 December 2021).
  23. Ada Lovelace Institute and Traverse. (2020). Confidence in a crisis? Available at: https://www.adalovelaceinstitute.org/report/confidence-in-crisis-building-public-trust-contact-tracing-app/ (Accessed: 4 March 2021).
  24. McCool, S. et al. (2021). Public dialogue on location data ethics. Geospatial Commission, Traverse, Ada Lovelace Institute. Available at: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1040807/Accessible_Public_dialogue_on_location_data_ethics_Engagement_report.pdf.
  25. Selina Sutton et al. (2021). Survey of Public Perceptions of Data Sharing for COVID-19 related purposes. The Observatory for Monitoring Data-Driven Approaches to COVID-19 (OMDDAC). Available at: https://www.omddac.org.uk/wp-content/uploads/2021/08/WP3-Snapshot.pdf (Accessed: 7 December 2021.
  26. Yates, P. S. J. et al. (2021). Understanding Citizens Data Literacies Research Report, p. 125. Available at: https://www.liverpool.ac.uk/media/livacuk/humanitiesampsocialsciences/meandmybiddata/Understanding,Citizens,Data,Literacies,Research,,Report,Final.pdf (Accessed: 18 January 2022).
  27. Centre for Data Ethics and Innovation. (2021). Complete transparency, complete simplicity. Available at: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/995014/Complete_transparency__complete_simplicity_-_Accessible.pdf.
  28. Samson, R., Gibbon, K. and Scott. A. (2019). About Data About Us. The RSA. Available at: https://www.thersa.org/globalassets/pdfs/reports/data-about-us-final-report.pdf (Accessed: 2 December 2021).
  29. Worledge, M. and Bamford, M. (2021). ICO Annual track findings, 2021. Information Commissioner’s Office. p.21. Available at: https://ico.org.uk/media/about-the-ico/documents/2620165/ico-trust-and-confidence-report-290621.pdf (Accessed: 29 March 2022).
  30. Kennedy, H. et al. (2021) Living with Data survey report. University of Sheffield. Available at: https://livingwithdata.org/resources/living-with-data-survey-results/.
  31. Understanding Patient Data. (2017). What are the best words to use when talking about data? Available at: http://understandingpatientdata.org.uk/what-are-best-words-use-when-talking-about-data (Accessed: 2 December 2021).
  32. The Academy of Medical Sciences. (2018). Our data-driven future in healthcare. Available at: https://acmedsci.ac.uk/file-download/74634438 (Accessed: 2 December 2021).
  33. Ada Lovelace Institute and Traverse (2020) Confidence in a crisis? Available at: https://www.adalovelaceinstitute.org/report/confidence-in-crisis-building-public-trust-contact-tracing-app/ (Accessed: 4 March 2021).
  34. Which? (2021). The Consumer Voice: Automated Decision Making and Cookie Consents proposed by “Data: A new direction”, p.8. Available at: https://www.which.co.uk/policy/digital/8426/consumerdatadirection (Accessed: 19 November 2021).
  35. Citizens’ Juries C.I.C. and Jefferson Centre. (2019). Artificial intelligence (AI) & explainability: Citizens’ Juries Report. Available at: http://assets.mhs.manchester.ac.uk/gmpstrc/C4-AI-citizens-juries-report.pdf (Accessed: 18 January 2022).
  36. Ofcom. (2021). Adults’ Media Use and Attitudes 2020/21, p.2. Available at: https://www.ofcom.org.uk/__data/assets/pdf_file/0025/217834/adults-media-use-and-attitudes-report-2020-21.pdf.
  37. Lloyds Bank. (2021). Consumer Digital Index. Available at: https://www.lloydsbank.com/banking-with-us/whats-happening/consumer-digital-index.html (Accessed: 31 January 2022).
  38. Butow, D. (2018). ‘Trust in Facebook has dropped by 66 percent since the Cambridge Analytica scandal’. NBC News. Available at: https://www.nbcnews.com/business/consumer/trust-facebook-has-dropped-51-percent-cambridge-analytica-scandal-n867011 (Accessed: 3 March 2022).
  39. Porter, J. (2020). ‘UK ditches exam results generated by biased algorithm after student protests’. The Verge. Available at: https://www.theverge.com/2020/8/17/21372045/uk-a-level-results-algorithm-biased-coronavirus-covid-19-pandemic-university-applications (Accessed: 3 March 2022).
  40. Jayanetti, C. (2021). ‘NHS data grab on hold as millions opt out’. The Observer. Available at: https://www.theguardian.com/society/2021/aug/22/nhs-data-grab-on-hold-as-millions-opt-out (Accessed: 3 March 2022).
  41. Varley-Winter, O. and Shah, H. (2014). Royal Statistical Society research on trust in data and attitudes toward data use and data sharing. Royal Statistical Society. Available at: https://www.statslife.org.uk/images/pdf/rss-data-trust-data-sharing-attitudes-research-note.pdf (Accessed: 30 March 2021).
  42. Centre for Data Ethics and Innovation. (2022). Public Attitudes to Data and AI Tracker: Wave 1, pp.31-32. Available at: https://www.gov.uk/government/publications/public-attitudes-to-data-and-ai-tracker-survey (Accessed: 15 November 2021).
  43. Miller, C., Kitcher, H., Perera, K., Abiola, A., (2020) People, Power and Technology: The 2020 Digital Attitudes Report. London: doteveryone. Available at: https://doteveryone.org.uk/wp-content/uploads/2020/05/PPT-2020_Soft-Copy.pdf (Accessed: 4 March 2021).
  44. Miller, C., Coldicutt, R. and Kos, A., (2018) People, Power, Technology. London: doteveryone. Available at: https://doteveryone.org.uk/wp-content/uploads/2018/06/People-Power-and-Technology-Doteveryone-Digital-Attitudes-Report-2018.compressed.pdf (Accessed: 30 November 2021).
  45. Open Data Institute. (2018). ‘Who do we trust with personal data?’. Available at: https://theodi.org/article/who-do-we-trust-with-personal-data-odi-commissioned-survey-reveals-most-and-least-trusted-sectors-across-europe/ (Accessed: 4 March 2021).
  46. Kennedy, H. et al. (2021). Living with Data survey report, p.27. University of Sheffield. Available at: https://livingwithdata.org/resources/living-with-data-survey-results/.
  47. Ada Lovelace Institute. (2020). No green lights, no red lines. Available at: https://www.adalovelaceinstitute.org/report/covid-19-no-green-lights-no-red-lines/ (Accessed: 4 March 2021).
  48. Worledge, M. and Bamford, M. (2020). ICO Trust and Confidence Report. Harris Interactive and Information Commissioner’s Office. Available at: https://ico.org.uk/media/about-the-ico/documents/2618178/ico-trust-and-confidence-report-2020.pdf. (Accessed: 29 March 2022).
  49. See: Hansen, H. (2020). ‘Fallacies’, in Zalta, E. N. (ed.) The Stanford Encyclopedia of Philosophy, Summer 2020 edition. Metaphysics Research Lab, Stanford University. Available at: https://plato.stanford.edu/archives/sum2020/entries/fallacies/ (Accessed: 31 January 2022).
  50. Bauer, M. W., Allum, N. and Miller, S. (2007). ‘What can we learn from 25 years of PUS survey research? Liberating and expanding the agenda’. Public Understanding of Science, 16(1), pp. 79-95. doi: 10.1177/0963662506071287
  51. Which? (2021). The Consumer Voice: Automated Decision Making and Cookie Consents proposed by “Data: A new direction”. Available at: https://www.which.co.uk/policy/digital/8426/consumerdatadirection
  52. Kennedy, H. et al. (2021). Living with Data survey report. University of Sheffield. Available at: https://livingwithdata.org/resources/living-with-data-survey-results/. (Accessed: 29 March 2022).
  53. CDEI and Britain Thinks. (2021). Trust in Data. Available at: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/1049179/Trust_In_Data_-_Publishable_Report__1.pdf (Accessed: 17 March 2022).
  54. Yates, P.S.J. et al. (2021). Understanding Citizens Data Literacies Research Report, p. 125. Available at: https://www.liverpool.ac.uk/media/livacuk/humanitiesampsocialsciences/meandmybiddata/Understanding,Citizens,Data,Literacies,Research,,Report,Final.pdf (Accessed: 18 January 2022).
  55. Lloyds Bank. (2021). Consumer Digital Index. Available at: https://www.lloydsbank.com/banking-with-us/whats-happening/consumer-digital-index.html (Accessed: 31 January 2022).
  56. Blank, G., Dutton, W. H. and Lefkowitz, J. (2019). Perceived Threats to Privacy Online: The Internet in Britain, the Oxford Internet Survey, 2019. SSRN Scholarly Paper ID 3522106. Rochester, NY: Social Science Research Network. Available at: https://doi.org/10.2139/ssrn.3522106
  57. Worledge, M. and Bamford, M. (2020). ICO Trust and Confidence Report. Harris Interactive and Information Commissioner’s Office. Available at: https://ico.org.uk/media/about-the-ico/documents/2618178/ico-trust-and-confidence-report-2020.pdf.
  58. Kennedy, H. et al. (2021). Living with Data survey report. University of Sheffield. Available at: https://livingwithdata.org/project/wp-content/uploads/2021/10/living-with-data-2020-survey-full-report-final-v2.pdf
  59. Ada Lovelace Institute. (2021). Participatory data stewardship. Available at: https://www.adalovelaceinstitute.org/report/participatory-data-stewardship/ (Accessed: 10 January 2022).
  60. The CDEI public attitudes tracker is a good example of such regular surveying. See: Centre for Data Ethics and Innovation. (2022). Public Attitudes to Data and AI Tracker: Wave 1. Available at: https://www.gov.uk/government/publications/public-attitudes-to-data-and-ai-tracker-survey (Accessed: 15 November 2021).

Related content