On 9 September 2020, the UK Government published its National Data Strategy for open consultation which closed on 9 December 2020.
The strategy aims to explore how the UK can enable better use of data ‘across businesses, government, civil society and individuals’ and centres on four pillars: data foundations, data skills, data availability and responsible data.
The Ada Lovelace Institute welcomes this consultation and supports the Government’s aim to ensure data is used responsibly and in the interests of people and society. If used well, data offer substantial opportunities for social wellbeing and more effective public services, through research and tailored interventions.
While we applaud the immense task of developing a single strategy on data (and all its uses) into one coherent document, we have some reflections on how it could go further.
We have organised our thinking around developing commitments in the strategy – particularly on transparency. We also argue there are some points in the strategy, and the missions themselves, where priorities or emphasis need reworking. Finally, we make suggestions about missing elements that would strengthen the UK’s approach to data. Here we summarise the main points made in our consultation response.
Moving from principles to practice – responsible data
To inform our response to the consultation, we aimed to stimulate discussion around whether the priorities outlined in the National Data Strategy will add up to responsible data practice and consider what might be missing.
One of the major challenges for responsible data is putting principles and aims into meaningful practice, so we ask how the Government might achieve its objectives with data, drawing on international practice and the latest research.
Public sector transparency measures should create meaningful understanding
A commitment to greater Government transparency is welcome and an essential component for greater accountability. Transparency has become a key value in the field of data ethics, however transparency has in the past tended to take the form of high-level rhetorical positioning rather than substantive proposals. To add value, the Government needs to ensure they are committed to create meaningful understanding of public sector algorithmic practice. See our article on Meaningful transparency and (in)visible algorithms for a discussion of the challenges in delivering on meaningful transparency, in particular in considering the varying goals for transparency.
To be meaningful, transparency mechanisms must cover a full evaluation of the sociotechnical system around an algorithm, including explicit articulation of values being propagated through the systems. Models to consider include Helsinki and Amsterdam’s algorithmic registers.
We recommend that the NDS goes further in its transparency commitments in adopting a mandatory obligation on public sector organisations using algorithms, as recommended in CDEI’s recent bias report. Building an obligation, and indeed developing a suitable mechanism or register for transparency will require the development of shared language. Currently, the lack of a common terminology regarding ‘algorithms,’ ‘AI’ and ‘ADMS’ makes it difficult to clarify the object of study. In considering this, the team may be interested in Ada’s work with Dr Michael Veale developing a taxonomy of ADMS.1
Plans for transparency need accountability more clearly built-in
However well designed, transparency mechanisms do not equal accountability. The NDS must include commitments to strengthen accountability more explicitly in the work linked to transparency: there have been several high-profile examples of harmful data practice that is stopped through legal challenge or public backlash, damaging trust in data practice.
Strengthening rolling monitoring and evaluative feedback loops would improve practice and trust. Options the Government could consider include expanded requirements for fuller public ‘umbrella’ impact assessment to include data protection, algorithmic, equalities and human rights impact assessments combined into a single process.2 As our Examining the Black Box report highlights, algorithm audits and impact assessments are a useful tool for regulators and third-party auditors to evaluate the potential harms and effects of an algorithmic system prior to its launch and after its deployment.
The NDS could strengthen public accountability by setting out requirements for audits and impact assessments and requiring documentation to be made accessible to third-party assessors.
Another mechanism for preventative accountability would be to build in participatory structures for citizens to be involved in scrutiny and decision-making and we recommend there’s more explicit consideration and consultation is built-in with individuals who may face risks from algorithmic systems.3
A change in emphasis
We also identify areas in the strategy, and the missions themselves, where priorities or emphasis need reworking. These include:
- Acknowledging risks as well as opportunities
- Not presenting regulation as a barrier to innovation
- Aiming for quality over quantity of data
- Differentiating different types of data
- Personal data having value does not equal commodification
- Focusing on building trustworthy uses of data over building trust in the use of data
Acknowledge risks as well as opportunities
Exploring the very real risks data can pose and acknowledging some of the Government’s practice which has fallen short, could offer useful lessons for others. This, alongside clearer structures to consider and mitigate risks, or assess whether to move forward, would give actors confidence to move considerately, sustainably and without fear of backlash – a real and legitimate anxiety.
Effective mechanisms for people to critically engage and scrutinise different uses of data and questionable technology should be encouraged as part of shaping the ambitions for achieving wide public benefits from the responsible use of data. There is a clear opportunity for the National Data Strategy to articulate how it would enable individuals and a wide range of groups in society to contribute to a national vision for data that benefits us all, to resource a range of initiatives that seek to balance the imperative to innovate with the imperative to ensure that the benefits from innovation are fairly and inclusively distributed.
Regulation should not be presented as a barrier to innovation
The National Data Strategy’s underlying messages are around reaping ‘the benefits of greater data use’ and maintaining ‘a fit-for-purpose legal and regulatory regime’ while emphasising that the UK’s ‘data regime will support vibrant competition and innovation, building trust and maintaining high data protection standards without creating unnecessary barriers to data use’.
The NDS should be cautious about any language which may be perceived as setting regulation as a barrier to innovation.
Data protection legislation can be perceived as complex and difficult to work with. However, these may be due to perception of burdens, lack of knowledge and understanding rather than actual barriers in the law.
Amid rapid technological change, a responsible approach requires creating incentives for companies to engage with data protection regulation early in the process and providing the necessary clarification to innovators. Better regulatory understanding upskills organisations to innovate more effectively and efficiently and bring the best out of technological advancements.
Innovative efforts concentrated on data protection by design and by default and checking for compliance early in the process would yield not only better outcomes for society but can also offer competitive advantages.
In this light, compliance is an enabler of responsible innovation and represents a competitive advantage. In other words, innovation goes hand in hand with regulation, mutually supporting and reinforcing one another.
Aim for quality rather than quantity
One of the observed narratives around data is to collect as much as possible now and consider what it can be used for later. But it is well understood that more data does not in itself guarantee better decision-making.
Rather, organisations can improve and innovate better when focusing on the quality of data, understood as data which is collected for a specific purpose, proportional, necessary and relevant. Incorrect and bad-quality data is cumbersome and costly, reducing productivity and blocking development. The shift from quantity of data to better and more thoughtful data collection enables important business opportunities.
The UK Government has an opportunity to change the paradigm from ‘more data is better’ and break the ‘move fast and break things’ philosophy which has dominated the data and technology space. Instead, the emphasis should be on innovative processes that require less data, better quality data which is supported from three different axes: quality of information, quality of process and quality of governance.
Differentiate different types of data
The National Data Strategy speaks about data in general terms, encompassing both personal as well as aggregate, anonymised, IoT or machine data. New technologies, new types of processing and the ever-growing amount of data pose continual challenges to defining what’s personal data and what’s not and the risks of handling non-personal or anonymised data are, in increasing circumstances, as acute as handling personal data under the GDPR.
Therefore, the practical challenges posed by processing personal, anonymised or aggregated data require the application of a high bar of security and protection, regardless of the type of data used.
To this end, the strategy should be prepared to address the potential harms coming not only from data processing that can lead to affecting the rights and freedoms of individuals but also from data uses regardless of their nature which have wider societal impact.
Personal data having value does not equal commodification
The National Data Strategy emphasises how ‘data has become a significant modern, economic asset.’ While we can attach an economic judgment to data, when it comes to personal data, we should be clear about the distinction of personal data having value and commodifying it.
As was highlighted in our expert panel on data ownership,4 there are several essential attributes in our society which have no non-market valuation such as clear air or clear water. Similarly, personal data should not be the subject of monetary estimations.
Focus on building trustworthiness over trust
One of the five broad areas of work acknowledged as part of the mission of the National Data Strategy is to ‘build trust’ in uses of data through, for example, a national engagement campaign on the societal benefits of government use of data.
Aims to build trust and promote benefits through engagement at a national, rather than local level, run risk of falling into a ‘deficit model’ that implies the problem to solve is the public’s lack of trust or understanding of the benefits. Under this model, solutions come in the form of informing people, or listening to their concerns only to dispel them and convince people that their concerns are a result of being misinformed. Deficit models do not recognise the complexity of public perspectives and what underpins them, and as such often serve to weaken, not strengthen trust.5
Rather than a focus on building trust in the use of data, the focus must be on building trustworthy uses of data. This shift in focus moves away from a deficit model and recognises that the challenge is not asking people to be more trusting, but in creating data practices which can be trusted. This means drawing on mechanisms for responsible data practices, transparency and accountability.
Finally, we make suggestions about missing elements that would strengthen the UK’s approach to data.
More specificity on which values or ethics
The NDS needs to get more specific on which values or ethics it is scaffolding its approach, including clarity as to why it is prioritising those values, and an acknowledgement of trade-offs. To ensure ethics translates into practice, legislation could set precise public policy goals that all parties need to meet, for example, to protect against discrimination, health misinformation or election manipulation, and encourage a shift from a ‘Can I build it?’ to a ‘Should I build it?’ approach.
Independent oversight mechanisms
Government should strengthen independent oversight mechanisms, particularly for novel or experimental data practices. In Exit through the App Store? we recommended the creation of a Group of Advisors on Technology in Emergencies, to complement any internal ethics reviews or frameworks.
Monitoring the use and impact of ethical frameworks
In the strategy the focus is mainly on softer governance to deliver responsible data. While frameworks and guidance can be the right tools, the Government should monitor their use and impact – for example by evaluating the uptake and impact of various frameworks and consider where some voluntary approaches could be strengthened.
Our research with frontline bodies (forthcoming) indicates that when resources are stretched the priority is on meeting legal obligations rather than using ethical frameworks: there need to be incentives for both legal and ethical use.6
Our response to the consultation is based on our research, public deliberation and expert convening.
As part of our consultation response, we have been working in partnership with the Open Data Institute, Royal Statistical Society, Institute for Government and Centre for Public Data to co-ordinate discussion around the strategy’s four pillars and think about how best to achieve the aims outlined in the National Data Strategy.
We paid particular attention to pillar four: Responsibility: driving safe and trusted use of data and are particularly indebted to the guest speakers and participants at our roundtable on responsible data held on Wednesday 11 November 2020. They include Swee Leng Harris (Luminate), Mathias Vermeulen (AWO), Lynne Currie (ICO), Raegan MacDonald (Mozilla), Seb Bacon (Datalab, University of Oxford), Kristina Irion (University of Amsterdam), Jesper Lund (IT-Pol), Sanjay Sharma (Brunel University).
The content of our consultation response is not directly endorsed by the experts.
Perspectives on the UK National Data Strategy 2020
Getting data right
Read the summary of the discussions from the ‘Getting data right’ series of workshops held with experts and practitioners about how the four pillars of the Strategy could best be addressed and realised practically and sustainably.
We welcome other’s views on this, please get in touch by emailing firstname.lastname@example.org with ‘UK Data Strategy’ in the subject.
The DCMS notes that the Government’s response will be published in due course following the consultation’s closure at 11:45 pm on Wednesday 9 December 2020. This will take all responses submitted to the consultation into account, and will be based on careful consideration of the points made in responses.
We look forward to seeing the final output.
- See: Safak, C., & Parker, I., (2020) ‘Meaningful transparency and (in)visible algorithms’, Ada Lovelace Institute, 15 October 2020. Available at: https://www.adalovelaceinstitute.org/blog/meaningful-transparency-and-invisible-algorithms/ and forthcoming work.
- For a fuller discussion of this see the ‘Getting data right: perspectives on the UK National Data Strategy 2020’ event summary, Pillar 4 (Event 4) | Responsibility: driving safe and trusted use of data, page 23, 24 November 2020. Available at: https://theodi.org/article/getting-data-right-perspectives-on-the-uk-national-data-strategy-2020/
- For a discussion of participatory practices with marginalised communities see: Patel, R., and Peppin, A. (2020) ‘Making the invisible visible: what public engagement uncovers about privilege and power in data systems’ Ada Lovelace Institute, 5 June 2020. Available at: https://www.adalovelaceinstitute.org/blog/public-engagement-uncovers-privilege-and-power-in-data-systems/
- Pavel, V., (2020) ‘Ownership or rights: what’s the path to achieving true agency over data?’ Ada Lovelace Institute, 10 August 2020. Available at: https://www.adalovelaceinstitute.org/blog/ownership-or-rights-whats-the-path-to-achieving-true-agency-over-data/
- Sturgis, P. and Allum, N. (2004) ‘Science in Society: Re-Evaluating the Deficit Model of Public Attitudes’, Public Understanding of Science, 13(1), pp. 55–74.
- See forthcoming report from the Ada Lovelace Institute on ‘Learning data lessons: data access and sharing during COVID-19’, expected in December 2020.
Can transparency bring accountability to public-sector algorithmic decision-making (ADM) systems?
Making visible the invisible: what public engagement uncovers about privilege and power in data systems
Lived experience insights at Citizens’ Biometrics Council and Community Voice workshops show technology can mediate power asymmetries and privilege.
Myths and themes to emerge from our panel discussion on data ownership at RightsCon 2020