Skip to content
Blog

Common governance of data: appropriate models for collective and individual rights

From Elinor Ostrom’s design principles for governing the commons to mechanisms that ensure collective and individual data rights: what steps to take?

Diane Coyle

30 October 2020

Reading time: 6 minutes

The idea of the commons is an evocative one. For some political traditions, developments such as the enclosure of the commons during the Industrial Revolution or the fencing of the American west, stand as symbols of the usurpation of traditional collective rights and the immiseration of workers under capitalism. The association casts a warm glow over the protection of commons.

At the same time, Garrett Hardin famously gave us the powerful concept of the ‘tragedy of the commons’, the over-use of resources that had no legal owners. His solution was to assign individual property rights to common resources, creating the incentive for their maintenance; whereas Elinor Ostrom demonstrated that alternative, collective institutional frameworks could – sometimes – manage them successfully.

In the digital economy, the concept of a commons has been applied to intellectual property. As corporate interests have extended legal ownership of intangible assets, particularly copyright, a number of lawyers and economists have made the countervailing case for extension of the public domain.

This is contested terrain. Companies such as John Deere and General Motors are arguing in the US courts that the presence of sophisticated software in their vehicles, using real-time information, means that users who thought they were purchasing a tangible asset of their own, are in fact renting a bundle of metal, software and data from the corporation. Is this not another, modern, form of enclosure, a land-grab by the powerful?

Data is at the frontier of this conflict. One reason for this is simply the increasing awareness of the importance of data, as digital technology is used ever more widely in work and home life – who can fail to be aware of its role after the experiences of lockdown in 2020? But apart from everyday experience, there is also a growing body of research pointing to the greater success of firms that use digital technologies more effectively. People have also become more aware of companies ‘harvesting’ the data generated by their use of tech in order to sell advertising, and of the vastly damaging consequences of a business model based on the drive for clicks.

Another reason for the key role of data is that it is unlike other tangible and even intangible goods. To understand its distinctiveness, the economic concept of ‘rivalry’ is central. Physical, tangible, goods are rivalrous: one person can use or consume them. I eat the apple, and nobody else can. I install the machine in my factory, and no other businessperson can use it. Intangible goods are non-rivalrous – although other users can often be excluded from their use by legal or technical means. A software programme, once written, can be used over and over again without depletion. Similarly, data, once created and stored, can be re-used without depletion.

This makes the concept of the commons as applied to data a metaphor rather than an analytical construct. The traditional commons – land, fish, forests – are all tangible and rival. The tragedy of the commons lies in exactly that fact. Data, on the other hand, could be used many times over. Indeed, the creation of economic value requires the re-use of intangibles such as ideas, creative outputs and data. What economists refer to as ‘knowledge spillovers’ are the dynamos of increasingly intangible growth in economic value.

All the same, the legal norm applied to data almost everywhere at present is that it is property that can be owned. Hence the court battles over whether it is owned by the Midwestern farmer or the John Deere company. Hence the presumption that Facebook and Google can use ‘our’ data for their commercial purposes, once we have ticked the T&C box, handing over legal ownership – as if it were a scarce, rivalrous resource rather than a non-rivalrous one. It is the Garrett Hardin solution to the tragedy of the commons applied to something that is not a commons in the economic sense.

There is another sense, though, in which the commons metaphor does fit, and that is the inherently relational character of data or rather its useful content. A data record in itself, even if about a personal characteristic, such as my heartbeat or location at a given moment, has no information content; it does not help me decide on a course of action or enable me to bring about a different outcome. Yet it is the information on which I can act that gives it value (of any kind, economic or intrinsic). Is my heartbeat normal or worrying? That can only be answered by knowing the population average. Am I in the right place (to get where I want to or to be sold a coffee from a nearby cafe)? That depends on other geospatial and other information.

Conventional property rights make conflicts over who ‘owns’ this value inevitable, and hence the growing interest in forms of data governance that could deliver trustworthy access to data, such as data trusts or data boxes. A classic commons problem can be tackled by assigning private ownership and access rights; the challenge with non-rivalrous data is to assign common ownership and access rights.

This means that Elinor Ostrom’s well-known ‘design principles’ for common governance are exactly the right place to start thinking about appropriate rights. They need reinterpretation for the world of intangible, non-rivalrous common resources. In the Bennett Institute’s report for the Nuffield Foundation, we sketched the interpretation of the design principles for this context (see table of pg. 28 of the report).

One challenge is that the data economy exists at huge scale, whereas Ostrom’s studies investigated small groups, which made it possible to overcome the challenge of monitoring behaviour and to apply social sanctions when needed. The scale of the institutional framework needed to govern data makes the trustworthiness of the institutions concerned vital. Another key difference is the conflict of imperatives in the data context: the need for access to create personal and social value by using data versus privacy requirements and the need to create economic incentives to invest in data, ensure its quality and store it securely.

Nature gave us many common but potentially scarce physical resources. When it comes to data, we have the possibility of building some new common resources which are naturally abundant, and only artificially scarce. However, although this is the start of a necessary period of institutional and legal innovation – just as was needed in previous epochs of technological revolution before their benefits worked for everyone – Elinor Ostrom’s work gives us the starting point.

This article is the third in a series about data stewardship. Across the series, researchers and practitioners working in different organisations and contexts, who each have a unique perspective on data stewardship, will share practical experience and research ideas.

 

It’s not possible or desirable for one person or organisation to decide what a ‘good’ use of data is. That’s why we hope this series and our research will help push forward thinking on how to govern data for good and ensure diverse voices contribute to defining it.

 

Let us know what you think – Tweet @AdaLovelaceInst using #DataStewardship or email hello@adalovelaceinstitute.org with Data Stewardship in the subject line.

Image credit: Orbon Alija

Related content

This is not a cookie notice

The Ada Lovelace Institute website doesn’t have a cookie notice, because we don’t collect or share personal information about you.

We use only the minimum necessary cookies to make our site work, and to enable sharing on platforms like Twitter, YouTube and Google. We have no control over the tracking technologies used by these sites and services.

You can disable all cookies by changing your own browser settings, though this may change how the website works.

It’s part of what we do, making data and AI work for people and society. Find out more.