The release of the Facebook files has refuelled discussions around the regulation of Big Tech. Today we publish the first in a series of blog posts exploring entrenched problems alongside some of the potential transformations that could lead to more ambitious visions for data use and regulation.
Every few months a major scandal involving a large technology company triggers an urgent public conversation about the need to regulate Big Tech. Yet the aftermath of well-publicised incidents such as Cambridge Analytica, Amazon’s sale of facial recognition to US police during the Black Lives Matter protests and the proliferation of the anti-vax movement on Facebook have failed to move the dial in any significant way.
The recent release of the Facebook files and papers already has a different profile. This may be due to context: they have hit the news sandwiched between a global Facebook outage and a significant rebranding of Facebook to Meta. It also comes at a time of increased regulatory attention on Facebook – from a steady drumbeat of US congressional hearings on Facebook’s business practices to plans to regulate social-media platforms in Europe, Canada, Australia and other regions, the release of the Facebook files may prove different given that its contents appear to validate many of the deepest concerns regulators have about the company’s practices.
The files contain an unprecedented number of leaked documents, which shed light on Facebook’s business practices and have intensified questions around measures to mitigate damaging algorithms and exploitative data use, propelling discussions about regulating online platforms into the mainstream media.
Of course, these debates are not new. In the USA there have been calls to break up Big Tech from academia, politicians, as well as more recent (unsuccessful) attempts from regulators. What may be new about the Facebook files is that they are diverting the discussion away from regulating monopolistic control of online platforms, to finding other regulatory solutions that address the structural underpinnings of companies like Facebook.
There are different proposals for what form these solutions might take. Frances Haugen, the Facebook whistleblower, disagrees with breaking up platforms and believes oversight of the algorithms is the key. While some advocate for a ban on surveillance advertising, others such as Roger McNamee, one of the early investors in Facebook, are calling for ‘legislation to address three related problems across the entire technology world: safety, privacy and competition’. This indicates a growing understanding that a single-lens approach to regulating large platforms will not be sufficient.
These questions about the types of interventions needed in order to address the concentration of power and data around large platforms have been the focus of our Rethinking data work at the Ada Lovelace Institute. The programme seeks to identify significant trends in the tech sector and explore a range of interventions (legislative, market-based, governance, public sector and societal norms) that could lead to a more balanced vision of our relationship to data-driven technologies.
At the centre of the Rethinking data programme is a group of experts who have come together to try to understand how power is exercised or asserted by large tech companies, and to unpack the challenges and opportunities created by different legal, technical, market and governance interventions that could lead to transformations from the current status quo.
The Rethinking data working group started with questions such as what changes in the data governance ecosystem are necessary in order to enable a countervailing vision for data that could make the case for its social value, tackle asymmetries of power and data injustice, and promote and enable responsible and trustworthy use of data. Last year, we began to explore what the future of data use and regulation could look like in 10 to 15 years from now and what the most promising interventions would be, to enable more ambitious visions.
Using futures methodologies and expert deliberations, as well as commissioning research on issues such as opening up platforms via interoperability, promising market interventions to support ambitious transformations in the digital ecosystem and new horizons for data processing and the tension between personal and non-personal data, our final report – designed to provoke and inform debate among policymakers, technology developers and civil society – will be published at the beginning of 2022.
Today we’re launching a series of blog posts to initiate that debate, with a focus on both the problems and solutions.
Leading theorists and practitioners will unpack and elaborate on some of the specific problems that our research has uncovered with the current digital ecosystem, including tech centralisation and concentration of power, data dominance and infrastructure dependencies, failed enforcement and fragmentation of regulation.
At the same time, we’re inviting an open conversation about transformative ideas and promising interventions for achieving ambitious visions for the future of data use and regulation.
To kickstart the series, we’re releasing a multipart contribution about the value and function of interoperability measures, as a potential solution not only to regulating big tech platforms, but also for enabling new types of infrastructure to emerge. Other contributions will explore questions such as how to address the collective harm that data-driven technologies can produce, and what measures are needed to give new data intermediaries a real chance to play a positive and meaningful role in the data economy.
Transforming the digital ecosystem with interoperability
Following the release of the Facebook files, investigations and parliamentary hearings have been launched in the USA, UK and EU. Whistleblower Frances Haugen has testified before the US Senate and given evidence in the UK Parliament in the context of the draft Online Safety Bill. The IMCO Committee in the European Parliament, foreshadowed by disagreements on certain rules, postponed their vote scheduled for 8 November 2021 on the package of digital proposals in order to hear the whistleblower’s testimony. Some of the critical elements for regulation that Haugen is proposing are oversight over algorithms and ‘full access to data for research not directed by Facebook’.
Discussions about regulating large platforms have re-intensified. In the US, they are contributing to the legislative initiative to modify internet platform liability for third-party content (Section 230). In the UK, they are relevant to the draft Online Safety Bill, which imposes a duty of care on platforms to protect their users from harmful content. In the EU, they are informing the negotiations on compromise amendments on the Digital Services Act (DSA) and the Digital Markets Act (DMA) – two legislative proposals aimed at creating a transparency and accountability regime for online intermediaries and enabling a new framework for competition, reigning in the power of ‘gatekeeper’ platforms, respectively.
Some of the core elements of the DSA and DMA, highlighted early on by Margrethe Vestager, Executive Vice-President of the European Union, were ensuring fair use of data and interoperability. The European Data Protection Supervisor (EDPS) issued an opinion indicating that ‘increased interoperability can help to address user lock-in and ultimately create opportunities for services to offer better data protection’. However, while the DMA provision to allow interoperability with connected services that a platform is developing is a step forward, without making such rules mandatory, some voices warned that this might not ‘be enough to reset the rules for big tech’.
A more ambitious response would be to mandate the functional separation of very large platforms via interoperability. This is one of the key concepts discussed in the Rethinking data working group as one of the potential interventions to support ambitious transformations in the digital ecosystem. Related to this analysis are links to more recent developments in China, where ‘super large platforms’ will be required to support interoperability, and previous reports in the US that examine the risks and benefits of an interoperability regime alongside how to mitigate privacy challenges.
For a deeper dive into the opportunities and challenges brought by this potential transformation and how it could work in practice, we’re launching today a multipart commission from Ian Brown, a leading specialist on internet regulation and pro-competition mechanisms. Read part 1: From ‘walled gardens’ to open meadows.
Image credit: bagotaj
Array ( [s] => [posts_per_page] => 12 [meta_key] => sb_post_date [order] => DESC [orderby] => meta_value [paged] => 1 [post_type] => Array (  => blog-post  => case-study  => evidence-review  => feature  => job  => media  => news  => press-release  => project  => policy-briefing  => report  => resource  => summary  => survey  => toolkit  => event  => person ) )
Rethinking data blog series
How interoperability could be the key to addressing platform power
Equitable interoperability as a standard
How interoperability could be the key to addressing platform power
An interdisciplinary and international group of experts to advise on the development of data governance and regulations
What's the path to achieving true agency over data?
Today the Ada Lovelace Institute launches Rethinking Data.