Skip to content
Blog

Getting under the hood of big tech

Auditing standards in the EU Digital Services Act

Jenny Brennan , Alexandru Circiumaru

15 March 2022

Reading time: 9 minutes

A section of a car dashboard

How much hate speech is there on YouTube? Is Facebook showing job adverts only to users from certain racial demographics? Over the last decade, despite tech platforms’ evolving content moderation policies and approaches, it has remained nearly impossible to answer questions like these about the behaviour of platforms and nature of content being recommended on them. 

That is beginning to change – policymakers are now moving to regulate big platforms like Facebook, Google, Twitter and TikTok with legislation such as the European Union’s Digital Services Act (DSA) which provides regulators with new powers to investigate tech platforms. How EU policymakers design and implement those powers will have consequences globally for people and society.

The DSA creates requirements for transparency, content moderation and assessing the risk of harm posed to users, including whether platforms meet their own standards on harmful content. Under the Act, regulatory inspection, sometimes referred to as ‘audit’, is expected to be the primary tool for verifying the compliance of large tech platforms (covering platforms which, according to Article 25 of Act proposal, offer their services to 45 million or more average monthly EU users) with the legislation. 

However, the Act includes little detail on what an audit entails and how one should be carried out. Instead, it defers to European and international standards bodies to develop voluntary industry standards for audits (Article 34.1.d). In the absence of clear direction from EU policymakers and considering the degree of technical expertise needed, there is a real risk that platforms themselves will set the standards for audits by default, while existing contributions in this field from academia and civil society are overlooked.

The final text of the DSA is currently being negotiated by EU legislators and is expected to be adopted before the end of June. The Council of the European Union (‘the Council’) and the European Parliament have already met twice to debate their versions of the original proposal, initially published by the European Commission (‘the Commission’) in December 2020. All three institutions agree that auditing is an important component of the regulation, but there are differences of opinion between them as to what an audit should consist of. 

In this blog, we make the case for the role of detailed auditing guidelines under the DSA to support audits as a rigorous, meaningful mechanism for holding big tech platforms accountable. We examine the forms of audit identified in the DSA and the challenges European policymakers and regulators will have to grapple with. Based on our previous research, we also provide suggestions on the best ways forward for the upcoming discussions to finalise the Act.

Three types of auditing in the DSA

The DSA presents three forms of auditing, which are intended to respectively monitor, check and investigate the compliance of very large online platforms, such as whether they are enforcing their own terms and conditions, whether transparency reporting is accurate and whether the required complaint handling is in operation.

  1. Monitoring – ongoing monitoring by regulators could be put in place to identify concerns or non-compliance.

    For monitoring, the Commission will be able to mandate access to and explanations for databases and algorithms and have the option to appoint ‘independent external experts and auditors’ to assist with monitoring (Article 57).
  2. Investigating – regulators will investigate specific issues or instances of suspected or previously identified non-compliance.

    The DSA identifies that audits can be used for specific investigations into possible infringements (Article 50), as well as for enforcement after they have been found – for instance, additional audits can be requested by a regulator to assess whether steps taken to rectify infringements are effective.
  3. Annual compliance – an annual assessment of compliance with legislation.

    The DSA places a requirement on very large online platforms to pay for annual audits conducted by external independent experts to report on compliance with the Act and related Codes of Conduct (Article 28).

None of these three forms of audit is ‘ready to go’ with a clearly defined standard method or process. 

There is a burgeoning field of algorithm auditing, developed primarily in the contexts of academia and investigative journalism, with the participation of some industry actors. In Technical Methods for Regulatory Inspection of Algorithmic Systems in Social Media Platforms, the Ada Lovelace Institute surveyed the landscape and identified six technical auditing techniques that could be used by a regulator or independent auditor

Alongside these technical approaches, auditors will need to be able to review documentation and interview staff (working on product, moderation and policy) to inform their understanding of the platform in practice. However, how regulators and appointed auditors can best combine these approaches isn’t yet established – and is a challenge for DSA policymakers, as well as those considering similar legislation elsewhere, such as the UK’s Online Safety Bill and Canada’s online harms legislation

EU legislators need to determine how the three forms of auditing they have identified will be carried out in practice to make them robust and resilient. This is a complex and sensitive decision, which will be meaningful beyond Europe, given the ‘first-mover’ position of the EU in setting regulatory standards is likely to influence regulation elsewhere.

Standards and guidelines can’t rely on best industry practices

The initial draft of the DSA text proposes the development and implementation of voluntary industry standards for annual compliance audits, set by relevant European and international standardisation bodies. It does not provide further details on what audits may look like for monitoring and investigation. 

In their proposed amendments, the Council emphasises the role of ‘best industry practices’ for audit and the need for standards and guidelines, as well as making those standards publicly available. Meanwhile, the Parliament proposes a safeguard against standardisation bodies failing to deliver or agree on a standard: that the Commission itself be empowered to define specifications for audit (and other standards needed in the Act) in further legislation (known as implementing acts). 

The proposals make a significant step forward in recognising the importance of good audit practice, standards and guidelines – and transparent development. In the absence of precise descriptions of audits in the DSA itself, these extra elements will be essential for ensuring that inspections are rigorous and effective. However, the current proposals place too much emphasis on the notion of ‘best industry practices’. This is problematic for two reasons. 

First, there is no agreed method or technical standard for auditing a single platform in relation to a single issue (for example hate speech on Facebook), let alone for a single issue across all large platforms. Second, the focus on ‘industry’ leaves aside the wealth of research on compliance and accountability mechanisms coming from academia and civil society. 

Many individuals and organisations in these fields have developed significant experience in auditing methods, as part of their willingness to ask difficult questions of big tech platforms. Indeed, investigative journalism, academic and civil society audits have been an important source of awareness for regulators – from identifying that the YouTube algorithm is recommending videos that violate its own content policies on violent content and hate speech, to surfacing the promotion of content encouraging eating disorders to teenagers on Instagram.

An ecosystem of inspection

The authors of the DSA must recognise that regulators sit within, and rely on, a wider ecosystem of inspection. Auditing, as proposed in the DSA, will be conducted by a mix of actors – regulators themselves and independent auditors, either paid for by platforms or commissioned by regulators. 

This ecosystem of inspection will be made up of three types of auditor:

  • First-party audit: where the audit is conducted by the platform themselves e.g. Twitter’s analysis of its own platform in France, Germany, Spain, Japan, the United Kingdom, Canada and the United States found that, in all but Germany, tweets posted by accounts from the political right received more algorithmic amplification than the political left.
  • Second-party audit: where the audit is commissioned by the platform but conducted by a separate organisation e.g. Facebook commissioned a civil rights audit of its platform, published in 2020, which found failures to protect civil rights on a range of issues, from voter suppression to hate speech.
  • Third-party audit: where the audit is conducted independently of the platform. This could be a regulator themselves, or another organisation, such as a research institute, NGO or journalist e.g. The Markup used its Citizen Browser auditing tool to find that Facebook continued recommending ‘health groups’ – a frequent source of COVID-19 misinformation – to users after it had committed to stop.

An effective ecosystem would see these forms of audit complementing each other – for instance, a third-party audit could confirm or contradict a first-party audit. As such, regulators should be empowered to support the health of this ecosystem – by enabling a community and marketplace of trusted independent auditors; empowering independent auditing from academia and civil society; and penalising platforms that seek to disrupt or obstruct independent auditing.

Currently, many of the auditing methods we’ve identified in our research break down because platforms do not provide the relevant information or data access necessary for audits. The DSA is an opportunity to resolve this, granting powers to compel platforms to provide the relevant data and APIs to third-party auditors, who can undertake their own independent audits. 

Many third-party auditors from civil society organisations and academic labs describe their relationship with social media firms as one in which platforms treat them as adversaries rather than partners. In many cases, online platforms like Facebook have actively disrupted efforts to run these audits. To mitigate this threat to accountability, there have been calls to enshrine research access in the DSA, which should be considered by EU policymakers looking to support a strong ecosystem of inspection. 

The creation of auditing powers in the DSA is an important step forward for platform accountability, but for those powers to achieve their aims, EU policymakers must recognise the importance of who gets to define how audits work in practice and ensure that this isn’t left to industry alone The development of auditing guidelines and standards needs to support, and be informed by, the academia, civil society and the users and communities they will affect.

The DSA is an important piece of legislation, both because of its goals for accountability and the way in which it attempts to achieve them. Getting auditing right is crucial to its success. These remaining weeks of legislative debate, before the final version is adopted, are crucial. EU policymakers must ensure that robust auditing is able to support compliance with the Act and truly hold large platforms accountable.

Related content