Skip down to main content

Powerful Platforms Should Be Regulated — But Does the European Commission’s Proposed Audit Process Risk Capture?

Published on
12 Jan 2022
Written by
Johann Laux and David Sutcliffe
Dr Johann Laux, Postdoctoral Researcher, Oxford Internet Institute, explains more in conversation with David Sutcliffe, Senior Science Writer, Oxford Internet Institute.
Image of social media icons

In Conversation: Dr Johann Laux and David Sutcliffe, Oxford Internet Institute

The platform economy is booming, at the same time as fears are rising that the huge market power of such giants as Facebook, Google, (etc.), could result in information asymmetries vis-a-vis their competitors, consumers, and regulators. In its efforts to create “a safer and more open digital space”, the European Commission has recently proposed a Digital Markets Act (DMA) to address the competitiveness  of digital markets, and a Digital Services Act (DSA) to tackle dissemination of illegal content on platforms.

In his article, “Taming the few: Platform regulation, independent audits, and the risks of capture created by the DMA and DSA“, published in the Computer Law & Security Review, Dr Johann Laux (with co-authors Associate Professor and Senior Research Fellow, Dr Sandra Wachter, OII and Senior Research Fellow, Dr Brent Mittelstadt, OII) discusses some possible weaknesses in the current proposal.

While legislation is clearly needed to address the market power of digital platforms, increase the transparency of their data-driven practices, and manage risks stemming from the content they disseminate, the DSA has stricter oversight rules only for ‘very large online platforms’ (VLOPs). Platforms below this threshold would therefore sit in a regulatory blind spot- including harms resulting from them. VLOPs could also leverage their market power against their new mandatory auditors and risk assessors, a threat theorised by Laux and colleagues as ‘audit capture’. Were this to happen, societal risks could go undetected or be downplayed.

We spoke to Johann about what could be done to ensure that consumers and citizens receive the full protection of these ambitious legislative proposals.

David: The proposed package of legislation seems self-evidently a good thing, and much needed. However, I guess you’re saying that oversight of harms (in the DSA) should perhaps be proportional to the size of a platform, rather than “on / off” according to a fairly arbitrary cut-off point. Just because a platform is small enough to sit below the radar doesn’t mean it couldn’t cause any harm.

Johann: The DSA doesn’t leave smaller platforms completely unregulated, but it does introduce special (i.e. extra) rules for what it calls ‘very large online platforms’ – online platforms like Facebook which currently provide their service to at least 45 million monthly active recipients on average. These platforms must regularly check whether they create ‘systemic risks’, for example through disseminating illegal content, causing discrimination, or manipulation. Of course, a platform with only 20 million active users could easily create such risks, too. Think of a misinformation campaign during elections, targeting only a subgroup of receptive voters. There is an element of pragmatism in adopting this threshold model, as screening for systemic risks is costly and resources are now allocated to where the largest number of people is exposed to such risks. There is, however, also an element of protectionism in the chosen threshold. European platforms are currently often smaller than their American competitors. While this may represent a sensible approach from a European perspective, it does leave a blind-spot in the detection of risks.

David: What sorts of specific “societal harms” is the DSA trying to mitigate? And why would companies require regulation in order to force them to deal with them?

Johann: It is simply time for an update of the regulation of online harms. In the history of the internet the question has always been, “Who is liable for the content shared on digital platforms?” The current law in the European Union, the e-Commerce Directive, was adopted in 2000. Companies like Facebook, Twitter, and YouTube were not even founded back then—but these and other services have since changed our political discourse, our social lives, and our decision-making as consumers online.

The DSA expresses a great concern over the misuse of these large platforms’ services. The European Commission has proposed to focus on three areas in which it deems an added level of oversight to be necessary: the dissemination of illegal content such as hate speech; the impact on fundamental rights, such as freedom of expression, the right to a private life, and the right to non-discrimination; and lastly, the manipulation of platforms’ services, for example through fake accounts and the use of bots. It is important to note that the current proposals are likely to undergo changes during the EU’s legislative process. Either way, if passed into law, the largest platforms would be incentivised to take a hard look at what is happening on their apps and websites.

David: Audit capture and the “adverse incentive structures” of the DSA is a central concern of your article – I guess you’re saying that the market power of platforms could be enough to shape any audit / oversight process to their advantage?

Johann: This is a paradoxical feature of the DSA as proposed by the European Commission. On the one hand, it acknowledges the powerful impact that very large platforms have in our digital lives. On the other, it ignores the market power these platforms have when selecting their auditors and assessors. The (novel) oversight for systemic risks that the DSA seeks to introduce will be implemented partly by external auditors and risk assessors, who could be companies, non-profit organisations, or researchers at a university. However, to get hired by the relatively few large platforms under consideration, there is a real risk that auditors will need to ‘play ball’.

Auditors will depend on the platforms for the data and information they need to do their work. Also, as an academic in a highly competitive job market, it is a huge advantage to have access to proprietary data on a large platform. Just consider the recent debate about whether Instagram is bad for the mental health of teenage girls, which was based on internal, non-public research by Facebook, Instagram’s parent company. Moreover, if you are an auditor with data science skills, and you want to change jobs, the most lucrative ones will likely be in the industry you are currently auditing. Thus, even with the best critical intentions, it is conceivable that external auditors have incentives to cater to the platforms’ interests. In our paper, we call these mechanisms ‘audit capture’.

David: You draw comparisons with the 2007-2008 financial crash, and problems with the credit rating market that contributed to it; pointing out that having independent auditors doesn’t necessarily guarantee their quality or effectiveness. What might we usefully learn from the crash and what followed – and could we apply those lessons to the problems we see with online platforms?

Johann:  Without changes to the Commission’s DSA proposal, some auditors could emerge as repeat players once they have been selected by the powerful platforms and performed well in the eyes of the audited.   In turn, this could lead to few auditors dominating the emerging audit market under the new regulation. Such market concentration is likely detrimental to the quality of audits and assessments. The financial crisis of 2007-2008 provides a historical example of this, when just three credit rating agencies (with a combined market share of over 90%) assigned inflated ratings of subprime mortgages to increase their revenues. These auditors were also concerned with their reputations – their most valuable asset – in relation to one other. This led to ‘herding’ behaviour, where they factored their rivals’ evaluations into their own ratings.

The lesson to be learned from the financial crisis is that merely establishing an obligation to be assessed by independent auditors does not necessarily guarantee high-quality audits and effective governance. In the aftermath, regulators introduced measures to increase independence. For example, under EU law certain auditors now have to rotate after seven years with the same client. To prevent revolving doors, there is also a freezing period of two years in which auditors must not take up a key management position in an entity they audited. France has a system of joint audits in which companies are required to appoint two different audit firms working together – which also helps smaller audit firms to enter the market. One could also think of a system in which the regulator – rather than the corporation – appoints the auditor.

Measures like these could all be helpful for reducing the risk of audit capture under the DSA. I am very pleased to see that the European Parliament seems to be pushing in this direction now, suggesting that auditors must not have provided any other service to the platform they are hired to audit twelve months before the audit, and cannot work for the same platform for another year afterwards. Our paper discusses several such options to increase the independence of auditors.

David: It was obvious when high street banks suddenly went into meltdown that stronger financial regulation and oversight was needed. The 2008 crash played out in front of us. What sort of equivalent shock – or “platform crisis” – would be required for tighter regulation of platform companies to become similarly blindingly obvious? Could there be an equivalent “global platform shock” that we all feel?

Johann: I think the misinformation crisis is already playing out right in front of us. Look at the interference in democratic elections, the spreading of false claims about the Covid vaccine, or campaigns in denial of climate change. Of course, we don’t have an equivalent of the Dow Jones Index that drops 50% in a few days and thus makes us all aware of the crisis we are in. Instead of what you called a “global platform shock” we are experiencing a gradual rise in temperature in our societies, brought about by misinformation and manipulation. But we are not simply frogs being boiled alive – we are aware of what is going on and can act accordingly.

David: So are regulators catching up with the market challenges of these giant, powerful companies – which have huge market share, and sit on vast troves of data? You’ve pointed out some potential (specific) issues in your article: what would you say about the state of things in general?

Johann: I think regulators are indeed catching up, whether it’s in Washington, Brussels, or London. The DSA is not the only new regulation that the European Union has proposed. It has a sister law, the Digital Markets Act (DMA) which is concerned with reintroducing competition in the platform markets in Europe. To a large degree, the DMA puts decisions into law that European market authorities and courts have already passed against big tech companies. But it is not so much the breadth of the regulatory agenda that worries me. What is lacking is depth, especially when it comes to the enforcement of these new laws. I doubt that we currently have the capacities in regulatory institutions to implement the necessary oversight mechanisms at all times, for all platforms. In our research in the OII’s Governance of Emerging Technologies project, we are therefore working on new tools that regulators and platforms can use to screen their algorithmic processes for problematic outcomes.

David: Finally – not to bring up the Brexit issue – but what is happening in the UK on this front? Is the UK Government already on to this? Or are we back to a situation of Europe leading on (safety) regulation, but us no longer able to benefit?

Johann: With the Cambridge Analytica data scandal during the Brexit vote, the UK got first-hand experience of the risks that the DSA is trying to address. Of course, after Brexit the DSA will not apply in the UK. Instead, the UK is working on its own content regulation rules for big platforms, the so-called Online Safety Bill. There is thus some regulatory competition between the UK and the EU. It will be interesting to see which approach passes the test of time.

David Sutcliffe was talking to Dr Johann Laux, a Postdoctoral Researcher on the OII’s Governance of Emerging Technologies project, where he focuses on the legal, ethical, and social implications of emerging technologies, and how they are governed. He tweets at: https://twitter.com/Johann_Laux

Read the full article: Johann Laux, Sandra Wachter, Brent Mittelstadt (2021) Taming the few: Platform regulation, independent audits, and the risks of capture created by the DMA and DSA. Computer Law & Security Review 43, 105613.

The article is a deliverable of the ‘A right to reasonable inferences in advertising and financial services’ project, supported by the Miami Foundation, Luminate Group, the British Academy (grants PF\170151 and PF2\180114), and the EPSRC via the Alan Turing Institute (grant EP/N510129/1).

The authors of the article declare they have no conflicts of interest.

Related Topics