Visiting Policy Fellow
Jas Johl is a Visiting Policy Fellow at the Oxford Internet Institute whose work focuses on global efforts to manage digital identity, privacy, and emerging technology.
The removal of a BBC documentary critical of Indian Prime Minister Narendra Modi by both YouTube and Twitter earlier this year has renewed critical focus on the censorship capabilities of Big Tech companies, who increasingly leverage their control of data flows and social infrastructure to enact political goals and compete with—and even supplant—the regulatory role of public institutions.
While researchers have drawn attention to government led campaigns to spread computational propaganda and censor online content, Web3 proponents have advocated for decentralization of technology and building “censorship resistant” applications that are governed by code, rather than humans as a means to counteract the increasing centralization of censorial decision-making in the hands of a few key tech monoliths.
But calling for greater degrees of decentralization alone, without grounding in the forms of governance and institutions that have meaningfully succeeded in distributing decision-making authority at lower levels, is unlikely to yield sufficient results. Attempts at decentralizing technology would be better served by deliberately working with and drawing on existing social structures rather than trying to replace them.
A federated model of technology, analogous to political federalism, is a non-centralized structural arrangement of technology that offers the potential for meaningful checks and balances to combat both censorship and increasing centralization of power.
When the term centralization is used in technology, it broadly refers to a network with a central authority in control of the data, decision making, and network functions. Facebook, for example, has complete control over its features, content moderation, and who can and cannot join the platform, and all data is stored on servers or databases controlled by the company.
The concentration of control makes it easier for centralized technology companies to censor the flow of information on the platform. Twitter, Facebook, Netflix, and YouTube, for example, have removed or restricted access to certain posts, videos, accounts, or topics based on their own content policies or a myriad of politically motivated governmental requests.
While centralization allows for quick removal of dangerous content, it puts the control for what constitutes “dangerous” in the hands of a privileged few. Big Tech Companies employ limited numbers of content moderators, leaving decision-making agency over information in the hands of authorities distant from the relevant groups and outside the social context of the countries for which they censor content. (Countries with outsize legislative and economic weight also command greater moderation attention: 16% of Facebook’s content moderators were located in Germany, which accounts for only 1.5% of their global user base, due to the country’s 2017 passage of a Network Enforcement Act).
Decentralization, which for many holds promise as a way to empower people to act decisively within their own social contexts, broadly refers to systems in which control and decision-making power are distributed among multiple entities in a network. There are a number of decentralized platforms like Mastodon (a microblogging service), Diaspora (a social network), and PeerTube (a video sharing platform) that offer alternatives to traditional social networks like Twitter, YouTube, and Facebook, by enabling the operation of web infrastructure and services without centralized ownership or control.
However, decentralized technologies present their own challenges in regards to censorship; for example, they have frequently been utilized by terror groups who leverage crypto currencies, peer-to-peer file storage, and encrypted communication to organize attacks. In 2022, European Union lawmakers introduced a new Digital Services Act in part, to hold Big Tech companies accountable for the spread of information that carries potential “societal risk,” including terrorist content. Yet, researchers have pointed to the regulations’ insufficiency in addressing the digital harms of decentralized extremist networks.
Rather than looking to decentralization of technology alone as a panacea for censorship and centralized power, we should instead ask what configurations of technology mirror real-world arrangements of decision-making authority that have led to meaningful checks and balances.
A federated system of technology, analogous to U.S. federalism in some ways, with its distribution of powers across local, state, and national governments, has the potential to equip those closest to a problem, and often with greater knowledge and a larger stake in its resolution, to coordinate.
In a federated model of technology, different servers are connected together to form a network, but each maintains its own local autonomy and control, and the ability to set policy for use of its own resources’ while allowing members to connect across the network.
For example, the federated social media platform Mastodon allows users to choose which servers to join and which communities to interact with, and each server has its own governance in the form of moderation policies and content guidelines. Each individual server administrator has the authority to set the rules for the users of their server, but they have no control over activity on other servers.
This distribution of control provides federated systems like Mastodon some censorship-resistance: anyone can run their own instance that is not subject to one central authority, and if a single server is taken offline, the rest of the network can continue to operate independently.
Of course, that’s not a guarantee against censorship: federated systems are still part of a larger institutional structure. While the majority of Mastodon users use a small set of servers, almost all those servers use a small set of infrastructure service providers (ISPs), companies which have themselves been pushed to the forefront of global legal and governmental battles regarding censorship.
It’s critical to remember that, like within a federal model of government, a federated system of technology cannot produce meaningful distribution of decision-making authority if the institutions that are meant to uphold the checks and balances are themselves not fair or representative.
Researchers have drawn attention to the extensive ways in which existing societal inequalities and individual biases are embedded within the algorithmic structures of current technology. Leaving the architectural arrangement of technology in the hands of organizations that are largely homogenous – as in the case in the U.S. tech sector – will reproduce their values, as well as further entrench the biases and decision making authority of those who make up the organizations themselves.
We should make sure to ask not only what kind of technology federation is desirable, but also a federation of who and for what purpose?
The Mozilla Foundation recently purchased an instance of Mastodon. Imagine what a federated arrangement of technology, led by a non-profit actor like Mozilla, alongside major civic organizations and human rights organizations, would bring to the current space in terms of the potential for institutional checks and balances, in our current space saturated by private sector actors driven by the ethos of surveillance capitalism – monitoring user activity and selling personal data to advertisers.
Such federated cooperative models, in which actors are charged to develop technology founded on non-market values such as solidarity, democratic ownership, and seeking to achieve fair labor conditions by implementing digital forms of collective bargaining processes, are technologically feasible, but would require strong state support and regulatory frameworks to compete with the private monopolies with immense financial backing.
Meaningfully addressing increasing centralization of technological power, censorship and control requires solutions that are informed by the realities of the social as well as the technical. We need federated models of technology – led by coalitions of civic organizations – that deliberately work with and draw on existing social structures to extend offline and online networks of trust, rather than trying to replace them.