Nancy is a DPhil student at the OII. Her research concerns the public perception and user experience of emerging technologies.
Why adopting a feminist approach to tech governance can help ensure a healthy internet and trustworthy AI.
Oxford Internet Institute doctoral researchers Rutendo Chabikwa and Nancy Salem put forward the case for adopting a feminist perspective to tech governance. In doing so they share their highlights from the first ever MozFest House Kenya gathering where the topic was discussed at length as part of a wider debate on the internet and trustworthy AI.
From the 21st to the 22nd of September 2023, Mozilla Foundation held its first ever regional MozFest House: Kenya. MozFest House: Kenya was a regional gathering that brought together “builders, researchers, policymakers, activists, civil society organisations, and philanthropy to connect and explore critical issues related to a healthy internet and trustworthy AI”.
The festival not only centred African perspectives on responsible technology, but it also highlighted how Africa can and does contribute to the global conversations on healthy internet and emerging technologies. For example, on the second day there was a debate and discussion panel that brought together individuals with great experience to discuss structural interventions in AI. Amidst these insightful conversations, we – Nancy Salem and Rutendo Chabikwa, doctoral researchers at the Oxford Internet Institute - took part by facilitating a hybrid session titled The Case for Feminist Tech Governance, supported by the OII’s Dieter Schwarz Foundation funded Research Programmes on AI & Work, and AI, Government, and Politics.
The aim of the session was to push forward the existing framework of feminist tech governance and suggest more holistic and contextualised approaches to tech governance. MozFest created an interdisciplinary collaborative space that allowed for us to uncover new insights that can inform tech governance, not just on the African continent but beyond. The workshop was split into two parts: ‘(Re)defining Feminist Tech Governance’ which was led by Rutendo and a discussion of how user-focused research can contribute to thinking about priorities for governance, led by Nancy.
(Re)Defining Feminist Tech Governance & Trust and Safety
In order to be able to take a fully feminist approach to the discussion, we began by recognising that there is a socio-political nature to technology. This means recognising that, “Tech is Not Neutral, Nor is it Apolitical” (Dr. Kim Crayton). This is important because it allows us to understand that the same systems (of oppression) present in society – racism, sexism, ableism, neoliberal capitalism, heteronormativity etc.- affect how tech functions, how it is used, and the effects it will have on society. This therefore means that understanding that tech governance is a feminist issue goes beyond simply adding women and gender minorities to the current systems that uphold tech development and deployment today. These systems are already exclusionary and inequitable, and technology can exacerbate the exclusion and harms. The most poignant of reflections, was the discussion recognising that access to technology was not a panacea. The focus on ‘closing the digital divide’ while important to a certain extent, would not prevent the harms that digital technologies can inflict on women and other gender minorities.
With this in mind, we began to discuss and unpack an example of contemporary approaches to policy, and see if we can come up with a more tech specific approach. The framework we began with is one that is used in feminist foreign policy, as there are moves to apply this same framework to tech governance. The framework is the 3Rs framework: Rights, Resources, and Representation. The 3Rs framework centres equal rights for all, equitable access to resources, and equal representation. It is necessary that this framework be interrogated from African perspectives, as it is constantly discussed and upheld by, mostly, Global North states; with the Global South as the ground upon which this framework is implemented. In discussing each of these pillars in turn, participants of the session provided critical insights into feminist approaches to tech governance and trust and safety.
Rights: The understanding of rights in the digital age needs to be expanded and made more complex to provide protection for harms that are specific to digital technologies. Examples such as the right to be forgotten and the right to privacy highlight this complexity. However, beyond expanding how we understand and speak of rights, it is important that we also contextualise rights. An example we discussed was that of access and privacy being intertwined in the African contexts, where some women and gender minorities only have access to technology through shared devices and/or spaces. How do we create policies that can protect these rights in a complex settings?
Another important discussion on the ‘rights’ pillar is that of recognising the power relations in the rights discourse. As Ranciere put it, “the Rights of Man are the rights of those who have not the rights that they have and have the rights that they have not” (2004). The human rights discourse is one in which power to protect the powerless is bestowed upon, or taken up by, one group over another. Participants suggested that empowerment to seek recourse, should be central to protection of rights.
Resources: The current framing of resources focuses on equitable access to resources, which in the case of technology includes access to technological devices. While this is indeed important, it is important to consider expanding the understanding of resources in the tech sector in two ways. The first is expanding this to the resources required to develop technologies. The physical infrastructure upon which digital technologies exist is a result of exploitative extraction practices. Many of the minerals used in the making of tech hardware are found in areas of extreme conflict, extracted through multiple human rights abuses. Thus, the challenge of technology begins before we even have a tech product. A truly feminist understanding of this issue considers the power imbalance and oppression that allows for tech products to become possible. Tech governance, if it is to truly take a feminist approach, would take into consideration the sourcing of these materials.
The second, a contribution from the participants, is the importance of recognising humans as resources in the development of technology through either providing the that further develops specific algorithms or through gig labour. Participants provided the argument that reframing (gig) labour as a resource for tech companies as opposed to customers/users, may provide more protections for women and gender minorities; thus pushing for a feminist approach to trust and safety policy.
Representation: The most important contribution from the discussion on representation was the importance of moving beyond representation of women in the coding and making of technologies. Based on the acknowledgement that there is a socio-political nature to technology, participants suggested that communities of interest must be involved in the deployment of technologies. This offered a great transition into discussion how UXR approaches can inform tech governance and trust and safety policies.
Socio-technical Research and Feminist Tech Governance
To follow the discussion of the socio-political nature of technology, our session also examined how technologies are socio-technical, that is, they are formed at the intersection of social structures and technical capabilities. As Mozfest brings together a highly interdisciplinary, and cross-sectoral community, our conversation highlighted how socio-technical research can be one area where academics, builders, and civil society could work together to identify priorities for tech governance. Specifically, how methodologies across these sectors centre users of technology in different ways (from ethnography to UXR) and how we can think about going from these insights to thinking about governance. We shared research from two projects examining the lived experience of women and gender minorities in the Gig Economy, in Cairo and internationally. We illustrate two findings:
The first, that research that focuses on users as they navigate technology in their day to day life can illustrate the complexity of assessing safety mechanisms. In the case of the gig economy, we discussed how mechanisms that are supposed to make drivers feel safer can set off a chain reaction of events. The option to report passengers harassing or bullying drivers on apps is often visible to those passengers causing additional safety risk. In interviews with workers, the idea of a more discrete option was discussed, referring to the option to make an SOS call on an IPhone. This prompted a discussion around institutional memory and learning about trust and safety across different kinds of apps. Governance mechanisms can support learning from successful safety mechanisms and stop the replication of mechanisms that are ‘easy’ design fixes, but do not reduce harm.
The second related point, that technologies are not used in isolation. Women and gender minorities use WhatsApp, share location options, and other functions to stay safe while working in the gig economy. Research on safety for drivers implicates looking across the design of many apps, and how they interact with each other. It is difficult to think about the regulation of Uber, or the gig economy more broadly, without thinking about the regulation of WhatsApp which is often used by drivers to keep safe while driving. Our session highlighted this, amongst other reasons, is why governance, and not individualised platform regulations are necessary.
The way forward
Discussion at the session further highlighted that governance structures are not enough if people do not feel able to seek redress and remedy when rights are violated. Socio-technical research can identify when technologies are designed in ways that make redress difficult, or implicate a new set of trust and safety issues, as in the case of gig workers finding reporting options unreliable when they are visible to passengers. Finding trust and safety issues arising from design leads us to think about how this is indicative of a gap or oversight in governance. Work in understanding the lived experience of research can lend evidence towards principles of effective, equitable governance.