Research, news, and openings
Visit the Synthetic Society Lab’s website.
Visit the Synthetic Society Lab’s website.
The scale and intimacy of sensitive data gathered about us, combined with powerful algorithms, allows to study human lives and societal patterns in ways previously unimaginable. Tech platforms and digital services use this data to shape our choices, relationships, and interactions. The potential for tangible harms, from privacy violations to algorithmic discrimination, is immense. Independent researchers and journalists are not equipped with the tools to monitor these harms in opaque and complex digital infrastructures.
How to ensure researchers have secure, privacy-preserving, and equitable access to “research-grade” data to best study algorithms? How can they measure the real-world impact of algorithms when direct access is denied or limited?
How can we make sure AI models are developed in ways that protect people’s privacy, ensure fairness, and prevent unintended harm, especially when sharing AI models or the information they produce? How can we build public interest technology that truly benefits everyone, and give people and community groups a stronger voice in creating a fairer future?
We advance human-centred computing research to study the impact of data and algorithms on society. Our research guides the development of algorithms and infrastructures that are sustainable, safe, and serve the public interest, helping to ensure that the power of digital technologies is accountable to everyone. Our research is built on three core areas.
Senior Research Fellow
Luc conducts human-centred computing research to understand how data and algorithms impact society. They work to make digital power visible to the public and guide the development of accountable, sustainable, and safe algorithms for all.
DPhil Student
Andrew holds a B.S. in Applied Mathematics from Yale University and an MSc in Social Data Science from the OII. He is a Clarendon Scholar and was previously a Thouron Prize winner at the University of Cambridge (Pembroke College).
Researcher
Postdoctoral Researcher
Alex develops approaches to statistics and machine learning which provide formal privacy guarantees to individuals, using techniques such as local differential privacy.
Research Assistant
Sofia completed her MSc in Social Data Science and now works as a Research Assistant at the OII, focusing on algorithmic fairness, machine learning, and interactive data visualization.
DPhil Student
Lujain is a DPhil student in Social Data Science at the OII. Her research sits at the intersection of AI governance and human-centred computing, particularly examining how user autonomy and control are undermined in human-AI interactions.
MSc Student
Francisco is a Chevening-WHT scholar doing a MSc student in Social Data Science, they have a master in Sociology at UFRJ and focus their research on digital methods, digital politics and culture.
DPhil Student
Juliette is a Clarendon Scholar at the OII, conducting research on data access, privacy-enhancing technologies, and algorithm auditing.
21 May 2025
Oxford researchers reveal how AI language models encode a flawed and binary understanding of gender, posing significant risks for transgender, nonbinary, and even cisgender individuals.
9 January 2025
Computer scientists at the Oxford Internet Institute, Imperial College London, and UCLouvain have developed a new mathematical model which could help people better understand the risks posed by AI and assist regulators in protecting peoples’ privacy.
19 July 2024
UKRI has announced its latest round of 68 Future Leaders Fellowships, of which the OII's Dr Luc Rocher was one of three Oxford recipients. The funds will enable them to work on making privacy technologies more transparent and accountable.
Image caption: Anne Fehres and Luke Conroy & AI4Media / Better Images of AI / CC BY 4.0.