Skip down to main content

Safer Internet Day 2023 – OII Director Victoria Nash & OII Visiting Policy Fellow, Lisa Felton, give their perspective on how to make the online world safer for children

Published on
7 Feb 2023
Written by
Victoria Nash and Lisa Felton
OII Director Vicki Nash, and OII Visiting Policy Fellow Lisa Felton, share their reflections on Safer Internet Day 2023.
children and screen

As we mark Safer Internet Day, OII Director, Associate Professor and Senior Policy Fellow, Dr Vicki Nash and Visiting Policy Fellow Lisa Felton share their perspective on how we can make the online world a safer place for children in their latest blog.

Parenting and guiding children in an ever-changing online world is a huge challenge. In the UK we are lucky that our schools teach children about online safety but that doesn’t mean we don’t need to make further space for talking and listening to children about their experiences online. We also need proportionate and effective regulation to ensure that companies do their bit, ideally with children’s voices and interests at its heart. We don’t have this yet, but after five years in the making, it seems that the UK will have the ambitious, broad-ranging Internet safety legislation that we were promised back in 2017.

The Online Safety Bill is back before Parliament for its third reading, following a series of amendments wrought as much by party political chaos as by parliamentary scrutiny. Given the potential for further amendments it’s hard to be sure of the final result, but the journey to date has exposed serious weaknesses which it will be hard to fix at this late stage.

Early aspirations were for a potentially world-leading law which could keep children safe and tackle the worst online harms. There was a wide-ranging consultation which sought to identify consensus about both priorities and approaches, whilst the resulting Online Harms White Paper, released in 2019, proposed a bold new framework which would see providers of online services given an over-arching duty of care for their customers. The latter approach was still evident in both draft legislation and the published bill, but along the way the Bill has gained and lost new sections, like a gaudy Christmas tree decorated by fighting children. Controversy has raged about the inclusion (and exclusion) of legal but harmful content for adults, whilst committee scrutiny has introduced valuable new measures such as duties regarding age-gated access to online pornography, and publication of non-consensual intimate imagery. Despite a general sense of positive forward motion, it is hard not to feel that important opportunities have been lost along the way.

Five years is a long time on the Internet. When the first green paper was published there was no Truth Social, no BeReal, and TikTok had yet to make its mark. This was before Frances’ Haugen’s whistle-blowing testimony, before Molly Russell’s tragic death, before the pandemic that left us dependant on online spaces for human connection, education and work. In those five years we have learnt a great deal more about the best and worst features of social media and online platforms. In particular, the role of platform business models in driving an attention economy has come sharply into focus. This could and should have been reflected in the legislation we have in front of us. The most significant gap is a failure to focus foremost on regulating online services as systems that shape how we interact, rather than just as content hosts. This might sound horribly abstract, but it has some really important implications. A systems-based approach would mean holding companies responsible for all the design, moderation and governance decisions they take, rather than simply the content they host. That content is created, shared and responded to by individuals, but it is the platforms that decide whether and how it is amplified, who it is pushed to and with what measures they might take to mitigate risk.

At present, the UK Bill tries to juggle both content and systems regulation, and as a result, does neither very well. There is mention of regulating algorithms and ‘functionalities’ in the legislation, and imposition of a limited duty of care implies a recognition that companies must consider risks when designing their products. But the focus on content takes precedence throughout. This is a major missed opportunity. System-based regulation has several key advantages, beyond just better aligning duties with responsibilities. First of all, it’s worth remembering that users of online services face a variety of risks, many of them not really reducible to problematic content. There are harmful behaviours, such as bullying, grooming or fraud, whilst other risks to wellbeing arise from predatory corporate behaviour or unethical data exploitation. Regulation that prioritises content risks thus offers less protection than duties to design systems in ways that respect users’ interests more broadly. A second key benefit of a systems-focused approach is that it would require in-scope companies to address risks ex ante, rather than dealing with problematic content once it is posted. Surely it would be better to require companies to anticipate risks that might arise around functions such as live-streaming and put measures in place to deal with this, rather than depend upon their ability to remove illegal content once created.

Unfortunately, the lack of a focus on systems design is not the only obvious gap with the Online Safety Bill. By its very nature, the Bill addresses risks and harms, not opportunities and benefits. This is perhaps the first duty of the state – to prevent harm. But online, as elsewhere, risks often go hand-in-hand with opportunities, and difficult trade-offs are involved in making decisions as to when to intervene or how far to go. In the context of the Online Safety Bill, this observation brings two implications. Of most concern is the possibility that over-zealous application of legislation will reduce freedom of expression and access to information despite explicit commitments of the Bill to the contrary, and this applies particularly to children and teenagers. There is a very careful balancing act to be achieved in the proposed use of age verification and age assurance technologies if the result is not simply to be cutting young users off from the services and practices they value so highly.

Secondly, moving from platform operator responsibilities to regulator responsibilities, it is vital that OfCom’s independence and expertise is protected from undue political interference. Current proposals to allow both Parliament and the Secretary of State to intervene at key moments in the regulatory process risk damaging OfCom’s ability to assess compliance in the round, with due consideration given to the difficult trade-offs that decision-making must involve. Finding space for expert views in this process would also inspire confidence that such due consideration will be given.  Academics have much to offer in providing evidence of risk and harm, whilst the voices of children and young people should also be heard if codes of conduct are to be informed by a rich understanding of what these online services mean to those that use them most. Finally, it is also essential to apply higher standards for larger providers, as is the case in the Digital Services Act which will achieve a better balance between protecting children across all services and ensuring that the largest providers take steps which are reasonable and appropriate in relation to their services.

It is not too late for change and for the UK to be a leader on moving a systems based approach. A tiered approach based on size and reach of services  may be a more proportionate way to address this, adding requirements to ensure that a more objective and robust approach is taken through independent auditing and increased transparency and guidelines. Ultimately, a greater focus on how the services work rather than narrowly focusing on specific types of content is more likely to be more adaptive to risks and drive effective results in the long term.

Vicki Nash is Director, Associate Professor and Senior Policy Fellow at the Oxford Internet Institute.   Find out more about Professor Nash’s research interests and published work.

Lisa Felton is a Visiting Policy Fellow at the Oxford Internet Institute and Head of External Affairs Strategy and Global Policy Programmes at Vodafone Group.