Skip down to main content

Expert comment: Why digital literacy on the curriculum is an opportunity for a more proactive response to AI in schools

Published on
17 Nov 2025
Written by
Rebecca Eynon
The OII's Professor Rebecca Eynon welcomes curriculum changes, and urges an active role for young people in AI.
children in the classroom

Professor Rebecca Eynon, from the Oxford Internet Institute (OII) and the Department of Education, sees the proposed reforms to the school curriculum as a positive step towards equipping students to respond to social and technological change. However, she argues that these should go beyond merely equipping young people to be passive users of generative technologies, but also empower them to actively shape AI’s future.

The recent curriculum and assessment review recommends strengthening digital literacy at each key stage of the curriculum, to equip pupils for an era of rapid social and technological change. It includes recommendations to improve the clarity of the contents of the computer curriculum; to replace the much critiqued GCSE Computer Science with a Computing GCSE that is broader in focus; and to map the teaching of digital literacy and the use of technology across all subject disciplines.

There is a lot that is positive in these recommendations. They are rooted in concerns by education experts about the lack of digital skills amongst students that need to considered from primary school onwards; the desirability – but also challenge – of teaching digital literacies across the curriculum; the low uptake of computer science particularly by girls and those who are economically disadvantaged or culturally marginalised; and questions of how best to respond to developments in generative AI.

Indeed, these concerns echo many of the findings from our ongoing Economic Social Research Council Education project, Towards Equity Focused Edtech, where we carried out very rich ethnographic research in secondary schools in England. In significant contrast to the impervious discourse around digitally savvy youth, we found students who had notable skills gaps in core ICT skills such as word processing, dealing with files, or sending emails. In many schools there was a lack of clarity around who was responsible for teaching digital literacy, or where in the curriculum it should be taught. We also found significant variation in digital infrastructures across schools, leading to inequalities in access and use of technologies for learning. Not to mention confusion and concern amongst educators and students alike in all schools about whether and how to use AI “appropriately”.

Towards a proactive approach

The proposals put forward by the review will likely be welcomed by many schools who identify with these current problems. However, as the government takes forward these recommendations, it is important that they do not lead to inadvertently adopting a reactive approach to AI. It is unquestionable that we need to “equip young people for a world that is changing quickly.”  but it is not the case that we must simply prepare them for some kind of inevitable AI future. We must instead recognise that young people (and indeed all of us) make the future, and are making it now. AI did not magically appear. It is made and used by people, and reflects past cultural, economic and political choices and values. Nor is AI fixed; it can be changed. AI is not, therefore, just something to react to, but something that people should actively shape in relation to the kinds of education, and indeed society we want.

This requires then a rejection of any kind of inevitability with AI nor a set future, and a proactive, not reactive, response to AI in schools. An important strand of this response is the development of digital literacy for students. Three important elements are needed to ensure this is done as a proactive response – one that promotes criticality, inclusion and responsibility.

Criticality

The review sets out important foundations of digital literacy (and relatedly media literacy), which will enable young people to have the knowledge and skills to engage with learning, and to participate in social life and use technology safely. It is important that young people are not positioned as “end users” of fixed AI technologies. Instead, they should be supported in becoming citizens who can use and engage with technology critically in the richest sense – including economic, political and cultural factors.

. For example, students should be taught not only to identify misinformation and disinformation, but to also learn about the complex sociological as well as technical reasons for why it occurs and its social implications. Other areas of work could include the wider political economy of AI that favours powerful companies in certain parts of the globe, the environmental costs of AI, or ideas around surveillance capitalism. This approach would not just support young people in becoming responsible and discerning users of AI, but people who can potentially change it through their use, forms of refusal, or through re-design of AI.

Inclusion

Design is a key aspect of digital literacy, offering students ways to reflect on and make visible social injustices while examining how technology’s affordances and values can support or hinder inclusion. This might involve creating digital artefacts that express community realities, using coding to explore bias and discrimination in AI, or participating in design projects that address the needs of their school or local community. Such projects could improve students’ sense of self and awareness of inequities in school and society, and promote a stronger sense of social responsibility and the limits of technology to solve social problems.

Responsibility

Generative AI is error prone, often biased and can be inaccurate. Rather than hold the companies that build AI to account, individuals are tasked with developing the appropriate knowledge and skills so that they are able to identity and deal with these problems of AI; thus, moving responsibility away from developers to individuals.

However, teaching students how to question and critique generative technologies should not be the panacea for biased, unregulated, and problematic AI.  There is a risk that a focus on digital literacies can responsibilise young people in dealing with the significant problems of many AI products. Thus, beyond the national curriculum, there is a societal responsibility that does not just fall on young people, to find ways to better govern and regulate AI in ways that address the multiple environmental, legal and social costs of such technology.

Developing the agenda

The review is a productive basis to develop a digital literacy agenda for schools that forms part of a proactive response to AI in schools. But in determining the new curriculum’s details and how it will operate, it is important that varied voices and expertise are part of defining and setting the terms. This includes academic experts, those working in the third sector and, crucially, teachers. In the past, commercial voices have taken too much sway.

Of course, the digital literacy curriculum can only go so far. It will be taken up and engaged with in varied ways by teachers and students, and is only one aspect of a proactive response to AI and schools. However, it is an important starting point in efforts to supporting young people and teachers towards social and educational change.

This article was first published as an Expert Opinion piece, available at the University of Oxford website.

Related Topics:

Privacy Overview
Oxford Internet Institute

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Strictly Necessary Cookies
  • moove_gdrp_popup -  a cookie that saves your preferences for cookie settings. Without this cookie, the screen offering you cookie options will appear on every page you visit.

This cookie remains on your computer for 365 days, but you can adjust your preferences at any time by clicking on the "Cookie settings" link in the website footer.

Please note that if you visit the Oxford University website, any cookies you accept there will appear on our site here too, this being a subdomain. To control them, you must change your cookie preferences on the main University website.

Google Analytics

This website uses Google Tags and Google Analytics to collect anonymised information such as the number of visitors to the site, and the most popular pages. Keeping these cookies enabled helps the OII improve our website.

Enabling this option will allow cookies from:

  • Google Analytics - tracking visits to the ox.ac.uk and oii.ox.ac.uk domains

These cookies will remain on your website for 365 days, but you can edit your cookie preferences at any time via the "Cookie Settings" button in the website footer.