Skip down to main content

New report calls for decisive government action to help build public trust in AI

People walking

New report calls for decisive government action to help build public trust in AI

Published on
24 Sep 2025
Written by
Helen Margetts and Jakob Mökander
A new paper from the Tony Blair Institute for Change, featuring a guest contribution from the OII’s Professor Helen Margetts, calls for government action to build better public trust in AI.

A new paper from the Tony Blair Institute for Change, featuring a guest contribution from the OII’s Professor Helen Margetts, warns that without decisive action to include the public in AI development, a lack of trust will undermine the government’s growth agenda and leave disadvantaged groups behind.  Entitled ‘Building Public Trust in Ai to Accelerate Adoption’, TBI’s paper argues that with AI adoption central to the government’s growth strategy, as set out in the AI Opportunities Action Plan, ministers must take urgent steps to ensure the entire population is equipped with the confidence and skills to benefit. Otherwise, public scepticism will undermine progress.

Jakob Mökander, Director of Science and Technology Policy at TBI and OII alumnus, said: “We simply can’t ignore the opportunity AI presents for growth – in building a thriving tech sector, attracting more capital to the UK, and enhancing productivity across the economy. Off the back of last week’s deal with the US, Britain has the perfect window to show the world what’s possible, and lead in tech diffusion and application.“To realise this, we need to prove to people that AI will work for them, not happen to them. We need to clear the fog and mystique around AI development and show people it’s a helpful tool just like any other.

“Building this trust across the entire country, not just those who are already evangelists for AI, must be an urgent priority if we want to bring the benefits of AI to Britain, and realise the government’s growth agenda.”

To bridge the trust gap, the Institute argues, the government must follow a strategy of active public engagement combined with policies that build justified trust in AI.

The authors recommend a two-pronged strategy. First, the government must ensure that AI systems are worthy of the public’s trust. This means ensuring that all AI tech is robustly tested for safety, with proportionate regulation in place to protect people’s interests.

Second, the government must deliver a national programme of outreach and communication efforts to improve public attitudes. At the centre of this strategy should be the principle of talking about AI in a ‘human’ way; communication should not get stuck on infrastructure and the ‘nuts and bolts’ of the tech but showcase how it can help in the day-to-day.

Also recommended is a national rollout of AI training programmes developed with employers, unions, professional bodies and further education providers that target people of all backgrounds, not just those already working with AI. Efforts should also be supported by engagement initiatives, potentially including ‘AI Open Houses’ that invite citizens into cutting edge AI labs, and an accessible, publicly broadcast lecture series.

With AI central to the government’s growth agenda, the Institute argues that it must build public confidence with a high-priority national programme of AI education and outreach to ‘clear the fog’ surrounding development

The authors recommendations are based on new polling from the TBI and Ipsos which finds that 39% of Britons see AI as a risk to the economy.  Critically, analysis of the new polling data shows that trust and optimism about AI are closely correlated with regular usage, which in turn is linked to being younger, male and higher income.

Guest author Professor Helen Margetts, Professor of Society and the Internet, Oxford Internet Institute said,

“People are adopting AI tools more rapidly than any previous digital technology, with half of respondents using these tools in the last 12 months and 26% using them weekly. But enthusiasm for AI is strongly related to people’s trust in public institutions to manage and regulate these technologies, so we need to ensure that we have an institutional landscape fit for our AI future.

The survey of 3,727 UK adults reveals that AI usage in the UK is highly varied; while more than half of respondents had not used AI in the last 12 months, 26% use it weekly. Critically, 38% cite a lack of trust in AI outputs as the main barrier to adoption, higher than any other factor.

The polling demonstrates that usage and confidence are sharply divided by age, gender and income. Adults under 35 are far more likely to view AI positively, while older demographics are significantly more sceptical. Women are six percentage points less likely than men to see AI as an opportunity for society, even when accounting for income and education.

Confidence in using the technology is also associated with income, with those higher earning most likely to say they’re confident in AI, even in sectors most exposed to disruption such as finance, professional services and IT.

The findings highlight a striking correlation. 56% of people who have never used AI see it as a societal risk, compared to just 26% of those using it weekly.  This ‘experience gap’ means that those already benefiting from AI are more likely to trust it, while others remain sceptical.

Mökander concluded:

“There is currently a gap between the government’s narrative and public attitudes towards AI. Building trust in AI will be key if the government is to succeed in delivering the AI Opportunities Action Plan and its wider growth agenda.

“Most people don’t need or want to know the technicality of large language models, semiconductors and neural networks. If we talk about AI only at this level, we just put up more of a smokescreen.“What they want is reassurance that the technology is safe to use and works in their interests. They want to understand how it’s going to improve their lives; their children’s education and career prospects, their jobs, their pockets, and the way they interact with government.

“Governments have a legitimate role in not only ensuring that AI is safe and effective, but also in building healthy public attitudes towards the risk and benefits. With good governance and increased public engagement, we can build the trust needed for the government’s growth agenda.”

Notes to editors

About the report

Download ‘What the UK Thinks about AI: Building public trust to accelerate AI adoption’ Authors; Jakob Mokander, Jo Puddick, Naman Goel, Alan Wager, Tim Rhydderch, Helen Margetts, Daniel Cameron and Ben Roff.

About the data

The polling was conducted by Ipsos Knowledge Panel on behalf of TBI between 30 May and 4 June 2025, with a nationally representative sample of 3,727 UK adults.

Related Topics:

Privacy Overview
Oxford Internet Institute

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

Strictly Necessary Cookies
  • moove_gdrp_popup -  a cookie that saves your preferences for cookie settings. Without this cookie, the screen offering you cookie options will appear on every page you visit.

This cookie remains on your computer for 365 days, but you can adjust your preferences at any time by clicking on the "Cookie settings" link in the website footer.

Please note that if you visit the Oxford University website, any cookies you accept there will appear on our site here too, this being a subdomain. To control them, you must change your cookie preferences on the main University website.

Google Analytics

This website uses Google Tags and Google Analytics to collect anonymised information such as the number of visitors to the site, and the most popular pages. Keeping these cookies enabled helps the OII improve our website.

Enabling this option will allow cookies from:

  • Google Analytics - tracking visits to the ox.ac.uk and oii.ox.ac.uk domains

These cookies will remain on your website for 365 days, but you can edit your cookie preferences at any time via the "Cookie Settings" button in the website footer.