Skip to main content

Hold power to account in this election and beyond

With an election imminent, it’s crucial we continue to defend democracy and hold the powerful to account.

Donate now
Case update 31 May 2024

Ofcom must revise rules on political deepfakes

Political deepfakes could upend this election, but Ofcom’s advice to social media companies is ill-prepared. It’s time for the regulator to spell out what the law demands.

With only five weeks to go before the country goes to the polls, the fight against political disinformation has never been more important.

There’s nothing new about people publishing dubious statistics or questionable claims during an election. But political deepfakes – pictures, audio or video of political figures created with AI tools – can be made with a few clicks, and spread like wildfire through social media networks primed to boost the most extreme content.

Technology is outpacing legislation, but – as we pointed out to the Crown Prosecution Service last month – the Online Safety Act sets out provisions that could help to protect voters from political deepfakes.

Call for action against political deepfakesAdd your name

The communications regulator Ofcom is preparing guidance on how social media companies should navigate these choppy waters, laying out their duties to tackle content that is illegal under the act. But legal advice commissioned by Good Law Project says its latest draft is “deficient”, which could let social media companies and other service providers of online content off the hook.

Deepfakes are only mentioned twice in Ofcom’s draft, despite their increasing influence on elections around the globe, and it doesn’t make clear how to tell whether a deepfake might be illegal content.

Ofcom admits that it can be hard to tell whether content is illegal by just looking at what it says, and suggests that companies will need to assess “contextual information”. But it doesn’t say how companies can take this contextual information into account – despite the fact that tools designed to detect deepfakes already exist.

Unless the regulator updates its guidance, social media companies and other communication platforms won’t carry out their duty to tackle deepfakes.

It’s too late to bring in new legislation to cover this election, but existing laws do provide some of the tools we need. It’s time for Ofcom and the Crown Prosecution Service to enforce these rules and take meaningful action against political deepfakes. Our democracy is at stake. 

  • In our fractured media environment, it’s easy for political deepfakes to fly under the radar. If you spot one, let us know at legal@goodlawproject.org