We use cookies where necessary to allow us to understand how people interact with our website and content, so that we can continue to improve our service.
AI-generated clips of political figures saying things they never said are a threat to democracy. But there are already laws that the Crown Prosecution Service could use against political deepfakes.
Advanced AI technology can now create convincing images, audio clips and videos of people in places they’ve never been, doing things they never did and saying things they never said. Deepfakes of political figures could be misused to spread false information, shift public opinion and influence elections.
Last year’s elections in Slovakia saw a tide of deepfakes unleashed across social media, and both Rishi Sunak and Keir Starmer have already been attacked with deepfake content. But the police are taking no action and the Electoral Commission has issued a warning that it has no powers to tackle this scourge. This means that fake, AI-generated videos, photos or audio clips could warp the UK’s democratic debate and the outcome of our elections.
The Online Safety Act will presumably require social media platforms to take action, but it’s not yet clear how it will be implemented. Given that political deepfakes are already proliferating, this leaves the next general election – likely taking place within a matter of months – wide open to interference.
Legal advice commissioned by Good Law Project (which can be read here and here) says that deepfakes relating to political figures could be illegal. The Representation of the People Act 1983 includes a provision that anyone who “makes or publishes any false statement of fact in relation to the candidate’s personal character or conduct” before or during an election for the purpose of influencing its outcome “shall be guilty of an illegal practice”. And the Online Safety Act 2023 creates a new criminal offence of sending a message intended “to cause non-trivial psychological or physical harm to a likely audience”.
But to date, there have been no prosecutions of this kind in the UK. The Crown Prosecution Service is failing the public on two counts: letting offenders off the hook, and leaving prosecutors to struggle with outdated guidance.
That’s why we’ve written to the Crown Prosecution Service, asking it to confirm that deepfakes aimed at influencing an election could be illegal and to publish updated guidance in this rapidly-moving area.
There’s no time to waste. It’s time to safeguard our democracy by taking action on political deepfakes.