Channel 4 journalist Cathy Newman has described the ‘dehumanising’ moment she came face to face with an explicit deepfake pornography video of herself.
The 49-year-old, who fronts the channel’s evening news bulletins, was investigating videos made with artificial intelligence when she was made aware of a clip that superimposed her face onto the body of an adult film actress as she had sex.
Ms Newman is one of more than 250 British celebrities believed to have been targeted by sick internet users who create the uncannily realistic videos without the consent of their victims – which is set to become a criminal offence in Britain.
Footage of the veteran reporter viewing the video of her computer-generated doppelganger aired in a recent report on Channel 4 News; she says the experience has haunted her, not least because the perpetrator is ‘out of reach’.
The investigation into five popular deepfake sites found more than 4,000 famous individuals who had been artificially inserted into adult films without their knowledge to give the impression they were carrying out sex acts.
Channel 4 journalist Cathy Newman says she felt ‘utterly dehumanised’ after viewing a deepfake pornography clip featuring her face imposed on an adult actress
This is not Cathy Newman – but a deepfake video featuring her face superimposed on that of an adult actress in a pornographic film
Deepfake apps have been advertised on social media despite pledges to crack down on their proliferation
Footage aired as part of the report showed a clearly disturbed Newman watching as her AI-generated double crawled towards the camera.
‘This is just someone’s fantasy about having sex with me,’ she says as she watches the disturbingly authored clip.
‘If you didn’t know this wasn’t me, you would think it was real.’
Writing in a national newspaper, Ms Newman thought she would be ‘relatively untroubled’ by watching a video a twisted stranger had made that superimposed her face on that of an adult performer – but came away from the experience ‘disturbed’.
‘The video was a grotesque parody of me. It was undeniably my face but it had been expertly superimposed on someone else’s naked body,’ she said in The Times.
‘Most of the “film” was too explicit to show on television. I wanted to look away but I forced myself to watch every second of it. And the longer I watched, the more disturbed I became. I felt utterly dehumanised.
‘Since viewing the video last month I have found my mind repeatedly returning to it. It’s haunting, not least because whoever has abused me in this way is out of reach, faceless and therefore far beyond the limited sanctions presently available.’
The broadcaster also told BBC Radio 4’s Today programme she found the footage ‘violating’.
She continued: ‘It was a sort of disgusting fantasy that someone had had and decided to create of me in all sorts of different sexual positions.
‘It was really disturbing and violating is really the best way to describe it.’
In its investigation, Channel 4 News claimed the most targeted individuals of deepfake pornography are women who are not in the public eye.
Newman spoke to Sophie Parrish, who started a petition before the law was changed, after the person who created digitally altered pornography of her was detained by police but did not face any further legal action.
She told the PA news agency in January that she was sent Facebook messages from an unknown user, which included a video of a man masturbating over her and using a shoe to pleasure himself.
‘I felt very, I still do, dirty – that’s one of the only ways I can describe it – and I’m very ashamed of the fact that the images are out there,’ she said.
Deepfake apps have proliferated online as artificial intelligence technology becomes more advanced – both in the speed with which it can generate pictures and video and the realism of the footage it can produce.
Matters came to a head in January when singer Taylor Swift became a victim of deepfaking – after sexually explicit AI-generated images of her were shared on social media after being generated using Microsoft AI image generation software.
The sharing of sexually explicit deepfake images is already a criminal offence in Britain under the Online Safety Act – but new proposals will also criminalise the creation of the pictures with malicious intent.
While she has voiced her support for the government’s proposals to criminalise the creation of malicious deepfakes, she believes the legislation may not be able to bring down even a fraction of those behind the sick videos.
She added: ‘I think one of the problems with this new law that’s being proposed is that this is a worldwide problem.
‘So, we can legislate in this jurisdiction, it might have no impact on whoever created my video or the millions of other videos that are out there.
‘I think speed is also of the essence here. because we discovered in our research that more deepfake porn videos were created in 2023 than all other years combined, so this is increasing exponentially.’
Taylor Swift is among those to be targeted by sick internet users creating fake sexually explicit images using artificial intelligence software
Love Island contestant Cally Jane Beech took to social media in January to reveal she had been a victim of deepfake porn
Advancements in artificial intelligence technology mean deepfake videos can be made using just a single picture of a person’s face – with the hallmarks of artificially generated images becoming harder to spot as the tech improves.
Channel 4 estimates that more deepfake videos were created in 2023 than in all other years combined since the advent of the trend.
The government says it wants the creation of sexually explicit deepfake images to be punishable with an unlimited fine.
Victims and safeguarding minister Laura Farris said: ‘The creation of deepfake sexual images is despicable and completely unacceptable irrespective of whether the image is shared.
‘It is another example of ways in which certain people seek to degrade and dehumanise others – especially women.’
Cally Jane Beech, a former Love Island contestant who herself has been a victim of deepfakes, has welcomed the new proposals.
‘What I endured went beyond embarrassment or inconvenience,’ she said.
‘Too many women continue to have their privacy, dignity, and identity compromised by malicious individuals in this way and it has to stop. People who do this need to be held accountable.’
The former Miss Britain previously shared her shock at being sent a deepfake of herself in a video on Instagram.
She said in January: ‘I was sent a picture from someone that said, “this is a nude picture of you”, and I said that’s impossible because I’ve never sent a nude picture or posted any nude pictures anywhere.
‘It was what looked like to be me my face, my body, but an AI version of boobs and private parts. First thing I did was panic.
‘Obviously someone has gone on there and done that picture of me and put it on the internet. I want to raise awareness on this I think it is massively important.
‘I knew there was such a thing as AI but I didn’t realise this kind of thing was happening.’