There are growing concerns over deep fake naked images of women circulating online.
The latest development in artificial intelligence (AI) means you can choose a photo of a person fully clothed and in return the app will deliver a picture or a video of them naked.
Dublin Rape Crisis Centre (DRCC) is calling on the Government to implement a full awareness campaign around deep fake images in light of this new technological risk.
On The Pat Kenny Show, DRCC chief executive Rachel Morrow said the images being created by these AI apps are “incredibly realistic”.
“You can use deep fake AI to create images, to create video, to create audio - they are incredibly realistic,” she said.
“They take the images, perhaps of somebody that you know, or a celebrity, a friend, a family member, and you put them into an AI generator, and what it spits out is sexualised content.
“Some people refer to it as deep fake porn.
“There's lots of websites that are easily accessible and it's extremely worrying, not just the content itself, but the proliferation and the accessibility of this type of image.”
While there is still some policing of social media sites for nude images, these AI apps are being fed to consumers through advertising on social media apps.
“This was brought to our attention on a particular website by a journalist with The Journal and he was being fed through advertising a link to download an app that would generate this imagery for him,” Ms Morrow said.
“So, he queried this with [DRCC] and I believe that since it has been taken down, which is obviously very welcome.
“These websites are very, very accessible and there are hundreds of millions of videos and people who are victims of this.
“Many of them don't even know that they are a victim, because, you know, they aren't using the site.”
Ms Morrow said that because the images are AI generated and aren’t actual naked pictures, some people think these crimes are victimless - which is not true.
“Our message is that there are many victims [of this crime] we hear about in Dublin Rape Crisis Centre,” she said.
“There's new legislation going through in the UK and many women have been speaking out about this problem because of that legislation.
“Many of these people are celebrities, there are politicians who have been particularly targeted.
"I suppose this is a new form of controlling women, because when women are successful - look at Taylor Swift - there was a lot of coverage about it when it happened to her and X was very slow to take down the imagery that was associated with her.
“But women are being targeted, successful women, but I suppose for successful women or well-known women or celebrities, they have a platform that they can talk about this but for other people who maybe don't have that platform it is very, very traumatising.”
Ms Morrow said Coco’s Law or The Harassment, Harmful Communications and Related Offenses act of 2020 covers deep fake imagery and videos and audio but there have been no prosecutions that relate to deep fake imagery as of yet.
Girl holding phone crying. Image: Alamy