AI-Powered 'Nudifying' Tools Are Now Used To Degrade And Dehumanize Women
What in the new level of hell is this?
Even if you've never taken a single nude picture, a photo of your 'naked' body could be circulating somewhere on the internet right this very second.
Yes, you read that correctly.
The act of 'nudification' - taking an image and stripping the subject nude using AI-powered software - is the latest internet trend that is gaining a worrying amount of traction. Anyone with an internet connection can do it at no cost. And millions of people already are.
One website in particular - which I won't name here because I know many creeps read my work for some reason - is now exploding with popularity. It has reportedly received 38 million hits in the first seven months of 2021. And its bio reads:
'Undress any girls with any dress. Superior than Deepnude. The threat to girls.'
Well, at least they got that last bit right. This disgusting practice is absolutely a threat to girls. And it's precisely things like this that only add more fuel to the fire of sexual harassment and abuse women already receive both online and in real life.
But why are websites like this even allowed to exist? And what can be done about this scary Internet trend?
'Nudification' and 'deepfake' problem is spiraling out of control
It's not the first time AI has been used for illicit purposes. Another nudifying tool, called DeepNude, was launched in 2019, but it was quickly taken down after the potential for abuse became apparent. But even though DeepNude is gone, its codes are still in circulation - including the new 'nudifier' website that is recently booming in popularity.
The site in question even boasts that its 'state of the art' deep-learning algorithm is so sophisticated there is not a woman in the world, regardless of race, nationality, body shape, or type of clothing she is wearing, who is safe from being 'nudified.' But it obviously doesn't work on men.
And to make matters worse, the website isn't confined to the dark web or even delisted from Google's search engine index. It operates out in the open and even encourages users to promote it on social media.
They actually have a 'referral' program that works similarly to the one we have here, on Medium. Users can publicly share a personalized referral link and earn rewards for each new person who clicks on it. And they often share it with pictures of 'nudified' women.
What's particularly worrying about this is that the vast majority of people using these tools target people they know, according to deepfake experts. This means that if anyone you know has used that website, chances are they might have used your picture. Or a picture of your daughter, sister, wife, or female friend.
If this utterly horrifying practice doesn't show the grim and increasingly dangerous reality of being a woman on the internet, I don't know what else will.
But even before 'nudifying' became a thing, women have been victims of another similar practice for a couple of years now - AI-generated 'deepfake' fake porn. As opposed to 'nudifier' tools, this technology superimposes victims' faces onto existing nude bodies.
And it's still being used today.
We simply can't keep dismissing the issue of 'deep fake' and 'nudification'. Because both of these trends can have a profound impact on many women and girls' lives.
Yet another way to make life more dangerous and difficult for women
Some years ago, one of my nude pictures started circulating in my friend group. The guy I had sent it to originally - who I was dating at the time - thought it would be 'fun' to share it with his mates. And then his mates shared it with other people. And then eventually I found out about it. Someone I used to work with commented that my butt looks better naked than in the jeans I was wearing that day.
Yikes. It's not a great experience, let me tell you that. It's humiliating. Degrading. And embarrassing.
Yes, I took that photo. And I consented to that photo being seen by the guy I was dating. But I didn't agree to anybody else seeing it.
Now, imagine if I didn't take that picture, yet I'd end up in a similar situation. Because some asshole decided to 'nudify' one of the photos available on my social media platforms. It no longer matters if these images are authentic - they create a new reality in which the victim must live. Without their consent. But even though the picture isn't real, it doesn't make this situation any less humiliating or embarrassing. It still very much is.
This is a sickening trend.
It warrants nothing but condemnation.
Because it not only reduces women to our body parts, with no rights or agency, but it's yet another way to make our life more dangerous and difficult. Just think about it. How many women will lose their jobs or relationships because of this? How many will have their lives put at risk?
There are a lot of places in the world where a woman could face real violence just for posting a 'provocative' selfie, let alone if fake naked pictures of her were to be circulating. Not that long time ago, a Saudi Arabian woman was allegedly murdered and buried in the desert by her brother because of her Snapchat selfies. Another woman from the same country was reportedly murdered just for having a secret Instagram account.
We must do more to prevent and criminalize image-based sexual abuse from happening. Otherwise, it will never stop.
We aren't doing enough
1 in 5 women has experienced online sexual abuse in recent years. But despite the number of women who fall victim to it and the immeasurable harm these websites and tools pose to women everywhere, there has been little meaningful intervention to date.
In 2015, England and Wales introduced 'image-based sexual abuse' laws which made it illegal to disclose 'private sexual photographs and films with intent to cause distress.' But this legislation only relates to images or videos originally consented to by the victim. This means that distributing intimate sexual images online created without consent using 'deepfake' or 'nudifying' digital technologies falls outside the law and goes completely unpunished.
The victims of these disgusting practices - who are almost exclusively women and girls - are often left with little to no legal recourse as their lives are turned upside down.
And even when these images or videos are reported, they are rarely taken down. You'd likely have a much easier time getting a social media platform to remove copyrighted material - as required by law - than 'deepfake' nudes. Because social media companies typically face zero liability for it. So why would they care?
In the UK, former Culture Secretary Maria Miller is now calling on the government to address the now apparent failings of the 2015 legislation and criminalize 'deepfake' and 'nudification' image abuse.
It's about time distributing sexual images online without consent should be recognized for what it is - a sexual offense. And not only in the UK.
Otherwise, these websites will keep popping out, subjecting many women and girls to humiliation, degradation, and dehumanization. Apathetic tech companies will keep aiding its global spread. And people who engage in this disgusting practice will continue to see it as merely a joke or a laugh or just something men do on account of being men.
It isn't a joke. And it isn't something that men 'just do.'
It's an unacceptable sex crime, and we must do more to prevent it.
As long as we're allowing these websites and tools to function, every woman and girl with publicly accessible images of herself is vulnerable.
We can't let that happen.
We can't let the online world be a continuum where women and girls suffer the same violence and harassment experienced offline.
This story was originally published on Medium.
About the Creator
Sometimes serious, sometimes funny, always stirring the pot. Social sciences nerd based in London. Check out my other social media: www.linktr.ee/katiejgln
There are no comments for this story
Be the first to respond and start the conversation.