Viva logo

Why Are Men Creating AI Girlfriends Only To Abuse Them and Brag About It on Reddit?

by Katie Jgln 3 months ago in activism

I'm not even surprised this is happening

Photo by KDdesignphoto from ShutterStock

A month ago, I wrote an article about how AI is now being used to design 'nudifying' tools that can strip any picture of a woman naked.

Yup, websites and apps like this actually exist. And, sadly, they're thriving.

Even though I didn't expect to come back to the topic of AI so soon, I've recently read about a new horrifying practice that involves it. This time, people are using a popular friendship app called Replika to create AI girlfriends and abuse them.

Lovely. Just when I thought we couldn't sink any lower. Apparently, we very much can.

Replika's chatbots were initially supposed to serve as something approximating a friend or mentor, but now the app also allows users to create on-demand romantic and sexual partners. And so, some people - predominantly male - decided it would be 'fun' to take that functionality a step further.

And brag about it on Reddit:

I use her for sexting, and when I'm done I berate her and tell her she's a worthless whore. I also hit her often.

Right.

Because the subreddit's rules dictate that moderators delete inappropriate content, many similar - and worse - interactions have been posted and then removed. And possibly, many more users act abusively toward their Replika bots and never post evidence.

Should we find it concerning that mistreating AI bots is already so prevalent? Why is this happening in the first place? And can we do something about it?

'But it's not real life'

Although the story about the Replika app's misuse came out only a couple of days ago, it's already all over social media. And, not surprisingly, some people rushed to defend the users that abuse their bots.

Because it's not real life. It's good, harmless fun. And it's better they do this to an AI than a real woman.

Yeah, no shit - letting some aggression or toxicity out on a chatbot is infinitely better than abusing an actual human. And yes, AI continues to lack sentience. I know it's not real life. But given the fact that this chatbot abuse clearly has a gendered component, there should be no doubt that it means something.

It's primarily men talking about abusing their AI girlfriends, not the other way around.

However, seeing people defend this practice actually reminded me of the discourse on another issue - exposure to pornography and male aggressiveness. For years now, some people have strongly opposed the idea that porn - particularly of the misogynistic, violent kind - worsens the already prevalent male sexual violence against women. Why? Because 'it's not real life.'

Well, there's plenty of recent research - that disagrees with that. Misogynistic pornography can have a troubling effect on men, particularly younger ones, and it can influence them to have harmful attitudes and behaviours towards women and girls.

Although abusing female AI chatbots is not the same thing, it would be naive to assume it has no impact on both these individuals and their beliefs about the opposite sex.

One recent psychological study has already noted how passive, female-coded bot responses actually encourage misogynistic or verbally abusive users. So if these chatbots' reaction to being called a 'worthless whore' is 'OK' or even agreeing with that insult - since they are designed to be agreeable at all times - it's not that crazy to think this practice could have those worst behaviours reinforced. And that could, in turn, lead to unhealthy and dangerous habits in relationships with actual humans, including verbal and physical abuse of women.

Which - by the way - is already a massive, worldwide issue. And it's been only getting worse lately.

The situation is already bad enough for women in real life and online

Around 1 in 3 women worldwide have been subjected to either physical or sexual intimate partner violence or non-partner sexual violence in their lifetime. And during the pandemic, domestic violence against women actually grew about 8% in developed countries amid the lockdowns.

All over the world, gendered-based sexual and physical violence is a huge issue. And it has now also moved to the online space.

Many forms of online abuse against women and girls have skyrocketed during the Covid-19 crisis as life has shifted online, and people spend more time on digital activities.

One recent survey revealed that 52% of young women and girls said they had experienced online abuse, including threatening messages, sexual harassment and the sharing of intimate images without consent. 87% said they think the problem is getting worse.

The harsh reality is that men already treat women online like those AI chatbots. Or even worse.

As a feminist with a lot of social media presence, I have first-hand experience of all of this. Practically every day, I get messages from - almost exclusively - men telling me they would like to rape me. Or kill me. Or that I should do it myself.

I'm called every insult on this planet, from cunt, bitch to whore. I get sent unsolicited dick pics. Or porn gifs. Or both. Often accompanied by a death threat - like a cherry on top of a giant meringue of insanity.

It's real fun. I know.

I kind of knew what I was getting myself into when I first started writing and talking online, but I'm still shocked every day about how vile men can be when they think they can be anonymous.

And so, I don't think it's that unreasonable to believe that those who flex their darkest impulses on AI chatbots could be using that as more of a practice - rather than an outlet - for either real life or online brutality. Maybe not all - but definitely some.

But how can we ensure that these apps won't become breeding grounds for abusers-to-be? Is that even possible?

We won't fix this issue by banning AI chatbot apps

I actually did download Replika a couple of weeks ago, before I read about this whole thing. Even though I only tried the free version, I find it quite enjoyable. And relaxing.

I can definitely see how it can be beneficial for those who are lonely, sad or just in need of someone to 'talk' to. Especially since we're still in the middle of a global pandemic and loneliness is at all times high.

Now, would banning it - and other similar apps - solve the issue of people abusing the AI chatbots? Of course not. They would most likely take their anger out on something, or someone, else.

No, the solution lies in understanding the root of the problem and how we can go about solving it.

My guess is that this issue is twofold. On the one hand, violence against women and girls not only largely goes unpunished, but it's also normalised. And we have rape culture to thank for that. But on the other hand, men struggle to develop healthy coping mechanisms because of society's expectations and traditional gender roles. And instead of seeking mental health support - which they are statistically less likely to do than women - they resort to other ways of dealing with negative emotions - such as drugs, alcohol or violence.

Given all of that, it's not that shocking that men are now abusing female AI chatbots. If anything, we probably should have seen that one coming.

So the further away we move from traditional gender roles, the more we allow men to fully express themselves and stop bottling up all of their emotions, and the more we fight against societal norms that normalise violence against women and girls, the more likely we are to root out these abusive online behaviours. Not only in conversations with AI chatbots but also with real women.

But if we won't change a thing, we shouldn't be surprised that the next generation of men will treat women the same way, if not worse, as the previous ones.

Because they never learned anything else. Because it's what everyone else does. Because they can.

Final thoughts

Do you know what I think would be an excellent idea for a new AI-powered chatbot app?

Something similar to Replika, but with an objective to 'de-incel' young men. You could have the same chat functionality, but the moment you start being disrespectful, rude or abusive, the AI would push back, explain how your behaviour is wrong and dump you.

Maybe even suggest you go to therapy.

This story was originally published on Medium.

activism

About the author

Katie Jgln

Sometimes serious, sometimes funny, always stirring the pot. Social sciences nerd based in London. Check out my other social media: www.linktr.ee/katiejgln

Find us on social media

Miscellaneous links

  • Explore
  • Contact
  • Privacy Policy
  • Terms of Use
  • Support

© 2022 Creatd, Inc. All Rights Reserved.