Viva logo

New Technological Advancements, New Ways to Exploit Women

What are deepfakes and how are they being used?

By KBPublished about a year ago 4 min read

What is a deepfake?

A form of AI, a deepfake software enables the user to swap the faces on a video, creating an illusion that a person is doing or saying something that didn’t occur. Currently, there are a few major software companies on the market performing deepfakes, including “Faceswap,” “DeepFaceLab,” and “Deepfakes Web.” This software is free; anyone that downloads it can use it, some even are available in the app store.

If you are more curious about the technology and details behind this software, check out this video below.

How are they being used?

The main marketable feature for the everyday user is for ‘fun.’ There are many viral videos that exhibit this technology, making the tool a seemingly low-stakes “just for the laughs” gag.

In the Ted Talk below, Supasorn Suwajanakorn speaks about some of the positive uses of this software. For example, his intentions behind this AI technology were for education. It is often a better and more accurate experience to learn history from people who had gone through a specific event, such as the Holocaust. The idea was to record Holocaust survivors to make an interactive interview with learners, that with the help of AI, it is like you are having a conversation with this person. The preservation behind this idea is an incredible way to mark history, however, this technology can easily get into the wrong hands.

There have been cases of online users exploiting women’s bodies and faces with the deepfake software; in certain cases, with pornographic content.

As Supasorn Suwajanakorn states, as the technology gets better and better, it is harder to spot what is real and what isn’t. This creates a new problem for women and all people who are victims of unsanctioned deepfakes.

Who is being targeted?

There have been instances where there is no reason for a person’s face being used in a video other than their photos being public, but many of the popular cases seem to involve celebrities.

Kristen Bell, an actress known for Frozen, The Good Place, Veronica Mars, and much more has been a victim of pornographic deepfake videos since 2020 when she spoke out about it on Vox. In this case, Bell’s face has been distorted onto the bodies of porn performers. She mentions in the interview, “It’s hard to think about that I’m being exploited.”

Scarlett Johansson is another well-known celebrity who has been a constant victim of the same deepfake pornographic videos. According to Vox, 96% of the deepfakes are pornographic.

With women historically being unable to control their bodies, this new form of harassment through nonconsensual videos is alarming.

Previously, deepfakes were very easy to spot. But it has become increasingly more difficult to see the difference, blurring the lines of what is real and fake. This calls to reiterate the phrase: “not everything you see online is true.”

Isn’t this illegal?

As of now, the answer is both yes and no. No, deepfakes are not illegal. However, if it is used in a pornographic fashion, the victim can claim copyright or defamation. There are very few legal options for victims and thus, the majority of this harassment goes unsolved or unnoticed.

What are the next steps?

In this article by Fight The New Drug, they explain that victims of pornographic deepfakes suffer similar trauma to sexual assault survivors.

The psychological effects of these videos are more detrimental than what is marketed as “fun” software, creating an unsafe online environment for women especially.

Their list for victims’ wellbeing is as follows:

  • Connect with online organizations who support survivors
  • Take your case to law enforcement or hire an attorney
  • Record evidence of your abuse
  • Cope with the psychological effects of your abuse
  • Remember that this isn’t your fault
  • Focus on what you do have control over
  • Take precautions for the future
  • It is both discouraging and disappointing to see this as a consistent issue, with such an alarming number being used nonconsensually. Questions float around such as, “am I doing enough to protect myself,” and “should I stop posting on social media altogether?” There is no easy answer to this; there is no easy answer to the sexualization of women, or the increase of rape in my city, or the laws trying to go against women’s rights, or the pornographic deepfakes circulating the web.

    So, to gain a bit of control, I research and share what I’ve learned in hopes that I can bring more awareness to such issues.

    **NOTE: This is not at all to undermine the validity of sex workers. Sexual exploitation, harassment, or abuse is never justifiable–in any form. This article is to point out the exploitation of women through deepfakes, where unknowing victims are used without their consent, creating a new form of sexual harassment.


    About the Creator


    A snippet of life. Some real, some not. Thanks for reading!

    Reader insights

    Be the first to share your insights about this piece.

    How does it work?

    Add your insights


    There are no comments for this story

    Be the first to respond and start the conversation.

    Sign in to comment

      Find us on social media

      Miscellaneous links

      • Explore
      • Contact
      • Privacy Policy
      • Terms of Use
      • Support

      © 2023 Creatd, Inc. All Rights Reserved.