Download as pdf or txt
Download as pdf or txt
You are on page 1of 2

How Deepfakes Reinforce Misogynistic Objectification of Women

By: Ayoola Fadahunsi

According to a research study on deepfakes conducted by Sensity AI, 90% - 95% of

deepfakes are used to make pornography videos. And of those videos, 90% are of women.

Celebrities, public figures, and the everyday woman have a high chance of finding themselves,

the star, of a pornographic video that they never participated in. Despite never being a part of this

video, however, their faces show otherwise. The use of deepfakes in revenge porn and

non-consensual pornographic videos takes away the little power that women have in society, and

reinforces the narrative that women are only useful for the “male gaze.”

Since the beginning of time, women have been objectified by men and suppressed in their

sexuality. The sexuality of women has always been defined by the standards of men - from the

way women should dress, to how they should behave in the workforce and domestically. The

incorporation of deepfake in revenge porn and non-consensual pornographic videos, will give

men more power over women and continue this sexist cycle. Unrestricted access to deepfake

technology makes it easy for men to take revenge against their ex-partners, or fulfill their own

fantasies. As deepfake pornography continues, women in the public light will have another threat

to their livelihoods. Female Politicians are already scrutinized beyond measure, if sensual videos

are spread about them, they will further lose their credibility. Their male counterparts and

constituents will focus on the provocative deepfake videos, as opposed to the real work that they

do everyday.

Women have to work twice as hard to accomplish half of the things that men do, adding

another form of harm to their oppression, is inherently immoral. When analyzing the situation

pragmatically, deepfake porn is harmful to the careers of women and their station in society. But
the violation goes beyond that. The violation of women brings in the factor of privacy, consent,

and responsibility. Because the women in these videos have not consented to the use of their face

and bodies, it is immoral to use their faces and bodies. Their privacy is being violated because

their videos and pictures are being used in the public sphere, where all users have access to the

videos. The value of privacy and consent is not being upheld when deepfakes are used in

non-consensual pornography. But how do we fix the problem, and who is responsible for the

harms caused?

This is the question that remains unanswered. Who should be responsible for the

continued objectification of women. Some might argue that the person who creates the video

should be held responsible, while others say that it is the distributor of the videos. Another group

might assert that the creator of the deepfake code or app should be responsible for the

pornographic videos. I assert that pornographic distribution sites like PornHub.com and Reddit

should monitor what is published on their site and remove videos that include the faces of

celebrities, public figures, and regular people. They can use deepfake recognition algorithms to

identify deepfake videos and take them down. These distributors must take responsibility for

what is shared due to the nature of the website. It would be harder to find each creator and make

them take down the video, or punish them for making the video in the first place.

Even if there is no established methods to hold people responsible for revenge porn and

non-consensual pornographic videos, we must actively find ways to mitigate its harms against

women. The government must include deepfake pornography into its laws against what is not

suited in pornographic sites. With the implementation and enforcement of these laws, distributors

will be incentivized to reduce the amont of deepfake pornography. The government must actively

work to protect the rights of women and their privacy.

You might also like