Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

NOTES: needs 3 more sources and ~400 more words

Claire Nguyen

Mrs. Shaffer

English 1B

29 April 2024

Image and Likeness: An Exploration of the Harms of Deep Fake Pornography

AI is taking the world by storm, and seemingly every aspect of the professional sphere

has been influenced by its effects. It has left rather striking and controversial marks on the

entertainment sphere especially. For example, the writers of the recent Disney movie Wish have

come under fire for their lyrics sounding AI-generated. These lazy shortcuts may lead to some

funny, although slightly annoying, consequences, but there is a much more sinister side to

AI-generated entertainment content. Specifically, in one of the most naturally sinister, though

lucrative, sides of the entertainment industry: pornography. With the rise of AI image generators,

it has become incredibly easy to produce deep faked pornography, where one person's face is laid

atop another person doing pornographic actions’s body. This raises many questions lawmakers

will need to answer. Is non-consensual deep fake pornography a form of harassment? Is it

protected speech to share these images and videos on the internet? How can we even catch this

kind of content online to take it down, and can we prosecute its proliferators? Each of these

questions requires a certain amount of nuance to answer. However, in most cases, the online

sharing of deep fake porn is gravely immoral and can and should be illegal. However, in most

cases, the creation of deep fake pornography is immoral but legal, but the sharing of deep fake

pornography is both immoral and illegal.


Before continuing, it is prudent to provide definitions of common terms to be used in this

essay, most prominently deep fake and pornography. The definition of pornography is hotly

debated over, but for the purposes of this paper, the working definition of pornography will be

“sexually explicit material” (Willoughby). However, many other caveats may be added to this

definition, such as that the material in question must have been created with the intention to

arouse. Most scholars agree that it is nearly impossible to define pornography, but that with

proper context, it is easy to determine what is or is not pornographic. The definition of a “deep

fake” is much simpler by comparison. A deep fake is “algorithmically synthesized material

wherein the face of a person is superimposed onto another body”(Karasavva). Consolidating

these two definitions, deep fake pornography is a form of AI-generated content where one

person’s face is imposed onto another body, and the content is pornographic in nature.

Deep fake pornography is a widespread phenomenon that targets certain demographics

much more than others. Deeptrace Labs published a study entitled “The State of deep fakes” with

some shocking statistics about the proliferation of deep fake pornography. As of 2019, there are

14,678 deep fake videos online, a 100% increase since 2018. 96% of these videos are

pornographic in nature(Adjer 8). Additionally, the people whose faces are used in deep fake

pornography, herein referred to as either subjects or victims or deep fake pornography,

overwhelmingly are classified into a few populations. According to “The State of deep fakes”,

essentially 100% of the subjects of deep fake pornography are women, and 99% of them work in

the entertainment industry (Ajder 8). By numbers alone, one can see that deep fake pornography

is becoming exponentially more widespread and that it targets some of society’s most vulnerable

and most hated.


Though there is little contemporary research on the subject, it can be assumed that the

creation of this content is occurring much more frequently now than in 2019. In 2022, the release

of StableDiffusion, an AI image generator that can be downloaded directly to a private machine,

made the production of AI pornography easier than ever. The model was trained on pornography,

allegedly including child pornography, and has no request restrictions as its code is open

source(Hern). At this point, anyone who has images of themself available publicly online can be

turned into deep fake pornography. Anyone with a computer can create deep fake pornography

out of any image they desire, and the socio-psychological effects of this are yet to be studied.

In the past, the sharing of sexual media without the consent of the person on screen has

been called “revenge porn,” and deep fake pornography can be considered a form of this type of

harassment. The risk of becoming a victim of the former kind of revenge porn could be

alleviated in the past by making sure there are no sexually explicit images or videos of oneself.

With the advent of deep fake pornography, one must make sure there are no available images of

one's face at all, which is impractical and sometimes impossible in this day and age. The onus of

responsibility has been moved from the victim to the government or even society as a whole.

There are many moral subtleties which color the issue of deep fake porn. For example,

there is the pervert's dilemma. The pervert’s dilemma is the idea that “ethically, there is little

separating a deep fake from a sexual fantasy”(Maddocks).Both sexual fantasies and deep fake

pornography take a person who did not consent to be viewed sexually and disregards this lack of

consent. However, while most people would be embarrassed to be found in possession of deep

fake porn, they find sexual fantasies to be a regular part of human sexuality.

The question then becomes whether deep fake porn should be treated as a regual part of

human sexuality as well. In “The Real Threat of Deepfake Pornography: A Review of Canadian
Policy,” author Vasileia Karasavva argues that the dangers posed by the consumption of deep

fake pornography make it an automatically immoral practice. She states deep fake pornography

“could create unrealistic expectations about sexual performance, likes and dislikes, and the

willingness of partners to engage in certain acts”(Karasavva). Deepfake pornography takes a real

person and creates a fictional, sexual, and objectified version of them for the user to receive

pleasure. If the user begins to misconstrue this fictional person with the person in reality, the user

may even go as far as to assume that they consent to sexual acts because of what they saw on

screen. Essentially, deep fake porn could lead to rape in reality. Sexual fantasies are much less

egregious than deepfake porn, as there is no tangible evidence that they ever occurred, but the

same concerns with deep fake porn can be applied to sexual fantasies about real people. The

difference with deep fake porn is we could actually theoretically prosecute the proliferators of

this heinous material.

// needs another 2-4 paragraphs

For some opponents of the regulation of deep fake pornography, the possible

infringement of free speech outweighs the benefits that come from taking this media off the

internet. For example, as Alex Barber writes in his legal article “Freedom of expression meets

deepfakes”, “categorising [deep fake pornography] as harassment or defamation would quickly

and easily make suppression compatible with legitimate freedom-of-expression

principles”(Barber). Freedom of expression is a wonderful civil right provided to inhabitants of

the United States, and it has lead to many impactful movements such as the Civil Rights

Movement. Barber and other opponents of the regulation of deep fake pornography worry that if

deep fake pornography distribution is regulated, the government will use this regulation as a

precedent to regulate other forms of free expression. This is a valid concern, as court case
precedents can lead to a slippery slope and unintended consequences. Barber further argues that

sharing deepfake pornography may not even harm women. He states “some pornography may or

may not subordinate women as such (over and above harming the individuals

depicted)”(Barber). Alex warns against knee jerk reactions to deep fake pornography and

encourages proponents of regulation to read empirical studies which detail how much harm is

done to subjects of deepfake pornography.

However, I believe that over all, Barber is overcomplicating the issue. Our “knee jerk

reactions”, as Barber would call them, are real reflections of our moral compass, and our inherent

desire to seek the good and banish the evil. They can be a truly valuable tool for lawmakers who

hope to do the least harm. Additionally, speech can be regulated without infringing on free

speech as a whole. For example, it is illegal to make death threats, or scream “Fire!” in a

crowded building. Even more relevant, it is illegal to post revenge porn online. Attorney

Deborah C. England describes, “California is one of 42 states (and the District of Columbia) that

explicitly outlaw nonconsensual pornography”(England). In almost the entirety of the country,

citizens are simultaneously disallowed from posting nude photos of a person without their

consent, while still being allowed to set up encampments in universities in order to “Free

Palestine”. Judges have the ability to see nuance in issues, precedents are not so cut and dry that

only one could dismantle United States’ citizens constitutional rights. The argument that

pornography does not subjugate women based on empirical studies is ignoring a gross amount of

evidence from non-empirical observations. First of all, just because an empirical study makes an

assertion does not mean what it is asserting is correct. Data can be manipulated in many ways,

such as by asking leading questions on surveys and only surveying compact demographics.

Companies that stand to benefit, such as those who distribute deep fake pornography, have an
incentive to fund these studies. Studies on whether or not sexual activity harms children conflict

on the issue, but that does not mean that children should be exposed to sexual activity.

Just as we can tell by basic biological facts that children should not be having sex, we can

tell by basic biological and moral principles that consent is necessary for even someone’s

likeness to be used in deep fake pornography. Sex is a highly vulnerable act, especially for a

woman. Because it naturally leads to the woman carrying a baby for nine months, women are

generally very selective with their sexual partners. Sex carries a special meaning for humans,

because it is the only act which allows the participants to create a new life. This is why rape is

wrong, considered by most to be worse than assault and battery, even if the woman’s body is not

harmed. When a rapist sexually assaults a woman, in nature he would essentially takes away the

woman’s right to choose when she becomes pregnant, and she would be forced to propogate his

DNA even if she may feel as though it is not worth propagating. This is why sex has been

historically reserved to within the confines of marriage: it is an act that has grave, although

wonderful, consequences. Deep fake pornography leads to people lusting over a woman and

objectifying her without her consent. It ignores the biological reality of sex: that it is historically

private and consensual. The privacy is gone, by the fact that it is all over the internet, and most of

the time, consent is the least of the creator’s concerns. This kind of AI generated content is an

abomination, an affront to the dignity of the human person, and a slandering of the once marital

act.

The non-consensual sharing of deep fake porn is obviously an epidemic which needs to

be quashed. However, some difficulties arise around the concept of prosecuting criminals who

spread deep fake pornography. Thankfully, law enforcement is not without recourse. As

discussed in the paragraphs above, deep fake pornography causes significant emotional and
psychological harm to those who become non-consensual subjects of it. Unfortunately, due to

widespread online anonymity, law enforcement is unable to track down users who share this kind

of content. Though the actual perpetrators who share deepfake porn may not be feasibly

locatable, the explicit material itself can be detected and removed by AI deepfake pornography

detection software. AI deepfake pornography detection software would theoretically have the

ability to pinpoint deepfake pornography so it can be removed from social media like Twitter.

Artificial intelligence technology is rapidly improving, so this detection software is decently

realistic to implement. It would take a large amount of money and time to hire programmers to

create a technology such as this, but the foundations of such an application are already readily

available. Proto versions of this technology are already available. Gerrit De Vynck, author of the

Washington Post article “The AI deepfake apocalypse is here. These are the ideas for fighting it”

describes, “Some companies, including Reality Defender and Deep Media, have built tools that

detect deepfakes based on the foundational technology used by AI image generators” (De

Vynck). Like other forms of AI, this detection software will need to be consistently improved as

the software that creates deepfakes improves. This technology will be expensive to maintain, but

it will be worth the investment if law enforcement and social media companies care about

protecting women.

When society loses sight of the value of the human person, everything else falls

apart. When women are turned into objects, meant to be used for the gratification of others, we

become no better than animals. When one of the most important biological functions in the

history of the world can be substituted with a computer screen, we can know that the train has

come off the track. Deep fake porn is just a symptom of a greater issue: an innate human desire

to take and to use. Excused by the Kantian standards that promote the idea of no objective
reality, we become free to do whatever we want, unbound by the chains of morality or “human

decency”. In fact, we are to consider ourselves as no different than animals, and the idea of

“human decency” as a whole is just a pipe dream, a “social construct” as one may call it. We

have a responsibility to bring light to the realities of deep fake porn, and protect a woman’s right

to remain clothed. After all, we should control our desires, not the other way around.

You might also like