Download as pdf or txt
Download as pdf or txt
You are on page 1of 31

Week 6: Governing the

Internet – Content
Moderation and Online
Harms

Dr Joanne Gray
Digital Cultures/MECO
We acknowledge the tradition of
custodianship and law of the Country on
which the University of Sydney campuses
stand. We pay our respects to those who
have cared and continue to care for Country.
Content Moderation
The old way The new way

Source: Home Care Assistance


Source: MemeGenerator
The University of Sydney 3
Part 1: Why Free
Speech is Tricky
“The free communication of ideas and opinions is one of the most
precious of the rights of man. Every citizen may, accordingly, speak,
write, and print with freedom, but shall be responsible for such
abuses of this freedom as shall be defined by law.”
Declaration of the Rights of Man and the Citizen, France, 1789

“Congress shall make no law respecting an establishment of religion,


or prohibiting the free exercise thereof; or abridging the freedom of
speech, or of the press; or the right of the people peaceably to
assemble, and to petition the Government for a redress of
grievances.”
First Amendment, Constitution of the United States, 1791.

The University of Sydney 4


• Protection of the individuals against the tyranny of
the state or the majority
Key principles • The “right to be wrong”, or to hold a dissenting or
heretical view
underpinning • Truth as emerging from dialogue among free
freedom of individuals – “sunlight as the best disinfectant”
• “That the only purpose for which power can be
speech: rights rightfully exercised over any member of a civilized
community, against his will, is to prevent harm to
of the others. His own good, either physical or moral, is
not a sufficient warrant ... Over himself, over his
individual body and mind, the individual is sovereign.” (John
Stuart Mill, On Liberty, 1859)

The University of Sydney 5


• Term ‘Fourth Estate’ first used by British MP Edmund
Burke in 1787

Key principles • Other three ‘Estates’ were the Lords/nobility/crown, the


Church, and the legislature (in the UK, the House of
Commons)
underpinning • Idea that a press that is free of government interference

freedom of
can give voice to ‘the people’ and their concerns
• A free press also functions as a watchdog on the

speech: media exercise of power


• Three ‘estates’ today are the Executive, the Judiciary
as watchdog/ and the Legislature
• Some authors have referred to the Internet as a ‘Fifth
‘Fourth Estate’ Estate’ – exercise of networked online scrutiny over the
media as well as government (e.g. Wikileaks)
• Bill Dutton (2009), ‘The fifth estate emerging through
the network of networks’, Prometheus, 27 (1), pp. 1–15

The University of Sydney 6


• Jurgen Habermas (1962), The Structural
Transformation of the Public Sphere (translated
into English, 1989) – also Habermas, ‘The Public
Key principles Shpere’, New German Critique, 1974

underpinning • The public sphere as an open ‘space’ between the


state and the citizenry (private sphere) where

freedom of freedom of assembly, expression and association


are guaranteed, and where open deliberation can
occur
speech: the • Civil society as a ‘third space’ alongside
public sphere government and business
• Early capitalism promoted the ‘bourgeois public
sphere’ of coffee houses and debating societies –
this became the modern press

The University of Sydney 7


Complications/limitations in practice
• Harms principle
• Harm to others (e.g. defamation laws – I can’t call you a Nazi without proof)
• Harm to society (e.g. well-being of children, promotion of crime or violence)
• Public order/national security
• Sedition laws continue to exist – limits to calling to bring down the government
• Statements that may endanger others e.g. promotion of terrorist acts
• Rights in conflict – where to draw the line between harm and speech
• Do individuals have the right to make racist statements?
• What is the balance between public responsibilities and private rights?
• Offensiveness
• Restrictions on public circulation of sexually explicit content (pornography)
• Censorship and classification – different balances on public availability of content

The University of Sydney 8


Complications in practice
• Responsible speech
• Do people have obligations to the wider
community in terms of what they say
publicly?
• Public interest
• Do particular statements serve to be of
potential benefit to others?
Rugby player Israel Folau. Source: The New Daily
• Role of media in wider community
• Should the media as powerful public
institutions be socially responsible in
what they say and do?
• Do people in the media or other key
areas of public life (e.g. sports,
entertainment) have social
responsibilities?

The University of Sydney Singer Guy Sebastian. Source Pedestrian TV. 9


Changing political economy of speech
• Development of mass communication
• As media takes an increasingly mass form, there is concern about the power of mass communications to manipulate public
opinion
• Concentration of media ownership
• As media become larger, they are increasingly owned by large corporations (media barons) with their own commercial
interests
• Professionalisation of journalism
• Journalists bring their own “news values” to reporting which may distort public opinion – “gatekeeper” function
• Media management
• A series of new professions emerge associated with the management of media messages across business, government and
other fields
• Different traditions
• In many non-Western countries, the adversarial model of media/government relations is not seen as optimal from the
perspective of their national development

The University of Sydney 10


Conclusion to Part 1
• Underpinning all discussions about content moderation and online harms are long debated philosophical ideas about freedom of
speech and freedom of expression – these ideas were most clearly articulated in the Western Enlightenment
• Key ideas here are:
1. Free individuals have a sovereign right to hold opinions, even if they are unpopular;
2. A free media is vital to holding the powerful to account;
3. Truth and knowledge are best advanced through the free exchange of different opinions in the public sphere.
• Challenges in practice include:
• The harms principle
• Public order and national security
• Rights in conflict
• Offensiveness
• Responsible speech and consideration of others
• Political economy of communications
• Unequal power of different groups of speakers
• Different political and philosophical traditions

The University of Sydney 11


Part 2: The Internet, Platforms and Speech

Source: Peter Steiner, The New Yorker, 1993. The University of Sydney 12
The Internet as a free speech platform
• Users engaged as content creators
and not just consumers –
interactive media platforms
• Ease of setting up a new site – end
to channel scarcity
• Unmediated nature of speech and
conversation – no gatekeepers
• Potentially global reach of content
– not tied to national authorities
• Alternative platforms to
mainstream media – greater
diversity of voices
Indymedia was established in Seattle, WA, USA in 1999.
Its slogan, and that of similar movements around the world,
Was “Don’t hate the media, be the media”.
The University of Sydney 13
The liberating potential of the Internet for
speech
• Howard Rheingold, The Virtual Community (1994) – ‘The technology that makes virtual
communities possible has the potential to bring enormous leverage to ordinary citizens
at relatively little cost’ (Rheingold, 1994, p. 4)
• John Perry Barlow, ‘Declaration of the Independence of Cyberspace’ (1996) - ‘We are
creating a world where anyone, anywhere may express his or her beliefs, no matter how
singular, without fear of being coerced into silence or conformity’.
• Mike Godwin, Cyber Rights (1998) - ‘Give people a modem and a computer and access to
the Net, and its far more likely that they will do good than otherwise. This is because
freedom of speech is itself a good … societies in which people can speak freely are better
than societies in which they can’t’ (Godwin, 1998, p. 23)

The University of Sydney 14


But how far did this go?
• LICRA v. Yahoo! (2000) – French court requires Yahoo! to block access
for French users of its site to access Nazi memorabilia
• Joseph Gutnick v. Dow Jones (2002) found that the Wall Street
Journal Online could be subject to Australian defamation law
• China’s Golden Shield Project – initiated in 1998 and completed in
2003 – designed to manage flow of foreign online content into China
• Sometimes also referred to as the “Great Firewall of China”
• “If you open the window for fresh air, you have to expect some flies
to blow in” (Deng Xiaoping)

The University of Sydney 15


What platforms do? Moderate content
• Moderation is ‘essential, constitutional, definitional’ of a platform … ‘everything on a
platform is designed and orchestrated’ (Gillespie, 2018, p.21)
• A platform without governance is not possible; governance is as central to platforms as
are data, algorithms, and interfaces.
• Content moderation is intrinsic to platforms
• It is particularly vital to platforms where user-created content and advertising co-exist,
such as social media platforms (Facebook, YouTube, Twitter etc.)

The University of Sydney 16


Why do • Removing restricted and illegal content
• Managing the user experience of the platform
platforms • Making the platform suitable for advertisers to place
products on
moderate • Avoiding breaches of copyright

content? • Avoiding public controversies


• Avoiding criticism from governments and politicians
(Tarleton • ‘Whether they want to or not, platforms find that they
must serve as setters of norms, interpreters of laws,
Gillespie, ‘All arbiters of taste, adjudicators of disputes, and enforcers
of whatever rules they choose to establish. Having in
Platforms many ways taken custody of the web, they now find
themselves its custodians. The challenge for platforms,

Moderate’) then, is exactly when, how, and why to intervene.’


(Gillespie, 2018, p. 5)

The University of Sydney 17


But they wish to avoid the appearance of
moderating content
• Belief in freedom of speech
• ‘the free speech wing of the free speech party’ (Dick Costolo, Twitter CEO, 2013)
• Concerns about precedent
• Facebook ‘should not be the arbiters of truth. That would be a bad position for us to be in
and not what we should be doing’ (Mark Zuckerberg, Facebook CEO, 2020)
• Concerns about legal action
• Trump sues Twitter, Google and Facebook alleging 'censorship'
• Potential impact on profitability – will users leave?
• Avoiding the appearance of being a ‘publisher’ or a ‘media company’
• ‘Moderation is hard to examine, because it is easy to overlook—and that is intentional. Social
media platforms are vocal about how much content they make available, but quiet about how
much they remove’ (Gillespie, 2018, pp. 6-7)

The University of Sydney 18


Moderation as a practice
• Sits between legal requirements and guidelines for use of the platform (negative), and desire to
build user communities that promote ongoing engagement and interaction (positive)
• Different platforms apply their own moderation practices differently
• Platforms such as Reddit may be ‘edgier’ than Facebook
• ‘Free speech’ platforms such as Gab and Parler have emerged as Facebook anad Twitter apply
content rules more strongly
• The communities that platforms cater to differ greatly – what may be ‘obscene’ or ‘offensive’
in one context may not be in another – challenge of consistent guidelines
• ‘Platforms answer the question of distribution differently from the early web or traditional media.
But they do offer the same basic deal: we’ll handle distribution for you—but terms and conditions
will apply. These terms may be fewer and less imposing, though you may be asked to do more of
the labor of posting, removing, maintaining, tagging, and so on. But the platform still acts as a
provider.’ (Gillespie, 2018, p. 16)

The University of Sydney 19


• The Internet was originally envisaged as a free
speech platform – ‘technology of freedom’ – and
that idea remains powerful
• This has always been challenged in practice by
different laws and different political and cultural
traditions

Conclusion • The platformisation of the Internet has seen


content moderation become increasingly
important
to Part 2 • Platforms moderate content for legal and ethical
reasons, but also to enhance overall user
experience
• Different platforms apply different rules to content
moderation – they largely set these rules
themselves, subject to criminal law

The University of Sydney 20


Part 3: The Growing Pressures on Content
Moderation

The University of Sydney 21


Source: The Washington Post, 2020.
Content moderation and community
management
• Content moderation has its roots in community management, but has been scaled up as
social media platforms have become ever larger
• Public scandals are an important driver of content moderation decisions
• Content moderation initially has been concerned with those activities that are clearly
illegal (e.g. child exploitation)
• In more recent times, it has had to address a more diverse range of sources of online
harms, whether to individuals, groups or social systems

The University of Sydney 22


Reddit and #gamergate
• Reddit was founded in 2005. It is an
extended bulletin board system, where
users are encouraged to join online
communities (‘Subreddits’) and to
upvote/downvote the contributions of
users
• Registering to use the site is easy, and
moderators were initially volunteer
community managers, encouraged to
have considerable latitude in how they Reddit: Dive into anything
managed their communities (formerly ‘The front page of the Internet’)

The University of Sydney 23


Reddit and #gamergate
• A series of controversies arose on Reddit over 2013-2015
• 2013: False identification of the Boston marathon bomber on the site led to the
person subsequently suiciding
• 2014: Private sexual photos from a celebrity photo hack were distributed through a
subreddit The Fappening as well as on 4chan
• 2014: female games journalists such as Zoe Quinn and Anita Sarkeesian subjected to
orchestrated harassment and abuse on Reddit and 4chan
• 2015: Reddit CEO Ellen Pao resigns after four months. Claiming ongoing harassment
from the Reddit community as she changed content moderation policies

The University of Sydney 24


Reddit and #gamergate
• Adrienne Massanari, “#Gamergate and The Fappening: How Reddit’s algorithm,
governance, and culture support toxic technocultures”, New Media & Society 19(3), pp.
329-346
• Massanari identifies the culture of Reddit as a geek/nerd culture and very male-
dominated
• The culture of upvoting and algorithmic sorting are seen as promoting cultures that are
highly resistant to alternative voices e.g. women gamers
• Such sites also seem to have been recruitment grounds for what would become known
as the alt-Right
• Content moderators often struggle to manage these subreddits, both because of
algorithmic biases but also because they are expected to identify with the community

The University of Sydney 25


• Sarah Roberts (2019), Behind the Screen : Content
Moderation in the Shadows of Social Media (New
Haven, CT: Yale University Press)

Content • Content moderation becomes an increasingly


important activity in the 2010s, although it has

moderation always had a history with the Internet


• Commercial content moderation is frequently

as labour outsourced by the major digital platform


companies
• It has also been moved offshore to lower-wage
countries such as India and The Philippines

The University of Sydney 26


What commercial content moderation is
• ‘Commercial content moderation is the organized practice of screening user-generated
content posted to internet sites, social media, and other online outlets. The activity of
reviewing user-generated content may take place before the material is submitted for
inclusion or distribution on a site, or it may take place after material has already been
uploaded.
• In particular, content screening may be triggered as a result of complaints about material
from site moderators or other site administrators, from external parties (for example,
companies alleging misappropriation of material they own), or from other users who are
disturbed or concerned by what they have seen and then trigger mechanisms on a site,
an action called the “flagging” of content, to prompt a review by professional
moderators’ (Roberts 2019, p. 34).

The University of Sydney 27


What content moderators do
• Content moderators work with a manual or set of guidelines for whether to restrict or
remove content
• Machine-automated detection programs are often used to support the process by auto-
identifying potentially problematic content
• Attempts to fully automate the process have been tried and failed
• Content moderators often experience various forms of trauma in the work that they
undertake
• Content moderators have taken industrial action against companies such as Google and
Facebook, and sought compensation for exposure to online harms

The University of Sydney 28


The Cleaners
(2018
documentary)

The University of Sydney 29


• Content moderation has its roots in community
management, but is more typically done today as a
large-scale industrial expertise for the largest social
media platforms
Conclusion • The work is frequently outsourced to third parties
or “offshored” to developing countries
to Part 3 • Such work is difficult labour, and often quite
traumatic for those involved
• It typically involves a mix of human labour and
machine-automated detection – it has been
difficult to fully automate such work

The University of Sydney 30


Thank you

You might also like