Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

1102864

research-article2022
TVNXXX10.1177/15274764221102864Television & New MediaKhoo

Article
Television & New Media

Picturing Diversity: Netflix’s


2023, Vol. 24(3) 281­–297
© The Author(s) 2022
Article reuse guidelines:
Inclusion Strategy and the sagepub.com/journals-permissions
DOI: 10.1177/15274764221102864
https://doi.org/10.1177/15274764221102864
Netflix Recommender journals.sagepub.com/home/tvn

Algorithm (NRA)

Olivia Khoo1

Abstract
This essay asks two related questions: what is unique about streaming services
(and Netflix specifically), that generates a greater investment in the diversity of
its content, and how does the technology associated with streaming, in particular
algorithmic recommendation systems, facilitate an engagement with diversity and
inclusion? To answer these questions the essay considers the relationship between
Netflix’s Inclusion Strategy, its Recommender Algorithm, and the diversity of its
content, exploring the complex set of relations that exist between the way Netflix
recommends content to its audience and its (perceived) diversity.

Keywords
Netflix, algorithm, diversity, inclusion, race, recommendation system

Introduction
The zeitgeist shift signaled by social movements such as #MeToo, #5050x2020,
#TimesUp, and #OscarsSoWhite marks an acknowledgment by the film and television
industries that they need to do better in terms of improving the representation of, and
respect toward, marginalized groups in front of and behind the camera. In this land-
scape, streaming services are arguably leading developments in the industry in this
area.1 In January 2021, Netflix released the first report of its newly launched Inclusion
Strategy, examining all films and series commissioned by the company between 2018

1
Monash University, Melbourne, VIC, Australia

Corresponding Author:
Olivia Khoo, School of Media, Film and Journalism, Monash University, Sir John Monash Drive, Caulfield
East, Melbourne, VIC 3145, Australia.
Email: Olivia.khoo@monash.edu
282 Television & New Media 24(3)

and 2019 against twenty-two indices of diversity and inclusion, including sexuality,
disability, race, and gender (Myers 2021). It showed many positive gains over this
two-year period in relation to the diversity of on-screen talent as well as creators, pro-
ducers, writers, directors, and cinematographers. The Netflix Inclusion report is
acknowledged by the company as being U.S.-centric, with race and ethnicity data only
collected from Netflix’s U.S. employees, although it is noted that women make up
51.7 percent of Netflix’s global workforce (Myers 2022).
As I write this essay from an Australian context, primarily with access to the
Australian Netflix catalog, the debate around diversity and inclusion in the Australian
screen industries is also growing. Screen Australia’s recent report, Seeing Ourselves:
Reflections on Diversity in TV drama (2016) notes that the majority of characters we
see on Australian television continue to be straight, able-bodied, and Anglo-Celtic,
despite the increasing diversity of our population and audiences (Australian Bureau
of Statistics 2016). Yet, as the report also observes, there is a “new momentum and
appetite for change across the industry,” particularly in relation to issues of inclusion
and diversity.
Netflix is a global streaming service, and diversity matters. But diversity also
means different things regionally. It involves telling stories that reflect the local popu-
lation, although from the perspective of a global company like Netflix these stories
must also cross borders. After all, the Netflix Inclusion Strategy is not only about
diversity; it is also about reach. The unexpected popularity of the South Korean series
Squid Games on Netflix in 2021 demonstrated how a niche Korean-language series
can become a global hit: Squid Games became the most highly viewed Netflix series
in 94 countries, surpassing Bridgerton, a period drama with a racially diverse cast, as
the most watched Netflix series of all time (Spangler 2021). Where traditional televi-
sion is more conservative in its programming to meet the demands of advertisers,
subscription-based services such as Netflix are thriving by offering diversity, or at
least the semblance of diversity, to its audiences.
This essay seeks to address two related questions: what is unique about streaming
services (and Netflix specifically), that generates a greater investment in the diversity
of its content, and how does the technology associated with streaming, in particular
algorithmic recommendation systems, facilitate an engagement with diversity and
inclusion? To put this another way, are streaming services presenting more diverse
stories than traditional television services or are they simply better at marketing and
promoting to niche audiences? To answer these questions, I consider the relationship
between Netflix’s Inclusion Strategy, its Recommender Algorithm, and the diversity
of its content, exploring the complex set of relations that exist between the way Netflix
recommends content to its audience and its (perceived) diversity.
The essay proceeds with a discussion of the Netflix Recommender Algorithm, a
collective term for a series of proprietary computational tools developed by the com-
pany since the early 2000s. As a subscription-based service, Netflix relies on a loyal
subscriber basis, coupled with a need to grow this membership, and carry a broad cata-
log that will appeal to this audience. In this regard, personalization systems and algo-
rithmic filtering are extremely important to how Netflix curates its content for its
Khoo 283

subscribers. The essay then goes on to explore Netflix’s use of a new algorithm in
2017 to deliver personalized artwork to its subscribers, providing, at least on the sur-
face, more diversity of content by imagining (and literally imaging) its catalog in
multiple and alternate ways.
Methodologically, the essay combines an analysis of articles from Netflix’s techni-
cal and research blogs, and company media releases, with news reports and social
media posts about Netflix algorithms, particularly instances where encounters with
these algorithms “are brought to the fore during breakdowns, accidents, and controver-
sies” (Gaw 2021, 7). I provide a semiotic analysis of specific instances of artwork
personalization to show how recommender systems cater to different taste communi-
ties and, in doing so, appeal to a diversity of audiences.

The Netflix Recommender Algorithm (NRA)


Recommendation systems are “algorithmic tools that internet platforms use to identify
and recommend content, products, and services that may be of interest to their users”
(Singh 2020, 6). Algorithms themselves can be defined as “a set of instructions, rules,
and calculations designed to solve problems” (Benjamin 2019, 11). Netflix’s “problem”
is how to appeal to and diversify its audience so as to retain and increase its subscriber
base. Unlike other major streaming services, Netflix’s financial success relies solely
on its ability to attract and retain subscribers. Of its main competitors, Apple TV chan-
nels users into larger media ecosystems, and Amazon Prime Video forms part of a
broader e-commerce entity. Other streaming sites such as YouTube rely on targeted
advertising for their revenue. Netflix, however, has a major incentive to retain and
grow its stable of subscribers. So valuable are its recommendation systems that in
October 2006 the company launched the Netflix Prize, a contest offering US$1m to
the first team to develop a recommendation system capable of predicting movie rat-
ings with at least 10 percent greater accuracy than its existing system, Cinematch.
Concluding on 21 September 2009, the competition drew more than fifty thousand
participants from 186 countries (Hallinan and Striphas 2016).2
Central to Netflix’s brand and business is the Netflix Recommender System (NRS),
“a collection of proprietary algorithms used to recommend content to users and per-
sonalize nearly every aspect of their experience on the platform” (Pajkovic 2021, 3).
The NRS is responsible for approximately 80 percent total hours streamed on Netflix,
with the remaining 20 percent from search. The combined impact of personalization
and recommendation was valued at an estimated one billion dollars per year in reve-
nue as at 2015 (Gomez-Uribe and Hunt 2015, 5).
The NRS uses a combination of content-based filtering and collaborative filtering
algorithms to recommend content. Content based filtering relies on a user’s individu-
alized past data (their viewing history, scrolling behavior, and watch time) and makes
recommendations similar to those a user has previously demonstrated interest in
(Pajkovic 2021, 3). Collaborative filtering relies on larger trends among Netflix’s
global users, making recommendations based on the interests and preferences of other
users identified as having similar tastes (Singh 2020, 8–9). Previously, Netflix relied
284 Television & New Media 24(3)

on collaborative filtering data from users in a specific country or region; now recom-
mendations are gathered from algorithmically grouped “taste communities” around
the world, of which there are currently over two thousand (Pajkovic 2021, 4). As
Netflix seeks to commission original content around the world, its recommender algo-
rithms have been employed at global scale, sharing data across the more than 190
countries Netflix operates in (Pajkovic 2021, 3).
The level of detail ascribed to the data Netflix collects is key to the accuracy of its
recommendations. Each user’s experience of the Netflix homepage is algorithmically
generated, including the rows of titles displayed, the ordering of those rows, as well as
suggested titles (“Because You Watched” rows) (Gomez-Uribe and Hunt 2015, 13:04).
The NRS thus consists of not one but a number of different algorithms to personalize
a user’s experience of the platform. For the remainder of this essay, I focus on a spe-
cific algorithm which recommends personalized artwork to consider Netflix’s engage-
ment with diversity through its “picturing” of diversity. What counts as “diversity”
differs between countries according to the representation of social, linguistic, and cul-
tural diversity at a local level. In this essay, diversity of content is viewed from the
perspective of Netflix’s American and Australian catalogs. While studies of cultural
diversity tend to focus on how accurately representations on screen reflect society
(Turner 2020, 20), this needs to be combined with a consideration of the structures
and people that make up a media organization, and the media’s platform specific
affordances.

Personalized Artwork
In December 2017 Netflix reported the use of a new algorithm to deliver personalized
artwork to its subscribers. This algorithm is capable of choosing a particular image out
of several different versions to best support a given recommendation (Gomez-Uribe
and Hunt 2015, 13:05). Previously, images on the Netflix site were limited to official
promotional material supplied by a production company, namely movie posters or
DVD cover art (Brincker 2021, 87). However, internal research showed that thumb-
nails of promotional posters were the major influencer on what to watch while brows-
ing, with 82 percent of audiences deciding on the basis of these images (Nelson 2016).
Nick Nelson, Head of Product Creative at Netflix, notes that users spend an average of
1.8 seconds considering thumbnail images of each title before deciding whether to
view it or move on. This means there is only a very short amount of time to capture a
user’s interest. Nelson (2016) observes, “we know that if you don’t capture a mem-
ber’s attention within ninety seconds, that member will likely lose interest and move
onto another activity. Knowing we have such a short time to capture interest, images
become the most efficient and compelling way to help members discover the perfect
title as quickly as possible.”3 The personalized artwork algorithm is central to what
Brincker (2021) calls Netflix’s “data-driven analytics to optimize the power of
pictures” (p. 87).
The Netflix personalization algorithms are tested primarily through what are
referred to as A/B tests. These tests measure the effectiveness of recommendation
Khoo 285

variants by comparing control and experimental groups of Netflix users (Gomez-Uribe


and Hunt 2015, 13:09). Each group receives alternate recommendations, and user
engagement and subscriber retention rates are correlated with various algorithm vari-
ants. In the case of personalized artwork, the A/B testing model tracks which images
receive the most clicks in different markets (Netflix 2016). The online learning frame-
work supporting the use of A/B testing in relation to artwork personalization involves
“contextual bandits.” Netflix engineers Chandrashekar et al. (2017) explain: “Rather
than waiting to collect a full batch of data, waiting to learn a model, and then waiting
for an A/B test to conclude, contextual bandits rapidly figure out the optimal personal-
ized artwork selection for a title for each member and context.” The contextual bandit
model selects the artwork most likely to elicit engagement for each Netflix user based
on their viewing context.
While the personalization of promotional images is not new and different film ter-
ritories often have localized marketing campaigns, in the case of Netflix this material
is being decided by algorithms instantly. As Lobato (2019) notes, “Netflix users do not
experience the catalog as a static list or schedule, but rather as a series of interactive,
personalized recommendations” (p. 251). Beyond appealing to users to simply click
on an inviting image, Netflix also examines quality of engagement to avoid learning a
model that recommends “clickbait” images: “ones that entice a member to start play-
ing but ultimately result in low-quality engagement” (Chandrashekar et al. 2017).
Improving engagement (the length of time viewing Netflix content) is “strongly
correlated with improving retention” (Gomez-Uribe and Hunt 2015, 13:09).
Netflix’s personalized recommendations and personalized visuals means that it
can respond to diverse audiences in a more agile way, but does this translate to an
active engagement with diversity and inclusion? Netflix argues that its online learn-
ing algorithms or contextual bandits are “learning an unbiased model on an ongoing
basis” (Chandrashekar et al. 2017). However, is this model truly “unbiased”?
Hallinan and Striphas (2016) point to Netflix’s “complex alchemy of audiovisual
matchmaking” (p. 117). This “complex alchemy” is generally not demystified or
publicized to Netflix users. In fact, Netflix’s personalization of artwork was only
brought to wider public attention in 2018 when some users began noticing the use of
minor supporting Black cast in promotional thumbnails, drawing attention on social
media to this perceived targeting of content based on race.
On 18 October 2018, writer and podcaster Stacia L. Brown (@slb79) tweeted:
“Other Black @netflix users: does your queue do this? Generate posters with the
Black cast members on them to try to compel you to watch?” Brown was commenting
on a personalized promotional image of Lauren Miller Rogan’s Like Father (2018),
starring Kristen Bell, Kelsey Grammer and Seth Rogan. The thumbnail on Brown’s
page, however, featured black cast members Leonard Ouzts and Blaire Brooks, both
of whom have only minor roles, or what Brown notes was “10 cumulative minutes of
screen time [and] 20 lines between them, tops.”4
Other users followed by posting their own examples of image personalization on
social media. Tobi Aremu, a Brooklyn-based filmmaker, reported to The Guardian that
the romantic comedy Set It Up (Claire Scanlon, 2018) was promoted to him with an
286 Television & New Media 24(3)

image of supporting cast members Lucy Liu (who is Asian American), and Taye Diggs
(who is African American), instead of principle cast members, Zoey Deutch and Glen
Powell, both of whom are white (Sharf 2018). British romantic comedy Love Actually
(Richard Curtis, 2003), with a predominantly white ensemble cast, has also been
reported as featuring a promotional poster foregrounding Black actor Chiwetel Ejiofor
as the romantic lead (Andrews 2018).
Perhaps the most recent high-profile example, and a case of “reverse” profiling,
resulted in Nicole Byer, host of popular Netflix baking show Nailed It! speaking out.
Byer was contacted by a fan who sent her a promotional poster of Nailed It! featuring
only her two white, male co-stars—co-host and chef Jacques Torres and show assistant
director Wes (Shepherd 2019). In a series of since-deleted Tweets, Byer wrote:

If Netflix didn’t sign my checks and give me a huge platform and opportunity to showcase
my comedy, I would talk about how f***** up and disrespectful this is to me a black
woman. . . . I would talk about how this essentially whitewashing for more views. But
they sign my checks and I’m honestly so happy and grateful to and for the show and no
sarcasm I love my job and wish to keep it so I’ll be quiet (Shepherd 2019).

Byers explains her reason for deleting the tweets: “I talked to one of the execs on my
show about it and the thrilling conclusion is the removal of the image and a conversa-
tion about how the thumbnails are made and selected that I’m happy with” (@nicole-
byer, 29 May 2019).
Netflix has denied using race or other markers of identity as factors in its personaliza-
tion data: “We don’t ask members for their race, gender, or ethnicity, so we cannot use
this information to personalize their individual Netflix experience. The only information
we use is a member’s viewing history” (Iqbal 2018). Rochelle King, Netflix’s vice presi-
dent of product creative, adds, “In general, a person’s race, gender or ethnicity is not a
great indicator of what that person will actually enjoy watching. Time after time, we
have seen that great stories transcend borders and that an individual’s tastes are complex
and multifaceted, going well beyond basic demographic attributes” (Nguyen).5 And yet
these “basic demographic attributes,” including race and ethnicity, are seemingly mapped
onto “taste communities,” creating the potential for filter bubbles.
Alexander (2016) points to a contradiction between “the notion that we have
reached an ‘on-demand utopia’ in which we are finally free to develop our own taste,
and the neoliberal reality of filter bubbles” (p. 94). She argues that Netflix’s personal-
ization system disempowers users and encourages instant gratification based on exist-
ing preferences: “we are no longer serendipitously exposed to [new and unfamiliar]
films” (p. 94). Bucher (2018) affirms: “What we see is no longer what we get. What
we get is what we did and that is what we see” (p. 2). Algorithmic biases are therefore
evident “both in the content that is available to us and in what is not recommended”
(Siles et al. 2019, 511). Described as “deceptive,” creepy’, “misrepresentative,” and
creating the “clickbait” culture it seeks to avoid, personalized imagery “[plays] on the
desire for Black content (recognized by the Netflix algorithm) and then utilized by
Netflix (in their clickbait image production) to serve more white mainstream American
culture” (Brincker 2021, 90).
Khoo 287

Algorithmic Cultures and Bias


The transparency and accountability of algorithms that filter, hierarchize, and recom-
mend have been scrutinized in terms of their implications and real-world consequences
including discrimination and the reproduction of existing power structures (Rieder
et al. 2018, 51) Theorists such as Benjamin (2019) have argued convincingly that
racial and other biases are built into algorithms, perpetuating the biases of the pro-
grammers behind them: “bias enters through the backdoor of design optimization in
which the humans who create the algorithms are hidden from view” (Benjamin 2019,
11). In Race After Technology, Benjamin (2019) explains what she refers to as the New
Jim Code: “the employment of new technologies that reflect and reproduce existing
inequities but that are promoted and perceived as more objective or progressive than
the discriminatory systems of a previous era” (pp. 5–6).6 While technology may be
developed to address different forms of discrimination or bias, they may well end up
reproducing them (Benjamin 2019, 47).
Noble (2018) has also highlighted how algorithms “reinforce oppressive social rela-
tionships and enact new modes of racial profiling,” which she terms “technological
redlining” (p. 1). Noble writes:

Part of the challenge of understanding algorithmic oppression is to understand that


mathematical formulations to drive automated decisions are made by human beings.
While we often think of terms such as ‘big data’ and ‘algorithms’ as being benign, neutral,
or objective, they are anything but (p. 1).

Others such as Couldry and Mejias (2019) have established a connection between
algorithms and “data colonialism”: a form of exploitation that “combines the preda-
tory extractive practices of historical colonialism with the abstract quantification
methods of computing” (p. 337). Algorithms construct and enforce regimes of power
and knowledge, which then become further normalized. “In ranking, classifying, sort-
ing, predicting, and processing data, algorithms . . . help to make the world appear in
certain ways rather than others” (Bucher 2018, 3). Siles et al. (2019) caution even
more expansively, “at stake in the establishment of algorithmic recommendation sys-
tems are the conditions for the redefinition of subjectivity and culture itself” (p. 499).
Algorithms can enact and perpetuate racial discourses, including the ways in which
we think about and experience race through our engagement with various media forms
(Joyrich 2009, 2). In the case of artwork personalization, the selection of minor actors
of marginalized races or ethnicities being used to “sell” a program as diverse only
masks the lack of diversity actually being presented, and feeds into pre-conceived
ideas about what an ethnically or racially diverse audience member might want to see
as part of the same “taste community.” This taste community is seemingly correlated
with race by ostensibly “unbiased” algorithms.
On the other hand, some of Netflix’s more progressive and racially diverse series
can end up being “whitewashed” in their promotional imagery. One of Netflix’s most
popular series to date has been Bridgerton, produced by Shonda Rhimes and debuting
288 Television & New Media 24(3)

Figures 1–5. Artwork Personalization of Netflix series Bridgerton (2020–2022). Period


drama with an all-white cast, or intercultural romance?

on Netflix on 25 December 2020. The series is ground-breaking for many reasons, not
least of which is its incorporation of non-white cast in lead roles in a series set in
Regency era London. Bridgerton was the most watched Netflix series at the time of its
debut and has been renewed for a second season in 2022. It presents an alternate his-
tory of a racially diverse society where the Queen is biracial (Queen Charlotte is
played by Guyanese-British actress Golda Rosheuvel), as is the leading man, the Duke
of Hastings (played by Regé-Jean Page, whose mother is Zimbabwean and whose
father is English). However, in several of the promotional thumbnails for the series,
not a single black actor is presented, making the series appear as another all-white
Sense and Sensibility or Downton Abbey targeting viewers of traditional British period
dramas. Only one thumbnail features the interracial romance at the heart of the series,
between the leading couple Lady Daphne (Phoebe Dynevor) and the Duke of Hastings.
A thumbnail with a much smaller image of the couple does not clearly show their
faces, or allow viewers to readily ascertain their ethnicity (Figures 1–5).
Whether this promotional strategy leads to more audiences tuning in and having
their expectations of a genre pleasantly disrupted, or whether it leads to disappoint-
ment and lower audience engagement, remains to be seen. In the case of Bridgerton,
the show’s popularity would appear to suggest the former. Nevertheless, I argue that
there is a level at which the ground-breaking and progressive nature of the series is
Khoo 289

neutralized (or worse, negated) through its promotional whitewashing, which does not
encourage viewers to make a choice to watch something outside of their comfort zone.
As Joyner (2016) has emphasized in relation to discovering ethnically and racially
diverse content, the problem is “not in what we’re being shown, but in what we’re not
being shown. . . . [I]t’s not until you express specific interest in ‘black’ content that
you see how much of it Netflix has to offer. . . . [T]o the new viewer, whose prefer-
ences aren’t yet logged and tracked by Netflix’s algorithm, ‘black’ movies and shows
are, for the most part, hidden from view.”
A consequence of this “picturing of diversity” is related to the fact that Netflix
produces original films and television shows based on viewing data collected from
audiences. Thus, recommendation systems can also influence decisions about the
kinds of future programs we might see. Programs such as Orange is the New Black
(created by Jenji Kohan and first airing on Netflix in 2013), and The Chair (created by
Amanda Peet and Annie Julia Wyman in 2021), have been leading the way in terms of
foregrounding ethnic and racial diversity in front of and behind the camera. Through
their popularity, it is apparent that there is a market for Netflix to create more diverse
programs for its subscribers.
Concerns over algorithmic bias, particularly in artwork personalization, intersect
with Netflix’s diverse company profile in multifaceted ways. If algorithms can be
programmed for taste reproduction, can they, through a diverse workforce, also be
programmed for diversity?

Netflix: Sowing the Seeds


On 1 August 2017 Netflix launched the #FirstTimeISawMe campaign. Through the
company’s social media channels, Netflix featured short videos with well-known cre-
ative artists including directors Spike Lee and Ava DuVernay, speaking about the
importance of creative control and representation in the media for minority groups. In
another video featuring its own employees, Netflix showcased the diversity of its
staff, asking employees to reflect on the first time they saw themselves reflected in
the media in terms of their race, sexuality or gender. Labeling themselves (in bold
text on screen) as “Gay and Middle Eastern,” “Strong, Outspoken and Dominican,”
and “Christian, a Mom and Asian American” (much like the descriptive tags of its
micro genres) these individuals recall the powerful moments in films and television
programs when they felt themselves truly represented on screen. While the
#FirstTimeISawMe campaign celebrates Netflix’s own inclusive media and diverse
programming, it has circulated more widely on social media to initiate a conversation
about media representation, diversity, and its impact on audiences and creators.
Netflix has made a pointed decision to hire programmers and other employees from
diverse backgrounds. The company has a dedicated Director of Inclusion Recruiting,
Kabi Gishuru, whose job is to train employees in addition to using inclusive hiring
practices to increase the company’s employee diversity, “spotting bias in the interview
process, sourcing candidates in non-traditional ways, and helping hiring managers
identify the perspectives missing on their teams” with the aim to create an
290 Television & New Media 24(3)

environment, policies and practices “that not only invite people in, but when they get
in they feel there is a level of investment in them” (Myers 2021). In February 2021,
Netflix announced the creation of the Netflix Fund for Creative Equity, which will
invest $100 million over the next five years in organizations that help underrepre-
sented communities train and find jobs in film and television (Sarandos 2021).
In collaboration with organizations that support and promote technical, creative,
and business leaders from under-represented groups, Netflix also builds diverse net-
works to increase its hiring pool with the aim of hiring more inclusively. It has partner-
ships with /dev/color, which connects Black software engineers, technologists, and
executives to companies, Techqueria, which serves the largest global community of
Latinx professionals in the technology industry, and TalentoTotal, a diversity and
inclusion development program in the United States that promotes Afro-Latino and
Indigenous (ADI) people to become business leaders across Latin America. These
efforts at addressing systemic issues that have excluded particular groups from the
entertainment and technology industries arguably have a flow on effect in terms of
how Netflix approaches its content. Chief Content Officer for Netflix, Sarandos (2021)
notes, “inclusion behind the camera exponentially increases inclusion in front of the
camera, and . . . both depend on ensuring that the Netflix executives commissioning
these stories are also diverse.”
In July 2020 Que Minh Luu joined Netflix as Director of Content for Australia and
New Zealand. Luu is a former executive producer for the ABC where she championed
diverse content such as The Heights, Diary of An Uber Driver, and Retrograde. Luu is
the daughter of refugees who fled Vietnam to Australia by boat in the late 1970s. She
also speaks during interviews of having a diversity and inclusion lens: “There needs to
be more points of view contributing to the conversation” (Bizzaca 2021; see also
Tadros 2021). Since launching locally in 2015, Netflix has made over 50 Australian
titles, including co-productions with television networks. As Que Minh Luu notes,
“[Netflix executives] have always been quick to make clear that being inclusive
doesn’t mean that you can’t be commercial” (Bizzaca 2021). Netflix’s focus on diver-
sity and inclusion serves a corporate agenda as much as a social one.
Beyond the corporate speak that “pair[ing] . . . culture with diversity and inclusion
. . . unlocks our ability to innovate, to be creative, to solve problems,” the Netflix
Inclusion report concludes with this simple statement that through diverse stories
“we’re able to better entertain our current and future members” (Myers 2021). In an
upbeat video published on the Netflix website, Vice President of Inclusion Strategy at
Netflix, Myers (2021) comments, “My team’s vision is to equip everyone with a diver-
sity lens, which is to say that as they do their job, they’re thinking about who’s not
here. Are we gathering all the perspectives?” The idea of inclusion “taking root” at
Netflix suggests that the company has invested many years in developing a strategy
that is now firmly implanted as part of its culture and ethos, so much so that Netflix
operates with a “diversity lens” in all aspects of its operations. The key question is how
this “rooted” form of inclusion informs Netflix’s design of its algorithms and trans-
lates to audience engagement (and ultimately retention).
Khoo 291

Netflix executives deploy all the appropriate diversity and inclusion language and
is training employees in this language. The company has held workshops on topics of
privilege, bias, intersectionality, and allyship, particularly in years where Black and
Asian communities have been disproportionately affected by the COVID-19 pandemic
and experienced hate crimes (Myers 2021). Yet, as Ahmed and Swan (2006) note,
“One of the primary defences of the language of diversity is that it is more ‘inclusive’,
precisely because it does not name a specific social category (such as gender, race and
class). But what are the terms of this inclusion? Who is included by the term?” (p. 96).
Netflix notes that it needs to improve its recruitment of Latinx, Middle Eastern/North
African, American Indian/Alaskan Native, and Native Hawaiian/Pacific Islander com-
munities, and also its representation of the LGBTQ+ community and characters with
disabilities, which currently make up only 4 percent of leads in film and 1 percent in TV
series, and 1 percent of series leads, respectively (Myers 2021) Netflix has published its
diversity data quarterly on its jobs site since 2013, with a plan to develop a roadmap for
ongoing improvement of diversity and inclusion. Currently, women make up nearly half
of its U.S. workforce (47.1%), including at the leadership level (directors and above:
47.8%). Nearly half of its U.S. workforce (46.4%) is made up of people from one or
more underrepresented racial and/or ethnic backgrounds, including Black, Latinx or
Hispanic, Indigenous, Middle Eastern, Asian, and Pacific Islander backgrounds (Myers
2021). As noted earlier, the Netflix Inclusion report is highly U.S. focused, with inclu-
sion and representation outside the U.S. an area where growth is needed. The final piece
of the Inclusion strategy involves the audience itself. How do audiences engage with
Netflix’s personalization systems and is there room to move within these systems?

Wither the Audience?


Charges of algorithmic bias have been met by two contrasting perspectives when it
comes to the agency of the audience. On one side is the argument that personalization
systems result in the diminished autonomy of audiences. While ostensibly being
offered a plethora of individualized options and choices, the effect of personalized
recommendations is to reduce agency and autonomy by being presented with a nar-
rower range of (similar) content (Arnold 2016, 50). If agency is defined as the capacity
to act (or to have acted differently), recommendation algorithms control our ability to
act by hiding certain options from view. Through algorithmic predictions, the NRS
takes actions on behalf of (or away from) the user. As Arnold (2016) notes, “Although
Netflix’s brand identity centers on notions of user choice, its algorithms work to
actively negate choice” (p. 59).
The opacity of Netflix’s recommendation systems is harnessed to the company’s
advantage. There is no real transparency or accountability about how the NRS operates
or makes decisions. Furthermore, Netflix offers users “only a limited set of controls
over how algorithmic decision-making shapes their platform experience” (Singh 2020,
33). Users cannot opt out of personalization features, including, and especially, receiv-
ing title suggestions. The lack of transparency makes it difficult both to analyze and to
counter problematic recommendations derived from these systems (Singh 2020, 6).
292 Television & New Media 24(3)

The other side of the debate suggests that there is a correlation between the use of
SVOD platforms and an increase in the quantity and diversity of content consumed,
despite recommendation systems offering a narrower range of content over time. As
Limov (2020) suggests, “users are transformed by recommendations, even as their
participation transforms the algorithms in turn” (p. 6307). Limov (2020) observes,
“recommendation systems can be conceptually understood as channelling users”
attention to the (somewhat) unfamiliar, adding a dimension of discoverability that
improves the accessibility of content on streaming platforms’ (p. 6307). Gillespie
(2014) adds, “recommendation algorithms map our preferences against others, sug-
gesting new or forgotten bits of culture for us to encounter” (p. 167).
These viewpoints are not polar opposites, and research in this area has been limited
by a lack of knowledge on how users actually respond to recommendation algorithms
on a daily basis or incorporate them into their lives (Siles et al. 2019, 499). In the
Australian context, cultural theorist Turner (2019) signals more broadly that there is a
“gap in our knowledge of how individuals and households consume television, across
platforms and devices, in domestic spaces” (p. 222), and this requires us to adapt
modes of audience research employed in earlier studies of television audiences. It is
not yet clear what kind of impact recommender algorithms can or will have in terms
of transforming film and television cultures and developing greater engagement with
more diverse content. A more balanced approach between the two opposing views
suggests that the relationship between users and algorithms can be framed in terms of
“mutual domestication”:

While algorithms participate in the maintenance and perpetuation of certain cultural


codes, they also learn from them and can be incorporated into our daily lives . . .
differently, shaping culture differently. The relationship between platform, technology
(algorithm) and people is cyclical. We watch on the basis of recommendations, but then
these perpetuate certain other recommendations’ (Siles et al. 2019, 516).

Within these cyclical processes there are a range of actors, including the programmers
and engineers who develop these systems, the users who interact with them, and the
company that deploys them (Pajkovic 2021, 3).
While resistance against the technical aspects of recommendation systems is not
easily achievable, it may be possible to resist the cultural biases inscribed in Netflix’s
recommendations, “expressed in the constant recommendation of content that users
consider stereotypical” (Siles et al. 2019, 511). Audiences can also actively exercise
agency through their social networks. Recommendations from friends, peers, and
other social networks also shape this relationship (Frey 2019, 167), as the example of
Bridgerton shows, despite the persistent whitewashing of its promotional imagery.
Algorithms thus form part of broader situated practices of sociality. Bucher (2018)
posits the notion of “programmed sociality” as a heuristic. Within certain circum-
scribed parameters, algorithms are dynamic rather than fixed and participate in wider
networks of sociality, both human and non-human (Bucher 2018, 4).
Khoo 293

Thus, while it may not be possible to actively circumvent the Netflix recommender
algorithms to see more diverse content on a user’s home page, it might be possible to
resist aspects of personalization through other forms of “programmed sociality.” The
way Netflix users relate to the media they are presented with is not only technologi-
cally determined. While algorithms may express the practices and biases of their
designers, they are also shaped by users’ social and cultural codes, and in the case of
Netflix, by its diverse company profile, at least in the United States.

Conclusion
In this essay I analyzed the relationship between streaming platforms, recommender
algorithms, and cultural diversity to consider the question of whether streaming
services present more diverse stories and facilitate a greater engagement with
diversity and inclusion than traditional “free-to-air” television, or whether they are
simply better at marketing and promoting diversity to audiences. I focused specifi-
cally on the Netflix Recommendation System, and in particular Netflix’s personal-
ization of artwork.
Where free-to-air television has been slow to cater to diverse audiences (with nota-
ble exceptions, for example, in the Australian context, the nation’s public broadcast-
ers, the Special Broadcasting Service (SBS) and the Australian Broadcasting
Corporation (ABC)), other platforms, including streaming services, have to some
degree filled a gap. Competition between a growing stable of streaming services argu-
ably leads to more diversity, as platforms themselves have been required to engage
with different audiences to grow their subscription base. Streaming platforms do not
have the limitations of scheduling that free-to-air television has; however, they must
capture audiences’ attention as soon as they connect to the platform (Ranaivoson 2019,
14). Recommender systems are designed to assist users in making choices based on
their previous selections. This does lead to biases in what is presented, and indeed, in
the case of artwork personalization, how they are presented.
I have presented the argument that recommender algorithms take actions on behalf
of (or away from) users, who are not necessarily given access to the full range of
Netflix’s catalog. While there is greater diversity being presented on the surface, the
question remains whether this results in sustained engagement with what is being
watched, with users transformed by the algorithmic recommendations to extend their
viewing habits, or whether recommender systems merely reinforce existing consumer
tastes, perpetuating consumption of similar products with only the “illusion of diver-
sity” (giving the impression that the Netflix catalog is much bigger than it actually is)
(Ranaivoson 2019, 112). Further audience research needs to be conducted in this area.
Beyond the level of on-screen representation, the value of diversity and inclusion
can be found in having a greater pool of creative talent to draw from, to create stories
that will have relevance to a broader cross section of the population, and to tell more
innovative stories. As the Screen Australia (2016) report Seeing Ourselves: Reflections
on Diversity in TV drama notes, the lack of diversity and inclusion:
294 Television & New Media 24(3)

is limiting the relevance of our industry and our most popular forms of cultural expression.
It is having commercial implications, as audiences seek relevant content elsewhere in
material produced overseas. And it is undermining our ability to innovate and connect
with the storytelling potential of our increasingly diverse population.

Netflix has acknowledged the value of a diverse workforce by placing its Inclusion
strategy at the heart of its operations. From a cultural perspective, we can also point to
a broader shift in audience’s media consumption toward more diverse content; that is,
reflecting the progressiveness of contemporary audiences and their commitment to
diversity and to a more accurate representation of the make-up of their society.
The power of algorithms, as Bucher (2018) notes, is “through the kinds of encoun-
ters and orientations algorithmic systems seem to be generative of” (p. 3). Algorithms
are relational, processual, and cultural (Striphas 2015). The picturing of diversity may
for now be algorithmically generated but more diverse content is slowly making an
appearance if you know where to find it (or it will eventually find you).

Declaration of Conflicting Interests


The author declared no potential conflicts of interest with respect to the research, authorship,
and/or publication of this article.

Funding
The author received no financial support for the research, authorship, and/or publication of this
article.

Notes
1. The Horowitz State of Viewing & Streaming study (Horowitz Research 2020) reports that
55 percent of viewers surveyed find that streaming services such as Netflix, Hulu, and
Disney Plus do better at showcasing stories by and about people of color than broadcast
and cable. Numbers are higher among viewers from under-represented groups: 58 per-
cent of African-Americans, 61 percent of Hispanics, and 64 percent of Asian-Americans
(Umstead 2020, 18).
2. The winning team, BellKor’s Pragmatic Chaos, was able to improve results by 10.06 percent.
3. See also Krishnan (2016). Vice President of Product innovation and Personalization algo-
rithms, Carlos Gomez-Uribe, said that the company employs more than 800 engineers
responsible for developing recommendation algorithms and maintaining the personaliza-
tion of the site (Novak 2017, 163).
4. https://twitter.com/slb79/status/1052776984231718912?ref_src=twsrc%5Etfw%7Ctwca
mp%5Etweetembed%7Ctwterm%5E1052776984231718912%7Ctwgr%5E%7Ctwcon%
5Es1_c10&ref_url=https%3A%2F%2Fwww.buzzfeednews.com%2Farticle%2Fnicolengu
yen%2Fnetflix-recommendation-algorithm-explained-binge-watching
5. https://www.buzzfeednews.com/article/nicolenguyen/netflix-recommendation-algorithm-
explained-binge-watching.
6. Based on Michelle Alexander’s thesis in The New Jim Crow, Benjamin examines a shift
from “explicit racialization to a colorblind ideology that masks the destruction wrought by
the carceral system . . .” (p. 9).
Khoo 295

References
Ahmed, Sara, and Elaine Swan. 2006. “Doing Diversity.” Policy Futures in Education 4 (2):
96–100.
Alexander, Neta. 2016. “Catered to Your Future Self: Netflix’s ‘Predictive Personalization’ and
the Mathematization of Taste.” In The Netflix Effect: Technology and Entertainment in the
21st Century, edited by K. McDonald and D. Smith-Rowsey, 81–97. London: Bloomsbury
Academic.
Andrews, Simon. 2018. “Netflix Said to be Deceiving Black Users With ‘Creepy’ Posters.”
Screen Geek, October 21. https://www.screengeek.net/2018/10/21/netflix-posters-deceiv-
ing-black-users/.
Arnold, Sarah. 2016. “Netflix and the Myth of Choice/Participation/Autonomy.” In The Netflix
Effect: Technology and Entertainment in the 21st Century, edited by K. McDonald and
D. Smith-Rowsey, 49–62. London: Bloomsbury Academic.
Australian Bureau of Statistics (ABS). 2016. “Census.” https://www.abs.gov.au/statistics.
Benjamin, Ruha. 2019. Race After Technology. New York: Polity.
Bizzaca, Caris. 2021. “Podcast – How to Pitch to Netflix ANZ.” Screen Australia, May 21.
https://www.screenaustralia.gov.au/sa/screen-news/2021/05-21-podcast-how-to-pitch-to-
netflix-anz.
Brincker, Maria. 2021. “Disoriented and Alone in the ‘Experience Machine’ – On Netflix, Shared
World Deceptions and the Consequences of Deepening Algorithmic Personalization.”
SATS 22 (1): 75–96.
Bucher, Taina. 2018. If. . .Then: Algorithmic Power and Politics. Oxford: Oxford University
Press.
Couldry, Nick, & Ulises A. Mejias. 2019. “Data Colonialism: Rethinking Big Data’s Relation to
the Contemporary Subject.” Television and New Media 20 (4): 336–349.
Chandrashekar, Ashok, Fernando Amat, Justin Basilico and Tony Jebara. 2017. “Artwork
Personalization at Netflix.” Netflix Technology Blog, December 8. https://netflixtechblog.
com/artwork-personalization-c589f074ad76.
Frey, Mattias. 2019. “The Internet Suggests: Film, Recommender Systems, and Cultural
Mediation.” JCMS: Journal of Cinema and Media Studies 59 (1): 163–9.
Gaw, Fatima. 2021. “Algorithmic Logics and the Construction of Cultural Taste of the Netflix
Recommender System.” Media Culture & Society. Published electronically October 25
2021. doi:10.1177/01634437211053767.
Gillespie, Tarleton. 2014. “The Relevance of Algorithms.” In Media Technologies: Essays on
Communication, Materiality, and Society, edited by T. Gillespie, P. J. Boczkowski, and
K. A. Foot, 167–93. Cambridge, MA: The MIT Press.
Gomez-Uribe, Carlos A., and Neil Hunt. 2015. “The Netflix Recommender System: Algorithms,
Business Value, and Innovation.” ACM Transactions on Management Information Systems
6: 1–19.
Hallinan, Blake, and Ted Striphas. 2016. “Recommended for You: The Netflix Prize and the
Production of Algorithmic Culture.” New Media & Society 18 (1): 117–37.
Horowitz Research. 2020. “State of Viewing and Streaming 2020.” https://www.horowitzre-
search.com/syndicated-research/2020-studies/state-of-viewing-streaming-2020/.
Iqbal, Nosheen. 2018. “Film Fans See Red Over Netflix ‘Targeted’ Posters for Black Viewers.”
The Guardian, October 21. https://www.theguardian.com/media/2018/oct/20/netflix-film-
black-viewers-personalised-marketing-target.
296 Television & New Media 24(3)

Joyner, April. 2016. “Blackflix: How Netflix’s Algorithm Exposes Technology’s Racial Bias.”
Marie Claire, February 29. https://www.marieclaire.com/culture/a18817/netflix-algo-
rithms-black-movies/.
Joyrich, Lynne. 2009. “Preface: Bringing Race and Media Technologies Into Focus.” Camera
Obscura: Feminism Culture and Media Studies 24 (1): 1–5.
Krishnan, Gopal. 2016. “Selecting the Best Artwork for Videos Through A/B Testing.” Netflix
Technology Blog. https://netflixtechblog.com/selecting-the-best-artwork-for-videos-
through-a-b-testing-f6155c4595f6.
Limov, Brad. 2020. “Click It, Binge It, Get Hooked: Netflix and the Growing U.S. Audience for
Foreign Content.” International Journal of Communication 14: 6304–6323.
Lobato, Ramon. 2019. Netflix Nations: The Geography of Digital Distribution. New York, NY:
New York University Press.
Myers, Vernā. 2021. “Inclusion Takes Root at Netflix: Our First Report.” January 13. https://
about.netflix.com/en/news/netflix-inclusion-report-2021
Myers, Vernā. 2022. “Our Progress on Inclusion: 2021 Update.” February 10. https://about.
netflix.com/en/news/our-progress-on-inclusion-2021-update
Nelson, Nick. 2016. “The Power of a Picture.” Netflix. https://about.netflix.com/en/news/the-
power-of-a-picture.
Noble, Safiya Umoja. 2018. Algorithms of Oppression: How Search Engines Reinforce Racism.
New York, NY: New York University Press.
Novak, Alison. 2017. “Narrowcasting, Millennials and the Personalization of Genre in Digital
Media.” In The Age of Netflix: Critical Essays on Streaming Media, Digital Delivery and
Instant Access, edited by C. Barker and M. Wiatrowski, 162–81. Jefferson, NC: McFarland
& Company.
Pajkovic, Niko. 2021. “Algorithms and Taste-Making: Exposing the Netflix Recommender
System’s Operational Logics.” Convergence: The International Journal of Research into
New Media Technologies 28: 214–35.
Ranaivoson, Heritiana. 2019. “Online Platforms and Cultural Diversity in the Audiovisual
Sectors: A Combined Look at Concentration and Algorithms.” In Audio-Visual Industries
and Diversity: Economics and Policies in the Digital Era, edited by L. A. Albornoz and
M. T. G. Leiva, 100–18. London and New York, NY: Routledge.
Rieder, Bernhard, Ariadna Matamoros-Fernández and Òscar Coromina. 2018. “From Ranking
Algorithms to ‘Ranking Cultures’: Investigating the Modulation of Visibility in YouTube
Search Results.” Convergence: The International Journal of Research into New Media
Technologies 24 (1): 50–68.
Sarandos, Ted. 2021. “Building a Legacy of Inclusion: Results From Our First Film and Series
Diversity Study.” Netflix. https://about.netflix.com/en/news/building-a-legacy-of-inclusion.
Screen Australia. 2016. “Seeing Ourselves: Reflections on Diversity in TV Drama.” https://
www.screenaustralia.gov.au/fact-finders/reports-and-key-issues/reports-and-discussion-
papers/seeing-ourselves.
Sharf, Zack. 2018. “Netflix Accused of Promoting Content by Targeting Viewers’ Race,
But Company Says That’s Impossible.” Indiewire, October 23. https://www.indiewire.
com/2018/10/netflix-accused-targeting-viewers-race-posters-thumbnails-1202014458/.
Shepherd, Jack. 2019. “Netflix Accused of ‘Whitewashing’ by Nailed It! Presenter Nicole
Byer.” The Independent, May 29. https://www.independent.co.uk/arts-entertainment/tv/
news/netflix-nicole-byer-nailed-it-whitewash-jacques-torres-wes-a8934311.html.
Khoo 297

Siles, Ignacio, Johan Espinoza-Rojas, Adrián Naranjo and María Fernanda Tristán. 2019.
“The Mutual Domestication of Users and Algorithmic Recommendations on Netflix.”
Communication Culture & Critique 12: 499–518.
Singh, Spandana. 2020. “Why Am I Seeing This? How Video and E-Commerce Platforms
Use Recommendation Systems to Shape User Experiences.” New America, March. https://
www.newamerica.org/oti/reports/why-am-i-seeing-this/case-study-netflix/.
Spangler, Todd. 2021. “Squid Game is Decisively Netflix No. 1 Show of All Time With 1.65 Billion
Hours Streamed in First Four Weeks, Company Says.” Variety, November 16. https://variety.
com/2021/digital/news/squid-game-all-time-most-popular-show-netflix-1235113196/.
Striphas, Ted. 2015. “Algorithmic Culture.” European Journal of Cultural Studies 18 (4–5):
395–412.
Tadros, Edmund. 2021. “Netflix’s Local Content Boss Talks Diversity and Inclusion.”
Australian Financial Review, February 26. https://www.afr.com/companies/media-and-
marketing/netflix-s-local-content-boss-talks-diversity-and-inclusion-20210225-p575xq.
Turner, Graeme. 2019. “Approaching the Cultures of Use: Netflix, Disruption and the
Audience.” Critical Studies in Television: The International Journal of Television Studies
14 (2): 222–32.
Turner, Graeme. 2020. “Dealing With Diversity: Australian Television, Homogeneity and
Indigeneity.” Media International Australia 174 (1): 20–8.
Umstead, R. Thomas. 2020. “Finding Diversity on Streaming Services.” Multichannel.com,
May 18. https://www.nexttv.com/news/finding-diversity-on-streaming-services.

Author Biography
Olivia Khoo is Associate Professor in Film and Screen Studies at Monash University, Australia.
She is the author of Asian Cinema: A Regional View (Edinburgh University Press, 2021).

You might also like