Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 64

when you go to google and type in climate change is you're going to see different results depending on where you

live and the particular


things that google knows about your interests

that's not by accident that's a design technique. What i want people to know is that everything they're doing online is being watched, is being
tracked every single action you take is carefully monitored and recorded

a lot of people think google's just a search box and facebook's just a place to see what my friends are doing what they don't realize is there's
entire teams of engineers whose job is to use your psychology against you

i was the co-inventor of the facebook like button i was the president of
pinterest
google
twitter
Instagram
there were meaningful changes happening around the world because of these platforms i think we were naive about the flip side of that coin

we get rewarded by parts likes thumbs up and we conflate that with value and we conflate it with truth a whole generation is more anxious
more depressed

i always felt like fundamentally it was the force for good i don't know if i feel that way anymore

facebook discovered that they were able to affect real world behavior and emotions without ever triggering the user's awareness they are
completely clueless

fake news spreads six times faster than trueness we're being bombarded with rumors

if everyone's entitled to their own facts there's really no need for people to come together in fact there's really no need for people to interact
we have less control over who we are and what we really believe if you want to control the population of your country there has never been a
tool as effective as facebook we built these things and we have a responsibility to change it the intention could be how do we make the world
better if technology creates mass chaos loneliness more polarization more election hacking more inability to focus on the real issues we're
toast.
this is checkmate on humanity

hi everyone and welcome i'm very excited to have this incredibly important conversation with the people behind the new documentary called
“the social dilemma”

before we dive in and there's an awful lot to talk about a lot of issues that are raised by this documentary i wanted to first introduce our
panelists and in the era of zoom if you all can wave as i introduce you just to help people Know who you are which one is you jeff orlowski is
the director produ he's also the director producer and cinematographer of the award-winning films chasing coral and chasing ice, tristan
harris is a good friend of mine i'd like to say i hope you think so too tristan he is the president and co-founder of the center for humane
technology and a former google design ethicist he's been called the closest thing silicon valley has to a conscience, tim kendall is ceo of
moment hi tim he's also the former president of pinterest and the former director of monetization a facebook, anyway a cathy o'neil is a
professor and entrepreneur and i hope you all notice that her hair matches is that your blanket kathy or your chair yes i needed my blanket
yes all right it matches your blanket kathy earned a ph okay kathy is a kind of a brainiac she earned a Phd in math from Harvard was postdoc
at the mit math department and is a prof and a professor at barnard college where she published a number of research papers in arithmetic
algebraic geometry which makes me break into a cold sweat kathy i hope you know that, anyway she is in the film and has a lot to say
Meanwhile, rashida richardson is a visiting scholar at rutgers law school and Rutgers institute for information policy and law where she
specializes in race emerging Technologies and the law. so i'm thrilled as i said to be able to have this conversation this is an area that i've
been interested in for some time so thank you so much for doing this and jeff let me start with you as i mentioned your past work includes
chasing ice and chasing coral which really focused on climate change so i'm curious what made you want to turn your lens to social media
and technology a very different subject in so many ways

different in some ways and very similar in other ways, i think our team has always been interested in big problems and big Challenges and
the existential threat of climate change was where we spent a lot of Time and then when learning from tristan and others in the film about
what's at stake here the seriousness of the way our technology is reprogramming civilization we realized this was a huge huge issue facing
society ironically it's my own filter bubble that got me into this was seeing posts from christon and other friends and i was in a small group of
people that i was hearing these conversations a couple years ago and it was that insight that that said wait a second there's a much much
bigger story here and we we set out to explore that

you know i've produced documentaries as well on big thorny social issues and i'm curious what some of the challenges were for you and just
actually putting this documentary together because i know it Took three years jeff yeah it was so I imagine there were parts of it that weren't
so easy

Right, it was a big project and a big Undertaking when i first started speaking with Tristan and getting christian's perspective we started
reaching out to other former employees from the companies as well and it was difficult to get people to just be willing to speak on the record
i feel like i had to twist tim's arm a bit to get him to speak on camera and it was challenging because i was very very curious about the
perspective from the people who were inside the companies and that was sort of a foundational backbone and then from that thinking then
we reach out to rashida and Kathy and others to kind of surround that insight and that knowledge and to give us a perspective here we also
really wanted the film to be very accessible to the general public and so we were really as a creative team thinking through how do we bring it
to life how do we make it interesting how do we make it accessible how do we get the public to think about this in a different way as much as
i love all of the brilliance in all of the talking heads right i say that with dear love and affection not everybody wants to watch a documentary
and we were really trying to figure out how do we make a film that a lot more people want to come and see and to bring into the conversation
than to have conversations like this to follow up.

and you know i think one of


08:53
the great things about the film
08:55
jeff is it really takes a deeper dive
08:57
into these concepts that
09:00
that people may kind of sense but
09:02
they're somewhat oblivious to
09:04
because they are so keyed in to the
09:06
technology and what it allows you to do
09:09
and the world it opens up to you they
09:11
don't think of some of the repercussions
09:13
and tristan you've been thinking about
09:15
this
09:16
for a long time as i mentioned you've
09:19
been called the conscience of the
09:20
silicon valley
09:22
and i think it's probably
09:25
instructive for people hear your
09:26
backstory just a little bit
09:28
and what made what made the the
09:32
switch what made you flip the switch and
09:35
say
09:36
wait a second this is not a good thing
09:38
that i'm doing

09:40
yeah thanks katie it's good good to
09:42
see you here and
09:43
and thank you for doing this the
09:47
you know i i was at stanford studying
09:50
a computer science degree
09:51
and my friends in college started
09:54
instagram and i saw a lot of my friends
09:57
were the same age and i'm the same age
09:58
as zuckerberg
09:59
a lot of my friends when we were in
10:01
college would talk about all these
10:03
positive social impact driven things we
10:05
wanted to do in the world with
10:06
technology with computer science
10:08
and i saw more and more of my friends
10:10
get sucked down the kind of rabbit hole
10:12
of building these big technology
10:14
companies that would get lots of
10:16
engagement and growth
10:17
and less and less of our choices had to
10:19
do with hey how can we make the world
10:20
better and more and more of our choices
10:22
had to do with
10:23
how could we keep people engaged and
10:25
suck people in
10:26
and i noticed that that just became this
10:28
race to go deeper and deeper into human
10:30
psychology to figure out a deeper way to
10:32
manipulate
10:33
our really lizard brain instincts and
10:35
the founders of instagram and i studied
10:37
at a
10:37
lab at stanford called the persuasive
10:39
technology lab where we learned many of
10:41
these things and as the film talks about
10:42
you know i had a background in magic
10:44
and behavioral economics and kind of how
10:46
is the mind fooled and i saw that more
10:48
and more of it had to do with this
10:49
trickery
10:49
and that that would create this huge
10:51
confusion if we didn't as a tech
10:54
industry come together
10:55
and say we have a moral responsibility
10:56
to get this right and we also have a
10:59
more responsibility to have regulation
11:00
that can create the incentives for
11:02
technology companies to do the right
11:03
thing
11:04
which is not the case currently yeah
11:06
well let's talk about it it's really
11:07
sort of
11:08
they play on this pavlovian response
11:11
in many ways as you
11:12
describe as the lizard brain and i wan
11:16
i want to use a couple of terms here by
11:18
the way i encourage everybody watching
11:20
this panel to really go back and watch
11:21
the movie because jeff
11:23
and everyone involved you all did a
11:25
great job of really
11:27
outlining the the host of issues
11:31
from addiction to polarization that
11:33
really is part and parcel
11:35
of the digital world but real quickly
11:38
tristan can you just give us
11:39
kind of an appetizer you one of the one
11:42
of the concepts in the film
11:44
is if the platform is free you are the
11:46
product
11:47
what do you mean by that exactly well
11:50
you know if you just ask people how much
11:52
have you paid for your facebook account
11:53
recently
11:55
and people think for a second they
11:56
realize they haven't paid at all well
11:58
then how is it worth more than 500
11:59
billion dollars as a company
12:02
and the answer is that we are the
12:03
product and so long as we are the
12:05
products meaning advertisers pay
12:07
so that we are influenced but so long as
12:10
we are the product that means that we're
12:11
worth more
12:12
if we're addicted distracted outraged
12:14
polarized and disinformed
12:16
because that meant that the attention
12:19
mining model was successful
12:21
we're worth more when we're kind of
12:22
domesticated into this kind of
12:24
hyper attention switching distracted
12:26
addicted kind of human
12:28
and i think that the big confusion is
12:30
when we look in the mirror of technology
12:32
we've been told these are just neutral
12:34
platforms they're just showing us a
12:36
reflection in the mirror this is who you
12:37
are you have a lizard brain
12:39
but it's really a fun house mirror where
12:41
it's amplified the lizard brain parts of
12:43
ourselves we like to say
12:44
since the film deals in conspiracy
12:45
theories you know lizard people do
12:47
actually walk among us and run the world
12:50
it's just that we're the lizard people
12:51
because it's actually taken our lizard
12:53
brain and made that
12:54
the pilot of our choices which is
12:56
dangerous when you zoom out and say what
12:57
it's doing to democracy
12:59
and to society around the world let's
13:01
talk about addiction though before we
13:03
talk about polarization which of course
13:05
so many people are going to be
13:06
interested in hearing
13:08
especially because we're so close to an
13:10
incredibly
13:12
important election tim i'm curious from
13:15
you
13:15
about addiction and you yourself
13:18
admitted
13:19
that you have trouble putting your phone
13:21
away you say when you know you should be
13:23
spending quality time with your kids
13:24
you're in a closet you know looking
13:26
checking
13:26
emails and i know tristan
13:29
has has really examined
13:33
the mechanisms that are used to
13:36
addict people whether it's bright colors
13:38
or you know and we can talk to kathy and
13:41
me in a minute about
13:42
a.i and how that is contributing but
13:45
tim from a monetization point of view
13:48
how
13:48
how does facebook how do they get us so
13:51
we
13:52
we we feel like we're we're going cold
13:55
turkey when we're not
13:57
close to our devices well i think
14:00
you know it all started i mean they
14:03
prey
14:04
on they prey on human weakness and i
14:07
i think the light bulb really went off
14:10
for them
14:11
when they invented photo tagging 15
14:13
years ago
14:14
and they realized that oh if we let
14:18
katie know
14:19
that a picture of her just got posted to
14:22
facebook
14:23
we sent her an email she comes to the
14:26
site
14:27
100 of the time and she
14:30
stays for this amount of time
14:33
and so there's this there's this notion
14:36
of these
14:37
technologies i think they the insight
14:39
there was
14:40
wow if we can prey on
14:43
you know katie's you know social
14:46
self-consciousness
14:47
or or kind of all of our phobias about a
14:49
bad picture being
14:50
posted online we can get incredible
14:53
engagement
14:55
and the dimensions on which they prey on
14:57
our prefrontal cortex have just been
14:59
added
15:00
right all the terms that tristan just
15:03
just explained and i think what's
15:05
especially scary is that
15:08
they've they've preyed on all these
15:09
dimensions right our need for belonging
15:11
our
15:12
our need to express anger our need to
15:15
watch a car crash they prayed on all of
15:17
these
15:18
and there are only so many people
15:21
in the world and there's only so much
15:23
time but
15:25
these services and their value is
15:27
predicated on
15:28
consistent and high rates of growth
15:31
which basically means they have to get
15:34
better
15:34
and better at it which happens at at our
15:37
expense
15:39
well so what what was it for you that
15:41
said i'm
15:42
i'm sure you were making a boatload of
15:44
money at facebook what employee were you
15:46
by the way
15:47
i think i was around 95 or
15:52
something
15:53
so you had probably you know a nice
15:55
chunk of change
15:56
working there but then you decided like
16:00
i can't do this was there an ah yeah
16:02
that wasn't
16:03
i mean i'd love to give myself credit
16:05
for for that was 10 years ago so i'd
16:07
love to get myself credit for seeing the
16:08
future 10 years ago but i didn't i just
16:10
wanted to go to another company
16:12
pinterest in this case that that did
16:15
similar things i think we can we can
16:17
argue it's doing less harm in the world
16:19
by a pretty wide margin compared to
16:21
facebook at the moment
16:22
but the mechanisms are the same in terms
16:25
of
16:26
the algorithms being used to bring
16:28
people onto the platform and have them
16:30
spend
16:31
increasing amounts of time by convincing
16:33
them
16:34
in the case of pinterest that they need
16:37
things and so you know my
16:40
my reckoning if you will came probably
16:43
right when i had my
16:44
first daughter so six years ago when i
16:47
when i started to realize that
16:49
my phone and the services on my phone
16:52
were more interesting
16:54
than my first child and
16:57
notwithstanding sort of my core
17:00
values which is that
17:01
i wanted more than anything to be the
17:04
best dad i could be
17:06
i couldn't get my behavior in congruence
17:08
with that
17:09
and so that that misalignment that sort
17:12
of psychological misalignment was a
17:14
was a red alert for me and i started to
17:18
talk more about it in my in my job at
17:20
pinterest
17:22
talk more about it publicly and then
17:24
ultimately
17:25
decided to leave pinterest to really
17:28
try to help you know bring to bear
17:31
technology to help
17:33
individuals and families with with this
17:35
problem
17:36
and that's what really motivated you to
17:38
start moment
17:39
yeah which i which i there was a
17:41
different founder of moment but i but i
17:43
joined
17:44
an existing company and then became the
17:46
became the ceo
17:48
kathy you explored the biases of
17:51
algorithms a lot in your work um
17:54
can you explain for for people who may
17:57
not be super well
17:58
versed in tech sort of the whole concept
18:01
of
18:02
algorithms and the role they play in our
18:04
lives
18:06
yeah and if you don't mind i'm going to
18:07
answer the same question you asked
18:09
tristan
18:09
and tim which is like how this how i saw
18:13
the light because it'll also explain
18:14
that
18:16
which is that i was a mathematician
18:18
until i became a quant
18:19
at a hedge fund in 2006. and i kind of
18:22
just got this front row view
18:25
of the credit crisis and i saw in
18:28
particular if you remember the
18:29
aaa ratings on mortgage-backed
18:31
securities yeah that were
18:33
essentially lies those were
18:36
algorithms those were risk algorithms
18:38
trying to
18:39
convince us to trust them and
18:43
i was like that is a big old
18:45
mathematical lie that
18:46
really made a problem that was already a
18:48
problem the housing bubble much worse
18:50
especially when it exploded in front of
18:52
us dramatically all
18:53
of a sudden so i quit had i quit after a
18:56
while
18:57
trying to you know i'll i'll skip some
19:00
details but i became a data scientist
19:02
sort of again once again eating the
19:04
kool-aid drinking the kool-aid of like
19:06
oh now i'm a data scientist instead of a
19:08
hedge fund quant
19:09
i can do good with data
19:13
but what i saw was these algorithms that
19:15
i was now using to predict
19:17
humans instead of the market
19:20
were also flawed and moreover they were
19:23
very directly
19:24
related to this concept that both
19:27
tristan and tim have described which is
19:29
like
19:29
making people feel like their self-worth
19:32
is on the line
19:33
i was deciding whether somebody was
19:35
worthy of an opportunity
19:38
online and moreover i realized that i'm
19:41
good at my job
19:42
i'm a good mathematician i'm a good data
19:44
scientist and what i was doing was
19:46
super dumb katie i was like how much are
19:49
you worth
19:50
where do you live are you a man or a
19:52
woman
19:54
i was deciding that that i was deciding
19:57
who was worthy
19:58
i was deciding but what i essentially
20:00
was doing was i was saying
20:01
you have a profile of a lucky person and
20:04
i'm going to give you
20:05
an option i was gonna make you luckier
20:07
you on the other hand you have a profile
20:09
of an unlucky person based on your you
20:12
know
20:13
browser history and if you're unlucky
20:17
i'm gonna make you unluckier
20:18
i'm to segregate the world into the
20:20
lucky who i'm going to make luckier and
20:22
the unlucky who i'm going to be
20:24
unluckier and what i saw was that every
20:26
data scientist was doing this
20:28
we were all segregating the world and
20:31
streamlining them
20:32
into these paths that were basically
20:35
propagating the past
20:38
and it was going to create a feedback
20:39
loop and lots of harm
20:41
can you give me real world examples
20:43
kathy like
20:44
sure i'm sort of following you but not
20:47
exactly
20:48
you know can you give me a real world
20:50
example that might be
20:51
our viewers sure yeah so i'm
20:55
i was working in ad tech which is
20:57
basically what facebook
20:58
built is built on so i was deciding
21:01
who gets
21:02
sort of an offer in the in the world of
21:04
travel
21:05
expedia cheap tickets that that was the
21:07
world i was working in
21:08
but the same exact methodologies were
21:11
being used
21:12
in insurance like based on
21:15
your profile you might get diabetes i'm
21:17
not sure about how much this
21:19
health insurance should cost based on
21:20
your profile people like you didn't
21:23
always pay back their loan so we're
21:24
gonna charge you extra
21:26
and moreover people like you were
21:28
willing to pay
21:29
more for car insurance even if you you
21:31
wouldn't actually represent a higher
21:32
risk
21:33
but if you're willing to pay more we're
21:34
going to charge you more so there was
21:36
all sorts of ways that sort of old
21:39
school scamming
21:40
was became becoming digitized and
21:42
moreover it was getting legitimized
21:44
because we all trusted big data
21:46
we all trusted algorithms it was
21:49
basically taking advantage of people
21:51
and actually preying on them
21:54
for their social you know depending
21:57
on their socioeconomic status
21:59
and past behavior right 100 in fact
22:03
a vc came to our firm and again it was
22:05
just a travel
22:07
industry ad tech firm and he said to
22:11
send to the entire company he said you
22:13
know i'm dreaming of the day
22:15
and he was an architect you should i i
22:17
should say of the internet
22:19
because he was a vc deciding who to to
22:21
give money to in the world of ad tech
22:24
he was like i can't wait for the day
22:25
when i never all i see are ads for
22:27
trips to aruba and jet skis and i never
22:30
want never again have to look at another
22:32
university of phoenix ad
22:34
because those aren't for people like me
22:36
and i was like oh wait who are those for
22:39
actually and i looked into it and i was
22:41
like
22:41
single black mothers who are poor enough
22:44
to qualify for financial aid
22:46
and who don't know the difference
22:48
between
22:49
a private college and a for-profit
22:52
college so they're literally preying on
22:54
people
22:55
who you know are trying to make the
22:58
their
22:58
lives better for their children it was
23:01
absolutely predatory
23:03
but that was actually his vision his
23:04
intention and so yes katie it was really
23:07
demographic it was very
23:09
related to race and gender and
23:12
money so i was also concerned
23:17
you know with so i was concerned with
23:19
lots of different algorithms and
23:20
predictive algorithms
23:21
that were were sort of representing
23:24
really important decisions in people's
23:25
lives
23:26
like what i said college admissions
23:28
getting a job getting insurance getting
23:30
a loan
23:31
and i was also concerned with political
23:33
information and that's where i came
23:35
to facebook and social media because i
23:37
was
23:38
i was studying the extent to which
23:40
political campaigns
23:42
know so much about us that there's this
23:44
asymmetry of information
23:46
they know more about me than i know
23:47
about them
23:49
they get to decide which of the 20
23:51
things
23:52
i agree with most that they want to tell
23:55
me about
23:56
and even if i go to their website the
23:58
cookies follow me they still have my
24:00
profile they can still
24:01
show me the one thing that their
24:02
candidate agrees with me
24:05
and ignore everything else and what's
24:07
worse and this is something
24:09
that only partly predicted when my book
24:10
came out in 2016 before the election
24:14
is they don't even have to give us
24:15
information they can literally give us
24:18
propaganda to prevent us from wanting to
24:20
vote in the first place
24:23
they can literally commit voter
24:25
suppression and we did see that
24:28
and they could predict who it's going to
24:29
work on so what's worse katie is that
24:32
people like you who are journalists and
24:33
like
24:34
would know better and would recognize
24:36
this as propaganda they're not going to
24:38
show that to you
24:38
they have a score for every user about
24:41
how
24:42
likely that user is to object to what
24:45
they're saying
24:47
and they're not going to show that to
24:48
people who would know better
24:50
so it was really pernicious and i and
24:54
i needed to say something about it so i
24:56
quit my job in data science and wrote
24:58
that book
24:59
i mean i'm just saying like the
25:01
algorithms that we built
25:03
on a daily basis were definitely making
25:06
inequality works
25:07
and they were definitely destroying
25:08
democracy
25:10
rasheed i was going to ask you about
25:12
that since you really
25:14
are a student of this intersection
25:18
of of race technology
25:22
and and law
25:25
how do you see it what kind of societal
25:28
effects
25:28
do you see when it comes to
25:31
these big internet companies
25:34
and this you know information age
25:38
really polluting society and
25:41
exacerbating some of the biggest social
25:44
ills that we face today
25:47
yeah so we're seeing a lot of
25:49
compounding of
25:50
probably the ugliest parts of society
25:53
happening
25:54
in any setting where data is being used
25:56
to generate
25:57
outcomes and that's definitely part of
25:59
the process within
26:01
social media platforms so if you
26:04
accept as the ground truth that we live
26:06
in a socially inequitable society if
26:08
you're using data that's reflecting our
26:10
society it's going to amplify that
26:12
but then when you compound that by the
26:14
fact that many of these algorithms and
26:16
systems are being designed by
26:18
a very homogeneous and small minority of
26:21
people mostly white men
26:23
then their world view is also being
26:25
imported into
26:27
these systems and a lot of the
26:29
imbalances that kathy just outlined
26:32
as in who is lucky who's not lucky who's
26:34
deserving of opportunity
26:36
or who's deserving of certain benefits
26:39
that's all playing out at a global scale
26:42
and on
26:42
not just one platform but multiple so it
26:45
really
26:46
sort of skews our view of reality in a
26:49
way where
26:50
you may see and consume information in a
26:53
way that's confirming your world view
26:55
and making you think oh we live in an
26:57
equitable society but in reality
26:59
it's it's the complete opposite
27:01
happening and
27:03
it also serves to worsen what the status
27:06
quo
27:07
already is in many ways and that if you
27:10
if we stick with the advertising
27:13
examples
27:14
if if you're advertising jobs that and
27:17
you're using an algorithm that has a
27:19
racial and gender bias then
27:21
that's going to further compound who's
27:23
getting the higher period
27:25
paying jobs or who's getting better
27:26
opportunities in society
27:28
and that's happening across several many
27:30
different domains
27:32
and in general it's an interesting area
27:35
i came into this because i was working
27:36
on civil rights issues and saw
27:39
that big data tech and ai were all
27:42
sort of infiltrating all of these issues
27:44
and compounding them making
27:45
the issues that i was trying to help fix
27:48
harder to fix
27:49
and i think we're seeing that across
27:51
the world whether we're talking about
27:53
voting rights issues
27:55
school segregation or even equitable
27:57
employment
27:58
these technologies are in many ways just
28:01
worsening what's already
28:02
not a great start i remember exploring
28:05
this when i did a
28:06
national geographic hour on this and and
28:09
talked to tristan about it and
28:10
they were just starting to realize how
28:14
how biased ai was for that very reason
28:17
rashida because of the people who were
28:18
creating
28:20
this this intelligence it was from their
28:22
real world
28:23
view would the problem be mitigated in
28:25
any way shape or form
28:27
cathy and rashida and anybody else can
28:30
can chime in
28:31
if in fact there was more diversity
28:34
that was coming up with this
28:38
this these algorithms or the artificial
28:41
intelligence because you know i remember
28:43
writing doctor into google and looking
28:46
at the images
28:47
and at the time they were all pretty
28:50
much white guys with white coats
28:52
and did not show diversity and that was
28:55
reflecting
28:56
the people who are programming the ai
28:59
so how much of the problem would be
29:01
solved cathy you first if we had a more
29:03
diverse
29:04
group of people who was doing this and
29:06
it seems to me
29:08
that it would help but the problem is
29:11
even much bigger
29:12
than that right yeah it's a great
29:15
question
29:16
you know i actually run an algorithmic
29:18
auditing company now
29:19
and so this is exactly the question we
29:21
ask our clients like
29:22
how do we how do we get a lot more um
29:27
diverse viewpoints into the design
29:30
and the implementation of this algorithm
29:32
and and what i've noticed katie is
29:34
that you don't actually need to have
29:35
them all in the data science team the
29:37
data science team
29:38
honestly has been way overloaded they
29:41
have been in charge of not only coding
29:43
but basically
29:44
making ethical decisions deciding
29:47
on all sorts of things that they really
29:50
have no business or expertise to do
29:51
so the way i have framed it in my
29:54
company
29:55
is you start with the list of
29:57
stakeholders like who cares about this
29:59
algorithm
30:00
like if you have customers if
30:04
say say it's a mortgage lending company
30:06
like
30:07
the customers care what does it mean for
30:09
them for this to be successful or
30:10
a failure if if you deny people
30:13
mortgages
30:14
unfairly those are a stakeholder that
30:17
you have to bring into this conversation
30:19
and it's very quite it's in in fact
30:21
quite obvious
30:22
that people of color black people
30:24
especially have been redlined from
30:26
mortgages so you should specifically
30:28
talk to people
30:29
representing that stakeholder group
30:31
about what would mean
30:33
for this mortgage company to fail and
30:35
so you
30:36
you get their opinion you bring them
30:38
into the conversation and this is before
30:40
the algorithm is built
30:42
or at least before it's deployed and the
30:44
idea is that the data scientists instead
30:46
of like making these decisions
30:47
for the sake of everyone without having
30:49
any historical information or knowledge
30:51
about how mortgages have historically
30:53
been denied
30:55
to people that they should be told
30:58
after the fact after the values have
31:00
been laid down
31:01
here's what the values are you have to
31:04
make these tests to make sure they're
31:05
fitting the criterion that we want and
31:07
that's your job is to translate those
31:09
values into code it's not to make those
31:12
values to decide on those values
31:14
if you will so the answer is we
31:16
absolutely must realize
31:18
that algorithms are everywhere there are
31:20
you know we don't have humans making
31:22
decisions
31:23
anymore college admissions offices in
31:25
the
31:26
next few months are going to be
31:27
decimated by budget cuts
31:29
they're going to be turned into
31:30
essentially algorithms and the question
31:32
we have to ask is
31:34
well for whom are they going to fail you
31:36
know who who gets to decide
31:39
what it means for this to work or fail
31:40
and we have to make that a larger and
31:42
broader conversation
31:44
and and tristan i mean you were kind of
31:46
the ethics guy
31:47
until you weren't right why don't these
31:50
companies
31:51
have any interest in in having someone
31:55
present or a team of people
31:56
present talking about the larger impact
32:00
societal impacts of some of these
32:03
technologies
32:04
they is it because they don't care
32:07
yeah i think that you know i agree first
32:09
of all with everything that cappy
32:11
just shared it's heartbreaking when
32:14
you see the scale of harms that are
32:16
being generated in places where there's
32:18
no representation i think a good example
32:20
of this first of all
32:22
is i think it's the case that seventy
32:23
percent of facebook users are outside
32:26
the us which means that the
32:28
assumptions of the design
32:29
team that are sitting there in menlo
32:31
park are going to be
32:33
incomplete or incorrect 70 of the time
32:36
then you have the issue of these
32:38
technology companies really aggressively
32:40
going into markets in places where
32:42
there isn't infrastructure they're going
32:43
into myanmar different countries in the
32:45
african continent
32:46
and they're actually establishing the
32:48
infrastructure so they're saying
32:50
hey when you get a cell phone we're
32:51
going to do a deal with the telecom
32:53
so you get a cell and you get a phone
32:54
number and you get your first phone in
32:56
myanmar
32:57
facebook comes built into the phone you
32:59
get a facebook account
33:00
and they're actually this is important
33:01
it actually crowds out other
33:03
places other players who could be there
33:06
so for example you go into a country
33:07
that didn't even have internet before
33:09
and and suddenly all the content
33:12
that's generated in those local
33:13
languages
33:14
is actually all on facebook there isn't
33:16
some you know 20 other online
33:18
publications who can actually
33:19
compete with what's on facebook all that
33:21
material is there
33:22
and then you have a situation like in
33:23
myanmar where the government spreads
33:25
fake news that starts to as
33:27
is described in the film go after the
33:29
rohingya minority group
33:30
and there's no there's no
33:32
counterbalancing force in fact in that
33:33
case in 2013 or 14
33:36
there was only four burmese speakers who
33:38
can even do the content moderation would
33:39
have any idea what's going on in that
33:41
country
33:42
now zooming out you say okay facebook is
33:44
managing about 80 elections per year so
33:46
we all care about the upcoming u.s
33:48
election we care about whether you know
33:50
how well facebook deals with some of the
33:51
issues that are that are present now
33:53
in this country but then you zoom out
33:55
and say do they have
33:57
teams that speak the languages of all
33:58
the countries that they have colonized
34:00
and it's a form of digital colonialism
34:02
when they create infrastructure where
34:03
they don't even have the staff
34:05
in those countries and the people who
34:07
are representative and so across the
34:08
board i would also just
34:10
add in every case if you don't have
34:12
expertise
34:13
or the stakeholders as cathy said um
34:15
where the people closest to the pain are
34:17
not closest to the decision making are
34:19
closest to the power
34:20
that's a huge problem and and i wanted
34:23
to add
34:23
katie if i could just add there you
34:25
just said something about
34:26
the the people inside the companies and
34:29
kind of their intentions and one of the
34:31
things
34:31
from my experience in having the
34:33
conversations over the last few years
34:35
i think there are a lot of people at the
34:37
companies that do mean well that are
34:39
well intended
34:40
yet they're stuck with this problem
34:42
they're stuck with this business model
34:44
there's a phrase in coding that have
34:46
heard inherent vice so it's like they
34:49
even acknowledging the problems of the
34:50
code they still built around
34:52
those problems and now they've grown to
34:54
a scale that is affecting all of society
34:56
so even if they wanted to make changes
34:59
you have to overturn the entire business
35:00
model you have to overturn the entire
35:02
valuation by the stock market so when
35:04
you talk about getting diversity of
35:05
opinion into these companies
35:07
diversity in a lot of ways is needed but
35:09
i think we also need to really question
35:11
the fundamental business model of what's
35:14
driving these companies
35:15
i often make a comparison to the fossil
35:17
fuel industry
35:18
and the fossil fuel industry seemed
35:20
really awesome to humanity when we
35:21
started it
35:22
and it gave us great power and
35:24
opportunity but years later we're seeing
35:26
the consequences of that
35:27
and the same thing with our social media
35:29
technology and this business model of of
35:32
attention and this advertising targeted
35:34
business model like we need to figure
35:36
out how can we flip this model upside
35:38
down and create
35:39
financial models that work in alignment
35:41
with people in alignment with humanity
35:43
and society
35:44
and can take all of these factors into
35:46
consideration when
35:47
when really aligned for the public goods
35:50
and tim
35:50
why can't they change their their
35:53
business model i mean how much money
35:54
does mark zuckerberg need
35:58
well i think i think it's a little more
36:01
complicated than
36:02
than just his wealth but i i think the
36:05
trick is that
36:06
is it's what jeff said you know the the
36:08
business model is advertising
36:10
and the value of the company is
36:12
multiples
36:13
of the amount of revenue and it's just
36:15
predicated on that revenue continuing to
36:17
grow and grow and grow
36:19
and so if they were to map a path
36:22
where they were to segue off of that
36:24
business to to something else let's just
36:26
say
36:27
asking users to pay so much
36:31
value would be destroyed the company
36:34
probably wouldn't be recognizable
36:36
or at least that's i mean i can't come
36:38
up with a creative way
36:40
for how they would do that on their own
36:43
accord
36:43
in a way that wouldn't just destroy
36:47
hundreds of billions of dollars of
36:49
shareholder value
36:50
i just don't see a tractable path um
36:54
which is why i think that governments
36:56
are going to have to play a role and i
36:59
think we as
36:59
as individuals are going to have to play
37:02
a role in terms of
37:03
having our own reckoning and applying
37:05
our own pressure to these companies
37:07
speaking of a reckoning you know rashida
37:09
we have seen just this phenomenal social
37:12
justice movement
37:13
unfold in the last six months or so
37:16
following the
37:17
the murder of george floyd and of course
37:19
ahmad aubry and so many other
37:22
instances that we almost hear about on a
37:24
weekly if not daily basis and
37:26
i was curious you said that these
37:28
technologies make it harder for you to
37:31
do your work
37:32
and i'm curious if if you can explain
37:35
why
37:38
yes so i guess part of explaining why i
37:41
made that comment is because a lot of my
37:43
work
37:43
has been looking at not only commercial
37:46
uses like social media platforms but
37:48
also
37:49
uses of algorithms and data-driven
37:51
technologies within the government
37:53
sector
37:54
so i've written about the use of
37:56
predictive policing
37:57
and risk assessments in the criminal
37:59
justice system which are just
38:00
accelerating
38:01
the racial disparities within that
38:03
sector but they're often and
38:06
do you explain what that means rashida
38:08
for people yeah so predictive policing
38:10
is a technology that relies on police
38:12
and other data to predict who may be a
38:14
victim or a perpetrator of a crime or
38:16
where a crime may occur
38:18
and risk assessments are actuarial tools
38:21
that attempt to use historical data
38:23
and perform statistical analysis to then
38:26
predict
38:27
decisions for similarly situated
38:29
individuals usually judges
38:32
so in both of these examples they're
38:34
technologies that are relying on
38:35
historical data
38:37
and a really racially biased and flawed
38:39
system
38:40
to make predictions about the future
38:43
and they're often adopted under the guys
38:46
that they're more objective
38:47
or fairer or impartial when the reality
38:50
is they're just further concretizing the
38:52
inequities that already exist in our
38:54
society
38:55
and then giving this gloss of fairness
38:58
on top
38:59
so it's that when i give those
39:01
examples the challenges you're not only
39:04
dealing with the mythology
39:06
that comes along with anything related
39:07
to math tech
39:09
science that we have in society but
39:12
you're also grappling with these deeper
39:14
structural
39:14
issues that we really haven't learned
39:16
how to deal with as a society and trying
39:18
to tackle them all at once
39:20
and then if you take that out and
39:23
look at the private sector
39:25
you're dealing with similar issues and
39:27
that like mis we've talked about
39:28
misinformation hate speech is another
39:30
issue these aren't new
39:31
issues in society but the design of
39:34
these platforms
39:35
are made so that they amplify these
39:38
issues
39:39
and we're also dealing with a society
39:41
that hasn't dealt with those issues at
39:43
least legally
39:44
in an equitable or manageable way to
39:46
date so we're asking both technology
39:49
companies to solve problems we can't
39:51
really solve
39:52
and then we're integrating similarly
39:54
flawed technologies into our social
39:55
systems and public systems
39:57
that already have deep cracks in them
40:00
and then
40:00
asking to solve those along with the
40:03
tech issues on top
40:04
what do you make of rashida when when
40:06
facebook says that we have
40:08
ai that can you know
40:11
track and stop things like hate speech
40:14
and misinformation
40:15
is that is that a legitimate
40:18
stance for that company to take
40:22
i mean do do they really have that
40:25
because
40:26
it it seems and tristan you and i have
40:28
talked about this
40:30
you know when there's a a piece of
40:32
misinformation out there by the time
40:35
it's corrected it's traveled
40:38
through thousands if not millions of
40:40
people my daughter
40:42
actually her job involves correcting
40:44
misinformation on facebook she's very
40:46
very busy
40:47
but i always say gosh by the time you
40:50
can evaluate it um
40:52
it's been seen and shared over and over
40:54
and over again
40:55
so can their ai really stop
40:59
things or is that just a pr thing
41:02
it's pr tech isn't going to solve the
41:05
tech problem
41:06
i think that's just a fundamental thing
41:08
to understand
41:10
but also like i said these are very
41:12
complicated
41:13
issues in that if there isn't a clear
41:16
roadmap toolkit
41:17
or rubric for individuals to follow it
41:20
means they're making these subjective
41:21
judgments
41:22
at scale and with no clear guidance on
41:25
how to make
41:26
what are really consequential decisions
41:29
so
41:30
i do think it is the responsibility of
41:32
facebook twitter and these other
41:34
platforms to monitor
41:35
and understand how their platforms are
41:37
contributing to these problems
41:39
but i do think it's misguided to suggest
41:42
that ai is
41:43
somehow going to solve or be the silver
41:46
bullet solution to problems that are
41:47
amplified by their technologies
41:50
but tim how do they do that when there's
41:51
so much information
41:53
going through these pipes you know how
41:55
do you keep up with it i guess
41:57
isn't that sort of the the nub of the
42:00
issue and and
42:01
you know i'm particularly interested as
42:03
the election is just around the corner
42:06
so much false information so
42:09
many just complete blatant lies that
42:13
are being consumed
42:15
and processed and then shared and
42:18
they're so powerfully influential
42:21
you know and and even friends of mine
42:25
i can tell when they've been fed stuff
42:27
over and over and over again because
42:31
we sometimes can't even have a
42:32
conversation because i know the
42:34
algorithms they might have
42:36
read something and then they just become
42:38
this
42:39
repository for first for like-minded
42:43
content
42:44
and then i and then i can't really have
42:46
a conversation with them and it's very
42:48
frustrating
42:49
you know yeah i mean i'm not not sure
42:52
exactly
42:53
what what the question is but uh
42:58
i'm on the receiving end of that too and
43:00
i i think it does go back to
43:02
jeff's point and then other points that
43:04
that folks have made about the
43:05
incentives
43:07
look like misinformation is really good
43:09
for their business
43:11
the spread and engagement of incendiary
43:14
content
43:16
is terrific for their union
43:19
why because it plays to the attention
43:21
economy idea and it keeps correct well
43:24
hooked is that tristan should talk to
43:27
this because he'll
43:28
he'll give the great wrapper around it
43:30
but yeah i mean it it just it
43:32
it plays on our our weaknesses you know
43:36
it plays on our animal brain
43:38
at the at the lowest level and that's
43:41
really good for business because when
43:42
that part of my brain is being tapped
43:45
i go into an unconscious mode where i
43:48
am sucked in sucked into the horn
43:52
yeah and i spend an ordinary amount of
43:54
time and and more time
43:56
more time i spend on that platform the
43:58
more money they make
43:59
from me and if i really become a convert
44:03
i help spread it to other people and
44:05
then we make even more money from from
44:08
them tristan you want to add to that
44:12
yeah i mean it's just to sort of sum
44:16
up where that leads to
44:17
and to double down on what tim is saying
44:19
i think jeffy
44:20
uses this line that you know this is a
44:22
polarization for profit business model
44:24
because let's say facebook has two
44:26
possible versions of the news feed they
44:27
could show
44:28
let's imagine one news feed called the
44:30
challenge feed and so everything you see
44:32
when you swipe it challenges and expands
44:34
and makes more nuance to your view of
44:36
reality
44:37
every single swipe you make and the
44:38
other feed is called the affirmation fee
44:40
it just makes it shows you why you're
44:42
right you're right you're right you're
44:43
right you're even more right than you
44:44
thought here's even more evidence about
44:46
why the other side is even more crazy
44:48
more whatever it is that you hate which
44:50
of those two
44:51
feeds is going to keep you coming back
44:53
right
44:54
the the affirmation fee right and so
44:57
i think that the most nuanced point in
44:59
the film that i think is so critical for
45:01
people to get
45:02
is that these technology systems have
45:05
taken this shared reality
45:07
and put it through this paper shredder
45:09
where we each get our own
45:11
three billion truman shows each one of
45:13
us has our own reality
45:14
and if those realities are not
45:16
compatible with each other which they're
45:18
not because the way i get
45:19
you your information is to show you
45:21
different facts that are completely
45:22
different of a different kind and frame
45:24
than the other facts it makes it
45:26
impossible to have a conversation
45:28
and if we cannot have consensus in
45:30
agreement when you disagree with someone
45:31
you can't just
45:32
say i don't want you on the planet i
45:33
don't want you voting anymore so for
45:35
anything we want to do whether it's
45:36
climate change racism inequality
45:38
any issue we want to tackle it depends
45:40
on us having a shared consensus view
45:42
semi-reliable consensus
45:44
reality and we need that to be able to
45:47
do anything
45:48
which is why i think this is the
45:49
existential threat that undergirds all
45:51
the other issues that makes our already
45:53
difficult problems
45:54
unsolvable now the thing that gives me
45:56
hope is that the film
45:58
will create a shared truth about the
46:01
breakdown of that shared truth so
46:02
instead of being caught
46:03
in the mess of all of it we actually
46:06
have a shared conversation
46:07
about the breakdown of our shared
46:09
conversation and that's an empowering
46:11
place to stand in fact
46:12
my favorite thing that i think jeff
46:14
recommends people do after the after
46:15
watching the film
46:16
is you watch it with other people who
46:18
politically you disagree with
46:20
and you both at the end of the film open
46:21
up facebook on both your phones
46:23
and then you reality swap you trade
46:25
phones
46:26
and you see how if i was living in their
46:28
feed you know if you saw my feed you'd
46:30
see like climate apocalyptic news and
46:32
u.s
46:33
china escalating relations and you'd say
46:35
oh my god it makes sense why tristan
46:36
looks terrified all the time
46:38
you have a deeper empathy for where each
46:39
of us are coming from
46:41
and i think that's probably the biggest
46:42
thing that we can do because honestly
46:44
the companies are not going to be able
46:46
to change magically all of these things
46:47
as tim just said not only is it against
46:49
their shareholder
46:50
shareholder pressure and their business
46:51
model but even if they put
46:53
100 of their resources to solving the
46:56
issue
46:56
the growth rate of the harms the
46:58
conspiracy theories the mental health
47:00
the degree degradation of our public
47:02
square
47:02
has far exceeded any of the product
47:04
changes and developments that they can
47:06
make
47:06
so we're left with culture and the thing
47:08
that gives me hope is that this film
47:10
being simultaneously released
47:12
in 190 countries and in 30 languages has
47:15
shown
47:15
i'm getting about 100 messages on
47:17
instagram about you know per hour
47:19
from people all around the world saying
47:20
oh my god this is why we got
47:22
the five star movement in italy this is
47:23
why we got bolsonaro
47:25
in brazil and i think that shared
47:27
context that collective global will
47:30
is what we're going to be able to use to
47:31
hopefully create a government response
47:33
and also wake up the public that we've
47:35
lost our shared conversation
47:38
before we talk about just go i'm
47:41
gonna go around the horn and get
47:42
solutions but
47:44
you know it seems to me that this is
47:47
gonna be a real problem for the election
47:50
we're hearing about russian
47:52
interference
47:54
right again we're hearing obviously
47:58
about these echo chambers and how people
48:00
are getting affirmation not
48:02
information you know in my business
48:04
cable news is sort of following this
48:06
model right with appealing to one side
48:09
or the other
48:10
engagement through enragement as my
48:12
friend kara swisher says so
48:15
you know is
48:18
anything being done i think it's sort of
48:20
um
48:21
really lip service what's being done
48:24
before the election can you explain
48:26
who who wants to explain what facebook
48:28
is doing
48:30
before november 3rd and
48:33
what impact it will actually have any
48:36
anybody
48:37
i can't speak to the specifics of
48:40
what facebook is doing
48:41
but in my mind it's political ads right
48:44
the week before the election right but
48:46
these are
48:46
i i feel like anything that they do is a
48:48
band-aid solution
48:50
like they're plugging holes in a sinking
48:51
ship and this is not actually addressing
48:53
the fundamental
48:54
problem just like with climate change we
48:56
have broken down our
48:58
environmental ecosystem and these
49:01
platforms have broken down our
49:02
information ecosystem
49:04
and we have we're literally just
49:06
re-warping and reshaping
49:08
our sense of truth and understanding and
49:10
shared conversation
49:12
and just like with climate change
49:13
even if we solved it right now we would
49:15
have a decade or more of inertia
49:17
from the from the carbon buildup like
49:19
the same problem exists here in my mind
49:21
with our information
49:22
breakdown where these are not easy
49:24
things to solve or undo
49:25
this is like we've been living through
49:27
this mind warp for a period of time now
49:29
so the idea of solving stuff between now
49:33
and the election is is kind of a moot
49:36
thing in my mind we can make little
49:37
band-aids
49:38
steps but already we've been fractured
49:40
into different ways of seeing the world
49:42
that are not going to undo
49:44
quickly right in fact you know a lot of
49:46
these issues
49:47
really haven't changed where
49:50
where people stand i mean they're pretty
49:52
deeply entrenched i
49:54
interviewed nate silver and he said
49:55
there's been very little movement
49:57
from one camp you know to another people
50:01
are kind of sticking with their their
50:03
tribalistic
50:04
instincts so you can tell that they are
50:08
being you know fed but what's
50:10
frustrating to me is that
50:12
people don't trust the media and people
50:14
don't even understand what truth
50:16
actually is
50:17
and what facts actually are um
50:20
everything is being called into question
50:23
and it's it's for somebody who's been a
50:25
journalist for
50:26
a long time that's exceedingly
50:28
frustrating
50:29
kathy what are we gonna do about
50:32
potential russian
50:33
interference just kind of like say oh
50:35
well
50:36
i guess we're gonna have to live with it
50:38
you know it
50:39
is really difficult and i agree with you
50:41
that it is trust
50:43
itself that has been injured um
50:46
and not just trust about any particular
50:49
topic but trust in the concept of
50:51
authority and expertise
50:53
i'm writing a book now about how
50:56
shame plays into all this because i i
50:59
think that
50:59
this sort of fuel that that
51:03
that divides us and that makes us
51:06
more and more tribal
51:07
is is literally his shame and wanting to
51:10
be outraged and shame the other side um
51:15
so it is really hard and i agree with
51:17
tim that it's not going to
51:18
happen overnight even if we turn off
51:20
facebook tomorrow which i really
51:22
wish we could do i do want to say though
51:24
that there
51:25
are you know smaller easier problems to
51:28
solve
51:29
that would go a long way in some of the
51:32
economic
51:33
aspects of this so i mean
51:36
to be clear there are as i said before
51:39
algorithms that are dividing
51:41
deciding almost any bureaucratic
51:44
choice that we are we go through anytime
51:47
we touch your bureaucracy
51:49
it's an algorithm now and they are not
51:52
even
51:52
being asked to comply with existing laws
51:56
that's how outside the system that's how
51:59
blindly trusted
52:00
algorithms have been in the last 10 15
52:04
years as they've popped up everywhere
52:06
so if you don't mind i'm going to say i
52:08
don't know how to solve this big problem
52:10
of facebook democracy and trust
52:12
but i do want to say that we can solve
52:15
smaller
52:16
problems that are embedded inside this
52:17
larger problem like
52:19
mortgage loans like college
52:22
admissions
52:23
like who gets a job you know to
52:26
rashida's point earlier
52:27
like it's all about the historical
52:30
propagation of bias so let's ask
52:33
linkedin to show us
52:35
exactly how they match make between
52:37
people looking for jobs and people who
52:39
are looking to to hire people because i
52:41
doubt that that's ever been vetted
52:43
and that's an algorithm that i could
52:45
think we can all agree
52:46
will be extremely extremely important in
52:49
the coming months
52:50
as we recover from this great depression
52:53
when all these people who have lost jobs
52:54
need new jobs they're going to go online
52:56
and they're going to be told which jobs
52:58
exist by an algorithm
53:01
and you know like are they going to be
53:03
shown all the jobs no they're going to
53:05
be shown the jobs
53:06
that the platform that they're on
53:09
decides
53:10
you deserve do you see what i mean so
53:11
like this stuff is
53:13
both economic and it's civic like we've
53:16
been talking about the civic side of it
53:18
like what does it mean to be an informed
53:19
citizen in the age of misinformation
53:21
that's a tough problem but we can talk
53:23
about
53:24
who deserves a job and let's make sure
53:26
that that works
53:28
and that i don't think that's irrelevant
53:30
to this question of a civic
53:32
minded nation we have to actually have
53:34
an economic existence in order to
53:36
get back to trust so what do you think
53:40
the
53:40
the solution is tristan you said watch
53:43
the film
53:44
trade facebook feeds um
53:48
that's a really nice idea i hope it
53:50
happens
53:51
but i think you would concede that that
53:54
we need
53:55
more well katie there's a there's a
53:58
short term and there's a long term
54:00
long term we need to move to a
54:02
totally humane
54:03
and just and equal and fair
54:06
digital infrastructure and as cathy
54:09
and rashid have pointed out
54:11
there's a quote from from mark
54:12
andreessen that software is eating the
54:14
world which is that
54:15
software is replacing technology's
54:17
replacing every aspect of our physical
54:19
infrastructure in our physical decision
54:20
making
54:21
we're going to hand over to the machines
54:23
so we have to have and i think kathy has
54:25
said this
54:26
kind of like an fda for algorithms or
54:27
some notion of responsibility
54:29
for the ways that these things are
54:31
steering society fundamentally and
54:32
that's a bigger conversation
54:34
when it comes to the shorter term and
54:35
we're talking about the election you
54:37
asked katie about
54:38
about what are we going to do about
54:40
about russia
54:41
you know i have friends right now at the
54:43
stanford cyber policy center who are
54:44
tracking multiple networks of hundreds
54:46
of thousands of accounts that can be
54:48
activated overnight in the days
54:49
or months days or weeks leading up to
54:52
the election and drop
54:53
news it's important for people to
54:54
realize first of all it's not just
54:56
russia it's china
54:57
iran saudi arabia everyone's in on the
54:59
game now
55:01
the kgb officers used to spend 25
55:04
percent of the time
55:05
manufacturing disinformation story
55:07
that's how you were successful at your
55:08
job was 25
55:09
of your time was inventing plausible
55:11
things that could happen
55:13
so now you imagine that you know while
55:15
we've been we've been obsessed with
55:16
protecting our physical borders of this
55:18
country and we have a
55:19
you know department of defense and we
55:20
spend billions of dollars a year on that
55:22
while we're protecting our physical
55:24
borders you know if russia are trying to
55:25
try to fly a plane into the united
55:27
states
55:27
they're going to be shot down by the
55:28
department of defense but if they try to
55:30
fly an
55:31
information bomb into the united states
55:34
into facebook
55:35
they're met with a white glove and an
55:36
advertising algorithm that says yeah
55:38
exactly which zip code zip code or
55:39
conspiracy theory group would you like
55:41
to target
55:42
and i think we have to realize that our
55:43
digital borders are completely
55:44
unprotected
55:45
which means that people have to exercise
55:47
a totally new degree
55:49
of cynicism and skepticism about what
55:51
they read on facebook except if it's
55:53
probably from their closest uh
55:55
friends that they know
55:56
you really it's not just the news by
55:58
the way it's the comment threads
56:00
russia they work in threes so they'll
56:02
actually
56:03
control the way that a conversation goes
56:04
and they'll generate conflict to try to
56:06
steer it in one way or the other
56:07
i think that we fundamentally cannot
56:09
trust the information that we're seeing
56:10
and we need a cultural movement
56:12
a global cultural awakening for this
56:15
understanding
56:15
these are not authoritative platforms
56:17
which doesn't mean don't trust
56:19
anything it means that we have to
56:20
recognize that these technology
56:21
platforms are not trustworthy places
56:23
to get information and i really hope
56:25
that the film does deliver that message
56:27
it's perhaps the one thing we can agree
56:28
on
56:29
is that these things have actually torn
56:30
us apart and that it's especially
56:33
on a bipartisan level how it's eroded
56:35
the mental health of our children
56:36
which everyone agrees is an enormous
56:38
problem and it's not a partisan issue
56:40
that's the one thing that gives me hope
56:42
i thought that was really well done
56:44
with the the young girl who um
56:47
you can see that she just crushed when
56:49
somebody makes fun of her ears
56:52
and you know even adults get crushed
56:55
when people say
56:57
mean things about them and you can only
56:58
imagine for a young
57:00
developing person
57:03
who's trying to to establish his or her
57:07
sense of self
57:09
how how horrible that is and
57:12
i think you know my daughter who's 29
57:15
says gosh i wish i grew up in the 70s
57:18
when there was no
57:19
social media because that's when i grew
57:21
up and i'm like i hear you
57:23
i don't you know it was hard enough as a
57:26
teenager in the 70s
57:28
to to you know get through it
57:31
relatively unscathed i don't know how
57:34
you do it today
57:35
could i share one thing on that katie
57:37
before i know you're probably wrapping
57:38
up but
57:39
we're evolved to care when people don't
57:41
approve of us
57:42
or don't like us right because that's
57:44
that's an existential issue for survival
57:46
and a tribe if the people in your tribe
57:48
don't like you or they're
57:49
saying negative things about you our
57:50
attention is going to be sent to that
57:52
instantly and it also sticks with us so
57:54
we'll end up
57:55
sort of revisiting that over and over
57:57
and over again in our minds that
57:59
negative thing that that one person said
58:01
i've noticed that even with the the film
58:03
coming out that about 99.9
58:05
of all the feedback is incredibly
58:06
positive but of course what does my mind
58:09
hold on to
58:10
it's that point one percent of people
58:12
who who hate it who said the worst
58:13
possible things
58:14
about me about anybody else hey tristan
58:18
welcome to my world
58:19
yeah well this is the thing is i think
58:21
katie that celebrities and
58:23
public public intellectual public you
58:24
know figures in society had dealt with
58:26
this for a long time
58:27
but now we've subjected every person
58:30
to the masses right and it's this big
58:33
gladiator tournament where if you say
58:34
one thing that's that's
58:36
wrong and sort of the context collapse
58:38
it's never been easier to see
58:40
infinite evidence
58:40
going clicking clicking clicking deeper
58:42
down the rabbit hole of people who want
58:43
to hate on hate on you
58:45
and i'm especially concerned about how
58:46
that's affecting a teenagers because
58:48
they're so vulnerable
58:49
as you said it affects all of us but it
58:50
affects young people the most
58:52
rashida clearly it sounds like you know
58:54
we need the government to act and then i
58:56
look at what's going on with the
58:58
government right now and it seems to me
59:00
that that more often than not
59:04
they do nothing
59:10
they do nothing well not only they do
59:12
nothing but they actually seem to be
59:14
exploiting
59:15
the divide and polarization that is
59:19
is resulting from all the things we've
59:21
been discussing
59:22
so i guess my question to you is should
59:25
the government get involved and what do
59:27
you think the appetite is
59:29
for some kind of oversight some kind of
59:32
regulation
59:34
yes the government should get involved
59:36
but i think where you go from
59:38
answering that becomes more complicated
59:40
because we're not talking about just one
59:42
issue
59:43
you could have civil rights enforcement
59:45
anti-trust enforcement
59:46
privacy enforcement and and even when i
59:49
name those three categories we don't
59:51
necessarily have regulatory or legal
59:53
frameworks for addressing the exact
59:55
concerns we're talking about
59:56
so it we are grappling with very
60:00
complex and nuanced issues so i do want
60:02
to give
60:03
people in government some credit that
60:05
that they're trying to learn real time
60:07
and figure out what to do
60:08
but there's also a range of actions that
60:11
need to be taken
60:12
taken because we're talking about
60:13
companies that are not only functioning
60:15
as
60:16
communication technologies but also
60:18
advertising companies so
60:20
if you take a sectoral approach which do
60:22
you do can you do both
60:24
and then i think we're also dealing
60:27
with as i said earlier
60:28
a compounded effect of societal problems
60:30
that have always existed that we never
60:33
have really wanted to grapple with
60:34
so in some ways we also need more
60:37
robust
60:38
and structural reform to address some of
60:40
these issues
60:41
and i do think there is an appetite or
60:44
at least an
60:45
understanding that something needs to be
60:47
done but i think what's happening
60:49
not only here in the us but
60:51
internationally
60:52
is that every government is trying to
60:54
grapple with where to start because
60:56
we're dealing with
60:57
so many different issues and it's hard
60:59
to prioritize
61:00
one certain fix over the other or figure
61:03
out how many different forms of
61:05
regulation
61:06
can work together to address the myriad
61:09
of issues
61:10
we've discussed today and in the film
61:12
but rashida when it's when it's uh
61:15
operating to the benefit of some of
61:17
these authoritarian governments that are
61:19
popping up
61:20
all over the world it seems to me
61:23
that they're not going to be really
61:26
jonesing to address these problems
61:28
because they've come to power
61:30
as a result of the the problems
61:32
ostensibly they should be fixing right
61:35
you guys
61:36
yeah you are dealing with some
61:38
governments that have a self-interest in
61:40
maintaining the status quo and that's
61:41
not simply an authoritarian regimes but
61:44
also in democracies
61:45
but you're also dealing with the problem
61:47
that the majority of these companies
61:50
are based in the united states and are
61:52
u.s companies and therefore
61:54
regulated by our lack of laws here
61:58
so you're dealing with the need for
62:01
national reforms for how issues are um
62:04
affected within certain jurisdictional
62:07
boundaries but also
62:08
just a global problem with global scale
62:11
companies
62:12
and the inability to for us as a globe
62:15
to really understand how to regulate and
62:17
deal with issues that are not isolated
62:20
within the borders of one country right
62:22
tim
62:23
and jeff what do you think are you
62:26
optimistic that something's going to be
62:28
done and if you could wave a magic wand
62:30
tim
62:32
you know good luck how would you fix
62:34
this
62:35
this huge problem that's so as rashida
62:38
said so multifaceted
62:40
yeah i mean i'm i think philosophically
62:43
i'm just
62:43
i'm i'm more focused on bottoms up
62:46
probably because i'm just more
62:47
comfortable with that which is just like
62:48
starting with the individual when i
62:50
think about global systemic problems
62:52
like
62:53
climate change or even cigarettes right
62:56
the evolution of cigarettes from
62:58
something that was
62:59
actually good for you and doctors were
63:02
doing it to now where it's
63:03
it's it's absolutely socially rejected i
63:06
think
63:07
two things happened right people had a
63:10
reckoning
63:10
with their own in climate change case
63:13
their
63:14
own contribution to it do i have an suv
63:16
in my
63:17
driveway like how what is my
63:20
carbon footprint and am i contributing
63:22
to this problem that i now feel
63:24
stronger and stronger about what changes
63:26
can can i make
63:28
and then i you know so
63:31
that is sort of all to say that at an
63:33
individual level
63:34
one of the things that i spend my time
63:37
on is trying to show
63:38
people a really tight feedback loop
63:42
about the impact that the phone has on
63:44
them
63:46
right i mean the thing that happened
63:47
with cigarettes is that it just became
63:49
really clear that it was going to kill
63:51
me and the thing that's happened with
63:53
sugar is that it's become really clear
63:55
that i'm going to get diabetes and i
63:56
have a lot of problems beyond that
63:59
and so the compulsive and addictive
64:01
usage of
64:02
phones right now we don't have a
64:04
feedback loop
64:05
we can't see the size of our brain
64:08
changing even though there's very clear
64:10
data that it
64:12
does we can't see the onset of
64:15
depression and anxiety even though
64:17
there's very clear data that it does
64:19
and so if i had a magic wand i would
64:22
i would try to create transparency
64:25
around that feedback loop so we knew
64:27
incrementally when i spend
64:29
four hours on my phone tonight what's
64:31
the cost individually
64:34
i think katie my i think my optimism
64:38
comes from my pessimism the same with
64:40
climate change and that i think things
64:42
are going to get worse and we're not
64:44
gonna
64:45
we're gonna push back at some point that
64:47
we as a society are gonna realize no
64:49
this is not acceptable this is not how
64:51
we wanna move forward
64:52
and the stakes are so high and it's so
64:54
apparent to everybody that we say no we
64:56
need to
64:57
we need to about face we need to change
64:58
the system and whether that comes from
65:01
the goodwill of the fossil fuel industry
65:03
whether that comes from the goodwill of
65:04
the tech industry
65:05
or whether that comes from regulation i
65:08
i only
65:09
those are the paths forward that i see
65:11
uh
65:12
the status quo is unsustainable um
65:15
it just will continue to break down and
65:18
rip apart our society in a way that
65:20
the public will not go for
65:23
and my hope here is that the film and
65:25
countless people like
65:26
everyone here on this conversation and
65:28
many many more working in the industry
65:30
can continue to just raise the alarm and
65:32
talk about the problems and define the
65:34
problems so that we can have a shared
65:36
definition of this is what we need to
65:37
address
65:38
this is what we need to fix we can we
65:40
agree on these basic facts so that we
65:42
can look towards the solutions and what
65:43
those solutions look like
65:44
and then i'm super eager to hear from
65:46
rasheeda and kathy and the t
65:47
and many many more people around like
65:49
what does meaningful
65:50
policy look like i that's so beyond my
65:53
expertise and my knowledge but i'm i'm
65:55
hungry and eager to hear that i want
65:57
somebody to just put together a list of
65:59
let's do these things and if we do these
66:02
things it makes a step in the right
66:03
direction
66:06
while you're completing that thought i
66:08
just looked at my screen time
66:12
and it says 18 hours and 13 minutes
66:15
tristan is that really bad
66:19
more than any of us
66:22
you guys i'm writing a book and i use
66:25
this
66:26
to write so my social networking
66:30
two hours and nine minutes that's bad
66:33
too huh
66:35
i think we have to be careful i
66:37
especially want the parents out there to
66:39
be
66:39
not too hard on themselves because
66:42
what makes the situation inhumane is
66:44
when we have
66:44
a systemic problem for which we only
66:47
have individual
66:48
solutions it's sort of like you see this
66:49
massive problem and then
66:51
the burden is only on what i can do or
66:52
look at my screen time
66:54
just like i wouldn't want you to feel
66:55
bad about how often you use your arm if
66:57
you had a timer about how many minutes
66:58
or hours a day use your arm
67:00
the the slab of glass that we have here
67:02
is is not
67:04
by itself evil i'm more worried about
67:06
the broader climate change of culture
67:08
the systemic issues that are really
67:10
really bad
67:11
and i think that you know we it we
67:13
need a full
67:14
press full court press solution we need
67:16
there to be you know individual actions
67:18
that we can take and a cultural movement
67:19
around that which i hope the film starts
67:21
but never without the broader link to
67:23
how do we change the system
67:25
so i think for everybody who whether
67:27
they download moment or they turn off
67:28
notifications or they take their whole
67:30
school and say
67:31
you know what as a school we're going to
67:32
move all the kids who are on tick tock
67:33
or instagram because
67:34
one thing parents should know is it's
67:36
not about an individual taking their
67:38
teenager
67:39
off of instagram when all my friends are
67:40
still on instagram and they're still
67:42
talking and that's where all the
67:43
homework and
67:43
dating conversations are it's not a
67:45
viable solution we need a group
67:47
migration so as we do all these
67:49
individual things
67:50
that each of those people stay part of
67:52
this conversation you know they can do
67:54
it through
67:54
you know center for humane technology or
67:56
any one of these other groups but they
67:58
need to become part of the movement for
67:59
to ask for demand
68:00
ask and demand for a much greater
68:03
regulation and change
68:04
we don't want a world where it's just
68:05
the individual we want everyone to do
68:07
the individual things
68:08
and to participate in demanding the
68:10
more systemic change that we need
68:12
on an earlier call it was on today
68:13
someone said can we put the genie back
68:15
in the bottle
68:16
well there's an interesting story of
68:18
tylenol in the 1980s where they
68:19
literally did just that
68:21
they found there was poison in the
68:23
tablets
68:25
yeah and they could have lied about it
68:27
and said you know what actually there's
68:28
no poison we just have to make sure our
68:30
stock price keeps going up we're going
68:31
to pretend it's not happening
68:32
and meanwhile people keep dying and they
68:34
keep putting the tylenol out
68:36
in this case they quite literally put
68:38
the you know the genie
68:39
they took it off the shell until they
68:41
invented the tamper-proof
68:43
container and that invention was when
68:45
they put it back on the market
68:47
and the fact that they were transparent
68:48
and honest about the problem at first
68:50
their stock price
68:51
dropped dramatically but then it
68:53
actually went back to even higher than
68:54
what it was before
68:55
and so i think that there's several
68:57
things that short term to stop the
68:58
bleeding we need them to do everything
69:00
that they can
69:00
including turning off algorithmic
69:02
amplification not recommending
69:04
facebook groups twitter can untrend
69:05
october which is one of the things to
69:07
not have
69:07
these trending topics which are easily
69:09
gameable by foreign actors
69:11
there's a long litany of these kinds of
69:12
things and i think the diversity
69:14
of the solutions need to be reflected in
69:15
the people not just on this call but the
69:17
broader community
69:18
that said it's just this is going to
69:20
take a long time so it's one of those
69:21
multi-stage things just like climate
69:23
change
69:24
and i think kathy should probably
69:25
chime in yeah i just wanted to mention
69:27
that i wrote a piece for bloomberg like
69:30
just came out a couple days ago about
69:32
tick tocks algorithm you know there's a
69:34
whole
69:34
whole thing about who gets to own the
69:36
algorithm even if
69:37
tick tock is sold and i just made the
69:41
obvious data science point that
69:42
recommendation engines can
69:44
be manipulated and i just outlined how
69:48
if i control tick tock's algorithm i and
69:50
i know there's like an anti-vaxxing
69:52
viral video or a cluster of viral videos
69:55
around anti-vaxxing
69:57
i could amplify those in the
69:59
recommendation engine or i could
70:01
diminish
70:02
the i could you know de-emphasize them
70:04
um
70:05
and i think of that as a good thing
70:07
personally because i know that like
70:09
meetup.com
70:11
they ad hoc changed their
70:14
recommendation engine to make it less
70:15
sexist and that's what's something i
70:17
want to see
70:18
you see i i would say that the
70:21
the thing that's most disingenuous about
70:24
zuckerberg's
70:25
travels to congress is this idea that by
70:27
doing nothing he's somehow being
70:29
objective
70:31
by not you know putting his thumb on the
70:33
scale that's somehow
70:35
the default and the most reasonable
70:38
suggestion to do
70:39
no it is he acts as if it's
70:43
value-free but it's actually of course
70:44
value-laden what he's saying is that
70:46
the the results of the choices we've
70:49
made in
70:50
in maximizing engagement as tim
70:52
mentioned
70:53
which of course encourages divisiveness
70:56
and tribal
70:57
justice and disagreement that that
71:00
was a choice
71:01
that was a very value-laden choice and
71:04
you
71:04
have that now the choice to do something
71:06
else you can say hey
71:08
we should de-emphasize misinformation
71:11
propaganda
71:12
and anti-science rhetoric
71:15
they could do that are they gonna do
71:17
that i doubt it but i think the long
71:19
the long-term point is that it is
71:21
actually quite possible to do it katie
71:23
it's also quite possible to regulate
71:25
algorithms and force them to follow
71:27
rules
71:28
and as i've said a couple times now like
71:32
my goal in the short term is to convince
71:34
us
71:35
and regulators and lawmakers and
71:37
policymakers
71:38
that once you've built a law in plain
71:40
english you could force an algorithm to
71:42
follow that law
71:43
we don't know what the laws need to be
71:45
for facebook and social media
71:48
but they need to be something like this
71:50
has to work for the public good
71:51
not just for your profit and once we
71:54
translate that
71:55
we can force them to follow that rule
71:58
and jeff you wanted to say something
72:00
you're you're going to have to
72:02
so why don't you close close things out
72:05
oh well i was going to add this one
72:06
comment to what kathy was saying there
72:08
and i just heard this recently and it's
72:11
from a reliable source but not somebody
72:13
inside facebook but apparently that as
72:16
as comments get closer and closer to the
72:19
their
72:19
terms of service and how extreme things
72:21
get they actually do
72:23
algorithmic those those apparently
72:24
spread exponentially and those are being
72:26
tampered down
72:27
along the the terms of service um
72:31
and it's interesting because the more
72:32
extreme something gets the more rapidly
72:34
it spreads
72:35
but that's something that i've heard
72:36
it's very speculative but to your point
72:38
kathy i don't think it's tying into
72:40
all the aspects of misinformation that
72:42
we're discussing here um
72:44
but katie just to to close from from my
72:46
perspective i'm so grateful for this
72:48
team and everybody that we have here
72:50
for sharing their stories for sharing
72:51
their voice and for helping to elevate
72:53
this issue
72:54
i knew nothing about this going into
72:56
this a couple of years ago and it was
72:58
from all of these conversations that i
72:59
got to see
73:00
just how there is this invisible change
73:03
happening in our society through this
73:05
code that we interact with every day
73:07
that nobody realizes what's hiding on
73:08
the other side of their screen
73:10
and my hope here through all of these
73:12
voices is that people can
73:14
wake up and see and recognize that there
73:15
is something that we need to pay
73:17
attention to as a society
73:18
there's something that we need to demand
73:20
a change to there's something we need to
73:22
demand transparency into
73:23
just as kathy was saying like there
73:25
are these opaque
73:26
algorithms that drive our lives that we
73:28
have no insight into how they operate
73:30
and that's really scary it's something
73:32
that touches each and every one of us
73:33
and we don't know
73:34
what's driving it so that is my biggest
73:37
hope is that
73:38
this can be a wake-up call and that our
73:40
society can rally together and say we
73:43
want to do something about this
73:44
thank you guys so much for this
73:46
conversation it was so critically
73:48
important and so informative rashida
73:51
tristan kathy tim jeff
73:55
i really enjoyed speaking to all of you
73:57
and i cannot encourage
73:58
everyone enough to watch the social
74:01
dilemma
74:02
you did a fantastic job and took a
74:05
lot of complicated issues and
74:07
made them extremely accessible i
74:10
think for the average person
74:11
so it's on netflix it's called the
74:13
social dilemma and please check it out
74:16
take care everybody thank you so much
74:19
thank you
74:20
thanks katie thanks everybody thanks
74:26
everybody
75:17
you
Inggris (dibuat secara otomatis)

You might also like