Professional Documents
Culture Documents
Seminar 3-2
Seminar 3-2
22
Ibid.
23
Ibid.
24
Ibid.
25
Ibid.
26
Ibid.
Google Search Bias
► ‘CEO’16
► ‘woman’17
► ‘girl’18
► ‘Latinas and Asian women’19
► Google’s autocomplete function used to generate offensive and biased
results.20
16
Daniel Cossins, ‘Discriminating algorithms: 5 Times AI showed prejudice’, New Scientist, 27 April 2018. Available online at
https://www.newscientist.com/article/2166207-discriminating-algorithms-5-times-ai-showed-prejudice/ (Last accessed on 12 July 2019).
17
Jonathan Cohn, ‘Google’s algorithms discriminate against women and people of colour’, The Conversation, 25 April 2019. Available online at
https://theconversation.com/googles-algorithms-discriminate-against-women-and-people-of-colour-112516 (Last accessed on 12 July 2019).
18
Ibid.
19
Ibid.
20
Ibid.
Biased Data Biased AI
► AI programmes do not always make impartial judgements/decisions because
they are trained on biased human/real-world data.1
► Recruitment2
► Healthcare3
► Criminal justice system4
► Internet Search5
► AI bias could worsen inequality6
1
Brian Resnick, ‘Yes, artificial intelligence can be racist’, Vox, 24 January 2019. Available online at
https://www.vox.com/science-and-health/2019/1/23/18194717/alexandria-ocasio-cortez-ai-bias (Last accessed on 23 June 2019).
2
Ibid.
3
Ibid.
4
Ibid.
5
Jonathan Cohn, ‘Google’s algorithms discriminate against women and people of colour’, The Conversation, 25 April 2019. Available online at
https://theconversation.com/googles-algorithms-discriminate-against-women-and-people-of-colour-112516 (Last accessed on 12 July 2019).
6
Ibid.
Class Activity 6: Google Auto-complete
► Enter the incomplete search terms in the list below (Batches A-D) into Google Search
and examine the results of Google’s Auto-complete function.
► Type the results of the auto-complete function into the Zoom chat box.
► Some of the results may contain highly offensive terms and phrases. If the results are
highly offensive, do not type it in the Zoom chat box. Instead, simply type the
following in the Zoom chat box: Search term ‘XXXX’ produces an offensive auto-
complete result.
► List of incomplete search terms:
► Batch A: Climate change is; Climate change is not; Climate change should; Climate
change does
► Batch B: Foreigners will; Refugees should; Why are Japanese; Why are Chinese;
Why are Asian; Chinese students are; Indians have; Pakistanis are; Africans are
► Batch C: Why are Singaporeans; Singaporeans are; Singaporeans have; Singaporeans
cannot
► Batch D: Wives should; Why wives should; Why do wives
Class Activity 7: Google Images
► Enter the following search terms into Google Images.
► Describe the Google Images search results in the Zoom chat box. Do you notice any biased patterns in the search
results?
► List of search terms:
► ‘CEO’; CEO-style; CEO-look
► ‘Professor’; Professor style; College Professor style;
► ‘Teacher’; Teacher look
► ‘nurse practitioner’; nurse look
► ‘telemarketers’
► ‘receptionist’
► ‘chef’
► shopping
► Search terms in quotation marks are taken from (1) Jennifer Langston, ‘Who’s a CEO? Google image results can shift gender biases’, UW
News, 9 April 2015. Available online at
https://www.washington.edu/news/2015/04/09/whos-a-ceo-google-image-results-can-shift-gender-biases/. Last accessed on 24 September
2020. (2) and Oliver W Duke Williams, ‘How well do Google Image results represent reality?’, UCL Centre for Digital Humanities (UCLDH
Blog), 23 June 2015. Available online at https://blogs.ucl.ac.uk/dh/2015/06/23/how-well-do-google-image-results-represent-reality/. Last
accessed on 24 September 2020.
Class Activity 6 & 7: Follow-up (Part 1)
► Perhaps you have noticed offensive auto-complete results and biased patterns
in your Google Images search results.
► Why were these offensive/biased results generated?
► Type your response in the Zoom chat box.
Class Activity 6 & 7: Follow-up (Part 2) - Biased Data Biased AI
7
This case study is attributed to ProPublica. See Jeff Larson, Surya Mattu, Lauren Kirchner and Julia Angwin, ‘How we analyzed the COMPAS Recidivism
Algorithm’, ProPublica, 23 May 2016. Available online at https://www.propublica.org/article/how-we-analyzed-the-compas-recidivism-algorithm (Last
accessed on 12 July 2019).
8
Ibid.
9
Ibid.
PredPol – Predictive Policing in the United States
► What is the goal of predictive policing?
► Predictive policing programmes are trained on ‘police-recorded data’.10
► Are ‘police-recorded data’ biased? Lum and Isaac: Yes, they reflect racial
prejudices/assumptions.11
► Drug crime – Drug use and Drug arrests (Oakland, California case study by
Kristian Lum and William Isaac)12
► What are the consequences for predictive policing? Lum and Isaac: ‘The
locations that are flagged for targeted policing are those that were … already
over-represented in the historical police data.’13
10
Kristian Lum and William Isaac, ‘To Predict and Serve?’, Significance 13 (2016), p. 15.
11
Ibid.
12
Ibid., pp. 16-19.
13
Ibid., p. 18.
Facial Recognition Technology
► China is a world leader in facial recognition technology.
► YITU
► Megvii
► SenseTime
► Hikvision
► Facial Recognition and Race – MIT case study involving facial recognition
technology from Microsoft, IBM, and Megvii14
► Higher misidentification rate for dark-skinned people. Why? 15
14
Timothy Revell, ‘Face-recognition software is perfect – if you’re a white man’, New Scientist, 13 February 2018. Available online at
https://www.newscientist.com/article/2161028-face-recognition-software-is-perfect-if-youre-a-white-man/ (Last accessed on 12 July 2019).
15
Ibid.
Class Activity 8
► Zoom Poll - When performing searches on Google, do you often look past
the first page of the search results? (Yes/No)
► What do I need to do in order to appear on the first page of Google? Let’s ask
Google!
► Google: how to get on the first page of google
► What type of advice do you see? Which companies/individuals/organizations
are offering the advice?
► Now that you know that the high-ranking search results are heavily
‘gamed’, do you think there’s a stronger case for modifying algorithms
in order to produce ‘sanitized’ results that serve the public interest?
► If you think there’s a stronger case for doing so, what can you do to
convince the tech giants to adjust their algorithms?
► Type your response in the Zoom chat box.
27
Dhruv Khullar, ‘AI could worsen health disparities’, The New York Times, 31 January 2019. Available online at https://www.nytimes.com/2019/01/31/opinion/ai-bias-
healthcare.html (Last accessed on 12 July 2019).
28
Ibid.
29
Ibid.
30
Ibid.
31
Ibid.
Healthcare
“To minimize this risk, data must be representative of the population at large and the
benefits it confers must be available to all. The digital divide is a threat to this ideal.
Social determinants of health still shape access to technology. About 11% of U.S. adults
do not use the Internet. Low-income communities and households are about four times
more likely than middle- or high-income communities to lack access to broadband
technology. Similarly, 21% of uninsured patients do not use the Internet and a much
larger percentage of patients do not seek health information online. Thus, differential
access to technology not only threatens the representativeness of the data that populate
our big data models and inform the resultant algorithms, but also undermines the
potential of big data to improve the lives of the most vulnerable people”
Big Data Analytics and the Struggle for Equity in Health Care: The Promise and Perils
Said A. Ibrahim, Mary E. Charlson, Daniel B. Neill
Health Equity. 2020; 4(1): 99–101. Published online 2020. doi: 10.1089/heq.2019.0112
https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7133425/
Class Activity 12: Survival of the Best Fit
► In preparation for the DD/MM seminar, please play the following online educational
game: 'Survival of the Best Fit'.
► It's available online at: https://www.survivalofthebestfit.com/
► You should be able to complete the game in about 10min.
► 'Survival of the Best Fit' illustrates the dangers of AI bias through the operations of a
mock AI recruitment tool.
► Please post your reflections on the game and on AI bias in this thread.
Recruitment
► AI is also being used in recruitment decisions.
► But it may reinforce the biased trends in a company’s historical employment
data.32
► E.g. Amazon revealed last year that its AI recruitment programme discriminated
against women.33
32
Juliet Eccleston, ‘AI for recruitment: can robots really offer HR teams complete bias-free hiring’. HRZone, 14 May 2019. Available online at
https://www.hrzone.com/talent/acquisition/ai-for-recruitment-can-robots-really-offer-hr-teams-complete-bias-free-hiring (Last accessed on 12 July 2019).
33
Isobel Asher Hamilton, ‘Why’s it totally unsurprising that Amazon’s recruitment AI was biased against women’, Business Insider (Singapore), 13 October 2018. Available
online at https://www.businessinsider.sg/amazon-ai-biased-against-women-no-surprise-sandra-wachter-2018-10/?r=US&IR=T (Last accessed on 12 July 2019).
Class Activity 12: Survival of the Best Fit
► In preparation for the DD/MM seminar, please play the following online educational
game: 'Survival of the Best Fit'.
► It's available online at: https://www.survivalofthebestfit.com/
► You should be able to complete the game in about 10min.
► 'Survival of the Best Fit' illustrates the dangers of AI bias through the operations of a
mock AI recruitment tool.
► Please post your reflections on the game and on AI bias in this thread.
Class Activity 11: The voices behind your AI voice assistants
…
► Siri: https://www.youtube.com/watch?v=z2bTymnb1uE
► Alexa: https://www.youtube.com/watch?v=HkanSVgHeYU
► Cortana: https://www.youtube.com/watch?v=YFweSyEQiv0
► Google Assistant: https://www.youtube.com/watch?v=JvbHu_bVa_g
► Do you notice a pattern here? Type your response into the Zoom chat
box.
Class Activity 11: Follow-up - The voices behind your AI voice
assistants …
► ‘I think that probably reflects what some men think about women – that they’re not
fully human beings.’
- Kathleen Richardson (Quoted in Adrienne Lafrance, ‘Why Do So Many Digital Assistants Have
Feminine Names?’, The Atlantic, 30 March 2016. Available online at
https://www.theatlantic.com/technology/archive/2016/03/why-do-so-many-digital-assistan
ts-have-feminine-names/475884/
.)
► ‘It’s much easier to find a female voice that everyone likes than a male voice that
everyone likes.’
Clifford Nass (Quoted in Adrienne Lafrance, ‘Why Do So Many Digital Assistants Have
Feminine Names?’, The Atlantic, 30 March 2016. Available online at
https://www.theatlantic.com/technology/archive/2016/03/why-do-so-many-digital-assistan
ts-have-feminine-names/475884/
.)
Social Stability in the Age
of the Tech Giants
The Online Spread of Fake News and
Harmful Content
Social Media Algorithms and the Spread of Fake News and Harmful Content
34
Katherine Viner, ‘How technology disrupted the truth’, The Guardian, 12 July 2016. Available online at https://www.theguardian.com/media/2016/jul/12/how-
technology-disrupted-the-truth (Last accessed on 23 June 2019).
35
Ibid.
36
Ibid.
The Emergence of ‘filter bubbles’37
► Major tech platforms offer personalised results to their users, leading to the
emergence of ‘filter bubbles’.38
► E.g. Personalised search results.39
► To make matters worse –
Del Vicario, Bessi, Zollo, Petroni, Scala, Caldarelli, Stanley, and Quattrociocchi:
‘Users tend to aggregate in communities of interest, which causes
reinforcement and fosters confirmation bias, segregation, and
polarization.’40
37
Ishtiaque Hossain, ‘Filter bubbles are shrinking our minds’, HuffPost, 14 September 2019. Available online at https://www.huffingtonpost.in/ishtiaque-
hossain/filter-bubbles-are-shrinking-our-minds_a_21469747/ (Last accessed on 12 July 2019).
38
Ibid.
39
Ibid.
40
Michela Del Vicario, Alessandro Bessi, Fabiana Zollo, Fabio Petroni, Antonio Scala, Guido Caldarelli, H. Eugene Stanley, and Walter Quattrociocchi, ‘The
spreading of misinformation online’, Proceedings of the National Academcy of Sciences of the United States of America 113 (2016), p. 558.
Journalism Then and Now
► Humans have always (1) enjoyed being told things that they want to hear (2)
and (2) have always sought the company of like-minded people.
► Respectable newspapers and news organisations have always adhered to strict
ethical and professional standards – Matt Yglesias: ‘a commitment to accuracy
and basic integrity.’ 41
► See, for example, the Editors’ Code of Practice in IPSO’s (Independent Press
Standards Organisation) website (https://www.ipso.co.uk/).
41
Matthew Yglesias, 'The Case against Facebook', Vox, 9 April 2018. Available online at https://www.vox.com/policy-and-politics/2018/3/21/17144748/case-
against-facebook (Last accessed on 17 July 2019).
Journalism Then and Now
► Today: Massive supply of online content providers not bound by traditional
ethical and professional standards.
► Margaret Simons: ‘Today, just about anyone with an internet connection and a
social media account has the capacity to publish news and views to the world.
This is new in human history.’42
► Social media algorithms that give people what they want to hear + Ease of
online coordination of ‘communities of interest’ + large supply of unprofessional
content = disruption of socio-political stability
42
Margaret Simons, ‘Journalism faces a crisis worldwide – we might be entering a new dark age’, The Guardian, 15 April 2017. Available online at
https://www.theguardian.com/media/2017/apr/15/journalism-faces-a-crisis-worldwide-we-might-be-entering-a-new-dark-age (Last accessed on
12 July 2019).
Social Stability in the Age
of the Tech Giants
Automation and Unemployment
The Impact of Automation on the Future of Work: The Optimistic View
► Automation will both destroy and create jobs; new jobs created will make up
for job losses.43
► History will repeat itself.44
► Higher productivity Lower prices Higher demand Job creation 45
► Higher profits Higher wages Higher consumption Job creation 46
► Lower prices Higher real wages Higher consumption Job creation 47
► Technological progress rarely results in the automation of entire jobs. 48
43
Henry J. Holzer, ‘Will robots make job training (and workers) obsolete? Workforce development in an automating labor market’, Brookings, 19 June 2017. Available online
at https://www.brookings.edu/research/will-robots-make-job-training-and-workers-obsolete-workforce-development-in-an-automating-labor-market/ (Last accessed on 1
July 2019).
44
Ibid.
45
Sarah Kessler, ‘The optimist’s guide to the robot apocalypse’, Quartz, 9 March 2018. Available online at https://qz.com/904285/the-optimists-guide-to-the-robot-
apocalypse/ (Last accessed on 22 June 2019).
46
Ibid.
47
Ibid.
48
Ibid.
The Impact of Automation on the Future of Work: The Pessimistic View
49
Henry J. Holzer, ‘Will robots make job training (and workers) obsolete? Workforce development in an automating labor market’, Brookings, 19 June 2017. Available online
at https://www.brookings.edu/research/will-robots-make-job-training-and-workers-obsolete-workforce-development-in-an-automating-labor-market/ (Last accessed on 1
July 2019).
50
Gary Grossman, ‘It’s time for workers to worry about AI’, VentureBeat, 7 April 2019. Available online at https://venturebeat.com/2019/04/07/its-time-for-workers-to-worry-about-ai/ (Last
accessed on 22 June 2019).
51
For an explanation of UBI, see Kelsey Piper, ‘The important questions about universal basic income haven’t been answered yet’, Vox, 13 February 2019. Available online at
https://www.vox.com/future-perfect/2019/2/13/18220838/universal-basic-income-ubi-nber-study (Last accessed on 30 June 2019).
Class Activity 10: Automation and Unemployment
► Interactive Content: David Johnson, ‘Find out if a robot will take your job’,
Time, 21 April 2017. Available online at
https://time.com/4742543/robots-jobs-machines-work/. Last accessed on 24
September 2020.
► Enter as many job categories as you can into the ‘Will a Robot Take my Job?’
box in the middle of the page. Please enter a broad variety of job categories:
blue-collar jobs, white-collar jobs, etc.
► Take note of the job categories that are highly vulnerable to automation and
the job categories that are less vulnerable to automation. Type your answers
in the Zoom chat box.
► Reflect on the similarities between the jobs in each list. What do these
similarities tell us about the potential and the limits of automation?
Social Institutions in the
Age of the Tech Giants
Surveillance Capitalism and the
Surveillance State
The Tech Giants – Natural Allies of Democracy?
► Arab Spring protests (2011)
► Protestors used major tech platforms to coordinate their uprisings and promote
their cause.52
► Commentators concluded that the rise of the tech giants would promote and
strengthen democracy.53
52
Zack Beauchamp, ‘Social Media is rotting democracy from within’, Vox, 22 January 2019. Available online at
https://www.vox.com/policy-and-politics/2019/1/22/18177076/social-media-facebook-far-right-authoritarian-populism (Last accessed on 23 June 2019).
53
Ibid.
The Tech Giants – A Danger to Democracy?
► Authoritarian groups within established democracies are using social media to
expand their influence and popularity.54
► The rise of ‘surveillance capitalism’ – defined as the collection and monetisation of
users’ data – is also now seen as a threat to democracy.55
► Cambridge Analytica scandal56
► The tech giants are able to threaten democracy by using their data for political
purposes.57
54
Zack Beauchamp, ‘Social Media is rotting democracy from within’, Vox, 22 January 2019. Available online at
https://www.vox.com/policy-and-politics/2019/1/22/18177076/social-media-facebook-far-right-authoritarian-populism (Last accessed on 23 June 2019).
55
The term ‘surveillance capitalism’ was created by Shoshana Zuboff. See John Naughton’s interview of Shoshana Zuboff in John Naughton, ‘“The goal is to automate us”:
welcome to the age of surveillance capitalism’, The Guardian, 20 January 2019. Available online at https://www.theguardian.com/technology/2019/jan/20/shoshana-zuboff-age-
of-surveillance-capitalism-google-facebook (last accessed on 26 June 2019).
56
See Alvin Chang, ‘The Facebook and Cambridge Analytica scandal, explained with a simple diagram’, Vox, 2 May 2018 [Available online at
https://www.vox.com/policy-and-politics/2018/3/23/17151916/facebook-cambridge-analytica-trump-diagram (Last accessed on 1 July 2019)]; and Sean Illing, ‘Cambridge Analytica, the shady data
firm that might be a key Trump-Russia link, explained’, Vox, 4 April 2018 [Available online at https://www.vox.com/policy-and-politics/2017/10/16/15657512/cambridge-analytica-facebook-
alexander-nix-christopher-wylie (Last accessed on 1 July 2019)].
57
Chase Johnson, ‘Big tech surveillance could damage democracy’, The Conversation, 4 June 2019. Available online at https://theconversation.com/big-tech-surveillance-could-damage-democracy-
115684 (Last accessed on 24 June 2019).
Big Data, Surveillance Capitalism, and the Surveillance State
► The Chinese tech giants’ business model is also built on surveillance capitalism.
► Baidu
► Alibaba
► Tencent
► Big Data has possibly strengthened the Chinese government’s control over society
and the economy.58
► Big Data has strengthened the Chinese government’s confidence in centralised economic
planning.59
► The concentration of data collection in the hands of a few Chinese tech giants has
increased the Chinese government’s confidence in its ability to shape and control society
and the economy.60
58
Sebastian Heilmann, ‘Big data reshapes China’s approach to governance’, The Financial Times, 29 September 2017. Available online at
https://www.ft.com/content/43170fd2-a46d-11e7-b797-b61809486fe2 (Last accessed on 24 June 2019).
59
Ibid.
60
Ibid.
The Social Credit System
► The Social Credit System: The scoring of citizens’ behaviour. Benefits/penalties
will be awarded/meted out according to one’s social credit score. 61
► China is not the first or the only country to rate and shape human behaviour.
► FICO credit Score62
► Schufa63
► COMPAS
► PredPol
61
Alexandra Ma, ‘China has started ranking citizens with a creepy “social credit” system – here’s what you can do wrong, and the embarrassing, demeaning ways
they can punish you’, Business Insider (Singapore), 29 October 2018. Available online at https://www.businessinsider.sg/china-social-credit-system-punishments-
and-rewards-explained-2018-4/?r=US&IR=T (Last accessed on 12 July 2019).
62
For an explanation of the FICO credit score, see Justin Pritchard, ‘How credit scores work and what they say about you’, The Balance, 15 December 2018.
Available online at https://www.thebalance.com/how-credit-scores-work-315541 (Last accessed on 30 June 2019).
63
Cathrin Schaer, ‘Germany edges toward Chinese-style rating of citizens’, Handelsblatt Today, 17 February 2018. Available online at
https://www.handelsblatt.com/today/politics/big-data-vs-big-brother-germany-edges-toward-chinese-style-rating-of-citizens/23581140.html?ticket=ST-953057-
mP0vUj9ysq7UzciqSdcT-ap2 (Last accessed on 30 June 2019).