Professional Documents
Culture Documents
jinfopoli_9_1_370
jinfopoli_9_1_370
Advertising Infrastructure
ABSTRACT
Disinformation and other forms of manipulative, antidemocratic communication
have emerged as a problem for Internet policy. While such operations are not
limited to electoral politics, efforts to influence and disrupt elections have cre-
ated significant concerns. Data-driven digital advertising has played a key role in
facilitating political manipulation campaigns. Rather than stand alone incidents,
manipulation operations reflect systemic issues within digital advertising markets
and infrastructures. Policy responses must include approaches that consider digital
advertising platforms and the strategic communications capacities they enable. At
their root, these systems are designed to facilitate asymmetrical relationships of
influence.
Keywords: Disinformation, political advertising, infrastructure, targeted advertis-
ing, social media
This research was supported in part by the Government of Canada. The views expressed here are
the authors’ own.
8. As Full Fact points out, a “moral panic” around fake news could prompt overreactions that
threaten free speech. Full Fact.
9. Shane and Blinder.
10. Bradshaw and Howard.
11. U.S. v. Internet Research Agency.
12. Bradshaw and Howard.
Political Manipulation 375
13. Ibid.
14. Ghosh and Scott, “Russia’s Election Interference.”
15. McNair.
16. Kaye; Kim et al.; Valentino-DeVries.
17. Google, “Changing Channels,” 12.
376 JOURNAL OF INFORMATION POLICY
social feeds, mobile apps, websites, and other channels. Highly segmented
message targeting, through digital advertising, can help spur “organic”
amplification and generate human assets for information operations.18
around characteristics and traits that have not been self-disclosed by the
targets.30 Such inferences have been made available to target, or exclude,
politically sensitive groups for social media ad campaigns.31
36. Enwemeka.
37. Ghosh and Scott, “Digital Deceit I.”
38. HubSpot. What is deep learning? https://blog.hubspot.com/marketing/what-is-deep-
learning
39. Kaptein et al.; Berkovsky, Kaptein, and Zancanaro, 18.
380 JOURNAL OF INFORMATION POLICY
variations to see which are the most effective. Advertisers can use such tools
to determine what issues resonate with p articular targets as well as test for
fears or prejudices that can be invoked to influence political behavior.
43. Ariely.
44. Calo; Shaw.
45. PHD Media.
46. Some of the landmark contributions to this line of critique include, Williamson; Packard;
Ewen; McClintock.
47. Zuboff; Ghosh and Scott, “Digital Deceit I.”
382 JOURNAL OF INFORMATION POLICY
They exploited social unrest and human cognitive biases. The divi-
sive propaganda Russia used to influence American thought and
steer conversations for over three years wasn’t always objectively false.
The content designed to reinforce in-group dynamics would likely
have offended outsiders who saw it, but the vast majority wasn’t hate
speech. Much of it wasn’t even particularly objectionable. But it was
absolutely intended to reinforce tribalism, to polarize and divide . . . 55
Digital ad systems offer a great advantage for such efforts over mass audi-
ence print and broadcast media. First, microtargeting allows advertisers
61. Allbright.
62. Penzenstadler, Heath, and Guynn.
63. Facebook Business.
64. Google. “Political Content.”
388 JOURNAL OF INFORMATION POLICY
65. House of Commons of Canada, “Disinformation and ‘Fake News’: Interim Report,” 37.
66. Ravel, Woolley, and Sridharan.
67. Ibid.
Political Manipulation 389
68. Turow.
390 JOURNAL OF INFORMATION POLICY
Discussion of Transparency
Major tech companies such as Facebook and Google have already started
to implement their own policies requiring certain types of political ads in
some countries to include a disclaimer naming a sponsor and to go through
a verification process. These verification processes, however, have proved
feeble. Just before the 2018 midterm election, a VICE news investigation
team “applied to buy fake ads on behalf of all 100 sitting U.S. senators,
including ads ‘Paid for by’ Mitch McConnell and Chuck Schumer. All 100
392 JOURNAL OF INFORMATION POLICY
sailed through the system, indicating that just about anyone can buy an
ad identified as ‘Paid for by’ a major U.S. politician.”71 Even if measures
are put in place to prevent advertisers from impersonating elected officials,
Data Rights
71. Turton.
72. Information Commissioner’s Office; Bradshaw, Neudert, and Howard.
Political Manipulation 393
73. Ghosh and Scott, “Digital Deceit II”; Greenspon and Owen.
74. Ghosh and Scott, “Digital Deceit II,” 22.
75. General Data Protection Regulation Article 4 line 11.
394 JOURNAL OF INFORMATION POLICY
76. Woodrow Hartzog, “Policy Principles for a Federal Data Privacy Framework in the
United States,” § U.S. Senate Committee on Commerce, Science and Transportation (2019).
77. General Data Protection Regulation Article 4 line 11.
78. Dillet.
79. Ravel, Woolley, and Sridharan, 14.
80. Chester and Montgomery.
81. Rothchild.
82. Centre for International Governance Innovation.
Political Manipulation 395
83. Sample was 6,387 adults in France, Germany, the United Kingdom, and the United
States. RSA Security
84. House of Commons of Canada. “Disinformation and ‘Fake News’: Final Report.”
85. Bradshaw, Neudert, and Howard; McCann and Hall.
396 JOURNAL OF INFORMATION POLICY
possible to give individuals control over how their data is used by different
advertisers.
One implementation of granularity would apply consent not only
86. In addition to seeking blanket consent from its users, Facebook has also “bundled” con-
sent to advertising within its more general terms of service provision. At the time of this writing,
privacy regulators in several EU countries are investigating this issue as it pertains to Facebook
and other major ad platforms.
87. To the best of our knowledge, the GDPR is not clear on whether consent is required to be
obtained by advertisers that use the built-in targeting capacities of an ad platform like Facebook.
88. “Guidelines on Automated Individual Decision-Making and Profiling for the Purposes of
Regulation 2016/679” (ARTICLE 29 DATA PROTECTION WORKING PARTY, October 3,
2017), https://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=612053.
Political Manipulation 397
under the rationale that market forces will diversify the tech services land-
scape and give consumers more choices that enhance privacy and reduce
manipulative targeting.
95. Hartzog, Policy principles for a federal data privacy framework in the United States.
96. House of Commons of Canada. “Disinformation and ‘Fake News’: Final Report.”
97. Bradshaw, Neudert, and Howard.
400 JOURNAL OF INFORMATION POLICY
collected; keeping data only for as long as necessary; and limiting access
to only those who truly need it.”98
2. Advertising profile information or certain categories therein could be
106. “IPA to Call for Moratorium on Micro-Targeted Political Ads Online,” accessed
March 5, 2019, https://ipa.co.uk/news/ipa-to-call-for-moratorium-on-micro-targeted-political-
ads-online#.
107. Ibid.
108. Singer.
Political Manipulation 403
bibliography
Angwin, Julia, and Terry Parris Jr. “Facebook Lets Advertisers Exclude Users by Race.”
ProPublica, October 28, 2016. Accessed March 15, 2019. https://www.propublica.org/
article/facebook-lets-advertisers-exclude-users-by-race.
Angwin, Julia, Madeleine Varner, and Ariana Tobin. “Facebook Enabled Advertisers to Reach
‘Jew Haters.’” ProPublica, September 14, 2017. Accessed March 15, 2019. https://www.
propublica.org/article/facebook-enabled-advertisers-to-reach-jew-haters.
Ariely, Dan. Predictably Irrational: The Hidden Forces That Shape Our Decisions. New York:
Harper Collins, 2008.
Beckett, Lois. “Trump Digital Director Says Facebook Helped Win the White House.”
The Guardian, October 9, 2017, sec. Technology. https://www.theguardian.com/
technology/2017/oct/08/trump-digital-director-brad-parscale-facebook-advertising.
Berkovsky, Shlomo, Maurits Kaptein, and Massimo Zancanaro. “Adaptivity and Personalization
in Persuasive Technologies.” In Proceedings of the Personalization in Persuasive
Technology Workshop, Persuasive Technology 2016, edited by R. Orji, M. Reisinger,
M. Busch, A. Dijkstra, A. Stibe, and M. Tscheligi, Salzburg, Austria, April 5, 2016.
Bey, Sebastian, Giorgio Bertolin, Nora Biteniece, Edward Christie, and Anton Dek.
“Responding to Cognitive Security Challenges.” NATO STRATCOM Centre
of Excellence, January 2019. Accessed March 15, 2019. https://stratcomcoe.org/
responding-cognitive-security-challenges.
Bodine-Baron, E., T. Helmus, A. Radin, and E. Treyger. Countering Russian Social Media
Influence. Santa Monica, CA: Rand Corporation, 2018. Accessed March 15, 2019. https://
www.rand.org/content/dam/rand/pubs/research_reports/RR2700/RR2740/RAND_
European Commission. “High Representative of the Union for Foreign Affairs and Security
Policy.” Action Plan against Disinformation (No. JOIN(2018) 36 final), 2018a.
Accessed March 15, 2019. https://ec.europa.eu/commission/sites/beta-political/files/
McCann, D., and M. Hall. Blocking the Data Stalkers. New Economics Foundation, 2018.
Accessed March 15, 2019. https://neweconomics.org/uploads/files/NEF_Blocking_Data_
Stalkers.pdf
“RSA Data Privacy & Security Survey 2019: The Growing Data Disconnect between Consumers
and Businesses.” RSA Security, February 6, 2019. https://www.rsa.com/content/dam/en/
misc/rsa-data-privacy-and-security-survey-2019.pdf.
court case
U.S. v. Internet Research Agency, 18 U.S.C. §§ 2, 371, 1349, 1028A (U.S. Dist., D.C., 2018).