Professional Documents
Culture Documents
Full PDF
Full PDF
Full PDF
NEWS FEATURE
Correction for “News Feature: The genuine problem of fake news,”
by M. Mitchell Waldrop, which was first published November 16,
2017; 10.1073/pnas.1719005114 (Proc Natl Acad Sci USA
114:12631–12634).
The editors note that on page 12632, right column, first para-
graph, lines 4–7, “Dan Gillmor, a technology writer who heads the
Knight Center for Digital Media Entrepreneurship at Arizona
State University” should have instead appeared as “Dan Gillmor,
professor of practice and director of the News Co/Lab at Arizona
State University.” The online version has been corrected.
Published under the PNAS license.
www.pnas.org/cgi/doi/10.1073/pnas.1721073115
CORRECTION
NEWS FEATURE
The genuine problem of fake news
Intentionally deceptive news has co-opted social media to go viral and influence millions.
Science and technology can suggest why and how. But can they offer solutions?
M. Mitchell Waldrop, Science Writer
In 2010 computer scientist Filippo Menczer heard a carrying the Ebola virus. “Some politicians wanted to
conference talk about some phony news reports that close the airports,” he says, “and I think a lot of that was
had gone viral during a special Senate election in motivated by the efforts to sow panic.”
Massachusetts. “I was struck,” says Menczer. He and his By the 2016 US presidential election, the trickle
team at Indiana University Bloomington had been track- had become a tsunami. Social spam had evolved into
ing early forms of spam since 2005, looking mainly at “political clickbait”: fabricated money-making posts
then-new social bookmarking sites such as https://del. that lured millions of Facebook, Twitter, and YouTube
icio.us/. “We called it social spam,” he says. “People users into sharing provocative lies—among them head-
were creating social sites with junk on them, and get- lines claiming that Democratic candidate Hillary Clinton
ting money from the ads.” But outright fakery was once sold weapons to the Islamic State, that Pope Francis
something new. And he remembers thinking to himself, had endorsed Republican candidate Donald Trump,
“this can’t be an isolated case.” and (from the same source on the same day) that the
Of course, it wasn’t. By 2014 Menczer and other Pope had endorsed Clinton.
social media watchers were seeing not just fake political Social media users were also being targeted by Russian
headlines but phony horror stories about immigrants dysinformatyea: phony stories and advertisements
Fig. 1. Fabricated social media posts have lured millions of users into sharing provocative lies. Image courtesy of Dave
Cutler (artist).
designed to undermine faith in American institutions, more tightly to reputable fact-checking information,
the election in particular. And all of it was circulating and expanding efforts to teach media literacy so that
through a much larger network of outlets that spread people can learn to recognize bias on their own.
partisan attacks and propaganda with minimal regard “There are no easy answers,” says Dan Gillmor, pro-
for conventional standards of evidence or editorial re- fessor of practice and director of the News Co/Lab at
view. “I call it the misinformation ecosystem,” says Arizona State University in Tempe, AZ. But, Gillmor
Melissa Zimdars, a media scholar at Merrimack College adds, there are lots of things platforms as well as sci-
in North Andover, MA. ence communicators can try.
Call it misinformation, fake news, junk news, or de-
liberately distributed deception, the stuff has been around Tales Signifying Nothing
since the first protohuman whispered the first malicious Once Menczer and his colleagues started to grasp the
gossip (see Fig. 2). But today’s technologies, with their potential extent of the problem in 2010, he and his
elaborate infrastructures for uploading, commenting, lik- team began to develop a system that could comb
ing, and sharing, have created an almost ideal environ- through the millions of publicly available tweets pour-
ment for manipulation and abuse—one that arguably ing through Twitter every day and look for patterns.
threatens any sense of shared truth. “If everyone is en- Later dubbed Truthy, the system tracked hashtags such
titled to their own facts,” says Yochai Benkler, codirec-
as #gop and #obama as a proxy for topics, and tracked
tor of the Berkman Klein Center for Internet & Society at
usernames such as @johnmccain as a way to follow
Harvard University, echoing a fear expressed by many,
extended conversations.
“you can no longer have reasoned disagreements and
That system also included simple machine-learning
productive compromise.” You’re “left with raw power,”
algorithms that tried to distinguish between viral in-
he says, a war over who gets to decide what truth is.
formation being spread by real users and fake grass-
If the problem is clear, however, the solutions are less
roots movements— “astroturf”—being pushed by soft-
so. Even if today’s artificial intelligence (AI) algorithms
ware robots, or “bots.” For each account, says Menczer,
were good enough to filter out blatant lies with 100%
the algorithms tracked thousands of features, including
accuracy—which they are not—falsehoods are often in
the eye of the beholder. How are the platforms sup- the number of followers, what the account linked to,
posed to draw the line on constitutionally protected free how long it had existed, and how frequently it tweeted.
speech, and decide what is and is not acceptable? They None of these features was a dead giveaway. But col-
can’t, says Ethan Zuckerman, a journalist and blogger lectively, when compared with the features of known
who directs the Center for Civic Media at Massachusetts bots, they allowed the algorithm to identify bots with
Institute of Technology. And it would be a disaster to try. some confidence. It revealed that bots were joining
“Blocking this stuff gives it more power,” he says. legitimate online communities, raising the rank of se-
So instead, the platforms are experimenting in lected items by artificially retweeting or liking them,
every way they can think of: tweaking their algorithms promoting or attacking candidates, and creating fake
so that news stories from consistently suspect sites followers. Several bot accounts identified by Truthy
aren’t displayed as prominently as before, tying stories were subsequently shut down by Twitter.