Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

Education for Information 34 (2018) 185–197 185

DOI 10.3233/EFI-180209
IOS Press

Disinformation, dystopia and post-reality in social


media: A semiotic-cognitive perspective

Rebeka F. Guarda, Marcia P. Ohlson and Anderson V. Romanini∗


School of Communications and Arts, University of São Paulo, SP, Brazil

Based on recent political happenings, such as Brexit (UK) and the election of Donald Trump (USA), it
has become clear that political marketing has been using ‘Big Data’ intensively. Information gathered from
social media networks is organized into digital environments and has the power to determine the outcome
of elections, plebiscites and popular consultations. New advertising and persuasion mechanisms have
been created to undermine the reliability of traditional mass media communication that are familiar to the
general audience. Consequently, ‘fake news’ and ‘alternative facts’ have emerged along with the notion
of ‘post-truth’, which defines the state of affairs represented in public opinion that has been contaminated
by these strategies. Based on the pragmatic-semiotic concepts developed by Peirce, such as belief, mental
habits, controlled action, final opinion, truth, and reality, we argue that the ‘global village’, (McLuhan,
2008) may be at a dangerous fork in the road. This author’s ‘scientific method’ was elaborated based on
(1) the concatenation of hypotheses, (2) the deduction of its consequences, and (3) the design of experi-
ences and aims to test our beliefs against our results which would be critically evaluated by communities
of researchers. This fork in the road, which rapidly evolves as a dystopia built and reaffirmed by the
spread of disinformation on social networks, points towards a ‘post-reality’ that can represent an illusory
and brief comfort zone for those who live in it but may also represent a tragedy with no turning back for
our entire civilization.

Keywords: Disinformation, fake news, belief, public opinion, pragmatism, Peirce

1. Introduction

The spread of disinformation is a topic that has gained increasing visibility and
proportions worldwide, especially after indications that this type of practice may
have influenced the outcome of political events, such as the 2016 U.S. elections and
the Brexit decision, which signaled the withdrawal of Britain from the European
Union. In 2016, this discussion was incorporated into the international public sphere
debate after The Economist published an article entitled “Art of the Lie” (2016),
which focused on the term ‘post-truth’ and blamed the internet and social media for
the dissemination of lies told by politicians, such as Donald Trump. A few months
later, Oxford Dictionaries selected ‘post-truth’ as the word of the year, describing it
as an adjective “relating or denoting circumstances in which objective facts are less

∗ Corresponding author: Anderson V. Romanini, School of Communications and Arts, University of


São Paulo, Brazil. E-mail: vinicius.romanini@usp.br.

0167-8329/18/$35.00
c 2018 – IOS Press and the authors. All rights reserved
186 R.F. Guarda et al. / Disinformation, dystopia and post-reality in social media

influential in shaping public opinion than appeals to emotion and personal belief”
(Oxford Dictionary, 2016).
Little by little untrue information, which seemed to be created and spread in or-
der to obtain advantages, started being called ‘fake news’ by several mass media
communication businesses and researchers alike. Buzzfeed media editor, Craig Sil-
verman, one of the people responsible for spreading the term, used it for the first
time on Twitter in 2014 (Silverman, 2017). In January 2017, at the first presidential
press conference since Election Day, Donald Trump refused to take questions from
CNN reporter Jim Acosta, stating that he wouldn’t answer questions from CNN be-
cause it works with ‘fake news’ (CNN, 2017). The establishment and popularization
of this term in the public sphere, regarding the grounds on which journalistic infor-
mation is based, gained even more ground when U.S. Counselor to President Trump,
Kellyanne Conway, used the term “alternative facts” during a “Meet the Press” in-
terview on January 22, 2017. In it, she defended White House Press Secretary Sean
Spicer’s false statement about the attendance numbers of Donald Trump’s Inaugura-
tion Day (Swaine, 2017).
‘Post-truth’ can be understood as the by-product of a phenomenon addressed in
psychoanalytic literature, the ‘Confirmation Bias’, that is, the tendency to selectively
assess information. This means that only evidence that supports an initial belief and
hypothesis is accepted. “Confirmation bias, as the term is typically used in the psy-
chology literature, connotes the seeking or interpreting of evidence in ways that are
partial to existing beliefs, expectations, or a hypothesis in hand” (Nickerson, 1998,
p. 175). From the systemic point of view, contained in pragmatic communication
theory (Watzlawick, 1968) and directly related to communication psychology, ‘con-
firmation bias’ produces positive feedback in a predisposition for disagreement and
conflict, leading communication agents into a spiral that gravitates towards a violent
clash of opinions (Idem, pp. 28–32).

2. Disinformation and public opinion

Since mid-2017, the academic debate concerning disinformation and ‘fake news’
has been growing quickly. However, many researchers highlight that this topic is not
totally new, since lies and manipulation have always existed, especially those delib-
erately spread to reach specific goals in public opinion. Examples are common in lit-
erature and are mainly related to war advertisement and to inflamed debates between
rival political groups, especially during election periods or social uprisings. More
often, disinformation and promotion of ignorance, fundamentalism, and prejudice
through inaccurate information have been pointed out as one of the main strategies
for social domination or for demobilization of protests in the face of injustices and
attacks on fundamental human rights.
Assuredly, studies about the ideological use of mass media communication are
a fundamental part of the critical theories inspired by Marx in the 20th century.
R.F. Guarda et al. / Disinformation, dystopia and post-reality in social media 187

Adorno, Bakhtin, Gramsci and Baudrillard have all written about the role of mass
media communication and its strategies of (mis)representing social reality and social
production of meaning to ensure the maintenance of status quo and the reproduction
of the forms of domination of one class over another. In his book “Marxism and the
Philosophy of Language”, released in the beginning of the 20th century and one of
the most well-known works of the so-called “Bakhtin Circle”, Voloshinov (1973)
emphasized that linguistic signs are dynamic entities that are capable of simultane-
ously reflecting and refracting an individual according to the ideological perspective
that conditions social dialogues:
Every sign, as we know, is constructed between socially organized persons in
the processes of their interaction. Therefore, the forms of signs are conditioned
above all by the social organization of the participants involved and also by the
immediate conditions of their interaction. When those forms change, so does
the sign. And it should be one of the tasks of the study of ideologies to trace
the social life of the verbal sign. Only so approached can the problem of the
relationship between sign and existence find its concrete expression; only then
will the process of the causal shaping of the sign by existence stand out as a
process of genuine existence-to-sign transit, of genuine dialectical refraction of
existence in the sign. (p. 21, author’s emphasis).1
In the 1980’s, Baudrillard (1994) exposed the set of simulacra and simulations
produced by the colonization of our representations through advertising strate-
gies. According to him, human experience itself has become a simulation since
the signs have become stronger than reality. In this context, the so-called simu-
lacra are known as simulations that are disconnected from reality, i.e., they rep-
resent elements that do not exist, yet, conflictingly, become models for reality.
Even in functionalist studies of mass communication research, excess of informa-
tion has been criticized because of the impacts it has had on public opinion con-
cerning socially relevant topics. According to this school of thought, excessive in-
formation would generate confusion and passiveness among the population instead
of engagement for social change. Lazarsfeld and Merton (1948) created the term
‘narcotizing dysfunction’ to describe the effect that the overwhelming flow of infor-
mation, produced by mass media, had on individuals, making them passive in their
social activism. According to this hypothesis, because of the amount of diverse in-
formation available, individuals spend more time trying to understand current issues

1 Author of important concepts such as dialogism and chronotope, the semioticist and philosopher,

Mikhail Bakhtin gathered a group of scholars around him that became known as “the Bakhtin Circle”, of
which Valentin Voloshinov was part. In this article, we have attributed the writing of “Marxism and the
Philosophy of Language” to Voloshinov, according to the edition published in 1973 by Seminar Press.
However, the authorship of this book is controversial since it has also been given to Bakhtin in a few
editions, as well as in the biography of “Mikhail Bakhtin”, published originally in 1984 by Katerina Clark
and Michael Holquist.
188 R.F. Guarda et al. / Disinformation, dystopia and post-reality in social media

and less time actually conducting socially organized activities. Lazarsfeld and Mer-
ton (idem) stated that action strategies may be discussed but are rarely implemented
by the individuals who face this kind of overflow of information. In other words,
people have unconsciously replaced action for knowledge. Although information and
political messages have multiplied throughout traditional and online media, political
participation continues to drop. People pay increasing attention to media, but over-
exposure to media messages can confuse the audience, thus discouraging them from
getting involved in the political process.
These theories, however, were incapable of foreseeing the fast-paced technolog-
ical progress that started with the invention of the world wide web. Advancements,
such as social networking on digital platforms, mobile devices connected to geolo-
cation, and storage of huge amounts of data in ‘clouds’ (a network of computers
and memory units capable of storing and computing enormous amounts of digital
information) have exponentially increased the social importance of mass media in
the digital era. McLuhan (2008) was surely the 20th-century scholar who came clos-
est to our current reality as he wrote about the perceptive/cognitive convergence of
media and the ‘global village’ that has emerged with it. According to this Canadian
researcher, a society globalized by the flow of mediatized information would be sus-
tained by new types of logic, completely based on orality (in contrast to the linear
logic which sustains written communication), on abrasive and emotive communica-
tion between participants, and on the redefinition of the meaning and importance of
privacy.

3. New technologies, Big Data and filter bubble

Following McLuhan’s train of thought regarding media, Manovich (2001 p. 218–


219) proposed that the invention of the database represents the birth of a new cultural
genre, which associates data and allows spectators to access fragmented information
in an interactive manner because of the customization of content filters. If the infor-
mation contained in the databases represent past information, collected and classified
according to distinct logics, the patterns extracted from the databases can provide fu-
ture possibilities which are specific to the personal context of each individual user or
to the social context of a certain community.
Before moving on, it is important to point out the differences between the previous
methods of collection and storage of huge volumes of data, such as those conducted
by banks and censuses, to those used in the current ‘Big Data’ phenomenon. Boyd
and Crawford (2012) have defined ‘Big Data’ as a socio-technical phenomenon and
explained that it “is less about data that is big than it is about the capacity to search,
aggregate, and cross-reference large data sets” (p. 663). Opposite to common belief
that large sets of data would be enough to provide unbiased and truthful information,
these authors alert us about the need to critically reflect about the origins, the means
of access, the interests involved, and the biases related to that data. In addition, they
R.F. Guarda et al. / Disinformation, dystopia and post-reality in social media 189

even state that social media users “are not necessarily aware of the multiple uses,
profits, and other gains that come from information they have posted” (ibid., p. 672).
In this perspective, Kitchin (2013, p. 262) indicates that ‘Big Data’ is “huge in vol-
ume (. . .), high in velocity (. . .), diverse in variety (. . .), exhaustive in scope (. . .),
fine-grained in resolution (. . .) and uniquely indexical in identification, relational in
nature (. . .), and flexible (. . .)”. In an article discussing the epistemological impli-
cations of the data revolution, Kitchin (2014) argues that ‘Big Data’ and new data
analytics are reconfiguring the way research is conducted, as they favor the correla-
tion of data and hinder the hypothesis test. More than type, quantity or speed, ‘Big
Data’ is associated to a new analysis methodology enabled by the development of
high-performance computers and artificial intelligence technologies that are capable
detecting patterns and constructing of predictive models (ibid., p. 2).
Just and Latzer (2016) explain that the increasing flood of digital data created
the demand for automated algorithmic selection in order to deal with the massive
amounts of collected data. In this sense, ‘Big Data’ and the ‘algorithmic selection’
process are co-evolving: the first one is “a new economic asset class”, while the sec-
ond is “a new method of extracting economic and social value from big data” (ibid.,
p. 240). Automated algorithmic selection has been applied in the social sphere in
many different ways, which led Just and Latzer to argue that the algorithms them-
selves need to be evaluated as institutions and as key actors, since “they influence not
only what we think about but also how we think about it and consequently how we
act, thereby co-shaping the construction of individuals’ realities, structurally similar
but essentially different to mass media.” (Ibid., p. 254).
Pariser (2011) argues that the abundant flow of data circulating the internet and the
algorithms used by different companies like Google and Facebook lead to what he
calls “personalization”. Based on the data collected from users, websites and social
media, algorithms create predictions about who their users are and what they would
like to do, and thus select the information each user receives. This process changes
the way information circulates on the internet and leads to the ‘filter bubble’, an in-
visible mechanism that provides individuals only with information that is in line with
their preferences, connecting people who have similar opinions and distancing peo-
ple who think differently (ibid., p. 9). Among the consequences “(. . .) personalized
filters limit what we are exposed to and therefore affect the way we think and learn.
They can upset the delicate cognitive balance that help us to make good decisions
and come up with new ideas”. (ibid., p. 83)

4. Participatory lies on Web 2.0

Considering that our current context is influenced by ‘post-truth’, manipulated


content has gained increasing space for growth, since emotions and personal be-
liefs become more important than verified facts in the shaping of public opinion.
190 R.F. Guarda et al. / Disinformation, dystopia and post-reality in social media

However, sharing content irresponsibly, that is, giving voice only to what is in ac-
cordance with our own opinions and beliefs without worrying about checking the
accuracy of the information, explains only part of the problem. Disinformation, on
the other hand, is a more complex phenomenon. In addition to increasing potential
dissemination, promoted by information and communication technologies as well as
social media, the “democratization” of content creation is also underway. Jenkins
defines this as the ‘participatory culture of Web 2.0’ (Jenkins, 2009) and states that
his type of content creation, which was previously restricted to mass media commu-
nication businesses, can now be done (and is done) by any organization or individual
interested in spreading ideas or ideologies. A common strategy used to disseminate
disinformation is through websites or social media profiles with titles that are similar
to the existing and supposedly trustworthy media. This clearly illustrates the para-
sitic strategies that feed on the reputation of traditional mass media communication
businesses.
In this new scenario, the academic community has been searching for a deeper
comprehension concerning the origins and implications of the informational chaos
that has reached global proportions with the aid of new technologies. Over the last
two years, the term ‘fake news’ has been in the spotlight. For instance, in an article
discussing the 2016 U.S. elections from the standpoint of the growth of online news
and social media platforms, Allcott and Gentzkow (2017), while studying news ar-
ticles containing political implications, defined fake news as “news articles that are
intentionally and verifiably false, and could mislead readers” (p. 213). However, be-
cause ‘fake news’ carries a vague definition and because it has been repeatedly used
by politicians who are interested in disqualifying news coverage, researchers and
journalists have searched for new terms capable of describing more precisely and
critically the processes of creating and sharing untrue information.

5. Many shades of mis- and disinformation

Floridi (2010), Professor of Philosophy and Ethics of Information at Oxford Uni-


versity, has tackled the contemporaneous ethical aspects of information in what he
calls the ‘infosphere’, the new realm of digital information that has superseded the
old analogical configurations of human culture. He defines mis- and disinformation
as structured (digital) data that is simultaneously, although in different ways, seman-
tic (meaningful), factual and untrue; misinformation being unintentionally untrue
and disinformation being intentionally untrue.
When it comes to digital media, simple definitions seem to collapse. Wardle and
Derakhshan (2017) warn us that ‘information pollution’ is not limited to the news
and that the term ‘fake news’ is inadequate to describe the complexity of the phe-
nomenon. Thus, they have proposed a new conceptual framework for examining the
so-called ‘information disorder’:
R.F. Guarda et al. / Disinformation, dystopia and post-reality in social media 191

Dis-information. Information that is false and deliberately created to harm a


person, social group, organization or country.
Mis-information. Information that is false, but not created with the intention of
causing harm.
Mal-information. Information that is based on reality, used to inflict harm on a
person, organization or country. (2017, p. 20, authors’ emphasis).
Wardle and Derakhshan (2017) added that informational disorder should also con-
sider the actors, messages and interpreters involved, as well as the diversity of form,
motivation and dissemination. The nuances of controversial online content can be
observed through the seven types of mis- and disinformation that Wardle has out-
lined in her article “Fake News. It’s Complicated” (Wardle, 2017). In pointing out
seven types of mis- or disinformation, the authors reveal the complexity of the phe-
nomenon. They are: “1) satire or parody (no intention to cause harm but has the
potential to fool); 2) misleading content (misleading use of information to frame an
issue or individual); 3) imposter content (when genuine sources are impersonated);
4) fabricated content (new content is 100% false, designed to deceive and do harm);
5) false connection (when headlines, visuals or captions don’t support the content);
6) false context (when genuine content is shared with false contextual information);
and 7) manipulated content (when genuine information or imagery is manipulated to
deceive)” (Wardle, 2017).
Previously, these threats would come in the form of text, containing false informa-
tion, or, at best, in the form of an adulterated, old or out-of-context photo. Nowadays,
we deal with extensive tampering that uses artificial intelligence and other techniques
that were previously dominated only by cinema and its special effects. In a similar
direction, Chesney and Citron (2018) argued that ‘deep fake’ technologies have ex-
panded the potential distortion of reality with techniques such as machine learning.
According to these authors, sophisticated technologies keep making falsifications
more real and more profound. Simultaneously, they have become more difficult to
detect, which can lead to problems, such as new forms of exploitation, sabotage and
threat to democracy. These characteristics are synchronized with the consolidation
of the so-called ‘Web 3.0’, which uses artificial intelligence, augmented reality, and
broadcasting of high-definition and real-time information as some of its pillars.

6. Disinformation meets Big Data

The most significant case that has occurred at the intersection of ‘Big Data’ and
disinformation was the 2016 U.S. presidential elections, which contained claims of
stolen data and Russian interference, aimed to favor republican Donald Trump. In
2017, Google, Facebook and Twitter admitted that Russian operators had bought and
used their services to spread false information and promote polarization within the
North-American society (Isaac & Wakabayashi, 2017). A study conducted by jour-
nalist Jonathan Albright and published in The Washington Post revealed that posts
192 R.F. Guarda et al. / Disinformation, dystopia and post-reality in social media

made by only six of the 470 Russian Facebook accounts, controlled by a Russian
troll farm, were shared more than 340 million times and had generated more than
19.1 million interactions (Timberg, 2017).
A new scandal arose in March 2018, when The New York Times revealed that
“Cambridge Analytica”, one of the companies responsible for Trump’s campaign,
used stolen data from millions of Facebook users to map out psychological pro-
files and craft personalized messages capable of influencing the behavior of voters
(Rosenberg et al., 2018). The company collected this data from an alleged person-
ality test without revealing that the information gathered would be used for election
purposes. According to an article published in The Guardian, the company used ad-
ditional information obtained from geolocation to send messages and monitor effi-
ciency on platforms such as Facebook, YouTube and Twitter (Lewis & Hilder, 2018).
This leads us to the hypothesis that false news is a phenomenon associated not only
with communicational, social and political aspects, but that it also contains an eco-
nomic element. The creation of deliberately manipulated information has become a
new industry which operates through the financial compensation of its content cre-
ators on social media, e.g. ‘click factories’. Many websites and social media pages
that promote manipulated content operate based on ‘click baits’ for their financing
and/or profit.
Researchers have not yet reached consensus regarding the true impact that false
information had on the results of the 2016 U.S. elections, that is, if it actually de-
termined the electors’ votes. However, we can already infer that increasingly quick
and sophisticated techniques used for creating and disseminating disinformation is a
threat to be considered in any electoral process. Notwithstanding, people that share
information that support their beliefs are not the only ones who contribute to the dis-
semination of this type of article. Web robots, also known as Bots, and armies of fake
profiles have an enormous impact on the promotion of untrue information, showing
that the level of technical sophistication and perfection that such content has reached
is capable of confusing even the most skeptical and the most qualified expert.
Furthermore, disinformation frequently produces extremist and polarized dis-
courses that are strengthened in social media due to the ‘bubble effect’, leading
specific groups of people to protest in public spaces. The reasons for protests and
manifestations can vary, and frequently promote prejudice and hatred against minori-
ties in artistic, political, and even violent manifestations, such as the murder cases
that occurred in India after the spread of fake news through WhatsApp (a popular
messaging app). In this context, the ideological use of digital media enables us to
take ‘narcotizing dysfunction’ to complete aporia and to new forms of domination
and exploitation, involving mainly cognition and the amount of time social media
users spend in digital environments. Thus, a door leading to ‘post-reality’ is opened.

7. The semiotic-cognitive stance

All these kinds of mis- and disinformation, which thrive with little or no con-
straint on digital social media, seem to be producing a dystopic representation of
R.F. Guarda et al. / Disinformation, dystopia and post-reality in social media 193

reality that is fueling the realm of post-truth. This phenomenon, unprecedented in


our civilization, quickly associates the ‘confirmation bias’ (which makes it easier
and faster to disseminate ‘fake news’ that repeat age-old prejudices and misconcep-
tions) with the ‘narcotizing dysfunction’ (which hinders the establishment of new
beliefs and habits that would be capable of dealing with often urgent, last-minute
subjects). Semiotic refraction seems to be replacing reflection, resulting in the emer-
gence of ‘post-reality’, a type of parallel universe embedded into real life, i.e., a
simulacrum that seems to be more concrete than reality itself. For instance, some
climate change deniers (people who deny the occurrence of climate change), fed by
fake news, continue to behave in an unruly manner and against best practices sug-
gested by the scientific community. Some even unnecessarily burn fossil fuel as a
way to express their biased views. ‘Post-reality’ can very well be the final and ulti-
mate trap of our species, since we have already reached turning points in many fields,
such as in nuclear weapon escalation and global warming.
In the philosophical definitions mentioned above, Floridi only applies values of
truth to symbols, which is the only class of sign that can be semantic. The prob-
lem with Floridi’s definition, from a semiotic and pragmatic point of view, is that
attributes such as “true” or “untrue” applied to a symbol are only a matter of be-
lief. On the other hand, Peirce, a North-American philosopher, defined pragmatism
as a method to clarify ideas. He describes four ways of fixating a belief concerning
the trueness of symbols such as words, concepts, ideas, propositions and arguments
(Peirce, 1877). Three of them are non-scientific and contribute to the positive feed-
back of the ‘confirmation bias’: (1) the ‘a priori method’ fixates beliefs by selecting
only information that fits nicely in a rational system that is previously accepted as
true, and, in this sense, it comes close to a coherent theory of truth; (2) the ‘method
of tenacity’ fixates beliefs as one comes up with a hypothesis and holds it dear, even
against all contrary evidence; (3) the ‘method of authority’ works as one uncritically
accepts the opinions of another person, group or institution based solely on their rep-
utation and status. These methods can arise individually or can be mixed in different
intensities whenever disinformation generates false beliefs that circulate in social
media.
The fourth method, referred to as the ‘scientific method’, is based on experience
and precise concentration among three kinds of rational arguments: abduction (or
hypothesis), deduction and induction. It works as follows: once a novelty appears
before one’s eyes and produces curiosity and doubt, this person must make an effort
to formulate the best possible conjecture based on previous knowledge. Once the
individual has reached an abductive hypothesis, he or she must deductively extract
possible consequences, and, finally, proceed to test his or her findings in the real
world. This method works better than the others because it humbly assumes its own
fallibility, knowing that the first argumentative step is to elaborate a conjecture that
must be reformulated whenever the empirical test fails to reach a stable belief. Ad-
ditionally, it depends on a community of inquirers in continuous dialogue while they
search for a true belief about a given matter. In fact, Peirce advocates for a logical
194 R.F. Guarda et al. / Disinformation, dystopia and post-reality in social media

type of socialism and states that enquiry, as a normative purpose, should no longer
be pursued by all members of a society when vital matters arise.
The consequence of the ‘scientific method’ is that truth would be the final opinion,
grounded by the ultimate mental habit developed by an ideal community of inquir-
ers as they gather information through experience and share it in a communicative
exchange. This infers that we might never hold an ultimate true belief, but there is al-
ways hope that we can get as close to it as possible if sufficient efforts and resources
are dedicated to this process. The ‘scientific method’ may be slow and cognitively
demanding, but it is the only way to separate information from disinformation, be-
cause it grounds the relevant symbols on reality and on building a trustworthy social
opinion. Peirce suggests that our social beliefs can be strengthened or weakened
depending on how they perform when confronted with reality, which is in accor-
dance with his pragmatic maxim: the meaning of an idea, symbol or concept is, in its
general consequences, translated in social dispositions to act accordingly whenever
needed. Chance is precisely the ratio between success and failure of a belief when it
is applied to experience, and this relation is logarithmic:
Any quantity which varies with chance might, therefore, it would seem, serve as a
thermometer for the proper intensity of belief. Among all such quantities there is
one which is peculiarly appropriate. When there is a very great chance, the feel-
ing of belief ought to be very intense. Absolute certainty, or an infinite chance,
can never be attained by mortals, and this may be represented appropriately by an
infinite belief. As the chance diminishes the feeling of believing should dimin-
ish, until an even chance is reached, where it should completely vanish and not
incline either toward or away from the proposition. When the chance becomes
less, then a contrary belief should spring up and should increase in intensity as
the chance diminishes, and as the chance almost vanishes (which it can never
quite do) the contrary belief should tend toward an infinite intensity. Now, there
is one quantity which, more simply than any other, fulfills these conditions; it is
the logarithm of the chance. (CP 2.676.2 author’s emphasis)
All four of Pierce’s methods try to establish mental habits that are capable of
grounding our beliefs and putting us in a state of inclination to act according to
them, but only the ‘scientific method’ stimulates democratic and responsible actions
that are capable of dealing with socially complex subjects. Unfortunately, it is also
the method that is most susceptible to the so-called ‘tragedy of the commons’, since
it depends on the continuous engagement of all people involved in a given matter
and it is the method that most people are inclined to abandon whenever their com-
forting beliefs are challenged by the hard facts of experience. In addition, even when
people do engage in public debate concerning difficult and complex issues for a pe-
riod of time (usually when the subject is set on mass and social media), the flow of

2 Referencesto the Collected Papers of C. S. Peirce (Peirce 1958) are given in the text and footnotes as
a decimal number, referring to volume and paragraph, e.g. ‘2.276’ refers to Volume 2, paragraph 276.
R.F. Guarda et al. / Disinformation, dystopia and post-reality in social media 195

information can be so overwhelming that the ‘narcotic dysfunction’ can hinder the
population from taking genuine, decisive courses of action to change their realities.
Globalized corporations, such as Google and Facebook, who direct the digital flow
of information according to their specific algorithms can cause a number of effects:
1) Individuals gather more disinformation than truth when navigating on digital
social media.
2) Our disinformed beliefs vary positively and are reinforced due to a dystopian
universe of chances represented in digital media.
3) ‘Authority’, ‘tenacity’ and ‘a priori methods’ are given prominence over the
‘scientific method’, while fake authorities, hate speech, and persuasive reassur-
ance based on disinformation proliferate.
4) We share our beliefs during social exchange within a limited and biased com-
munity, such as in the ‘bubbles’ of social media.
5) We develop social metal habits and types of rational actions based on dystopic
representations induced by ‘Big Data’ strategies.
When all of these phenomena coincide, we may call it ‘perfect disinformation
storm’. As they continue to intensify, we note that our civilization seems to be at a
dangerous fork in the road, in which beliefs and their corresponding actions are no
longer grounded on reality, but on a ‘post-reality’ that emerges from disinformation
and its actual consequences.
The effects of building dystopic realities in the current context of global informa-
tional disorder can become even greater and more harmful in developing countries,
such as Brazil and India. The penetration of electronic devices and digital media plat-
forms is deep in these countries, which contrasts with their populations’ low levels
of education. This new scenario demands preparation from governments and edu-
cation institutions in order to promote learning and stimulate citizens to develop a
more critical view of the available content. Instead of enabling the democratization
of content and technologies, databases are being used in strategies for manipulating
information and have provide even more powerful weapons for certain social groups,
contributing to concentration of power and increased inequality. (O’Neil, 2016).
Our civilization, as a whole, seems to be affected by this state of affairs in a critical
moment of our history. ‘Post-reality’ is the semiotic hell in which Dante Alighieri’s
warning : “Lasciate ogni speranza, voi ch’entrate”3 (Inferno, Canto III, line 9) should
be put hanging at the front door.

8. Conclusion

Since digital information first made its appearance in the late 1940’s, its mathe-
matical and physical aspects began to stand out as computers, transmission infras-

3 “A bandon all hope, ye who enter here.”


196 R.F. Guarda et al. / Disinformation, dystopia and post-reality in social media

tructures, and programming languages were put at the service of a fast growing info-
sphere (Floridi, 2016). Only when it became clear that databases had become a new
cultural genre (allowing fast media convergence) and that ‘Web 2.0’ had revealed the
new era of participatory culture, did scientists and philosophers begin to fully under-
stand the cognitive aspects of the digital era preconized by McLuhan. If information
has lost its meaning at the hands of engineers who are worried solely about the best
syntax for performance and efficiency, it is now time for the semantic and pragmatic
aspects of information to become the center of our concerns. It has become clear that
the massive quantity of information available on digital devices hinders our search
for truth, since these flows are managed by specific logics (algorithms, data analy-
sis techniques and platform policies) and the ‘bubble effect’, which prevents users
from having contact with varied content. Furthermore, deliberate manipulation of
information with the intention of obtaining advantages is led by social agents who
are very economically and politically powerful. In this perspective, the challenge of
searching for the truth requires combined efforts from governments, academia, press
and civil society to build common understandings that guide the debate, while fac-
ing the difficulty of identifying true information from fake information. Philosophy
of information, and especially its ethical consequences, has become a necessary if
not urgent matter. Several types of mis- and disinformation are spreading around the
web through social media, and the power of these strategies to undermine traditional
democracy has been shown as they have manipulated the formation of public opinion
regarding very sensitive subjects. It has become clear that a cognitive and semiotic
research approach is needed to understand how digital information, concerning spe-
cific social subjects, has turned into meaning. Peirce states that belief is a dynamic
mental habit shared by a community of interpreters. This encourages us to accept
that predicates, such as “true” and “false”, are normative and can only be applied in
the long run, after cautious investigation by a community of inquirers, and are al-
ways provisional and open to further review. Reality, in pragmatic philosophy, is the
object of the final representation built by an ideal community. If ‘fake news’, disin-
formation and ‘post-truth’ continue to grow in our societies, we will most likely see
a corresponding ‘post-reality’ being represented and shared, which would lead to an
even more deteriorated situation for our ethical considerations.

References

Alighieri, D., Vasta, D. M., & Steiner, C. (1984). La Divina Commedia. Torino: Paravia.
Allcott, H., & Gentzkow, M., National Bureau of Economic Research. (2017). Social Media and Fake
News in the 2016 Election. Cambridge, Mass: National Bureau of Economic Research.
Baudrillard, J. (2015). Simulacra and simulation. Johanneshov: MTM.
Chesney, R., & Citron, D. K. (January 1, 2018). Deep Fakes: A Looming Challenge for Privacy, Democ-
racy, and National Security. Ssrn Electronic Journal.
CNN. (2017, January 11). Donald Trump shuts down CNN reporter: “You’re fake news. [Video file].
Retrieved from https://www.youtube.com/watch?v=Vqpzk-qGxMU.
Floridi, L. (2013). The philosophy of information. Oxford: Oxford University Press.
R.F. Guarda et al. / Disinformation, dystopia and post-reality in social media 197

Floridi, L. (Spring 2017 Edition). “Semantic Conceptions of Information”, The Stanford Encyclopedia of
Philosophy, Edward N. Zalta (ed.), URL = <https://plato.stanford.edu/archives/spr2017/entries/infor
mation-semantic/>.
Floridi, L. (2016). The 4th revolution: How the infosphere is reshaping human reality. Oxford: Oxford
University Press.
Isaac, M., & Wakabayashi, D. (October 30, 2017). Russian Influence Reached 126 Million Through
Facebook Alone. The New York Times. Retrieved Jul 31 2018 from https://www.nytimes.com/2017/
10/30/technology/facebook-google-russia.html.
Jenkins, H. (2009). Confronting the challenges of participatory culture: Media education for the 21st
century. Cambridge, Mass: MIT Press.
Kitchin, R. (November 1, 2013). Big data and human geography: Opportunities, challenges and risks.
Dialogues in Human Geography, 3, 262-267.
Kitchin, R. (July 10, 2014). Big Data, new epistemologies and paradigm shifts. Big Data & Society, 1,
481-518.
Lazarsfeld, P. F., & Merton, R. K. (1948). Mass Communication, Popular Taste and Organized Social
Action in L. Bryson (ed.), The Communication of Ideas. New York: Harper, 95-118.
Lewis, P., & Hilder, P. (March 23, 2018). Leaked: Cambridge Analytica’s blueprint for Trump victory. The
Guardian. Retrieved Jul 31 2018, from https://www.theguardian.com/uk-news/2018/mar/23/leaked-
cambridge-analyticas-blueprint-for-trump-victory.
Manovich, L. (2001). The language of new media. Cambridge, Mass: MIT Press.
McLuhan, M. (2008). The Gutenberg galaxy: The making of typographic man. Toronto: University of
Toronto Press.
O’neil, C. (2016). Weapons of math destruction: how big data increases inequality and threatens democ-
racy. New York: Crown Publishers.
Pariser, E. (2011). The Filter Bubble. What the Internet is Hiding from you. The Penguin Press. New York.
Peirce, C. S. (November 1877). “The Fixation of Belief”. Popular Science Monthly, pp 1-15.
Peirce, C. S., In Hartshorne, C., Weiss, P., & Burks, A. W. (1994). The Collected Papers of Charles
Sanders Peirce. Electronic edition. Charlottesville, Va: InteLex Corporation.
Oxford Dictionaries. [Def.1] (2016). Post-truth. Oxford: Oxford University Press. Oxforddictionar-
ies.com. Retrieved March, 30 2018, from https://en.oxforddictionaries.com/definition/post-truth.
Rosenberg, M., Confessore, N., & Cadwalladr, C. (March 17, 2018). How Trump Consultants Ex-
ploited the Facebook Data of Millions. The New York Times. Retrieved Jul 31 2018, from https://
www.nytimes.com/2018/03/17/us/politics/cambridge-analytica-trump-campaign.html.
Silverman, C. (December 31, 2017). I Helped Popularize The Term “Fake News” And Now I Cringe
Every Time I Hear It. BuzzFeed News. Retrieved from https://www.buzzfeednews.com/article/
craigsilverman/i-helped-popularize-the-term-fake-news-and-now-i-cringe.
Swaine, J. (January 23, 2017). Donald Trump’s team defends ‘alternative facts’ after widespread
protests.The Guardian. Retrieved from https://www.theguardian.com/us-news/2017/jan/22/donald-
trump-kellyanne-conway-inauguration-alternative-facts.
Timberg, C. (October 5, 2017). Russian propaganda may have been shared hundreds of millions of times,
new research says. The Washington Post. Retrieved Jul 31 2018, from https://www.washingtonpost.
com/news/the-switch/wp/2017/10/05/russian-propaganda-may-have-been-shared-hundreds-of-
millions-of-times-new-research-says/?utm_term=.906996929e0a.
Volosinov, V.N. (1973). Marxism and the Philosophy of Language. (Ladislav Matejka; I. R. Titunik,
Trans.). New York and London: Seminar Press. (Original published in 1930).
Wardle, C. (February 16, 2017). Fake news. It’s complicated. First Draft News. Retrieved from https://
firstdraftnews.org/fake-news-complicated/.
Wardle, C., & Derakhshan, H. (2017). Information Disorder: Toward an interdisciplinary framework for
research and policy making. Council of Europe.
Watzlawick, P., Beavin, B. J. H., & Jackson, D. D. (1968). Pragmatics of human communication: A study
of interactional patterns, pathologies, and paradoxes. London: Faber and Faber.
Word of the Year is. . . (2016). Oxford English Dictionaries. Oxford: Oxford University Press. Retrieved
12 jul 2018, from https://en.oxforddictionaries.com/definition/post-truth.

You might also like