Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 6

THE GREAT HACKS: THE WEAPONIZATION OF TRUTH THROUGH THE

SPREAD OF MISINFORMATION AND DISINFORMATION

Misinformation and disinformation abound on social media and other platforms

accessible by the public, often leading to incorrect judgments, inciting negative public

sentiments, and posing major dangers to public safety and social order. Thus, scholars are also

concerned about the propagation of disinformation via social networks, which is unsettling and

inescapable today.

In the wake of these complex times, fake news is much more than a name for inaccurate

and misleading material masquerading as news. It has evolved into an emotive, weaponized

word to attack and discredit journalism. Misinformation and disinformation are objective social

phenomena that occur in social operations. It often refers to material that has been extensively

disseminated, purposefully or accidentally, without a factual foundation, validation, or

explanation. Also, it has been a source of concern in the social sciences, sociology, journalism,

computer science, and other disciplines of study.

With the advancement of internet technology and social media platforms, disinformation

propagated through word-of-mouth is swiftly distributed via social media platforms. It has the

properties of fission spread, rapid proliferation, a broad range of effects, and profound impact. A

multitude of false information, as well as the spread of rumors and misleading information on

social media platforms, causes not an only public concern and poses a threat to the public's

physical and psychological health but also poses serious challenges to the governance and

stability of social order.


Misinformation and disinformation's damaging character have made the notion of an

"information pandemic" familiar to the general public. The term "information pandemic" refers

to a set of physical and psychological responses that the general population experiences when

confronted with disinformation since it is impossible to determine the truth of the information,

and the spread of misinformation infiltrates everyone's life. During the COVID-19 outbreak, for

example, the World Health Organization saw combating the "information pandemic" as a vital

part of its mission. With the rise of social media, the reach of the "information plague" grew,

amplifying the harm presented by disinformation. For instance, when confronted with

disinformation, the uncertainty of the future, and a lack of access to knowledge, the public's

psychological pressure increases, generating anxiety and panic. At this moment, the public is

highly likely to be ensnared by the group, magnifying mass fear, sparking collective social

crises, and even leading to numerous social catastrophes, due to rumors and inaccurate

information. It has been shown that the damage caused by misinformation transmitted on social

media is more severe owing to social media features such as quick transmission speed, a broad

range of influence, and profound effects. Consequently, it is critical to comprehend the process

of disinformation transmission on social media and regulate misinformation.

Additional instances of the weaponization of information include the 2016 presidential

election in the United States, which was an uproarious issue. In the weeks and months until

November 8, social media platforms like Twitter and Facebook were inundated with "fake news"

(Howard, 2017). Following Donald Trump's election as the forty-fifth President of the United

States, investigations showed that considerable foreign influence had played a part throughout

the campaign, with efforts geared primarily at influencing the outcome of the election. Most
fingers pointed straight to Russia and President Vladimir Putin's administration as the most

probable culprits (National Intelligence Council 2017).

It was far from the first time social media was used in influence operations. For example,

the Islamic State terrorist group (ISIS) employed large Twitter campaigns a few years ago to

promote propaganda, foster radicalization, and recruit foreign warriors for its battle in Syria and

Iraq. (Klausen, 2015).

Governments and non-state actors have both conducted influence operations long before

social media, but what is new about the modern extent, intensity, and effect of influence

operations are all expected to become more evident as digital platforms expand their reach over

the internet and become more fundamental to our social, economic, and political life.

Democracies, on the other hand, rely on open and unfettered information exchange and are

especially vulnerable to the poison of influence operations that distribute false news,

misinformation, and propaganda. The whole structure of democratic government is predicated on

the premise of an educated population with a shared sense of facts, shared public narratives, and

firm confidence in the information supplied by institutions. This whole assemblage is under

attack from well-planned influence operations, which will only worsen when new "deep fake"

technologies enter the picture.

With that being said, small but significant adjustments are both achievable and required.

In general, combating the issue entails addressing the approach for fighting influence operations

and strengthening people's online "immunity" so they are less susceptible to misleading,

incorrect, and divisive information. Broad-based educational activities aimed at raising user

awareness of bogus information may be beneficial but are prohibitively expensive. Inoculating

critical points (people) inside a network is more effective and less expensive (Christakis &
Fowler, 2011). Targeted interaction with people at the center of networks (high network

centrality scores in social network analysis jargon) might assist develop herd immunity and

lower susceptibility to bogus material (Halloran et al., 2002).

Moreover, governments and conventional media organizations may collaborate to

develop their narratives of events that can be used to counteract the influence operations of

others. The efficacy of such counter-narratives is dependent on the confidence that users have in

their sources; thus, acting quickly to stem the flow of disruptive influence operations aimed at

undermining user trust is critical.

As an English major, it is ultimately required to filter, manage, and reveal knowledge

through two complementary techniques. First, platforms increasingly use massive user bases to

encourage individuals to flag and report potentially offensive material. The platforms then

review the material that was highlighted. If it violates a platform's terms of service or community

rules, the content may be deleted, and the account submitted can be banned. In addition to these

human-driven solutions, organizations should use automated detection technologies to identify

and remove information. With additional data, these techniques will continue to improve by

decreasing exposure to information such as execution videos, "conspiracy films," hate-filled

tweets, and posts.

REFERENCES:

Allcott, Hunt and Matthew Gentzkow. 2017. “Social Media and Fake News in the 2016

Election.” Journal of Economic Perspectives 31 (2): 211–36.

https://web.stanford.edu/~gentzkow/research/fakenews.pdf.
Christakis, Nicholas A. and James H. Fowler. 2011. Connected: The Surprising Power of Our

Social Networks and How They Shape Our Lives — How Your Friends’ Friends’ Friends

Affect Everything You Feel, Think, and Do. New York, NY: Back Bay Books.

Cook, John, Stephan Lewandowsky and Ullrich K. H. Ecker. 2017. “Neutralizing

misinformation through inoculation: Exposing misleading argumentation techniques

reduces their influence.” PloS ONE 12 (5): e0175799.

https://journals.plos.org/plosone/article/file?id=10.1371/journal.pone.0175799&type=pri

ntable.

Fisher, Marc, John Woodrow Cox and Peter Hermann. 2016. “Pizzagate: From rumor, to

hashtag, to gunfire in D.C.” The Washington Post, December 6.

Frank, Russell. 2015. “Caveat Lector: Fake News as Folklore.” The Journal of American

Folklore 128 (509): 315–32. doi:10.5406/jamerfolk.128.509.0315.

www.researchgate.net/publication/281601869_Caveat_Lector_Fake_News_as_Folklore.

Giles, Martin. 2019. “Five emerging cyber-threats to worry about in 2019.” MIT Technology

Review. www.technologyreview.com/s/612713/five-emerging-cyber-threats-2019/.

L. Guo and Y. Zhang (2020), “Information flow within and across online media platforms: an

agenda-setting analysis of rumor diffusion on news websites, Weibo, and WeChat in

China,” Journalism Studies, vol. 21, no. 15, pp. 2176–2195

L. Li, H. Xia, R. Zhang, and Y. Li, (2019) “DDSEIR: a dynamic rumor spreading model in

online social networks,” in Proceedings of the International Conference on Wireless

Algorithms, Systems, and Applications, pp. 596–604


H. Allcott and M. Gentzkow (2017) “Social media and fake news in the 2016 election,” The

Journal of Economic Perspectives, vol. 31, no. 2, pp. 211–236.

G. Pennycook and D. G. Rand (2019) “Lazy, not biased: susceptibility to partisan fake news is

better explained by lack of reasoning than by motivated reasoning,” Cognition, vol. 188,

pp. 39–50.

You might also like