Download as pdf or txt
Download as pdf or txt
You are on page 1of 425

Advanced Sciences and Technologies for Security Applications

Anthony J. Masys
Ricardo Izurieta
Miguel Reina Ortiz Editors

Global
Health
Security
Recognizing Vulnerabilities, Creating
Opportunities
Advanced Sciences and Technologies
for Security Applications

Series Editor
Anthony J. Masys, Associate Professor, Director of Global Disaster Management,
Humanitarian Assistance and Homeland Security, University of South Florida,
Tampa, USA

Advisory Editors
Gisela Bichler, California State University, San Bernardino, CA, USA
Thirimachos Bourlai, West Virginia University, Statler College of Engineering and
Mineral Resources, Morgantown, WV, USA
Chris Johnson, University of Glasgow, Glasgow, UK
Panagiotis Karampelas, Hellenic Air Force Academy, Attica, Greece
Christian Leuprecht, Royal Military College of Canada, Kingston, ON, Canada
Edward C. Morse, University of California, Berkeley, CA, USA
David Skillicorn, Queen’s University, Kingston, ON, Canada
Yoshiki Yamagata, National Institute for Environmental Studies, Tsukuba, Ibaraki,
Japan
The series Advanced Sciences and Technologies for Security Applications comprises
interdisciplinary research covering the theory, foundations and domain-specific topics
pertaining to security. Publications within the series are peer-reviewed monographs
and edited works in the areas of:
– biological and chemical threat recognition and detection (e.g., biosensors, aero-
sols, forensics)
– crisis and disaster management
– terrorism
– cyber security and secure information systems (e.g., encryption, optical and
photonic systems)
– traditional and non-traditional security
– energy, food and resource security
– economic security and securitization (including associated infrastructures)
– transnational crime
– human security and health security
– social, political and psychological aspects of security
– recognition and identification (e.g., optical imaging, biometrics, authentication
and verification)
– smart surveillance systems
– applications of theoretical frameworks and methodologies (e.g., grounded theory,
complexity, network sciences, modelling and simulation)
Together, the high-quality contributions to this series provide a cross-disciplinary
overview of forefront research endeavours aiming to make the world a safer place.

The editors encourage prospective authors to correspond with them in advance of


submitting a manuscript. Submission of manuscripts should be made to the Editor-
in-Chief or one of the Editors.

More information about this series at http://www.springer.com/series/5540


Anthony J. Masys • Ricardo Izurieta
Miguel Reina Ortiz
Editors

Global Health Security


Recognizing Vulnerabilities, Creating
Opportunities
Editors
Anthony J. Masys Ricardo Izurieta
College of Public Health College of Public Health
University of South Florida University of South Florida
Tampa, FL, USA Tampa, FL, USA

Miguel Reina Ortiz


College of Public Health
University of South Florida
Tampa, FL, USA

ISSN 1613-5113 ISSN 2363-9466 (electronic)


Advanced Sciences and Technologies for Security Applications
ISBN 978-3-030-23490-4 ISBN 978-3-030-23491-1 (eBook)
https://doi.org/10.1007/978-3-030-23491-1

© Springer Nature Switzerland AG 2020


This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the
material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation,
broadcasting, reproduction on microfilms or in any other physical way, and transmission or information
storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology
now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors, and the editors are safe to assume that the advice and information in this
book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or
the editors give a warranty, expressed or implied, with respect to the material contained herein or for any
errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional
claims in published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer Nature Switzerland AG.
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
Foreword

Our small planet is effectively growing smaller every day. In the nineteenth century,
it was possible to believe that geographic distances could protect a country from
disruptions, natural disasters, and plagues raging far away on the other side of the
world. But this is no longer the case. Thanks to our international air travel network, it
is now possible to get from any city on the planet to any other city on the planet in
less than one day. And the number of people traveling internationally by air has
grown by over 5% a year for the past 10 years. Paralleling this growth in air travel is
a growth in shipping. Few of the goods we now consume are produced locally. Most
come from across the world. The increase in the movement of both people and goods
means that we are no longer sheltered from events that happen on the other side of
the world. The world has turned into a village.
And we are seeing more disasters and threats to the security in our planetary
village. Political instability and environmental changes caused by global warming
are displacing more and more people. In 2017, it was estimated that over 65 million
people were refugees. Displaced people often are forced to live in crowded, less than
ideal conditions which can breed disease, food insufficiency, and radicalism. Envi-
ronmental degradation and the incursion of people into previously wild habitats
spurs the development of zoonoses which can become devastating epidemics and
pandemics. Climate change can result in slow moving (e.g., the inundation of the
Pacific atoll island nations) and rapid (e.g., hurricanes) weather-related disasters that
leave people homeless and traumatized. Any many of these factors can often afflict a
population at the same time, leading to complex humanitarian disasters that are very
difficult to address. A good example of this is the Ebola epidemic in North Kivu
Province of the Democratic Republic of the Congo in 2018. Despite deploying an
effective vaccine, the international community has struggled to contain this epi-
demic, as a result of political unrest, population movement, and distrust of outsiders
by the local population.
The authors of this volume highlight many of the challenges that confront our
global security environment today. These range from politically induced disasters to
food insecurity, to zoonoses, and to terrorism. More optimistically, the authors also

v
vi Foreword

present some advances in technology that can help us combat these threats. Under-
standing the challenges that confront us and the tools we have to overcome them will
allow us to face our future with confidence.

Professor, Global and Planetary Health, Thomas Unnasch


College of Public Health, University of
South Florida, Tampa, FL, USA
Contents

Part I Emerging Threats


Plagues, Epidemics and Pandemics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
Ricardo Izurieta
Agricultural Emergencies: Factors and Impacts in the Spread
of Transboundary Diseases in, and Adjacent to, Agriculture . . . . . . . . . 13
Ashley Hydrick
The Threat Within: Mitigating the Risk of Medical Error . . . . . . . . . . . 33
Simon Bennett
Climate Change, Extreme Weather Events and Global Health
Security a Lens into Vulnerabilities . . . . . . . . . . . . . . . . . . . . . . . . . . . . 59
Carson Bell and Anthony J. Masys
Global Health Biosecurity in a Vulnerable World – An Evaluation
of Emerging Threats and Current Disaster Preparedness Strategies
for the Future . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 79
Kristi Miley
The Emerging Threat of Ebola . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 103
Michelle LaBrunda and Naushad Amin

Part II Mitigation, Preparedness and Response and Recovery


Natural and Manmade Disasters: Vulnerable Populations . . . . . . . . . . . 143
Jennifer Marshall, Jacqueline Wiltshire, Jennifer Delva, Temitope Bello,
and Anthony J. Masys
Global Sexual Violence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163
Sara Spowart

vii
viii Contents

Global Health Security and Weapons of Mass Destruction Chapter . . . . 187


Chris Reynolds
Antimicrobial Resistance in One Health . . . . . . . . . . . . . . . . . . . . . . . . . 209
Marie-jo Medina, Helena Legido-Quigley, and Li Yang Hsu
Food Security: Microbiological and Chemical Risks . . . . . . . . . . . . . . . . 231
Joergen Schlundt, Moon Y. F. Tay, Hu Chengcheng, and Chen Liwei

Part III Exploring the Technology Landscape for Solutions


Gaussianization of Variational Bayesian Approximations with
Correlated Non-nested Non-negligible Posterior Mean Random
Effects Employing Non-negativity Constraint Analogs and Analytical
Depossinization for Iteratively Fitting Capture Point, Aedes aegypti
Habitat Non-zero Autocorrelated Prognosticators: A Case Study
in Evidential Probabilities for Non-frequentistic Forecast
Epi-entomological Time Series Modeling of Arboviral Infections . . . . . . 277
Angelica Huertas, Nathanael Stanley, Samuel Alao, Toni Panaou,
Benjamin G. Jacob, and Thomas Unnasch
Simulation and Modeling Applications in Global Health Security . . . . . . 307
Arthur J. French
The Growing Role of Social Media in International Health Security:
The Good, the Bad, and the Ugly . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 341
Stanislaw P. Stawicki, Michael S. Firstenberg, and Thomas J. Papadimos

Part IV Leadership and Partnerships


Effecting Collective Impact Through Collective Leadership
on a Foundation of Generative Relationships . . . . . . . . . . . . . . . . . . . . . 361
Marissa J. Levine
Global Health Security Innovation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 387
James Stikeleather and Anthony J. Masys

Index . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 427
Part I
Emerging Threats
Plagues, Epidemics and Pandemics

Ricardo Izurieta

1 Introduction

Plagues have been on earth before mankind and in fact they emerge along with life
on earth and they have experience three and a half billions of years of evolution and
adaptation. The encounter of civilizations has facilitated the exchange of microor-
ganisms determining the emergence of plagues and pandemics that have decimated
populations. The presence of the Spaniards in the Americas during the end of the
fifteenth and beginning of the sixteenth centuries resulted in the global expansions of
plagues like yellow fever, variola, measles, rubella, syphilis, tuberculosis among
others. Also, a few centuries before, the thirteenth century expansion of the Mongol
empire facilitated the dissemination of the black plague.
Plagues should not only be analyzed from its biological determination but also
from its social, economic, cultural and even moral determinants. The plague of the
ends of century twentieth -Human Immunodeficiency Virus/Acquired Immunodefi-
ciency Syndrome (HIV/AIDS)- cannot be seen as a mere genetic evolution and
adaptation of the Simian Immunodeficiency Virus (SIV) but also as a social con-
struction in which elements like access to health care services and therapeutics,
knowledge about the disease and even cultural and moral elements should be
incorporated into the analysis.

R. Izurieta (*)
College of Public Health, University of South Florida, Tampa, FL, USA
e-mail: rizuriet@health.usf.edu

© Springer Nature Switzerland AG 2020 3


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1_1
4 R. Izurieta

2 Discussion

As described in historical chronologies of the times of the Plague (years


1347–1350), people reacted to the pandemic with penitence acts to assuage God’s
anger. One of the most striking demonstrations of auto-infliction of pain and
suffering were the Processions of the Flagellants that took place in the Netherlands
and Germany [1]. Similar reactions were seen in the times of cholera -1991 Cholera
pandemic of the Americas- when the parishioners of the port of Guayaquil, Ecuador
decided to participate in the Eastern Christ of Consolation Procession. Actually,
these religious ceremonies became the epicenters of the transmission of the patho-
gens instead of ameliorating the epidemics and pandemics. A synergy of socioeco-
nomic, cultural, ecological and biological were observed during the Eastern Christ of
Consolation Procession, where parishioners of all catholic churches of the main port
of Guayaquil, Ecuador and its surroundings participated in the procession to ask for
God’s mercy since the cholera epidemic had already taken thousands of lives in the
country.
During the times of cholera in the Americas, a rampant expansion of the seventh
cholera pandemic originated in the Celebs Islands, Indonesia arrived into Peru in
December 1990 to later be expanded to Ecuadorian territories. Just a few weeks after
Peru had declared a cholera epidemic, the first case of the disease was reported in
Ecuador on February 28, 1991. A fisherman who had traveled by a small vessel to
Peru contracted cholera and initially infected the town of Bajoalto in El Oro
province. The epidemic spread rapidly. During the first epidemiological week,
cases were reported in widespread regions of the country such as Esmeraldas in
the north, Guayas in the center of the coastal zone, and throughout parts of the
Andean highlands. In a matter of months, the epidemic had reached beyond the
Andean mountains to the waters of the Amazon River spreading throughout the
rainforest. At the peak of the epidemic—the 17th epidemiological week (April 27-
May 4, 1991)—more than 3000 new cases were reported in the country [2]. The
causative agent was found to be Vibrio cholera O1 of biotype El Tor serotype Inaba
[3] (Fig. 1).
Since John Snow’s classic work in 1885 on cholera in London, water is known to
be one of the most important vehicles of transmission for cholera [4]. As in the case
of Snow’s work, our findings on the positive relationship between potable water
supply and increased cholera attack rates suggested the contamination of municipal
water supplies. Prior to the 1991 cholera epidemic in Ecuador, Guayaquil’s munic-
ipal water system had already reported low water pressure and other deficiencies in
their system. Although the central tap water reservoir in the city had 0.1 p.p.m. of
free chlorine, no chlorine was detected from jury-rigged connections at the periphery
of the water system [5]. In short, deficiencies of the municipal water system within
the country were the explanation for the correlations found between high potable
water coverage and increased cholera attack rates. These inadequacies may have
caused massive widespread contamination with hundreds of patients overcrowding
emergency health care services.
Plagues, Epidemics and Pandemics 5

Fig. 1 Cholera patient with severe dehydration (sunken eyes) under rapid intravenous water and
electrolytes infusion

Similar occurrences were described in Rioacha of Colombia and Trujillo and


Piura of Peru [6, 7]. In the Piura study, researchers concluded that “Piura’s water
system [had] distributed the infection throughout the city” thereby demonstrating as
Snow and others did in the mid-1900s, that untreated water in centralized distribu-
tion systems still pose as hazards to public health [6]. Such common sources of
contamination were associated with one or more of the following deficiencies:
insufficiently maintained pipes, low or intermittent water pressure, illegal
(unmetered) water connections, water taps located below ground level, and substan-
dard levels of chlorine in the system. In the Rioacha case study, epidemiological
research linked the municipal drinking water supply as a risk factor for both acute
diarrheal diseases and cholera. Their findings showed that the municipal water
system was contaminated and was thus serving as a key vehicle for cholera
transmission [8].
But water was not the only vehicle of transmission of V. cholerae, as demon-
strated years later by Rita Colwell. V. cholerae actual reservoir is in the brackish
waters of the estuaries all around the world. V. cholerae The Tor has been isolated
from aquatic estuaries where the temperature is higher than 10 ○ C and the salinity is
between 2‰ and 2.5‰, these are optimal conditions for vibrio growth [9]. This
correlation between cholera O1, temperature and salinity, was described by
Hood et al. in two studies conducted in Florida [6]. Miller et al. found that
the optimum concentration of salt for the survival of cholera was 2.0‰, the
salt concentration of the rivers of Ecuador and the water of the Pacific Ocean
were optimal for the development of the bacteria, phenomenon attributed to
the increase of rain water during the rainy season of March-May 1991 in Ecuador.
6 R. Izurieta

The correlation between high rainfall during the rainy season and the epidemic peaks
of. V. cholerae actually was later demonstrated in Ecuador [6, 8]. In addition,
zooplankton plays an important role in the survival of cholera and the association
with copepods is a characteristic of V. cholerae as other vibrios like Vibrio
parahaemoliticus. Hug et al. have stated that salinity and the presence of zooplank-
ton are the main factors that determine the growth of vibrios, in this environment
V. cholerae is apt to contaminate organisms developed by their ability to adhere to
surfaces [6].
Consequently, seafood became the second most common vehicle of transmission
of V. cholerae. The historical and ancient practice of eating raw seafood or dry fish
among the Ecuadorian and Peruvian population factored into the V. cholera trans-
mission in the coastal regions. Substantial evidence points to the consumption of raw
fishery products as one of the greatest risks [10–13]. We found that the popular
national dish “ceviche,” (containing marinated seafood) was a vehicle of V. cholera
transmission [14]. The presence of V. cholera in “conchas” (or shellfish) in Ecuador
were described in the studies of Weber et al. [5]. Thus, it was concluded that
contaminated seafood, handling and preparation, including the time the seafood is
exposed to citric acid in the lemon juice marinade, affect the outcome of cholera
cases. People consuming raw or dry seafood unleashed a cholera epidemic that took
thousands of lives.
On the Good Friday of March 29, 1991, hundreds of thousands of parishioners
from of catholic churches of Guayaquil and its satellite cities and counties congre-
gated in the streets to participate in the Eastern Christ of Consolation Procession. As
usual, that day the route of procession began early in the morning in the church of
Cristo del Consuelo located at the intersection of Lizardo García and A streets,
southwest of Guayaquil. People were agglomerated on the streets while the residents
of houses and buildings located on the sides of the route were throwing water to
ameliorate the heat people were experiencing during those high temperature days
which are typical of the rainy season.
Early in the afternoon, the National Epidemiological Surveillance System
mounted by the Ministry of Public Health got an emergency call indicating the
presence of dozens of cholera cases in an apartment located in one of the poor
neighborhoods of the Febres Cordero parish, close to the El Salado estuary. When
the epidemiology brigade arrived, about a dozen men and women were sitting or in
bed with the classical profuse watery diarrhea proper of cholera. Immediately, oral
rehydration with Oral Rehydration Salts (ORS) or intravenous rehydration with
Ringer’s Lactate was stablished. While being rehydrated, a case control study
interview was administered in order to identify the foods consumed in the last
3 days by the patients and their family controls. The Odds Ratios clearly pointed
at the consume of dry shrimp 2 days ago as the main factor associated with the
profuse diarrhea. In a qualitative interview, the patients, all of them members of an
extended family, mentioned that relatives from the Puna island have brought dry
shrimp to consume during the Eastern week in Guayaquil and to participate in the
Christ of Consolation Procession. Paradoxically, they have arrived to ask God to
protect Guayaquil and its surroundings from the cholera epidemic that was being
extended like fire since its beginning 4 weeks ago in the Ecuadorian territory. During
Plagues, Epidemics and Pandemics 7

Fig. 2 Hundreds of thousands of parishioners of the port of Guayaquil, Ecuador participating in the
Eastern Christ of Consolation Procession

the dinner of Good Wednesday they consumed fish and seafood and no meat at all as
it is banned by the Catholic church. That weekend we went to visit Puna island to see
if there were more patients who needed immediate rehydration in a low population
zone where there is no health care services. Few more cholera cases where found
also associated with the consumption of dry shrimp. During the inspection of the
water and sanitation systems, it was found that the water was obtained from wells
[15] and that the sanitation system was composed of pit latrines. Consequently,
contamination of the sea by sewage discharges was not possible and the contami-
nation of shrimp and fish in their natural environment was plausible. The results of
the case-control study were immediately faxed to the Minister of Public Health who
forwarded the fax to the Ministry of Fishery. The order of the Ministry of Fishery
was to declare the document as classified due to the possible impacts on shrimp
exportations. This was the first epidemiological evidence of the contamination of
fish and seafood in their natural marine environment. But also, it brings to discussion
the following questions: Since in the London of century twenty-first there were not
refrigerators, did London used to consume dry fish which could have been contam-
inated? Besides the transmission by water was there a transmission through seafood
during the 1854 London epidemic? Was the cholera epidemic controlled not only by
John Snow’s clever interventions but also because of the development of protective
natural immunity in the population? (Fig. 2)
Although the highest morbidity rates of cholera were observed in the coastal
provinces, the highest Case Fatality rates (CFR) were reported in the Andean
provinces where there is a predominance of indigenous Kichwa population, descen-
dants of the Incas. The presence of the cholera pandemic in Ecuador and the Andean
region showed us that after 500 years of the conquest of the Inca Empire by the
Spaniards, our indigenous populations were still ignored by the government health
system. The Kichwa and other indigenous ethnic groups were mistreated and
marginalized in health services managed by the “mestizos”. Emergencies were not
8 R. Izurieta

Fig. 3 During burial ceremonies and celebration of the Day of the Death countless cholera
outbreaks were reported among indigenous Kichwa communities. In these celebrations relatives
of the deceased used to share food and drinks spreading the disease among the invitees

managed based on the severity of the case but on the racial background of the patient.
Many Kichwas preferred to die at home with their relatives instead of dying in a
hospital alone and mistreated by strangers that talked to them in another
language [16].
For the Kichwa population, diseases are classified as environmental and proper of
the community and those diseases of outside or God are those caused by the presence
or contact with whites and mestizos on indigenous lands. These diseases can also be
brought by the Indians who go to work in the cities of whites and mestizos, on the
coast, or in another place different from that of the community. Outside the com-
munities, in the cities, kichwas can be contaminated by their uncontrolled life, by the
excess of work, but mainly by the continuous contact with white-mestizo carriers of
these diseases. Therefore, there is no cure within the community and these diseases
must be treated in white-mestizo hospitals. These diseases are considered as a
punishment sent by God and are diagnosed and cured by white doctors in hospitals.
The healers and Yachacs (shamans) try not to mediate in this group of diseases. But
they are also attributed to a punishment for breaking community values or norms
(carelessness, filth, alcoholism) or for the social degradation existing in the commu-
nities [17, 18] (Fig. 3).
The paradoxical existence -from the merely bio-ecological point of view- of the
epidemic in the frosty Andean provinces can lead us to propose a new pattern of
alternative transmission. The Andean region, characterized as a zone of cold climate
Plagues, Epidemics and Pandemics 9

Fig. 4 Lake San Pablo which waters were invaded by V. cholera stablishing a new enclave of the
plague in the Andean mountains

in which temperatures are almost always below 10 ○ C, theoretically does not


constitute an ecological environment conducive to the maintenance of V. cholerae.
The fact that Vibrio cholerae can develop with an incredible adaptability even in
cold freshwater of the Andes as described in rivers in eastern Australia and saltwater
as described in the North American bays in the Gulf of Mexico poses a menace to
human populations who do not have natural immunity. In conclusion we could
accept the fact that the microorganism was able to invade new ecological niches in
the Andean mountains as well as to find an immunological virgin Kichwa population
whose socioeconomic status and characteristic culture harbored the conditions for an
explosive and lethal epidemic (Fig. 4).
In the historical analysis carried out by Glass and Black, cholera disease shows a
cyclical behavior. According to these analyzes, epidemic outbreaks should be
expected every 5–7 years [4]. The behavior of cholera worldwide seems to be
closely related to the climatic changes caused by El Niño current in Latin America
and the Monsoons in Southeast Asia. In the same way, the seasonal behavior of
cholera has been reported in several investigations [19], for sure the epi-
demic adopted this seasonal and cyclical behavior Andean zone. Therefore, the
statement that cholera is also linked to ecological and cultural conditions has a
logical basis. The verification of a seasonal and periodic cholera behavior in the
Andean region would be evidence of the connection of outbreaks of the disease
with climate changes, which in turn would be related to the cultural practices of
the affected populations. The ethnocultural and environmental factors would be
10 R. Izurieta

integrated into socioeconomic determinants and health infrastructure to constitute


what would be called a syndemic pattern of transmission of cholera in the Andean
zone. As a matter of fact, there are strong epidemiological and biological bases for
the construction of this syndemic pattern of Andean transmission integrating the
socioeconomic, religious, environmental and ethnocultural factors of the region.

3 Conclusion

Although the persistence of cholera in certain provinces is mainly attributed to


socioeconomic conditions and the availability of sanitary infrastructure, these factors
do not completely explain the behavior of cholera in the Andean zone. Therefore, a
reasonable complementary explanation of the epidemic behavior and endemization
of cholera is the influence of ecological and cultural factors associated with ancestral
practices among the descendants of the pre-Hispanic ethnic groups that inhabit the
Andean highlands. After the epidemic of 1991, which attacked an immunologically
virgin population, the disease had a tendency to disappear with no evidences of its
endemization.
In the global context, the recent epidemics have weakened the reputation of
international organizations during the Ebola epidemic. Dr. Joanne Liu, Medicins
Sans Frontieres (MSF) International President, denounced United Nations has not
deployed the minimum necessary resources to tackle the exceptionally large out-
break of Ebola virus. Despite repeated calls by non-government international orga-
nizations like MSF for a massive mobilization on the ground, the international
response was lethally inadequate [20].

References

1. Hays JN (2009) The burdens of disease: epidemics and human response in western history, 2nd
edn. Rutgers University Press, New Brunswick
2. Sempetegui R, Garcia L (1992) Colera. In: Sempertegui R, Naranjo P, Padilla M (eds)
Panorama Epidemilogico del Ecuador. Ministry of Public Health of Ecuador, Quito
3. Izurieta RA (2006) Death foretold in the times of cholera. Institute for the Study of Latin
America and the Caribbean, University of South Florida
4. Glass RI, Black RE (1992) The epidemiology of cholera. In: Barua D, Greeough W III (eds)
Cholera. Plenum Publishing Corporation, New York
5. Weber JT, Mintz ED, Cnizares R, Semiglia A, Gomez I, Sempertegui R, Davila A, Greene KD,
Puhr ND, Cameron DN, Tenover FC, Barret TJ, Bean NH, Ivey C, Tauxe RV, Blaske PA
(1994) Epidemic cholera in Ecuador: multidrug-resistance and transmission by water and
seafood. Epidemiol Infect 112:1–11
6. Ries A, Vugia D, Beingolea L, Palacios AM, Vasquez E, Wells GJ, Swerdlow D, Pollack M,
Dean N, Seminario L, Tauxe R (1992) Cholera in Piura: a modern urban epidemic. J Infect Dis
166:1429–1433
Plagues, Epidemics and Pandemics 11

7. Swerdlow D, Mintz E, Roddriguez M, Tejada E, Ocampo C, Espejo L, Greene K, Saldana W,


Seminario L, Tauxe R, Wells J, Bean N, Ries A, Pollack M, Vertiz B, Blake P (1992)
Waterborne transmission of epidemic cholera in Trujillo, Peru: lessons for a continent at risk.
Lancet 340:28–32
8. Cardenas V, Saad C, Varona M, Linero M (1993) Waterborne cholera in Riohacha, Colombia,
1992. Bull PAHO 27(4):313–330
9. Colwell R, Anwarul H (1994) Environmental reservoir of Vibrio cholerae: the causative agent
of cholera. Ann N Y Acad Sci 740:44–54
10. Pan American Health Organization (1991) Risk of cholera transmission by foods. Bull PAHO
25(3):274–277
11. McIntyre RC, Tira T, Flood T, Blake P (1979) Modes of transmission of cholera in a newly
infected population on a atoll: implications for control measures. Lancet 1:311–314
12. Merson MH, Martin WT, Craig JP (1977) Cholera in Guam. Am J Epidemiol 105:349–361
13. Quick R, Thompson B, Zuniga A, Dominguez G, Brizuela E, Palma O, Almeida S, Valencia A,
Ries A, Bean H, Blake P (1995) Epidemic cholera in rural El Salvador: risk factors in a region
covered by a cholera prevention campaign. Epidemiol Infect 114:249–255
14. Izurieta R, Ochoa T, Narvaez A, Sempertegui R (1991) Investigacion del Brote de Colera en el
Recinto La Maria, Provincia del Oro. Epidemiology Bulletin, Ministry of Public Health of
Ecuador, vol 31
15. Cabrera J (2011) Estudio Hidrogeologico de la Isla Puna (Ecuador). Ingenieria de Minas.
Escuela Superior Politecnica del Litoral ESPOL
16. Narvaez A, Mathieu C (1992) El colera en las comunidades indigenas de Imbabura. Bol
Epidemiol Minist Salud Publica Ecuador 34:2–12
17. Izurieta R, Medina M (1995) Cholera: Report del brote de Salcedo. In: Narvaez A, Valcarcel M,
Betancourt Z (eds) Cholera in the Ecuadorian Highlands. Ministry of Public Health of Ecuador
and the Pan American Health Organization
18. Narvaez A, Izurieta R, Rodriguez N, Trujillo P, Vava M, Globet V (1998) Practicas preventivas
y concepciones sobre nosologia y causalidad del colera en comunidades indigenas de Imbabura
-Ecuador 1994–1996. Documento Minsiterio Salud Publica Ecuador
19. Colwell RR (1996) Global climate and infectious disease: the cholera paradigm. Science. l
274:2025–2031
20. Medicins Sans Frontieres (MSF) (2019) The failures of the international outbreak response.
https://www.msf.org/ebola-failures-international-outbreak-response
Agricultural Emergencies: Factors
and Impacts in the Spread
of Transboundary Diseases in,
and Adjacent to, Agriculture

Ashley Hydrick

1 Introduction: The Importance of Agriculture

In 2015, 40 years after the first call to end global hunger, the United Nations
(UN) made food security through sustainable agriculture the second Sustainable
Development Goal (SDG-2) [1]. In addition to providing for the basic human right of
food security, a robust agricultural industry is vitally essential to ensure the social,
economic, and political stability of a nation [2–4]. Food security is the state of
having access to a sufficient supply of food that is safe, nutritious, and meets a
group’s dietary needs and preferences to maintain a healthy lifestyle [5]. The state
where these criteria are not met is called food insecurity. Depending on the resource,
it is estimated that between 820 million and 940 million people worldwide live in
some form of chronic or recurring food insecurity, and about 108 million people live
in a state of food emergency [6–8]. The UN describes a two-fold intervention
approach for food insecurity, which are reducing the degree of exposure to the
hazards that contribute to food insecurity or increase the ability of communities to
cope with those hazards [5]. Hazards that contribute to food insecurity may include:
Human-made disasters – war or chemical/radiologic contamination, Climatic events
– droughts, storms, or flooding, and Disease or pest outbreaks that impact agriculture
[9, 10]. Any one or a combination of these events may constitute an agricultural
emergency, which will be defined in this chapter. Food aid has been a cornerstone of

The views and information presented are those of the author and do not represent the official
position of the U.S. Army Medical Department Center and School Health Readiness Center of
Excellence, the U.S. Army Training and Doctrine Command, or the Departments of Army/Navy/
Air Force, Department of Defense, or U.S. Government.

A. Hydrick (*)
Long Term Health Education and Training Program, U.S. Army, University of South Florida
College of Public Health, Tampa, FL, USA
e-mail: marie105@health.usf.edu

© Springer Nature Switzerland AG 2020 13


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1_2
14 A. Hydrick

humanitarian assistance for nearly half a century, but has been found to be costly and
unsustainable if the hazard cannot be immediately eliminated [4, 10–12]. Therefore,
development of agriculture toward food self-sufficiency has become the new focus
of world leaders in addressing the food insecurity challenges of today [4, 13]. The
purpose of this chapter is to introduce agricultural emergencies and provide some
case examples, specifically transboundary disease (TBD) events, that have threat-
ened sustainable agriculture worldwide. This discussion is not meant to provide a
detailed description of every threat to sustainable agriculture but will provide a basic
working knowledge of the area from which the reader can develop his or her own
knowledge base.

2 Food Self-Sufficiency: The New Goal of Food Security

According to the UN, investment in small farming operations and improving


agricultural technology can assist transition of small subsistence farmers to high-
impact commercial productions that increase food security and improve social
capital by adding “marketable surpluses” [13]. In other words, the new goal of
many international organizations is the development of high-impact sustainable
agriculture to create a state of food self-sufficiency, which is the ability of a country
or community to meet or exceed its own food needs through domestic agricultural
production [4, 13, 14]. This section will introduce the importance of domestic
agriculture and food self-sufficiency in the world today.
The ability of a country to domestically produce even half of its own food
requirements means improved food security for communities, less vulnerability to
price fluctuations in the international market, increased participation in the global
food trade, and increased social stability and trust of the people [2, 6, 14]. Establish-
ment, or re-establishment, of local agricultural markets in war-torn areas has been
shown to improve the sense of social security and welfare, which increases regional
stability [12, 15]. Vulnerable populations, especially women, greatly benefit from
development of commercial agriculture and return of these markets through improv-
ing social capital [15]. It is estimated that 950 million people (16% of the world
population) cover their demand for agricultural products using international trade,
and there are about 2.5 billion small-scale farmers worldwide that subsist primarily
on their own agricultural production. Most of these individuals are in North Africa
and South Asia, which are also the most food insecure areas of the world [6, 16]. Con-
versely, many of the most food secure countries in the world are neither major
importers or exporters of food, indicating that they are likely producing and con-
suming most of their food domestically (Table 1). These examples indicate that food
self-sufficiency can be associated with improving food security and social stability.
Agricultural Emergencies: Factors and Impacts in the Spread of. . . 15

Table 1 Comparison of top food secure countries to top countries in agricultural trade
Rank Food security index Top food importing countries Top food exporting countries
1 Singapore China U.S.
2 Ireland U.S. Brazil
3 U.S. and U.K.a Germany Netherlands
4 Netherlands Japan Germany
5 Australia U.K. France
These rankings were obtained from the FAOSTAT (2017) and the Global Food Security Index
(2018) [17, 18]
a
The United States of America (U.S.) and the United Kingdom (U.K.) were tied for third rank

Table 2 Top producers of staple commodities eorldwide


Rank Maize Rice Wheat Roots and tubers Potato Cottonseed
1 U.S.a China China Ethiopia China China
2 Chinaa India India D.R.a India India
3 Brazil Indonesia Russiaa Pakistan Russia Pakistan
4 Argentina Bangladesh U.S. Indonesia Ukraine U.S.
5 Mexico Vietnam Canada Namibia U.S. Brazil
These rankings were obtained from the FAOSTAT Countries by commodity page for the year 2016
[19]
a
China China (mainland), D.R. Dominican Republic, Russia Russian Federation, U.S. United States
of America

There are a number of societal benefits to obtaining a state of food self-


sufficiency, but the social and economic benefits of this autonomous state is com-
plicated [14, 18]. The assumption is that food security should be intrinsically linked
to robust domestic agricultural production. However, Mainland China, which is a top
producer of nearly all staple commodities (Table 2), is also the number one global
food importer, and is still ranked 46 in the world on the Global Food Security Index
(GFSI) [17]. Conversely, the U.S. is a top staple food producer, food importer and
exporter, and is a top ranked food secure nation (Tables 1 and 2). Many countries,
like Saudi Arabia and United Arab Emirates, are able to meet their food requirements
almost purely through trade due to economic wealth stemming from other industry
[14]. The amount of global food trade is increasing with over one-sixth of agricul-
tural products entering international market [14]. This allows for many countries
with no agricultural production capability to attain food security, while countries
with robust domestic agricultural systems supplement and improve stability of their
food resources through international trade [14, 16]. However, the concern is that, as
populations grow and resources become increasingly scarce, the lack of food self-
sufficiency in some countries will cause food insecurity and destabilized those
nations [16]. Therefore, food self-sufficiency would not only provide social, polit-
ical, and economic benefit at the local national level today, but is expected to also
improve sustainability and stability of agricultural systems into the future.
16 A. Hydrick

3 Agricultural Emergencies Definition

Agricultural emergencies are a threat to the stability and development of food self-
sufficiency worldwide. Many governmental and non-governmental organizations
refer to “agricultural emergencies” in several contexts, but there are limited, often
inconsistent, definitions or descriptions in literature and practice [20–23]. Some
organizations refer to agricultural emergencies exclusively in terms of disease
outbreak events [21], while others refer to emergencies in agriculture in broader
context to include the impacts of climatic disasters [20, 22–24]. In 1998, the UN
defined emergencies in the agricultural sector as those that threaten agricultural
production and livelihoods to constitute a general or food emergency [25]. Similarly,
Gilpen, Carabin, Regens, & Burden define agricultural emergencies as “any type of
event, regardless of intent, that jeopardizes the economic stability of any sector of
agriculture” [24]. These were the only two direct definitions of agricultural emer-
gencies that could be found by this author. Commonalities between these definitions
involve threats to production and economic stability on the large or small scale. This
aligns with the widely accepted general definitions of emergencies and disasters,
which are present or imminent hazardous conditions that threaten the lives and well-
being of persons and property, disrupt normal systems, and require immediate
coordinated response to prevent further damage or injury [20, 26]. Disasters are
differentiated from emergencies by exceeding local coping capacities that requires
outside aid or assistance [26].
The agricultural emergency definitions lean heavily on easily quantifiable impacts,
such as monetary losses and production volume, making them most applicable to the
commercialized agricultural systems that are characteristic of developed nations.
There is some applicability to the small subsistence operations of the lesser developed
nations, but the coping capacity of those systems are often more quickly overcome
than in large-scale commercial agriculture [13, 27]. These definitions only partially
capture the variety of social considerations involved in agricultural emergencies,
which are generally poorly characterized in literature, either over or under sensation-
alizing the issue [2, 28]. Lubroth et al. reference the social impacts of agricultural
emergencies, including psychosocial isolation, increases in suicides, and, in some
cases, breakdown of rule of law as a result of disease outbreak in the agricultural
systems [2]. Discussion of the economic impacts of agricultural emergencies is
important to meet the intent of the formal definitions, but this discussion will also
attempt to capture the social implications of the events presented.

4 Transboundary Agricultural Diseases

Transboundary diseases (TBDs) are highly communicable agents responsible for


high rates of morbidity and/or mortality in the affected populations, thus are usually
socially, economically, and politically significant [29, 30]. Animal TBDs are capable
Agricultural Emergencies: Factors and Impacts in the Spread of. . . 17

of moving undetected over large geographic areas facilitated by airborne particles


and mechanical or biologic vectors, which can rapidly escalate outbreaks and makes
elimination of these diseases difficult [29–31]. There are several organizations
dedicated to the monitoring and containment of these diseases worldwide [29, 30,
32]. The World Organization for Animal Health (OIE) is one such agency that
provides surveillance and subject matter expertise for the World Trade Organization
(WTO) in all matters involving animals and TBDs. The OIE lists 117 internationally
reportable animal TBDs that affect species ranging from cattle and swine to bees and
amphibians [29, 32]. While agricultural TBDs have largely declined in developed
countries due to improved veterinary treatment, primary prevention, and biosecurity
measures, they have remained a major source of economic and political instability in
less affluent Asian, African, and South American nations [2, 9, 29, 33]. Outbreaks of
these diseases in non-endemic countries are diverse and complex large population
disasters that affect multiple species, and are increasing in frequency and scope of
impact worldwide [32–35]. This section will discuss important TBDs and several
important outbreaks that have occurred since the start of the twenty-first century.

4.1 Foot and Mouth Disease

One of the TBDs of greatest significance is foot and mouth disease (FMD), primarily
because of the severe economic and political repercussions of the disease
[36, 33]. FMD is a highly transmissible viral disease that causes fever and vesicular
lesions progressing to erosions on feet, mouth, and udders of cloven-hooved ani-
mals, especially swine and bovine species [29, 32]. This disease is associated with
medium to low mortality rates, especially in endemic zones, but is responsible for
significant morbidity and obvious production loss [29, 37]. It is estimated that
endemic zones, where the disease normally circulates, suffer losses of up to $21
billion annually due to the direct and indirect impacts of FMD [36]. Direct losses
from FMD encompass production losses, like decreased weight gain, reproductive
challenges, juvenile mortality, decreased milk production, and increased disease
burden from secondary infections [29, 37]. Additionally, FMD endemic countries
are subject to restrictions from affluent trade markets, which limits economic growth
and stability [2, 33, 37]. Epizootic outbreaks of FMD can rapidly escalate in cost in
non-endemic areas due, in part, to the direct and indirect impacts of the disease, but
also to the cost of disease elimination efforts to regain FMD-free status [9, 38]. These
risks are made more severe by the high mobility of the FMD virus, which can travel
on aerosolized particles for up to 30 miles across land and further on dust and
contaminated persons or equipment [29, 39–41]. Concerns over the challenges of
being declared FMD endemic has led countries to sometimes extreme responses to
outbreaks.
Between 2000 and 2001, multiple isolated FMD outbreaks became one of the first
seminal TBD events of the twenty-first century [33, 35, 38]. Table 3 summarizes
several of the larger 2000–2001 FMD outbreaks. Many countries, like Japan or
18 A. Hydrick

Table 3 Summary of 2000–2001 FMD outbreaks


Uruguay (2000– S. Korea Japan
UK (2001) 2001)a (2000) (2000)
Duration 7.5 months 4 months 1 months 1 month
Case No. 2057 2033 15 3
Animals Slaughtered 6.24 mil. 20,406 2216 740
Financial cost (USD) 9.2 bil. 730 mil. 433 mil. 15 mil.
Financial cost (% 0.6 3.4 <0.1 <0.1
GDP)b
These numbers were obtained from the UN Food and Agriculture Organization FMD reports
[38, 44]
a
Uruguay experienced two isolated outbreaks of FMD in October 2000 (FMD Type O) and from
April to August 2001 (FMD Type A)
b
Estimated % of the country’s 2001 gross domestic product (GDP)

Korea, “stamped out” their FMD outbreaks early due to lack of movement through
domestic markets [35]. Conversely, major outbreaks occurred in the United King-
dom (UK) and Uruguay, due to movement of infected animals through densely
populated live-animal markets [35]. This resulted in over 2000 confirmed cases in
each country with dire economic consequences [35, 38]. The UK opted for a
complete “stamp-out” approach, where all confirmed positive and exposed animals
were destroyed, that resulted in over six million head of livestock destroyed and a
financial cost of over nine billion dollars [29, 35, 42]. This outbreak cost the UK
approximately 100% of their estimated total domestic agricultural value, in addition
to threatening tourism and other industry [42, 43]. The number of culled animal
carcasses resulted in an environmental disaster and required mobilization from
military and foreign veterinary services to manage the infected animals and uphold
stop-movement orders issued by the government [14]. Similarly, from 2000 to 2001,
Uruguay experienced two separate FMD outbreaks that, combined, produced com-
parable case numbers to those seen in the UK, but resulted in lower cost and loss of
property than the UK outbreak [38]. The Uruguay government responded early to
these outbreaks with a “stamp-out” approach, which successfully eliminated the
2000 outbreak [35, 44]. However, when the disease occurred again in 2001 with
widespread transmission, the response became targeted culling with vaccination
[38, 45, 46]. This more conservative approach resulted in approximately 20,406
head of livestock destroyed, and still cost Uruguay $730 million (~50% value)
primarily resulting from the cost of vaccine distribution and prolonged status as an
FMD positive country [35, 36, 38]. However, Uruguay did not experience the
ecologic fallout seen in the UK outbreak, and local government capabilities were
not reported to have been exceeded [14, 46].
These are overviews of impacts at the country level, but possibly more important
were the loss of income for large numbers of families, as well as loss of faith in
government authority [2, 36]. One study found that, in addition to the massive
economic loss produced by the UK FMD outbreak, farmers and response personnel
Agricultural Emergencies: Factors and Impacts in the Spread of. . . 19

involved in the outbreak experienced profound psychological trauma, indicated a


loss of faith in governance, and isolation within their farming communities
[47]. Negative media coverage likely increased these social tensions and impact to
non-agricultural industries [28, 42]. In Uruguay, there was resistance to outbreak
management early, but that resistance declined and positive response was seen due to
the flexibility of the government response to the FMD outbreaks [38, 48]. The
political and economic impacts of FMD are obviously profound, but the social
impacts are more enduring and damaging. The comparison betweeen these two
outbreaks show that inflexibility of management can make the difference between
an agricultural emergency and a disaster.

4.2 Highly Pathogenic Avian Influenza

Avian influenza is another OIE-listed TBD that is politically and economically


significant, but this disease carries a more visceral social fear due to its zoonotic
potential [28, 32, 49]. Sometimes called fowl plague, avian influenza is caused by
the influenza A virus, which is the same one responsible for the annual seasonal flu
in humans [29, 32]. The majority of avian influenza strains are low pathogenic
(LPAI), resulting in variable morbidity and usually low mortality [29]. However,
several strains, especially those with H5 and H7 hemagglutanin markers, are capable
of developing into highly pathogenic avian influenza (HPAI) through a process
called antigenic drift (small changes in the viral genome) and shift (large changes
in the viral genome) [32]. Both LPAI and HPAI strains can cause significant
economic loss and food insecurity due to decreased productivity, depopulation in
controlling the disease, and direct mortality from the virus, up to 100% mortality in
HPAI affected populations [29, 49, 50]. These impacts are profound for large-scale
commercial operations, but losses from HPAI are felt most intensely by small-scale
farming families, because they are absolute losses in assets and income [51]. How-
ever, the even larger concern is that these avian circulating HPAI viruses will
undergo antigenic shift to become highly lethal human circulating influenza viruses,
which could become the next pandemic disaster [32, 52]. The combined economic
and zoonotic implications of HPAI makes this an important TBD for discussion in
this chapter.
Historically, HPAI is a sporadically occurring disease in wild and domestic birds,
capable of severe local impact, but poor extended propagation, due to the extreme
mortality of the agents [29]. However, since the start of the twenty-first century, two
HPAI strains have emerged as a focus of concern for experts due to their recurring
appearance in avian and human hosts [32, 53]. H5N1 is probably the best known and
most concerning HPAI strain that first emerged in Hong Kong (1997), and then on
Mainland China (2003) [29, 53]. Although the H5N1 strain has caused relatively
lower case numbers, it has continued to be a source of concern due to its wider host
range, which has made the virus widespread across multiple continents, higher
human case fatality rate (CFR) up to 60%, and relatively high communicability
20 A. Hydrick

(R0 ¼ 2.26/avian; 1.14/human) (52–54). The H7N9 LPAI/HPAI strain emerged as a


source of concern in China (2013) with higher case numbers than the H5N1 strain
(450 cases/3 years versus 683 cases/18 years) [32, 53]. Although the viral charac-
teristics of this strain are not fully understood, H7N9 appears to cause less severe
illness, with a human CFR of 22%, and has a narrower host range, which has caused
it to remain largely confined to China [30, 53, 54]. The emergence of these HPAI
strains have caused several social and economic impacts [51, 55, 56]. Reports are
highly variable for the economic impacts of the H5N1 emergence, but it is estimated
that those costs likely total in the billions (approx. 2% GDP/East Asia) in outbreak
response, animal production and market losses, not to mention the cost of human
disease and lives lost [51, 55]. Similarly, H7N9 emergence has resulted in over $5
million due to poultry industry losses and higher human case numbers [53, 56]. The
social impacts appear more profound for the H5N1 outbreak with consumer concern
over the disease causing up to 70% reduction in poultry product acquisition in some
countries, and producers indicating fear of returning to poultry production [55]. The
H7N9 outbreak produced less social fear among consumers and producers due to
transparency in public communications and non-urban impacts [56]. However, the
social and financial impacts of both viruses were elevated in severity for producers as
the majority of impacted farms were small-scale operations [51, 55]. Additionally,
continued circulation of both viruses is a point of anxiety among experts for future
pandemics [32, 52, 57].
Non-H5N1/H7N9 outbreaks have also been a source of great public health and
economic concern. Since 2014, there have been two significantly extended HPAI
outbreaks in internationally important poultry operations [9, 49, 50]. Starting in
December 2014, the U.S. experienced the largest agricultural TBD event in North
American history, which resulted in the loss of over 50 million birds, especially
egg-producing chickens, and cost the U.S. $897 million (1.8% industry value) in
response and recovery [9, 49, 58]. The outbreak involved three different HPAI
strains – H5N2, H5N8, and H5N1 – that were geographically dispersed with cases
centering in the Midwest, along the California coast, and Mississippi and Central
avian fly-ways [9]. With the U.S. producing around 18 million metric tons of poultry
products, exporting over three million, this outbreak impacted a major world com-
mercial poultry producer [29, 49, 59, 60]. Following the U.S. 2014–2015 outbreak,
thirty European countries experienced outbreaks of the same H5N8 HPAI with over
2000 positive site cases and over 6 million birds culled from 2016 to 2017 [50]. Like
the U.S. 2014–2015 outbreak, the European Union (EU) outbreak was geographi-
cally widespread to the northern and southern extents of the continent in association
with wild avian flyways [50, 61]. As the EU is another major producer of commer-
cial poultry products, exporting over 1 million metric tons, the economic impacts
were similarly profound [49, 62]. Evidence suggests that the U.S. outbreak did not
cause enough consumer concern or fear to significantly alter domestic product
consumption, but did produce supply shortages and decreased export numbers due
to trade restrictions [9, 59]. Similarly, an EU report indicated low perceived risk of
human disease transmission in affected countries, which lowered the overall social
impacts [61].
Agricultural Emergencies: Factors and Impacts in the Spread of. . . 21

Overall, HPAI has been responsible for epidemic disease events in avian species
and thousands of human cases worldwide since the turn of the century [32]. The
impacts of these outbreaks have been experienced in the form of price instability,
food shortages, food insecurity, enduring trade restrictions well past the conclusion
of active disease transmission, and recurrent spillover of the zoonotic disease into
human populations [32, 49, 50, 60]. This discussion highlights the significance of
HPAI as an economic and political TBD, as well as the importance of the viruses as a
potential source of pandemic disease, an intense human health risk.

4.3 Crop Destroying Pests and Pathogens

Crop-destroying pests and pathogens are responsible for mass agricultural and
economic losses worldwide [63–65]. These agents are TBDs in the less obvious
sense, in that they do not emerge and re-emerge in geographically distant regions
like animal TBDs, but rather, remain strongly regionalized in their occurrence
[29, 63]. However, in recent years, the impacted regions are extending well beyond
historic boundaries and are resulting in emerging disease events in large areas
worldwide [63, 66]. On average, crop-destroying agents are responsible for up to
20% of pre-harvest crop loss, and a further 10% rejection of post-harvest product
from the food chain [63, 65]. With the world population rapidly growing, expected
to exceed nine billion by 2050, and the majority of that population obtaining
nutrition primarily through plant staples, like rice and wheat, losses associated
with crop-destroying agents are potentially the most impactful [16, 63, 67, 68].
Crop-destroying insect pests are one of the most important agents that affect crop
yield today [64]. The desert locust is probably the oldest and most sensational
example of a crop destroying insect pests causing profound social and economic
impacts [69]. These insects are always present at low levels throughout North Africa,
as far east as India, and emerge in outbreaks corresponding to prolonged or heavy
rains. Localized outbreaks are common occurrences to this day, emerging in several
locations throughout the locusts’ range annually, but if they are not controlled and
favorable swarm conditions continue, then outbreaks can develop into swarms or full
plagues [68, 70]. A full-sized plague is extremely costly and dangerous. The last full-
scale locust plague occurred from 2003 to 2005 and affected the food security and
livelihoods of over 8 million people across 13 million hectares of land in West
Africa, many of which are small-scale subsistence farmers [6, 71, 72]. This locust
plague caused up to 100% crop and pasture destruction locally and cost the UN Food
and Agricultural Organization (FAO) approximately $570 million in control opera-
tions and over $90 million in 2004 food aid estimates alone [71]. Similarly, other
crop destroying insects, like caterpillars, cornsilk flies, grain beetles, and maize
weevils, are capable of massive production losses (over 77%, 22%, 63%, and 10%
respectively), but those losses are observed throughout the lifecycle of the crops
[64]. Additionally, insect pest control programs are time-consuming and expensive,
which can increase the economic impacts of crop destroying pests, and serve as a
22 A. Hydrick

source of social concern and commentary [65, 69, 73, 74]. The UN Food and
Agriculture Organization Desert Locust program, for example, cost over $18 million
to develop and $3 million to maintain annually [69, 71]. Finally, crop-destroying
insects can present a double threat to agricultural stability by vectoring crop-
destroying pathogens [75, 76].
Fungal pathogens are another significant emerging crop-destroying agent
[63]. There are a variety of important parasitic fungal agents that are destructive to
crop species with or without producing toxins that harm human health, like ergot or
aflatoxin [67, 75]. Recent estimates indicate that rice blast, a rice crop-destroying
fungus, is responsible for a 30% production loss, enough to feed over 60 million
people. The same study found that the development of blast-resistant rice strains
would improve U.S. production by over $69 million [66]. That same improvement
would likely be even more profound in the world’s top rice producers (Table 2).
Similarly, wheat blast is responsible for production losses ranging from 40% to
100% where it is occurring worldwide [34]. Many of the highest impacted areas have
poor economic stability and limited biosecurity capability to prevent and treat for
these fungi, which represents one disastrous consequence of crop-destroying path-
ogens [6, 33, 63, 66]. However, concerns over northward migration of fungal
pathogens and their threats to the agricultural security of affluent trade partners,
causes those trade partners to increase biosecurity measures by restricting trade
access and income for less affluent nations [33, 63]. As seen with FMD, the
two-fold consequence of production and trade loss associated with crop-destroying
pests and pathogens can have profound impacts on affluent commercial production
schemes, but can be devastating for the less affluent small-scale producers.

5 Agriculture Adjacent Transboundary Disease Events

In addition to causing disease in agriculturally significant species, many of the OIE


listed TBDs are maintained and vectored in wildlife species, including comparable
mammalian and arthropod species, which complicates the control of these diseases
[29, 77]. The agriculture-wildlife interface serves as a continuing outbreak source for
many TBDs, like avian influenza and classical swine fever, which have been
eliminated from agricultural production in most developed countries
[29, 32]. Throughout the world, wild avian, rodent, cloven-hooved species and
arthropod pests appear to be the most significant non-human vectors of agricultural
disease [32, 75–77]. Ongoing disease outbreaks in the agricultural sector can be
facilitated by the bi-directional transmission of disease across the agriculture-
wildlife interface, which will be discussed in greater detail in the next section
[50, 78]. Expansion of agricultural practices, and other human centered activities,
to meet growing population demands likely increases the risk of disease outbreaks
from wildlife sources, which is a major threat [16, 77]. Control of TBDs at the
wildlife interface is made additionally complicated by inconsistent, sometimes
conflicting interagency policy within governments [78]. Concerns over these
Agricultural Emergencies: Factors and Impacts in the Spread of. . . 23

challenges have led to an overall increase in literature on the topic, as well as the
development of monitoring programs to better detect disease occurrence at the
wildlife-agricultural interface [30, 31, 77, 78].
One area where literature has been less consistent and more stagnant is in
agricultural adjacent diseases, which do not directly impact agricultural systems
[31, 77, 79]. There are many diseases that only impact wildlife but are capable of
secondary damage to agriculture, especially those diseases responsible for species
decline and genetic homogenization in ecosystems [31, 79]. While the impacts of
wildlife epizootic diseases are intensively studied from an ecologic standpoint, there
is limited research directed toward the secondary effects that these species extinc-
tions can have on anthropocentrically important issues like agriculture and public
health [79–81]. One example of this issue is found in the North American epizootic
outbreak of White Nose Syndrome.

5.1 White Nose Syndrome

White Nose Syndrome (WNoS) is an emerging fungal disease in North America


responsible for severe colony mortality of cave-hibernating bat species in 31 US
states and five Canadian provinces [31, 79]. Named for the white fungal plaques that
form around the nose, ears, and wings of affected bats, WNoS increases return to
normothermia and subsequent wakefulness of hibernating bats causing them to
consume their nutritional energy stores prior to winter-end resulting in starvation
[31, 80]. The causative fungus for WNoS, Pseudogymnoascus destructans, is
endemic throughout Europe and Asia, but bat species native to those regions are
largely unaffected by the fungus [81, 82]. However, since its emergence in North
America in 2006, WNoS has been responsible for over 90% colony mortality and the
decline of some bat species almost to the point of extinction throughout the eastern
half of North America [79, 80]. In addition, survivors of WNoS are expected to
experience chronic sequelae leading to continued morbidity and mortality in later
hibernation seasons [31, 83]. Experimental studies have shown that the loss of
predator species, like the WNoS-affected bat species, in an area result in elevated
crop destruction due to insect infestation and vector-borne disease transmission
[75, 76]. Studies like the ones referenced here have led governmental and
non-governmental agencies to express concern over public health risks, such as
increasing disease transmission and worsening agricultural stability [79, 80, 84].
However, further research into this topic indicates that the relationship of the
ecologic and public health impacts of WNoS is likely more complicated than
expected [76, 81, 82, 84] Research does support that the absence of a predator
increases disease transmission and spread of crop-destroying pests, but the loss of
predator diversity does not produce a statistically significant difference in experi-
mental agroecosystems [76, 75]. There are North American bat species that are
similarly resistant to WNoS as those in Europe, which could serve to fill some of the
ecological niches of the WNoS-susceptible bats [31, 76, 81, 82]. Yet other studies
24 A. Hydrick

indicate that this niche-fill theory may be limited in applicability due to the dietary
specificity of the different bat species [84]. This author could only find limited
research into the cause and effect relationship of the WNoS impacts in diverse
ecosystems and dynamically mobile populations. However, if this relationship
could be better established, WNoS could be a prime example of a large ecologic
outbreak disaster that is affecting negative consequences at the agriculture-wildlife
intersection.

6 Factors in the Spread of Transboundary Disease

Human activity is likely the most important factor in TBD and epizootic disease
movement and spread [31, 32, 35]. As discussed in the FMD section of this chapter,
the natural mobility of the virus can be increased from 30 miles to hundreds of miles
across land and sea through human activity [29, 39–41]. Indirect fomite transmission
was determined to be the main source of viral entry into each country in the U.K. and
Uruguay FMD outbreaks [40, 42, 45]. One comparative study of the early twenty-
first century FMD outbreaks found that live animal trade and exposure of dense
animal populations were the most significant factors determining the size and extent
of those outbreaks [35]. Similarly, the main risk factors in domestic poultry exposure
to the HPAI virus in the 2014–2015 U.S. and 2016–2017 European outbreaks was
indirect transmission through contaminated personnel [50, 58]. The incursion of
explorers and tourism enterprises into bat hibernation habitats have been implicated
in the importation and spread of WNoS in the U.S. [31, 82]. There is documented
concern over the risks of insect movement to distant regions through the interna-
tional trade of fresh fruits and vegetables, some of those insects may be vectoring
crop-destroying pathogens in addition to being, themselves, crop destroying agents
[69, 75]. Human driven genetic homogenization of crop species is suspected to
increase their vulnerability to disease transmission [34]. Finally, the expansion of
agricultural systems have encroached on wild habitats and increased the risk of
agricultural-wildlife disease transmission [78].
The likely contribution of climate change and climate variability as factors in the
spread of TBDs must also be addressed in this section [63, 66, 85]. The social and
political debate of climate change and the degree of human contribution to that
change is not within the scope of this discussion. However, scientists have noted
climate associated shifts in vector and disease patterns over the past two decades that
warrant concern [31, 32, 63, 66]. Normally occurring patterns in locust recession and
upsurge are intrinsically linked to the fluctuating wet and arid weather patterns with
the current habitat range expected to change due to climatic shifts [68, 70]. Climate
conditions can be important in the mobility of the FMD virus, with wind patterns and
atmospheric particulates serving to increase the virus’s mobility, which became an
important consideration in the 2000 South Korean FMD outbreak response (Table 1)
Agricultural Emergencies: Factors and Impacts in the Spread of. . . 25

[41]. Research suggests that crop destroying fungal pathogens are moving toward
the poles at a rate of eight kilometers per year, in a process that scientists largely
attribute to climate change [63, 66]. One commonly-referenced source indicates that
the melting of the permafrost in extreme northern countries may disturb previously
encased pathogen sources and increase the probability of re-emergence of diseases
that are currently controlled or eradicated [85].
The importance of the agriculture-wildlife interface in transboundary disease
cannot be overemphasized, because, as previously discussed, many diseases are
maintained and vectored at the agriculture-wildlife interface. In the case of the
twenty-first century HPAI outbreaks, while contamination from personnel are the
likely source of exposure for domestic poultry, the spread of the H5N1 and H5N8
viruses strongly correspond with the wild avian flyways [9, 52]. It was noted in
earlier sections that insect species can vector crop-destroying pathogens, which are
responsible for millions, if not billions, of dollars in production losses worldwide
[63, 66]. Although wildlife did not serve as a significant vector in the twenty-first
century FMD outbreaks, this is just one of many economically and politically
important diseases that is monitored in wildlife populations worldwide
[30, 31]. As previously discussed in detail, the loss of species diversity in the wildlife
sector is likely to result in a decline of ecologic resiliency, which can lead to
secondary consequences that are expected to increase pest burden and vulnerability
of domestic species [79].
These challenges threaten agriculture and food security, but they cannot be
simply avoided. Obviously, climatic variations have always been unavoidable issues
and, since the threat cannot be eliminated entirely, an increasing degree of resiliency
must be attained through innovative development [5, 10, 13, 16]. In fact, it is human-
driven trade and technological/systems advancements that have been largely respon-
sible for disease eradication and increasing affluence worldwide [32–34]. It is
advancements in testing, vaccination, and treatment of the animal TBDs that have
led to their eradication in developed countries [32]. Similarly, genetic manipulation
of crop species may risk homogenization and decreasing resiliency to novel threats,
but it is also being used to develop crop strains that are resistant to the crop-
destroying pathogens [64, 66]. Another positive note is that, whether it is due to
management programs or inhibitory climate conditions, the desert locust has been in
recession since 2005, which is the longest recorded recession in over 50 years
[69, 70]. Moving forward, experts agree that a balanced trade-off of utility, innova-
tion, and security will be required for successful management of TBDs and epizootic
diseases into the future [33, 34, 65]. This will require continuous objective study of
these diseases in usual transmission and outbreak situations, in addition to the
implementation of well-informed and fluidly adaptive control systems. More
research and knowledge synthesis are required to determine the importance of
climatic factors and indirect agricultural-adjacent diseases on agricultural stability
and epizootic disease transmission.
26 A. Hydrick

7 Agroterrorism: Malicious Attacks on Food and Social


Security

Terrorism and malicious attacks are the final major consideration for TBD transmis-
sion [67, 86, 87]. Agricultural (agroterrorism) and food terrorism are the malicious
introductions of disease-causing materials, especially biological agents, into the
food-chain, through the agricultural production or manufacturing/serving phases
respectively [67]. The food and agricultural sectors have been appealing targets for
opposing political and military forces throughout history [67, 87]. The reason for this
becomes obvious from the civil and political widespread social fear seen in naturally
occurring TBD outbreaks [2, 47, 86]. In short, these are attacks meant to sabotage the
food-chain to introduce food insecurity, increase social fear, cause economic and/or
political instability, or intentionally cause injury to select populations [2, 67]. This
final section will discuss the development and threats of agroterrorism in the world
today.
In ancient warfare, the launching of plague corpses and poisoning of water
sources are well known tactics for weakening enemy troops with disease and
instilling fear [86, 87]. These methods have advanced in the twentieth century
through improved scientific knowledge and technologic developments with impor-
tant biowarfare research programs emerging in many countries including Japan, the
U.S., the Union of Soviet Socialist Republics (U.S.S.R), and Iraq [87]. Today, most
nations have stopped research into these agents as offensive weapons following the
signing of the 1972 Biological and Toxin Weapons Convention [67]. However,
some governments including Syria, Iran, Iraq, and North Korea have continued their
research efforts despite urging from the international community [67, 87]. However,
it is not just national governments that have researched the development and utility
of weaponized bioagents, these agents serve as effective, low-cost weapons for non-
state actors as well [67, 86]. There are several documented cases of ideologic or
religious groups, as well as unassociated individuals, that have attacked or threat-
ened food or agricultural systems to cause terror, make political or social statements,
or gain economic benefit [86]. Probably one of the best-known examples of a
modern food terrorism attack is the Rajneeshee cult attacks in August and September
of 1984 [67, 86]. The Rajneeshee cult is a religious terrorist group that intentionally
contaminated the salad bars of ten restaurants in Dalles, Oregon with Salmonella
typhimurium, which is a common causative agent of gastroenteritis [67, 88]. This
attack, which resulted in a total of 751cases between September and October, was
politically motivated with the goal of altering the outcome of a local election by
preventing voter participation due to illness [88]. In 2011, a South African man was
arrested after ransoming the U.S. and U.K.’s agricultural systems with threat of a
FMD attack for $4 million, an act that was motivated by political discontent and
economic opportunism [67].
These threats against agricultural stability and food security have sparked
increased interest in research of food chain vulnerability [86]. Many world govern-
ments have developed lists of restricted and reportable pathogens that may be
Agricultural Emergencies: Factors and Impacts in the Spread of. . . 27

associated with malicious attacks on the agricultural sector and food security, or on a
nation’s populous itself [32, 67]. One author has concluded that the most likely
biological attacks would be zoonotic, specifically Bacillus anthracis (Anthrax),
based on the preferences of historic biowarfare programs. While agents that cause
disease in humans would be most sensational, they are not required to cause massive
economic loss and unrest in a society [2, 28, 47, 89]. Although agro- and food
terrorism events have been small in number and scale, this area remains a potential
target for malicious attack [67]. This indicates the need for continuing monitoring
and research into the threats of biological agents that can be used as agro- or food
terrorism agents.

8 Conclusion

This chapter introduced the background and concepts of agricultural emergencies.


This chapter also discussed the issues and concerns in the occurrence and spread of
epizootic plant and animal diseases. TBD outbreaks, like the ones discussed here, are
socially, economically, and politically significant events that offer a diverse array of
impacts and challenges to agriculture and trade. It is obvious that these diseases cost
millions to billions of dollars in production and trade losses in endemic countries, as
well as epizootic outbreaks, but they are also capable of causing social fear and civil
unrest. TBDs directly threaten human health and food security but these diseases,
along with epizootic wildlife diseases like WNoS, are responsible for degradation of
ecosystems, which can increase vulnerability of human activities to environmental
conditions and disease. Human activity, climate change and variability and the
wildlife-agriculture interface are the most important drivers in the occurrence and
spread of these diseases under normal conditions. The threat of a malicious attack,
agro- and food terrorism, using TBD and other epizootic pathogens is a continuing
danger as well. These diseases are not new threats but are continuing issues in the
twenty-first century. The threats in the occurrence and spread of these diseases are
not easily eliminated and the potential severity of impacts caused by TBDs warrants
continuing research and monitoring.

References

1. United Nations (2018) Food security and nutrition and sustainable agriculture. In Sustanable
development goals: knowledge platform. [Online]. https://sustainabledevelopment.un.org/
topics/foodagriculture#
2. Lubroth J et al (2017) Linking animal disease and social instability. Rev Sci Tech 36
(2):445–457
3. UN General Assembly (1948) General assembly resolution 217 A: the universal declaration of
human rights. United Nations, New York
28 A. Hydrick

4. U.S. Agency for International Development (2018) Agriculture and food security. USAID What
we do. [Online] October 04, 2018. https://www.usaid.gov/what-we-do/agriculture-and-food-
security
5. Food and Agriculture Organization (2008) An introduction to the basic concepts of food
security, [prod.] United Nations. Food Security Information for Action: Practical Guidelines,
Rome
6. FAO, IFAD, UNICEF, WFP, and WHO (2018) The state of food security and nutrition in the
world 2018: building climate resilience for food security and nutrition. UN FAO, Rome
7. Food Security Information Network (2017) Gobal report on food crises. World Food
Programme/UN, Rome
8. Von Grebmer K et al (2017) 2017 global hunger index: the inequalities of hunger. International
Food Policy Institute, Washington, DC
9. McElwain TF, Thumbi SM (2017) Animal pathogens and their impact on animal health, the
economy, food security, food safety, and public health. International Office of Epizootics/
WHO, Rev Sci Tech 36(2, s.l)
10. Sandstrom J et al (2014) Future threats to agricultural food production posed by environmental
degredation, climate change, and animal and plant diseases – a risk analysis in three economic
and climate settings. Food Secur 6:201–215
11. Director General for International Cooperation and Development (2018) Food and nutrition
security. European Commission, Brussels
12. Trenchant J-P et al (2018) The impact of food assistance on food insecure populations during
conflict: evidence from a quasi-experiment. World Dev 2:e1–e18
13. Uwizera C (2015) Sustainable development: report of the Second Committee. s.l. : United
Nations General Assembly. A/70/472
14. Clapp J (2017) Food self-sufficiency: making sence of it, and when it makes sense, Food Policy,
vol 66. Food and Agricultural Organization/UN, Rome, pp 88–96
15. Upreti B, Ghale Y, Sony KC (2016) Effects of armed conflict on agricultural markets and post-
conflict engagement of women in export-led agricutlrue in Nepal. J Int Women's Stud
18:156–180
16. Fader M et al (2013) Spatial decoupling of agriculture production and consumption: quantifying
dependencies of countries on food imports due to domestic land and water constraints. Environ
Res Lett 8:e014046
17. The Economist Intellegence Unit (2018) The global food security index. The Economist.
[Online] Corteva Agriscience. https://foodsecurityindex.eiu.com
18. Food and Agriculture Organization (2017) FAOSTAT. [Online]. http://www.fao.org/faostat/en/
#home
19. Food and Agriculture Organization/UN (2017) Countries by commodity. Food and Agriculture
Statistics. [Online]. http://www.fao.org/faostat/en/#rankings/countries_by_commodity
20. Agriculture and Agri-Food Canada (2018) Agriculture emergency management. Industry,
markets and trade. [Online]. http://www.agr.gc.ca/eng/industry-markets-and-trade/agriculture-
emergency-management/?id¼1468878400208
21. Animal Health Australia/Plant Health Australia (2018) Farm biosecurity. [Online]. http://www.
farmbiosecurity.com.au/
22. Dennison K et al (2012) Planning and resource management best practices. s.l. : National
Alliance of State Animal and Agricultural Emgergency Programs
23. U.S. Department of Agriculture. Emergency Response (2018) United States Department of
Agriculture Animal and Plant Inspection Service. [Online]. https://www.aphis.usda.gov/aphis/
ourfocus/emergencyresponse
24. Gilpen J et al (2009) Agriculture emergencies: a primer for first responders. Biosecurity
Bioterrorism: Biodefense Strategy 7:187–198
25. Food and Agricultural Organization/UN (1998) FAO’s emergency activities: technical hand-
book series. United Nations, Rome
Agricultural Emergencies: Factors and Impacts in the Spread of. . . 29

26. UN Office for Disaster Risk Reduction (2018) terminology on disaster risk reduction. UNISDR
What We Do. [Online]. https://www.unisdr.org/we/inform/terminology#letter-d
27. FAO Pest Management Team (2015) The integrated production and pest managment
programme. Food and Agricultural Orgnaization, Rome
28. Klemm C, Hartmann T, Das E (2017) Fear-mongering or fact-driven? Illuminating the interplay
of objective risk and emotion-evoking form in response to epidemic news. Health Commun
34:1–11
29. Committee on Foreign and Emerging Diseases (2008) Gray book: foreign animal diseases.
United States Animal Health Association, St. Joseph
30. EMPRES-Animal Health Webmaster/UNFAO (2018) EMPRESTADs. Animal production and
health. [Online] UN Food and Agirculture Organization. http://www.fao.org/ag/againfo/
programmes/en/empres/diseases.asp
31. U.S. Geological Survey (2017) White Nose Syndrome. USGS Wildlife Health Center. [Online].
https://www.nwhc.usgs.gov/disease_information/white-nose_syndrome/
32. World Organization for Animal Health (2018) OIE-listed diseases, infections, and infestations
in force in 2018. World Organization for Animal Health, Paris. s.n
33. Shanafelt D, Perrings C (2018) The effect of the post 2001 reforms on FMD risks of the
international live animal trade. EcoHealth 15:1315–1318
34. McDonald B, Stukenbrock E (2016) Rapid emergence of pathogens in agro-ecosystems: global
threats to agricultural sustainability and food security. Philos Trans B:371, 2016026
35. McLaws M, Ribble C (2007) Description of recent foot and mouth disease outbreaks in
nonendemic area: exploring the relationship between early detection and epidemic size. Can
Vet J 48:1051–1062
36. Knight-Jones T, Rushton J (2013) The economic impacts of foot and mouth disease – what are
they, how big are they and where do they occur? Prev Vet Med 112:161–173
37. Rushton J, Knight-Jones T (2011) The impact of foot and mouth disease. Food and Agriculture
Organization & International Office of Epizootics, Rome
38. Committee on Commodity Problems (2002) Animal diseases: implications for international
meat trade. Intergovernmental Group on Meat and Dairy Products, Rome
39. Mohr S et al (2018) Manipulation of contact network structure and the impact on foot-and-
mouth disease transmission. Prev Vet Med 157:8–18
40. Rivas AL et al (2006) Human-mediated foot-and-mouth disease epidemic dispersal: disease and
vector clusters. J Veterinary Med Ser B 53:1–10
41. Joo Y-S et al (2002) Foot-and-mouth disease eradication efforts in the Republic of North Korea.
Can J Vet Res 66:122–124
42. The Comptroller and Auditor General Office (2002) The 2001 outbreak of foot and mouth
disease. National Audit Office, London. HC 939
43. The World Bank. Agriculture, forestry, and fishing, value added (% of GDP) (2018) The World
Bank: data. [Online] The World Bank/UN. https://data.worldbank.org/indicator/NV.AGR.
TOTL.ZS
44. Ryan J (2000) FMD situation in Europe and other Regions 1999 and 2000. Food and Agricul-
tural Organization/UN, Rome
45. Canon A et al (2017) Module 5: vesicular diseases. National Veterinary Accredidation Program/
USDA. [Online]. https://nvap.aphis.usda.gov/VESIC/vesic0340.htm
46. Animal and Plant Health Inspection Service. AgLearn: FMD in Uruguay 2001. s.l. :
U.S. Department of Agriculture
47. Mort M et al (2005) The psychosocial effects of the 2001 UK foot and mouth disease epidemic
in rural population: qualitative dairy based study. BMJ 331:e1234
48. Jelen M (2001) Development in Uruguay: everyone pays price of foot-and-mouth. Interpress
Service News Agency
49. Ramos S, MacLachlan M, Melton A (2017) Impacts of the 2014–2015 highly pathogenic avian
influenza outbreak on the U.S. poultry sector. US Department of Agriculture World Agriculture
Outlook Board, Washington DC
30 A. Hydrick

50. Napp S et al (2018) Emergence and spread of highly pathogenic avian influenza A (H5N8) in
Europe in 2016–2017. Transboundary Emerg Dis 65:1–10
51. McLeod A et al Economic and social impacts of avian influenza. FAO Emergency Center for
Transboundary Animal Diseases Operations, Geneva
52. Peiris JS, de Jong M, Guan Y (2007) Avian influenza virus (H5N1): a threat to human health.
Clin Microbiol Rev 20(2):243–267
53. Bui C et al (2016) A systematic review of the comparative epidemiology of avian and human
influenza A H5N1 and H7N9 – lessons and unanswered questions. Transbound Emerg Dis
63:602–620
54. Bui CM et al (2017) Influenza A H5N1 and H7N9 in China: a spatial risk analysis. PLoS ONE
12(4):e0174980
55. Otte J et al (2008) Impacts of avian influenza virus on animal production in developing
countries. CAB Rev: Perspect Agric Vet Sci Nutr Nat Resour 3:1–18
56. Qui W et al (2018) The impacts on health, society, and economy of SARS and H7N9 outbreaks
in China: a case comparison study. J Environ Public Health e2710185
57. Kahn R, Richt J (2013) The novel H7N9 influenza A virus: its present impact and indeterminate
future. Vector-Borne Zoonotic Dis 13:347–348
58. Veterinary Services Surveillance, Preparedness, and Response Services (2016) Final report for
the 2014–2015 outbreak of highly pathogenic avian influenza (HPAI) in the United States.
Animal and Plant Inspection Service/USDA, Riverdale
59. Martin J, Mezoughem C, Sandoval L (2014) Livestock and poultry: world markets and trade.
Foreign Animal Service/USDA, Riverdale
60. Kuberka L et al (2016) Livestock and poultry: world markets and trade. s.l. : Foreign Agricul-
ture Service/USDA
61. European Center for Disease Prevention and Control (2016) Rapid risk assessment: outbreaks
of highly pathogenic avian influenza A (H5N8) in Europe. European Union, Stolkholm
62. van Horne PLM (2017) Competitiveness of the EU poultry meat sector, base year 2015.
Wageningen Economic Research, The Hague. 2017-005
63. Bebber D, Gurr S (2015) Crop-destroying fungal and oomucetes pathogens challenge food
security. Fungal Genet Biol 74:62–64
64. Silva GA et al (2018) Spatial distribution and losses by grain destroying insects in transgenic
corn expressing teh toxin Cry1Ab. PLoS ONE 13(8):e0201201
65. Kumar AN, Murugan K, Madhiyazhagan P (2013) Integration of botanicals and microbials for
management of crop and human pests. Parasitol Res 112:313–325
66. Nalley L et al (2016) Economic and environmental impact of rice blast pathogen (Magnaporhte
oryzae) alleviation in the United States. PLoS One 11:1–15
67. Keremidis H et al (2013) Historical perspective on agroterrorism: lessons learned from 1945 to
2012. Biosecurity Bioterrorism: Biodefense Strategy Pract Sci 11:S17–S24
68. Meynard C et al (2017) Climate-driven geographic distribution of the desert locust during
recession periods: subspecies’ niche differentiation and relative risks under scenarios of climate
change. Glob Chang Biol 23:4739–4749
69. World Summit on Food Security (2009) Transboundary animal and plant pests and diseases,
WSFS Secretariat. Food and Agricultural Organization, Rome
70. World Meteorological Organization/UN (2016) Weather and desert locusts. World Meteoro-
logical Organization and Food and Agricultural Organization/UN, Geneva. WMO-No. 1175
71. Cressman K (2015) Desert locust economics: a case study. Food and Agricultural Organization,
Rome
72. Ceccato P et al (2007) The desert locust upsurge in West Africa (2003–2005): information on
the desert locust early warning system and the prospects for seasonal climate forcasting. Int J
Pest Manag 53(1):7–13
73. Carlson R (1962) Silent spring. Mainer Books, Boston
74. Hayes T, Hansen M (2017) From silent spring to silent night: agrochemicals and the
anthropocene. Elementa Sci Anthropocene 5(57):1–24
Agricultural Emergencies: Factors and Impacts in the Spread of. . . 31

75. Bats initiate vital agroecological interations in corn. Maine, Josiah, Boyles, Justin. 40, 2015,
Proc Natl Acad Sci, 112, 12438–12443
76. Long E, Finke D (2015) Predators indirectly reduce the prevalence of an insect-vectored plant
pathogen independent of predator diversity. Oecologia 177:1067–1074
77. Wiethoelter A et al (2015) Global trends in infectious disease at the wildlife-livestock interface.
Proc Natl Acad Sci 112(31):9662–9667
78. Miller R, Farnsworth M, Malmberg J (2013) Diseases at the livestock-wildlife interface: status,
challenges, and opportunities in the United States. Prev Vet Med 110:119–132
79. U.S. Fish and Wildlife Service (2011) National plan for assisting states, federal agencies, and
tribes in managing White-Nose Syndrome in bats. U.S. Fish and Wildlife Service, Hadley.
National Response Plan
80. Alves DMCC, Terrible LC, Brito D (2014) The potential impact of White-Nose Syndrome on
the conservation status of North American bats. PLoS One 12:e107395
81. Zukal J et al (2016) White-nose syndrome without boarders: pseudogymnoascus destructans
infection tolerated in Europe and Paleartic Asia but not in North America. Nature 6:1–13
82. Hayman DTS et al (2016) Environment, host, and fungal traits predict continental-scale White-
Nose Syndrome in bats. Sci Adv 2:e1500831
83. Davy C et al (2016) Conservation implications of physiological carry-over effects in bats
recovering from white-nose syndrome. Conserv Biol 31:615–624
84. Clare E et al (2014) The diet of Myotis lucifugus across Canada: assessing foraging quality and
diet variability. Mol Ecol 23:3618–3632
85. Revich B, Podolnaya M (2011) Thawing of permafrost may disturb historic cattle burial
grounds in East Siberia. Glob Health Action 4:8482
86. Olson D (2012) Agroterrorism: threats to America’s economy and food supply, [prod.] Federal
Bureau of Investigation. Law Enforcement Bulletin, Washington, DC.: s.n.
87. Zilinkas RA (2017) A brief history of biological weapons programmes and the use of animal
pathogens as biological warfare agents. Rev Sci Tech 36(2):415–422
88. Torok T et al (1997) A large community outbreak of Salmonellosis caused by intentional
contamination of restaurant salad bars. J Am Med Assoc 278(5):389–395
89. Person B et al (2004) Fear and stigma: the epidemic within the SARS outbreak. Emerg Infect
Dis 10:358–363
The Threat Within: Mitigating the Risk
of Medical Error

Simon Bennett

1 Introduction

An irony of healthcare the world over is that patients die at the hands of doctors,
nurses, pharmacists, general practitioners and other medical professionals
[1, 2]. They die because the organisational culture within which medicine is prac-
ticed is not a learning culture. For a variety of reasons, information pertinent to
patient safety is either overlooked, ignored, devalued or not verbalised.
According to Professor Sir Brian Jarman, director of the Dr. Foster Unit at
Imperial College, London, National Health Service whistleblowers are “fired,
gagged and blacklisted”. According to the Professor, “nobody dare whistleblow in
the NHS” [3]. It is reasonable to assume that the possibility – however remote – of
being fired, gagged or blacklisted, influences the perceptions and behaviour of
employees, perhaps dissuading them from reporting under-resourcing, incompe-
tence, malpractice or error. The under, or non-reporting of error creates resident
pathogens or latent errors (see Reason’s [4] work for a definition of same) – the
preconditions of failure – within the healthcare system.
Systems-thinking [4–12] reveals the behaviours that foment the healthcare sys-
tem’s pathogenic culture.

1.1 Bullying

In a 2016 Guardian survey of 1500 medical professionals, 8 out of 10 claimed to


have been bullied: “[The on-line survey revealed that] 81% had experienced

S. Bennett (*)
University of Leicester, Leicester, UK
e-mail: sab22@leicester.ac.uk

© Springer Nature Switzerland AG 2020 33


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1_3
34 S. Bennett

bullying, and for almost half of them . . . it is still ongoing . . . . A third of victims said
they had been pushed out of their jobs, with many developing serious mental health
problems as a result, while almost three-quarters reported increased stress and panic
attacks” [13].
Bullying has both direct and indirect consequences. Respondents who claimed to
have been bullied took, on average, 108 days off work. According to Bowles and
Cooper [14], performance is linked to morale: “If there is one word which encapsu-
lates the benefits which accrue from a high morale organisation, it is this: perfor-
mance. This refers to performance at the individual level and that of the organisation
as a whole. Evidence for morale correlating highly with, and driving, performance is
strong and growing”. To the extent that bullying lowers morale, it is bad for
productivity and, potentially, health outcomes. Following publication of a damning
report into patient deaths at the National Health Service’s Gosport War Memorial
Hospital [15], Health and Social Care Secretary Jeremy Hunt told the BBC: “We do
have to tackle [the] blame culture” [16]. Employees who fear they might be
disciplined or sacked if they question a clinical judgement or practice are unlikely
to whistle-blow. Fear silences. Silence leaves error unreported and malpractice
unchallenged.

1.2 Normalisation

In her seminal analysis of the 1986 Challenger space shuttle disaster (STS-51-L
exploded shortly after launch when a seal in one of its solid rocket boosters
ruptured), sociologist Diane Vaughan developed the concepts “normalisation of
deviance” and “social organisation of mistake” [17] to explain NASA’s tolerance
of increasingly high levels of risk. According to Bennett [12], these processes obtain
in healthcare, where medical professionals’ quiescence or withdrawal supports the
normalisation of sub-optimal or dangerous practices.

1.2.1 Evidence for Normalisation

In late June 2018, an independent investigatory panel published its report into deaths
at Gosport War Memorial Hospital [15]. The Gosport Independent Panel (GIP)
found that, over a period of 11 years, the lives of more than 650 elderly patients
had been cut short as a consequence of the hospital’s prescribing regime. Specifi-
cally, as a consequence of the prescribing of dangerous levels of opioid drugs
(diamorphine, haloperidal, midazolam and hyoscine). Concerned about the prescrib-
ing regime, in the early 1990s some of Gosport’s nurses contacted the Royal College
of Nursing (RCN). At a December, 1991 meeting with hospital managers, nurses
were told to address their concerns to the responsible doctor, the senior sister and a
The Threat Within: Mitigating the Risk of Medical Error 35

consultant geriatrician. According to the GIP, staff were told “to keep any concerns
within the ward, rather than taking their concerns to others outside the hospital”
[15]. The regime continued. More patients died.
Between 1998 and 2010, Hampshire Constabulary conducted three investiga-
tions. According to the GIP, each was flawed. Following publication of the GIP’s
report, Chief Constable Olivia Pinkney was contrite: “The force has always
acknowledged that the first two police investigations were not of a high quality.
The report makes clear a view from the Panel that the third did not look widely
enough” [18]. It is pertinent to ask why the first two investigations ‘were not of a high
quality’. Were investigators incompetent? Were investigators encouraged to ‘go
easy’ on an under-pressure and under-resourced NHS?
According to the Gosport Independent Panel, “during a certain period at Gosport
War Memorial Hospital, there was a disregard for human life and a culture of
shortening the lives of a large number of patients” [15]. The deliberate shortening
of lives at the Gosport War Memorial Hospital was symptomatic of an organisational
malaise. Specifically, an organisational culture that, over time and with the active
participation of clinicians and managers, normalised a dangerous practice – the
prescribing of high doses of opioids: “There was an institutionalised regime [for
institutionalised read normalised] of prescribing and administering ‘dangerous
doses’ of a hazardous combination of medication not clinically indicated or justified,
with patients and relatives powerless in their relationship with professional staff”
[15]. As mentioned, Gosport War Memorial Hospital’s governing elite (composed of
senior clinicians, nursing staff and managers) was central to the normalisation
process. Vaughan’s observations on the role of elites and organisational culture in
the social organisation of mistake are relevant to the Gosport scandal: “[The Chal-
lenger launch decision] reminds us of the power of political elites, environmental
contingencies, history, organisational structure and culture, as well as the impact of
incrementalism, routines, information flows and taken-for-granted assumptions in
shaping choice in organisations” [17].
What happened at Gosport is an object lesson in how not to organise for safety.
Safe organisations are mindful organisations [19, 20], a mindful organisation being
one in which, in the context of a just culture, staff are encouraged to voice concerns
and managers are expected to listen and act where appropriate. The Gosport War
Memorial Hospital was far from being a mindful organisation. Indeed, to the extent
that its governing elite brooked no dissent, Gosport was a mindless organisation. It
was an accident waiting to happen.

1.3 Clubbishness

Clinicians’ shared histories, experiences, training and social life promote a common
value-set and world-view, creating a clinician in-group. Under such conditions,
judgements and practices are unlikely to be questioned – even when those practices
36 S. Bennett

are deviant and attract criticism. Social cohesion inhibits methodical scepticism.
According to Leyens et al. [21], in-groups look after their own: “[G]roup members
exhibit in-group favouritism biases . . . . People prefer their in-group to an out-group
. . . they interpret more leniently an ambiguous behaviour performed by an in-group
member than by an out-group member [and] they excuse more readily anti-
normative behaviours committed by an ingrouper than by an outgrouper . . . they
perceive bias in neutral reports of their conflict with an outgroup . . . and so on . . . .
[O]utgroup derogation [to derogate is to devalue or diminish] also serves to make
one’s group superior to the outgroup. People derogate outgroupers to feel better . . .
When under threat, they try to restore a positive image of their ingroup by denigrat-
ing the outgroup . . . . [I]ngroup favouritism and outgroup derogation reflect a
protection of the ingroup . . .”.

1.3.1 Evidence for Clubbishness

Between 1988 and 1995, clinicians at Alderhey Children’s Hospital, Liverpool,


England removed and stored organs from around 850 infants without the consent
of parents or guardians. The practice was not confined to Alderhey. The public
inquiry into the scandal criticised both the senior clinician responsible for the
practice, Professor van Velzen, and other senior clinicians for failing to supervise
van Velzen: “Professor van Velzen was guilty of the following activities . . . [order-
ing] the unethical and illegal retention of every organ in every case for the overriding
purpose of research . . . falsifying statistics, records and work output . . . falsifying
research applications . . .falsifying postmortem reports ... failing to keep a proper
catalogue or record of the stored organs . . . . Alder Hey and the University
[of Liverpool], knowing of the risks inherent in the appointment of Professor van
Velzen . . . failed to supervise and performance-manage the new unit . . . Alder Hey
and the University failed to implement the job plan and additional supervision laid
down by their Joint Review in 1993 . . . The chief MLSOs [medical laboratory
scientific officers] were complicit in . . . van Velzen’s falsifications and the Service
Manager at Alder Hey allowed himself to be sidelined . . . Alder Hey and the
University permitted . . . van Velzen to abdicate his clinical duties and responsibil-
ities . . . Alder Hey and the University missed numerous opportunities to discipline
. . . van Velzen for justifiable reasons from 1989 onwards . . . Alder Hey and the
University failed to apply . . . proper audit procedures and management systems to
. . . van Velzen’s Unit of Fetal and Infant Pathology throughout his tenure when there
was reason for continuing audit” [22]. The origins of Alder Hey’s and the University
of Liverpool’s laissez-faire approach merit investigation. Was it incompetence that
led to van Velzen being left to his own devices? Or was it an assumption that, as a
colleague with numerous academic papers to his name, an important job to do and a
title (Professor), van Velzen was considered to be above suspicion? To what extent
did deference, collegiality, empathy and other psycho-social dynamics blind van
Velzen’s colleagues and employer to the immorality, illegality and organisational
risks inherent in his modus operandi?
The Threat Within: Mitigating the Risk of Medical Error 37

Regarding the Gosport War Memorial Hospital scandal, none of Hampshire


Constabulary’s three investigations proved effective [15, 16]. It is not beyond the
realms of possibility that inter-elites empathy – encapsulated in the term ‘old boy
network’ – undermined the integrity of the investigations. Further, it is not beyond
the realms of possibility that deference skewed investigators’ perceptions. The police
were investigating a social elite: the clinician responsible for Gosport’s prescribing
regime was married to a retired Royal Navy commodore (a senior rank). They shared
a home valued at £700,000 [23].
According to Fairlie [24] “The exercise of power in the United Kingdom (more
specifically, in England) cannot be understood unless it is recognised that it is
exercised socially”. Fairlie coined the term The Establishment to describe “the . . .
matrix of official and social relations within which power is exercised”. Taylor et al.
[25] argue that the Establishment constitutes “an exceptionally important social
group because of its immense influence and power”. Formal power (for example,
that vested in the civil service, local government, constabulary, judiciary, military,
state broadcasters, universities and NHS) is exercised within a network of elite social
relations that, accepting Fairlie’s thesis, influence how power is exercised. Clinicians
and senior police officers are Establishment figures, sharing broadly the same values,
interests, outlook and life-chances. The GIP report notes: “The senior management
of the hospital, healthcare organisations, Hampshire Constabulary, local politicians,
the coronial system, the Crown Prosecution Service, the General Medical Council
(GMC) and the Nursing and Midwifery Council (NMC) all failed to act in ways that
would have better protected patients and relatives, whose interests some subordi-
nated to the reputation of the hospital and the professions involved” [15]. The report
also notes that “When . . . relatives complained about the safety of patients and the
appropriateness of their care, they were consistently let down by those in authority –
both individuals and institutions” [15]. Therefore, according to the GIP, the
governing elite seemed more interested in protecting its own than in serving its
clients. The fact that the governing elite failed to act on information received, or even
to engage with clients in any meaningful way, suggests a superior, disdainful
attitude. It suggests arrogance.
Gosport’s governing elite dealt firmly with dissenting groups and dissenting
individuals. In 1991, nurse Sylvia Giffin, along with three colleagues, raised con-
cerns about Gosport’s prescribing regime. According to Ms. Giffin’s daughter, her
mother’s actions led to her being bullied: “It made her very unwell. They forced her
out, but tried to blame it on ill health. She suffered depression because of the
bullying, but carried on with her work . . . . Mum never wanted to point any fingers
at anyone. It wasn’t about any particular doctor. It was about the whole system. They
bullied her until she gave them some names” [26]. As noted by Leyens et al. [21],
in-groups protect themselves by denigrating and excluding (othering) those they
consider a threat.
38 S. Bennett

1.4 Groupthink

Groupthink is a psychological process whereby, over time, the members of a


cohesive in-group begin to think and act in the same way. Groupthink engenders a
preferred world-view or narrative that anchors members’ opinions and guides their
actions. Members attend to data that supports the group’s preferred world-view, and
discredit or ignore data that contradicts it. Dissenters are silenced or marginalised
(othered). The group isolates itself. Cordery [27], who considers groupthink a
pathology, offers the following definition: “[Groupthink] can arise when the group
is highly cohesive, where there is strong leadership, where decision-making pro-
cedures are unstructured, where pressure to find a solution is intense, where member
homogeneity is high and where the group is insulated or closed off from external
sources of information or influence”. According to Janis [28], groupthink degrades
“mental efficiency, reality testing and moral judgement”.
Janis [28] identified eight characteristics and consequences of groupthink:
(a) An illusion of invulnerability – encourages risk-taking
(b) Contra-indications ignored or discounted – absence of introspection and
reflexivity
(c) Feelings of moral superiority – eschewing of ethical and moral introspection
(d) Stereotyping of non-believers – othering of non-believers (Mrs Thatcher
famously referred to Britain’s coal miners as The Enemy Within, effectively
framing Britain’s miners, so vital to the economy, as a ‘dangerous other’)
(e) Dissenters pressurised – suppression of counter-arguments and alternative
world-views
(f) Self-censorship – suppression/concealment of ideas that challenge the dominant
narrative
(g) Illusion of unanimity – assumption that the majority view is held by all
(h) Active suppression of contra-arguments by mindguards – suppression of incon-
venient truths.
All are susceptible to groupthink – even the educated. “[H]ighly cohesive groups
composed of well-qualified, well-motivated people sometimes fall into a pattern of
‘groupthink’ that can yield disastrous policy recommendations” notes Hackman
[29]. According to Janis [28], the Kennedy Administration’s 1961 Bay of Pigs
invasion – a poorly-planned anti-Castro coup – resulted from groupthink within
President John F. Kennedy’s inner-circle. Groupthink may have influenced the
Johnson Administration’s thinking on Vietnam. Specifically, its belief that it could
defeat an agile, adaptive insurgency with conventional forces, weapons and tactics.
It may have contributed to the 1986 Chernobyl nuclear power plant disaster that
spewed radioactive dust into the atmosphere: “The [plant] operators’ actions were
consistent with an illusion of invulnerability. It is likely that they rationalised away
any worries (or warnings) . . . . Their single-minded pursuit of repeated testing
implied an unswerving belief in the rightness of their actions. They . . .
underestimated the opposition: in this case, the system’s intolerance of being
The Threat Within: Mitigating the Risk of Medical Error 39

operated within the forbidden reduced-power zone. Any adverse outcomes were
either seen as unlikely, or possibly not even considered at all. Finally, if any one
operator experienced doubts, they were probably self-censored before they were
voiced” [30].
Small-group pathologies such as groupthink may cause members to make poor
decisions. According to Mulenburg [31], organisational processes are often inade-
quate to the task of producing optimal decisions: “The evidence of practical expe-
rience shows . . . that decision making is seldom a precise, rational activity. In reality,
it is often plagued with bias, misconception and poor judgement. Decisions are often
poor choices made for expediency or out of ignorance of alternatives”.

1.4.1 Evidence for Groupthink

The behaviour of those Gosport War Memorial Hospital clinicians and managers
responsible for the prescribing regime that shortened the lives of hundreds of patients
suggests groupthink. Evidence for groupthink includes:
– Dismissal of relatives’ complaints
– Adherence to the prescribing regime in the face of opposition
– The questioning and devaluing of professional critique (such as that advanced by
the nurses who contacted the RCN)
– The othering of dissidents
– The prioritisation of institutional reputation over other considerations [15, 26].

2 Aviation’s Line Operations Safety Audit (LOSA): A


Transferable Safety Methodology?

The Institute of Medicine’s ground-breaking report To Err is Human: building a


safer health system argued for a better understanding of the systemic causes of
medical error [32]. The Department of Health report An organisation with a memory
argued for a wider appreciation of the value of the systems approach in preventing
medical error [33]. The National Patient Safety Agency report Seven steps to patient
safety emphasised the importance of systems or holistic thinking to patient safety:
“The best way of ... reducing error rates is to target the underlying systems failures
rather than take action against individual members of staff .... A much wider
appreciation of the value of the systems approach in preventing, analysing and
learning from patient safety incidents [is required]” [34].
Despite such exhortations, medicine’s safety praxis has been little influenced by
the systems approach: “While being widely championed in patient safety, where
factors related to individuals, technology and the wider organisation are afforded
equal consideration ... there is ... evidence that the systems approach ... is still
underexploited and could be taken much further” [35].
40 S. Bennett

The National Health Service’s failure to embrace systems-thinking in safety


management has occurred against a backdrop of an ever-growing contingent liability
for medical error: “With potential legal claims, mostly for clinical negligence, now
totalling more than £26 billion, the NHS is facing unsustainable liabilities .... There
is no sign of any improvement in reducing the incidence of harm being caused to
patients .... In the past decade [i.e. since 2004], the situation has deteriorated
substantially” [36].
In the context of limited progress in the application of systems-thinking in
National Health Service safety regimen and an ever-growing contingent liability
for medical error, it is suggested that the NHS considers introducing risk-manage-
ment tools from other domains. Specifically, that it considers adapting aviation’s line
operations safety audit (LOSA) methodology.

2.1 Systems-Thinking: Its Meaning and Expression


in Aviation Safety Tools
2.1.1 Introduction

Systems-thinking is grounded in the social sciences. Component methodologies


include ethnography, participant observation, action research, oral history and
mass-observation. According to Waterson and Catchpole [35], systems-thinking is
not about applying the right type of knowledge to solve a problem. Rather, it is about
applying the right approach. It provides a problem-solving frame of reference – a
modus operandi. It gives one a way of foregrounding problems so they can be
reliably fixed.
Systems-thinking has a simple premise – that human error can be induced by
circumstance. For example, a poorly-designed display may cause a pilot to misread
an instrument [37]. Systems-thinking acknowledges the systemic origins of error:
“[H]uman mistakes ... rarely have a single underlying contributory factor. Error is
the product of design, procedures, training and/or the environment” [38].

2.1.2 Systems-Thinking in Aviation

Aviation has pioneered the systems-thinking approach to risk management and


accident investigation. Watershed systems-thinking-inspired reports include:
Moshansky’s analysis of the 1989 Dryden accident
Haddon-Cave’s analysis of the 2006 Nimrod loss
Complex systems – prone to systems phenomena such as emergence and practical
drift, and subject to social, economic and political pressures – are difficult to
understand and run [7, 8, 39–42]. According to Lagadec [43] and Perrow [6],
complexity creates vulnerability. Perrow [6] claims that accidents are the norm in
The Threat Within: Mitigating the Risk of Medical Error 41

complex systems. According to Perrow, trying to prevent accidents by adding extra


layers of safety makes them more likely to occur (because adding extra layers of
safety increases the likelihood of unanticipated interactions between system com-
ponents and encourages operators to take risks (risk compensation)). Understanding
how systems work in reality is the sine qua non of trouble-free system operation.
The 1972 Florida Tri-Star loss (in which 101 died) and 1977 Tenerife runway
collision (in which 583 died) convinced the airline industry that it needed:
(a) A better understanding of flight operations
(b) Better teamwork, both on and around aircraft.
Human factors tools were developed. The first, Crew Resource Management
(CRM), improved teamwork and resource utilisation on the flight-deck, in the
cabin, on the ramp and in the maintenance hangar [38, 44, 45]. The second, the
line operations safety audit (LOSA), documented the lived reality of routine flight
operations [46].

2.1.3 The Line Operations Safety Audit

A line operations safety audit documents the system-as-found. It describes the vérité
of a system. Systems-thinking tools such as the line operations safety audit assume
system behaviour to be an emergent property of complex interactions (some of
which are hard to discern) between human and non-human actors or actants
(e.g. designers, regulators, employees, equipment, resourcing, rules, regulations,
personal ambition, organisational culture, corporate aspirations and the legal envi-
ronment) [47]. Systems-thinking challenges the false certainties of
reductionism [40].
Executed by trained observers familiar with flight operations, a LOSA reveals the
lived reality – the vérité – of flight-deck labour. Observers’ freedom to roam and
probe reflects the line operations safety audit methodology’s grounding in actor-
network theory (ANT). The actor-network theory research method requires that
investigators ‘follow the actors’ [47].
A LOSA is sensitive to risk-creating phenomena such as practical drift and
emergence. According to Woods et al. [48], emergence “... means that simple
entities, because of their interaction ... can produce far more complex behaviours
as a collective .... One common experience is that small changes ... can lead to huge
consequences ... [a product of] non-linear feedback loops ...” . A LOSA describes:
1. The threat environment (for example, sub-standard air traffic control or adverse
weather)
2. The number and type of errors made by flight crew (for example, a pilot’s
intentional non-compliance with a rule)
3. Coping mechanisms. Eurocontrol [49] notes: “[R]outine threats to the safety ... of
the system are constantly being managed by the system before they lead to serious
outcomes. The frequency of these threats and the ways in which they are managed
42 S. Bennett

are known at a local level by ... operators ... but this information is often not
captured in any structured way by the organisation. [The LOSA] provides a
means by which this can be achieved”.
4. Good practice (for example, safety innovations introduced by front-line
personnel) [46].
The LOSA methodology draws on Hollnagel’s [50] Safety-II model of risk
management and on high reliability organisation (HRO) theory [51, 52]. Hollnagel’s
model recommends:
– The proactive management of safety
– The tailoring of safety initiatives to the object system (mapping a system’s
topography informs change)
– The placing of workers at the epicentre of the risk management process
(to capitalise on workers’ local knowledge, adaptability and capacity for
problem-solving bricolage).
A high-reliability organisation displays the following characteristics:
1. “Successful containment of unexpected events”
2. “Effective anticipation of potential failures”
3. “Just culture”
4. “Learning orientation”
5. “Mindful leadership” [52].
Aspects of mindful leadership include:
1. “Proactive commissions of audits to identify problems in the system”
2. “‘Bottom-up’ communication of ‘bad news’”
3. “Engagement with front line staff through site visits” [52].
Being a voluntarily-undertaken, ground-level safety audit composed of both
observational and interview data, the LOSA methodology helps create the conditions
necessary for high-reliability operation. Promoted by the International Civil Avia-
tion Organisation, the LOSA methodology has gained traction with the airline
industry: “In 1999, ICAO endorsed LOSA as the primary tool to develop counter-
measures to human error in aviation operations . . . and made LOSA the central focus
of its Flight Safety and Human Factors Programme for the period 2000 to 2004 . . . .
The number of operators joining LOSA has constantly increased since March 2001,
and includes major international operators from different parts of the world and
diverse cultures” [46].
Line operations safety audits are performed by pilots employed by the subject
airline within a supportive organisational framework. Upon securing the
co-operation of its pilot workforce, the airline trains a small cadre of volunteers in
the methodology. Exercises such as the one illustrated in Fig. 1 form part of the
training.
Once trained to the required standard, this cadre observes a predetermined
number of flights (sectors in aviation parlance) on selected routes. Pilot-observers
The Threat Within: Mitigating the Risk of Medical Error 43

Fig. 1 Threat-and-error management (TEM) work-sheet training exercise [53]

record threats (for example, inclement weather) and errors (for example, level busts).
These are described and coded on standard proforma.
To qualify as a line operations safety audit, the audit process must meet the
benchmark set by the International Civil Aviation Organisation (ICAO). The ICAO
specifies ten operating characteristics for a LOSA:
1. “Jump-seat observations during normal flight operations”
2. “Joint management-pilot sponsorship” (a steering group composed of pilots and
managers oversees the process)
3. “Voluntary crew participation”
4. “De-identified, confidential . . . data collection”
5. “Targeted observation instrument” (a form or forms to record threats, errors,
interview and other pertinent data)
6. “Trusted, trained and calibrated observers”
7. “Trusted data-collection site” (ideally, the LOSA data should be stored and
archived by a trusted third-party)
8. “Data verification roundtables” (raw data is scanned for inaccuracies)
9. “Data-derived targets for enhancement” (data is used to develop an action plan)
10. “Feedback of the results to the line pilots” [46].
After the results have been communicated to the pilot group the action plan is
funded and launched and the next LOSA timetabled, creating a virtuous circle of
investigation, verification, improvement and validation (see Fig. 2).
44 S. Bennett

Fig. 2 The LOSA


methodology creates a
perpetual virtuous circle of
investigation and
improvement

3 Knowledge Transfer: A Line Operations Safety Audit


for Medicine

3.1 Introduction

To ensure that medical practitioners identify with, and accept the methodology, it is
suggested that the line operations safety audit (the term ‘line operations’ refers to
routine commercial aviation flight operations) nomenclature be dropped. Possible
new titles include:
– Routine activities safety audit (RASA)
– Health operations safety audit (HOSA) (preferred).
The health-care system’s culture of fear and intimidation (see Sect. 1.1, above)
has the potential to undermine a safety audit. Put simply, observers who fear that
what they say may constitute grounds for disciplinary action or dismissal may be
tempted either to overlook threats and errors, or to sanitise data. It is suggested,
therefore, that HOSAs are performed by persons not directly employed by the health
authority undergoing audit. One option would be to use medical students, specifi-
cally students engaged in the student-selected component (SSC) in patient safety.
Being an action-research project requiring good non-technical skills, a HOSA would
make an excellent student-selected component (SSC).

3.2 The Student-Selected Component (SSC) in Patient Safety


as a Vehicle for Health Operations Safety Audits
(HOSAs)

Those who study medicine at a British university spend 5 years as a medical student
(see Fig. 3).
The Threat Within: Mitigating the Risk of Medical Error 45

Fig. 3 Medical student to junior doctor career path [54]

The United Kingdom General Medical Council’s (GMC’s) 2009 report Tomor-
row’s Doctors specified that medical degrees should provide students with an
element of choice. Student selected components meet that requirement. Student-
selected components aim to:
– Help students develop their research skills
– Encourage students to consider issues not addressed within the medical core
curriculum
– Boost self-confidence
– Develop students’ presentational skills
– Help students explore different career paths [55].
The GMC encourages educators to offer SSCs in a various disciplines. For
example:
– A specialist area of medicine or surgery
– Social-psychological topics such as counselling, substance abuse or
homelessness
– Foreign languages [55].
Some universities offer SSCs in patient safety. The GMC sanctions a variety of
learning methods for SSCs, including:
– Formal tutorials
– Practicals
– Guided self-study
– Problem-solving exercises
– Role-playing exercises
– Small-group work
– E-learning
– Patient-based learning [55].
46 S. Bennett

3.3 Potential Educational and Organisational Benefits of a


HOSA-Based SSC in Patient Safety

3.3.1 Potential Educational Benefits

Drawing on theories of immersive/experiential learning [56] and action learning


[57, 58], a HOSA-based SSC in patient safety would offer students the opportunity
to:
– Conduct in vivo research into a complex, politically-loaded and difficult-to-solve
problem (patient harm)
– Work in the field in a bespoke problem-solving team
– Develop teamworking, observational, communication and analytical skills
– Generate new thinking (‘blue-sky’ thinking) to inform policy and action.
Using medical students to perform a HOSA would satisfy many of the require-
ments of a student-selected component. For example, participating in a HOSA
would:
– Improve the student’s research skills
– Boost self-confidence
– Improve communication skills
– Improve organisational skills
– Improve observational skills
– Hone data-recording skills
– Sharpen presentational skills.

3.3.2 Potential Organisational Benefits

Action-research is purposeful. Denscombe [59] observes: “Early on, action research


was ... seen as research specifically geared to changing matters ... this has remained a
core feature ...” . Lewin [60] characterises action-research as a “spiral of steps ...
composed of a circle of planning, action and fact-finding about the result of the
action”. A HOSA provides for the identification of strengths and weaknesses,
implementation of remedies and evaluation of same.
While medical students may lack the knowledge and experience of time-served
professionals (for example, doctors and nurses employed by a National Health
Service trust), several benefits accrue from students’ institutional independence,
academic training and academic orientation. Specifically:
– Students are unlikely to pander to special interests
– Students are relatively immune to organisational coercion
– Students are acutely aware of the requirements of academic research, including
the need for objectivity, reflexivity and verification
– Students have not been institutionalised by the healthcare establishment.
The Threat Within: Mitigating the Risk of Medical Error 47

Fig. 4 The health operations safety audit (HOSA) data-capture form

3.4 Research Instrument

It is suggested that the standard University of Texas Human Factors Research Project
LOSA Threat and Error Management (TEM) data-capture form be simplified to
improve functionality. One possible configuration is shown in Fig. 4.
Observations made during a health operations safety audit are both described and
coded (see Figs. 5 and 6). It is suggested that a system of codes be developed with
input from academics versed in the HOSA methodology, doctors, nurses, para-
medics, pharmacists, laboratory technicians and other medical personnel. As with
a line operations safety audit, a health operations safety audit should be overseen by
a multi-disciplinary Steering Group. It is suggested that HOSA Steering Groups
draw members from both academia and health-care (in the case of a hospital, the
Steering Group should include academics, doctors, nurses, anaesthetists,
radiographers, pharmacists, paramedics, porters and other medical professionals).

3.5 Reflections on the HOSA Methodology

Acknowledging the role of reflective practice [61] in validating research, the HOSA
methodology is hereby critiqued. No methodology is perfect. There are several
potential issues:
(a) The data could be skewed by the Hawthorne effect. A person subject to
observation might modify her behaviour [25, 62–64]
48 S. Bennett

Fig. 5 The health operations safety audit data-capture form – example I

Fig. 6 The health operations safety audit data-capture form – example II


The Threat Within: Mitigating the Risk of Medical Error 49

(b) The data could be skewed by experimenter bias. Identification with the observer
might cause an observee to modify her behaviour
(c) The data could be skewed by observer bias. An observer’s values and pre-
conceptions might influence her choice of scenario and interpretation of same
(how she frames, understands and records what she sees and hears)
(d) The data could be skewed by the HOSA coding structure. Coding systems can
focus attention in a particular direction
(e) Observer cognitive overload might cause data to be misinterpreted or lost.
Information overload and prioritisation errors can result in observer task-
saturation, reducing situation awareness and accuracy [65]. Medical facilities
can be chaotic. Unlike an aircraft flight-deck, a medical facility (for example, an
accident and emergency (A&E) department) presents a dynamic, permeable
environment populated by actors with varied roles (for example, doctors, nurses,
ambulance crew, porters, cleaners, clerical workers, police officers). These
actors rotate through the scenario
(f) Knowledge-deficit may reduce accuracy. Student-observers might lack the
knowledge and experience necessary to make accurate observations under all
circumstances
(g) Opponents could use the fact that the HOSA methodology is an import (from
aviation) to label it inappropriate for healthcare
(h) Opponents could dismiss the methodology as derivative and second-hand. Blue-
sky thinking is a rarity, however. As Gilbert and Stoneman [66] explain: “[R]
esearch is never conducted without reference to other studies”.

3.6 The Mechanics of a Student-Administered HOSA

3.6.1 Introduction

Ideally, the HOSA should be administered by students who are about to complete
their formal academic studies. In the UK, this would mean using 5th-year medical
students. The HOSA should be conducted over 2 or 3 weeks. It should commence
with a short, bespoke training course.

3.6.2 HOSA Training

Regardless of students’ prior knowledge of quality-assurance methodologies and


techniques, and of patient-safety surveys, they must receive specific training in the
theory and practice of the health operations safety audit. While computer-based
50 S. Bennett

training has a role, initial training should be in the form of a one or two-day course
delivered by a person familiar with the methodology. Training methods should
include:
– Traditional chalk-and-talk
– Short PowerPoint presentations
– Question-and-answer sessions
– Exercises, preferably conducted in a medical training facility, to familiarise
students with the HOSA data-capture form (see Fig. 4) and difficulties of note-
taking and coding under duress
– Patient-safety training videos. For example, consideration should be given to
presenting a patient-safety video through a HOSA lens. In the UK context, the
training video Recognising Risk and Improving Patient Safety – Mildred’s Story
[67] would seem a suitable candidate for this treatment.
At the end of the training day(s), the Trainer should pair the students into mixed
teams. Students’ preferences regarding co-workers should be respected. Pairing
creates a supportive, collegial environment for student-observers.

3.6.3 Fieldwork

The Trainer should be on hand to offer advice and support for the duration of the
fieldwork. In practice, this means rotating through the two-person teams to ensure
that the research instrument is being used as intended, and that observees are
co-operating. Students should record:
– Observees’ actions
– Relevant conversations (as best they can)
– Systemic influences on behaviour.
Students should also reflect on the degree to which their presence influenced
observees’ behaviour (the Hawthorne effect, whereby the act of observation influ-
ences observee behaviour).
Potentially, students could find themselves documenting so-called never-events.
NHS England [68] defines never-events as: “[S]erious incidents that are wholly
preventable as ... safety recommendations that provide strong systemic protective
barriers are available .... Each Never Event type has the potential to cause serious
patient harm or death. However, serious harm or death is not required to have
happened ... for [an] incident to be categorised as a Never Event”. Never-events
include:
– Wrong-site surgery
– Retained foreign object post-procedure
– Wrong-route administration of medication
– Scalding of patients [69].
The Threat Within: Mitigating the Risk of Medical Error 51

The Friday of each fieldwork week should be an office day where the students
meet with the Trainer to discuss progress. At the end of the HOSA the students
should be required to demonstrate baseline competencies by, for example,
presenting key findings in a PowerPoint presentation to fellow students, academics
and health service personnel.
The HOSA data sheets should be verified by the Steering Group and a report
presented to the relevant health authority. The data verification stage (also known as
the data cleaning stage), during which the raw data is checked by subject experts for
consistency, is crucial for quality-control. The International Civil Aviation Organi-
sation describes the LOSA data verification stage in these terms: “Data-driven
programmes like LOSA require quality data management procedures and consis-
tency checks. For LOSA, these checks are done at data verification roundtables. A
roundtable consists of three or four department and pilots association representatives
who scan the raw data for inaccuracies . . . . The end product is a database that is
validated for consistency and accuracy according to the airline’s standards and
manuals, before any statistical analysis is performed” [46]. For Merritt and Klinect
[70], data verification is pivotal: “[A]irline representatives who are fleet experts
attend a data cleaning roundtable .... Together, they review the data against the
airline’s procedures, manuals and policies to ensure that events and errors have been
correctly coded. After the roundtable is completed, airline representatives are
required to sign off on the data set as being an accurate rendering of threats and
errors. Only then does analysis for the final report begin”.
The Steering Group should prepare two reports: the first one, a report that lists
every major observation with supporting data; the second one, a high-level summary
a couple of pages long. The Steering Group should strive for the widest possible
dissemination of the findings. It should offer to present the findings in seminars, at
board meetings and other forums. It should offer to write articles for health authority
journals and web-sites summarising the findings. To ensure continued buy-in, the
Steering Group must engage with collaborators. It must disseminate, and attend to,
feedback.

4 Conclusion

The U.K. Government acknowledges that the under, or non-reporting of incompe-


tence, malpractice and error has contributed to patient-safety failures in the National
Health Service [71, 72]. Speaking in February, 2018 at the Sixth Annual World
Patient Safety, Science and Technology Summit in London, Jeremy Hunt, Health
and Social Care Secretary for the U.K. Government, observed: “In the UK we are
still very hierarchical in medicine .... It’s one of the only professions where we talk
about Mr. this, Dr. that, rather than the first name terms that are normally used .... [I]f
you’re a hierarchy, it means you’ve only got one pair of eyes spotting the mistake,
but if you remove the hierarchy you can have eight or nine pairs of eyes spotting ...
potentially lethal mistakes .... If you really want your hospital ... to be a safe one,
52 S. Bennett

you’ve got to be really brave and be prepared to be honest when it’s not. And that
means a completely different approach to what has been traditional in healthcare
systems all over the world, where we are often very guarded ... about being open
when things are going wrong .... [A]ny clinician wants nothing more than to be
completely open and transparent about what happened, to learn where the lessons
need to be learned to make sure that that tragedy doesn’t ever happen again. But in
modern healthcare systems, we make that practically impossible. People are terrified
[that] if they’re open about what happens, they will be removed from the register,
they might get fired by their hospital, it would be bad for the reputation of their unit,
for the reputation of their trust – a thousand worries prevent the one thing that really
should be paramount, which is proper learning from that mistake and proper attempts
to make sure it can never be repeated” [73].
It would be wrong to conclude that the under, or non-reporting of incompetence,
malpractice and error is a peculiarly British problem. It is a world-wide phenome-
non, contributing to in excess of 42 million adverse medical events each year. As the
director general of the World Health Organisation told the Sixth Annual World
Patient Safety, Science and Technology Summit: “Adverse events are now estimated
to be the 14th leading cause of death and injury globally. That puts patient harm in
the same league as tuberculosis and malaria. There are an estimated 421 million
hospitalisations in the world every year, and, on average, one in ten of those results
in adverse events” [74]. In a 2003 Canadian Institute for Health Information (CIHI)
survey “... about a quarter [of adults] said that an adverse event had occurred in their
own care, or that of a family member”. One half claimed that the adverse event had
produced “serious health consequences”. Three quarters claimed that it had “led to a
hospital visit or longer hospital stay”. One in 20 stated that it had resulted in the
death of a relative [75].
Adverse medical events are a drain on resources: “[M]edical errors aren’t just bad
medicine; they’re bad economics. The investments needed to improve patient safety
pale into insignificance compared with the costs of harm. The question, therefore, is
not whether we can afford the interventions that will keep patients safe. The question
is whether we can afford the status quo” [74].
Prescribing errors cost billions: “[M]edication-related harm ... has been estimated
at an annual global cost of $42bn” [2]. The human costs of prescribing errors include
pain, suffering, injury and, in extremis, death. In 2018 three British universities,
Sheffield, York and Manchester, published data on the human and financial costs of
prescribing errors. The i-newspaper’s Paul Gallagher reported: “More than 22,000
people could be dying in England every year after suffering fatal reactions to errors
in their medication .... Errors in medication are ... costing the NHS up to £1.6 billion
[annually]” [76].
Part of the solution to the problem of avoidable deaths in healthcare is the
importation of proven safety-management strategies. Faced with an unacceptably
high fatality rate and traumatised by disasters such as the 1977 Tenerife collision
(that killed hundreds), the air transport sector invested in systems-thinking-informed
proactive safety-management strategies and accident investigation techniques
[46, 77, 78].
The Threat Within: Mitigating the Risk of Medical Error 53

One of the industry’s most potent tools is the line operations safety audit
[46, 70]. Adapted and re-packaged as the health operations safety audit, this meth-
odology could do much to reveal the weak points in healthcare systems, helping
managers and doctors focus scarce resources on the most dangerous and costly
failings (such as wrong-site surgery, diagnostic and prescribing errors). Student
selected components are a key element of UK medical degrees [55, 79]. The
patient-safety student-selected component could provide a reservoir of unconflicted
observers for health operations safety audits.
During the Second World War, the British Government’s Ministry of Information
looked to mass observation [80, 81], an ethnographic technique pioneered by
anthropologist Tom Harrisson, poet Charles Madge and film-maker Humphrey
Jennings, to document the lifestyles, opinions and aspirations of wartime Britons.
Techniques included self-reporting and the recording by researchers of conversa-
tions and activities in workplaces, public houses and sports venues.
The HOSA methodology draws on this tradition, documenting the lived reality
(the vérité) of the medical labour process, warts-and-all. Provided it is adequately
funded, implemented with the full co-operation of staff (medical and managerial)
and performed by unconflicted, enthusiastic observers, there is every reason to
believe that health operations safety audits will deliver tangible safety and efficiency
benefits to under-pressure hospitals, clinics and other medical facilities. Good news
for patients. Good news for staff. Good news for the Exchequer.
In 2017, Health and Social Care Secretary Jeremy Hunt ordered an investigation
into maternity care at the Shrewsbury and Telford NHS Trust. The Trust was failing
to learn lessons. As the Trust’s Medical Director, Dr. Edwin Borman, explained: “In
the case of foetal heart monitoring, we have identified a number of cases where
learning has not been fully implemented. We’ve put systems in place to make
improvements” [82]. In August, 2018, the investigation was widened after further
complaints were received [83]. Health operations safety audits could help transform
hospitals into learning institutions (see Fig. 2). Lives could be saved. Suffering could
be eased.
Biographical Note
During a 20-year career in aviation, the author has spent over 1500 h on the jump-
seat and circa 200 h on the ramp. He has observed: 232 x A319-operated sectors;
66 x A320-operated sectors; 62 x A321-operated sectors; 144 x B737-operated
sectors; 181 x B757-operated sectors; and 7 x A300-operated sectors. He has
observed eighteen National Police Air Service EC135 sorties (total flying
time 792 min). As part of his safety training he has performed a landing in a
737–300 simulator and has completed Safety and Emergency Procedures (SEP)
courses for several aircraft types, including the A319, B737 and B747–800. During
his 3-year contract with a UK-registered freight airline, the author contributed to a
line operations safety audit (LOSA) and helped deliver flight-crew human-factors
training courses.
54 S. Bennett

References

1. Christianson MK, Sutcliffe KM, Miller MA, Iwashyna TJ (2011) Becoming a high reliability
organisation. Crit Care 15:314
2. Healtheuropa (2018) Sixth Annual World Patient Safety, Science and Technology Summit,
25th April. Available at: https://www.healtheuropa.eu/6th-annual-world-patient-safety-science-
technology-summit. Accessed 25 Aug 2018
3. Jarman cited in Bennett C (2018) After Gosport, who would want to be elderly and in hospital
now? The Guardian, 24 June 2018
4. Reason JT (1990) Human Error. Cambridge University Press, Cambridge
5. Turner BA (1978) Man-Made Disasters, 1st edn. Wykeham Publications, London
6. Perrow C (1984) Normal accidents: living with high-risk technologies. Basic Books, New York
7. Reason JT (2013) A Life in Error. Ashgate Publishing Ltd, Aldershot
8. Hollnagel E (2004) Barriers and accident prevention. Ashgate Publishing Ltd, Aldershot
9. Challenger R, Clegg CW, Robinson M (2010) Understanding crowd behaviours, vol. 1:
practical guidance and lessons identified. Her Majesty’s Stationery Office, London
10. Dekker SWA (2014) The field guide to understanding ‘Human Error’, 3rd edn. Ashgate
Publishing Ltd., Farnham
11. Bennett SA (2017) The March, 2011 Fukushima Daiichi nuclear power plant disaster – a
foreseeable system accident? In: Asia/Pacific security challenges: managing black swans and
persistent threats. Springer, Cham, pp 123–137
12. Bennett SA (2018) Getting to the heart of medical error and malpractice, July 2. Available at:
https://troymedia.com/2018/07/02/causes-medical-error-malpractice/. Accessed 23 Aug 2018
13. Johnson S (2016) NHS staff lay bare a bullying culture. The Guardian, 26 October 2016
14. Bowles D, Cooper C (2009) Employee morale: driving performance in challenging times.
Palgrave Macmillan, London
15. Gosport Independent Panel (2018) Gosport war memorial hospital. The report of the Gosport
independent panel. Her Majesty’s Stationery Office, London
16. Hunt cited in Rudgard O, Sawer P, Steafel E, Marshall F (2018) Hampshire police hand
investigation into Gosport hospital deaths to another force after admitting failings. The Daily
Telegraph, 21 June 2018
17. Vaughan D (1996) The Challenger launch decision. Risky technology, culture and deviance at
NASA. University of Chicago Press, Chicago
18. Pinkney cited in Rudgard O, Sawer P, Steafel E, Marshall F (2018) Hampshire police hand
investigation into Gosport hospital deaths to another force after admitting failings. The Daily
Telegraph, 21 June 2018
19. Weick KE, Sutcliffe KM, Obstfeld D (1999) Organising for high reliability: processes of
collective mindfulness. Res Organ Behav 21:81–123
20. Weick KE, Sutcliffe KM (2007) Managing the unexpected: resilient performance in an age of
uncertainty. Jossey-Bass, San Francisco
21. Leyens JP, Paladino PM, Rodriguez-Torres R, Vaes J, Demoulin S, Rodriguez-Perez A, Gaunt
R (2000) The emotional side of prejudice: the attribution of secondary emotions to ingroups and
outgroups. Personal Soc Psychol Rev 4(2):186–187
22. The Royal Liverpool Children’s Inquiry (2001) The Royal Liverpool Children’s inquiry:
summary and recommendations. Her Majesty’s Stationery Office, London
23. Rudgard O, Sawer P, Steafel E, Marshall F (2018) Hampshire police hand investigation into
Gosport hospital deaths to another force after admitting failings. The Daily Telegraph, 21 June
2018
24. Fairlie H (1955) Political commentary. The Spectator, 23 September
25. Taylor P, Richardson J, Yeo A, Marsh I, Trobe K, Pilkington A (1995) Sociology in focus.
Causeway Press, Ormskirk
26. Wilson cited in Paterson S (2018) ‘My mother was bullied out of her job for speaking out on Dr
Opiate scandal’: daughter of whistleblowing nurse says hundreds of lives could have been saved
The Threat Within: Mitigating the Risk of Medical Error 55

if hospital chiefs had listened. Available at: http://www.dailymail.co.uk/news/article-5872585/


My-mother-bullied-job-speaking-Dr-Opiate-scandal.html. Accessed 27 July 2018
27. Cordery J (2002) Team working. In: Psychology at work. Penguin Books, London, pp 326–350
28. Janis IL (1972) Victims of groupthink. Houghton Mifflin, Boston
29. Hackman JR (2002) Why teams don’t work. In: Theory and research on small groups, Social
psychological applications to social issues, vol 4. Springer, Boston, pp 245–267
30. Glendon AI, Clarke SG, McKenna EF (2006) Human safety and risk management. CRC Press,
Boca Raton
31. Mulenburg J (2011) Crew resource management improves decisionmaking. ASK Magazine is
published by NASA, Washington DC, pp 11–13. 11 May
32. Institute of Medicine (2000) To err is human: building a safer health system. National Academy
Press, Washington, DC
33. Department of Health (2000) An organisation with a memory. The Stationery Office, London
34. National Patient Safety Agency (2004) Seven steps to patient safety. National Patient Safety
Agency, London
35. Waterson P, Catchpole K (2015) Human factors in healthcare: welcome progress, but still
scratching the surface. BMJ Qual Saf 0:1–5
36. Stittle J (2014) In critical condition. Available at: https://www.icsa.org.uk/knowledge/gover
nance-and-compliance/analysis/news-analysis-in-critical-condition. Accessed 1 Oct 2016
37. Bennett SA (2016) Disasters and mishaps: the merits of taking a global view. In: Disaster
Forensics: understanding root cause and complex causality. Springer, Cham, pp 151–174
38. Harris D (2014) Improving aircraft safety. Psychologist 27(2):90–94
39. Dekker SWA (2006) Resilience engineering: chronicling the emergence of confused consensus.
In: Resilience engineering: concepts and precepts. Ashgate Publishing Ltd, Aldershot, pp 77–92
40. Shorrock S, Leonhardt J, Licu T, Peters C (2014) Systems thinking for safety: ten principles.
Eurocontrol, Brussels
41. Snook S (2000) Friendly fire: the accidental Shootdown of U.S. Black Hawks over Northern
Iraq. Princeton University Press, Princeton
42. Weir DTH (1996) Risk and disaster: the role of communications breakdown in plane crashes
and business failure. In: Accident and design. UCL Press, London, pp 114–126
43. Lagadec P (1993) Ounce of prevention worth a pound in cure. Management Consultancy, June
1993, 45
44. Bennett SA (2010) Human factors for maintenance engineers and others – a prerequisite for
success. In: Encyclopaedia of aerospace engineering. Wiley, Chichester, pp 4703–4710
45. Landry SJ (ed) (2018) Handbook of human factors in air transportation systems. CRC Press,
Boca Raton
46. International Civil Aviation Organisation (2002) Line Operations Safety Audit (LOSA). Inter-
national Civil Aviation Organisation, Montreal
47. Latour B (2005) Reassembling the social: an introduction to actor-network theory. Oxford
University Press, Oxford
48. Woods DD, Dekker S, Cook R, Johannsen L, Sarter N (2010) Behind human error, 2nd edn.
Ashgate Publishing Ltd, Aldershot
49. Eurocontrol (2016) Normal Operations Safety Survey (NOSS). Available at: http://www.
eurocontrol.int/articles/normaloperationssafetysurveynoss/. Accessed 15 Oct 2016
50. Hollnagel E (2014) Safety-I and safety-II. The past and future of safety management. Ashgate
Publishing Ltd, Aldershot
51. Roberts KH (1990) Some characteristics of one type of high reliability organisation. Organ Sci
1:160–176
52. Health and Safety Executive (2011) High reliability organisations: a review of the literature.
Health and Safety Laboratory, Buxton
53. Klinect JR, Wilhelm JA, Helmreich RL (2001) University of Texas human factors research
project threat-and-error management exercises version 9.0. University of Texas, Austin
56 S. Bennett

54. Triggle N (2016) Junior doctors’ strike: all-out stoppage ‘a bleak day’. Available at: http://
www.bbc.co.uk/news/health36134103. Accessed 10 Oct 2016
55. National Health Service (2018) Medical student-selected components (SSCs). Available at:
https://www.healthcareers.nhs.uk/explore-roles/doctors/medical-school/medical-student-
selected-components-sscs/. Accessed 14 Aug 2018
56. Kolb D (1984) Experiential learning: experience as the source of learning and development.
Prentice Hall, Englewood Cliffs
57. Revans R (1980) Action learning: new techniques for management. Blond and Briggs, Ltd,
London
58. Leonard HS, Marquardt MJ (2010) The evidence for the effectiveness of action learning. Action
Learn: Res Pract 7(2):121–136
59. Denscombe M (2014) The good research guide: for smallscale social research projects.
McGrawHill Education, London
60. Lewin K (1946) Action research and minority problems. J Soc 2(4):34–46
61. Schön D (1983) The reflective practitioner, how professionals think in action. Basic Books,
New York
62. Mayo E (1945) The social problems of an industrial civilisation. Harvard University, Boston
63. Mayo E (1949) Hawthorne and the western electric company. Public Administration Concepts
Cases 149–158
64. Landsberger HA (1958) Hawthorne revisited. The New York State School of Industrial and
Labour Relations, Ithaca
65. Gordon S, Mendenhall P, O’Connor BB (2013) Beyond the checklist. What else healthcare can
learn from aviation teamwork and safety. ILR Press, Ithaca
66. Gilbert N, Stoneman P (2016) Researching social life, 4th edn. Sage, London
67. Allsop P, Overton S, Stewart N, Stewart P (2010) Recognising risk and improving patient safety
– Mildred’s story. University of Leicester Audio Visual Services, University of Leicester,
Leicester
68. NHS England (2016a) Never events. Available at: https://www.england.nhs.uk/patientsafety/
neverevents/. Accessed 24 Nov 2016
69. NHS England (2016b) Never events list 2015/16. Department of Health, London
70. Merritt A, Klinect J (2006) Defensive flying for pilots: an introduction to threat and error
management. University of Texas, The University of Texas Human Factors Research Project,
Austin
71. Triggle N (2018) Shipman, Bristol, Stafford, Morecambe Bay – and now Gosport. Available at:
https://www.bbc.co.uk/news/health-44550913. Accessed 20 June 2018
72. Healtheuropa (2018) Prioritising patient safety in the NHS, 26th April. Available at: https://
www.healtheuropa.eu/prioritising-patient-safety-in-the-nhs/85666/. Accessed 25 Aug 2018
73. Hunt cited in Healtheuropa (2018) Prioritising patient safety in the NHS, 26th April. Available
at: https://www.healtheuropa.eu/prioritising-patient-safety-in-the-nhs/85666/. Accessed
25 Aug 2018
74. Adhanom Ghebreyesus cited in Healtheuropa (2018) Sixth Annual World Patient Safety,
Science and Technology Summit, 25th April. Available at: https://www.healtheuropa.eu/6th-
annual-world-patient-safety-science-technology-summit. Accessed 25 Aug 2018
75. Canadian Institute for Health Information (2003) Health Care in Canada 2004 – a focus on safe
care. Canadian Institute for Health Information, Toronto
76. Gallagher P (2018) Medicine errors killing thousands of NHS patients. i-newspaper,
23 February
77. Moshansky VP (1992) Moshansky, commission of inquiry into the air Ontario accident at
Dryden, Ontario: final report (volumes 1–4). Minister of Supply and Services, Ottawa
78. Haddon-Cave C (2009) The nimrod review. An independent review into the broader issues
surrounding the loss of the RAF nimrod MR2 aircraft XV230 in Afghanistan in 2006. HC 1025.
Her Majesty’s Stationery Office, London
The Threat Within: Mitigating the Risk of Medical Error 57

79. General Medical Council (2009) Tomorrow’s doctors. Outcomes and standards for undergrad-
uate medical education. General Medical Council, London
80. Madge C, Harrisson T (1937) Mass-observation (pamphlet). Frederick Muller, London
81. Icon Films (2018) Tom Harrisson – the barefoot anthropologist. Available at: https://iconfilms.
co.uk/productions/past-productions/tom-harrisson-the-barefoot-anthropologist.html. Accessed
27 Aug 2018
82. Borman cited in Roberts R (2017) Cluster of ‘avoidable’ baby deaths at NHS trust to be
investigated, 13 April. Available at: https://www.independent.co.uk/news/health/jeremy-hunt-
health-secretary-announces-investigation. Accessed 1 Sept 2018
83. British Broadcasting Corporation (2018) Shropshire baby and mother maternity deaths review
widened, 31 August. Available at: https://www.bbc.co.uk/news/uk-england-shropshire-
45366648. Accessed 1 Sept 2018
Climate Change, Extreme Weather Events
and Global Health Security a Lens into
Vulnerabilities

Carson Bell and Anthony J. Masys

1 Introduction

Natural hazards have displaced over 26.4 million people per year since 2008
[42]. This number will only increase as climate change continues to exacerbate
displacement due to monsoon-related flooding, coastal erosion, cyclones, and salin-
ity intrusion [6, 58]. These natural hazards create vulnerability in communities
throughout countries like Bangladesh, Tuvalu, Somalia, and even the United States.
Extreme weather events such as Typhoon Haiyan, Hurricane Maria, and Hurri-
cane Irma shed light on regional and global health stressors that certainly impact
global health security. As noted by Lichtveld [33], ‘. . .the 2017 North Atlantic
hurricane season, as well as the droughts and flooding in the Caribbean over the
past two decades, illustrate the high potential for devastation resulting from these
storms’. Similarly, we see the impact of such events on vulnerable populations in the
South Pacific, whereby the effects have created what some might call ‘climate
refugees’. Vulnerability is defined as the condition determined by physical, social,
economic, and environmental factors and processes that increase the susceptibility of
a community/system to the impact of hazards [63]. Those most vulnerable to
climate-related hazards live across the globe in poor areas of developing countries.
Poverty is certainly an indicator of higher vulnerability, and vulnerability to climate
change intensifies elements of poverty [25].
Climate change is increasingly becoming the causal factor to human migration.
As described in Berchin et al. [7:147], ‘Climate change poses various threats to
humanity, especially regarding global vulnerable communities, which already suffer
from severe droughts and famine, instigating population displacement [16]. Climate
change increases the intensity of extreme weather events, provoking migrations and

C. Bell · A. J. Masys (*)


College of Public Health, University of South Florida, Tampa, FL, USA
e-mail: tmasys@health.usf.edu

© Springer Nature Switzerland AG 2020 59


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1_4
60 C. Bell and A. J. Masys

displacements; thus, climate refugees are the subject of increasing attention world-
wide [8, 46]’.

2 Health Effects

It is widely reported in the scientific literature that the global climate is changing.
The Intergovernmental Panel on Climate Change (IPCC) estimates a rise in average
global temperatures between 1.8 and 4 ○ C by 2100, resulting in significant impacts
and consequences for communities. Such threats to global health include: changing
infectious disease patterns; water and food insecurity; extreme climatic events;
declining air quality thereby reshaping the global health security landscape to one
rooted in transborder vulnerabilities ([5]:54). As described in [5]:70–71),
‘climate change is expected to alter patterns of vector-borne infectious diseases and food-
and water-borne infectious diseases and increase disease incidence and prevalence. For
example, the incidence of Lyme disease has risen swiftly in Europe and North America,
where the geographical range and upper temperature limits of ticks, the disease vector, are
spreading northward. Likewise, climate change is projected to expand the geographic area
suitable for dengue transmission globally. Infectious disease transmission will also be
mediated by non-climatic factors, such as adaptation or socioeconomic development’.

3 Connection to Global Security

Images of war appear when one begins to initially think about the term global
security. The Cambridge dictionary defines global security as the “protection of the
world from war and other threats.” Climate change presents tremendous “other”
threats to global security. There are three main ways in which climate change
presents threats to global security: border disputes, resource shortages, and migration
[11]. All of these threats can culminate into armed conflict in countries across the
globe, ultimately exacerbating population migration.
Border disputes have long existed, but climate change is changing our topogra-
phy at ever-increasing rates [11]. Rising sea levels are specifically altering the
physiography of all countries [67]. The results are realized through increased
possibilities of new shipping routes, receding coastlines, and the disappearance of
entire islands [11]. In addition, Werrell and Femia [70] published specific borders at
risk to experience armed conflict based on climate-related features. The following
are a list of potential conflicts that could arise due to our changing physiography:
1. Fish wars in the South China Sea between Vietnam and China,
2. River wars at the China-India border: China controls the Tibetan Plateau which
supplies water to over two billion people, unless China re-routes the water to
supply more of their country,
Climate Change, Extreme Weather Events and Global Health Security a. . . 61

3. Cattle wars in Nigeria as desertification reduces the amount of land cattle can
graze on [70].
Another threat to global security created by the effects of climate change is
resource shortages. Types of resource shortages include lack of clean water and
farmable land. Intensifying temperatures, increased salinization of freshwater, and
water inundation all combine with climatic events like drought, cyclones, and floods
to further decrease the stock of these resources. Our population is only growing, and
resources are becoming more scarce. In South Sudan, drought has lead to health
crises such as famine and drought which has increased conflict and mortality rates. In
2008, then U.N. Secretary General Ban Ki Moon declared climate change “as big of
a threat as war,” thereby creating threats to global security.
The third threat to global security is migration. Migration is defined in the
Oxford Dictionary as the movement of people to a new area or country in order to
find work or better living conditions [44]. People have been migrating to other places
since the beginning of our species [21]. We can migrate across a state, across a
country, or across the world. People migrate for many reasons, whether it is to pursue
better economic opportunity for their family, or to escape war or persecution [21]. In
2015, there were over 244 million migrants around the world, which is greater than
3% of the world’s population [23]. Globalization has increased the amount of
international migration across the globe, where migrants are travelling longer dis-
tances and of greater diversity in terms of origin [4]. People can migrate voluntarily
or forced, though the decision to migrate is not a dichotomous decision, rather a
continuum of experience [17]. Many migrants who “voluntarily” migrate will list
sociopolitical and economic reasons why they left, a choice they likely would not
have made if they did not feel migrating was their only option to better their family’s
livelihood [17]. It is important to look beyond dichotomous descriptions of migrants
such as “skilled” and “unskilled,” “temporary” and “permanent,” as a complex act
such as migration cannot be simplified to a “one or the other” scenario with much
accuracy [17].
First it is important to differentiate between the differing terms under the
umbrella of the word “migrant.” A person can be classified as a voluntary
economic migrant, an asylum seeker, a refugee, and a forcibly displaced person
[65]. An economic migrant is a person who leaves purely for economic reasons and
cannot be considered for refugee status [65]. An asylum seeker is seeking interna-
tional protection.
In 1885, Ernst Ravenstein created the foundational laws of migration. Features
of the law include three aspects: (1) reasons, or motives to leave, (2) the distance
travelled, and (3) the migrants’ characteristics. Ravenstein also stated that migration
occurs in streams or steps. Stream migration involves the direct movement from
place X to place Y. Step migration also involves the movement from place X to
place Y, but with the addition of in-between steps of movement before the ultimate
destination of place Y [45]. Migration can be viewed within four themes of migra-
tion theory that exist within micro, meso, and macro scales [10, 41]. Micro scales
62 C. Bell and A. J. Masys

refer to an individual’s decision to move, possibly an individual cost/benefit analysis


[10]. Macro levels are the structural and objective conditions such as violence or
civil war that is crucial when considering those who have been forcibly displaced
[10]. Meso levels are a mix of macro and micro scales. The meso level refers to
systems and networks. Migration is a system influenced by networks that create
chain migration. The relative strengths of each should be considered when talking
about migration theory.
There are sociological, economic, geographical, and unifying themes of
migration theory. One of the most popular migration theory is a sociological
model created in 1966 by Everett Lee called the push/pull model. This model focuses
on the micro levels of migration [10, 26]. Lee explains migration as a decision of an
individual or family to migrate based on several factors: characteristics of origin,
characteristic of destination, nature of intervening obstacles (e.g. cost, borders), and
the nature of the people [10]. There are factors that pull an individual or family to a
destination as well as factors that push an individual or family away from their
origin. This theory is simple, whereas migration in reality is complex. Lee’s push/
pull model also does not take into consideration the lack of autonomy many people
have over the decision to migrate. To understand migration, you have to look past
one’s desires. The theorization of migration has long been unconnected between
theories rather than building upon previous theories, which is partially due to the
difficulty in predicting migration patterns and the complexity of migration [4].
There are four kinds of causal factors of migration [10]. Root causes are the
structural and systemic factors that lay the groundwork for future forced displace-
ment. These relate to macro and meso factors listed above, and examples include a
weak state, severe social fragmentation, and economic underdevelopment. Next are
proximate causes which are the immediate circumstances that trigger movement
such as escalation of violence or conflict and persecution. These also relate to micro
and meso factors. The next causal factor is an enabling condition. This refers to the
actual journey to and stay in the destination county. Enabling conditions consist of
resources, legislation, border controls, and travel possibilities. The last causal factor
is a sustaining factor. This factor relates to what encourages continuous or chain
migration. Chain migration is the idea that resources like money and knowledge flow
through networks that make migrating to the same destination country more feasible
and attractive [10].
Here, we will focus on specific the macro and meso reasons to why people
migrate. Matters pertaining to mass migration have been discussed in the literature
including but not limited to Kaundert et al. ([27]) and Taylor and Masys [56]. In
regards to climate change, human mobility occurs when there are extreme events,
changing weather, glacial melt, and coastal inundation [68]. These interact with the
social, economic, political, and demographic factors of migration that are mentioned
above. Because of this, the World Food Programme [WFP] [76] considers climate
change to be a crisis multiplier. Not only are people experiencing natural disasters,
extreme heat, and drought, but they are also experiencing food crises brought on by
changes in the climate. WFP has spent over $23 billion dollars in response to
climate-related disasters in the past decade [76]. There have been 6500 climate-
Climate Change, Extreme Weather Events and Global Health Security a. . . 63

related disasters between 1980 and 2007 [18]. Certain regions of the world are more
vulnerable to these crises than others. Neoliberal capitalism is considered by many to
be guilty of this inequity by omission [18]. By having the resources to but not acting
more quickly to reduce greenhouse gas emissions, developed neoliberal countries
are exacerbating the negative effects of climate change [18]. In addition, the new
world of free trade is facilitating increased displacement of pollution and other
negative externalities to poorer countries through the use of outsourcing labor and
factories.
These climate injustices are manifested as socioeconomic injustices, interna-
tional injustices, and intergenerational injustices. Socioeconomic injustices are per-
ceived as people and countries not having enough money to adapt and prepare for the
future of climate change. The people who live in the areas most affected by climate
change are the same people who possess significant barriers to migrating, in response
to climate change, due to lack of both economic resources and social capital
[38]. International injustices refer to the disparities between the rich North and
global South [18]. In global negotiations, countries of the North call on the respon-
sibilities that “all” countries should share, when it is North that produces more
greenhouse gases and experiences less of climate change’s consequences [18]. The
third injustice is intergenerational, meaning that we are creating the world in which
our grandchildren will inherit. The damage we do will only impact them negatively.
Many migrants experience climate injustice; and it is these thousands of people who
are the climate refugees of the world.
Climate change is Earth’s greatest stressor, and soon to be the main cause of mass
migration movements [7, 38, 39]. There are many direct and indirect ways that
climate change is creating such a large number of climate refugees, because climate
change interacts with social, political, and demographic factors that coexist and
exacerbate events that cause migration [38]. They include soil salinity, sea level rise,
severe droughts, famine, and extreme weather. Some countries are more resilient
than others, bringing us back to the idea of climate injustice. The poorest nations will
receive the brunt of climate change effects. Similar characteristics of these countries
such as high population density, shortage of resources, poor urban planning, and
armed conflict all co-occur with climate change and together will intensify migration
levels [39]. Small Island Developing States [SIDS] such as the Republic of Kiribati,
the Maldives, and Tuvalu will be impacted to the point of disappearance and will be
discussed later in this chapter. In addition, Bangladesh is experiencing major strains
due to sea level rise. The Horn of Africa has created millions of climate refugees due
to their extreme famines caused by drought. Many even argue that environmental
conditions have been a cause of human migration since the origin of our species
[38, 50], while anthropogenic climate change has exacerbated those movements.
There is a lot of irony that surrounds climate change and the existence of climate
refugees. First, there is the climate conundrum. The western, neoliberal world
encourages countries to increase their output so that they can create more stable
environments. However, this process, if carried out in the same way as other
developed nations who transitioned from third to first world nations, will severely
64 C. Bell and A. J. Masys

increase the use of energy, including increased greenhouse gas emissions. This will
only exacerbate climate change and eventually de-stabilize that very same region
that development intended to stabilize [11]. Second, as more refugees seek refuge in
other countries due to climate change, there appears to be more anti-immigrant
rhetoric in the media. There has recently been a rise in nationalism and populist
parties across the globe. Refugees are a physical symbol to people who are afraid of
their changing environments [3]. And to many refugees, there is increased risk of
danger to themselves by arriving in their new “safe” country. Climate refugees are an
emergent and undeniable reality [7].
The idea of recognizing the crossovers between climate change and migration is
not new. According to Warner [68], Graeme Hugo was one of the first people to
investigate the relationship between environmental change and human migration.
The Intergovernmental Panel on Climate Change [IPCC] also published information
about environmentally induced population movements in their first assessment
report in 1990. Many scholars mention climate-related migrations from over two
centuries ago, when the seventeenth century in Europe experienced the Little Ice
Age and many fled due to lack of food, increased epidemics, and wars [39]. Some
even trace environmental migration as far back as mass migrations in Mesopotamia
when a famine caused by drought caused many to flee in 3000 BCE [39]. Fast
forward millennium to 2013, where journalists in the United States began posting
articles talking about America’s first climate refugees. First it was residents in
Newtok, Alaska. Newtok was first reported in the U.S. Geological Survey in 1949.
The people of Newtok were known as dip net people, or Qaluyaarmiut. Newtok itself
means rustling of grass. After three decades of living in Newtok, its native residents
realized their land was fast disappearing. The average rates of erosion ranged from
36 to 83 ft per year. It became nearly impossible for fuel to be delivered to the
residents. With the help of Congress, the 450 residents of Newtok fled the slow-
moving climate disaster and relocated in Martavik. While a refugee typically passes
international borders, it is important to note that climate change will affect migration
in all countries, even the United States.
Public Health Implications Environmentally induced forced migrations involves
an array of resource and social disruptions that affect the health of climate refugees
[52]. In general, the changing climate impacts human health through heat-related
mortality and disaster-related injury [50]. It is also increasing the prevalence of
certain diseases such as Lyme Disease, West Nile Virus, and Dengue due to
warming temperatures changing the geographic and seasonal patterns of mosquitos
and other insects [39, 50]. There are specific direct and indirect health risks related to
climate-caused migration [50], which is unfortunately not the focus of much inter-
national discussion around environmental migration [39]. It is unfortunate because
the health impacts of migration will be a major source of loss of life and human
suffering [29]. Migration affects the spread of the increasing prevalence of infectious
diseases, as people are carriers of infections that migrants can bring with them, or
bring back from their destination country [38]. This, in conjunction with the
increased area of space hosts of diseases can survive in due to warming climates,
Climate Change, Extreme Weather Events and Global Health Security a. . . 65

is a dangerous position for the globe to be in McMichael et al. [39]. For example,
droughts caused extreme famines in 2011 in the countries Somalia, Ethiopia, and
others. These countries’ infectious disease prevalence is higher because of migra-
tion, exacerbated due to the weakened immune systems caused by malnutrition and
weakened health systems from the famine [39].
During the last two decades, world humanitarian agencies have been able to lift
200 million people out of hunger, and chronic malnutrition globally in children has
decreased substantially from 40% to 26% [75]. However, climate change threatens
this progress. Extreme weather events- floods, droughts, and storms- as well as the
long term climatic risks such as sea level rise are going to affect areas of the world
that depend the most on agriculture. Climate change threatens to affect all the facets
of food security and nutrition as follows: food availability, food access, food
utilization, and food stability. The World Food Programme has created the Food
Insecurity and Climate Change Mapping Application depicting different food inse-
curity scenarios depending on emissions and adaptation scenarios. Countries in
Africa, who produce the lowest emissions globally, will experience the brunt of
food insecurity effects. In addition to those who rely heavily on agriculture, those
who work in fields related to food handling like farming, fishing, and forestry will
also be negatively affected. Under-nutrition and micronutrient deficiencies increase
the mortality and morbidity from communicable diseases [78]. Diseases such as
measles, acute respiratory infection, malaria, diarrheal diseases, Tuberculosis, and
HIV can all be exacerbated in malnourished populations [78].
Increasing ocean acidification is decreasing fish stocks in addition to overfishing
due to the increasing populations [39]. This also means that there will be less
availability of drinking water, increasing the rates of diseases caused by unsafe
drinking water such as cholera and other diarrheal diseases. A lack of water also will
harm the ability to grow crops. The lack of water creates desertification that reduces
the areas where food can be grown, particularly in developing countries like in
Sub-Saharan Africa. These countries also rely heavily on subsistence farming,
compounding these impacts.

4 Discussion

4.1 Case Study: Bangladesh

Bangladesh is considered one of the most vulnerable countries to climate change in


the entire world [1, 6, 9]. Methods such as livelihood vulnerability indices used by
Toufique and Islam [58] reveal four disaster prone areas of Bangladesh, the most
vulnerable of which being the communities living along coastal regions. Due to the
imminent destruction of their homes from an estimated 46 cm sea level rise by 2050,
people are moving into urban areas of Bangladesh with crumbling infrastructure that
are not prepared to make the adaptive changes necessary to compensate for rapidly
66 C. Bell and A. J. Masys

rising density [51]. The effects of climate change are not just imminent in the future.
Using the methodology used by the Association for Climate Refugees [ACR], it is
estimated that six million individuals have already been displaced by the effects of
climate hazards in Bangladesh [64]. The prevalence of this issue is only expected to
increase exponentially over the next 40 years as the projections predict an upwards
of 20 million displaced Bangladeshi people due to climate-related hazards
[6]. Flooding their homes, natural disasters force Bangladeshis to move into urban
areas, where the less known, but equally disastrous, concerns with rapid urbanization
occurs.
The city of Dhaka, with a population of 14 million people, is undergoing rapid
influx of people from rural areas [51]. However, Dhaka is ranked high among the top
20 cities of the globe for its exposure to climate extremes [25]. Unplanned urban-
ization causes many health issues. Development often occurs without much plan-
ning, and surrounding areas of cities that should be conserved for productive
farmland or for conservation efforts are being used to house thousands of displaced
citizens seeking new opportunity [2]. When people move farther from city centers,
they must travel farther for work which increases consumption of fossil fuels and
puts drivers at risk of traffic-related injury [47]. In Dhaka, these surrounding areas
are much more prone to flooding, thus not alleviating any vulnerability [2]. The
burden of these unplanned migrations expressed through the negative impacts of
urbanization is doubled by the city’s own vulnerability to climate disasters. Dhaka is
not alone, for the next four largest cities of Bangladesh, Sylhet, Chittagong, Khulna,
and Rajshahi, though in different disaster zones, are all facing ever-increasing
minimum and maximum temperatures [51].
These cities mentioned will hold 44% of the total population by 2030 [74], and
they are not prepared to handle such an influx. As poor rural dwellers seek refuge and
resilience in an urban city, demand for housing increases. This increases the prices of
housing, making resilience unaffordable for the poor [25]. In addition to lack of
affordable housing, water sources become an issue due to quick development
creating water sources that are more likely to flood and are less safe [25]. With
higher population density and thousands who cannot afford housing, more people
settle in slums. With slums overcrowding and lacking essential needs like healthcare
and running water, the chance for disease outbreak increases [47]. Higher crime and
less social connections further deteriorate any hope for resilience of the climate-
related displaced people [15]. All of these combined determinants create an envi-
ronment far from equipped to handle the inevitable 20 million Bangladeshi citizens
who will be displaced by sea level alone [6].
Vulnerability is inextricably linked with the problem of urban poverty. For
example, the char areas, land along embankments of rivers, of Bangladesh are the
poorest and most vulnerable to climate change [1]. The char areas experience regular
river erosion, and have very limited resources to healthcare, education, and even
traditional communication methods [24]. These social inequalities, experienced by
the 5% of the Bangladeshi population that live on the chars, already create poor
health outcomes. Their present vulnerability is heightened by more aggressive
flooding and erosion that climate change is causing. If they stay, their lives will be
Climate Change, Extreme Weather Events and Global Health Security a. . . 67

destroyed. If they migrate to an urban area such as Dhaka, they are likely to live in a
slum where opportunities are slim and health outcomes do not differ from before
[15, 74].

4.2 Case Study: Tuvalu

Located halfway between Hawaii and Australia is a collection of nine coral atolls
known as Tuvalu [14]. Tuvalu was once a British colony that achieved independence
in 1978, but they now have bigger issues to address [14]. The mean elevation in
Tuvalu is two meters, which means Tuvaluans are extremely vulnerable to climate
change [14]. Deforestation, beach erosion, and damage to coral reefs from increasing
water temperature is exacerbating the effects of sea level rise in this small country
[14]. According to a study by Than, Singh, and Uma [57], the sea level rise rate in
Tuvalu is 5.9 mm per year, which is about four times higher than the global average
of 1–2 mm year. Soon, all 11,150 people in the country of Tuvalu might be at risk of
becoming climate refugees. In 2015, Category 5 Cyclone Pam displaced 45% of the
island’s people [61]. The government of Tuvalu has known of this likely outcome
for its country, because in 2000 they appealed to New Zealand and Australia about
accepting their citizens if sea level rise makes Tuvalu uninhabitable. Tuvalu is not
alone, for they are among many small island developing states [SIDs] who will
experience the brunt of future climate disruption [19].
There is great irony when looking at small island developing states, like Tuvalu,
and climate change. Tuvalu emits 1.0 mt of carbon dioxide per capita according to
the World Bank [73]. In comparison, the United States emits 16.5 mt per capita,
Australia 15.83, and Canada 15.32 mt per capita [76]. Carbon dioxide is a major
contributor to climate change, yet the countries, like Tuvalu, who produce the least
amount of CO2 will experience the brunt of its effects, and sooner. Because of this
reality, many scholars view Tuvalu as a litmus test to global climate change,
however, this is criticized by many for the comparisons dehumanizing nature
[19]. It frames Tuvaluans as evidence of climate changes versus human subjects
[20]. Are Tuvaluans accepting this seemingly inevitable fate as a litmus test?
The Government of Tuvalu has reached out to international partners to help them
combat climate change. An agreement struck with the United Nations Development
Programme [UNDP] created a new climate resilience project that will benefit 30% of
Tuvalu’s population. It will take place over the course of 7 years, hopefully posi-
tively impacting this tiny nation [61]. In addition, Tuvalu’s Coastal Adaptation
Project has partnered with the Green Climate Fund to receive a $36-million-dollar
grant. This project will work to increase coastal protection coverage from 570 to
2780 m. They will focus on the most highly populated islands of Tuvalu: Funafuti,
Nanumea, and Nanumaga. Tuvalu has gone beyond increasing coastal protection to
ensuring their citizens are educated on the topic of climate change and how it will
change the physiology of their island home. Primary schools have included
68 C. Bell and A. J. Masys

curriculum to teach students on these topics [61]. Tuvalu will not be able to engineer
their way out of their potential demise without the help of other nations such as the
United States.

(Japan times 2018)

4.3 Case Study: South Sudan

Across the Pacific and Indian Oceans, 10,144 miles away from Tuvalu, is the
country of South Sudan. Though far away from each other in distance, South
Sudan also faces intensifying negative effects from climate change. Contrary to
Tuvalu, sea level rise is not the landlocked country’s problem, drought is. On
February 20th, 2017, a famine was declared in South Sudan [62]. The State Depart-
ment estimates that 383,000 people have died in South Sudan due to famine and civil
war [13]. They also estimated that there are over 2.5 million refugees as a result of
political instability and food insecurity in addition to 1.8 million internally displaced
persons [IDP] [43]. The increased unpredictability in rainfall events and rising
temperatures are increasing droughts in this nation. Drought causes greater losses
of crop and pasture land, as well as reduced key habitats for ecosystems and water
resources. The decreased ability to produce crops in South Sudan has incredible
impacts because 66% of the population lives on less than $2 a day, and they depend
Climate Change, Extreme Weather Events and Global Health Security a. . . 69

almost solely on the crops and animals they can raise for food and household
income [14].

(Al Jazeera News 2017)

South Sudan only recently gained independence from Sudan in 2011, and while
they are now their own nation state, they have been paralyzed by their tropical
climate and lack of rainfall in which all of their agriculture depends on. This has been
coupled with increased conflict in the country to make their death rate ranked first in
the world at 19.3 deaths per 1000 people [14]. Nearby, the Darfur War has been
labeled the “first climate change conflict,” as the drought felt in this area of the world
has elevated political instability and is one of the critical actors in this war’s
beginnings [77]. This reiterates the idea that climate change is the number one threat
to global security. Research is finding that with increasing temperatures comes
increased risk of civil war in Africa [12]. The global stage needs to better prepare
for when drought begins to affect more places in order to avoid the outcomes that
South Sudan is succumbing to.
UNDP projections indicate that South Sudan will feel global warming 2.5 times
more than the global average [62]. The United Nations cites that being highly
70 C. Bell and A. J. Masys

dependent on agriculture, having a recent history of conflict, and discriminatory


political institutions are all factors that make a country more vulnerable to climate
change- South Sudan possesses all three [59]. Many cite that developing countries
have an advantage over other countries in seeing lessons they are learning about
renewable energy like solar and wind energy to help them bypass traditional energy
use methods like the use of coal and natural gas. However, resources are being
diverted from mitigation efforts to ensure the country’s displaced have enough food,
clean water, and shelter. The United Nations Development Programme [60] and the
South Sudanese Government hope to “to strengthen the hydro-meteorological mon-
itoring network across to accurately measure and predict extreme climatic events,
and to establish flood and drought early warning systems to reduce the vulnerability
and impact of communities to the environmental hazards.”

5 Problem Framing: Revealing Vulnerabilities


and Exploring the Possibility Space

As described by Moore and Westley [40] ‘complex challenges demand complex


solutions. By their very nature, these problems are difficult to define’. Different ways
of seeing are thereby required to support strategic decision making. The complex
problem associated with climate related extreme events and its influence on public
health and migration requires a framework to support new ways of seeing to reveal
the vulnerabilities and exploring the possibility/plausibility space in support of
strategic decision making. Global and local decision making nexus ‘. . .complexifies
the decision environment. . .captured in one term. . .glocalization’ [32:1]. In essence,
glocalization is replete with uncertainties and dynamic, interacting complexities.
This framework is comprised of the following:
• Cynefin framework to support problem framing
• Network thinking to support interdependency analysis
• Vulnerability analysis to support public health implications
• Scenario planning to support impact analysis

5.1 Cynefin Framework

Framing the climate change impact on global health security as a wicked problem
[37] replete with uncertainties, ambiguity and complexity lends itself to application
of the Cynefin framework [30, 53, 54]. The Cynefin (kuh-ne-vin) is from the welsh
word for ‘habitat or rootedness’ and was used by Snowden as a metaphor for a
conceptual framework for ‘time and space’ to make decisions in complex situations.
Climate Change, Extreme Weather Events and Global Health Security a. . . 71

Fig. 1 Cynefin Framework.


(Kurtz and Snowden[30])

The application of Cynefin helps to contextualize problems regarding causality


across 4 + 1 domains: Simple/obvious; Complicated; Complex; Chaotic; and Disor-
dered (Fig. 1).
The Cynefin Framework is apropos in contextualizing the climate change threats
to global health security. The nonlinear behavior of climate related effects on
vulnerable communities is well documented, as described through the case studies
of South Sudan and Tuvalu. The multivariate climate change impacts include social,
political, economic, environmental and health security which challenges traditional
linear event based approaches to problem framing.
In matters pertaining to the impact and influence of climate change on society,
interdependencies, interconnectivity, indirect influences make mechanistic and lin-
ear approaches problematic. Understanding complex causality and influences
requires a more holistic perspective. As described in Kempermann [28:3]:
‘The Cynefin framework allows contextualization of situations that require a decision and
response by providing a reference language. The key assumption is that such situations fall
into one of five categories, which call for substantially different but definable conclusions
and mode of actions, and that awareness of these principal differences allows more struc-
tured insight and better-informed decisions’. . . Although it is only one of the five contexts
described by the Cynefin framework, complexity actually lies at the heart of the idea. The
entire framework has been designed in order to identify this critical category correctly and
allow decisions (actions) that are appropriate to the specific demands of these contexts, also
in relation to the others. The complex contexts are those of the “known unknowns,” in which
neither “best” nor “good” practice can be used, but “emergent practice” is needed. This
description alone already suggests, why for the research context this category is
fundamental.
72 C. Bell and A. J. Masys

The Cynefin framework helps situate our conceptual understanding of climate


related disasters and thereby challenge linear event based mindset approaches to
explore the uncertainty, ambiguity and complexity of the problem space. In so doing,
the Cynefin framework informs scenario development.

6 Network Thinking

Today we see unprecedented interconnectedness and interdependencies at the local


and global scale. As described by Sambharya and Rasheed [49:308] ‘risks are
rarely confined to a nation, an industry, or a firm. Instead, today’s risks are systemic,
their contagion rapid, and their consequences devastating and unpredictable. This
calls for new approaches to understand measure and respond to risks’. As described
in Masys et al. [37], our notion of risk and ‘hyper-risks’ [22] must be considered
from a comparative ‘networked model’ that recognizes the complex interdepen-
dencies and interconnectivity of the risk ‘ecosystem’ and the underlying understand-
ing of failures, natural hazards and human-made disasters as it pertains to resilience.
Network thinking or more clearly a ‘network mindset’ [66] is essential for under-
standing the network structure, network behavior and the feedback/feedforward
effects resident within these systems. What emerges from the study of networks is
the insightful requirement to evaluate actions and behaviours not in isolation but
recognizing that cause and effect are complex and nonlinear [37]. The ‘networked’
understanding of hyper-risks [22] requires a more holistic approach to hazard
identification and risk management that transcends the linear agent-consequence
analysis. This network mindset described in Masys et al. [37] and Xu and Masys [79]
resonates with understanding the complex and risk landscape associated with climate
change and public health impacts.

7 Vulnerability Analysis

Shocks (such as climate related extreme events) stress our ‘health security’ ecosys-
tem often resulting in failures at various scales thereby posing serious threats
nationally, regionally and globally. The ecosystem analogy emphasizes the
interdependence of all actors in the environment. To better manage black swan
events that stress the health security ecosystem, a fundamental redesign of our
mental models and perspective is needed: essentially a paradigm shift in how we
view vulnerabilities and enable health security. Woods [72, p. 316] asks the
question:
How do people detect that problems are emerging or changing when information is subtle,
fragmented, incomplete or distributed across different groups involved in production pro-
cesses and in safety management. Many studies have shown how decision makers in
evolving situations can get stuck in a single problem frame and miss or misinterpret new
information that should force re-evaluation and revision of the situation assessment....
Climate Change, Extreme Weather Events and Global Health Security a. . . 73

Given the current health security landscape and shocks to human systems char-
acterized by complexity and wickedness [34, 35], the concept of resilience
(supporting health security) encompasses a capacity to anticipate and manage risks
and the ability to survive threats and respond to challenges. This lies at the heart of
vulnerability. Understanding the interconnectivity and interdependencies in our
health security ecosystem supports a vulnerability analysis that can identify capa-
bility targets and resource requirements necessary to address anticipated and unan-
ticipated risks.
Through the lens of Cynefin, network thinking, vulnerability analysis emerges
scenario planning to support solution navigation across the 4 + 1 domains of
Cynefin.

8 Scenario Planning

With the onset of these climate-related disasters and the possibility of mass migra-
tions, scenario planning provides a well-established methodology to address such
uncertainty about the future occurrence and impacts of climate-related events.
As described in Wilkinson et al. [71, p. 301], plausibility-based scenarios are
useful approaches in situations characterized by increasing uncertainty and
complexity. . .emerging futures cannot be forecasted but can be imagined and
‘lived in’ and offers a different perspective to learning about the present tan history
alone provides. . .plausibility-based scenarios offer reframing devices rather than
forecasting tools.’ Scenarios are not populated with facts but with perceptions,
assumptions and expectations.’ In this way, scenario planning enables the develop-
ment of multiple narrative-based conceptual mappings of how possible futures might
unfold. This is particularly relevant with climate change and migration.
With the onset of these climate-related migrations, it is important to raise new
questions about how the international community perceives the risk of environmen-
tal induced migration as well as how best to prepare for the inevitable disruptive
events.
‘Things that have never happened before, happen all the time’ [48] challenges our
notion of these black swan events [36, 55]. As we continue to learn from past events,
we can apply scenario planning to anticipate plausible but unprecedented conditions,
and thereby expect the ‘black swans’ surprises. With this in mind, scenario planning
explores the possibility/plausibility space (Fig. 2).
It is important to emphasize that scenario planning cannot predict the future, but
rather, is about creating the mindset that is comfortable working with uncertainty and
thinking the unthinkable to facilitate the strategic conversation.
74 C. Bell and A. J. Masys

Fig. 2 Scenario Planning (http://quesucede.com/page/show/id/scenario-planning)

9 Conclusion

As described by Moore and Westley [40] ‘complex challenges demand complex


solutions. By their very nature, these problems are difficult to define’. Extreme
weather (climate related events) pose serious public health threats to vulnerable
communities and has resulted in mass migrations and displacements; creating what
some refer to as ‘climate refugees’. Threats and consequences stemming from
climate related events are transboundary and transnational and thereby emerge as a
global health security concern by exacerbating existing inequalities and burdens of
disease, changing patterns of infectious diseases, and affecting food security.
Complexity is a challenge to strategic decision making. The need for community-
based adaptation to the health risks posed by climate change and stakeholder
engagement have been highlighted in the literature [5:70–71]. To facilitate this, a
framework is presented to support solution navigation. The framework is comprised
of:
• Cynefin framework to support problem framing
• Network thinking to support interdependency analysis
• Vulnerability analysis to support public health implications
• Scenario planning to support impact analysis
In essence, the framework teases out the inherent complexities and interdepen-
dencies inherent in the climate refugee problem space. In so doing, strategic
interventions can be formulated, tested and deployed.
Climate Change, Extreme Weather Events and Global Health Security a. . . 75

References

1. Alam M (2017) Livelihood cycle and vulnerability of rural households to climate change and
hazards in Bangladesh. Environ Manag 59(5):777. https://doi.org/10.1007/s00267-017-0826-3
2. Alam M, Rabbani G (2007) Vulnerabilities and responses to climate change for Dhaka. Environ
Urban 19(1):81–97. https://doi.org/10.1177/0956247807076911
3. Alvarez A (2017) Intervention III: global refugees in an age of climate change. Cross Curr
67(3):634–643. https://doi.org/10.1111/cros.12285
4. Arango J (2000) Explaining migration: a critical view. Int Soc Sci J 52(165):283–296. https://
doi.org/10.1111/1468-2451.00259
5. Araos M, Austin SE, Berrang-Ford L, Ford JD (2016) Public health adaptation to climate
change in large cities: a global baseline. Int J Health Serv. 2016 46(1):53–78
6. Barua P, Shahjahan M, Rahman M, Rahman S, Molla M (2017) Ensuring the rights of climate-
displaced people in Bangladesh. Forced Migr Rev 1(54):88–91. Retrieved from http://www.
fmreview.org/sites/fmr/files/FMRdownloads/en/resettlement.pdf
7. Berchin I, Valduga I, Garcia J, de Andrade Guerra J (2017) Climate change and forced
migrations: an effort towards recognizing. Geoforum 84(2017):147–150
8. Bettini G (2014) Climate migration as an adaption strategy: de-securitizing climate-induced
migration or making the unruly governable? Critical Studies on Security 2(2):180–195
9. Biswas H, Rahman T, Haque N (2016) Modelling the potential impacts of climate change in
Bangladesh: an optimal control approach. J Fundam Appl Sci 8(1):1–19. https://doi.org/10.
4314/jfas.v8i1.1
10. Boswell C (2002) New issues in refugee research Addressing the causes of migratory and
refugee movements: the role of the European Union. Working Paper No. 73. https://www.
unhcr.org/3e19ac624.pdf
11. Burleson E (2010) Climate change displacement to refuge. J Environ Law Litig 25(1):19–35.
Retrieved from https://digitalcommons.pace.edu/lawfaculty/767/
12. Burke M, Miguel E, Satyanath S, Dykema J, Lobell D (2009) Warming increases the risk of
civil war in Africa. Proc Natl Acad Sci 106(46):10670–20674
13. Checchi F, Testa A, Warsame A, Quach L, Burns R (2018) Estimates of crisis-attributable mortality
in South Sudan, December 2013–April 2018: a statistical analysis. School of Hygiene & Tropical
Medicine, London. Retrieved from https://crises.lshtm.ac.uk/2018/09/26/south-sudan-2/
14. CIA Factbook (2018) https://www.cia.gov/library/publications/download/download-2018
15. Chowdhury M, Jahan F, Rahman R (2017) Developing urban space: the changing role of NGOs
in Bangladesh. Dev Pract 27(2):260. https://doi.org/10.1080/09614524.2017.1287162
16. Comenetz J, Caviedes C (2002) Climate variability, political crises, and historical population
displacements in Ethiopia. Global Environ Change B Environ Hazard 4:113–127
17. Erdal M, Oeppen C (2018) Forced to leave? The discursive and analytical significance of
describing migration as forced and voluntary. J Ethn Migr Stud 44(6):981–998. https://doi.org/
10.1080/1369183X.2017.1384149
18. Faber D, Schlegel C (2017) Give me shelter from the storm: framing the climate refugee crisis in
the context of neoliberal capitalism. Capital Nat Social 28(3):1–17. https://doi.org/10.1080/
10455752.2017.1356494
19. Farbotko C (2010) Wishful sinking: disappearing islands, climate refugees and cosmopolitan
experimentation. Asia Pac Viewp 51(1):47–60. https://doi.org/10.1111/j.1467-8373.2010.
001413.x
20. Farbotko C, Lazrus H (2010) The first climate refugees: contesting global narratives of climate
change in Tuvalu. Glob Environ Chang 22:382–390. https://doi.org/10.1016/j.gloenvcha.2011.
11.014
21. Hagen-Zanker J (2008) Why do people migrate? A review of the theoretical literature.
Maastrcht Graduate School of Governance Working Paper No. 2008/WP002. Available at
SSRN: https://ssrn.com/abstract=1105657 or https://doi.org/10.2139/ssrn.1105657
22. Helbing D (2013) Globally networked risks and how to respond. Nature 497:51–59
76 C. Bell and A. J. Masys

23. International Organization of Migration [IOM] (2018) Migration and migrants: a global over-
view. Retrieved from http://www.iom.int/wmr/chapter-2
24. Islam M, Hossain D (2013) Island char resources mobilization (ICRM): changes of livelihoods
of vulnerable people in Bangladesh. Soc Indic Res 117(3):1033–1054. https://doi.org/10.1007/
s11205-013-0375-y
25. Jabeen H, Guy S (2015) Fluid engagements: responding to the co-evolution of poverty and
climate change in Dhaka, Bangladesh. Habitat Int 47:307–314. https://doi.org/10.1016/j.
habitatint.2015.02.005
26. Jain M (2016) Ten theories of migration. Retrieved from http://www.youtube.com/watch?
v=b4svgoodin8
27. Kaundert M, Masys AJ (2018) Mass migration, humanitarian assistance and crisis management:
embracing social innovation and organizational learning. In: Masys AJ (ed) Security by design.
Springer, Cham
28. Kempermann G (2017) Cynefin as Reference Framework to Facilitate Insight and Decision-
Making in Complex Contexts of Biomedical Research Frontiers in Neuroscience | www.
frontiersin.org. 1 November 2017 | Volume 11 | Article 634
29. Kolmannskog V (2008) Future floods of refugees: a comment on climate change, conflict and
forced migration. Norwegian Refugee Council, Oslo
30. Kurtz CF, Snowden DJ (2003) The new dynamics of strategy: sense-making in a complex and
complicated world. IBM Syst J 42(3):462–483
31. Lee B, Preston F, Green G (2012) Preparing for high-impact, low-probability events; lessons
from Eyjafjallajökull. Chatham House Report. January 2012
32. Leleur S (2012) Complex strategic choices: applying systemic planning for strategic decision
making. Springer, London
33. Lichtveld M (2018) Disasters through the lens of disparities: elevate community resilience as an
essential public health service. AJPH Jan 2018 108(1):28–29
34. Masys AJ (2014) Disaster management: enabling resilience. Springer Publishing
35. Masys AJ (2016a) Exploring the security landscape- non-traditional security challenges.
Springer Publishing
36. Masys AJ (2016b) Disaster forensics: understanding root cause and complex causality. Springer
Publishing
37. Masys AJ, Ray-Bennett N, Shiroshita H, Jackson P (2014) High impact/low frequency extreme
events: enabling reflection and resilience in a hyper-connected world. 4th International Con-
ference on Building Resilience, 8–11 September 2014, Salford Quays, United Kingdom.
Procedia Economics and Finance 18 (2014) 772–779
38. McMichael C (2015) Climate change-related migration and infectious disease. Virulence
6(6):548–553. https://doi.org/10.1080/21505594.2015.1021539
39. McMichael A, Lindgren E (2011) Climate change: present and future risks to health—and
necessary responses. J Intern Med 270(5):401–413
40. Moore M, Westley F (2011) Surmountable chasms: networks and social innovation for
resilient systems. Ecol Soc 16(1):5. Retrieved from http://www.ecologyandsociety.org/vol16/
iss1/art5/ (4)
41. Najem S, Faour G (2018) Debye–Hückel theory for refugees’ migration. EPJ Data Science 7:22
42. Norwegian Refugee Council (2015) Global estimates 2015: people displaced by disasters.
Retrieved from http://www.internal-displacement.org/assets/library/Media/201507-global
Estimates-2015/20150713-global-estimates-2015-en-v1.pdf
43. O’Grady S (2018) A new report estimates that more than 380,000 people have died in South
Sudan’s civil war. In The Washington Post. Retrieved from https://www.washingtonpost.com/
world/africa/a-new-report-estimates-more-than-380000-people-have-died-in-south-sudans-civil-
war/2018/09/25/e41fcb84-c0e7-11e8-9f4f-a1b7af255aa5_story.html?utm_term=.c1b433a12e12
44. Oxford Dictionary (2018) Migration. Retrieved from https://en.oxforddictionaries.com/defini
tion/migration
45. Ravenstein E (1885) The laws of migration. J Stat Soc Lond 48(2):167–235. Retrieved from
http://www.worldcat.org/title/laws-of-migration/oclc/4670267
Climate Change, Extreme Weather Events and Global Health Security a. . . 77

46. Reuveny R (2007) Climate change-induced migration and violent conflict. Polit Geogr
26:656–673
47. Roy M (2009) Planning for sustainable urbanisation in fast growing cities: mitigation and
adaptation issues addressed in Dhaka, Bangladesh. Habitat Int 33(3):276–286. https://doi.org/
10.1016/j.habitatint.2008.10.022
48. Sagan SD (1993) The limits of safety: organizations, accidents, and nuclear weapons. Princeton
University Press, NJ
49. Sambharya RB, Rasheed AA (2012) Global risk in a changing world: new paradigms and
practice. Organ Dyn 41:308–317
50. Schwerdtle P, Bowen K, McMichael C (2018) The health impacts of climate-related migration.
BMC Med 16:1. https://doi.org/10.1186/s12916-017-0981-7
51. Shahid S, Wang X, Harun SB, Shamsudin S, Ismail T, Minhans A (2016) Climate variability
and changes in the major cities of Bangladesh: observations, possible impacts and adaptation.
Reg Environ Chang 16(2):459–471
52. Shultz J, Rechkemmer A, Rai A, McManus K (2018) Public health and mental health implica-
tions of environmentally induced forced migration. Disaster Med Public Health Prep:1–7.
https://doi.org/10.1017/dmp.2018.27
53. Snowden D (2003) Complex knowledge. Building the knowledge economy: issues, applica-
tions, case studies, 805
54. Snowden DJ, Boone ME (2007) A Leader’s framework for decision making. (cover story).
Harv Bus Rev 85(11):68–76
55. Taleb NN (2007) The black swan: the impact of the highly improbable. Penguin Books Ltd,
London
56. Taylor I, Masys AJ (2018) Complexity and unintended consequences in a human security crisis:
a system dynamic model of the refugee migration to Europe. In: Masys AJ (ed) Security by
design. Springer, Cham
57. Than A, Singh A, Uma P (2009) Sea level threat in Tuvalu. Am J Appl Sci 6. https://doi.org/10.
3844/ajassp.2009.1169.1174
58. Toufique K, Islam A (2014) Assessing risks from climate variability and change for disaster-
prone zones in Bangladesh. Int J Disaster Risk Reduct 10.(Part A:236–249. https://doi.org/10.
1016/j.ijdrr.2014.08.008
59. United Nations (2018) Climate shocks and humanitarian crises. Retrieved from https://www.
foreignaffairs.com/articles/world/2018-11-29/climate-shocks-and-humanitarian-crises
60. United Nations Development Programme [UNDP] (2018) Climate change and adaption.
Retrieved from http://www.ss.undp.org/content/south_sudan/en/home/ourwork/poverty
reduction/Climate-Change-and-Adaptation.html
61. United Nations Development Programme [UNDP] (2017a) Government of Tuvalu launches
new coastal protection project to bolster resilience to climate change. Retrieved from https://
reliefweb.int/report/tuvalu/government-tuvalu-launches-new-coastal-protection-project-bolster-
resilience-climate
62. United Nations Development Programme [UNDP] (2017b) Confronting climate change in
South Sudan. Retrieved from http://www.undp.org/content/undp/en/home/blog/2017/6/29/
Confronting-climate-change-in-South-Sudan.html
63. The United Nations International Strategy for Disaster Reduction (2004) Living with risk: a
global review of disaster reduction initiatives, vol 1. UN Publications, Geneva, p 16. Retrieved
from http://www.unisdr.org/files/657_lwr1.pdf
64. United Nations (2012) Climate displacement in Bangladesh: The need for urgent housing, land
and property (HLP) rights solutions. Retrieved from https://unfccc.int/files/adaptation/groups_
committees/loss_and_damage_executive_committee/application/pdf/ds_bangladesh_report.pdf
65. UN High Commissioner for Refugees [UNHCR] (2006) UNHCR master glossary of terms.
Retrieved October 2018 from http://www.refworld.org/docid/42ce7d444.html
66. Vespignani A (2009 July) Predicting the behavior of techno-social systems. Science
325(24):425–428
67. Willis J (2018) Sea level rise. Smithsonian Institute. Retrieved from https://ocean.si.edu/
through-time/ancient-seas/sea-level-rise
78 C. Bell and A. J. Masys

68. Warner K (2018) Coordinated approaches to large-scale movements of people: contributions


of the Paris agreement and the global compacts for migration and on refugees. Popul Environ
39(4):384–401. https://doi.org/10.1007/s11111-018-0299-1
69. Weick KE, Sutcliffe KM (2007) Managing the unexpected: resilient performance in an age of
uncertainty, 2nd edn. Wiley, San Francisco
70. Werrell C, Femia F (Ed) (2017) Epicenters of climate and security: the new geostrategic
landscape of the anthropocene. https://climateandsecurity.org/epicenters/
71. Wilkinson A, Kupers R, Mangalagiu D (2013) How plausibility-based scenario practices are
grappling with complexity to appreciate and address 21st century challenges. Technological
Forecasting and Social Change, Elsevier 80(4):699–710
72. Woods DD (2006) Essential characteristics of resilience. In: Hollnagel E, Woods DD, Leveson
N (eds) Resilience engineering: concepts and precepts. Ashgate Publishing, Aldershot, Hamp-
shire, pp 21–34
73. The World Bank (2018) CO2 emissions (metric tons per captia). Retrieved from https://data.
worldbank.org/indicator/EN.ATM.CO2E.PC?locations=TV
74. World Bank (2007) Dhaka: improving living conditions for the urban poor, Development Series
(paper no. 18). The World Bank Office, Dhaka. Retrieved from http://siteresources.worldbank.
org/BANGLADESHEXTN/Resources/295759-1182963268987/dhakaurbanreport.pdf
75. World Food Programme [WFP] (n.d.) Climate impacts on food security. Retrieved from https://
www.wfp.org/climate-change/climate-impacts
76. World Food Programme [WFP] (2018) Climate action. Retrieved from http://www1.wfp.org/
climate-action
77. World Food Programme (2017) The first climate change conflict. Retrieved from https://www.
wfpusa.org/articles/the-first-climate-change-conflict/
78. World Health Orgnaization (2010) Communicable disease and severe food shortage. Retrieved
from http://apps.who.int/iris/bitstream/handle/10665/70485/WHO_HSE_GAR_DCE_2010_6_
eng.pdf;jsessionid=FD330923D276427CBF1738F18CD4E138?sequence=1
79. Xu T, Masys AJ (2016) Critical infrastructure vulnerabilities: embracing a network mindset.
In: Masys AJ (ed) Exploring the security landscape- non-traditional security challenges.
Springer, Cham
Global Health Biosecurity in a Vulnerable
World – An Evaluation of Emerging
Threats and Current Disaster Preparedness
Strategies for the Future

Kristi Miley

1 Introduction

Our planet is home to several billion individuals, with growth expected to continue at
alarming rates over the next century. In the past two hundred years, this growth has
led to industrialization across the globe that has taken many countries to unimagin-
able innovations in technologies and medicine. Unfortunately, with such expansion
comes great burden to maintain global health safety and security, not just for humans
but for the entire delicate balance of all of earth’s resident species. As technologies
have improved, so have the ways that one might use this knowledge for harm instead
of good. It was through the realization of such acts of maleficence, along with public
health concerns regarding bioweapons, that biosecurity was developed.
Our world is no stranger to the threats of disease. Historically disease has been the
ultimate warrior, long before its sixth century appearance when plague road into
Europe like a mythical dragon blowing down presumably a quarter of the population
at that time [14]. Many will remember the terrorist attacks in the United states on
September 11th 2001, but there were also threats of a biological nature when letters
contaminated with anthrax spores breached the United States Postal Service less than
a month later [52]. If history has taught us anything regarding infectious diseases, it
is that we must understand our enemy. Breakthroughs in science have allowed us to
evaluate diseases in a manner not conceived of by our forefathers. When evaluating
future applications of war the days of armory are not over, they have simply added a
weapon more powerful than the bullet to the surplus. With the power to possess

K. Miley (*)
College of Public Health, University of South Florida, Tampa, FL, USA
e-mail: kmiley@mail.usf.edu

© Springer Nature Switzerland AG 2020 79


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1_5
80 K. Miley

growing capabilities of duplicating a virus, and/or creating a mutated or synthetic


pathogen, the knowledge to bring death to every corner of the globe could come in
the smallest of packages. Individuals now have the technologies to design a patho-
gen from natural sources which could ultimately progress into a weaponizable
organism.
Although many may perceive biosecurity as having “security in place to prevent
misuse or intentional release of pathogens or toxins” as defined by the World Health
Organization (WHO), biosecurity can be viewed more like an onion with many
layers that can include unintentional and/or natural events that inflict harm.
According to the Food and Agriculture Organization of the United Nation (FAO),
biosecurity includes risk management and safety associated with plant health, animal
health, human health, and the environment. There are many ways to characterize
biosecurity, but no matter how its defined biosecurity is of global health concern that
requires an organized worldly effort with all stakeholders involved.
Whether one is concerned with Agriculture Biosecurity, Food Safety, Research
biosecurity, animal health, or Pandemic threats, it is important to understand that
each example of biosecurity can have an interconnectedness to another. We should
consider emerging biosecurity threats as a pebble tossed in a pond, wherein the
pond represents the world. Something may seem as tiny as a pebble at first, but the
effects of its ripple will eventually reach the farthest corners, and with a pandemic
the impacts could be catastrophic. To add to the burden of health impacts, there is
growing concern that our planet is experiencing warming trends which may have
severe consequences on maintaining sustainable food sources in the future.
Biosecurity concerns are a global health issue and as such demand shared
understanding and coordination among all countries [28]. In global health practice,
collaboration between all stakeholders can assist in establishing the best outcome.
Biosecurity efforts should be approached in a similar fashion in order to provide
global protection from biorisks. This is not an easy task, as many obstacles lie in the
path to substantiating solid disaster management plans regarding biosecurity. Prob-
ably the most important topic to all stakeholders is regarding who is going to finance
the stages of managing biosecurity threats in the global health setting. Lesser
developed countries are unlikely to participate in biosecurity management efforts,
as they are less capable of providing the funding. Unfortunately, lesser developed
countries are also an easy target for a bioterrorist attack, as they lack the infrastruc-
ture to protect against biological weapons [44]. Biodefense preparedness strategies
can be quite costly, the United States has allocated billions to disaster management
and biosecurity [26]. However when observing the history of outbreaks from the
past, it would seem more costly to respond and recover from a biological pathogen
than the initial expense of preventative measures. Having an adequate preparedness
plan could reduce the dissemination of such an outbreak or attack, whereby reducing
the post incident costs.
Global Health Biosecurity in a Vulnerable World – An Evaluation of. . . 81

2 Emerging Biosecurity Threats

2.1 Agriculture Biosecurity Threats

Global warming and its effects on disease transmission is a hot topic in current
research, yet it remains controversial among many laymen. Although it seems that
the planet may have a cyclical transition in its weather patterns over time, climate
change appears to be occurring at an impressive rate compared to historical data.
Though it is impossible to predict the future, warming trends are currently affecting
weather patterns with intensifying storms, increased precipitation and flooding,
increases in temperature in cooler regions, lengthy periods of drought in some
areas, and the subsequent spread of infectious diseases in food source crops [1].
With natural disasters on the rise, the planet has become victim to many changes that
may lead to disparities which presumably will far outreach human capacity to
provide adequate resources for survival.
Florida, California, Hawaii, and many coastal regions in the United States are
rapidly becoming models of climate change, as well as potential research specimens
for the future study of global warming in respect to agricultural and human
biosecurity. This is seen in large part to the many natural disasters that have occurred
in the past 20 years affecting these areas. Although citrus has been primarily affected
in Florida and California, other regions provide many of the major sustainable foods
which originate from plant sources and are susceptible to global warming trends,
wherein potential emergence of disease may occur due to higher wintertime temper-
atures allowing infectious organisms to survive longer periods of time [1]. Global
warming trends could compound biosecurity threats, as plant pathogens will have an
increased ability to spread when the environment in regions with once cooler
climates transitions into more subtropical temperatures and these warming ecosys-
tems will feasibly provide a new place for such diseases to call home, whereby
expanding the range that a food source plant may become infected [1]. This could be
further implicated by the migration of humans into naïve environments as intense
weather patterns will drive movement into regions with more habitable conditions,
potentially bringing with them infectious materials that could continue the spread of
agricultural illnesses and essentially create a sustainable food shortage.
There are many plant pathogens that show major threat to continued sustainable
farming and may one day grossly impact our ability to feed the world’s population.
Many of these pathogens affecting our agricultural industry stem from bacterial
strains, viruses, and fungal infections. With the continued growth of the global
population there is widespread concern that plant pathogens may have severe affects
on sustainable crops in the near future. International travel and the current import/
export procedures regarding agricultural products may leave an open door which
could further threaten our ability to protect against fungal, viral, and bacterial
diseases currently impacting crops. There is also the potential for new pathogens
not yet discovered which could come from genetically modified sources that we need
to take into consideration when establishing safeguards for future food sources.
82 K. Miley

The United States has already experienced plant pathogens that have caused total
destruction of crop yields. One example is illustrated by the effects of the Asian
citrus psyllid with its ability to transmit the bacterium which causes citrus greening
and has devastated orange groves across the country [43]. To date the state of Florida
has lost thousands of acres of crop land to this bacterium due to its ability to infect
the soil and remnant root beds making it extremely difficult to clear the land of the
disease, which is then deemed unsuitable for citrus tree plantations. The impact can
be seen echoing beyond the loss of the fruit, to the loss of income to the grower, as
well as the subsequent change of land usage as the remnants of the plant disease will
create an environment that no longer sustains a grove. This can have further impacts
on global warming and greenhouse gases, as many of these destroyed citrus groves
are cleared and replaced with cattle. As noted by factors of global warming, removal
of orange plantations can be considered a form of deforestation which increases
greenhouse gases. Furthermore, there are implications surrounding the land use
changes as many of these cleared grove lands are replaced with beef cattle for
production. Although this adds to the sources of protein in our food schema, it
also adds an excessive amount of methane gas from the abundance of bovine feces in
the newly deforested farmland.
In relation to food sources, bacteria tends to activate fear and attention by the
media, which in turn brings awareness and aids in protecting the public. However,
there are other things that can effect crops that are equal to bacterial threats, such as
fungal diseases. Unfortunately plant fungus appears to lack the same enthusiasm
when it comes to protecting against these unsuspecting pathogens. Disparities in
research funding, as well as a lack of adequate replicate specimens for diagnosing
fungal pathogens, has created a dilemma for future agricultural biosecurity measures
that could aid in protecting against the potential spread of foreign fungal diseases to
naïve environments [10]. The ability to appropriately identify fungal diseases in
imported crops is also an issue due to the lack of systematic identification commu-
nicated internationally, all aspects of which are also time sensitive if spread of such
disease is expected to be contained to the host country in which the plant originated
[10]. Unfortunately, fungal infections can be quite complex as some species are
known to provide latency which may go undetected. This is especially concerning
when factoring contaminated imported crops, as they may infect the land surround-
ing their transplantation to a new environment and destroy a farmer’s crop yield.
Unfortunately, just as seen with the bacterium which causes citrus greening, fungal
diseases may go undetected for several years making protection against these
agricultural biosecurity threats quite difficult.
Fungal disease is not only a global threat to food security, it has also shown to
have detrimental effects on wildlife and this could impact the future health of the
planet. Fungal diseases have been on the rise over the past decade to a point where
we may soon encounter extinction events in some of the affected species. From the
1990’s to now there have been population declines noted in both bats and amphib-
ians due to fungal infections from Geomyces destructans and Batrachochytrium
dendrobatidis respectively [17]. The global impact of such infections may not be
apparent at this stage, but it is perceivable that a small nudge from homeostatic
equilibrium could cause further implications downstream in the ecosystem.
Global Health Biosecurity in a Vulnerable World – An Evaluation of. . . 83

Another important player that impacts planetary health due to the spread of fungal
diseases is phytophthora spp., which causes destruction of flora from small crops, to
fruit trees, to hardwood forests [20]. Potato blight and the famine in mid-1800’s
Ireland was due to phytophthora infestans, further there are several species known to
infect plant life such as soybean, coconut, and even the mighty oak tree, causing
death and deforestation [1, 20]. When fungus causes destruction to forests it can lead
to loss of habitat for the wildlife and potentially fuel global warming by allowing an
increase in greenhouse gases in areas that are left barren due to tree death. Because
fungi are inherently hardy, present in the environment, and capable of surviving at
length without a host organism, they present a complicated challenge to agricultural
biosecurity.
Genetically modified crops are a growing business and a new concern to
biosecurity, especially in the United States. The purpose of genetic alteration in
agriculture was to design sustainable crops that had heightened resistance to pests
which would have otherwise debilitated a non-modified plant. Although this appears
to be done for a great cause, there is still debate surrounding biosafety and human
health related to consuming a genetically modified food source. Biosecurity impli-
cations are mainly with respect to lack of uniformity in the regulation of the
genetically modified crop process and production, as well as the subsequent distri-
bution of goods to the consumer [21]. Further, environmental impacts are also of
biosecurity concern, especially upon importing genetically modified crops. With any
new science technology, unknown consequences can arise as previously noted in
genetically altered rice crops that led to unintentional herbicide resistant weeds
which impacted the environment [21]. Though novel in the fight to maintain
sustainable food sources for an ever-growing population in need, genetically mod-
ified crops require unified collaboration regarding regulatory processes that ensure
human and environmental health. The world has advanced to a place where genetic
splicing with altered gene expression and cloning may create a pathogenic plant
intended to destroy crop yield, and there must be adequate biosecurity strategies in
place to protect against this type of biowarfare application.

2.2 Natural Disasters and Threats to Biosecurity

Natural disasters have presented further concern to biosecurity and safety across the
globe. In the United States alone we have seen growing concerns with climate
change and the effects this may have on future natural disasters that will impact
the ability to prepare and protect the population. Recently, several regions in the
United States have experienced devastating natural events that will undoubtedly
affect future agricultural production. Although this may seem like a broken record
repeating over yet again, the trends of global warming are heavily impacted by
deforestation and release of greenhouse gases which can be an outcome after many
natural disasters. Unfortunately, we have seen recent natural disasters leave barren
land through the devastating wildfires in California to the hurricane force winds
84 K. Miley

ripping hardwood tree plantations to the ground in Florida. These types of natural
disasters bring with them ecological changes that could take centuries to repair,
leading to human displacement and widespread food shortages. Along with popula-
tion displacement following disaster events, is the problem of providing safe envi-
ronments with adequate supplies of potable water, clean sanitation, and protection
from the elements. In most instances this requires military action and martial law to
enforce equality and protection among those affected. Biosecurity preparedness is
difficult to ascertain regarding natural disasters. No matter how prepared a commu-
nity is, those affected will experience varying degrees of damage that must be
assessed to eliminate further threats to population safety. Flood waters will essen-
tially contaminate ground water leading to increased threats of illness in the affected
population. Crop damage from fires and floods can cause sustainable food source
shortages that will impact both the local residents and communities outside of the
affected region that rely on the import of those crops as a primary food source.
Although these may seem like short term problems, the time for recovery may be
long wherein impacts may span several years.
Environmental safety concerns arise when dealing with natural disasters. For
example, the massive fires in California have decimated the land but have also
destroyed the ambient air quality. Although the disaster region may not currently
be habitable, the surrounding areas still maintain residents that are subject to smoke
and soot that can have severe health implications. At minimum, the particulate
matter contained in the smoke can cause upper respiratory irritation at onset, but
may also lead to lung damage with prolonged exposure [46]. Natural disasters also
create a vulnerable environment to bioterrorism. Although not thought to be as
worrisome in developed countries such as; the United States, there is certainly
concern when disasters occur that an enemy may use this vulnerability in affected
populations to their advantage, especially in lesser developed regions of the globe.
Infrastructures in poorer countries are already weak and therefore may present an
opportunity for biological threats during times of despair. Even in the United States
where emergency management and strong infrastructure exists, there is still concern
that an intrusive media presence may create a window of opportunity for civil
disorder that can lead to continued biosecurity challenges during natural disaster
recovery [6].

2.3 Food and Water Biosecurity Threats

Food and water biosecurity is incredibly complex to address due to industry, travel,
import/export issues, and whether an origin of contamination is from a natural
biological source, in a synthetic form, or of chemical nature. This is further
compounded by the potential for deliberate inoculation of a pathogen into the food
or water supply as an attempt of biowarfare. Generally, most food source contam-
inants are accidental in nature, as seen with recent outbreaks of E. coli in leafy
greens. Most foodborne illness can be attributed to two primary bacterial organisms;
Global Health Biosecurity in a Vulnerable World – An Evaluation of. . . 85

E. coli and several Salmonella spp. that have been diagnosed in food animal sources,
mainly poultry [47]. Foodborne outbreaks can cause discourse in the public eye and
instill fear which can ultimately reduce the desire to purchase certain foods long after
the issue has been resolved. Food contaminants and agroterrorism are becoming
increasingly important topics in biosecurity, as many biological agents are derived
from nature and a skilled scientist with a little ingenuity could develop a deliverable
pathogen into the food supply [28].
Dealing with threats to the water supply can also be difficult, as contaminants can
stem from many sources. Historically through the accounts of John Snow in the
1800’s, Cholera was one of the most devastating pandemics of the time and was
caused by the bacterium Vibrio cholerae which contaminated the water supply [14].
Unfortunately, Cholera is not just a disease of the past as it continues to affect lesser
developed countries each year, and outbreaks tend to occur during times of distress
including during natural disasters [46]. Developed countries also have water
biosecurity issues that surround maintaining a safe potable water supply. These
include natural source contaminations, such as; raw sewage leeching into the water
supply due to flooding during intense weather events. Chemical dumping has also
posed a threat to the water supply, causing chronic illness and diseases in the area
surrounding the illegal dump sites as noted from previous events such as the “Love
Canal” that predate the EPA’s stringent regulations [3]. Early cattle ranchers were
also a part of this problem, as they would dig trenches to drench their cows with
pesticides which led to high contents of arsenic in the soil and in the aquafer [33].
The implications of this cattle treatment still presents issues with potable water in
some regions, and water testing is now a customary procedure when purchasing a
home in an area where a private well is the major water source.

2.4 Animal Biosecurity Threats

The food animal industry presents many challenges for agricultural biosecurity.
Zoonotic diseases may be encountered through exposure to wildlife, particularly in
regions of vast grazing land as much of this land can border national forests and
wildlife preserves. Disease transmission in agricultural animal farms may occur in
these settings via wildlife crossover and migration through developed pasture land.
Although many pathogens that may infect wildlife and agricultural livestock are not
currently considered transmissible to humans, there are several that are cause for
concern, such as; several Brucella spp. that can infect cattle, sheep, goats, and pigs,
Rabies virus, Hendra virus, Nipah virus, Marburg, Ebola, and Aphthae epizooticae
which causes Foot and Mouth Disease (FMD) in cattle [29, 40]. Although FMD is
not thought to impact human health directly, there has been serious impacts on the
food industry from this disease.
Previous research regarding zoonotic disease incidence has documented that
approximately 70–75% of emergent infectious diseases can be attributed to both
domestic animals and wildlife, which validates the need for biosecurity measures in
86 K. Miley

food animal production [12, 41]. Further, the costs incurred in an outbreak to food
animal farms can reach into the billions to contain and control the spread of disease.
This was seen in the early 2000’s in the United Kingdom, wherein Foot and Mouth
Disease caused economic strain of epic proportion with the mass slaughter of
millions of animals in attempt to contain the problem [30]. Unfortunately when
FMD is suspected and subsequently diagnosed, the farm activities are shut down no
longer allowing for meat and dairy production to continue [29]. The damage in this
situation extends well beyond the loss of meat and dairy production within the given
region, as the affects can be seen within the local communities that rely on these
staples, as well as the effects on financial earnings generally received from export of
these goods to other areas [50].
2009 was also an epic year for animal related emerging infectious disease with the
outbreak of swine flu H1N1, wherein this showed an example of human to swine to
human transmission, with swine influenza A viruses later sequenced and determined
to possess human origins [35]. This compatibility of infectious disease transmission
between humans and swine is not a difficult concept to grasp, simply consider the
use of pig valves in human cardiac procedures which have the ability to sustainably
function for upwards of 20 years. Of course the major biosecurity concerns with
swine and the influenza A virus is that influenza A has the capacity to frequently
recombine and potentially develop into a virus with greater virulence than previously
encountered, leaving the population vulnerable to a future pandemic that could span
the globe [35]. Zoonotic viruses may appear to be under control in some regions of
the world though they may reemerge due to migration of wildlife, or through
urbanization and international travel of humans, thus making biosecurity measures
regarding zoonotic pathogens of even greater importance [40].
The events surrounding FMD and swine flu illustrate the need for interdisci-
plinary approaches to agricultural biosecurity measures, employing all stake-
holders in the process of protecting against zoonotic diseases. There is a growing
need to involve effective collaboration between veterinarians, the CDC, USDA,
and agricultural stakeholders at the national, state, and community levels upon
initial diagnosis of an emerging infectious disease in order to halt subsequent
exposure across a given region and control the potential dissemination of disease.
Unfortunately, there is a discerning lack of concern, or possibly a lack of knowl-
edge, for biosecurity among rural veterinarian professionals. Previous research has
illustrated that approximately 10% of veterinarians that handle cattle for food or
dairy production did not consider biosecurity as a priority, and only 30% of those
surveyed claimed that they always wore disposable gloves during examinations at
such cattle farms [41]. This bodes the question to agricultural biosecurity as to
whether or not members of the veterinary community may also be fomites in the
broadcasting of emerging infectious diseases at the point source of onset. Another
concern regarding agricultural biosecurity is that certain zoonotic diseases go
unreported as a result of illnesses that are occurring at a small hobby farm where
there is a gross lack of veterinary care, leading to simple burial of the deceased
animal with no diagnosis or report of its occurrence.
Global Health Biosecurity in a Vulnerable World – An Evaluation of. . . 87

Prions are another concern for biosecurity and human health as it relates to meat
consumables. As noted in Europe in the late twentieth century, Bovine Spongiform
Encephalopathy (BSE) triggered catastrophic damage to human and economic
health with more than 30,000 farms implicated and several human deaths due to
variant Creutzfeldt-Jakob disease caused by a prion [42]. Though great effort took
place to control the spread of disease in a timely fashion, the economic losses and
public attitude regarding the beef industry has spanned many years. “Mad Cow
Disease” was simply a sad accident that might have been avoided had security
measures been adequate to provide quality control that prevented the contamination
of infected materials in cattle feed stocks [34]. Prions still have a mystery about them
which makes them unlikely suspects as a tool for bioterrorism. However, prions have
the ability to manipulate the mammalian system causing severe illness and death, as
such they should not be taken lightly when ascertaining biosecurity strategies to
prevent such an invasion.

2.5 Pandemics

Pandemics have plagued our world for thousands of years, possibly as long as
humans have walked upon the earth. Unfortunately, we have only been privy to the
definitive causes of such illnesses since the advent of microbial sciences and the
development of crude microscopic techniques. Outbreaks continue to pose a threat
to the population across the globe. Although there are numerous diseases that are
on the radar for potential threats to biosecurity, in regards to pandemics throughout
history there are four that seem to stand out; Plague, Influenza, Bacterial Menin-
gitis, and Cholera.
Plague has quite the interesting back story in that it was previously thought to
have been a disease purely of filth and squalor due to its mode of transportation, the
rat. However, through early research it was discovered that the real disease was
caused by the bacteria Yersinia pestis which was transmitted to humans by way of an
infected flea that also happened to feed on rats [14]. Unfortunately, plague outbreaks
were responsible for approximately 75 million deaths in affected regions of the
world during the 1300’s [34]. Even with the invention of antibacterials and antibi-
otics, plague is still of concern to biosecurity as a potential biowarfare agent. This is
further compounded by the limitations of current antibiotics in which antibiotic
resistance has become a growing issue with the sheer lack of new antibiotics to
replace them.
Influenza is an extremely old but strong player in today’s plethora of infectious
diseases, with its first noted appearance dating back to the time of Hippocrates in
early 400’s B.C.E., and it continues to cause severe outbreaks every couple
decades [14, 34]. There is some cause for concern that influenza could be used
as a bioweapon and creates a need for biosecurity measures to protect the popu-
lation from such actions. Influenza has been recently genetically modified for
experimental purposes that have been scrutinized for potential misuse which has
88 K. Miley

created major concerns for threats of future pandemics from mutant forms of the
virus [15]. This not only puts an emphasis on biosafety concerns, but it also
establishes deeper questions regarding the potential difficulties surrounding crea-
tion of a vaccine against a mutant influenza strain.
Bacterial meningitis was first recorded several centuries ago and is caused by
numerous bacterium, notably E. coli, Streptococcal bacteria, Listeria and
Haemophilus influenzae b, which infect the meninges that surround the spinal
cord [34]. Just as noted with influenza, outbreaks occur fairly often in certain parts
of the world. This is also a concern regarding antibiotic resistance and shows the
need for novel pharmaceuticals to enter the pipeline of drug discovery in order to
combat these threats, both in the natural state of the disease process and in the event
of misuse as a bioagent. Although these bacterial infections may not necessarily
seem the deadliest killer to employ in bioterrorism, these bacterium may provide a
likely ally for acts of war by individuals who are seeking to cripple health infra-
structures in certain regions that may already be experiencing hardships of war.
Cholera, Vibrio cholerae, is yet another frightening bacteria that has instigated
many pandemics causing imminent death shortly after infection and has been
postulated to be an old disease dating back to ancient Sanskrit [14]. Since this
bacteria is known to transmit to humans through contaminated food and/or water
sources, it has been highly scrutinized as a possible device of bioterrorism and
outbreaks have the potential to occur in more ways than one, especially during
times of war when an affected region’s infrastructure is disrupted [34]. We have
already seen instances of bacterial bioterrorism with the use of anthrax, Bacillus
anthracis, in which it was intentional modified to increase toxic delivery for use as a
bioweapon [34]. The implications of events such as this do not stop merely with the
individuals that fall victim, but extends beyond to the population where it instills
terror that may take years for resilience to be restored.

2.6 Research and Synthetic Biosecurity Threats

In research, scientists are often faced with the dilemma of reporting novel findings
that benefit humankind, as their research may also constitute as a threat to our
security if placed in the wrong hands. Although not as pronounced a presence in
the realm of biosecurity, research biosecurity is necessary to thwart off the poten-
tial threat that biological research may be utilized by outside parties in a destructive
fashion. Although a medical study may initially be designed to find the chink in a
virus’s armor in order to develop a novel vaccine to protect the population at large,
this research may also create a breach in biosecurity that would allow the use of
this information to create a genetically altered and possibly deadly super-pathogen.
Recent Influenza research was a prime example of such risk to biosecurity; even
though the scientists affiliated with such novel research were applying their studies
for the good of humankind, the publishing of their methodology was not allowable
due to the potential ramifications of the research being replicated for the purposes
Global Health Biosecurity in a Vulnerable World – An Evaluation of. . . 89

of creating a bioweapon [48]. In this example, the U.S. Department of Health and
Human Services (DHHS), the U.S. National Institutes of Health (NIH), and the
National Science Advisory Board for Biosecurity (NSABB) intervened on the
publishing of two research projects related to H5N1 and in this case the authors
had to agree to omit a substantial portion of relevant methodology that could be
used to recreate the virus for malicious intent [48]. One can perceivably imagine
the implications of dispersal of such a super virus in which the mortality rates could
skyrocket with no available countermeasures to handle a genetically altered
pathogen.
As science technology and research continues to prosper, so does the potential for
misuse or negligent use of select agents. There are currently select agent regulatory
standards among several government organizations wherein guidelines for safe
handling have been established. However, over the years changes have presented
challenges in updating and amending these documents so as to prevent security
breech and/or harm. In order to promote adequate biosafety measures with regards to
select agent use, the National Science Advisory Board for Biosecurity and the
National Academies of Science and Trans Federal Taskforce have been active
proponents in ensuring that continued improvements are made to protect against
select agent misuse [26]. Although the global agenda for infectious disease research
is to promote collaborations across boundaries which will assist in the elimination of
disease-causing pathogens, there may be cultural barriers that could present issues
with properly securing select agent use throughout the world.
It is becoming increasingly important as we train future researchers that we
maintain an ethical responsibility to incorporate biosecurity principles in research
practice and academia [48]. Biosecurity has not yet made a concise impact to the
higher echelons of learning. It has been noted that there are few available courses and
inadequacies regarding the topic of biosecurity in many countries where research
academics are in high demand, including Australia, Japan, and even the United
States [31]. Although research biosecurity is only a small piece of the biosecurity
puzzle, gaining knowledge and understanding of biorisks is necessary to educate
student researchers of potential emerging threats, whether by accidental exposure or
by intentional misuse of research materials and methodology.

3 Biosecurity Surveillance, Policy, and Practice

3.1 Biosecurity Surveillance

Many countries rely heavily on the importation of goods for sustainable food sources
including items for production purposes. Biosecurity applications at the point of
entry can prevent subsequent dissemination of a pathogen through stringent border
control measures. There are many ways in which an infectious agent can travel into a
naïve region that make biosecurity measures difficult, such as; within goods for
resale, on food source plants and in the soil or root beds, via infected wildlife, as well
90 K. Miley

as with the human traveler. However, there are checks and balances at work in most
developed countries that aid in the prevention of infectious disease transmission. In
the United States, the United States Department of Agriculture (USDA) plays a
dominant role in ensuring that food animals and agricultural crops are vetted
properly prior to transport, even within the country itself when crossing state lines.
Livestock that is to be sold for food consumption must be appropriately identified
through the USDA tag system, which gives the ability to determine origination of the
food animal in the event that a disease-causing pathogen were detected along the
path to meat production. Further the USDA provides seed certification guidelines
that allow a grower to know that the seeds he/she will cultivate are confirmed
genetically pure and meet quality standards [49]. Since the events of September
11th, there has been an elevated presence to secure imported goods in order to
protect public safety, and though it is uncertain if the benefits have been worth the
strive there has been millions of dollars applied to the biosecurity effort [45].
The recent advent of biosensor surveillance technologies has added to biosecurity
strategies that could aid in prevention against importation of unintentional diseases.
Biosensors have added the biological element to traditional chemical indicators
allowing for quick accessible surveillance of pathogens which work as the gate-
keeper for imported products [38]. The interesting benefits that biosensors present is
there ability to be complexed in platforms which could allow them to test against
multiple pathogens at the same time. As advancements in an industrialized world
continue, the importance of developing biosecurity screening techniques that are
effective at diagnosing a potential threat to food safety and human health will be
more important than in previous years. International travel and the need for contin-
ued export and import of goods to varying regions of the world will only exacerbate
the need to employ sensor based technologies that are capable of detecting infectious
contaminants in a timely fashion with accuracy and sensitivity to select pathogens
which are known to have deadly consequences to the population. When addressing
food security it has been noted that some antibiotic resistant strains of bacteria have
been detected in raw meat products intended for consumption which are of health
concerns both as the infections from these types of bacterium can cause annual
deaths in the hundreds in some cases, but also with antibiotic resistance on the rise
this can cripple health practitioners ability to treat these infections making it more
important to have detection technologies available to control such pathogens from
entering the consumables market [47]. Food safety is not limited to infectious agents
in need of detection, as many allergens may also be present in food supplies which
are cause for concern to public health, especially in a world where international trade
in commodities is commonplace. Food production processes lend way to multi-use
of food packaging plants and may present contamination of various allergic com-
pounds which may create unintentional allergic reactions, such as those found in
wheat, nuts, shellfish, milk products and eggs [8]. Immunological biosensors have
already been developed to determine the presence of some allergenic compounds,
such as peanut proteins, but there is still growing concern that continued research is
necessary to apply this technology to other allergens that may one day pose a threat
to biosecurity in relation to food production [8]. As lesser developed nations
Global Health Biosecurity in a Vulnerable World – An Evaluation of. . . 91

continue to grow and achieve industrialized status the need for enhancements in
biosecurity strategies will increase, creating challenges that will require continued
collaborations with health agencies and biosafety stakeholders in order to prevent
future dissemination of pathogenic agents.

3.2 Health Intelligence and Biosecurity Regulatory Agencies


and Partnerships

There are several regulatory bodies both nationally and internationally, that work
diligently to protect against threats to biosafety and security, with an extended
presence in the United States following the bioterrorism incidents of September
11th, 2001. The United States Department of Defense is one agency which provides
an infrastructure of support mechanisms for potential biological events, with path-
ogen identification services, hospital provisions, need-based application of mass
treatments, and security support [6]. The U.S. Department of Health and Human
Services (DHHS) also has authority to create countermeasures against various
attacks to human health both in the biological sense and regarding those of chemical
and/or nuclear intent, wherein they are a supporting body for procurement of funds
to appropriate external medical entities for delivery to the population that is in need,
as well as funding for development of research strategies that evaluate potential
threats [19]. Public health research is not without its biosecurity concerns, the
U.S. National Institutes of Health (NIH) with its biosecurity policy utilizes support
under the National Science Advisory Board for Biosecurity (NSABB) regarding
dual use research issues, such as; the misuse of pathogens to intentionally cause
harm, and seeks to promote quality research that benefits public health while
maintaining security of information, products, and technology that could be used
with malicious intent [37]. This also illustrates the importance of adequate training in
biosafety procedures for all research personnel, as a non-intentional or accidental
exposure can be just as detrimental to human health as those that have been
developed as an act of terrorism.
As a constituent of DHHS, the Centers for Disease Control and Prevention (CDC)
aims at global health security through early detection, strategic prevention, and
responsiveness at onset with regards to environmental and human health [9]. The
CDC provides support to the research community, as well as the protection of public
health, regarding select agents through its select agent registry which aims at
preventing misuse or accidental exposure to pathogens [26]. The CDC also wel-
comes collaborative relationships with a variety of global associations working
towards improving the sharing of information, which can assist in early detection
and response to new or reemerging infectious threats [9]. The World Health Orga-
nization (WHO) seeks to provide an infrastructure for international health with the
goal of providing accessibility to health for all humankind. Just as the CDC looks at
disease detection and prevention strategies, the WHO has a focus on health risk
92 K. Miley

assessments and strategies for treatment and prevention of both communicable and
non-communicable diseases [51]. The planet has an interconnectedness as one whole
unit of life and it is important to understand that borders do not delineate diseases
from entering such manmade boundaries. The Global Health Security Agenda
(GHSA) stresses the interconnectedness of the world as a complete unit that requires
global participations of all public health stakeholders in order to prevent, protect, and
respond to biological threats [22].
As biosecurity is a fairly recent concept there is still much to learn from history to
protect against future threats to our planet’s welfare. Although we may continue to
face challenges regarding globalization, resource depletion, and increased trade of
consumable goods, the Food and Agriculture Organization of the United Nation
(FAO) has developed a toolkit for addressing biosecurity in growing world [18].
FAO aims at encouragement of shared approaches to biosecurity measures, in which
regions would move away from current exclusive practices to unified collaborations
that create integrative preparedness and response to infectious pathogens and envi-
ronmental challenges that may affect future populations [18]. Global biosecurity
requires a concerted effort from all stakeholders regardless of current status of
disease presence in a given region of the world, as international travel and trade
export will continue to present challenges in changing landscapes. Furthermore,
climate factors will essentially create environments that are conducive to emergence
of new diseases and reemergence of previous known threats, as warming trends will
allow spread of diseases across borders into naïve populations. Therefore, it has
become increasingly important that proponents of biosecurity and safety be engaged
in unified regulatory policies with continued reevaluation of standards so that
amendments can precede a potential threat.

3.3 Biosecurity Policies and Ethical Concerns

Although there are many legal instruments involved in managing potential


biosecurity risks to human health, it is important to note that governing bodies
utilize a set of standards in the approach and implementation of biosecurity strategies
as noted through the work performed by Codex Alimentarius Commission (CAC),
the World Organization for Animal Health (OIE), and the Commission on
Phytosanitary Measures (CPM), which ultimately work towards protection of food
and animal biosafety that aid in protecting against spread of infectious agents and
potential pathogens present in international trade of goods [18, 23]. Biosecurity is an
integrated effort that requires shared principles and dissemination of information
across international boundaries in order to provide the greatest protection against
infectious agents. As previously mentioned research guidelines and policies for
handling select agents, which are regulated by the CDC, are further supported by
NIH and the NSABB in order to control the potential misuse of infectious agents.
Research has taken the use of pathogens in public health examination to new heights
on the journey towards disease elimination strategies. This expansion in
Global Health Biosecurity in a Vulnerable World – An Evaluation of. . . 93

experimentation of pathogens and toxins opens the door to accidental exposures,


maleficent activities, and potential exposure to hazardous conditions which could
increase biosecurity threats. Although there are several policy guidelines that address
the needs of the biosafety realm, there are still some challenges surrounding a unified
system for international protection across all sectors of biosecurity, including but not
limited to; food safety, plant and animal health, genetically modified substances, and
environmental and planetary health [18].
The United States Department of Health and Human Services has notable bio-
safety policies, such as; federal laws and protection acts, presidential directives,
executive orders, and international treaties and resolutions which have been
implemented for protective measures against biosecurity risks [13]. Initially many
of these executive orders and presidential directives were in response to fears of
nuclear warfare rather than potential use of pathogens as a bioweapon. However
biosecurity has grown to include a multitude of hazards to human health including
environmental causes, natural disasters, select agents, radiation, and genetically
modified sources of infectious pathogens or toxins. This perplexing array of poten-
tial biosecurity risks employs the need for biosafety and biodefense strategies that
can continually be modified to fit the standards of an everchanging world, especially
as lesser developed countries pave their path to an urbanized and industrialized
status.
When analyzing U.S. presidential directives related to biodefense strategies, one
such directive that sets the platform for discussion of biological threats is the
Homeland Biodefense Science and Technology Capability Review, which is han-
dled in collaboration with the National Science and Technology Council (NSTC)
and evaluates the environment and its sustainable resources, homeland security
issues, and areas of science, technology, engineering and mathematics (STEM)
[4]. Within this forum is the Biological Defense Research and Development Sub-
committee (BDRD), which assists in a collaborative effort regarding topics of
research, the investigation of threats to biosafety that can impact the environment
and its constituents, as well as known and/or emerging pathogens that will further aid
in addressing all aspects of biosecurity preparedness, response and recovery [4]. The
aim of directives that address biosecurity and biodefense are to gain a better
understanding of potential dangers in order to proactively design strategies that
can be either protective in nature or activated in response to an outbreak or biological
attack. These processes align the collaboration and activities of stakeholders to
ensure that appropriate infrastructure is in place and capable of meeting the needs
of the population during a threat of biological or environmental nature, as well as
potential acts of war.
When considering biosecurity ethical and privacy concerns, most issues are
involving aspects regarding dual use research issues, especially with respect to the
research of pathogenic agents that may be unintentionally or deliberately introduced
into the population putting humans at risk. Because research sciences are an inter-
disciplinary activity that encompasses a variety of nationalities from differing
cultural backgrounds, it is vulnerable to misconduct and misuse. Therefore it has
become increasingly important to have biosecurity ethical guidelines which can
94 K. Miley

protect the population and the environment from undue harm. With the continued
advancement of science technologies, ethical and privacy concerns will create
challenges for the health and research community. There have been documented
incidents of ethical disregard in research with ethical breeches and misconduct, from
fraudulent behavior and loss of pathogenic specimens which have sparked the debate
over whether today’s scientists are receiving adequate training in research integrity
and biosecurity ethics [39].
One area that is fairly new to biosciences, and of concern for ethical biosecurity,
is synthetic biology which may require even more elaborate ethical standards than
traditional biological research due to the unknown nature that synthetic modifica-
tions may have on the future of health and the environment [25]. Public awareness
may create further challenges regarding ethical concerns for use of synthetic biology,
as it is still considered an up and coming field of research and its application has not
reached far beyond the proponents of research and science technologies [11]. As
public inquiry and perception ensues, questions regarding the implications of syn-
thetic biology and its utilization will create ethical challenges that may not currently
be addressed through traditional biological research and biosafety measures, espe-
cially with regards to synthetic modifications of pathogenic agents.
Privacy regarding biosecurity and research is also a bit demanding with the
continued advancement of science technologies and the general transparency that
normally accompanies research and publication. This transparency to allow shared
information has created concern that maleficent individuals may exploit certain
research for harm rather than for the good in which it was intended. A recent research
study regarding Influenza illustrated the privacy debate as to whether or not certain
research should be published if it may adversely impact biosecurity due to misap-
propriation of the disseminated information [24]. Controversy surrounds the
research itself, as the creation of a mutated H5N1 that currently does not infect
humans in its natural state may increase transmissibility to humans, whereby creat-
ing a potential Influenza pandemic and an imminent threat to global biosecurity [16].
It is not difficult to understand the complexity in deciding whether to publish this
type of study or keep it private, as publishing may provide capabilities to replicate a
pathogen for misuse. Although research of this nature is warranted in the pursuit of
evaluating worst case scenario situations and future design of protective measures
with respect to highly pathogenic substances, the question remains as to whether the
resulting methodology should be open access.
The right to access controversial research creates further challenges for
biosecurity strategies that may require concerted collaboration on the possible
design of a high classification resource database system for approved researcher
accessibility. A limited access research publication database could provide an
outlet for dissemination of research that has the potential for dual use implications
and can be designed with similar standards to select agent policies which require
vetting of all approved research personnel, whereby only approved research and
health professionals would gain access. One might suggest that any research which
is meant to benefit humankind and/or the environment should be published and
accessible for the continued investigation to improve global health, and policies
Global Health Biosecurity in a Vulnerable World – An Evaluation of. . . 95

that prohibit the access to such information would be a disservice to the health
initiatives designed to protect against disease. Further, the unpublished studies will
appear as though the work were never accomplished, as the benefits of labor will
not be shared across all disciplines with the intent to improve health. To add to the
challenge there is a public divide and a gross need to bridge the gap between the
interests of research and protecting the public, as the biosecurity stakeholders
include; researchers, governmental agencies, health advocates, and the population
at large, and everyone involved may have differing agendas or thoughts on how
biosecurity relates to their needs [27].

4 Biosecurity Disaster Management

4.1 Preparedness

Current biosecurity preparedness strategies include; advancements in research of


biological agents for biorisk assessment through funding opportunities, increased
security regarding agricultural food resource contamination with the development of
the National Bio and Agro-Defense facility, surplus accommodations in the national
stockpile to aid in treatment of potential outbreaks, funding appropriations for the
improvement of health capabilities at the state and local levels, as well as global
interactions for the purpose of protecting against biological threats [4]. On the
international front, the Biological and Toxin Weapons Convention (BTWC) aids
in compliances of stakeholders for the prevention against the use of biological
pathogens, toxins or weapons of mass destruction, which acts as a biosecurity
measure to ensure the health and safety of the world’s population [5].
On the local level, first responders such as law enforcement agencies should have
preparedness plans in place and a general awareness regarding hazards and pro-
cedures for biological threats, as these individuals may ultimately be the ground zero
personal that must secure an area and ensure public health safety while disseminating
information regarding the threat to the proper channels in order to provide the utmost
protection to the population [7]. Research is also an outlet for biosecurity prepared-
ness strategies, as studies will continue to evaluate pathogens to establish new
methods of protection and/or preparedness, as noted through proactive vaccinations
that prevent the spread of disease to a naïve region from endemic regions by way of
international travel. Biosecurity preparedness is not without limitations, as it is
presumably difficult to ascertain when and where a biological threat might occur.
There is no amount of preparedness that can completely eliminate all risks to human
health. Therefore it is important to understand that biosecurity incorporates several
aspects of disaster management beyond preparedness strategies to protection plans,
mitigation procedures, response, recovery, and resilience. The National Health
Security Strategy and Implementation Plan helps address the aspects of biosecurity
emergencies, through support from documented incidents to resilience strategies that
create an engaged and informed public [36].
96 K. Miley

4.2 Protection

Protection agreements such as those established through the Biological and Toxin
Weapons Convention (BTWC) provide biosafety measures to protect against acts
of bioterrorism across boundaries wherein all active players are responsible for
protecting health by preventing the misuse of biological agents [5]. Surveillance
and detection strategies are also an integral part of biosecurity protection and rely
on collaborative efforts of the DHHS, DHS, USDA and many other organizations
at both the federal and local levels [4]. This illustrates the importance of continued
research in the application of biosensors to allow for implementation of surveil-
lance in imported agricultural products, as well as other areas of biosecurity
concern [38]. Of course there are also concerns for the development of aerosolized
biological weapons, wherein the application of a biosensor for indoor air quality
could protect against dissemination throughout a structure by utilization of a
controlled damper that closes upon detection of a aerosolized pathogen, such as
the use of biosensors that are similar to those designed to detect nerve agents [2].
Ultimately protection strategies rely on the availability of research regarding the
threat and the informed collaboration of this information to stakeholders, as well as
the expedient nature of those responding so that protection can be introduced to the
public as quickly as possible.
Biosafety plans establish a format for protection against the spread of diseases
through a system of checks and balances in which individuals are educated on the
risks of hazards in their immediate environment. Current plans to protect against
accidental or natural incidents should include multidisciplinary approaches in coop-
eration of all stakeholders and global partnerships, especially regarding the protec-
tion of fragile environments that may lack the appropriate infrastructure to
effectively combat a biosecurity threat [44]. Education is a key protective strategy
that can promote biosafety during the preparation process dealing with natural
disasters, such as; hurricane readiness. Ethics education also has a protective value
for the research community with regards to dual use concerns, wherein cooperation
with standards of ethical research can protect against misuse or accidental exposure
to a biological agent [31].
When evaluating agricultural biosecurity, risk assessment and management
through quarantine procedures can limit the potential spread of plant pathogens
that could affect crop yields downstream after importation, whereby advanced
diagnostics and knowledgeable action can prevent the spread [50]. Novel sciences
have the ability to assist in biosecurity protection in agriculture, such as; genetic
research, nanotechnologies and biosensors, which could make screening of imported
agriculture for detection of plant pathogens an easier undertaking [50]. Although
there are several protective measures throughout biosecurity disaster management,
there are several limitations on protective strategies with regards to biological threats
with one in particular regarding the lack of detection capabilities for certain highly
pathogenic agents [36]. Further with regards to highly pathogenic agents, there is a
lack of ethical education in chiefly science based curriculums that needs to be
Global Health Biosecurity in a Vulnerable World – An Evaluation of. . . 97

addressed in order to discuss the implications of dual use research [32]. There is also
a need to adapt protective strategies and education to the diverse needs of an
international population with a focus on strengthening global partnerships and
collaborations [48].

4.3 Mitigation

Regulatory frameworks provide biosecurity measures that reduce losses in the event
of biosecurity breach by way of quarantine or manage control procedures that limit
exposure to surrounding areas. We have seen this in aspects of agricultural farming
where entire crops had to be destroyed to prevent the spread of plant pathogens, as
well as in food animal production such as destruction of thousands of cattle that had
become infected with the prion that causes bovine spongiform encephalopathy
[17, 42]. As you might imagine mitigation has expensive consequences which
illustrates why more funding needs to be allocated in the preparation and protection
realm of biosecurity management so that adjustments could be made to potentially
eliminate the expenditures for such losses. Unfortunately the economic damages are
not the only losses that need mitigating in instances of agricultural and food animal
biosecurity, there are also varying degrees of societal impacts that can not be
addressed in a monetary fashion, especially if these types of diseases were to deplete
sustainable crop yield to a point which causes starvation in the affected population
[17]. Biological risks associated with natural disasters can also create economic and
societal issues, both from the standpoint of provisions for safe drinking water and for
the health risks to the affected community that lack the appropriate water and
sanitation that are required to promote a hygienic environment. Of course there are
limitations with current biosecurity mitigation strategies, such as; differing agendas
of stakeholders, lack of global partnerships in some regions of the world that may
encounter an event with a pathogenic agent, lack of a unified mitigation plan that
speaks to all countries, and the unknown emerging threats that have yet to present
themselves, such as; unforeseen implications with genetically modified pathogens.

4.4 Response

One important factor, possibly the most crucial in biosecurity response, is the ability
to diagnose the threat quickly and contain the area of exposure to prevent dissem-
ination of the threat, and this demands that first responders are knowledgeable in
handling threats of a potentially pathogenic source [7]. Biosecurity response requires
a collaborative and effective method of assessing the threat and adequate reactive-
ness to contain it, which further necessitates a joint effort between the emergency
management systems, the healthcare system, and the public health sector [36].
Communication is also an extremely important factor in the dissemination of
98 K. Miley

information during biosecurity response, not just between all responding organiza-
tions but also to the general public that may be in need. Response requires a triage of
the affected environment in order to determine emergent care needed and further
evaluation expectations regarding asymptomatic exposures, as well as determining
decontamination strategies that must be employed to best protect against further
spread to surrounding communities [6]. Although many biosecurity issues require
response from a seemingly bottom up perspective starting at the local level where the
initial noted case occurred, the potential for spread of disease is not necessarily
isolated to that area due to international trade and travel which may put the point
source of the pathogen in an entirely different region of the globe. These instances
may create limitations that hinder the ability to meet human needs after an incident as
health advocates attempt to locate the source of infection. Unfortunately there are
many vulnerabilities to biosecurity risks, as the human condition continues to evolve
with advancements in science and technologies more pressure will be placed on our
ability to effectively respond during an outbreak, natural disaster, or act of biowar-
fare. Continued research and global partnerships may provide the mechanism for
strengthening biosecurity response in the future.

4.5 Recovery

Factors involved in accommodating recovery require community involvement and


outreach. The public’s needs must be met in a timely fashion with primary goals
focused on health and safety including medical assessments and treatment, shelter
accommodations if the event was caused by some form of natural disaster, and
provisions of food and clean potable water [6]. Recovery also extends well beyond
the immediate exposure, as seen with agricultural damages wherein crop yields are
destroyed. The citrus industry has had several impacts that required long-term
recovery efforts, such as the medfly management program in the Valencia industry
back in the early 2000’s, and the Impact on Florida’s citrus industry due to citrus
greening that is still destroying groves across the state [50]. With regards to
biological agents, recovery efforts will require immediate action to properly inform
and protect the population, as well as provide vitality within the affected communi-
ties so that individuals will be empowered to persevere in spite of the catastrophic
event that they are faced with.

4.6 Resilience

In the broadest terms resilience is community motivated and requires strengthening


by local infrastructure, and the factors involved in creating an environment that will
build resilience in the event of biosecurity breach include; personal assets, strong self
advocation of needs, pooled resources, and diligent participation from local health
Global Health Biosecurity in a Vulnerable World – An Evaluation of. . . 99

systems and the public health sector [36]. The impacts of a biosecurity breach on the
population can create stress on the region both with the infrastructure currently in
place, down to the individual residing in the affected area. This makes it more
advantageous to establish strength and resilience skillsets at the local level as a
form of preparedness strategy, rather than trying to establish pliability in the
population after a pathogenic exposure or biowarfare event occurs. If a population
is ill prepared for potential biosecurity threats the ability for the affected communi-
ties to bounce back through the recovery process will diminish.
The world will continue to grow and change in ways that we may not perceive in
the current landscapes of research and biosecurity challenges. In order protect human
health, animal health and agriculture, and improve environmental protection in a
globalized planet, a dedicated unified approach to biosecurity is required [18]. To do
this will require continued research regarding infectious pathogens that threaten
agriculture, the environment, and human health. There is also a need for sustained
collaboration and amendments to current biosecurity directives as changes in climate
and/or environmental status warrants such adjustments in order to protect the welfare
of sustainable agriculture. Preparedness and protection strategies should also be
addressed periodically to accommodate for change as this will strengthen the
investment regarding the safety and security of the population as a whole. This
may be accomplished with the understanding that what effects one region is not
limited to that area by invisible boundaries, as pathogens can extend beyond such
erroneous lines that one chooses to draw in the sand. Adequate biosecurity measures
should be actionable to react quickly in the event of an outbreak or act of biowarfare
and must employ knowledgeable actors able to disseminate information to all
stakeholders as this will allow appropriate biosafety measures to be taken regarding
management in which possible outcomes are in the best interest of protecting human
health and the environment with concern placed on making a full recovery attainable
[18]. Biosecurity is a global health concern which is not attainable from one direct
source, organization, or country, and demands effective collaboration with all
stakeholders, including those individuals that may not have appropriate infrastruc-
tures to support adequate protection against infectious agents.

References

1. Anderson PK, Cunningham AA, Patel NG, Morales FJ, Epstein PR, Daszak P (2004) Emerging
infectious diseases of plants: pathogen pollution, climate change and agrotechnology drivers.
Trends Ecol Evol 19(10):535–544. https://doi.org/10.1016/j.tree.2004.07.021
2. Arduini F, Scognamiglio V, Moscone D, Palleschi G (2016) Electrochemical biosensors for
chemical warfare agents. In: Nikolelis DP, Nikoleli G-P (eds) Biosensors for security and
bioterrorism applications. Springer International Publishing, Cham, pp 115–139
3. Austin AA, Fitzgerald EF, Pantea CI, Gensburg LJ, Kim NK, Stark AD, Hwang SA (2011)
Reproductive outcomes among former Love Canal residents, Niagara Falls, New York. Environ
Res 111(5):693–701. https://doi.org/10.1016/j.envres.2011.04.002
100 K. Miley

4. BDRD, B. D. R. a. D. S (2016) Homeland biodefense science and technology capability review.


Washington, DC. Retrieved from https://obamawhitehouse.archives.gov/sites/default/files/
microsites/ostp/NSTC/biodefense_st_report_final.pdf
5. Bielecka A, Mohammadi AA (2014) State-of-the-art in biosafety and biosecurity in European
countries. Arch Immunol Ther Exp 62(3):169–178. https://doi.org/10.1007/s00005-014-0290-1
6. Buck G (2002) In: Clark S (ed) Preparing for biological terrorism; an emergency services
planning guide. Delmar, Thomson Learning, Albany
7. Budowle B, Beaudry JA, Barnaby NG, Giusti AM, Bannan JD, Keim P (2007) Role of law
enforcement response and microbial forensics in investigation of bioterrorism. Croat Med J 48
(4):437–449
8. Campuzano S, Montiel VR-V, Torrente-Rodríguez RM, Reviejo ÁJ, Pingarrón JM (2016)
Electrochemical biosensors for food security: allergens and adulterants detection. In: Nikolelis
DP, Nikoleli G-P (eds) Biosensors for security and bioterrorism applications. Springer Interna-
tional Publishing, Cham, pp 287–307
9. CDC (2016) CDC strategic framework. Retrieved from https://www.cdc.gov/about/organiza
tion/strategic-framework/index.html
10. Crous PW, Groenewald JZ, Slippers B, Wingfield MJ (2016) Global food and fibre security
threatened by current inefficiencies in fungal identification. Philos Trans R Soc Lond Ser B Biol
Sci 371(1709). https://doi.org/10.1098/rstb.2016.0024
11. Dabrock P, Braun M, Ried J, Sonnewald U (2013) A primer to 'bio-objects': new challenges at
the interface of science, technology and society. Syst Synth Biol 7(1–2):1–6. https://doi.org/10.
1007/s11693-013-9104-8
12. Degeling C, Johnson J, Kerridge I, Wilson A, Ward M, Stewart C, Gilbert G (2015)
Implementing a One Health approach to emerging infectious disease: reflections on the socio-
political, ethical and legal dimensions. BMC Public Health 15:1307. https://doi.org/10.1186/
s12889-015-2617-1
13. DHHS (2018) Public healthn emergency. Retrieved from https://www.phe.gov/preparedness/
14. Dobson M (2013) Disease, the extraordinary stories behind history’s deadliest killers. Quercus,
London
15. Du L, Li Y, Gao J, Zhou Y, Jiang S (2012) Potential strategies and biosafety protocols used for
dual-use research on highly pathogenic influenza viruses. Rev Med Virol 22(6):412–419.
https://doi.org/10.1002/rmv.1729
16. Faden RR, Karron RA (2012) Public health and biosecurity. The obligation to prevent the next
dual-use controversy. Science 335(6070):802–804. https://doi.org/10.1126/science.1219668
17. Fisher MC, Henk DA, Briggs CJ, Brownstein JS, Madoff LC, McCraw SL, Gurr SJ (2012)
Emerging fungal threats to animal, plant and ecosystem health. Nature 484(7393):186–194.
https://doi.org/10.1038/nature10947
18. Food, Agriculture Organization of the United, N., & Biosecurity Priority Area for Interdisci-
plinary, A (2007) FAO biosecurity toolkit. Biosecurity Priority Area for Interdisciplinary
Action, Food and Agriculture Organization of the United Nations, Rome
19. Gronvall G (2008) Biodefense countermeasures: the impact of Title IV of the US Pandemic and
All-Hazards Preparedness Act. Emerg Health Threats J 1:e3. https://doi.org/10.3134/ehtj.08.
003
20. Hyun IH, Choi W (2014) Phytophthora species, new threats to the plant health in Korea. Plant
Pathol J 30(4):331–342. https://doi.org/10.5423/ppj.Rw.07.2014.0068
21. Ishii T, Araki M (2017) A future scenario of the global regulatory landscape regarding genome-
edited crops. GM Crops Food 8(1):44–56. https://doi.org/10.1080/21645698.2016.1261787
22. Jenkins AB (2015) The global health security agenda and biosafety associations. Appl Biosaf
20(4):172–174. https://doi.org/10.1177/153567601502000401
23. Jenkins B (2017) The global health security agenda and the role of the world organisation for
animal health. Rev Sci Tech 36(2):639–645. https://doi.org/10.20506/rst.36.2.2681
24. Keim PS (2012) The NSABB recommendations: rationale, impact, and implications. MBio 3
(1). https://doi.org/10.1128/mBio.00021-12
Global Health Biosecurity in a Vulnerable World – An Evaluation of. . . 101

25. Kelle A (2009) Synthetic biology and biosecurity. From low levels of awareness to a compre-
hensive strategy. EMBO Rep 10(Suppl 1):S23–S27. https://doi.org/10.1038/embor.2009.119
26. Lewis N, Campbell MJ, Baskin CR (2015) Information security for compliance with select
agent regulations. Health Secur 13(3):207–218. https://doi.org/10.1089/hs.2014.0090
27. MacIntyre CR (2015) biopreparedness in the age of genetically engineered pathogens and open
access science: an urgent need for a paradigm shift. Mil Med 180(9):943–949. https://doi.org/
10.7205/milmed-d-14-00482
28. Martensson PA, Hedstrom L, Sundelius B, Skiby JE, Elbers A, Knutsson R (2013) Actionable
knowledge and strategic decision making for bio- and agroterrorism threats: building a collab-
orative early warning culture. Biosecur Bioterror 11(Suppl 1):S46–S54. https://doi.org/10.
1089/bsp.2013.0039
29. Michelotti JM, Yeh KB, Beckham TR, Colby MM, Dasgupta D, Zuelke KA, Olinger GG
(2018) The convergence of high-consequence livestock and human pathogen research and
development: a paradox of zoonotic disease. Trop Med Infect Dis 3(2):55. https://doi.org/10.
3390/tropicalmed3020055
30. Miller J, Burton K, Fund J, Self A (2017) Process review for development of quantitative risk
analyses for transboundary animal disease to pathogen-free territories. Biores Open Access 6
(1):133–140. https://doi.org/10.1089/biores.2016.0046
31. Minehata M (2012) ‘Getting the biosecurity architecture right’ in the Asia-Pacific region. Med
Confl Surviv 28(1):45–58. https://doi.org/10.1080/13623699.2012.658625
32. Minehata M, Sture J, Shinomiya N, Whitby S (2013) Implementing biosecurity education:
approaches, resources and programmes. Sci Eng Ethics 19(4):1473–1486. https://doi.org/10.
1007/s11948-011-9321-z
33. Missimer TM, Teaf CM, Beeson WT, Maliva RG, Woolschlager J, Covert DJ (2018) Natural
background and anthropogenic arsenic enrichment in florida soils, surface water, and ground-
water: a review with a discussion on public health risk. Int J Environ Res Public Health 15
(10):2278. https://doi.org/10.3390/ijerph15102278
34. Moore P (2007) The little book of pandemics; 50 of the world’s most virulent plagues and
infectious diseases, 1st edn. Collins, New York
35. Nelson MI, Wentworth DE, Culhane MR, Vincent AL, Viboud C, LaPointe MP et al (2014)
Introductions and evolution of human-origin seasonal influenza a viruses in multinational swine
populations. J Virol 88(17):10110–10119. https://doi.org/10.1128/jvi.01080-14
36. NHSS, N. H. S. S (2018) National health security strategy and implementation plan. Retrieved
from https://www.phe.gov/Preparedness/planning/authority/nhss/Documents/nhss-ip.pdf
37. NIH (2018) National Institutes of Health; Biosecurity Policy. Retrieved from https://osp.od.nih.
gov/biotechnology/biosecurity-policy/
38. Nikolelis D (2016) Biosensors for security and bioterrorism applications. Springer, Cham
39. Novossiolova T, Sture J (2012) Towards the responsible conduct of scientific research: is ethics
education enough? Med Confl Surviv 28(1):73–84. https://doi.org/10.1080/13623699.2012.
658627
40. Plowright RK, Peel AJ, Streicker DG, Gilbert AT, McCallum H, Wood J et al (2016)
Transmission or within-host dynamics driving pulses of zoonotic viruses in reservoir-host
populations. PLoS Negl Trop Dis 10(8):e0004796. https://doi.org/10.1371/journal.pntd.
0004796
41. Renault V, Humblet MF, Moons V, Bosquet G, Gauthier B, Cebrian LM et al (2018) Rural
veterinarian’s perception and practices in terms of biosecurity across three European countries.
Transbound Emerg Dis 65(1):e183–e193. https://doi.org/10.1111/tbed.12719
42. Requena JR, Kristensson K, Korth C, Zurzolo C, Simmons M, Aguilar-Calvo P et al (2016) The
priority position paper: protecting Europe’s food chain from prions. Prion 10(3):165–181.
https://doi.org/10.1080/19336896.2016.1175801
43. Rogers M, Stansly P, Stelinski L (2016) 2016 Florida citrus pest management guide: Ch. 9
Asian citrus psyllid and citrus leafminer. In 2016 Florida citrus pest management guide
102 K. Miley

44. Shinwari ZK, Khalil AT, Nasim A (2014) Natural or deliberate outbreak in Pakistan: how to
prevent or detect and trace its origin: biosecurity, surveillance, forensics. Arch Immunol Ther
Exp 62(4):263–275. https://doi.org/10.1007/s00005-014-0298-6
45. Sikes BA, Bufford JL, Hulme PE, Cooper JA, Johnston PR, Duncan RP (2018) Import volumes
and biosecurity interventions shape the arrival rate of fungal pathogens. PLoS Biol 16(5):
e2006025. https://doi.org/10.1371/journal.pbio.2006025
46. Skolnik R (2016) Global health 101, 3rd edn. Jones & Bartlett Learning, Burlington
47. Starodub NF, Ogorodniichuk YO, Novgorodova OO (2016) Efficiency of instrumental analyt-
ical approaches at the control of bacterial infections in water, foods and feeds. In: Nikolelis DP,
Nikoleli G-P (eds) Biosensors for security and bioterrorism applications. Springer International
Publishing, Cham, pp 199–229
48. Sture J, Whitby S (2012) Preventing the hostile use of the life sciences and biotechnologies;
fostering a culture of biosecurity and dual use awareness. Conclusions. Med Confl Surviv 28
(1):99–105. https://doi.org/10.1080/13623699.2012.658629
49. USDA, N (2010) National plant materials manual, 4th edn. USDA, N, Washington,
DC. Retrieved from https://www.nrcs.usda.gov/Internet/FSE_DOCUMENTS/
stelprdb1042145.pdf
50. Waage JK, Mumford JD (2008) Agricultural biosecurity. Philos Trans R Soc Lond Ser B Biol
Sci 363(1492):863–876. https://doi.org/10.1098/rstb.2007.2188
51. WHO (2018) World Health Organization. Retrieved from https://www.who.int/about/what-we-
do/en/
52. Borio L, Frank D, Mani V, Chiriboga C, Pollanen M, Ripple M, Ali S, DiAngelo C, Lee J,
Arden J, Titus J, Fowler D, O'Toole T, Masur H, Bartlett J, Inglesby T (2001) Death due to
bioterrorism-related inhalational anthrax. JAMA 286(20):2554
The Emerging Threat of Ebola

Michelle LaBrunda and Naushad Amin

1 Introduction

Ebola is one of the deadliest infectious disease of the modern era. Over 50% of those
infected die. Prior to 1976, the disease was unknown or at least unreported. No one
knows exactly where it came from, but it is postulated that a mutation in an animal
virus allowed it to jump species and infect humans. In 1976 simultaneous outbreaks
of Ebola occurred in what is now South Sudan and the Democratic Republic of the
Congo (DRC). For 20 years, only sporadic cases were seen, but in 1995 a new
outbreak occurred killing hundreds in the DRC. Since that time the frequency of
these outbreaks has been increasing. It is uncertain why this is occurring, but many
associate it with increasing human encroachment into forested areas bringing people
and animals into more intimate contact and increased mobility of previously remote
population. This chapter will navigate Ebola in the context of global health and
security.
There are multiple objectives of this chapter. First is to provide a basic under-
standing of Ebola disease processes and outbreak patterns. Second, is to explore the
interplay between social determinants of health and Ebola. The role of technology in
spreading Ebola outbreaks will be explained as will Ebola’s potential as a bio-
weapon. Readers will gain understanding of the link between environmental degra-
dation and Ebola outbreaks.
This chapter will be divided into five main sections. These are (1) a case study;
(2) Ebola Disease process; (3) Social determinants of health and Ebola; (4) Ebola in
the modern era, and (5) the link between Ebola and environmental degradation.
A case study will be given to give an overview of many of the topics that will be
discussed in detail in future sections. It will provide a humanitarian view of Ebola
and draw from concepts from all of the other areas. The case study is of a woman

M. LaBrunda (*) · N. Amin


University of Central Florida, Orlando, FL, USA

© Springer Nature Switzerland AG 2020 103


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1_6
104 M. LaBrunda and N. Amin

who contracts Ebola. The story will be told from her perspective. She will describe
from her why she thinks the outbreak has occurred. Her husband has died of Ebola
despite efforts of traditional healers. She will discuss burial rites in the context of her
religious beliefs.
The next section looks at the disease itself. The history, epidemiology, transmis-
sion, and signs/symptoms will be described. Prevention measures including the use
of personal protective equipment and vaccination strategies will be discussed. The
basics of diagnosis and treatment will be covered. The section will end with a
discussion of Ebola epidemics.
Social determents of health play an important role in the epidemiology and
transmission of Ebola. Factors impacting spread include, high population mobility,
porous international borders, and ongoing conflict resulting in displaced
populations. Poverty, cultural beliefs and practices and prior ineffective public
health messages have all played a role in the emergence of Ebola.
The following section will explore Ebola in the era of technology. The role of air
travel in disease spread and the effectiveness of airport screening measures will be
discussed. Ebola’s potential for use in bioterrorism will also be discussed in this
section.
The relationship between environmental encroachment and disease emergence
will be explored. Global warming, and the impact of a growing population in Ebola
outbreaks will be explored.
The chapter will end with a discussion of future directions. In this last section the
important of international collaborations for disease prevention and public education
programs will be discussed.

2 Case Study

Sia waited nervously in the small one room house where she lives. She was waiting
for her brother-in-law to return with the body of her dead husband, Saa. He had died
yesterday of the bush illness that was killing so many in her community, Ebola the
outsiders called it.
Just 2 weeks ago, the world had seemed a different place. Sia had sat with the
other women of the Kissi tribe at church joking and planning for the upcoming rice
harvest. Yes, they practiced Christianity, but also followed the traditions of their
ancestors. Women in her village prayed to Jesus and God, but also to their ancestors.
Outsiders sometimes questioned how the Kissi could follow both Christianity and
their old traditions, but Sia had never seen a problem. Ancestors after all, were the
ones who communicated with God. When someone in the family died, they were
escorted to the realm of the ancestors where they were able to protect the living
family and speak to God on their behalf. Ancestors continued to live in the village,
but in their new form. Sia shivered thinking of what happened to those who died and
were not escorted to the realm of the ancestors. Ceremonies were usually performed
The Emerging Threat of Ebola 105

by the brother of the deceased. If the ceremonies were not done properly, a loved one
would become a wandering ghost instead of an ancestor. Wandering ghosts torment
the living bringing misfortune to everyone in the village, especially to the family that
failed to perform the proper rights. Sia did not like to think of such things, but there
had been several deaths in a nearby village and she could not help but to wonder if it
was the work of a wandering ghost.
That was the day it started. Saa was fine when he woke up, but while they were at
the church, he started to get sick. He got sick so quickly that Sia suggested that they
return home early so he could rest. It wasn’t a far walk, but by the time they arrived
home, Saa was having chills, headache, nausea, and said his joints hurt. While Saa
rested, Sia prepared a tonic to ease the pain and ward off evil spirits. Saa’s eyes were
red and he felt hot to touch. “A powerful spirit must be involved”, Sia thought to
herself. She couldn’t imagine who would have cursed her husband this way. He
hadn’t argued with anyone that she knew.
For 3 days Sia cared for her husband with special food, potions, and prayers. She
had even sacrificed a chicken, but instead of getting better he started vomiting and
having diarrhea. Obviously, she needed assistance from someone with greater
influence in the spirit realm. Kai, a local medicine-man of conservable powerful
agreed to help but needed time to make the necessary preparations. By that evening
Saa had stopped eating altogether and his gums started to bleed.
Kai belonged to a secret society that added to his powers. Sia was not allowed to
attend Kai’s ceremony but was told that Saa had cried blood and started to hiccough
uncontrollably. Kai was notable to defeat the evil spirits even with his most powerful
incantations.
Some of the villagers wanted to take Saa to a treatment center set up by some
foreigners to see if they could help him. Sia was hesitant, but by the next morning
Saa had developed a yellow color to his skin and was having black diarrhea, so she
agreed. After a bunch of questions Saa was taken into the camp that the foreigners set
up, but they would not let Sia or anyone else in the family enter. That was the last
time she had seen Saa alive.
Two days later Sia was informed that Saa had died. He was to be buried in a mass
grave and no one was allowed to see his body. Saa’s brother said that he thinks the
foreigners killed him. They weren’t really there to help but part of a government plan
to destroy the Kissi. Workers in the camp were removing the internal organs of the
sick while they were still alive and selling them. That is why no one was allowed into
the camp or to bury the bodies properly. They weren’t just attacking the living, but
also trying to destroy the ancestors by preventing the death ceremony from
happening.
Luckily Saa’s brother knew people. It had cost everything that the family owned,
but the man driving the truck full of bodies agreed to meet a short distance from the
foreigners’ camp. He would give them the body there, but there were not to tell
anyone.
As Saa’s brother walked into the house carrying Saa’s body, Sia felt an over-
whelming sense of relief. All the worry gave her a headache and made her feel weak.
Now that they had Saa’s body it will be better. They will do the rituals this evening
and burry Saa in the morning. He will be able to walk with the ancestors.
106 M. LaBrunda and N. Amin

3 Ebola Virus Disease

3.1 Introduction and History

One of the world’s deadliest pathogen, the Ebola virus made its first appearance in
1976 in not just one but two simultaneous outbreaks. The first of its deadly attacks
were in what is now known as Nzara, South Sudan while the second occurred in a
small village community near the Ebola River bank in Yambuku, Democratic
Republic of Congo (DRC) [122]. Of the 318 known to be infected, 280 lost their
lives. Since that time, we have learned much about the Ebola virus and the disease it
causes.
Ebola Virus is an uncommon virus which infects both human and non-human
primates. It belongs to the family filoviridae, a negative stranded RNA virus. When
magnified, it appears as a filamentous structure Fig. 1. The Ebolavirus genus has six
known species, Zaire, Sudan, Tai Forest (formerly Côte d’Ivoire ebolavirus),
Bundibugyo, Reston, and the recently described Bombali [48]. Reston is highly
pathogenic for non-human primates and pigs, and Bombali has been discovered in
free-tailed bats as part of ongoing research to discover the Ebola reservoir. The Zaire
species was responsible for the first Ebola virus outbreak in 1976 and is considered
to be the deadliest of the six [63]. Initially the disease caused by Ebola virus was
called Ebola hemorrhagic fever, but later studies showed that the hemorrhagic
manifestations were less common than initially thought and subsequently the name
was changed to Ebola Virus Disease (EBD).
Until 2014, the Ebola virus isolated sporadic outbreaks occurred only in Central
Africa with counts numbering in the hundreds or less, and only lasting days to
weeks. However, in March 2014 the WHO confirmed an epidemic of the Zaire
species of Ebola virus emerging in West Africa. This outbreak lasted 2 years and
grew to be one of the world’s deadliest epidemics. There were 29,000 case and
11,000 fatalities documented by the World Health Organization (WHO).

Fig. 1 Transmission
electron micrograph
showing some of the
ultrastructural morphology
displayed by an Ebola virus
virion Centers for Disease
Control and Prevention
The Emerging Threat of Ebola 107

The index case of this epidemic is thought to be a 2-year-old child who became ill
in late 2013. The child eventually succumbed to the illness with symptoms of fever,
chills, vomiting, and black-tarry stool [117]. This was in Guinea, West Africa a
country where Ebola supposedly did not exist. From here it spread to Liberia, Sierra
Leone, Nigeria, and Mali.
The natural reservoir of Ebola virus is not known with certainty, although
research has suggested that it may be bats. Human infection may occur through
direct contact with the mystery reservoir or through contact with infected primates.
This can occur when hunting and preparing bush-meat or via contact with body
fluids from an infected person. Ebola is highly transmissible.
The disease pattern of EVD has shifted over the last 10 years. Currently, Ebola
has been found across Central and West Africa, with occasional exported cases to
other regions. For obscure reasons, outbreaks seem to be occurring with increasing
frequency. This may be linked to environmental degradation and increasing mobility
of local populations.
Ebola spread is through contaminated body fluids. Unfortunatly, traditional
funerary practices across Africa put funeral attendees in contact with body fluids
from those who have died of Ebola. Initial international efforts to control Ebola
spread during outbreaks have often resulted in clashes and conflict as control
measures confront tradition. Inadequate public health messages, distrust of those
providing the health messages, political instability, and regional conflict have
allowed Ebola to spread and kill thousands when early containment could have
been within reach.
Ebola is one of the most fatal infectious diseases humans have encountered. Even
with the best medical care the disease is deadly. Unfortunatly, the developing
countries where EVD occurs are not equipped with optimal medical or public health
facilities. To complicate the situation further, survivors of EVD are not hailed as
heroes, but instead may be left with chronic illness and stigmatized in their
communities.

3.2 Transmission

Transmission of Ebola disease is still being studied, but it is known that person-to-
person contact is the most common form of spread. Infection occurs primarily
through direct contact with body fluids from infected people or animals (Fig. 2).
Viral antigens have been isolated from the skin of those infected suggesting that skin
contact alone may be sufficient to spread disease [128]. It has also been shown that,
at least in primates, Ebola can be spread through intramuscular injection, and
inoculation can occur through contact of the conjunctiva or oral mucosa with
infected body fluids [61]. Blood, vomitus, and feces are the body fluid most likely
to spread infection because of the frequency with which they are encountered
during the course of the illness, but other fluid such as urine, semen, vaginal fluid,
tears, sweat, and breast milk also have potential for viral transmission [2, 5, 70, 88,
97, 112].
108 M. LaBrunda and N. Amin

Fig. 2 Ebola ecology

Caring for an infected person with Ebola, whether at home or in the hospital has
been identified as a high-risk activity for acquiring Ebola. Household members who
provide direct care to an Ebola victim are 25–30 times more likely to contract Ebola
than household members who share a residence but do not participate in patient care
[9, 34].
Healthcare workers are also at high risk for acquiring Ebola. One study found the
risk of developing EVD for healthcare workers to be 100 times that of the general
community during an outbreak of Ebola in Sierra Leone [67]. There are many factors
contributing to the spread of Ebola amongst healthcare workers. The presentation of
Ebola is non-specific so early on in the disease process it may be diagnosed as
malaria, influenza, or other non-specific viral illness. If a patient is initially
misdiagnosed, then proper protective measures to limit the spread of Ebola will
not be initiated. Also, the use of personal protective equipment (PPE) including
gloves and gowns for routine patient care is less common in developing countries
than in more developed countries due to financial restriction.
There is a risk of iatrogenic spread of Ebola. In the initial outbreak of 1976, health
care workers reusing glass syringes and needles in a community clinic may have
inadvertently caused spread of infection. The facility consisted of a 120-bed hospital
and a busy outpatient center which treated between 6000 and 12,000 people per
month. At the beginning of each day, nurses were given five syringes each which
were reused after a warm water rinse. Unfortunatly, this is where Ebola made its first
The Emerging Threat of Ebola 109

appearance. Potentially hundreds were exposed from this clinic alone


[12, 122]. There have been many other instances where hospitals have turned into
epicenters for Ebola outbreaks [86, 103]. Early detection and isolation is key to
preventing similar incidents in the future.
The greatest risk of transmission of EVD from human to human occurs when a
patient is acutely ill. Risk also corelates with severity of illness. The sicker a patient
is the more infective she is. In early phase of acute illness, the viral load is relatively
low, however it increases exponentially during the latter part of the acute illness, and
high viral loads are associated with high mortality rates and infectivity [43].
Those who handle corpses of Ebola victims after death also run considerable risk
of acquiring the disease. Many funerary customs in Ebola-prone regions involve
extensive physical contact with the dead body. Despite the risk of transmission,
many still engage in these traditional practices. Without these preparations, some
local traditions hold that misfortune will plague the living and the dead will not be
able to pass into the spirit realm. Family who do not engage in expected funerary
practice may be viewed negatively in the communities where they live. One funeral
ceremony alone has been linked to 85 additional cases of Ebola [113].
Transmissibility of Ebola virus depends on the phase of infection of the
ill-person. The viral load corresponds to the severity of illness [107]. In other
words, the sicker a person is, the higher concentration the concentration viral
particles in the blood stream. As an ill person succumbs to Ebola, they become
more debilitated and require more care. At the same time, the viral load increases as
the victim declines. Because of this, family caring for the ill are more likely to be
infected in the later stages and corpses of those killed by Ebola are highly
infectious [47].
Even after a person has recovered from Ebola and no virus can be isolated from
blood, it may still be found in other tissues and able to transmit disease. Live virus
has been isolated from breastmilk after recovery raising the issue of transmission to
mother to infant [5]. Ebola has been isolated from semen up to 9 months after onset
of symptoms, in urine for 30 days, sweat for 40 days, aqueous humor of the eye for
14 weeks, and in cerebral spinal fluid for 10 months [35, 58, 70, 97, 112]. There has
been at least one case where a man who recovered from Ebola transmitted the
infection to a sexual partner 200 days after his initial illness [79].
To prevent sexual transmission of Ebola, the WHO recommends systematic
testing for Ebola virus in semen. For the first 3 months after infection, the semen
of male Ebola survivors should be assumed to be infectious. Three months after the
day symptoms started semen testing for Ebola should be initiated. If the result is
negative, then it should be repeated in 1 week. If the test is positive, then it should be
repeated monthly until a negative result is obtained. Once two consecutive negative
results have been obtained sexual activities can be resumed [119]. Vaginal secretions
have been found to contain virus up to 33 days after the initiation of symptoms, but
no official testing recommendations exist for vaginal secretions [29, 56].
Other methods of Ebola spread have been postulated, but do not appear to be
significant sources of transmission. Surfaces contaminated with body fluids produce
a theoretical risk of transmission, but no confirmed documented cases of fomite
110 M. LaBrunda and N. Amin

transmission of Ebola exist. Ebola virus has been shown to persist in the environ-
ment supporting the need for close attention to decontamination of surfaces
[92, 116]. Medical procedures can augment disease spread if proper precautions
are not taken [66].
Hunting and capturing infected animals for bush meat or for trading in black
market as exotic pets can result in exposure and transmission of Ebola. There have
been numerous instances of human infection resulting from contact with dead
primates [8, 87]. Contact with wild primates, especially those found dead should
be avoided to curb the risk of contracting Ebola.
There is another step in Ebola transmission that continues to be elusive. Humans
and other primates can catch Ebola from each other, but they are not the reservoir.
The reservoir is not known with certainty, but there is some evidence linking bats to
Ebola [48]. The evidence for bats as the Ebola reservoir is suggestive but not
compelling. Antibodies against Ebola have been found in bat species, but the
significance of this is unclear. Antibodies are formed when an organism has been
exposed to an infectious organism. This is evidence of exposure and immune
response, but not of long-term infection or viral shedding [33]. Only one small
study has ever isolated Ebola RNA from bats [72]. Attempts to infect bats then
isolate viral RNA or shedding have not met with success [62, 93]. As the systematic
search for the reservoir continues, negative findings are as important as positive one.
Plants and arthropods have not been shown to harbor Ebola [96, 105].

3.3 Clinical Presentation

Ebola Virus Disease is an acute febrile illness that has been associated with hemor-
rhagic manifestations. It has an incubation period of 2–21 days, but presentation of
symptoms is most common between day 6 and 12 after exposure [99]. It is unclear
whether or not infected people can transmit disease prior to developing symptoms,
but those with symptoms should be assumed to be contagious.
EVD typically begins with abrupt onset of malaise, fever, and chills. It is also
common to experience vomiting, headache, diarrhea, and loss of appetite early in the
disease course. The diarrhea can be profuse and water losses of up to 10 l per day
have been reported [76]. Dehydration and hypovolemic can result. Relative brady-
cardia can also be seen in Ebola [17]. A maculopapular rash commonly develops
5–7 days after onset of illness. The rash is not a consistent finding and seems to vary
from region to region [99].
Hemorrhage is the most dramatic symptom associated with EVD but is not as
common as first feared. Usually it manifests as gastrointestinal bleeding, but pete-
chia, ecchymosis, bleeding oral mucosa can also be seen [121]. Bleeding is multi-
factorial and likely due to a combination of thrombocytopenia, coagulopathy from
liver involvement, and in some instance Disseminated Intravascular Coagulation
(DIC).
The Emerging Threat of Ebola 111

EVD can cause involve a number of different organ systems. Neurologically, it


can cause meningoencephalitis, confusion, chronic cognitive decline, and seizures.
Neurological symptoms typically occur 8–10 days after onset of illness [27, 32]. Car-
diomyopathy and respiratory muscle fatigue have been described [110]. Eye
involvement is also common early in the disease course and may persist. Patients
frequently report blurred vision, photophobia and blindness [102]. Laboratory find-
ings during the course of the infection can include leukopenia, elevated renal profile,
abnormal coagulation panel, thrombocytopenia, anemia, and elevated liver function
tests [69]. Hiccoughs are common late in the acute phase of illness.
Symptoms typically abate after 2 weeks of illness. Even after the acute illness has
resolved, Ebola victims can have long term symptoms. These include fatigue,
insomnia, headaches, myalgias, arthralgias, cognitive decline, and hair loss. Uveitis
and hearing loss are both common after recovery from EVD [59, 97].
Even after the resolution of acute EVD, new symptoms can develop. In a 2016
study looking at early clinical sequela, 76% of Ebola survivors developed arthral-
gias, 60% ocular symptoms, 24% auditory symptoms, and 18% uveitis [80]. Studies
evaluating the long -erm sequela of EVD are ongoing.

3.4 Prevention

Prevention strategies for Ebola are numerous, but essentially boils down to avoiding
all contact with skin and body fluids that could potentially harbor the Ebola virus. Of
course, this is more easily said than done especially in health care settings, and for
families of those infected. Health care providers deal with rapidly changing condi-
tions often in limited resource settings and are at high risk for contracting Ebola if
prevention protocols are not followed. Families of Ebola victims face similar, but
even more daunting challenges. Ebola may be found in secretions of those who have
recovered for months or even years after the acute illness has resolved. While not
common, cases of transmission have occurred months after a person has recovered.
Active Ebola virus can persist in urine, vaginal secretions, breast milk, semen, ocular
fluid, and cerebrospinal fluid even after recovery making prevention more
challenging.
While not heavily researched as an effective prevention strategy, people who eat
bushmeat should be encouraged to take precautions to prevent Ebola infection. This
means avoiding contact with fluids from slaughtered animals as much as possible.
Ebola virus is inactivated by thorough cooking, so through cooking of bush meat
should be encouraged [55].

3.4.1 Health Care Settings

Ebola is highly pathogenic and easily transmitted. Both the WHO and the Center for
Disease Control (CDC) have published detailed guidelines on prevention which are
112 M. LaBrunda and N. Amin

freely available online [54, 118, 124]. The WHO recommends the following key
elements to prevent transmission of Ebola virus in the hospital setting:
• Hand hygiene
• Gloves
• Facial protection (covering eyes, nose and mouth)
• Gowns (or overalls)
• Sharps safety
• Respiratory hygiene for both health care providers and patients
• Environmental cleaning
• Safe linen transport and cleaning
• Proper waste disposal
• Proper sanitation of patient care equipment
Ebola prevention requires attention to and special training in donning and remov-
ing personal protective equipment (PPE). Specific instructions and videos for use of
this equipment is available at the WHO prevention CDC websites.
Health care workers who use PPE equipment properly are safe from Ebola
infection, but can develop other health issues from the PPE itself. The PPE suits
are hot, uncomfortable, and require constant surveillance to ensure that all the
equipment remains in place and undamaged. Areas prone to Ebola outbreaks tend
to be hot, humid, and lack resources for air conditioning, wearing Ebola suits creates
a risk for development of heat related illness and dehydration. The CDC has
published guidelines for preventing heat related illness for those providing care to
Ebola patients in hot African climates [18].

3.4.2 Prevention after Recovery

As previously mentioned, people have survived initial Ebola infection may still be
able to transmit the disease to others. With proper preventive measures the risk of
transmission can be ameliorated. As with other aspects of Ebola, both the CDC and
WHO have published extensive guidelines available on their websites. For
healthcare workers, no special precautions are needed for basic patient care. The
CDC does recommend that additional PPE be used when caring for Ebola survivors
if contact with testes, urine, breast, breast milk, spinal fluid, or intraocular fluid is
anticipated during patient care [57].
In the home, additional precautions may be needed. Cases of transmission
through sexual contact and breast milk have been describe in the literature [28, 104].
CDC guidelines recommend abstinence from sexual activity of all types including
oral, anal, and vaginal. If abstinence is not possible then condoms and avoidance of
contact with semen is recommended. The WHO has recommended that semen be
tested 3 months after the onset of disease in men. If the test is negative, then it should
be repeated in 1 week. After two negative test sexual activity can be resumed. If the
test is positive, it should be repeated every month until a negative test is obtained.
The Emerging Threat of Ebola 113

Once a negative test occurs, it should be repeated in 1 week, and after two negatives
sexual activity can be resumed [56].
Maternity issues around Ebola are complex. It is unclear when it is safe for a
woman to become pregnant after recovering from Ebola. Some organizations have
suggested that a woman wait a few months prior to becoming pregnant, but so far
this recommendation has not been supported by clinical data. Breastmilk can
transmit Ebola virus from a mother who has recovered from EVD to her child. If
feasible, breastfeeding should be avoided. The data on Ebola transmission through
breasting is limited, and resources in Ebola-prone areas make repeat testing of
breastmilk impractical. Suggested strategies have recommended avoiding breasting
feeding for 2 months after recovery [10].

3.4.3 Travel

Travel restrictions may occur during Ebola outbreaks. It is generally accepted


practice that those who have potentially been exposed to Ebola virus not travel for
21 days after the last possible day of their exposure. As an alternative for those at low
risk, close monitoring with no restrictions on travel may be done. Balancing indi-
vidual rights with community safety creates ethical and regulatory challenges in
cases of potential exposure. Additional information on monitoring and travel restric-
tion can be found at both the CDC and WHO websites.

3.4.4 Vaccination

Vaccination development is in place, but there is currently no Federal Drug Admin-


istration (FDA) approved vaccination for Ebola. Currently, there are 14 different
clinical trials running with the goal of developing a safe and effective Ebola vaccine
[74]. An investigational vaccine called rVSV-ZEBOV is presently being used in
DRC under “compassionate use”. This vaccine is specific for the Zaire strain of
Ebolavirus. This same vaccine was previously administered to 16,000 volunteers
during an outbreak in 2015. So far, the vaccine appears safe with few side effects, but
insufficient data is available for licensing [120]. Preliminary reports suggest an
efficacy of 100%, but duration of protection is currently not known [82].

3.5 Diagnosis

Even though there are no specific therapies to treat Ebola, diagnosis is important to
prevent spread and to ensure administration of appropriate supportive care and
monitoring. Anyone who has had any potential exposure to Ebola in the last
21 days should be evaluated if symptoms of Ebola develop. While awaiting the
result of Ebola testing, appropriate infection control practices should be
114 M. LaBrunda and N. Amin

implemented. Diagnosis is done by reverse-transcription polymerase chain reaction


(RT-PCR). The test should be done 3 days after the onset of symptoms [7]. False
negatives can occur if the lab is collected before 72 h of symptom onset. A positive
test confirms Ebola virus disease and that the patient is infective. Considering repeat
testing in patients whose clinical picture is highly suspicious of EBD and have a
negative initial test.
Ebola Virus Disease has a broad differential, and simultaneous testing for other
illnesses should be undertaken as clinically warranted. This differential includes,
malaria, Lassa fever, typhoid fever, influenza, meningococcal meningitis (Neisseria
meningitidis), measles, Crimean-Congo Hemorrhagic Fever, Yellow Fever, Mar-
burg, and the familiar travelers’ diarrhea among many others [7].

3.6 Treatment

Supportive care is the only treatment for Ebola. There are no antimicrobial agents
proven to be effective in EBD. When possible, care should be provided at a facility
familiar with the clinical progression of Ebola. Supportive care in Ebola is no
different than for any other critically ill patient.
Give intravenous fluids to prevent dehydration and shock. Patients with Ebola
suffer from vomiting and diarrhea and may easily dehydrate. If intravenous fluids are
unavailable or prohibitively expensive, oral hydration should be undertaken. Ebola
can lead not only to hypovolemic shock, but also septic shock [11] so close patient
monitoring is warranted. Electrolytes will require close monitoring and should be
repleted as needed. Vasopressors may be required if blood pressure cannot be
maintained.
Ebola can result in significant hematological abnormalities [52, 83]. It can also
lead to liver failure followed by coagulopathy [60]. Thrombocytopenia, leukopenia,
and anemia are all common and treatment should be based on the specific abnor-
mality encountered.
Other management may include antipyretics, respiratory support, analgesics,
antimotility agents for diarrhea, antiemetics for nausea and vomiting, antibiotics,
nutritional support and renal replacement therapy. These and other supportive
measures must be tailored to the individual patient need.

4 Epidemiology and Outbreaks

The first reported outbreak of Ebola-like illness occurred in 1976 in Sudan and Zaire
[45] [now South Sudan and the Democratic Republic of Congo (DRC)]. It is
probable that sporadic outbreaks happened earlier but were not identified. Outbreaks
appear to be occurring more frequently than before. This is not only due to improved
detection techniques, but also due to environmental encroachment, increasing
The Emerging Threat of Ebola 115

population mobility, and changing weather patterns. The following section will
summarize data on known Ebola outbreaks that have occurred since 1976.

Cases and
Date Location Deaths Notes
1976 DRC Cases: 318 Zaire species
Deaths: 280 Index case was a 44-year-old
male teacher who had just
returned from a 10-day road
trip. He had eaten monkey and
antelope during his travel. He
presented with malaria symp-
toms [122]
1976 South Sudan Cases: 284 Sudan species
Deaths: 151 Index cases: Three people
whose only common activity
was employment in the same
cotton factory presented with
hemorrhagic fever. The only
known common plant or animal
exposure was cotton, rats, and
mosquitos [123]
1979 South Sudan Cases: 34 Sudan species
Deaths22 Index case was a male admitted
to the hospital with nausea,
vomiting, and fever. Unknown
animal exposure [4].
1994 Gabon Cases: 52 Zaire species
Deaths: 32
1995 DRC Cases: 315 Zaire species
Deaths:
250
1996 Gabon Cases: 37 Zaire species
Deaths: 21
1996 Gabon Cases: 60 Zaire species
Deaths: 45
2000 Uganda Cases: 425 Sudan species
Deaths:
224
2001 Gabon Cases: 95 Zaire species
Deaths: 53
2001 DRC Cases: 57 Zaire species
Deaths: 43
2002 DRC Cases: 143 Zaire species
Deaths:
128
2003 DRC Cases: 35 Zaire species
Deaths: 29
(continued)
116 M. LaBrunda and N. Amin

Cases and
Date Location Deaths Notes
2004 South Sudan Cases: 17 Sudan species
Deaths: 7
2007 DRC Cases: 264 Zaire species
Deaths:
187
2007 Uganda Cases: 149 Bundibugyo species
Deaths: 37
2008 DRC Cases: 32 Zaire species
Deaths: 15
2012 Uganda Cases: 11 Sudan species
Deaths: 4
2012 DRC Cases: 36 Bundibugyo species
Deaths: 13
2012 Uganda Cases: 6 Sudan species
Deaths: 3
2014–2016 (1) Started in Guinea Cases: According to the CDC contact
28,616 tracing linked first case to an
18-month old boy
(2) Spread to Sierra Leone and Deaths: Zaire species [39]
Liberia 11, 310 [38]
(3) Spread to Italy, Mali, Nige-
ria, Senegal, Spain, the United
Kingdom, and the United States
[39]
2014 DRC Cases: 66 Zaire species.
Deaths: 49 The index patient – pregnant
woman who butchered a mon-
key that had been found dead by
her husband [77]
2017 DRC Cases: 8 Zaire species
Deaths: 4
2018 DRC Cases: 54 Zaire species
Deaths: 33
2018 DRC In-progress Zaire species
As of
11/29/
2018:
Cases
Con-
firmed
cases: 380
(continued)
The Emerging Threat of Ebola 117

Cases and
Date Location Deaths Notes
Proba-
ble cases:
48
Deaths
Con-
firmed 200
probable
28 [126]
Unless otherwise noted, information obtained from the Center for Disease Control: https://www.
cdc.gov/vhf/ebola/history/distribution-map.html

5 Society, Social Determinants of Health and Ebola

Social determinants of health are the conditions in which a person lives and grows.
There is no one list of these factors, but they are generally considered to include
influences such as school, (un)employment, the community where one resides, food,
and transportation. The factors are driven by forces outside of one’s sphere of control
such as poverty and war as well as some potentially self-directed choices such as
belief system and friend circle. For example, social determinants of health are a way
of describing why when a 5.9 magnitude earthquake hits Haiti buildings collapse and
people die and when a 5.9 magnitude earthquake and the same earthquake on Guam
causes no damage. Social determinants of health significantly affected how Ebola
has impacted affected countries.

5.1 Poverty

Poverty affects every aspect of life for most. According to World Bank data, the rate
of poverty in Sub-Sahara Africa is trending downwards but is still over 40% of the
population. Poverty leads to lack of education, limited medical resources, poor
nutrition, and crowded living conditions. People in poverty will eat a dead animal
if they find one because it may be all they have to eat. They are unlikely to seek
medical care outside of traditional healers because it is all they know and can afford.
They may insist on washing the bodies of the dead because their only knowledge of
science are traditions passed from generation to generation. All of which contributes
to the spread of Ebola.
118 M. LaBrunda and N. Amin

5.2 War and Conflict

Anyone who reads the history the countries that make up the peri-equatorial regional
of Africa will quickly notice that the region has suffered from nearly continuous war
since even before the European occupation. There are pockets of stability in the
region, but conflict is a way of life for many. Conflict leads to destruction of
infrastructure, fear, stress, distrust, and population displacement. Currently, an
Ebola outbreak is occurring in DRC. Refugees from DRC continually flee into
neighboring countries, especially Uganda. Conflict driven human movement is a
means by which Ebola can be spread. No widespread outbreak of Ebola has occurred
in a refugee camp, but these types of settlements are fertile soil where an outbreak
could start and flourish before an alarm is raised. The Ugandan government is
working with the International Federation of the Red Cross and Red Crescent
Societies (IFRC), UNICEF, and the WHO collaborating to develop an Ebola emer-
gency preparedness plan [53].

5.3 Limited Public Health Infrastructure and Scarcity


of Health Providers

Political and economic instability across have resulted in a debilitated medical and
public health infrastructure. Official data is limited, but media sources have reported
that Liberia has experienced a severe shortage of trained health workers within the
country. Media sources list 207 general practitioners, 18 public health specialists,
15 pediatricians, 12 surgeons, 10 obstetrician-gynecologists, 6 ophthalmologists,
8 internists, 6 dentists, 4 psychiatrists, 4 family medicine specialists, 2 orthopedic
surgeons, 2 radiologists, 1 pathologist, 1 ear-nose-throat specialist, 1 veterinarian,
and 1 dermatologist as comprising the entire formally trained health community
(excluding nursing professionals) [3]. The CIA world factbook lists the number of
physicians per 1000 people to be 0.02 for Liberia, 0.02 for Sierra Leona, 0.08 for
Guinea, 0.09 in DRC, and 0.09 in Uganda [30]. Even some of these numbers are
almost 10 years-old making it difficult to assess the actual situation in the region.
Regardless, it is a safe conclusion that none of these countries are even close to
having the recommended 1 physician per 1000 residents recommended by the
WHO. Each of these countries is unique in the health care challenges it faces, and
only are mentioned here because they have all been touched by Ebola.
Infrastructure development is generally associated with improved health and
decreased disease burden, but this is not always the case. While lack of infrastructure
such as water and sanitation is thought to lead to increased transmission. Increased
connectivity via road and boat is thought to increase the risk of transmission through
increased number of contacts [77].
The Emerging Threat of Ebola 119

5.4 Cultural Beliefs and Practices

One of the most fascinating aspects of Ebola occurs at the intersection of culture and
public health. For generations, a mixture of traditional beliefs and mainstream
religion has served as a cultural foundation in many tribal areas across Central and
Western Africa. Funerary practices in these tribes are some of the most important in
their belief system. It is these practices that have been exploited by the Ebola virus
allowing it to spread. Exposure has been associated with attendance of funerals and
contact with dead bodies in multiple countries [1, 9, 37]. As public health and
medical personnel tried to curb Ebola spread, conflict has occurred. Those most at
risk for Ebola suddenly felt threatened not only by the disease itself, but also by those
where were trying to help as their core beliefs were suddenly targeted.
From the perspective of the health care workers trying to save lives, the cultural
beliefs were generally considered as just another barrier to be surmounted. This lack
of understanding between those at risk and the health care workers lead to conflict,
distrust, which at times drove Ebola victims into hiding rather than seeking care.
Bribes were made, bodies were stolen, aid workers were attacked, and Ebola spread.
Some of the cultural beliefs common in Central and Western Africa will be discussed
here with the goal of fostering cultural understanding of disease. Given the diversity
of human beliefs, it is likely that future events will again put disease control against
traditional beliefs.
A good starting point in cultural sensitivity is viewing an idea from the point-of-
view of the other party. In the case of Ebola, it is important to understand what
different groups of people believe to be the etiology of disease. Most educated health
professionals view disease as an understandable biological process. Infections are
caused by microbes. In the case of Ebola, it is a filovirus. In many traditional African
cultures, disease is believed to be due to witchcraft [89].
Consultation with traditional healers is a common practice across Africa. In many
regions traditional healers are the only locally available medical provider. Even if
modern medical facilities exist, many will turn to the traditional healers first because
they are more trusted, and their beliefs tend to align more closely with those of the
community. There are many different traditional healing practices, sometimes tradi-
tions are passed down through generations in specific families. One description of a
traditional medical ceremony in Sudan describes a medicine man and his assistants.
First, ritualistic dance and chants are performed. Next the medicine man shows his
spiritual power by having a large rock placed on his abdomen and broken by an ax
while he remains still. Once his strength has been established, his attention can be
turned to his patient. The medicine man’s diagnosis is mental illness caused by evil
ancestors who have returned with the purpose of tormenting the patient. Incantations
are the treatment [106]. Beliefs and practices such as this are common in rural
Central Africa. In these societies, illness is viewed as a disruption in the relationship
between God, ancestors, and the person affected. Witchcraft, sorcery, angry ances-
tors, and evil spirits may all be at the root of disease and a powerful medicine man
can restore the proper balance in these relationships thus curing disease [84, 91]. The
120 M. LaBrunda and N. Amin

individual customs and beliefs associated with the cause and treatment of disease is
too long to be included here, but those interested in additional information should
read the articles cited in this section for additional details.
Traditional healers can be a great asset to a community, but there have been
unfortunate instances where they actually promoted the spread of Ebola. Some
traditional healers claimed to be able to cure Ebola. Unfortunatly, their attempts at
cure have been known to spread the disease to those in attendance of curative
ceremonies as well as to themselves [78]. Traditional healers can also charge a
significant amount of money putting a family who is already dealing with the loss of
a loved one in additional financial stress [51]. Not all traditional healers seek the
good of the community but instead are motivated by personal gain.
Many societies in Central Africa practice religious beliefs based on a combination
of mainstream religion and ancestor worship. Occult ceremonies, secret societies,
and rituals are common, and the details of these practices are often covert, only
known to a small subpopulation. The ceremonies may be benign such as the one
described in the preceding paragraph or may involve animal or human sacrifice
[15, 65]. While many of these practices involve sacrifice and exposure to blood no
studies have been published linking these activities to Ebola transmission.
It is the traditional funerary practices that have been most closely associated with
the spread of Ebola. Many Central and Western African cultures view the death
ceremony as one of the most important. When people die, they must be guided to the
realm of the ancestors. From this realm, ancestors are able to hear the requests and
see the needs of the living family and communicate these needs to God. The living
family prays directly to the ancestors. If death rights are not done correctly then
instead of becoming an ancestor, the deceased may become an angry ghost which
torments the family [101].
A common funerary practice in Liberia is for an elder family member to bathe the
body of the deceased. It is common for mourners to touch the face and kiss the
forehead of the deceased. In some traditions the spouse of the deceased continues to
share a bed with the corpse until the time of burial. Another tradition involves dance.
On the night prior to the funeral, men dance with the dead body while women wail.
Several traditions involve sacrifice and exposure to the blood of a bull as part of their
ceremony [101].
To prevent the spread of disease the Governments in Liberia and Guinea passed
laws requiring safe burial teams or cremation when the number of grave sites was
insufficient for the number of bodies. Numerous reports of bribing health workers
responsible for collecting and properly disposing of the bodies allowed Ebola to
persist in this region [50]. People stopped going to the health care facilities, and
families would try to hide the cause of death from officials.
At the height of the epidemic in Sierra Leone, the number of Ebola care beds was
insufficient for the number of patients. Many were transferred from facility to facility
and their families were not notified. Rumors began to spread that the Ebola facilities
were harvesting organs and killing people [81]. Poor communication resulted in
suspicion and distrust.
The Emerging Threat of Ebola 121

It took thousands of deaths, but finally both sides began to compromise. The
government and health care workers started to work with local religious leaders and
traditional healers to find solutions that would let the people honor the dead without
exposing themselves. Many Muslim leaders told their followers to abstain from
washing bodies until the outbreak ended. Bodies were buried with families nearby
and although the could not touch the bodies prayers could be said. Burial teams
started to dress corpses in clothing requested by the family and often placed
requested jewelry. Once all sides compromised and started working together the
epidemic was able to be contained [81].
Even if someone survives Ebola the battle is not over. There is poor understand-
ing of disease and disease transmission. Survivors may be ostracized and shunned by
their communities because there is fear that they can spread disease. Survivors have
had their houses burned, families attacked, and lost their jobs due to irrational
community fear. During the west African Ebola outbreak survivors were issued
certificates stating that they were no longer contagious in an attempt to combat social
stigma.

5.5 Conclusion

This is not to say that it is all gloom-and-doom in countries that have experienced
Ebola outbreaks. Social determinants of health are not isolated static elements.
Technology and globalization are bringing health improvements at an unprecedented
rate. If one reviews data for the countries where significant Ebola outbreaks have
occurred, Guinea, Uganda, DRC, South Sudan, and Liberia. All of these countries
have had a decrease in infant mortality rates, decrease in maternal mortality rates,
and extreme poverty rate have been steadily dropping over the last 20 years despite
the presence of Ebola [46]. Anyone interested in additional information on measur-
able global trends, whether they be economic, or health based is encouraged to visit
Gapminder (www.gapminder.org).
Not every country that faces Ebola descends into a public health crisis. In July
2015 multiple cases of EVD were diagnosed in Lagos, Nigeria. Lagos is a densely
populated city and the capital of Nigeria. The Nigerian ministry of health was able to
rapidly contain the situation before a full-scale epidemic began. The Nigerian
government had access to trained health care providers able to do contact tracing,
able to mobilize a rapid efficient response, and worked closely in cooperation with
the WHO to implement standardized epidemiologic practices. The epidemic in
Nigeria was halted before it was able to start [40].
122 M. LaBrunda and N. Amin

6 Ebola in the Technology Era

6.1 Screening

The concept of quarantine was first developed in the fourteenth century to control the
spread of plague [24]. Quarantine is a required separation of incoming people or
animals prior to mixing with the local population with the goal of preventing the
spread of disease. It is one of the oldest and most effective public health measures,
but very unpopular with those whose movements are restricted by quarantine.
Recently, Kaci Hickcox, a nurse volunteering in Sierra Leone returned to the
US. She possibly had been exposed to the Ebola virus. Ms. Hickcox was placed on a
mandatory home quarantine of 21 days, but she defied the quarantine order and
proceeded with her day-to-day activities [98]. In reality, she was at very low risk for
developing the disease, and there was essentially no risk for widespread Ebola
transmission in the US, but her unwillingness to comply with the quarantine brought
attention to many public issues surrounding quarantine. Specifically, the conflict
between individual civil liberty and the well-being of the general public [127]. Since
1944 when quarantine laws were first written technology has expanded drastically.
Surely there exists a technology that allows us to abolish the antiquated quarantine
system.
Whether an intentional act of terrorism or through accidental contagion spread,
travelers pose a significant threat to homeland security. Various measures have been
attempted to try and identify sick travelers with the goal of limiting epidemic spread.
The following is a discussion of currently available boarder control measures aimed
at preventing the spread of disease, and evaluation of the effectiveness of these
measures, and a discussion of technologies that may be of utility in the future in
preventing cross-border Ebola spread.

6.1.1 Pandemic Vulnerability and Travel

Two-point-five million people fly in or out of the United States every day [44] and an
estimated one-million more per day cross via land and sea [16]. With millions of
border crossings daily, transmission of communicable disease between remote
locations is inevitable. The vast majority of communicable diseases spread by
travelers are upper respiratory viruses such as the common cold or influenza.
Generally, these are self-limited illnesses with few long-term consequences. Every
few years though, something new with greater lethality emerges and threatens the
security of the US travelers, their contacts, and the broader population at home.
Ebola, Severe Acute Respiratory Syndrome (SARS), and even the relatively benign
Zika virus have made media headlines with travelers seen as potential harbingers of
disease.
The Emerging Threat of Ebola 123

Another factor that must be taken into account is the increasing population
density and urbanization. The United Nations (UN) predicts that 65% of all people
will live in cities by the year 2050 [109]. A megacity is defined as an urban
population of over ten million people. The first to reach megacity status was
New York City in the 1930’s [31]. By 2018, the megacity count rose to
37 [36]. Large numbers of people in a small area constitute a vulnerability when
looking at epidemic risk assessment. A single ill traveler arriving to a megacity has
the potential to start a local chain of infection that could rapidly spread to millions.
With the widespread availability and affordability of trains, planes, automobiles,
buses, and boats it is easy for microbes as well as humans to travel rapidly across the
globe. Travel provides individual freedom for pleasure and commerce but, at the
expense of national security. Small disease outbreaks are continually occurring
across the globe. Multiple international monitoring systems are in effect and the
Center for Disease Control (CDC) has issued official recommendations for travel
restrictions for persons with higher-risk exposure to communicable diseases of
public health concern [114]. Briefly, these guidelines state that a person who
meets the following criteria will have their travel restricted [114]:
Be known or likely infectious with, or exposed to, a communicable disease that
poses a public health threat AND meet one of the following three criteria:
1. Be unaware of diagnosis, noncompliant with public health recommendations, or
unable to be located.
OR
2. Be at risk for traveling on a commercial flight, or internationally by any means.
OR
3. Travel restrictions are warranted to respond effectively to a communicable
disease outbreak or to enforce a federal or local public health order.
While the above criteria may be the best legally available option, it leaves a
multitude of holes by which a person with a communicable illness could slip into a
US city and start a new epidemic. Ideally, additional layers of protection would
allow potentially ill travelers to be identified and detained prior to entry to the United
States.

6.1.2 Ebola, Travel, and Homeland Security

An infectious agent can travel across the globe in 24 h if spread via airplanes
[14]. This has important implications for those trying to prevent disease from
spreading. Land and boat entry into the United states present other challenges. The
sheer number of people crossing by land on a daily basis makes any screening
difficult. Boat traffic can also present unique screening challenges. A cruise boat, for
example, may arrive with thousands of people who all debark within a short period
of time. Though screens are impractical in these situations. Even if screening
technology was employed allowing security agents to detect fever there are so
many causes of fever that timely interpretation of the data would be difficult. With
so much international travel occurring, there is a continual search for ways to
124 M. LaBrunda and N. Amin

improve screening for ill travelers with the goal of preventing importation of disease.
Many different methods have been tried, most centered around a specific pandemic
rather than continual monitoring. None have had great success. These methods have
included entry-screens, exit-screens, and post-entry monitoring.

6.1.3 Point-of-Entry Screening for Ill Travelers

The US division of Quarantine is not only authorized, but required to identify and
detain anyone entering the country with actual or suspected diphtheria, any viral
hemorrhagic fever including Ebola, cholera, tuberculosis, small pox, plague, novel
influenza strains or yellow fever [22, 94, 95]. In theory, this is an excellent regula-
tion, but how can millions of travelers be efficiently screened and detained if
needed?
After the outbreak of SARS in 2003 many countries starting using boarder
screening to try to identify possibly ill people in hopes of limiting spread of
infectious disease, others jumped on board after the 2009 H1N1 influenza pandemic.
The issue then resurged in the wake of the 2014 Ebola outbreak in West Africa. As
with many things, there must be an understanding of the costs, potential benefits and
effectiveness of programs aimed at preventing a possible public health disaster.
An article by the CDC, published around the same time as the article
recommending travel restriction for high-risk individuals, concludes that border
screens are expensive and not effective in preventing the spread of disease
[100]. While point-of-care screens are not yet considered an effective means of
controlling certain biosecurity threats, progress is being made.

Temperature Screens

Temperature screens have been developed with the goal of identifying people with
fever. What happens when a fever is detected depends on where a person is traveling
to and from, and the current state of outbreaks occurring in the world. There are
several types of temperature readers including ear gun thermometers, full body
infrared scanners, and hand-held infrared thermometers [68]. None of these methods
is highly effective and most screening devices can be fooled with minimal training
and effort. Once study found that thermal screens were only about 70% effective in
detecting fever. The authors of this study concluded that temperature screens were
ineffective in identifying ill travelers [64].
The European Center for Disease Control (ECDC) has also investigated the
feasibility of using temperature screens to identify ill travelers and came to similar
conclusions. This report was done during the Ebola of 2014 and geared towards
diagnosing travelers potentially infected with Ebola. They estimate that even under
ideal conditions 20% of symptomatic illness would be missed due to low sensitive of
temperature devices [41]. Additionally, it was concluded that those intentionally
trying to mask their temperature could easily do so and that those who had not
developed symptoms would be missed by the screen.
The Emerging Threat of Ebola 125

Even if fevers screens were accurate and difficult to manipulate that would still be
a poor screening measure. First of all, with many illnesses including chicken pox,
flu, the common cold and countless others, people can be contagious before a fever
starts. It is not yet known if an infected person can spread Ebola before symptoms
begin. Secondly, not all fevers indicate an infectious disease. Fevers can be due to
drug reactions, blood clots, and even cancer. Third, not everyone reacts to an
infection the same way. Some people naturally tend to have fever and others tend
not to. One expression commonly taught in medical schools across the US is, “the
older the colder”. This is a reminder to students that elderly patients may never have
a fever even if they are extremely ill with an infectious disease. Lastly, what
determines what constitutes a fever? The medical field defines fever as a temperature
of 38 degrees Celsius (100.4 F) or higher. Are these same numbers valid for travelers
or should different cut offs be used? While temperature screens may have their place
in emergency settings, they are far from an ideal way of detecting an ill passenger
and the day to day use of temperature screens is not generally considered an effective
means of identifying ill travelers.

Pre-departure Screens

When foreign agencies are cooperative screening may be done prior to departure.
Exit screening was done during the Ebola outbreak of 2014 for travels from West
Africa to the United States. The goal of exit screening is to identify those potentially
infected with a specific disease and prevent them from departing for the United
States until they can be medically cleared. The CDC considers this to be one of the
more effective forms of preventing disease importation to the United States
[26]. Departure screens are not routinely used except during times of known
outbreaks.
During the 2014 West African Ebola outbreak exit screening measures were
implemented. The general process used for screening during the outbreak was as
follows. Travelers were instructed to arrive earlier than they normally would for their
travel due to increased processing times. General instructions to travelers instructed
them to postpone travel if they were ill. In addition to the regular airport screening,
all travelers were required to have their temperature taken and fill out a “Traveler
Public Health Declaration”. Travelers who were febrile or considered at risk based
on the answers to their health declaration forms were detained and their travel
delayed [124]. During the Ebola outbreak the WHO provided resources for
predeparture screening that were detailed yet used easy-to-follow language and
including flow charts for those performing the screen. Basic information on Ebola
and its symptoms so that the illness was more well understood and the disease
symptoms familiar. Directions for using personal protective equipment for those
performing the screening. Written tools and the public health declaration form were
provided. Additional resources included a data collection log and a traveler infor-
mation card that could be distributed to travelers [124].
126 M. LaBrunda and N. Amin

The Ebola screening was done in two steps, a primary screen and a secondary
screen. The primary screen included three questions: (1) Is the traveler febrile?; (2) Is
the traveler demonstrating symptoms of Ebola?; and (3) Has the traveler marked
“yes” to any questions on the health declaration form? An affirmative response to
any of these questions resulted in secondary screening. Secondary screening
involved a public health interview and filling of the secondary health screen form,
repeat temperature measurement preferably with an accurate thermometer, and
focused medical exam. If the secondary screen found a temperature <101.5, no
risk factors for Ebola in the public health interview, and no symptoms of Ebola on
the public health interview they were allowed to proceed to check-in. If the above
criteria were not met, check-in was denied until health clearance could be
obtained [124].
This strategy was considered effective. The limitations include the time and
money required to implement the program, frustrating travel delays for travelers,
and the inability to identify illnesses other than Ebola or similar diseases. Its
usefulness is limited to known and identified epidemics. This strategy will likely
continue to be used in future outbreaks to prevent exportation of disease [41].
Temperature screens have been used during five epidemics to date, dengue,
SARS, Ebola, and Influenza during both the entry and exit process. Screening for
fever in Taiwan entry points during a dengue outbreak was reported to be effective.
One research study reports that 45% of imported dengue cases were able to be
identified through airport screening [71]. During the SARS outbreak, Singapore
entry points screened 400,000 people and identified no cases, Canada entry points
screened 6.4 million people and identified no cases, and Hong Kong entry points
screened 35.6 million people identifying only two cases of SARS [41]. Fever
screening was used during the 2009–2010 influenza pandemic and even with a
low threshold for defining fever was found to have a sensitivity in the 4.5% range.
Exit screening done in West Africa during the Ebola outbreak identified fever in
77 out of 36,000 travels screened. Of these, none had Ebola [25]).

Active Monitoring

Active monitoring is another technique that can be used in preventing disease spread
within Ebola naive countries such as the United States. It involves allowing a
traveler freedom to come into the US, freedom from quarantine, but also allows
health authorities to monitor the health status of potentially infected people. If
someone begins to develop symptoms then measures can be taken to isolate,
diagnose, and treat the ill person. This method is best applied to those who are
reliable and at low risk for developing illness.
There has not been much experience with widespread use of active monitoring
systems with the exception of the 2014 Western Africa Ebola outbreak. During this
outbreak, travelers from Liberia, Sierra Leone, and Guinea to the US were given
CARE (Check And Report Ebola) kits upon arrival to the US [23]. CARE kits
The Emerging Threat of Ebola 127

provided resources to travelers from Ebola affected countries. Travelers were given
information on the signs and symptoms of Ebola, educated on the basic pathophys-
iology of Ebola, provided a thermometer with detailed use instructions and given a
cell phone to ease the communication process. Travelers were allowed to travel
freely but were required to check in with public health officials daily. During these
check-ins, –health reports were given including the development of any new
symptoms, and daily temperature readings for 21 days. Ebola has a highly variable
incubation period. Twenty-one days was the longest interval between exposure and
disease presentation to have been reported accounting for its use in both CARE
packages and quarantine [125].
While the CDC coordinated active monitoring programs, the programs were
managed at the state level. All states eventually participated, but with varying start
dates. New York, Pennsylvania, Maryland, Virginia, New Jersey, and Georgia were
those to first initiate the program. Seventy percent of travelers from West Africa
enter through these states making them logical starting points for the program [21].
After much legal debate and unwanted publicity, Ms. Hickcox mentioned in the
introduction, eventually went into active monitoring program which restored most of
her personal freedoms while at the same time protecting public interests.

6.1.4 Conclusion

Currently available technology is considered insufficient to prevent entry of ill


individual into Ebola naive countries. The general public continues to demand
protection of civil liberties that include the freedom to travel and protection of
privacy. Despite recommendations by the CDC, it is difficult to identify an ill
traveler either before a person embarks for the US or at the point-of-entry. Post
entry monitoring of reliably low risk travels is a socially acceptable alternative to
quarantine and considered reliable although not widely tested.
Screening technologies such as infrared screens may not be considered useful on
a daily use basis but may prove of utility under certain circumstances such as an
active Ebola outbreak. As research continues, technology advances, and better
models to study patterns of disease spread are developed, new methods of point-
of-entry biosecurity are sure to emerge.

6.2 Bioterrorism

Bioterrorism is the intentional spread of disease with the goal of destabilizing an


opposing group. It is thought to have roots extending back to at least 1320 BCE
when the Hittites used infected sheep to spread infection and destabilize their
opponents [108]. Since that time, technology has improved and along with it the
threat of bioterrorism has augmented.
128 M. LaBrunda and N. Amin

The Center for Disease Control (CDC) divides bioterrorism agents into three
separate categories A, B, and C. Category A agents are those which are considered to
be of highest risk. Characteristics group A pathogens are, easy transmission, high
mortality rate, protentional for social disruption, and require special action. Category
B agents are of concern, but considered to have a lower potential for disease than
those in group A. This category is comprised of pathogens that are moderately easy
to spread, have moderate morbidity, low mortality and require specific diagnostic
and surveillance tools. Group C are agents of some concern. This group is made of
pathogens that are easily available, easy to produce and disseminate, and potentially
have significant medical and public health implications. Emerging infections also
fall within group C pathogens. Ebola is considered to be a high threat level A
biothreat [19].

6.2.1 Ebola: Potential in Bioterrorism

Bioweapons are at least as large a threat to homeland security as are traditional


weapons. Biological weapons are attractive to potential terrorists because they are
relatively inexpensive to manufacture, easy to encounter, and easy to distribute [6].
In 1975 the Biological Weapons Convention went into effect. It has been signed
by 180 countries and prohibits the development of biological agents for the purpose
of warfare. Unfortunatly, terrorists fail to abide by this convention, and it is rumored
that even some of the countries that signed the convention document continue to
engage in clandestine research into biological agents for warfare.
Characteristics of a pathogen with bioterrorism potential are those with consistent
disease induction and progression, high infectivity, are easily transmissible between
people, are difficult to diagnose, and have a high mortality rate [49]. It is also
important that the pathogen be stable during production, storage, and distribution
[6]. Lack of immunity in the targeted population and diseases that are difficult to
diagnoses are also attractive to would-be terrorists. Ebola possesses many of these
characterizes.
Ebola possesses many features of an ideal bioterrorism weapon. In the early
stages, Ebola presents as an acute viral illness. By the time clinical features unique to
Ebola infection have developed, it is likely that the illness will already have be
transmitted to others. Particularly vulnerable are those caring for infected patients
including family members and health care workers. Despite being limited to trans-
mission through body fluids, Ebola is highly contagious. Ebola has a high mortally
rate and is attractive to terrorists because there is already widespread fear associated
with Ebola infection. Reston virus, a non-human pathogen in the Ebola family, can
be transmitted. There is concern that with genetic manipulation EVD could be
transformed into an airborne illness and distributed as a bioterrorism weapon [20].
Ebola is one of the many pathogens that could potentially be converted into a
biological weapon. Preparedness plans at the local, state, and national level all
include sections applicable to Ebola. All hospitals in the nation have received
The Emerging Threat of Ebola 129

training on Ebola identification and response. Continued vigilance and repetitive


training sessions are required to ensure that should Ebola be used as a biological
weapon, it will be rapidly identified and contained.

6.2.2 Conclusion

Ebola virus is an agent that could be used as a bioterrorism agent. It is deadly, can
result in long term infection in survivors, and non-specific clinical presentation make
it an attractive choice for would be terrorists. Also, for many people, the word Ebola
creates fear out of proportion to the actual risk of disease. This visceral reaction and
exaggerated fear make Ebola a tempting agent. On the other hand, the lack of
airborne spread and existence of effective vaccine (even if not licensed) are deter-
rents to its use.

7 Ebola and Environmental Degradation

It is impossible to know with certainty when the first Ebola infection occurred. Most
likely it was in a remote African jungle and those infected died without a diagnosis
other than that provided by the local traditional healer. What can be said with
certainty is that the outbreaks are occurring with more frequency. No one knows
with certainty why this is. Hypothesis tend to center around issues of environmental
degradation in association with increased population mobility. Increasing popula-
tion, global warming, and continued human encroachment into forested areas have
been put forth as potential contributing factors.

7.1 Human Population Expansion

Increasing population is theorized to be contributing to the increasing frequency of


Ebola outbreaks. Increasing populations, particularly in developing countries, tend
to lead to congesting living conditions and rapid disease spread, but this would not
explain how the index case in an outbreak becomes infected. Experts opinion often
lists expanding population as contributing to the Ebola outbreak, and intuitively it is
credible, but there is little in the way of direct evidence to support this theory.
Literally hundreds of studies have been conducted on Ebola since the 2014 outbreak,
but none directly addresses the relationship between population growth in Africa and
increasing frequency of Ebola outbreaks. It is likely that the impact of increasing
human populations in endemic areas will not be fully understood until the reservoir
of Ebola has been determined.
130 M. LaBrunda and N. Amin

What we can say with certainty is that once started, Ebola spreads more quickly
than it did in the past and is killing more people. Population level research on Ebola
has yielded interesting results. For a start, risk of Ebola infection has been associated
with a higher level of education [73, 111]. Lower risk for acquisition of Ebola at the
population level has been associated with urban residence, households with no or
low-quality sanitary system, and married men in blue-collar professions in the 2014
outbreak in West Africa [73].
Other studies have found different results when examining the interplay between
population dynamics and the emergence of Ebola. For example, in contrast to the
study by Levy & Odoi, Ebola transmission has been positively correlated with
population density, and proximity to Ebola treatment centers in other investigations
[42]. Another study found that 84.6% of people who tested positive for Ebola cases
lived within a 3-km of roads connecting rural towns and densely populated
cities [75].
Basic public health principles hold that increasing population density allows
infectious disease to spread more quickly, but it is unclear what the impact is on
the emergence of Ebola. It is safe that there is a relationship between population
density, population distribution, and Ebola but the exact nature of that relationship
remains elusive.

7.2 Climate Change

Climate change has been cited by mass media sources as the source of emerging
disease such as Ebola. Elevated atmospheric temperature have been associated with
the development of EVD, but then so have low temperatures [42]. There does appear
to be a relationship between Ebola and temperature, but the character of that
relationship is not clear. Ebola virus is sensitive to high temperatures so intuitively,
higher temperatures would not create a more active form of the virus. What may
change is the human response to higher temperatures. When it is hot, people sweat
more, drink more, and may wear different clothing. It may be that the human
response to hot weather is responsible for the noted difference rather than changes
in viral activity. It is also possible that temperature changes correlate with other
phenomena such as rain storms and that rain, or the response of vegetation to rain
somehow impacts the emergence of Ebola.
Climate change, whether due to human activities or natural climatic cycles will
change patterns of disease across the globe. How changing weather patterns may
affect the distribution and frequency of Ebola cases remains to be seen. Possibly
once the reservoir of Ebola virus has been discovered scientists can predict with
greater certainty how climate change will impact the emergence of Ebola.
The Emerging Threat of Ebola 131

7.3 Environmental Encroachment

It is also postulated that Ebola is occurring with greater frequency due to increasing
human activities within previously untouched natural areas. At least one study has
linked deforestation to EVD outbreaks [90]. Again, there are limited studies
confirming this idea, but logic does suggest that it would be true.
Expert opinion, and the mass media purport that the increasing frequent outbreaks
of Ebola are due to environmental encroachment [85]. As roads are build, forests are
cut, and mineral resources exploited humans are in more intimate contact with the
forest and its inhabitants including the reservoir for Ebola. The reservoir is unknown,
but it is probably found in African jungles. A study looking at vegetation cover,
population density and incidence of Ebola found that vegetation was protective until
the population reached 200 people per square km. At this population density
vegetation became associated with and increase incidence of EVD [115]. There is
a relationship between environmental encroachment and the emergence of Ebola,
but until the reservoir is found it will be difficult to determine the exact nature of this
relationship.

8 Future Directions
8.1 International Collaboration

The frequency of Ebola outbreaks has been increasing. International collaboration is


essential to better understand how and why this is occurring. Traditional tribal
regions do not always follow country lines and both official and unofficial border
crossing are common. Contact tracing is essential for containment of Ebola out-
breaks requires countries to coordinate as people cross borders. Epidemiological
evaluation and experience in treating the disease also require a global rather than
country approach. The study of Ebola requires systematic evaluation and
intercountry coordination to most effectively predict outbreaks and limit their spread
once they do occur.
The global community would also benefit from international standards for diag-
nosis, prevention, and treatment. Luckily, framework already exists for this collab-
oration, at least in times of epidemics with pandemic potential. The International
Health Regulations (IHR) agreement is legally binding accord signed by 196 coun-
tries. It stipulated that these countries must act to contain the threat if a Public Health
Emergency of International Concern (PHEIC) is declared by the WHO director
general. A PHEIC was declared in August 2014 in response to the Ebola outbreak
in West Africa [13].
132 M. LaBrunda and N. Amin

The IHR helps to ensure that an appropriate global health response will be made
once a public health disaster is well underway. Intervention at this level will help
curb progression of the disaster. Along this same line of thinking, mitigation and
preparedness efforts are needed prior to development of a public health disaster. If a
PHEIC is declared, then local measures have failed. Improved regional collaboration
is needed to help minimize the impact of Ebola in the region.

8.2 Development of Health Care Infrastructure in At-Risk


Countries

Many countries at risk for outbreaks of EVD would benefit from bolstering of their
public health and medical programs. Outside assistance is a starting point, but
capacity building is required for long term solutions. In countries with weak public
health infrastructure international efforts need to focus on programs to develop a
sustainable public health system. The challenges are considerable particularly in
areas of chronic conflict, but progress has already been made and with continued
support will continue into the future. A basic public health infrastructure will help
contain Ebola as well as whatever threat comes next.

8.3 Public Education

When an Ebola outbreak hits the general public needs to be educated on how to
respond. If Ebola preparedness is part of the local education, then lives can be saved.
The public can help with surveillance efforts. This would require the population to
trust the public health community, believe that their input is useful, and that they be
trained to recognize potential Ebola in the community.
Public health education can also assist with limiting spread if an outbreak does
occur. This education can be provided through schools, community outreach cam-
paigns, or religious institutions. The education does not need to be complex, just
consistent, concise, true, and culturally appropriate.

9 Conclusion

Outbreaks of EVD have been occurring with increasing frequency. Thousands have
died and thousands more have been lives have suffered because of the disease. The
disease is highly fatal, but even more insipid, it exploits traditional ceremonies and
death-rights as a means of spread. Poverty, both at personal and national level has
resulted in an infrastructure ill-equipped to deal with events such as Ebola.
The Emerging Threat of Ebola 133

Overcrowding promotes transmission and lack of financial incentives have delayed


vaccine development. Despite the barriers, EVD is slowing being more well under-
stood, thousands of research articles have been published, and guidelines for every
aspect of the disease have been published by the WHO, CDC, or other government
level organizations. Progress is being made.

References

1. Agua-Agum J, Ariyarajah A, Aylward B, Bawo L, Bilivogui P, Blake I, Yoti Z (2015)


Esposure patterns driving Ebola transmission in West Africa: a retrospective observational
study. PLoS Med. https://doi.org/10.1371/journal.pmed.1002170
2. Assessment of the Risk of Ebola Virus Transmission from Bodily Fluids and Fomites (2007) J
Infect Dis 196(S-2):S142–S147
3. Ballah Z (2016) Liberia’s 4.5 million population has only 298 medical doctors. Retrieved
December 6, 2018, from The Bush Chicken: https://www.bushchicken.com/liberias-4-5-mil
lion-population-has-only-298-medical-doctors/
4. Baron R, McCormick J, Zubeir O (1983) Ebola virus disease in southern Sudan: hospital
dissemination and intrafamilial spread. Bull World Health Organ 61(6):997–1003
5. Bausch D, Towner J, Dowell S, Kaducu F, Lukwiya M, Sanchez A et al (2007) Assessment of
the risk of Ebola virus transmission from bodily fluids and fomites. J Infect Dis 196(Suppl 2):
S142–S147
6. Beeching N, Dance D, Miller A, Spencer R (2002) Biological warfare and bioterrorism. BMP
324:336
7. Beeching N, Fenech M, Fletcher T, Houlihan C (2018) Ebola virus infection. BMJ Best Pract.
Retrieved December 5, 2018, from https://bestpractice.bmj.com/topics/en-us/1210/pdf/1210.
pdf
8. Bonwitt J, Dawson M, Kandeh M, Ansumana R, Sahr F, Brown H, Kelly A (2018) Unintended
consequences of the Bushmeat Ban in West Africa during the 2013–2016 Ebola virus disease
epidemic. Soc Sci Med 200:166
9. Brainard J, Hopper L, Pond K, Edmunds K, Hunter P (2016) Risk factors for transmission of
Ebola or Marburg virus disease: a systematic review and meta-analysis. Int J Epidemiol
45:102–116. https://doi.org/10.1093/ije/dyv307
10. Brandt A, Oria O, Kallon M, Bazzano AN, Alessandra N (2017) Infant feeding policy and
programming during the 2014–2015 Ebola virus disease outbreak in Sierra Leone. Glob
Health: Sci Pract 5(3):507–515
11. Bray M, Mahanty S (2003) Ebola hemorrhagic fever and septic shock. J Infect Dis
188:1613–1617
12. Breman J, Heymann D, Lloyd G, McCormick J, Miatudila M, Murphy F et al (2016)
Discovery and description of Ebola Zaire virus in 1976 and relevance to the West African
epidemic during 2013–2016. J Infect Dis 214(suppl 3):S93–S101
13. Briand S, Bertherat E, Cox P, Formenty P, Kieny M-P, Myhre J et al (2014) The international
Ebola emergency. N Engl J Med 371:1180–1183
14. Brownstone S (2013) Outbreak! Watch how quickly an epidemic would spread across the
world. Retrieved from Fast Company: https://www.fastcompany.com/3023416/outbreak-
watch-how-quickly-an-epidemic-would-spread-across-the-world
15. Bukuluki P, Mpyangu C (2014) The African conception of sacrifice and its relationship with
child sacrifice. Int Lett Soc Humanist Sci 30(1):12–21
16. Burearu of Transportation Statistics (n.d.) United States Department of Transporation.
Retrieved from Border Crossing/Entry Data: https://www.bts.gov/content/border-
crossingentry-data
134 M. LaBrunda and N. Amin

17. Bwaka M, Bonnet M, Calain P, Colebunders R, De Roo A, Guimard Y, Van den Enden E
(1999) Ebola hemorrhagic fever in Kikwit, Democratic Republic of the Congo: clinical
observations in 103 patients. J Infect Dis 179(Suppl 1):S1–S7
18. CDC (2015) Interim guidance for healthcare workers providing care in West African countries
affected by the Ebola outbreak: limiting heat burden while wearing personal protective
equipment (PPE). Retrieved December 6, 2018, from https://www.cdc.gov/vhf/ebola/hcp/
limiting-heat-burden.html
19. CDC (2018) Retrieved December 1, 2018, from Center for Disease Response: https://emer
gency.cdc.gov/agent/agentlist-category.asp
20. Cenciarelli o, Gabbarini V, Pietropaoli S, Malizia A, Tamburrini A, Ludovici G, Bellecci C
(2015) Viral bioterrorism: learning the lesson of Ebola virus in West Africa. Virus Res
210:318–326
21. Center for Disease Control and Prevention (2014a) CDC announces active post-arrival
monitoring for travelers from impacted countries. Retrieved from CDC: https://www.cdc.
gov/media/releases/2014/p1022-post-arrival-monitoring.html
22. Center for Disease Control and Prevention (2014b) Enhanced Ebola screening to start at five
U.S. airports and new tracking program for all people entering U.S. from Ebola-affected
countries. Retrieved April 10, 2018, from CDC: https://www.cdc.gov/media/releases/2014/
p1008-ebola-screening.html
23. Center for Disease Control and Prevention (2014c) History of quarantine. Retrieved from
CDC: https://www.cdc.gov/quarantine/historyquarantine.html
24. Center for Disease Control and Prevention (2014d) Enhanced Ebola screening to start at five
U.S. airports and new tracking program for all people entering U.S. from Ebola-affected
countries. Retrieved April 10, 2018, from CDC: https://www.cdc.gov/media/releases/2014/
p1008-ebola-screening.html
25. Center for Disease Control and Prevention (2014e) History of quarantine. Retrieved from
CDC: https://www.cdc.gov/quarantine/historyquarantine.html
26. Center for Disease control and Prevention (2015) Protecting borders: the road to zero.
Retrieved from CDC: https://www.cdc.gov/about/ebola/protecting-borders.html
27. Cherfow D, Nath A, Suffiredini A, Danner R, Reich D, Bishop R et al (2016) Severe
meningoencephalitis in a case of Ebola virus disease: a case report. Ann Intern Med 165
(4):301
28. Christie A, Davies-Wayne G, Cordier-Lassalle T, Blackley D, Laney A, Williams D et al
(2015) Possible sexual transmission of Ebola virus – Liberia, 2015. Morb Mortal Wkly Rep 64
(17):479
29. Chughtai A, Barnes M, MacIntyre C (2016) Persistence of Ebola virus in various body fluids
during convalescence: evidence and implications for disease transmission and control.
Epidemiol Infect 155(8):1652–1660
30. CIA World Factbook (n.d.) Retrieved Decembwr 6, 2018, from https://www.cia.gov/library/
publications/the-world-factbook/fields/2226.html
31. Cox W (2017) The world’s ten largest megacities. Retrieved from The Huffington Post: https://
www.huffingtonpost.com/wendell-cox/the-worlds-ten-largest-me_b_6684694.html
32. de Greslan T, Billhot M, Rousseau C, Mac Nab C, Karowski L, Cournac J, Cellarier G (2016)
Ebola virus-related encephalitis. Clin Infect Dis 63(8):1076–1078
33. De Nys H, Kingebeni P, Keita A, Butel C, Guillaume T, Villabona-Areana C-J, Peeters M
(2018) Survey of Ebola viruses in frugivorous and insectivorous bats in Guinea, Cameroon,
and the Democratic Republic of the Congo, 2015–2017. Emerg Infect Dis. https://doi.org/10.
3201/eid2412.180740
34. Dean N, Halloran M, Yang Y, Longini I (2016) Transmissibility and pathogenicity of Ebola
virus: a systematic review and meta-analysis of household secondary attack rate and asymp-
tomatic infection. Clin Infect Dis 62(10):1277–1286. https://doi.org/10.1093/cid/ciw114
35. Deen G, Broutet N, Xu W, Knust B, Sesay F, McDonald S et al (2017) Ebola RNA persistence
in semen of Ebola virus disease survivors – final report. N Engl J Med 377(15):1428–1437.
https://doi.org/10.1056/NEJMoa1511410
The Emerging Threat of Ebola 135

36. Demographia World Urban Area (2018) Demographia world urban areas: 14th annual edition.
Retrieved from http://demographia.com/db-worldua.pdf
37. Dietz P, Jambai A, Paweska J, Ksiazek T (2015) Epidemiology and risk factors for Ebola virus
disease in Sierra Leone – 23 May 2014 to 31 January 2015. Clin Infect Dis 61:1648–1654.
https://doi.org/10.1093/cid/civ568
38. Ebola (2017a) Retrieved December 1, 2018, from Center for Disease Control: https://www.
cdc.gov/vhf/ebola/history/2014-2016-outbreak/case-counts.html
39. Ebola (2017b) Retrieved December 1, 2018, from Center for Disease Control: https://www.
cdc.gov/vhf/ebola/history/2014-2016-outbreak/index.html
40. Ebola in Nigeria and Senegal: stable – for the moment (n.d.) Retrieved December 7, 2018,
from WHO: https://www.who.int/csr/disease/ebola/ebola-6-months/nigeria-senegal/en/
41. European Center for Disease Control and Prevention (2014) Infection prevention and control
measures for Ebola virus disease. Retrieved April 10, 2018, from https://ecdc.europa.eu/sites/
portal/files/media/en/publications/Publications/Ebola-outbreak-technicalreport-exit-entry-
screening-13Oct2014.pdf
42. Fang L-Q, Yang Y, Jiang J-F, Yao H-W, Kargbo D, Jia B-G et al (2016) Transmission
dynamics of Ebola virus disease and intervention effectiveness in Sierra Leone. Proc Natl
Acad Sci 112:4488–4493
43. Faye O, Andronico A, Faye O, Salje H, Boëlle P, Magassouba N et al (2015) Use of viremia to
evaluate the baseline case fatality ratio of Ebola virus disease and inform treatment studies: a
retrospective cohort study. PLoS Med 12(12):e1001908. https://doi.org/10.1371/journal.
pmed.1001908
44. Federal Aviation Administration (2017) Air trafic by the numbers. Retrieved from FAA:
https://www.faa.gov/air_traffic/by_the_numbers/
45. Feldmann H, Geisbert T (2011) Ebola haemorrhagic fever. Lancet 377:849–862
46. Gapminder (n.d.) Retrieved December 6, 2018, from www.gapminder.com
47. Glynn F, Bower H, Johnson S, Turay C, Sesay D, Mansaray S et al (2018) Variability in
intrahousehold transmission of Ebola virus, and estimation of the household secondary attack
rate. J Infect Dis 217:232–237
48. Goldstein T, Anthony S, Gbakima A, Bird B, Bangura J, Tremeau-Bravard A, Mazet J (2018)
The discovery of Bombali virus adds further support for bats as hosts of ebolaviruses. Nat
Microbiol 3:1084–1089
49. Green M, LeDuc J, Cohen D, Franz D (2018) Confronting the threat of bioterrorism: realities,
challenges, and defensive strategies. Lancet Infect Dis. https://doi.org/10.1016/S1473-3099
(18)30298-6
50. Guion P (2014) Retrieved from wired news: https://news.vice.com/en_us/article/vbn87x/fam
ilies-in-liberia-are-paying-bribes-for-false-certificates-over-ebola-deaths
51. Hewlett B, Amola R (2003) Cultural context of Ebola in northern Ugands. Emerg Infect Dis 9
(10):1242–1248
52. Hunt L, Gupta-Wright A, Simms V, Tamba F, Knott V, Tamba K et al (2015) Clinical
presentation, biochemical, and haematological parameters and their association with outcome
in patients with Ebola virus disease: an observational cohort study. Lancet Infect Dis 15
(11):1292–1299
53. IFRC (n.d.) Emergency Plan of Action (EPoA) Uganda: Ebola preparedness. Retrieved
December 6, 2018, from https://reliefweb.int/sites/reliefweb.int/files/resources/
MDRUG041do1.pdf
54. Infection Prevention and Control Recommendations for Hospitalized Patients Under Investi-
gation (PUIs) for Ebola Virus Disease (EVD) in U.S. Hospitals (2018) Retrieved December
6, 2018, from https://www.cdc.gov/vhf/ebola/clinicians/evd/infection-control.html
55. Information note: Ebola and food safety (2014) Retrieved December 6, 2018, from World
Health Organization: information note: Ebola and food safety
56. Interim advice on the sexual transmission of the Ebola virus disease (2016) Retrieved
December 6, 2018, from WHO: https://www.who.int/reproductivehealth/topics/rtis/ebola-
virus-semen/en/
136 M. LaBrunda and N. Amin

57. Interim Guidance for Management of Survivors of Ebola Virus Disease in U.S. Healthcare
Settings (2018) Retrieved December 6, 2018, from Center for Disease Control: https://www.
cdc.gov/vhf/ebola/clinicians/evaluating-patients/guidance-for-management-of-survivors-
ebola.html
58. Jacobs M, Rodger A, Bell D, Bhagani S, Cropley I, Filipe A, Gifford R (2016) Late Ebola
virus relapse causing meningoencephalitis: a case report. Lancet 388(10043):498–503. https://
doi.org/10.1016/S0140-6736(16)30386-5
59. Jagadesh S, Sevalie S, Fatoma R, Sesay F, Sahr F, Faragher B, Scott J (2018) Disability among
Ebola survivors and their close contacts in Sierra Leone: a retrospective case-controlled cohort
study. Clin Infect Dis 66(1):131
60. Janvier F, Foissaud V, Cotte J, Aletti M, Savini H, Cordier P, Sagui E (2016) Monitoring of
prognostic laboratory markers in Ebola virus disease. J Infect Dis 213(6):1049
61. Jaxx N, Davis K, Geisbert T, Vogel P, Jaxx G, Topper M, Jahrling P (1996) Lethal
experimential infection of rhesus monkeys with Ebola-Zaire (Mayinga) virus by the oral and
conjunctival route of exposure. Arch Pathol Lab Med 120(2):140
62. Jones M, Schun A, Amman B, Sealy TK, Zaki S et al (2015) Experimental inoculation of
Egyptian Rousette bats (Rousettus aegyptiacus) with viruses of the ebolavirus and
Marburgvirus genera. Viruses:3420–3442
63. Kadanali A, Karagoz G (2015) An overview of Ebola virus disease. North Clin Istanbul
2:81–85. https://doi.org/10.14744/nci.2015.97269
64. Kamiya K, Nishiura H (2011) Some airports have a new security routine: taking your
temperature. BMC Infect Dis 11:111. Retrieved from https://bmcinfectdis.biomedcentral.
com/articles/10.1186/1471-2334-11-111
65. Kenan J (1997) The worship of God in African traditional religion a Nigerian perspective.
Retrieved December 5, 2018, from Faculty of Social Science and Humanities, and Religion:
University of Capetown: https://open.uct.ac.za/bitstream/handle/11427/17492/thesis_hum_
1997_kenan_john_sarauta.pdf?sequence=1
66. Khan A, Tshioko F, Heymann D, Le Guenno B, Nabeth P, Kerstiens B et al (1999) The
reemergence of Ebola hemorrhagic fever Democreatic Republic of the Congo, 1995, Com-
mission de Lutte Contre les Epidemies a Kikwit. J Infect Dis 179(1):S76
67. Kilmarx P, Clarke K, Dietz P, Hamel M, Husain F, McFadden J, Jambai A (2014) Ebola virus
disease in health care workers – Sierra Leone, 2014. Morb Mortal Wkly Rep 63
(49):1168–1172
68. Klibanoff E (2014) Some airports have a new security routine: taking your temperature.
Retrieved from NPR: goats and soda
69. Kortepeter M, Bausch D, Bray M (2011) Basic clinical and laboratory features of filoviral
hemorrhagic fever. J Infect Dis 204(Suppl 3):S810–S816
70. Kreuels B, Wichmann D, Emmerich P, Schmidt-Chanasit J, de Heer G, Kluge S et al (2014) A
case of severe Ebola virus infection complicated by gram-negative septicemia. N Engl J Med
371:2394–2401. https://doi.org/10.1056/NEJMoa1411677
71. Kuan M, Lin T, Chuang J, Wu H (2010) Epidemiological trends and the effect of airport fever
screening on prevention of domestic dengue fever outbreaks in Taiwan, 1998–2007. Int J
Infect Dis 14:e693–e697
72. Leroy E, Kumulungui B, Pourrut X, Roquert P, Hassanin A, Yaba P et al (2005) Fruit bats as
reservoirs of Ebola virus. Nature 438:575–576
73. Levy B, Odoi A (2018) Exploratory investigation of region level risk factors of Ebola virus
disease in West Africa. Peer J. Retrieved from https://peerj.com/articles/5888/
74. Levy Y, Lane C, Piot P, Beavogui A, Kieh M, Leigh B, Yazdanpanah Y (2018) Prevention of
Ebola virus disease through vaccination: where we are in 2018. Lancet 392(10149):787–790
75. Lu H, Qian J, Kargbo D, Zhang X, Yang F, Hu Y et al (2015) Ebola virus outbreak
investigation, Sierra Leone, September 28-November 11, 2014. Emerg Infect Dis 21(11):1921
76. Lyon G, Mehta A, Varkey J, Brantly K, Plyler L, McElroy A, Ribner B (2014) Clinical care of
two patients with Ebola virus disease in the United States. N Engl J Med 371(25):2402
The Emerging Threat of Ebola 137

77. Mananga G, Kapetshi J, Berthest N, Ilunga B, Kabange F et al (2014) Ebola virus disease in
the Democratic Republic of Congo. N Engl J Med 371:2083–2091
78. Manguvo A, Mafuvadze B (2015) The impact of traditional and religious practices on the
spread of Ebola in West Africa: time for a strategic shift. Pan Afr Med J 22(Suppl 1):9
79. Mate S, Kugelman J, Nyenswah T, Ladner J, Wiley M, Cordier-Lassalle T, Palacios G (2015)
Molecular evidence of sexual transmission of Ebola virus. N Engl J Sci 373(25):2448
80. Mattia J, Vandy M, Chang J, Platt D, Dierberg K, Bausch D, Mishra S (2016) Early clinical
sequelae of Ebola virus disease in Sierra Leone: a cross-sectional study. Lancet Infect Dis 16
(3):331–338
81. Maxmen A (2015) How the fight against Ebola tested a culture’s traditions. National Geo-
graphic. Retrieved December 5, 2018, from https://news.nationalgeographic.com/2015/01/
150130-ebola-virus-outbreak-epidemic-sierra-leone-funerals/
82. Metzger W, Vivas-Martinez S (2018) Questionable efficacy of the rVSV-ZEBOV Ebola
vaccine. Lancet 391(10125):30560–30569
83. Mobula L, MacDermott N, Hoggart C, Brantly K, Plyer W, Brown J et al (2018) Clinical
manifestations and modes of death among patients with Ebola virus disease. Am J Trop Med
Hyg 94(4):1186–1193
84. Mokgobi M (2014) Understanding traditional African Healting. Afr J Phys Health Educ
Recreat Dance 20(Suppl 2):24–34
85. Moskowitz P (2014) Deforestation, development may be driving Ebola outbreaks, experts say.
AlJazeera America. Retrieved December 7, 2018, from http://america.aljazeera.com/articles/
2014/8/4/ebola-deforestationclimatechange.html
86. Muyembe-Tamfum J, Kipasa M, Kiyungu C, Colebunders R (1999) Ebola outbreak in Kikwit,
Democratic Republic of the Congo: discovery and control measures. J Infect Dis 179:S259–
S262
87. Nkoghe D, Formenty P, Leroy E, Nnegue S, Edou S, Ba J, Mve M (2005) Multiple Ebola virus
haemorrhagic fever outbreaks in Gabon, from October 2001 to April 2002. Bull Soc Pathol
Exot 98(3):224
88. Nordenstedt H, Bah E, de la Vega M, Barry M, N’Faly M, Crahay B et al (2016) Ebola virus in
breast milk in an Ebola virus-positive mother with twin babies, Guinea, 2015. Emerg Infect
Dis 22(4):759–760. https://doi.org/10.3201/eid2204.151880
89. Okware S, Omaswa F, Talisuna A, Amandua J, Amone J, Onek P et al (2015) Managing Ebola
from rural to urban slum settings: expierences from Uganda. Afr Health Sci 15(1):312–321
90. Olivero J, Fa J, Real R, Marquez A, Farfan M, Vargas M et al (2017) Recent loss of closed
forests is associated with Ebola virus disease outbreaks. Sci Rep 7:14291
91. Osemwenkha S (2000) Disease Aetiology in Traditional African Society. Afr: Riv Trimest
Stud Documentazione Dell’Istituto Ital l’Afr l’Oriente 55(4):583–590
92. Palich R, Irenge L, Barte de Sainte Fare E, Augier A, Malw D, Gala J (2017) Ebola virus RNA
detection on fomites in close proximity to confirmed Ebola patients; N’Zerekore, Guinea,
2015. PLoS One 12(5):e0177350
93. Paweska J, Storm N, Grobbelaar A, Markotter W, Kemp A, van Vuren J (2016) Experimental
Inoculation of Egyptian Fruit Bats (Rousettus aegyptiacus) with Ebola Virus. Viruses 8:e29.
https://doi.org/10.3390/v8020029
94. Prevention CF (2000) Public health screening at US ports of entry: guidelines for inspectors.
Retrieved April 10, 2018, from http://www.library.armstrong.edu/eres/docs/eres/ETHI2000-
1_ADAMS/200003adaEntryPart1.PDF
95. Prevention CF (2017) US quarantine stations. Retrieved from CDC: https://www.cdc.gov/
quarantine/quarantine-stations-us.html
96. Reiter P, Turell M, Coleman R, Miller B, Maupin G, Liz J et al (1999) Field investigations of
an outbreak of Ebola hemorrhagic fever, Kikwit, Democratic Republic of the Congo, 1995:
arthropod studies. J Infect Dis 179(Suppl 1):S148–S154
97. Rowe A, Bertolli J, Khan A, Mukunu R, Muyembe-Tamfum J, Bressler D et al (1999, Feb)
Clinical, virologic, and immunologic follow-up of convalescent Ebola hemorrhagic fever
138 M. LaBrunda and N. Amin

patients and their household contacts, Kikwit, Democratic Republic of the Congo. Commis-
sion de Lutte contre les Epidémies à Kikwit. J Infect Dis 1:S28–S35
98. Sanchez R, Shoichet C, Karim F (2014) Ray Sanchez, Catherine E. Shoichet and Faith Karimi.
Retrieved from CNN: https://www.cnn.com/2014/10/31/health/us-ebola/index.html
99. Schieffelin J, Shaffer J, Goba A, Gbankie M, Gire S, Colubri A, Garry R (2014) Clinical illness
and outcomes in patients with Ebola in Sierra Leone. N Engl J Med 371(22):2092–2100
100. Selvey L, Antão C, Hall R (2015) Evaluation of border entry screening for infectious diseases
in humans. CDC: Emerg Infect Dis 21, 197(2)
101. Shah J (2015) The dead bodies of the West African Ebola epidemic: understanding the
importance of traditional burial practices. Inquiries J Soc Sci Art Humanit 7(11):1–4
102. Shantha J, Crozier I, Yeh S (2017) An update on ocular complications of Ebola virus disease.
Curr Opin Opthalmol 28(6):600–606
103. Shears P, O’Dempsey T (2015) Ebola virus disease in Africa: epidemiology oand nosocomial
transmission. J Hosp Infect 90(1):1–9
104. Sissoko D, Keita M, Boubacar D, Aliabadi N, Fitter D, Dahl B et al (2017) Ebola virus
persistence in breast milk after no reported illness: a likely source of virus transmission from
mother to child. Clin Infect Dis 64(4):513–516
105. Swanepoel R, Leman P, Burt F, Zachariades N, Braack L, Ksiazek T et al (1996) Experimental
inoculation of plants and animals with Ebola virus. Emerg Infect Dis 2(4):321–325
106. Thompson E (1965) Primitive African medical Lore and Witchcraft. Bull Med Libr Assoc 53
(1):80–94
107. Tower J, Rolin P, Bausch D, Sanches A, Crary S, Vincent M et al (2004) Rapid diagnosis of
Ebola hemorrhagic fever by reverse transcrpition-PCR in an outbreak setting and assessment
of patient viral load as a predictor of outcome. J Virol 78(8):4338
108. Trevisanato S (2007) The ‘Hittite plague’, an epidemic of tularemia and the first record of
biological warfare. Med Hypothesis 69(6):1371–1374
109. United Nations (2014) World’s population increasingly urban with more than half living in
urban areas. Retrieved from United Nations: http://www.un.org/en/development/desa/news/
population/world-urbanization-prospects-2014.html
110. Uyeki T, Mehta A, Davey R, Liddell A, Wolf T, Vetter P et al (2016) Clinical management of
Ebola virus disease in the United States and Europe. N Engl J Med 374(7):636
111. Valeri L, Patterson-Lomba O, Yared G, Ablorh A, Bobb J, Townes W, Harling G (2016)
Predicting subnational Ebola virus disease epidemic dynamics from sociodemographic indi-
cators. PLoS One 11:e0163544
112. Varkey J, Shantha J, Crozier I, Craft C, Lyon G, Mehta A et al (2015) Persistence of Ebola
virus in ocular fluid during convalescence. New Engl J Med 372:2423–2427. https://doi.org/
10.1056/NEJMoa1500306
113. Victory K, Coronado F, Ifono S, Soropogui T, Dahl B (2015) Ebola transmission linked to a
single traditional funeral ceremony – Kissidougou, Guinea, December, 2014–January 2015.
MMWR Morb Mortal Wkly Rep 64(14):386–388
114. Vonnahme L, Jungerman M, Gulati R, Illig P, Alvarado-Ramy F (2017) US federal travel
restrictions for persons with higher-risk exposures to communicable diseases of public health
concern. Emerg Infect Dis 23, supplement. Retrieved from https://wwwnc.cdc.gov/eid/article/
23/13/17-0386_article
115. Walsh M, Haseeb M (2015) The landscape configuration of zoonotic transmission of Ebola
virus disease in West and Central Africa: interaction between population density and vegeta-
tion cover. PeerJ. https://doi.org/10.7717/peerj.735
116. Westhoff Smith D, Hill-Batorski L, N’jai A, Eisfeld A, Neumann G, Halfmann P, Kawaoka Y
(2016) Ebola virus stability under hospital and environmental conditions. J Infect Dis 214
(Suppl 3):S142–S144
117. WHO (n.d.) Retrieved December 4, 2018, from Ground zero in Guinea: the Ebola outbreak
smoulders – undetected – for more than 3 months: http://www.who.int/csr/disease/ebola/
ebola-6-months/guinea/en/
The Emerging Threat of Ebola 139

118. WHO (2014) Interim infection prevention and control guidance for care of patients with suspected
or confirmed filovirus haemorrhagic fever in health-care settings, with focus on Ebola. Retrieved
from WHO_HIS_SDS_2014.4_eng.pdf https://apps.who.int/iris/bitstream/handle/10665/130596/
WHO_HIS_SDS_2014.4_eng.pdf;jsessionid=88823349AEAFBF8D9188605A6C7C4B51?
sequence=1
119. WHO (2016) World Health Organization. Clinical care for survivors of Ebola virus disease:
interim guidance. April 2016. Retrieved Dec 3, 2018, from http://apps.who.int/iris/bitstream/
handle/10665/204235/WHO_EVD_OHE_PED_16.1_eng.pdf;
jsessionid=4014756C59CC9C80C419B41A08E73430?sequence=1
120. WHO (2018) Ebola virus disease: FAQ: compassionate use of Ebola vaccine in the context of
the Ebola outbreak in North Kivu, Democratic Republic of the Congo. Retrieved December
6, 2018, from World Health Organization: https://www.who.int/ebola/drc-2018/faq-vaccine/en/
121. WHO Ebola Response Team (2014) Ebola virus disease in West Africa – the first 9 months of
the epidemic and forward projections. N Engl J Med 371(1):1481–4195
122. WHO: International Commission (1978) Ebola haemorrhagic fever in Zaire, 1976. Bull World
Health Organ 56(2):271–293
123. WHO: International Study Team (1978) Ebola haemorrhagic fever in Sudan, 1976. Bull World
Health Organ 56(2):247–270
124. World Health Organization (2014) WHO interim guidance for Ebola: exit screening at
airports, ports and land crossings. Retrieved from http://apps.who.int/iris/bitstream/handle/
10665/139691/WHO_EVD_Guidance_PoE_14.2_eng.pdf;
jsessionid=64A1B9978B17C57D62739E56CFA7841C?sequence=1
125. World Health Organization (2014b, November 6) WHO Interim guidance for Ebola: exit
screening at airports, ports and land crossings. Retrieved from http://apps.who.int/iris/
bitstream/handle/10665/139691/WHO_EVD_Guidance_PoE_14.2_eng.pdf;
jsessionid=64A1B9978B17C57D62739E56CFA7841C?sequence=1
126. World Health Organization (2018) Ebola fact sheet. Retrieved from WHO: http://www.who.
int/mediacentre/factsheets/fs103/en/
127. World Helth Organization (2018) Retrieved December 1, 2018, from Ebola situation reports:
Democratic Republic of the Congo: http://www.who.int/ebola/situation-reports/drc-2018/en/
128. Yong E (2016) The CDC’s new quarantine rule could violate civil liberties. The Atlantic.
Retrieved from https://www.theatlantic.com/science/archive/2016/12/cdc-quarantine-rule-vio
late-civil-liberties/511823/
129. Zaki S, Shieh W-J, Greer P, Goldsmith C, Ferebee T, Katshitshi J et al (1999) A novel
immunohistochemical assay for the detection of Ebola virus in skin: implications for diagno-
sis, spread, and surveillance of Ebola hemorrhagic fever. J Infect Dis 179:S34–S47
Part II
Mitigation, Preparedness and Response
and Recovery
Natural and Manmade Disasters:
Vulnerable Populations

Jennifer Marshall, Jacqueline Wiltshire, Jennifer Delva, Temitope Bello,


and Anthony J. Masys

1 Introduction

Recent decades have seen the emergence of high Impact, low probability events that
have had significant influence on Global Health Security. Natural disasters, such as
hurricanes, earthquakes, heat waves, tsunamis, hurricanes, droughts, floods and
epidemics, have significant influence on Global Health Security because they chal-
lenge health systems and disaster preparedness and often reveal disparities. Complex
disasters emerge from inherent interdependencies and interconnectivity within
which lie vulnerability pathways that effect the population and communities to
various degrees. Within these communities exist and emerge vulnerable populations
that are greatly affected by disasters.

1.1 What Are Natural and Man-Made Disasters?

A disaster is a catastrophic, sudden event that causes significant disruption of human,


material and environmental resources [1]. There are three types of disasters: natural,
man-made, and hybrid disasters. Natural disasters are events that are beyond human
control – often referred to as “Acts of God”- such as volcanic eruptions, tornadoes or
earthquakes [1]. Man-made disasters are catastrophic events that result from human
acts and decisions. They can occur suddenly or long-term such as national or
international conflicts. Man-made disasters include explosions (chemical and
nuclear), pollution, toxic release and fire. Hybrid disasters are a result of both
human and natural forces. For example, floods in communities built along floodplain

J. Marshall · J. Wiltshire · J. Delva · T. Bello · A. J. Masys (*)


College of Public Health, University of South Florida, Tampa, FL, USA
e-mail: tmasys@health.usf.edu

© Springer Nature Switzerland AG 2020 143


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1_7
144 J. Marshall et al.

Fig. 1 Global and regional disaster occurrence trends from 1900 to 2016

[1]. Three examples of disasters discussed in this chapter are displacement due to
war and conflict in other countries resulting in immigration or refugee resettlement,
infectious disease outbreaks, and extreme weather events.
The Asia-Pacific region is the most disaster-prone area of the world. For example,
during the period 2005–2014, the Asia-Pacific region had over 1600 reported
disaster events resulting in approximately 500,000 fatalities and affecting over 1.4
billion people [2]. Figure 1 depicts global and regional trends pertaining to disaster
occurrence. This highlights the unmitigated growth of complex disaster risks that
can create and affect the most vulnerable populations. Similarly, extreme weather
events in the Gulf of Mexico and Caribbean have had significant effects on the
region. As noted in Grid [3]
The 2017 Atlantic hurricane season was the seventh most active since records began in 1851
and the most active since 2005. Ten hurricanes affected around 20 countries and territories,
of which six developed into category 3 storms or above. The three major hurricanes, Harvey,
Irma and Maria, displaced over 3 million people in the space of a month. They hit as the
region was still recovering from the devastation wrought by hurricane Matthew, which
displaced 2.2 million people in 2016.

Disasters such as Hurricanes Katrina, Sandy, Maria, Irma and Michael


highlighted social, physical, and economic inequities among population groups
thereby shedding light on the impact on vulnerable populations. Social vulnerability
is defined as the characteristics of a person or group in terms of “their capacity to
anticipate, cope with, resist and recover from the impact” of a discrete and identi-
fiable event in nature or society [4]. Within the concept of social vulnerability lie
‘vulnerable populations’.
As described in Wolkin et al. [5]:
Previous research has demonstrated that socially vulnerable populations, also referred to as
at-risk populations, are more likely to be adversely affected in emergencies [6–12]. In
Natural and Manmade Disasters: Vulnerable Populations 145

particular, “the nation’s poorest, sickest, most dependent and most isolated residents” face
increased exposure to “physical hazards and to the social, economic, political, and psycho-
logical impacts” of disasters [5, 13].

Following natural disasters, aside from the number of deaths and damaged infra-
structure, the ensuing complex disasters create cascading events that result in
damaged resources, supply chains and critical infrastructure (water and sanitation),
a stressed or non-existent healthcare system and disruption of an already fragile food
system. Such social dimensions as poverty, gender and vulnerable populations
emerge as key concerns and challenges in the Disaster Risk Reduction domain
[14]. With regards to vulnerable communities, Zoraster [15] argues that ‘...many
high-risk geographical areas have a disproportionately high percentage of margin-
alized populations; this same population is at a disadvantage for preparation, evac-
uation, response, and recovery’. Among the most commonly cited vulnerable groups
are women, children, the elderly, patients living with HIV/AIDS or other chronic
diseases, those with mental or physical disabilities, ethnic minorities, and socioeco-
nomically disadvantaged groups [16].

2 Background
2.1 Who Are Vulnerable Populations?

Disasters highlight social and economic disparities among population groups


[17]. Vulnerable populations also termed “at-risk individuals” or “special needs
populations” refer to people who are at greater risk for poor social, physical and
psychological health outcomes after a disaster [19]. Canon [18] described vulnera-
bility in a disaster context a series of interrelated “disconnects” between livelihood
(strength and resilience, income and subsistence provisions), well-being (baseline
nutrition, physical, and mental health status), self-protection (income and resources
on hand), social protection, and governance (quality of social protection combined
with allocation of assets). Vulnerable populations include children, older adults,
individuals with disabilities or special health care needs, pregnant women, racial/
ethnic minorities, people with language barriers, and those living in poverty. These
populations tend to have limited social and economic resources, live in
disenfranchised or rural communities, or assisted living facilities. Specific examples
of individuals with special health care needs include those with chronic health
conditions (particularly medication or technology-dependent), infectious diseases
requiring testing and treatment (e.g. HIV, TB, Hepatitis, etc.), intellectual and
physical disabilities, and mental health conditions. Specific examples of social or
economic restriction include lack of disposable income (limited or no savings, credit,
or cash on hand), no private transportation, homelessness, incarceratation, lack of
legal documentation, and language barriers.
146 J. Marshall et al.

Vulnerable populations have unique needs in times of disaster [19]. For example,
before disaster, vulnerable populations may require additional assistance for safe
evacuation to shelters. Some people may have difficulties accessing shelters, while
others may be turned away from shelters that are unprepared to accommodate them
due to their pre-existing health conditions [20]. Reports after hurricane Katrina and
Rita provided accounts of individuals who had to sleep in their wheelchairs and were
housed under unfavorable conditions [20]. Individuals with hearing impairment or
language barriers may have problems with comprehending evacuation orders or
instructions in shelters. The aftermath of a disaster is particularly difficult for
vulnerable populations [20]. These groups require further assistance to recover
from devastation caused by a disaster. Housing, healthcare, employment are all
common problems that vulnerable populations face to a greater degree than the
rest of the population. This chapter will focus on three populations who are partic-
ularly vulnerable to the risks and challenges associated with disasters: pregnant
women and families with children, and the elderly.

2.2 Florida Context for Disaster Management

2.2.1 Florida Population

The overall, poverty rate in Florida is 15%, and in 19 of 63 counties (mostly rural)
the poverty rate climbs to 20% or higher [21]. Families with limited or no savings,
credit, or cash on hand, people who do not have private transportation, who are
homeless, incarcerated, and undocumented residents face numerous challenges in
preparing, responding, and recovering from disasters.
The prevalence of disability among adults in Florida is 23.5%, and prevalence of
disability among children and youth is also substantial. The widely accepted defini-
tion of special health care needs among children is those “who have or are at
increased risk for chronic physical, developmental, behavioral or emotional condi-
tions and who also require health and related services of a type or amount beyond
that required by children generally.” [22] which includes 19% of children ages 0–17
and 23% of U.S. households with children [23] and impact 20% of children in
Florida [24]. People with special health care needs or disabilities may encompass
intellectual or physical disabilities, sensory impairments (hearing, vision), may be
recovering from injury (i.e. veterans), or treating chronic (e.g. cancer, asthma,
diabetes, mental health conditions) or infectious diseases (e.g. HIV, TB, Hepatitis,
etc.) that require ongoing continuous treatment. Health care may comprise increased
routine and specialty medical care, prescribed medications, and dependence on
technology (such as dialysis, oxygen, wheelchairs, etc.).
This state also has the fourth highest number of births in the nation (following
California, Texas and New York) with 223,630 births in 2017 [25]. Nearly a third
(32%) of these births are via cesarean section, and 8% of infants born in the U.S. had
low birth weight (less than 2500 g or 5 lbs 8 oz), requiring specialized postpartum
and newborn care and medical follow up [25].
Natural and Manmade Disasters: Vulnerable Populations 147

Florida is a bellwether of the nation’s aging and changing demographic compo-


sition and has the highest proportion of older adults relative to their total population
[26, 27]. Currently, 16.8% of Floridians are aged 65 and older compared to 12.4%
for the nation as a whole. In 2030, adults aged 65 and older are estimated to represent
24.1% of Florida’s total population [28].

2.2.2 Disasters in Florida

Three examples of natural and man-made disasters affecting vulnerable populations


in Florida include political or economic conditions around the world resulting in
refugee resettlement throughout the state, arbovirus outbreaks, and hurricanes.
Refugee persons are individuals who have been forced to leave their home country,
and afraid to return home, due to war, persecution, or violence [29]. There were
33,279 documented refugees under the age of 60 who resettled in Florida during
October 2016 to September 2017; 16% (5369) were children under age 18 and 6%
(1965) were over age 60 [30] with the five largest groups coming from Cuba, and
Haiti, followed by Syria, Democratic Republic of Congo, and Iraq.
Due to its climate and geography, Florida is particularly prone to arboviruses. The
Florida Department of Health’s Division of Disease Control and Health Protection
publishes weekly counts of endemic mosquito-borne viruses such as West Nile virus
(WNV), Eastern equine encephalitis virus (EEEV), and St. Louis encephalitis virus
(SLEV), as well as exotic viruses such as dengue virus (DENV), chikungunya virus
(CHIKV) and California encephalitis group viruses (CEV) and Malaria (Counts
available athttp://www.floridahealth.gov/diseases-and-conditions/mosquito-borne-
diseases/surveillance.html). The Department also conducts prevention and health
communication activities, case investigations, laboratory testing, and referral to
medical services follow-up [31].
Natural disasters are by no means unique to Florida, however they are relatively
common in the state, with 120 direct hits in the past century, including 37 major
hurricanes [32]. Every coastal region of Florida Hurricane tracking and response
system (NW, SW, SE and NE) has been impacted with limited public transportation
(cite) and routes for evacuation (cite routes in this largely rural state, posing a
challenge for residents [33].

Example 1 Maternal and Child Health Populations, Three Disaster Scenarios


There are several important reasons why we should consider the impacts of
disasters on maternal and child health populations, which include women of
child-bearing age, pregnant women, families, and infants and young children.
The first reason has to do with fetal development. Much of the development
physical structure and major organs in the fetus occurs in the first 9 weeks of
pregnancy, often before a woman even knows that she is pregnant. From 10 to

(continued)
148 J. Marshall et al.

20 weeks of gestation, the structures of the brain are being created as well as
neurosensory systems, which continue to be built and refined during the third
trimesters of pregnancy and in early infancy [34]. One part of this neurosen-
sory development relates to the stress response systems. For example, research
has shown that maternal stress hormones impact birth outcomes as well as the
neurosensory systems of the developing fetus [35, 36]. In the case of the
developing infant, a supportive intrauterine environment (free from infectious
disease, chronic disease complications, sufficient nutrition, and not stressed)
supports full term, sufficient birth weight, lower risk of birth defects, and
optimal structural and neurological development [37, 38]. Thus, in order to
address these critical periods in neurophysiological fetal development, we
must consider systems of care that support women prior to, during, and
following pregnancy. For example, in 2017, only 77% of women who gave
birth in 2017 received first-trimester prenatal care [25]. Infant and child
development is also important because the social and physical learning envi-
ronment is what contributes to this “first thousand days” [39, 40] in which the
architecture and pathways of the brain are built and first attachment relation-
ships are established. We know from child development research that basic
learning functions and executive functioning rely on consistent routines,
supportive and engaging environments, adequate sleep, nutrition, and primary
and specialty care to meet each child’s needs.
The maternal and child health population encompasses pregnant women
and families with infants and young children. An estimated 23% of US
population is under age 18, with 6.1% under age 5 years. Women comprise
half (50.8%) of the population [21]. As we consider the critical periods for
fetal development and how much of them occur throughout unknown or
unplanned pregnancies naturally there’s a valid argument for considering
any interventions for pregnant women to also address the needs of all
women of childbearing age [41]. It has been noted that natural disasters do
not affect women and men equally [42]. The effect of disaster on women is
particularly notable. For instance, their physical, psychological, and reproduc-
tive health are often altered [43, 44]. Poverty and increased workload after
disasters threaten women’s well-being and worsen the negative health effects
of disasters on this group [13, 45]. Goodman [46] describes how as a vulner-
able population ‘. . .Data on the increased injuries to women has been derived
from several natural disaster sites. Women are uniquely vulnerable. Seventy
percent of world’s poor are women. Their vulnerability is accentuated by race,
ethnicity, and age. Women often have fewer financial means and less decision-
making power. Women more frequently bear the burden of meeting needs of
family, and their health and safety directly impacts the outcome of their
children’. While pre-disaster baseline data availability is limited in many

(continued)
Natural and Manmade Disasters: Vulnerable Populations 149

countries, and longitudinal follow up of children following disasters is oner-


ous, the long term negative and traumatic effects of disasters on children are
well documented [37].

2.2.3 Refugee Families

In the case of displacement due to man-made disasters, the mental and physical
health needs of women and children must be considered. Refugee families, which
include pregnant women and mothers, fathers or other primary caregivers of chil-
dren, often face difficult social and environmental conditions prior to arriving to the
United States. As entrance into the States can be up to a 10-year process, some
families may arrive after living for extended periods in internment camps or
experiencing other adverse experiences such as the raiding of houses and villages,
lack of job opportunities, and food insecurity. Having limited or no access to health
care and witnessing or experiencing violence, including systematic sexual violence,
and contribute to physical and emotional trauma. Many of the difficulties described
here impact all refugees; the particular economic, health, and mental health vulner-
abilities of women, children, and families are also highlighted.
Life after arrival is difficult for refugee families. This adjustment period may be
challenging or frustrating as a result of culture shock compounded by the adversity
encountered in the home country and transition to the United States. The first
3 months upon arrival can be synonymous with a “honeymoon period” due to the
support and assistance from a resettlement agency’s case manager [47]. However,
over the following months individual well-being and family functioning may appear
to deteriorate as the actual struggles of resettlement continue without a dedicated
case worker to assist [48]. Families are expected to find work to sustain their own
economic livelihood while caring for their children, even though it may take up to
over a year to become proficient in English and fully acquainted with U.S. systems.
Second, given the cultural and linguistic differences, resettled refugee families
may also develop urgent health concerns. For example, women who are refugees or
displaced from war-torn countries have higher unmet health needs which could
impact current or future pregnancies [49]. These needs can be met via pathways to
trauma-informed services and formal and informal supports so that pregnant and
childbearing-age refugee women have access to social support, health education,
adequate nutrition (folic acid and other vital nutrients), dental care, routine, obstetric
and gynecological health care, and counseling. Cultural differences such as hygiene,
views of family planning, and difficulty understanding and navigating U.S. health
care systems may be reflected in refugee families. For example, a study indicated
that Sudanese refugee women were hesitant to partake in Western perinatal traditions
such as receiving an epidural to ease pain or going to the hospital to deliver in fear
that they will have to have a C-section [50]. Limited health literacy such as
misunderstanding a health problem as issued from a health provider or inability to
understand how to access and use medications may also be result of cultural and
linguistic barriers. Chronic conditions including diabetes and high blood pressure
150 J. Marshall et al.

due to the constant stress, anxiety, and trauma of a forced migration may also be
experienced by families. Health conditions manifest in children as well. For instance,
resettled children from African, Middle Eastern, and Asian countries had a high
prevalence rate of dental caries, stool parasites, malnutrition, tuberculosis, anemia,
and elevated blood lead levels upon arrival [51, 52].
Mental health concerns and access to mental health services are essential for
refugee families. Depression, re-traumatization, lack of mental health assessments
for children and youth, and limited counseling services in various languages are
frequent concerns. In particular, the mental health of refugee children poses as an
area of concern due to their increased risk of psychological problems [53]. For
example, Ugurlu et al. [54] examined the mental health concerns of Syrian refugee
children who were between the ages of 7 and 12 for mental health intervention
implementation. At baseline, these children had a high risk of developing post-
traumatic stress disorder and showed symptoms for depression, trait anxiety, and
state anxiety [54]. There is also disruption in children’s education, due to lack of
early learning experiences for toddlers and preschoolers and large gaps in schooling
(sometimes several years) for older children. As refugee families may be at risk for
medical and mental health problems, psychosocial and health assessment is an
important first step in identifying unmet needs and connecting families to services.
In Florida, families are connected to and sponsored by a resettlement agency once
approved to resettle in the States for up to 90 days. With limited funds, a case
manager at the resettlement agency provides general support for the families by
assisting with enrolling children into school, setting up appointments with a state
refugee health manager to receive immunizations and to conduct a medical assess-
ment, and securing affordable housing with furniture. Case managers also assist in
the process of applying to jobs for employment and to receive benefits such as social
security, Medicaid, and SNAP. Additional assistance is provided to especially
vulnerable individuals who are pregnant or require a special health care need.
While services are intended to last for up to 90 days, support may be cut short due
to financial constraints from the agency. The child welfare system must also consider
its policies for supporting families with unclear or limited citizenship status so that
children are safe but families are not unduly targeted [55]. It is recommended to
encourage refugee families to foster relationships with individuals in their area and
for the local community to create coalitions to provide support for families. Addi-
tionally, health and social services providers should employ well-trained bilingual,
culturally competent, and trauma-informed staff.

2.2.4 Zika Virus

One example of how infectious disease can impact maternal and child health
populations was the Zika virus outbreak of 2016. Public health efforts included
health education campaigns to prevent mosquito bites, environmental mosquito
control efforts, and Zika testing and epidemic investigations. Because the outbreak
happened so strongly and swiftly, all public health sectors had to come together to
address the issue, including: infectious disease tracking and surveillance; testing and
Natural and Manmade Disasters: Vulnerable Populations 151

laboratory technology and capacity; obstetric and pediatric care through all access
venues; and public awareness and health education and prevention programs. It was
recognized early on that the Zika virus had deleterious effects on developing fetuses,
therefore attention was placed on pregnant women in Florida, with particular
emphasis on women or their partners who had traveled to Florida from Zika affected
areas and those who were living in areas of high Zika infection (e.g. South Miami)
[56]. One challenge with identifying and containing the risks of Zika to maternal and
child health populations is that half of the pregnancies were unplanned [41]. The
window of detection for Zika through blood testing is narrow and the sensitivity and
specificity of the tests varied. Additionally, the Zika virus is largely asymptomatic.
Therefore, it is plausible that many pregnant women were unaware of their preg-
nancy status and may not have yet exhibited symptoms of Zika infection. Thus,
Florida provided free Zika testing to all pregnant women in the state. Furthermore,
the subsequent effects of Zika on the developing fetus was unknown. As such, a
policy was implemented that all newborns testing positive for Zika at birth were
automatically eligible – and recommended – for assessment and early intervention
services if indicated through the Early Steps IDEA Part C funded program. This
eligibility was later expanded to any infant born to a mother who had tested positive
for Zika during her pregnancy [57].
The strengths of Florida’s systemic response to the Zika outbreak were the
coordinated response between Zika outbreak investigators, epidemiologists and
mosquito control personnel, and rapid dissemination of information to health care
providers via local health departments. Strengths also include the development of
process maps to facilitate communication and coordination among state and local
agencies, rapid organizing and response among community-based programs such as
Healthy Start Coalitions and other non-profits, public-private partnerships, and
policy changes (e.g. access to testing and early intervention services). Limitations,
driven largely by funding and capacity, were present in Zika outreach, prevention,
testing, and follow up protocols. For example, one issue that was insufficiently
addressed is that maternal and child health populations do not exist in isolation. In
fact, because Zika virus can also be transmitted sexually, efforts were made to
contain the virus among sexual partners. However, as epidemiology investigation
and health education systems (e.g. those who work in HIV reduction) are keenly
aware, the challenges in addressing such an issue is that many may not see them-
selves as at risk. Overall, risk perceptions of Zika were fairly low, and limited to
women considering pregnancy at the time [58, 59] and were responsive to media
outreach [60]. Furthermore, many infants followed up in pediatric clinics did not
receive the recommended care because families saw no apparent problems immedi-
ately after birth, though it is now known that the risk of brain and eye abnormalities,
neural tube defects, central nervous system dysfunction, and other sequelae among
infants exposed to Zika in-utero [61]. Fortunately, the development of the U.S. Zika
Pregnancy and Infant Registry, updated and monitored in Florida through the Florida
Birth Defects Registry (FBDR.org) increases capacity for an early alert system for
Zika-related conditions, and facilitates monitoring of pregnancy, infant and child
outcomes.
152 J. Marshall et al.

2.2.5 Hurricanes

The third example to consider for disaster planning specific to maternal and child
populations is extreme weather events. In Florida every year preparations are
underway for hurricane season. Some special accommodations have been made
for children with special health care needs and people with disabilities, including a
special needs registry, special needs shelters, the Disaster Supplemental Nutrition
Assistance Program, emergency childcare, breastmilk banks, and special accommo-
dations at hospital birth centers.
There are, however, gaps in ensuring that women receive uninterrupted access to
contraception, pregnancy testing, and prenatal care, particularly for people who had
limited healthcare access in normal conditions and also for those who have been
displaced or sheltered due to evacuation or following hurricane damage. In 1996,
Morrow and Enarson conducted a “gendered analysis” of issues encountered by
women (and the intersection of gender with race/ethnicity and class) in the aftermath
of Hurricane Andrew. They found that caregiving responsibilities coupled with
household and community loss, particularly among those who were migrant
workers, recent immigrants, single mothers, and coping with domestic violence,
resulted in unique and profound set of stressors [62]. Furthermore, it is known that
the trauma from disruption in secure housing and basic needs can influence the
intrauterine environment throughout a pregnancy and may negatively impact fetal
development and birth outcomes (e.g. preterm birth and low birth weight). Contin-
uous healthcare is essential for monitoring and supporting a healthy pregnancy.
Continuity between prenatal, birth and postpartum care is extremely important for
both mothers’ and babies’ health, as 1 in 33 births is affected by a birth defect or
other condition, and 8% of births result in low birth weight or preterm birth which
places the baby at increased risk for mortality and morbidity.
Pregnant women are also at risk of adverse health outcomes in the perinatal
period such as premature membrane rupture and premature labor, which can result in
babies being born preterm or with low birth weight during disasters [63]. Exposure
to hurricane can cause distress for the fetus. Negative outcomes such as spontaneous
abortions, stillbirths and intrauterine growth restriction (IUGR) are also risks
[64]. Furthermore, the risks to mothers’ health in the postpartum period are largely
unrecognized. The pregnancy related mortality rate in the U.S. continues to rise, with
rates in 2014 of 12.4/100,000 for White women and 40.0/100,000 for Black women.
Causes of pregnancy-related deaths from 2011 to 2014 included: cardiovascular
diseases, 15.2%, non-cardiovascular diseases, 14.7%, infection or sepsis, 12.8%,
postpartum hemorrhage, 11.5%, and others [65]. Perinatal depression and anxiety
affect one in seven and estimated one in ten women, respectively [66] and severe
maternal depression is most common within 30 days of delivery, however episodes
occur both during and after pregnancy [67]. In addition to adverse birth outcomes,
pregnant women are more likely to experience physical and mental health problems
compared to the broader population during disasters [68]. Beyond health care,
continuous adequate nutrition, quality sleep, and sufficient shelter are vitally impor-
tant to maintaining maternal and fetal health. Due to economic impacts of disasters,
Natural and Manmade Disasters: Vulnerable Populations 153

pregnant women may have decreased access to healthcare, and supply of prenatal
vitamins may also be disrupted [19]. Mothers who have been displaced from their
homes may have limited access to clean water and food [69]. Breastfeeding mothers
affected by disasters experience physical and emotional stress due to disruption of
their daily routine. For nursing mothers who end up in shelters, there is a lack of
privacy and sometimes sanitary environment to support breastfeeding.
We must also consider the developmental and social needs of infants and young
children, which depend upon the availability and well-being of their caregivers. High
level of stress during disasters increases the likelihood of postpartum depression
(PPD). In Florida, 58% of women experienced some postpartum depression symp-
toms in 2010, and moderate to severe PPD affects 11–18% of this population
nationwide [70]. Children are particularly more vulnerable in disasters because
they lack the ability to care for themselves, relying on their parents, caregivers or
guardians for their basic needs such as food, shelter, emotional and economic
support [19].
Disasters result in serious physical and emotional stress for children [71]. The
extent of the stress is largely dependent on how quickly their families can recover
from the aftermath of the disaster. Stress management and social support are
extremely critical for healthy functioning in mothers and other caregivers as well
as for young children at a time when their brains are developing exponentially.
Hurricanes can lead to separation between children and their caregivers or put their
parents in positions where they are no longer able to care for them either temporarily
or permanently. Acute Stress Reaction (ASR) is a common reaction in children
following a disaster [72]. This occurs as a result of disruption in their normal daily
routine. This can be addressed by providing for immediate daily needs of the child
and developing routines that are similar to the child’s regular activities. For example,
a major daily activity a child’s life is education. Due to the destructiveness of
hurricanes, schools may be destroyed, and education may be paused for some
time. Some children develop Post-traumatic stress disorder (PTSD) [72]. This is
usually diagnosable in about 4 weeks after a traumatic event. Support services that
build resilience, such as temporary schools, support groups, counseling, attachment
objects, and mental health services, can buffer the immediate and transgenerational
traumatic effects of chaotic environments on children and adults [73, 74] and adults.
Housing that accommodates the whole family which consists of a support system for
the mother and child, significant partners and co-parents grandparents other family
members and pets should be provided.

Example 2 Aging Population


The population in the United States and much of the world is aging. It is
estimated that one-fifth of the US population will be aged 65 and older by 2030
[75]. The “oldest old” – those aged 85 and over – are the fastest growing
elderly age group [75]. By 2050, the number of older Americans will almost

(continued)
154 J. Marshall et al.

double to 84 million, up from 43 million in 2012 [75]. Moreover, this aging


population will become more racially and ethnically diverse [75]. Older
Whites are estimated to increase by 46% from 2014 to 2030, while Blacks
and Hispanics are expected to increase by 90% and 137%, respectively [76].
The state of Florida is a bellwether of the nation’s aging and changing
demographic composition and has the highest proportion of older adults
relative to their total population [26, 27]. Currently, 16.8% of Floridians are
aged 65 and older compared to 12.4% for the nation as a whole. In 2030, adults
aged 65 and older are estimated to represent 24.1% of Florida’s total popula-
tion [28].
This large aging US population is especially vulnerable to disasters due to a
complex combination of physical, psychological, and social factors [26, 77].
In 2012, over 60% of older adults managed two or more chronic conditions
such as coronary heart disease, diabetes, hypertension, chronic obstructive
pulmonary disease, and arthritis [78], which are typically associated with
impaired physical mobility and functional limitations [26, 77, 79].
Alzheimer’s and other dementias are perhaps the most debilitating chronic
condition that older adults have to contend with [80]. Chronic conditions
related to the heart (i.e., cardiovascular disease, diabetes, and hypertension)
are also associated with a higher risk of dementia [80]. In 2017, one in ten
older adults (65 years and older) currently have Alzheimer’s dementia, which
is characterized by impaired memory, cognition, and communication skills, all
of which affect one’s ability to perform daily living activities [80]. Alzheimer’s
disease also increases significantly with age: 17% of people age 75–84, and
32% of people age 85 and older have Alzheimer’s dementia [80]. Blacks and
Hispanics are more likely to be disproportionately burden by chronic condi-
tions including Alzheimer’s dementia [78, 80]. Older adults’ health care needs
and physical limitations have been shown to hinder their adaptability during
disasters [26, 79].
Many older adults live in poverty or have limited financial reserves, which
increases their vulnerability to disasters [81]. In 2016, approximately 14.5% of
adults ages 65 and older lived in poverty with higher rates among the “oldest
old” (18.9%) and people in relatively poor health (21.5%) [82]. Poverty rates
for older Blacks and Hispanics are more than two times higher than that of
Whites [83]. Low-income older adults also tend to have limited transportation
options [79, 84]. Research shows that older adults with low-income or trans-
portation issues are less likely to leave their home in the event of a
government-ordered evacuation [77]. Evacuation can also exacerbate health
conditions for frail older adults. Older adults are at increased risk for abuse,

(continued)
Natural and Manmade Disasters: Vulnerable Populations 155

neglect, and exploitation during and after a disaster [56]. Even when they
survive a disaster, older adults face more economic hardships because they are
less likely to have work opportunities or have financial support from family
members [85]. The evidence indicates that physical, psychological, and social
factors impair older adults’ ability to prepare, respond, or recover from a
disaster [26, 79, 80], which have serious implications for this large aging
population.
The health care needs and physical limitations of aging adults in disasters
are reflected in the catastrophe of Hurricane Katrina in 2005 and the accom-
panying flooding, which resulted in 1330 deaths, 75% of which were over age
60 and 47% of all bodies were over the age 75 [79, 86, 87]. An estimated
200,000 evacuees with chronic conditions lacked access to needed medica-
tions and medical care during and after Hurricane Katrina [79]. There were
few resources in place to evacuate the frail elderly, or care for them in
emergency shelters [88]. Roughly 60% of nursing homes were able to suc-
cessfully evacuate residents [89]. Survivors also experience psychological
problems including post-traumatic stress disorder, anxiety-mood disorders,
and suicidality [90]. More elderly than any other group were adversely
impacted during and in the 1st year after Hurricane Katrina [87].

2.2.6 Hurricane Irma

Hurricane Irma, which hit Florida in 2017, provides insights into the vulnerabilities
of older adults during and following a disaster. Irma ranked as the fifth-costliest
hurricane to impact the U.S., causing $50 billion in damage, mostly in Florida
[91]. There was catastrophic damage to power infrastructure, medical facilities,
communities, and the economy [91, 92]. Longer duration of power outages was
found in rural and minority communities, and among individuals with sensory,
physical and mental disabilities [91]. Of the 129 deaths caused by Irma in Florida,
Georgia, and North Carolina, 123 (95.3%) were in Florida and the median age of
victims was 63 years [92]. Fourteen of the deaths occurred in a nursing home that
lost power and air conditioning after the storm [92]. Overall, approximately 90% of
the deaths were indirectly related to the hurricane with the most common indirect
cause of death being exacerbation of a chronic medical condition [91, 92]. Medical
conditions were exacerbated due to power outages (which impacted heat and cooling
systems), loss or disruption of emergency transportation services, loss or disruption
of usual access to medical care, and induced stress or anxiety [92].
Hurricane Irma prompted one of the largest evacuations in U.S. history of over six
million people [93]. Evacuation is one of the few ways to reduce hurricane-related
morbidity and mortality [94]; however, many older adults were reluctant to evacuate
during Irma due to a host of factors including fear for dangerous environments, loss
of property, language/cultural barriers, and lack of financial resources [95]. Nursing
homes were evacuated due to power outages and the lack of air conditioning
156 J. Marshall et al.

systems. But, evacuation of nursing home residents resulted in medication reconcil-


iation emergencies, especially for older adults with dementia [96]. Evacuation of the
elderly population was also hindered by roadway disruptions and ineffective emer-
gency plans [95].
As evidenced by Hurricane Irma, older adults bear a disproportionate burden of
disaster-related morbidity and mortality. Irma showed that emergency plans were not
quite effective in meeting the safety and medical needs of older adults, especially
frail older adults. Although Florida passed a law requiring emergency generators at
long-term care facilities in response to the nursing home deaths [97], older Floridians
are still at risks for higher fatalities from loss or disruption in access to medical care
after a hurricane. A 2018 study of Palm Beach County, Florida showed that access to
functional hospitals with emergency care during hurricanes are disproportionally
compromised for the elderly than the nonelderly [98]. The 2018 hurricane season
indicated that Florida needs to be better prepared to meet the diverse needs of older
adults especially in regard to evacuations and effective sheltering [95].

3 Conclusion

This chapter highlighted the complex and unique needs of vulnerable populations
following catastrophic consequences during and after a disaster. Globally the disas-
ter trends are increasing, stemming from both man-made and natural disasters. In
particular, the United States is experiencing an increase in the frequency and
intensity of natural disasters. The 2017 hurricane season certainly captures that
well. As noted in Benevolenza and DeRigne [99] ‘. . .individuals who are vulnerable
to the effects of extreme weather, namely the poor, the elderly/disabled, children,
prisoners, and substance abusers have experienced heightened levels of mental,
emotional, and bodily stress due to natural disaster exposure’. To explain the
complexities faced by these populations, case studies were included to demonstrate
the vulnerabilities and ensuing consequences for women and children and older
adults during or after disasters. This is not just a regional issue but rather a global
health security issue.

References

1. Shaluf IM (2007) Disaster types. Disaster Prev Manag: Int J 16(5):704–717. https://doi.org/10.
1108/09653560710837019
2. UN ESCAP (2015) Overview of natural disasters and their impacts in Asia and the Pacific,
1970– 2014 ESCAP technical paper information and communications technology and disaster
risk reduction division. http://www.unescap.org/sites/default/files/Technical%20paper-Over
view%20of%20natural%20hazards%20and%20their%20impacts_final_1.pdf
3. Grid (2018) Spotlight: the Atlantic hurricane season and the importance of resilience. Retrieved
from http://www.internal-displacement.org/global-report/grid2018/downloads/report/2018-
GRID-spotlight-atlantic-hurricane-season.pdf
Natural and Manmade Disasters: Vulnerable Populations 157

4. Wisner B, Blaikie P, Cannon T, Davis I (2004) At risk: natural hazards, people’s vulnerability,
and disasters, 2d edn. Routledge, London
5. Wolkin et al (2015) Reducing public health risk during disasters: identifying social vulnerabil-
ities. J Homel Secur Emerg Manag 12(4):809–822. https://doi.org/10.1515/jhsem-2014-0104
6. Morrow BH (1999) Identifying and mapping community vulnerability. Disasters 23(1):1–18
7. Cutter SL, Mitchell JT, Scott MS (2000) Revealing the vulnerability of people and places: a
case study of Georgetown County, South Carolina. Ann Assoc Am Geogr 90(4):713–737
8. Cutter SL, Boruff BJ, Shirley WL (2003) Social vulnerability to environmental hazards. Soc Sci
Q 84(2):242–261
9. O’Brien G, O’Keefe P, Rose J, Wisner B (2006) Climate change and disaster management.
Disasters 30(1):64–80
10. Hutton D (2010) Vulnerability of children: more than a question of age. Radiat Prot Dosim 142
(1):54–57
11. Phillips BD, Thomas DS, Fothergill A, Blinn-Pike L (2010) Social vulnerability to disasters.
CRC Press, Boca Raton
12. Flanagan BE, Gregory EW, Hallisey EJ, Heitgerd JL, Lewis B (2011) A social vulnerability
index for disaster management. J Homel Secur Emerg Manag 8(1)
13. Enarson E (2007) Identifying and addressing social vulnerabilities. In: Waugh WAT, Tierney
JK (eds) Emergency management: principles and practice for local government. ICMA Press,
Washington, DC, pp 257–278
14. Masys AJ (2013) Human security- a view through the lens of complexity. In: Gilbert T,
Kirkilionis M, Nicolis G (eds) Proceedings of the European Conference on complex systems
2012. Springer Proceedings in Complexity, pp 325–335
15. Zoraster (2010) Vulnerable populations: Hurricane Katrina as a case study. Prehosp Disaster
Med 25(1):74–78. 10.1.1.176.2958&rep=rep1&type= pdf. http://citeseerx.ist.psu.edu/viewdoc/
download
16. Gamble JL, Balbus J, Berger M, Buoye M, Campbell V, Chief K et al (2016) The impacts of
climate change on human health in the United States: a scientific assessment. US Global Chang
Res Program:247–286. https://doi.org/10.7930/J0Q81B0T
17. Jones EC, Gupta SN, Murphy AD, Norris FH (2011) Inequality, socioeconomic status, and
social support in post-disaster mental health in Mexico. Hum Organ 70(1):33–43. https://doi.
org/10.17730/humo.70.1.4h340201207274qj
18. Cannon T (2008) Vulnerability, “innocent” disasters and the imperative of cultural understand-
ing. Disaster Prev Manag 17(3):350–357. https://doi.org/10.1108/09653560810887275
19. Hoffman S (2009) Preparing for disaster: protecting the most vulnerable in emergencies, vol 42.
University of California, Davis, pp 1491–1544. Retrieved from https://lawreview.law.ucdavis.
edu/issues/42/5/articles/42-5_Hoffman.pdf
20. National Council on Disability (2006) The impact of hurricanes Katrina and Rita on people with
disabilities: a look back and remaining challenges. Retrieved from https://www.ncd.gov/
rawmedia_repository/e89f084e_e132_496c_a5b8_56351dfb3f10.pdf
21. US Census (2018) Quickfacts. 2018 population estimates. Retrieved from https://www.census.
gov/quickfacts/fact/table/US/INC110217
22. McPherson M, Arango P, Fox H, Lauver C, McManus M, Newacheck PW et al (1998) A new
definition of children with special health care needs. Pediatrics 102(1):137–139
23. Child and Adolescent Health Measurement Initiative (2012) Who are Children with Special
Health Care Needs (CSHCN). Data Resource Center, supported by Cooperative Agreement
1-U59-MC06980-01 from the U.S. Department of Health and Human Services, Health
Resources and Services Administration (HRSA), Maternal and Child Health Bureau
(MCHB). Retrieved from www.childhealthdata.org. Revised 4/2/12
24. Child and Adolescent Health Measurement Initiative. 2016-2017 National Survey of Children’s
Health (NSCH) data query. Data Resource Center for Child and Adolescent Health supported
by Cooperative Agreement U59MC27866 from the U.S. Department of Health and Human
Services, Health Resources and Services Administration’s Maternal and Child Health Bureau
(HRSA MCHB). Retrieved 1/19/19 from www.childhealthdata.org
158 J. Marshall et al.

25. Martin JA, Hamilton BE, Osterman MJK, Driscoll AK, Drake P (2018) Births: final data for
2017. Natl Vital Stat Rep 67(8):1–50. National Center for Health Statistics, Hyattsville
26. Horner MW, Ozguven EE, Marcelin JM, Kocatepe A (2018) Special needs hurricane shelters
and the ageing population: development of a methodology and a case study application.
Disasters 42(1):169–186
27. Hyer K, MacDonald G, Black K, Badana ANS, Murphy SP, Haley WE (2019) Preparing for
Florida’s Older adult population growth with user-friendly demographic maps. Fla Public
Health Rev 14(4):33–44
28. Pepper Institute on Aging and Public Policy (2015) Florida’s aging population: critical issues
for Florida’s future. Retrieved from https://pepperinstitute.fsu.edu/sites/g/files/imported/stor
age/original/application/22487bf23ab907d4c17b39227dccf000.pdf
29. United Nations High Commissioner for Refugees: The UN Refugee Agency. What is a refugee?
(n.d.). Retrieved from https://www.unrefugees.org/refugee-facts/what-is-a-refugee/
30. Florida Department of Children and Families. Statistics for Florida 2016 (2017) Florida
Department of Health [FDOH], 2018. Surveillance and control of selected mosquito-borne
diseases in Florida 2014 guidebook. Retrieved from http://www.floridahealth.gov/diseases-and-
conditions/mosquito-borne-diseases/_documents/2014/arboguide-2014.pdf
31. Florida Department of Health [FDOH] (2018) Surveillance and control of selected mosquito-
borne diseases in Florida 2014 guidebook. Available at http://www.floridahealth.gov/diseases-
and-conditions/mosquito-borne-diseases/_documents/2014/arboguide-2014.pdf
32. National Oceanic & Atmospheric Administration [NOAA], Hurricane Research Division (Updated
August 1 2018) Hurricane direct hits on the mainland U.S. coastline and for individual states by
Saffir/Simpson category1851–2017. Retrieved from http://www.aoml.noaa.gov/hrd/tcfaq/E19.html
33. Sadri AM, Ukkusuri SV, Murray-Tuite P, Gladwin H (2015) Hurricane evacuation route choice
of major bridges in Miami Beach, Florida. Trans Res Rec: J Trans Res Board 2532:164–173.
https://doi.org/10.3141/2532-18
34. Marshall J (2011) Infant neurosensory development: considerations for infant child care. Early
Childhood Educ J 39:175–181. https://doi.org/10.1007/s10643-011-0460-2
35. Buitelaar JK, Huizink AC, Mulder EJ, de Medina PGR, Visser GH (2003) Prenatal stress and
cognitive development and temperament in infants. Neurobiol Aging 24:S53–S60
36. Mulder EJ, De Medina PR, Huizink AC, Van den Bergh BR, Buitelaar JK, Visser GH (2002)
Prenatal maternal stress: effects on pregnancy and the (unborn) child. Early human development
70(1–2):3–14. 689-7093-14
37. Masten AS, Narayan AJ (2012) Child development in Florida: the case context of disaster, war,
and terrorism: pathways of risk and resilience. Annu Rev Psychol 63:227–257
38. Rees S, Inder T (2005) Fetal and neonatal origins of altered brain development. Early Hum Dev
81(9):753–761. https://doi.org/10.1016/j.earlhumdev.2005.07.004
39. Cusick SE, Georgieff MK (2016) The role of nutrition in brain development: the golden
opportunity of the "first 1000 days". J Pediatr 175:16–21
40. Schore AN (2001) Effects of a secure attachment relationship on right brain development, affect
regulation, and infant mental health. Infant Ment Health J 22(1–2):7–66
41. Ahrens KA, Thoma ME, Copen CE, Frederiksen BN, Decker EJ, Moskosky S (2018)
Unintended pregnancy and interpregnancy interval by maternal age, national Survey of Family
Growth. Contraception 98(1):52–55. https://doi.org/10.1016/j.contraception.2018.02.013
42. Ginige K, Amaratunga D, Haigh R (2016) Mainstreaming women into disaster reduction in the
built environment. Disaster Prev Manag 25(5):611–627
43. Jacobs MB, Harville EW (2015) Long-term mental health among low-income, minority women
following exposure to multiple natural disasters in early and late adolescence compared to
adulthood. Child Youth Care Forum 44(4):511–525. https://doi.org/10.1007/s10566-015-9311-4
44. Swatzyna RJ, Pillai VK (2013) The effects of disaster on women’s reproductive health in
developing countries. Global J Health Sci 5(4):106–113
45. Xu Y, Herrman H, Tsutsumi A, Fisher J (2013) Psychological and social consequences of
losing a child in a natural or human-made disaster: A review of the evidence. Asia Pac
Psychiatry 5(4):237–248
Natural and Manmade Disasters: Vulnerable Populations 159

46. Goodman AK (2016) In the aftermath of disasters: the impact on women’s health. Crit Care
Obstet Gynecol 2(6):29
47. Nawyn SJ (2010) Institutional structures of opportunity in refugee resettlement: gender, race/
ethnicity, and refugee NGOs. J Sociol Soc Welf 37(1):149–167. Retrieved from https://
scholarworks.wmich.edu/jssw/vol37/iss1/9
48. Utržan DS, Wieling EA (2018) A phenomenological study on the experience of Syrian asylum-
seekers and refugees in the United States. Fam Process 1–20. doi: https://doi.org/10.1111/famp.
12408
49. Carolan M (2010) Pregnancy health status of sub-Saharan refugee women who have resettled in
developed countries: a review of the literature. Midwifery 26(4):407–414. https://doi.org/10.
1016/j.midw.2008.11.002
50. Higginbottom GM, Safipour J, Mumtaz Z, Chiu Y, Paton P, Pillay J (2013) “I have to do what I
believe”: sudanese women’s beliefs and resistance to hegemonic practices at home and during
experiences of maternity care in Canada. BMC Pregnancy Childbirth 13(1):1. https://doi.org/
10.1186/1471-2393-13-51
51. Shah A, Suchdev P, Mitchell T, Shetty S, Warner C, Oladele A, Reines S (2014) Nutritional
status of refugee children entering DeKalb County, Georgia. J Immigr Minor Health 16(5):959.
https://doi.org/10.1007/s10903-013-9867-8
52. Yun K, Matheson J, Payton C, Scott KC, Stone BL, Song L et al (2016) Health profiles of newly
arrived refugee children in the United States, 2006–2012. Am J Public Health 106(1):128–135.
https://doi.org/10.2105/AJPH.2015.302873
53. Reed RV, Fazel M, Jones L, Panter-Brick C, Stein A (2012) Mental health of displaced and
refugee children resettled in low-income and middle-income countries: risk and protective
factors. Lancet 379(9812):250–265. https://doi.org/10.1016/S0140-6736(11)60050-0
54. Ugurlu N, Akca L, Acarturk C (2016) An art therapy intervention for symptoms of post-
traumatic stress, depression and anxiety among Syrian refugee children. Vulnerable Child
Youth Stud 11(2):89. https://doi.org/10.1080/17450128.2016.1181288
55. Pine BA, Drachman D (2005) Effective child welfare practice with immigrant and refugee
children and their families. Child Welfare 84(5):537–562. Retrieved from https://www.ncbi.
nlm.nih.gov/pubmed/16435650
56. How Many Seniors Are Living in Poverty? National and State Estimates Under the Official and
Supplemental Poverty Measures in 2016 (n.d.) Juliette Cubanski, Kendal Orgera, Anthony
Damico, and Tricia Neuman. https://www.kff.org/medicare/issue-brief/how-many-seniors-are-
living-in-poverty-national-and-state-estimates-under-the-official-and-supplemental-poverty-
measures-in-2016/
57. Florida Department of Health (2017) Zika incident response playbook. Retrieved from http://
www.floridahealth.gov/diseases-and-conditions/mosquito-borne-diseases/_documents/zika-
playbook.pdf
58. Thompson EL, Vamos CA, Jones J, Liggett LG, Griner SB, Logan RG, Daley EM (2018)
Perceptions of Zika virus prevention among college students in Florida. J Community Health 1–7
59. Delaney A, Mai C, Smoots A, Cragan J, Ellington S, Langlois P et al (2018) Population-based
surveillance of birth defects potentially related to Zika virus infection—15 states and US
territories, 2016. Morb Mortal Wkly Rep 67(3):91
60. Sell TK, Watson C, Meyer D, Kronk M, Ravi S, Pechta LE et al (2018) Frequency of risk-
related news media messages in 2016 coverage of Zika virus. Risk Anal 38(12):2154–2514.
https://doi.org/10.1111/risa.12961
61. Kessler RC, Galea S, Gruber MJ, Sampson NA, Ursano RJ, Wessely S (2008) Trends in mental
illness and suicidality after hurricane Katrina. Mol Psychiatry 13(4):374–384
62. Morrow B, Enarson E (1996) Hurricane Andrew through women’s eyes: issues and recom-
mendations. Int J Mass Emerg Disasters 14:5–22
63. Antipova A, Curtis A (2015) The post-disaster negative health legacy: pregnancy outcomes in
Louisiana after hurricane Andrew. Disasters 39(4):665–686. https://doi.org/10.1111/disa.12125
64. Sato M, Nakamura Y, Atogami F, Horiguchi R, Tamaki R, Yoshizawa T, Oshitani H (2016)
Immediate needs and concerns among pregnant women during and after typhoon haiyan
160 J. Marshall et al.

(Yolanda). PLoS Curr 8. currents.dis.29e4c0c810db47d7fd8d0d1fb782892c. https://doi.org/10.


1371/currents.dis.29e4c0c810db47d7fd8d0d1fb782892c
65. Centers for Disease Control and Prevention (CDC) Division of Reproductive Health, National
Center for Chronic Disease Prevention and Health Promotion (2018) Pregnancy mortality
surveillance system. Retrieved from https://www.cdc.gov/reproductivehealth/
maternalinfanthealth/pregnancy-mortality-surveillance-system.htm
66. Kendig S, Keats JP, Hoffman MC, Kay LB, Miller ES, Simas TAM et al (2017) Consensus
bundle on maternal mental health: perinatal depression and anxiety. J Obstet Gynecol Neonatal
Nurs 46(2):272–281
67. França UL, McManus ML (2018) Frequency, trends, and antecedents of severe maternal
depression after three million US births. PLoS One 13(2):e0192854. Retrieved from http://
www.dcf.state.fl.us/programs/refugee/reports/2017/Pdf/2017Pop3OriginbyAge.pdf
68. Oni O, Harville E, Xiong X, Buekens P (2015) Relationships among stress coping styles and
pregnancy complications among women exposed to hurricane Katrina. J Obstet Gynecol
Neonatal Nurs 44(2):256–267. https://doi.org/10.1111/1552-6909.12560
69. Gribble KD, McGrath M, MacLaine A, Lhotska L (2011) Supporting breastfeeding in emer-
gencies: protecting women’s reproductive rights and maternal and infant health. Disasters 35
(4):720–738. https://doi.org/10.1111/j.1467-7717.2011.01239.x
70. Florida Department of Health (2010) Pregnancy Risk Assessment Monitoring System
(PRAMS). Postpartum depression. Retrieved from http://www.floridahealth.gov/statistics-
and-data/survey-data/pregnancy-risk-assessment-monitoring-system/_documents/reports/
depression2010.pdf
71. Kousky C (2016) Impact of disasters on children. Futur Child 26(1):73–92. Retrieved from
https://files.eric.ed.gov/fulltext/EJ1101425.pdf
72. Garcia-Ortega I, Kutcher S, Abel W, Alleyne S, Baboolal N, Chehil S (2012) Mental health and
psychosocial support in disaster situations in the Caribbean: core knowledge for emergency
preparedness and response. Chapter 9, 73–89. Retrieved from https://www.paho.org/disasters/
index.php?option¼com_docman&view¼download&alias¼1980-mental-health-and-psychoso
cial-support-in-disaster-situations-in-the-caribbean-chapter-9&category_slug¼books&
Itemid¼1179&lang¼en
73. Goodman RD, West-Olatunji CA (2008) Transgenerational trauma and resilience: improving
mental health counseling for survivors of Hurricane Katrina. J Ment Health Couns 30(2):121–
136. https://doi.org/10.17744/mehc.30.2.q52260n242204r84
74. Watamura S, Phillips DA, Morrissey T, McCartney K, Bub K (2011) Double Jeopardy: poorer
social-emotional outcomes for children in the NICHD SECCYD experiencing home and child-
care environments that confer risk. Child Dev 82:48–65. https://doi.org/10.1111/j.1467-8624.
2010.01540.x
75. Ortman JM, Velkoff VA, Hogan H (2014). An aging nation: The older population in the United
States, Current Population Reports, P25-1140. U.S. Census Bureau, Washington, DC
76. Fact Sheet: Aging in the United States. Population Reference Bureau (2016). Available at http://
www.prb.org/Publications/Media-Guides/2016/aging-unitedstatesfact-sheet.aspx
77. Douglas R, Kocatepe A, Barrett AE, Ozguven EE, Gumber C (2017) Evacuating people and
their pets: older Floridians’ need for and proximity to Pet-Friendly Shelters. J Gerontol Ser B:
gbx119. https://doi.org/10.1093/geronb/gbx119
78. Ward BW, Schiller JS, Goodman RA (2014) Multiple chronic conditions among US adults: a
2012 update. Prev Chronic Dis 11:E62. Published 2014 Apr 17. https://doi.org/10.5888/pcd11.
130389
79. Aldrich N, Benson WF (2007) Disaster preparedness and the chronic disease needs of vulner-
able older adults. Prev Chronic Dis 5(1):A27
80. Alzheimer’s Association (2018) Alzheimer’s disease facts and figures. Alzheimers Dement.
2018 14(3):367–429
Natural and Manmade Disasters: Vulnerable Populations 161

81. Al-Rousan TM, Rubenstein LM, Wallace RB (2014) Preparedness for natural disasters among
older US adults: a nationwide survey. Am J Public Health 104(3):506–511. https://doi.org/10.
2105/AJPH.2013.301559
82. Cannon T (2008) Reducing people’s vulnerability to natural hazards: communities and resil-
ience, WIDER Research Paper, No. 2008/34., ISBN 978-92-9230-080-7. The United Nations
University World Institute for Development Economics Research (UNUWIDER), Helsinki
83. Komisar HL (2012) Key issues in understanding the economic and health security of current
and future generations of seniors. Henry J. Kaiser Family Foundation. Retrieved from: https://
kaiserfamilyfoundation.files.wordpress.com/2013/01/8289.pdf
84. Gibson MJ, Hayunga M (2006) We can do better: lessons learned for protecting older persons in
disasters
85. Schröder-Butterfill E, Marianti R (2006) A framework for understanding old-age vulnerabil-
ities. Ageing Soc 26(1):9–35. https://doi.org/10.1017/S0144686X05004423
86. Knowles R, Garrison B (2006) Planning for the elderly in natural disasters. Disaster Recover J 19
(4):1–3
87. Adams V, Kaufman SR, van Hattum T, Moody S (2011) Aging disaster: mortality, vulnerabil-
ity, and long-term recovery among Katrina survivors. Med Anthropol 30(3):247–270
88. Nigg JM, Barnshaw J, Torres MR (2006) Hurricane Katrina and the flooding of New Orleans:
emergent issues in sheltering and temporary housing. Ann Am Acad Polit Soc Sc 604
(1):113–128. https://doi.org/10.1177/0002716205285889
89. Norris M (2005) Assessing nursing homes’ responses to Katrina. Accessed from: http://www.
npr.org/templates/transcript/transcript.php?storyId¼4854893 http://www.npr.org/templates/
story/story.php?storyId¼4854893&ps¼rs
90. Kessler RC, Angermeyer M, Anthony JC, DE Graaf R, Demyttenaere K, Gasquet I et al (2007)
Lifetime prevalence and age-of-onset distributions of mental disorders in the World Health
Organization’s World Mental Health Survey Initiative. World Psychiatry 6(3):168–176
91. Mitsova D, Esnard A-M, Martin JAM, Sapat A, Hamilton JB, Lai BSE, Osterman MJK,
Driscoll AK, Drake P (2018) Socioeconomic vulnerability and electric power restoration
timelines births: final data for 2017. Natl Vital Stat Rep 67(8):1–50. Hyattsville, MD: National
Center for Health Statistics
92. Issa A, Ramadugu K, Mulay P, Hamilton J, Siegel V, Harrison C, Campbell CM, Blackmore,
Bayleyegn T et al (2018) Deaths related to hurricane Irma – Florida, Georgia, and North
Carolina, September 4–October 10, 2017. MMWR Morb Mortal Wkly Rep 67(30):829–832.
https://doi.org/10.15585/mmwr.mm6730a5
93. Wong S, Shaheen S, Walker J (2018) Understanding evacuee behavior: a case study of
hurricane Irma. Transportation Sustainability Research Center, Berkeley. Retrieved from
https://escholarship.org/uc/item/9370z127
94. Ricchetti-Masterson K Horney J (2013) Social factors as modifiers of hurricane Irene Evacu-
ation Behavior in Beaufort County, NC. PLOS Curr Disasters., Edition 1. https://doi.org/10.
1371/currents.dis.620b6c2ec4408c217788bb1c091ef919
95. Schwarzenberg SJ, Georgieff MK (2018) Advocacy for improving nutrition in the first 1000
days to support childhood development and adult health. Am Acad Pediatr 144(2):e20173716.
https://doi.org/10.1542/peds.2017-3716
96. Thompson CA (2017) Nursing home evacuation turns medication reconciliation into emer-
gency. Am J Health-Syst Pharm 74(22):1836–1838. https://doi.org/10.2146/news170075
97. Spurlock WR, Rose K, Veenema TG, Sinha SK, Gray-Miceli D, Hitchman S et al (2019) American
Academy of Nursing on policy position statement: disaster preparedness for older adults. Nurs
Outlook 67(1):118–121
98. Prasad S (2018) Measuring and comparing hospital accessibility for Palm Beach County’s elderly
and nonelderly populations during a hurricane. Disaster Med Public Health Prep 12(3):296–300
99. Benevolenza MA, DeRigne L (2019) The impact of climate change and natural disasters on
vulnerable populations: a systematic review of literature. J Hum Behav Soc Environ 29(2):266–281
Global Sexual Violence

Sara Spowart

1 Introduction

1.1 Sexual Violence Defined

Sexual violence is a term that has become increasingly recognized in our society
with the Me-Too movement and greater awareness of the various forms that this
global issue takes. Our stereotypes and conceptions about sexual violence have
progressed to encompass a much larger understanding of the ways that sexual
violence is perpetrated and its impact on victims. According to the World Health
Organization definition, sexual violence is “any sexual act, attempt to obtain a sexual
act, unwanted sexual comments or advances, or acts to traffic, or otherwise directed,
against a person’s sexuality using coercion, by any person regardless of their
relationship to the victim, in any setting, including but not limited to home and
work” [45]. This definition is more progressive than others that exist and works to
encompass the individual and societal impact that sexual violence has. Sexual
violence can range from verbal harassment to physical penetration. This can occur
through coercion, social pressure, intimidation or physical force. The World Health
Organization’s definition of sexual violence is fairly broad and more narrow
definitions exist that include only physical force or the threat of violence [44, 45].

S. Spowart (*)
College of Public Health, University of South Florida, Tampa, FL, USA
e-mail: sspowart@health.usf.edu

© Springer Nature Switzerland AG 2020 163


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1_8
164 S. Spowart

1.2 Situations Sexual Violence Occurs

Sexual violence occurs in a variety of situations and circumstances throughout the


world. These situations include rape in marriage, intimate partner relationships,
strangers and acquaintances. It also occurs with sexual harassment, which may
include unwanted sexual advances or harassment in school, work, or other situations
where there are power imbalances or the aggressor is in a position of authority.
Sexual violence also takes the form of systematic rape, sexual slavery and other
types of sexual violence in armed conflict such as forced impregnation or forced
abortions. It occurs in mentally or physically disabled groups, those who are
dependent on caretakers, as well as with children [45].
Sexual violence occurs worldwide and is a global issue however the data on the
issue varies and is not as reliable as other forms of health-related information. The
sources of data for sexual violence include population-based surveys, police reports,
and studies from nongovernmental organizations and clinical settings. Globally,
rates of sexual violence are largely underestimated, some reports estimate that
80–95% of sexual violence cases are not reported to the police [45]. Although sexual
violence happens to both men and women, the vast majority of survivors are female.
Both men and women largely underreport cases of sexual violence. This occurs for
many reasons such as societal stigma, poor support systems, shame, fear of blame,
not being believed, retaliation or being socially ostracized. In many societies, the
victim is blamed for the violence they have experienced, they may experience
societal or family rejection, retaliation or even violence such as honor killings or
being forced to marry their aggressor [45].

1.3 Global Prevalence

A World Health Organization multi-country study defined sexual violence as being


physically forced to have sexual intercourse without consent, to have sexual inter-
course out of fear of one’s partner, or being forced to perform a degrading or
humiliating sexual act [10]. This multi-country study found that the younger the
first age of sexual intercourse the more likely it was due to force or coercion. This
study found that women aged 15–49 years reported differing levels of sexual
violence in intimate partner relationships. Specifically, Japan and Serbia reported
the lowest levels of intimate partner sexual violence at 6% of women experiencing
this form of sexual violence; Thailand and Tanzania reported mid-range levels at
approximately 29–30% of women; Bangladesh and Ethiopia experienced the highest
levels of intimate partner sexual violence at 50% and 59% of women
respectively [10].
There is limited data on sexual violence by non-partners globally. A majority of
this data comes from police reports, justice records, crime surveys, retrospective
investigations on child abuse and rape crisis centers. The WHO multi country study
Global Sexual Violence 165

found that .3–12% of women in the study had been forced to have sexual intercourse
after age 15 by a non-partner. The perpetrator may have been a family member,
friend, acquaintance or stranger. A majority of studies find that most women know
their aggressors even though they are not intimate partners [2, 45]. In a
South African study one in five men reported raping a woman who was not an
intimate partner in a relationship. This is particularly an issue in humanitarian crises
such as conflict, post conflict, refugee camps, or humanitarian crises related to
drought, famine, natural disasters or other issues that create structural instability.
However, generally for a large portion of women in the world their first experience
with sexual intercourse is forced. This ranges from 1% in Japan to 30% in
Bangladesh of women reporting their first sexual intercourse was forced. Yet more
investigation is warranted as the terminology used in these surveys may impact
results. Surveys that ask about first sexual intercourse and use the term ‘unwanted’
instead of ‘forced’ yield results of sexual violence incidents several times higher.
Therefore, there is a possibility that the terminology on multi-country studies is
critical in obtaining more accurate reports of sexual violence estimates worldwide. In
addition to this, it is important to note that in much of the world, violence against
women is accepted and seen as a cultural norm. A woman may not label experiences
as sexual violence, but something like ‘unwanted sexual contact’ [2, 45].
Through My Eyes: Forced Sexual Initiation
I told my dad when I was 15 that I had a boyfriend. He took me into my room and raped
me. I was a virgin. It never happened again. He said he was teaching me about sex. –
Adapted recount from an adult survivor of sexual violence
Through My Eyes: Non-Partner Sexual Violence
He broke onto my property and raped me vaginally and anally first with his penis and
then with a knife. He choked me while he did it. I never saw his face. He ran away after. I
went in the house and hid in the bathroom for hours while I bled from where the knife cut
me. He was never caught. My family blamed me. I never met or knew this person. It has
ruined my life, I am a different person now. He is there in my mind whenever I close my eyes,
and it happened more than 10 years ago. –Adapted recount of an adult sexual violence
survivor
Through My Eyes: Sexual Violence by Intimate Partners
We had been together 10 years, he never hurt me once. . . I told him I didn’t want to be
with him anymore and he put a gun to my head and raped me. He kept me hostage for a week
in his apartment and raped me repeatedly. –Adapted recount of an adult sexual violence
survivor

1.4 Childhood Sexual Abuse

Childhood sexual abuse is defined as “the involvement of a child in sexual activity


that he or she does not fully comprehend, is unable to give informed consent to, or is
not developmentally prepared for, or that otherwise violates the laws or social taboos
of society, by adults or other children in positions of responsibility, trust, or power
over the victim” [42]. Child sexual abuse is prevalent on a worldwide scale.
166 S. Spowart

Approximately 20% of women and 5–10% of men report having been sexually
abused as children [42]. It is a difficult topic to research as the topic is largely taboo
and the reporting rates do not reflect the actual rates of incidence. The vast majority
of both child and adult sexual abuse is not reported. Additional issues with
researching childhood sexual abuse include varying definitions worldwide of child-
hood and sexual abuse and the ethics of researching these issues. Childhood sexual
abuse is widespread but the prevalence on a worldwide scale is largely unknown,
although recently there have been new studies that have covered minimally inves-
tigated regions of the world [35].
A 2004 WHO review found the estimated global prevalence of childhood sex
abuse to be about 27% for girls and 14% for boys and generally child sexual abuse
was more common among girls than boys. However, some recent studies have found
the rates to be very similar for girls and boys in Asia. In a 2009 study in Swaziland of
1242 girls and women, ages 13–24 years, 33% of participants experienced sexual
violence before age 18. This study found that the most likely perpetrators were men
or boys the participant knew in their neighborhood, or as boyfriends or husbands.
The abuse usually occurred in the perpetrator’s home [25]. In a study comparing the
rates of child sexual abuse before the age of 15 years in three Central American
countries, it was found that 4.7% of children experienced sexual abuse before the age
of 11 years in Guatemala, 7.8% experienced sexual abuse in Honduras and 6.4%
experienced sexual abuse in El Salvador before age 11 years. Similar to the study in
Swaziland, the participants usually knew their perpetrators before they were
abused [34].
Most perpetrators of childhood sexual abuse are not brought to justice and spend
no time in jail for their crimes. This is true in both developed and developing
countries and is a global issue of concern. Despite the use of forensic evidence
and rape kits, our laws and judicial systems are more likely to protect the perpetrators
and not the victims. Often, children are afraid to come forward for fear of retaliation
or because the perpetrator is a father, brother, cousin, neighbor, uncle or someone
else in their connected circle. Also, when children do come forward often they are
not believed or listened to, or may even be threatened with harm [35].
Through My Eyes: Childhood Sexual Abuse
I can’t stay in my room anymore. Daddy’s friend hurt me and I can’t sleep in the dark
now, I need night lights, I keep seeing his shadow. –Adapted recount from a minor child
survivor of sexual violence

1.5 Human Trafficking

Trafficking is nothing less than a modern form of slavery, an unspeakable and unforgiveable
crime against the most vulnerable members of the global society. (Sjöberg [33], p. 1)

Human trafficking is defined as “use of coercive, deceptive means or abuses of


position of power for exploitation or forced sex work and various other forms of
Global Sexual Violence 167

labour” [42]. Human trafficking is prevalent worldwide. An estimated 11.4 million


women and girls are trafficked globally [13]. Germany is the largest market in the
world for the sex trafficking of women and children. The United States is the world’s
second largest market and destination for sex trafficking in the world. Moreover, a
large number of trafficking victims in the US are actually US citizens or US residents
from vulnerable situations such as missing children, children of drug addicts,
runaways, or foster children. But it is estimated that 18,000–50,000 people are
trafficked into the United States every year, the remaining number of trafficking
victims are actually US residents of some sort. However, all of these statistics are
only estimates as the actual numbers are unknown. Approximately 90% of individ-
uals trafficked into the United States are female and close to half of all trafficking
victims are children [30].
The CIA provides a global overview of human trafficking activity and provides
rankings for which countries need to be watched depending on level of severity and
activity. The U.S. State Department is also involved with monitoring and addressing
sex trafficking concerns. Solutions for addressing sex trafficking according to the
U.S. Department of Justice include:
1. New sentences that reflect the severity of the crime of human trafficking and the
damage it causes to lives.
2. State and local police training and education to understand prostitution in relation
to sex trafficking and how to identify sex trafficking victims.
3. State and local police awareness training to identify human trafficking on an
everyday basis. A study conducted by Klueber [17] found that 83 of the largest
police departments in the United States were unaware of trafficking as a criminal
issue in their area. In addition to the lack of awareness, there was also little or no
training in trafficking issues and the police did not understand the various forms it
takes but saw it as an issue of organized crime [17].
4. Victim services, witness support services and victim reintegration programs
should be used while removing victims from trafficking situations and to keep
them protected so they can heal and reintegrate back into society [30].
5. There needs to be an understanding by police and prosecutors of the laws that can
be applied to prosecute traffickers. There needs to be an understanding of the laws
that can prosecute individuals for involuntary servitude, sex trafficking by force,
fraud or coercion, forced labor and seizure of documents. Other possibly relevant
laws include immigration laws, tax laws or others that are potentially related and
should be explored for prosecution [30].
Although sex trafficking is a criminal concern, it is also largely an economics
issue. It is a way for traffickers to financially profit from the vulnerability of certain
populations, specifically impoverished females and children with little if any familial
support. Economics is a key link in understanding the connection between vulner-
able individuals and human trafficking. With large scale globalization of economic
markets worldwide, human trafficking of the most vulnerable populations continues
to rise. Demand for sex work by higher income populations, relatively easy access to
the internet and mobile phones, the influence of globalization, minimal criminal
168 S. Spowart

consequences and high potential profits make this a growing worldwide industry.
The poverty that exists with the most vulnerable, particularly in rural areas world-
wide, increases the chances of both women and children being coerced, manipulated,
kidnapped or sold into human trafficking. The high market demand for sex work, the
large number of vulnerable women and children that are available to sex traffickers
worldwide, as well as the difficulty in identifying and prosecuting traffickers and
police corruption means the sex trafficking industry continues to grow
[15, 30]. Groups that are particularly at risk include chronic runaways and homeless
youth, youth with chronic truancy history, individuals with prior sexual trauma and
abuse, LGBTQ youth, undocumented immigrant youth, youth involved in child
welfare and juvenile justice, youth with severe substance abuse issues and youth
with friends or family involved in the commercial sex industry [15, 16].
Red flags exist for identifying potential victims of trafficking. These signs include
branding tattoos, new/expensive accessories, older controlling boyfriend/girlfriend,
potential victims being unable to answer questions about where they have been for
periods of time, being treated for multiple STDs and abortions, having physical
trauma or scarring, and being unable to look others in the eye. However, these are
just some potential red flags. It is not uncommon for sex trafficking victims to claim
they are older than their actual age and are prostitutes by choice and act indepen-
dently. The average age someone is trafficked into the industry is 12–14 years. Other
potential red flags are lack of identification, no birth certificate, social security card
or identification card. It is not uncommon for US sex trafficking victims for example,
to be born US citizens but have no identification, social security card or birth
certificate because these items have been taken from them by their traffickers.
Sometimes the victim’s names are even changed or they are called by new aliases
[15, 16].
The public needs to be educated to understand both the local and global levels of
trafficking that exist, red flags that exist for victims of sex trafficking, the vulnera-
bilities that traffickers prey upon, and the implications sex trafficking has for
societies and economies. The long-term cost associated with illegal activity and
economies based in human trafficking may include increased levels of crime,
violence, drug use, drug trafficking, reduced potential of workforce due to mental
and physical health issues resulting from trafficking, limited workforce capacity due
to limited skillsets and training. In the short term, human sex trafficking may lead to
higher profits than drug trafficking and have less risk and fewer consequences in an
economically globalized world. Traffickers may earn 5–20 times the original price
they paid for buying a trafficked woman or child. At this point in time, the benefits to
the trafficker far outweigh the costs and risks. This is true worldwide. However, even
though there can be a lot of moral charge and reaction to the issue of sex trafficking,
it is predominantly an economic issue. Consumer demand creates these markets.
Educating the general public, addressing the economic vulnerabilities of the most
at-risk populations, and enforcing severe consequences so that the costs far outweigh
the benefits for traffickers are all important for decreasing or eliminating sex
trafficking [15, 30].
Global Sexual Violence 169

1.5.1 Significant Legislation in Human Trafficking

The first comprehensive federal law that addressed human trafficking was the
Trafficking Victims Protection Act (TVPA) of 2000. This act defined human
trafficking as a person induced to perform labor or a commercial sex act through
force, fraud or coercion (Trafficking Victims Protection Act of 2000, Sec. 103(8)(A–
B), [26]). This is arguably the most important anti-trafficking law ever passed. A key
feature of TVPA is that physical transportation from one location to another is not a
requirement of sex trafficking. This is critical for understanding the different forms
of human trafficking that exist and the various ways it functions. This act works to
combat human trafficking through prevention, protection and prosecution. This law
has gone through multiple revisions since 2000. These revisions have had more of a
focus on domestic trafficking issues, acknowledgement of our domestic trafficking
issues and the different forms they take. In 2003 a federal civil right of action was
created for victims to sue their traffickers. In 2005 a pilot program for minor
trafficking victims was created with grant programs to help law enforcement combat
human trafficking (Trafficking Victims Protection Act of 2000, [26]). In 2008, new
prevention strategies about worker’s rights, new systems to gather data on Human
Trafficking, an expanded T visa, and enhanced criminal sanctions to increase
prosecution rates and make prosecution easier were implemented. In 2013, TVPA
became part of the Violence Against Woman Act (VAWA), the State Department
included emergency response provisions to disaster areas susceptible to human
trafficking, measures to prevent child marriage and address supply chains for
obtaining trafficking victims were included. In 2017, there was a big focus on the
demand side of human trafficking. Demand directly impacts supply however these
changes have not yet been passed (Trafficking Victims Protection Act of 2000, [26]).
Key features of TVPA 2000 are that the act addresses the subtle means of
coercion used by traffickers to control victims. It addresses the complexity of the
problem and how traffickers control their victims. Traffickers control victims
through psychological coercion, trickery, seizure of documents, and activities
which were difficult to prosecute under preexisting involuntary servitude statutes
and case law. Victims are also controlled through force, bodily harm (physical or
sexual), confinement and hunger. They are also controlled through fraud or deceitful
employment offers or work conditions, false promises or withholding wages.
Another way that victims are controlled is through coercion. Specifically, psycho-
logical coercion that includes threats to others, trickery, abuse of legal process, and
creating a climate of fear for victims. Another important component of TVPA is
stating that any person under the age of 18 who performs a commercial sex act is
considered a victim of sex trafficking even if the factors of force, fraud and coercion
are not present. Identifying the commercial aspect of sexual exploitation is important
for separating the crime of trafficking from that of sexual assault, rape or molestation
(Trafficking Victims Protection Act of 2000, [26]).
170 S. Spowart

1.5.2 Types of Human Trafficking

There are multiple types of sex trafficking. One of these types refers to “Romeo
Pimps.” These types of pimps generally target vulnerable young females, identify a
victim’s potential needs and works to fulfill them in order to build trust. This trust is
then used to help victims feel a false sense of love, care and loyalty. For many
victims, it’s the first time they have heard “I love you” or the first adult figure they
could rely on to fulfill a promise. The ‘Romeo pimps’ deliver on the stated or
observed needs or wants the victim may have. For example, if a victim needs
food, a phone or clothes a Romeo trafficker will deliver on those items in order to
build trust. Once this sense of trust, care and love is built the trafficker will begin to
ask the victim to do sexual acts with other people for money or will even switch and
become abusive and violent and tell the victim that they owe them or will hurt their
family members if they don’t perform sexual acts with other people. The trafficker
might turn and become very abusive, and the temporary love and care the victim
experienced is gone. Psychological coercion and manipulation is a big part of this.
Victims of Romeo Pimps may be recruited from malls public transportation or metro
centers, social media like Facebook or Instagram, group homes/youth centers/foster
care facilities, schools, shelters, detention centers. Globally, runaways and children
from low income backgrounds with little or no familial support are at high risk
[15, 32].
Through My Eyes
She went back to her trafficker, he was always there for her and gave her a place to stay
so she wouldn’t be homeless. He shot her in the head last week, she was HIV positive. She’s
dead now. –Adapted recount of an adult survivor of human trafficking and sexual violence

Other forms that trafficking can take include familial trafficking and trafficking by
strangers. Familial sex trafficking is trafficking that is perpetrated by family members.
Some examples may include an uncle drugging his undocumented immigrant 12-year
old niece and selling her to sex traffickers while she is unconscious. Another example
might include a mother regularly selling her 5 and 6-year old daughters for sex in
exchange for drug money. Or a father sexually abusing his 15-year old daughter and
selling her for sex to strangers on a regular basis. In these situations, the ‘pimp’ or
trafficker was a family member. In some situations, the victim may be relocated such
as the example of the uncle selling his niece, but in other situations the victims
continue to live with the family members and their income from sex work goes to
their traffickers (their family members who are trafficking them). Some victims are
abused even as young as infants. An example of this might be a mother or father
selling their baby for sexual abuse in exchange for money. Familial sex trafficking can
be a matter of power and control over the victims, a way to earn extra money or fund
an alcohol or drug addiction. It is very difficult to identify and can occur in families
that are well respected. Victims in these situations may have difficulty understanding
their abuse as they have been raised to view violence, rape and sex in exchange for
money as normal. These victims may still attend school and even receive good grades.
They are often cautious of what they say to adults outside of their family and so it may
be difficult to identify signs of their abuse and exploitation [3, 32].
Global Sexual Violence 171

Through My Eyes
I was first sold for sex when I was a baby. I was abused throughout my entire childhood. I
became a drug addict by the time I was 20 and was trafficked outside of the state. The
traffickers knew I could be controlled by the drug addiction. –Adapted recount of an adult
survivor of human trafficking and sexual violence

Traffickers use power and control behaviors and a variety of coercive and abusive
tactics to keep victims trapped in a cycle of violence. Physical, sexual, emotional and
psychological abuse is used as well as economic manipulation to keep victims
controlled. Traffickers also threaten to harm the victim or their family and exploit
their vulnerabilities. Other methods of control such a “Stockholm Syndrome” or
trauma bonding are also used to maintain control of victims. Trauma bonding occurs
in abusive situations, hostage situations, incestuous relationship and or any ongoing
attached relationship where there is a lot of pain and abuse mixed in with times of
calm and relative peace [3, 32].
Sex trafficking is increasingly facilitated online and sex traffickers advertise their
victims through online websites. Websites like backpage.com are used to advertise
victims. Minors are often posted with false ages and photos to make them appear to
be adult. Victims are usually exploited in hotels. The Dark Net is another avenue that
sex trafficking takes place online. Children or teens are often prostituted online.
However, no child or minor can consent to sex work. A child prostitute or teen
prostitute is by definition a child sex trafficking victim. This is because a minor
cannot legally or developmentally make the decision to engage in commercial sex
work [3, 32].
Through My Eyes: Human Trafficking
They took me in a van up to North Carolina and kept me in a basement. I was so drugged
up I thought it was only 2 days, but I was in that basement for 3 months. They kept changing
the clocks. –Adapted recount of an adult survivor of human trafficking and sexual violence
My mother started trafficking me and my sister when we were 5 years old for drug
money. She would make us have sex with lines of men every day. I tried to kill myself and was
hospitalized when I was 14. My sister decided to become an independent prostitute when she
was 14. She is dead now. –Adapted recount of an adult survivor of human trafficking and
sexual violence

1.6 Sexual Violence with Men and Boys

Men and boys suffer from sexual violence. Although current worldwide trends show
that it is usually women and girls who experience the majority of sexual violence,
this is an underdeveloped area of research in the field of sexual violence. Men and
boys are less likely to report than girls and the vast majority of girls do not report.
Men and boys also experience rape, unwanted sexual contact, and harassment. This
occurs in military, prison, home, work, school and street settings. Moreover, the
majority of men who experience sexual abuse do not receive treatment or appropriate
support for the sexual abuse they have experienced. It is arguable that the issues of
secrecy and shame are even more predominant with males concerning the issue of
172 S. Spowart

sexual violence. This has created even larger discrepancies and variation in reporting
rates than what currently exist with female survivors. An important potential piece of
prevention is addressing the sexual violence that occurs with young boys. Sexual
abuse in early life is associated with becoming a perpetrator in later life and as an
adult [28, 45].
Through My Eyes
I tried to let him (an acquaintance who had become homeless) use my shower and give
him a hot meal and he raped me anally. –Adapted recount from an adult male survivor of
sexual violence
I still think about killing myself. I remember what he did to me when I was 5. They had me
stay at a hospital when I was 7 so I wouldn’t hurt myself. I still have memories of what he did
everyday. –Adapted recount from a minor male survivor of sexual violence

1.7 Sexual Violence in Schools and at Work

Sexual harassment and violence in schools and work are global issues that affect
individuals in every country of the world. Sexual harassment is a type of sexual
violence. Unfortunately, there is no universal definition for sexual harassment.
Under Title VII of the Civil Rights Act of 1964, the U.S. Equal Employment
Opportunity Commission (EEOC) defines sexual harassment as “unwelcome sexual
advances, requests for sexual factors, and other verbal or physical conduct of a
sexual nature” [22]. These sexual advances interfere with an individual’s ability to
work, go to school or function with other aspects of their lives.
Sexual violence typically occurs in environments the victims feel they are safe.
Work and school environments are common places where sexual harassment occurs.
In school environments, victims experience abuse by peers and teachers. Research
from the Wellesley Centers for Research on Women and Management Systems
International found that a large portion of girls experience sexual violence in the
form of sexual harassment and sexual assault while commuting to and from school
or work. The abuse also occurs on school premises in places such as school
bathrooms or dorms [20, 41].
A study conducted in the Machinga district of Malawi of 40 schools with primary
school age girls explored the different types of sexual harassment and abuse at
school and identified four types of sexual harassment and abuse at school experi-
enced by these girls. Specifically, 13.5% experienced sexual touch, 7.8% experi-
enced sexual comments, 2.3% experienced rape and 1.3% experienced unwanted or
coerced sex. In addition to this, it was found that in 80% of the schools the teachers
knew a male teacher that had pressured a student for sexual intercourse and in 65%
of the schools the teachers knew a male teacher who had impregnated a student [6].
Although the experience of these primary schools may seem bleak regarding
sexual harassment, the evidence shows that it is widespread and exists in both high
income and low-income countries. By comparison, a large online study of female
students in US middle and high schools found that a majority of the 1002 partici-
pants experienced sexual harassment at school [12]. Research in the European Union
Global Sexual Violence 173

found that 40–50% of women in the EU experience some sort of sexual harassment
at work. However, in both high income and low-income countries the research is still
relatively recent and much more research needs to be done [9].
Through My Eyes
Men come up to me and ask to buy sex when I’m walking to school. I’m afraid to walk to
school alone. –Adapted recount of a minor survivor of sexual violence
I pay for my school tuition and food and shelter by giving older men sex. They don’t want
to use a condom, but I don’t worry about HIV, why worry about something that can kill you
in 10 years when you have to find some way to eat today? –Adapted recount of a minor
survivor of sexual violence

1.8 Stalking Sexual Violence

Stalking is a form of sexual violence that has historically received less recognition in
the realm of sexual violence, however individuals who experience stalking are also
at higher risk of other forms of sexual violence or violence by the stalker or
perpetrator. Stalking often occurs in the confines of an abusive intimate partner
relationship, however it can exist with strangers and acquaintances too. The Vio-
lence Against Women Act of 2005 defines stalking as “engaging in a course of
conduct directed at a specific person that would cause a reasonable person to- A) fear
for his or her safety or the safety of others; B) suffer substantial emotional distress”
[40]. Victims of stalking may experience anxiety, insomnia, depression, intrusive
thoughts, lost time at work and lower levels of functioning. The vast majority of
stalking cases are not reported. Stalking can occur online through cyberstalking or
repeated attempts at contact through the internet or phone, or in person through
various mechanisms. Stalking can cause a victim to experience severe distress,
emotional instability and intense fear for their safety or the safety of others they
are close to.
Through My Eyes: Stalking Sexual Violence
He hired a private investigator, which is basically a meth head he paid to follow me
around and take pictures of where I was at. He used to follow me even though there is a
restraining order and it was in violation of his probation. He would even take his ankle
bracelet monitor off so he could follow me. I know he’s going to kill me someday, I just know
it. –Adapted recount of an adult survivor of stalking

1.9 Conflict-Related Sexual Violence

It is perhaps more dangerous to be a woman than a soldier in armed conflict. –Maj. Gen.
(Ret.) Patrick Cammaert, Wilton Park Conference, May 2008.
In a number of contemporary conflicts, sexual violence has taken on particularly brutal
dimensions, sometimes as a means of pursuing military, political, social and economic
objectives.–Report of the Secretary-General Pursuant to Security Council Resolution 1820
(S/2009/362) Paragraph 6.
174 S. Spowart

Conflict-related sexual violence is the least condemned war crime according to


the UN Special Rapporteur on Violence against Women Radhika Coormaraswamy
[43]. Sexual violence has been silenced throughout history and social and religious
beliefs have made the topic taboo and one of shame and secrecy. Cultures of shame
and secrecy around sexual violence have helped perpetrators to continue to use
sexual violence in times of conflict. Sexual violence as a weapon only requires
physical force or threat of physical force which is a low-cost form of weaponry.
Sexual violence related to conflict is perpetrated by combatants and can be used as a
weapon of war to inflict torture, to injure, obtain information, dominate, threaten or
punish. Domestic sexual violence often increases during conflict and in post-conflict
environments [43].
Conflict-related sexual violence particularly differs from other forms of sexual
violence in that it is usually perpetrated by a stranger. A majority of sexual violence
is committed by someone you know, which is one way that this type of violence
differs. Also, generally in areas where there is conflict-related sexual violence the
mechanisms in place to combat sexual violence are poor or non-existent. According
to the report of the Secretary General on Conflict Related Sexual Violence, countries
of particular concern regarding sexual violence in conflict include Afghanistan, the
Central African Republic, Columbia, the Democratic Republic of the Congo, Iraq,
Libya, Mali, Myannar, Somalia, South Sudan, Sudan (Darfur), Syrian Arab Repub-
lic, Yemen, Bosnia and Herzegovina, Cote d’Ivoire, Nepal, Sri Lanka, Burundi and
Nigeria [31].
The passage of Resolution 1325 by the UN Security Council on October 31, 2000
acknowledged for the first time in history, that women and children comprise the
vast majority of those who negatively impacted by armed conflict, especially in
terms of the impact armed conflict has on sexual violence levels. Resolution 1325
provides the foundation for gender and peacekeeping work for the UN Department
of Peacekeeping Operations [24, 37].
Some solutions provided by the United Nations and International Committee of
the Red Cross include assuming that sexual violence is occurring in conflict situa-
tions and provide services on that assumption. Providing these services provides an
entry point for survivors who need assistance and may otherwise have no support.
Also, providing these services on the assumption that sexual violence is happening
in armed conflict and humanitarian emergencies works to create more awareness and
decreases the normalization, secrecy and shame of sexual violence in
communities [27].
We are still at the beginning of understanding why sexual violence occurs in
conflict and post conflict settings and why sexual violence occurs worldwide in
societies in general. Usually, perpetrators are men and are not held accountable for
sexual violence they commit. In armed conflict settings women are typically told
sexual violence is a woman’s problem and it is their fault because of something they
did because they are the ones experiencing it. To reduce and prevent sexual violence
in conflict areas it is important to prevent, protect and use early warning indicators
for changes in safety and changes in movement of women and children due to closer
proximity of armed groups. Also, UN peacekeepers need to build up their capacity in
Global Sexual Violence 175

case there are increases in conflict and to maintain peace. UN peacekeepers also need
to deploy women’s protection advisers in conflict ridden areas. Preventing sexual
violence needs to be implemented in UN peacekeeping training pre-deployment and
seen as a crucial part of protecting the civilian populations in conflict-ridden areas.
Some recommendations also include having women involved in the peacekeeping
process and negotiations in the peace process during post-conflict transitions [27].
Other recommendations for intervention involve understanding whether some
patterns of rape in conflict areas are seen as a practice and some are seen as a
strategy. Determining whether they are viewed as a practice or strategy can be useful
for addressing approaches for intervention. Somewhat effective interventions have
also included connecting UN peacekeeping organizations with armed conflict
groups and having armed conflict groups sign prohibitions against sexual violence
agreements. Also, UN peacekeeping organizations can provide training with book-
lets about sexual violence and the importance of prohibition during armed
conflict [27].
A success example from the International Committee for the Red Cross (ICRC)
includes efforts to have 15 different armed groups agree to sign a prohibition against
sexual violence agreement since 2012. According to the ICRC, there have been no
violations of the agreement since these 15 groups signed it. For armed conflict
groups that have not agreed to sign the prohibition against sexual violence agree-
ment, UN peacekeeping groups and the ICRC continue to have on-going discussions
and focus on appropriate timing for these on-going discussions in an attempt to
assess when certain groups may be more open to discussing the issue. Some
particularly patriarchal groups refuse to discuss sexual violence and are very defen-
sive on the topic. Discussions inquiring how groups can prove there have been no
sexual violence violations are significant for opening up the possibility of the issue in
more hostile groups. Other aspects of sexual violence in armed conflict that need to
be considered are the forced recruitment and forced participation of combatants in
gang rape with civilians. In Colombia for example, female combatants who were
forced into recruitment were often raped and some of these rapes led to pregnancy
and forced abortion by the armed conflict groups [5].
A key aspect of intervention is to have prevention, prohibition agreements and
discussions on the issue of sexual violence in conflict-prone regions or with armed
conflict groups before violence even occurs. The topic of accountability needs to be
incorporated into all discussions and efforts to combat this issue. In armed conflict,
accountability from perpetrators is the exception rather than the rule. Accountability
has grown over the last two decades, but it is still the exception rather than the norm.
Generally, if a victim comes forward they are blamed and ostracized by their families
and communities. In addition to this, in armed conflict or humanitarian emergency
situations the infrastructure is usually lacking to bring justice to the victims or law
enforcement is unavailable to investigate violations. Lastly, another impediment is
lack of transportation or childcare to report sexual violence and the survivors lack
funds to access justice [19].
Accountability needs to become the norm rather than the exception in conflict-
related sexual violence. Resilient criminal justice systems are needed that address the
176 S. Spowart

root causes of sexual violence. Systems that make perpetrators accountable for
sexual violence and prevent repetition of sexual violence by perpetrators are needed
to reduce incidence and help survivors fully function and reintegrate into society.
Research from interviews shows that what victims want is some form of
non-repetition of sexual violence and to remove those that perpetrate rape
[19]. Another pervasive issue to address is the problem of stigma in relation to
sexual violence. Stigma is one of the biggest impediments to progress with sexual
violence in conflict situations. For example, many survivors that have experienced
sexual violence in the Bosnia and Herzegovina conflict are blamed and stigmatized
for what happened to them. Rehabilitation efforts such as creating theater shows put
the issue on stage and reenact the stigma survivors experienced. These performances
act as a mirror for communities to see what a survivor experienced. This decreases
stigma, secrecy and feelings of shame surrounding the issue. It also increases
compassion and empathy for survivors and a desire for accountability from
perpetrators [19].
There is some overlap between sexual violence in conflict-related settings and
sexual violence in humanitarian settings. Both situations are destabilized and see a
lot of rape, gang rape and sexual violence. Sexual violence is often a product of
conflict in humanitarian emergencies. This is an area that should be addressed by all
aspects of the humanitarian community in terms of the prevention and response to
sexual violence. Like conflict-related sexual violence, sexual violence in humanitar-
ian emergencies is widespread, and there is a pervasive tolerance for high rates of
sexual assault. Tolerance and lack of accountability of all forms of sexual violence to
any population group are the norm in conflict-related settings and humanitarian
emergencies. The humanitarian sector has traditionally approached the issue of
sexual violence through addressing reproductive health needs of women. However,
a significant amount of evidence shows that sexual violence is significantly higher in
populations impacted by armed conflict. It is challenging if impossible to know the
numbers of sexual violence cases in humanitarian emergencies and conflict-affected
areas. Even in stable high-income regions of the world, only a small number of
sexual violence cases are reported. This is due to stigma, feelings of shame, social
retaliation and lack of infrastructure to address these crimes [21].
There is also overlap between conflict-related sexual violence and human sex
trafficking. As systems become destabilized the vulnerability to trafficking, its
prevalence and severity increase. Prevention efforts are needed in conflict settings
to prevent sex trafficking. These measures must address the vulnerabilities that
increase the chance of trafficking and improve a population’s resilience. Some
efforts to increase capacity include: microfinance efforts, educational opportunities,
food security work, addressing gender based violence and discrimination that can
lead to violence. Other prevention measures for individuals who must relocate due to
conflict include providing secure routes for traveling across border areas, and
securing appropriate identification of marriage and birth registration. In addition to
this, prevention of human trafficking in conflict settings requires cooperation
between different stakeholders and partnership between state and non-state
actors [36].
Global Sexual Violence 177

Sexual violence is now recognized as a threat to maintenance of international peace and


security, and as such requires a security, justice, and service response. Addressing it
requires more than the spontaneous efforts of a dedicated few, but collective and concerted
intervention at the highest political level and at the level of operations and programmes on
the ground. National ownership, leadership, and responsibility are indispensable to erad-
icating this scourge. –Zainab Hawa Bangura, Special Representative of the Secretary
General on Sexual Violence in Conflict
Through My Eyes
At least 90% if not all the women here have been raped, they are all refugees from the
Democratic Republic of Congo. But you cannot talk to them about it or mention the word
rape. It is a very shameful thing to be raped and the women will lie about it. –In reference to
a conversation regarding a group of female survivors of torture from the DRC

1.9.1 Causes and Risk Factors for Sexual Violence

Sexual violence occurs at multiple levels and is associated with many risk factors.
One of the best way for understanding the levels and ways that violence occurs is
through understanding the ecological model. The ecological model addresses vio-
lence on the individual, relationship, community and societal levels. An individual
experiences violence in relation to the multiple levels that exist in their world.
Individual and relationship levels relate to the individual aspects of an individuals’
behaviors, relationship relates to friendships, family and close relationships. Risk
factors related to this level include (on the individual and relationship level) gang
membership, an individual’s use of alcohol or drugs, exposure to domestic violence
as a child, physical or sexual abuse as a child, lower education levels, an individual’s
own beliefs and acceptance of violence and sexual violence, views that men and
women are not equal [45]. A study conducted in South Africa of a large population
of men found that having raped was associated with certain factors. Some of these
factors include higher levels of adversity and trauma in childhood, unequal views on
gender equality, been raped by a man in the past, and participation in transactional
sex [14].
Community and societal factors look at the impact that one’s community and
society has on an individual’s likelihood to be a victim or perpetrator of sexual
violence. These factors are a critical component of the perpetration of sexual
violence worldwide. These factors include: (1) the beliefs and ideas that societies
and communities hold about sexual violence, (2) the traditional gender and social
norms regarding male superiority and patriarchy, (3) societies that sanction sexual
violence by not having or enforcing laws. Lack of significant consequences regard-
ing sexual violence generally supports sexual violence [7].
Perpetrators are less likely to go to jail for sexual violence than for other crimes.
In the United States alone, out of every 1000 rapes, 995 perpetrators go free with no
consequences. This includes child, adult and elderly abuse. Seven hundred and
seventy rapes out of every 1000 rapes are not reported. Some reasons that victims
do not report include fear of retaliation, belief that the police will not help, belief it
was a personal issue, belief it was not important enough to report, or not wanting the
178 S. Spowart

perpetrator to get arrested. Moreover, a majority of sexual assaults worldwide are


committed by someone the victim knows as an acquaintance, friend or family
member and there is a fear that reporting could impact their relationships with
individuals connected to them. For example, reporting about an uncle could impact
the relationships in the rest of the family [7].

1.9.2 Consequences of Sexual Violence Worldwide

There are significant consequences to sexual violence worldwide. Sexual violence


occurs in stable and non-stable settings, it occurs in high income, middle income and
low-income countries and both humanitarian settings, conflict settings and everyday
‘normal’ life. Sexual violence at this point in time is pervasive worldwide and much
more common than most realize. Although it is relatively common worldwide in
both developed and developing regions, we are still learning the physical, mental,
economic and societal impacts that sexual violence has on populations. We now
know that childhood sexual violence alone is connected to increased levels of
adolescent and adult substance abuse, increased chances of future trafficking, pov-
erty, trauma, lower health, education and economic outcomes and a continuation of
violence. Sexual violence reduces the potential of societies and economies to
develop and provides a large health burden to society in terms of physical health
and mental health consequences. For example, the average person who is trafficked
is approximately 12–14 years old and the estimated life span of a trafficked individ-
ual once they’ve entered the life is 7 years. Individuals who experience sexual
violence suffer from a much higher burden of health issues and trauma. Also,
some victims of sexual violence are at risk for becoming perpetrators in the future
themselves and thereby continue the cycle of violence. In addition to this, babies
born from rape are at higher risk of malnutrition, neglect, abuse and abandonment.
This is particularly true in conflict-related sexual violence [39].
Worldwide, more girls and women are killed by violence than traffic accidents
and malaria combined. The home is one of the most dangerous places for a woman
in terms of violence and sexual violence makes up a significant part of these
deaths [11]. Sexual violence impacts the individual, their family and friends, the
community and their society. There are significant physical and psychological
impacts that can impact individuals acutely, chronically and fatally. Suicidal
ideation, suicide, PTSD and severe depression are all potential consequences of
sexual violence. Some potential damage to reproductive organs includes, trauma to
reproductive organs, trauma to anus, infection of sexually transmitted diseases,
pregnancy, unsafe or forced abortions. Some mental health effects of sexual violence
include, PTSD and trauma, anxiety, depression, shame, fear of sex, difficulty
functioning in society and on-going feeling of fear in one’s life after the violence
has occurred and difficulty caring for one’s children or oneself after the violence has
occurred. Other societal impacts on survivors include victim blaming, rejection by
spouse, family, or community [21].
Global Sexual Violence 179

Through My Eyes
I didn’t know him, he raped me behind a building and choked me. I can’t work as a nurse
anymore, I am afraid to leave my house. I took a bottle of pills last week and was
hospitalized but I wasn’t trying to kill myself, I just wanted the pain to stop. My husband
thought I tried to kill myself but I didn’t, I just took the whole bottle of pills. –Recount of an
adult survivor of sexual violence

1.9.3 Actions to Address Global Sexual Violence and Change Cultural


Norms

Changing Cultural Norms

An important aspect of sexual violence are the cultural norms, either spoken or
unspoken regarding sexual violence. Early thinking on this issue concluded that
patriarchal values and patriarchal societies created sexual violence. However,
research from Sweden, a country with one of the highest rates of gender equality
in the world, has surprisingly high rates of sexual violence. This contrast in high
sexual violence rates is seen across Nordic and European countries with higher
gender equality rates. Recent research suggests that it is in fact the beliefs about
sexual violence that potential perpetrators and potential victims hold that leads to its
occurrence. For example, the higher the tolerance level or lower level of education
and understanding of sexual violence by potential perpetrators or victims, the higher
the likelihood of sexual violence in a society even if there are high levels of gender
equality. A society’s rape supportive attitude is a good potential predictor of sexual
violence rates. These attitudes may be discrete or obvious but have an impact on the
way perpetrators and victims view and respond to the issue [23].

Creating Higher Levels of Awareness

Other mechanisms for changing cultural norms include creating awareness in fam-
ilies, communities and society as an important part of countering the secrecy that
sexual violence thrives in. Sexual violence tends to thrive when there is lack of
awareness, stigma, shame, or secrecy. Therefore, creating a culture that has
increased awareness of the prevalence of the issue across all population groups
and a better understanding of the reality versus myths of rapes will likely reduce the
incidence. Furthermore, part of increasing awareness and changing cultural rape
supportive norms is by maintaining a broad, universal definition of sexual violence
as provided by the World Health Organization. As provided at the beginning of the
chapter, according to the WHO, sexual violence is “any sexual act, attempt to obtain
a sexual act, unwanted sexual comments or advances, or acts to traffic, or otherwise
directed, against a person’s sexuality using coercion, by any person regardless of
their relationship to the victim, in any setting, including but not limited to home and
180 S. Spowart

work” [45]. This broad definition provides support for large scale public awareness
campaigns and brings increased understanding of the prevalence of the issue.

Involving Men in the Process for Change

Another part of changing cultural values and understanding of sexual violence


includes involving men in part of the process for change. Historically rape has
been considered a women’s problem and victims are typically blamed or silenced.
Accountability for perpetrators is not the norm, it is the exception is every part of the
world for every form of sexual violence. A vast majority of sexual violence perpe-
trators are men but they have not historically been involved in the process for
change. Women have largely led the movement for change in every part of the
world. Therefore, involving men in confronting sexual violence issues is a critical
missing piece. Also, engaging non-perpetrating males in conversations on how these
issues impact them, what they would like changed and what are the best approaches
and interventions to take is another important consideration. In addition to this, it is
critical to encourage more male survivors to speak out, as both male child and adult
survivors report even less often than female survivors and the vast majority never
receive the support they may need [1].

Showing Leadership

Showing leadership both locally and globally is a critical component of addressing


sexual violence. Leaders at all levels of society need to publicly condemn sexual
violence against boys and girls, including children, adolescents, adults and the
elderly to include all groups and advocate for awareness of potential rape supportive
attitudes and education on the importance of nonviolence and sexuality [10].

Changing Laws and Enforcing Existing Laws

Another component of addressing sexual violence and changing cultural norms is


changing laws and enforcing already-existing stricter laws and consequences. Sexual
violence is the least falsely reported crime of any crime. Only 995 out of 1000
perpetrators see any jail time (.5% of all perpetrators). Sexual violence is the only
crime where the victim is blamed and even shamed for the assault. Therefore,
enforcing laws and holding perpetrators accountable is an important part of changing
cultural norms. This includes laws that prohibit intimate-partner or domestic vio-
lence, sexual violence, sexual harassment, and trafficking as well as supporting
policies that encourage equality in relationships and challenge traditional patriarchal
values [10].
Global Sexual Violence 181

Taking a Public Health Approach

Another important aspect of addressing sexual violence and changing cultural norms
is taking a public health approach to the problem. This is an approach that addresses
individual, relationship, and community/societal factors. Effective approaches for
sexual violence require a heavy focus on prevention and target entire populations.
This means taking a large-scale approach. By targeting an entire population, the
social stigma, shame and secrecy of the issue is brought to the surface and it is treated
like any other public health concern. By targeting the entire population, the issue is
no longer a ‘women’s issue’ or something to be kept in the closet and not discussed.
It becomes more normalized as an issue for discussion and consideration. In order to
target an entire population, a multi-level cross sectional approach that incorporates
health, education, welfare, technology, entertainment, business and criminal justice
sectors is warranted [45]. Sector diversification helps to ensure that prevention
efforts are more effectively implemented and that survivors are better able to receive
the services and support they need.
Currently the evidence base of effective interventions to decrease sexual violence
is limited. However, shame, secrecy and lack of reporting often empower the issue
and allow it to thrive and continue from generation to generation and become an
institutionalized part of culture. Due to rape crisis centers, nongovernmental orga-
nizations, health care and medico-legal efforts in response to survivor support, there
is now more evidence on how to provide a comprehensive response to survivors than
ever before. These efforts and the resulting data include psychological support,
emergency contraception, treatment or preventive medicine for sexually transmitted
infections, treatment or preventive treatment for HIV, information and medical
access for safe abortion and forensic examination [45].
Sexual violence intervention and prevention efforts are most successful when
they combine a multi-sector approach and address multiple aspects of the issues. It is
not a simple problem and as such requires a multi-factoral approach [29]. The health
system response is a critical component to addressing the response of sexual
violence on a global level. Health care providers frequently and unknowingly
come into contact with survivors of sexual violence and victims of human traffick-
ing. This is especially true in urgent care settings where the patients may only come
in and out for a short time with little or no follow-up. One of the key roles health care
providers can make is in the identification of survivors or victims of sexual violence
and connecting them with resources that will most help the patient. According to the
World Health Organization 2013 report Global and Regional Estimates of Violence
Against Women one in three women worldwide that have ever had a partner
have experienced sexual violence, physical violence or both by their partner.
This violence increases the health burden and use of health care services [10].
182 S. Spowart

Invest in Sexual Violence Prevention

Invest in programs and interventions that challenge traditional social norms of


dominant male discourse and passive femininity that promote male violence. Iden-
tify risk factors that contribute to sexual violence such as being exposed to violence
in your childhood, and issues with alcohol or substance abuse. Promote programs
and efforts that increase gender equality such as microfinance efforts, empowering
interventions in food production and water and sanitation for women such as village
wells or home gardens so women do not need to travel alone for long distances to
obtain household goods [10].

Invest in Research and Data Collection

Regular and on-going national population based surveys of sexual violence that
assess the magnitude, risk factors and consequences are important for addressing
sexual violence issues. Ongoing data collection and reporting on all forms of sexual
violence and support research that addresses knowledge gaps in poorly understood
areas of sexual violence, prevention and interventions for at risk groups should be
required. There also needs to be investments in creating the capacity to do research
on sexual violence in low, middle and high-income countries [10].

Empathy Training

Other interventions that could make an impact include general population-wide


empathy training in relation to issues of gender inequality, power, control and
violence. Also, empathy training for perpetrators or individuals at risk for becoming
perpetrators, and self-compassion training for survivors of sexual violence may
make a positive impact [8].

Address Potential Vulnerabilities

Sexual violence and particularly human trafficking often occurs due to the exploi-
tation of an individual’s vulnerabilities. Some of the biggest vulnerabilities that
victims often experience include (1) the desire for love and secure attachment due
to a history of childhood abuse and unstable home lives, and/or (2) economic
vulnerabilities due to poverty, economic instability and lack of work opportunity.
Economic empowerment strategies such as microfinance and education that
empowers vulnerable at-risk populations are essential for addressing sexual violence
concerns [18].
Global Sexual Violence 183

Strengthen the Role of the Health Sector

The health care sector has a unique opportunity to address sexual violence and sex
trafficking. Training, identification, screening and prevention of sexual violence into
medical, nursing, public health, relevant undergraduate and graduate curriculum
could be effective for decreasing stigma, increasing awareness and providing sup-
port to survivors in their careers [10].

1.10 Organizations Working to Combat Global Sexual


Violence

The movement to combat global sexual violence is more pervasive now than at any
time in recent history. It is increasingly seen as a human rights issue that needs to be
addressed and has on-going negative consequences for the health, development and
progress of any society. Organizations that combat sexual violence include the Rape,
Abuse & Incest National Network (RAINN), which is the largest anti-sexual vio-
lence organization in the United States. The organization No Means No Worldwide
is a global rape prevention program. It works to provide training, awareness and
education in regions of the world with particularly high sexual assault rates such as
Kenya and Malawi. Rape Crisis Centers that are government funded and provide
support to rape victims are critical in the fight against sexual violence.
The first rape Crisis Centers began in the United States in the 1970s. Many of
these began as volunteer efforts, but have become much more professionalized and
active in prevention, education, training, outreach, response, crisis counseling and
forensic exams than ever before. Local rape crisis centers now exist worldwide in
spite of a general stigma against sexual violence survivors. These organizations exist
in major cities in Ghana, Egypt, Kenya, Jordan, Senegal, South Africa, Tanzania,
Argentina, Brazil, Chile, Costa Rica, Dominican Republic, Ecuador, Guatemala,
Jamaica, Mexico, Peru, Venezuela, Australia, China, India, Japan, Malaysia,
New Zealand, Singapore, South Korea, Taiwan, Thailand, Vietnam, Austria, Bel-
gium, Czech Republic, Denmark, England, Finland, France, Germany, Greece,
Iceland, Ireland, Italy, Netherlands, Norway, Russia, Scotland, Spain, Sweden,
Switzerland, Turkey and Wales [38].
However, this list of countries is not exhaustive. Arguably, there are many little-
known organizations throughout the world working to address issues related to
sexual violence. Some of these for example include grassroots organizations such
as Oasis Zimbabwe, Stichting Women’s Initiatives for Gender Justice in the Dem-
ocratic Republic of Congo, Global Change Project Inc., in the United States, Freely
in Hope in Kenya, MADRE in Colombia, Kwakha Indvodza in Swaziland to name a
few. There are an unknown number of organizations, both formal and informal that
are working worldwide to address sexual violence and all the forms it takes.
Mid-size organizations such as Project Concern International as well as institutions
184 S. Spowart

such as the Middlebury Institute of International Studies work to create awareness


and creative solutions to issues of global sexual violence. Larger actors such as
government actors, the United Nations, the World Bank, the Asia Foundation,
CARE, the World Health Organizations, the United Nations Population Fund and
UNICEF play a major role in worldwide efforts to reduce and eliminate all forms of
sexual violence.

1.10.1 Critical Thinking and Reflection

Sexual violence to any population group is a human rights issue and causes an
incredible amount of damage, both visible and non-visible to individuals, relation-
ships, communities and societies. The damage to society and damage to the devel-
opment of human potential is arguably unquantifiable because its effects continue far
beyond the actual incidence of sexual violence that occurs. However, for perpetra-
tors of sexual violence it may be seen as a low risk, low cost solution to something. A
trafficker may view trafficking as a way to make a profit, a perpetrator of sexual
violence may feel more empowered over another’s life as a result of the violence
they perpetrate onto more vulnerable groups. Prostitution that turns into sex traf-
ficking may have initially appeared to be a solution to extreme poverty and hopeless
circumstances, such as the situation of many women in sub-Saharan Africa that are
trafficked. Understanding how these issues may be viewed as solutions from another
perspective is potentially important for policy interventions and incidence reduction.
Understanding the role that poverty plays in sex trafficking as well as the thinking
and motivation of perpetrators is important for addressing these issues.
Historically, much of the work on this issue has focused on the impact on the
victim and the experience of the victim. However, it is worthwhile to conduct further
research and investigation into perpetrators of sexual violence, their motivations and
the benefits they receive from this violence. Even though it may be a challenging
question to investigate, to reduce the incidence and prevalence of the problem we
need to ask tough questions, such as: Why is there a demand for 5-year old sex
slaves, what is the benefit to perpetrators to cause this type of harm? What is the
motivation of an intimate partner to rape their partner, what is the benefit to them to
do this? What is the motivation of groups in armed conflict to commit gang rape to
mass numbers of women and children? What is the benefit they are deriving? By
better understanding the motivation and drive of perpetrators, to better understand
their thinking, may be an important piece of the puzzle that is missing in addressing
the global epidemic of sexual violence in the world today. In addition to this,
understanding that although sexual violence impacts societies and communities
and exists on a global scale, it is something that happens to real people. Each
incident is the lived experience of a real person, with real impacts on survivors
and their lives long after the violence has ended.
Global Sexual Violence 185

References

1. Barker G, Ricardo C, Nascimento M, World Health Organization (2007) Engaging men and
boys in changing gender-based inequity in health: evidence from programme interventions
2. Bott S, Guedes A, Goodwin MM, Mendoza JA (2012) Violence against women in Latin
America and the Caribbean: a comparative analysis of population-based data from 12 countries.
Pan American Health Organization, Washington, DC
3. Bruckmüller K, Bullock B, Bush MA, Carranza M, Chen Y, Crim BE et al (2010) Sex
trafficking: a global perspective. Lexington Books, Lanham
4. Bush GW (2002) Remarks at the White House Conference on missing, exploited, and runaway
children. Available at http://whitehouse.gov/news/releases/2002/10/20021002-4.html
5. Cohen DK (2013) Female combatants and the perpetration of violence: wartime rape in the
Sierra Leone civil war. World Polit 65(3):383–415
6. Columbia RH, Kadzamira E, Moleni C (2007) The safe schools program: student and teacher
baseline report on school-related gender-based violence in Machinga District, Malawi. United
States Agency for International Development (USAID), Washington, DC. http://www.usaid.
gov/our_work/cross-cutting_programs/wid/ed/safeschools.html
7. Department of Justice, Office of Justice Programs (2013) Bureau of Justice Statistics, female
victims of sexual violence, 1994–2010
8. Eisenberg N, Eggum ND, Di Giunta L (2010) Empathy-related responding: associations with
prosocial behavior, aggression, and intergroup relations. Soc Issues Policy Rev 4(1):143–180
9. European Commission (1998) Sexual harassment at the workplace in the European Union.
European Commission., Directorate-General for Employment IRaSA, Brussels
10. García-Moreno C, Jansen HAFM, Ellsberg M, Heise L, Watts C (2005) WHO multi-country
study on women’s health and domestic violence against women: initial results on prevalence,
health outcomes and women’s responses, vol 204. World Health Organization, Geneva, pp 1–18
11. Grown C, Gupta GR, Kes A, Projet Objectifs du millénaire (2005) Taking action: achieving
gender equality and empowering women. Earthscan, London
12. Hill C, Kearl H (2011) Crossing the line: sexual harassment at school. American Association of
University Women, Washington, DC
13. International Labour Office (2012) ILO global estimate of forced labour: results and
methodology. International Labour Office, Geneva
14. Jewkes R et al (2011) Gender inequitable masculinity and sexual entitlement in rape perpetra-
tion South Africa: findings of a cross-sectional study. PLoS One 6(12):e29590
15. Kara S (2009) Sex trafficking: inside the business of modern slavery. Columbia University
Press, New York
16. Kandathil R (2005) Global sex trafficking and the trafficking victims protection act of 2000:
legislative responses to the problem of modern slavery. Mich J Gend Law 12:87
17. Klueber SA (2003) Trafficking in human beings: law enforcement response. Doctoral disser-
tation, University of Louisville
18. Krug EG, Mercy JA, Dahlberg LL, Zwi AB (2002) The world report on violence and health.
Lancet 360(9339):1083–1088
19. Leatherman J (2011) Sexual violence and armed conflict. Polity
20. Management Systems International (MSI) (2008) Are schools safe havens for children? Exam-
ining school-related gender-based violence. United States Agency for International Develop-
ment (USAID), Washington, DC
21. Marsh M, Purdin S, Navani S (2006) Addressing sexual violence in humanitarian emergencies.
Glob Public Health 1(2):133–146
22. McLaughlin H, Uggen C, Blackstone A (2012) Sexual harassment, workplace authority, and the
paradox of power. Am Sociol Rev 77(4):625–647
23. Moyano N, Monge FS, Sierra JC (2017) Predictors of sexual aggression in adolescents: gender
dominance vs. rape supportive attitudes. Eur J Psychol Appl Leg Context 9(1):25–31
186 S. Spowart

24. Reilly N (2018) How ending impunity for conflict-related sexual violence overwhelmed the UN
Women, peace, and security agenda: a discursive genealogy. Violence Against Women
24(6):631–649
25. Reza A et al (2009) Sexual violence and its health consequences for female children in
Swaziland: a cluster survey study. Lancet 373(9679):1966–1972
26. Roby JL, Vincent M (2017) Federal and state responses to domestic minor sex trafficking: the
evolution of policy. Soc Work 62(3):201–210
27. Russell W (2007) Sexual violence against men and boys. Forced Migr Rev 27:22–23
28. Russell D, Springer KW, Greenfield EA (2010) Witnessing domestic abuse in childhood as an
independent risk factor for depressive symptoms in young adulthood. Child Abuse Negl
34(6):448–453
29. Samarasekera U, Horton R (2015) Prevention of violence against women and girls: a new
chapter. Lancet 385(9977):1480–1482
30. Schauer EJ, Wheaton EM (2006) Sex trafficking into the United States: a literature review. Crim
Justice Rev 31(2):146–169
31. Secretary-General, U. N (2018) Report of the Secretary-General on conflict-related sexual
violence
32. Shelley L (2010) Human trafficking: a global perspective. Cambridge University Press,
New York
33. Sjöberg M (2016) Rape victim and perpetrator blame: effects of victim ethnicity, perpetrator
ethnicity, participant gender, and participant ethnicity
34. Speizer IS et al (2008) Dimensions of child sexual abuse before age 15 in three Central
American countries: Honduras, El Salvador, and Guatemala. Child Abuse Negl 32(4):455–462
35. Stoltenborgh M, Van Ijzendoorn MH, Euser EM, Bakermans-Kranenburg MJ (2011) A global
perspective on child sexual abuse: meta-analysis of prevalence around the world. Child Maltreat
16(2):79–101
36. United Nations Office on Drugs and Crime (2018) Countering trafficking in persons in conflict
situations. United Nations, Vienna
37. UN Security Council, Resolution 1325 (2000) Distr.: General, October 31, 2000
38. University of Minnesota (2010) Handbook of international centers for survivors of sexual
assault and harassment. Regents of the University of Minnesota, Minnesota
39. University of Pittsburg (2005) Protecting children born of sexual violence and exploitation in
conflict zones: existing practice and knowledge gaps. University of Pittsburg, Pittsburgh
40. Violence Against Women and Department of Justice Reauthorization Act of 2005 (2013).
Article Sec.(3)(a)(24), Act No. H. R. 3402 of January 5, 2006 (PDF). Retrieved 12 February
41. Wellesley Centers for Research on Women, DTS (2003) Unsafe schools: a literature review of
school-related gender-based violence in developing countries. United States Agency for Inter-
national Development (USAID), Washington, DC. 2003
42. WHO, International Society for Prevention of Child Abuse and Neglect (2006) Preventing child
maltreatment: a guide to taking action and generating evidence. World Health Organization,
Geneva
43. Women UN (2010) Addressing conflict-related sexual violence–an analytical inventory of
peacekeeping practice. United Nations, New York
44. World Health Organization (2010) Preventing intimate partner and sexual violence against
women: taking action and generating evidence. World Health Organization, Geneva
45. World Health Organization (2012) Understanding and addressing violence against women.
Retrieved from http://apps.who.int/iris/bitstream/10665/77434/1/WHO_RHR_ 12.37_eng.pdf
Global Health Security and Weapons
of Mass Destruction Chapter

Chris Reynolds

1 Introduction

The global proliferation of weapons of mass destruction (WMD) presents a clear and
present danger to global health security. Unlike conventional weapons that confine
themselves to a defined and targeted area, WMD’s cross international boundaries
and borders and thus threaten global health security. This chapter will focus on the
ease of access to WMDs, the impact biological weapons and bioterrorism plays on
global health security, United States global policies on public health, and the role
actors and non-state actors play in the global health landscape. Chemical, biological,
radiological, nuclear, and explosive weapons, also known as weapons of mass
destruction (WMD), have the potential to kill thousands of people in a single
incident [1]. This fact, in and of itself, is why a thorough understanding of the threat
is necessary. In addition, this chapter will focus on global WMD proliferation
prevention to include international efforts, treaties, and conventions. The chapter
will conclude with a discussion of ongoing research initiatives, identification of
emerging threats, and additional recommended readings.

2 Global Proliferation

There is not anywhere on the planet that is truly safe from the weapons of mass
destruction (WMD) threat. Indeed, the threat to public health by the deliberate use of
these weapons impacts every living person. Whether by hostile nations or by
terrorism, these weapons present a clear and present danger to all nations. The
proliferation of WMD has risen dramatically since the fall of the former Soviet

C. Reynolds (*)
College of Public Health, University of South Florida, Tampa, FL, USA

© Springer Nature Switzerland AG 2020 187


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1_9
188 C. Reynolds

Union and other failed nation states. WMD includes chemical, biological, radiolog-
ical, nuclear weapons and explosives. In addition, there is great concern regarding
the spread of scientific knowledge among terror groups who can produce chemical
and biological weapons with little technical expertise. Both the proliferation of
WMD and spread of scientific knowledge to terrorists preset a global WMD threat.
Dennis Blair [2] states in an DNI Threat Assessment, “Most terrorist groups that
have shown some interest, intent or capability to conduct CBRN attacks have
pursued only limited, technically simple approaches that have not yet caused large
numbers of casualties. In particular, we assess the terrorist use of biological agents
represents a growing threat as the barriers to obtaining many suitable started cultures
are eroding and open source technical literature and basic laboratory equipment can
facilitate production” [2]. In his opening statement before the Senate Committee on
Homeland Security and Governmental Affairs, Senator Joseph Lieberman, summa-
rized his concerns surrounding global proliferation of legitimate biotechnology
research and expertise. As Senator Liberman noted, “. . .while so much of a benefit
in so many ways, also creates this problem because that work can be used to create
weapons of mass bioterror” [3].

Attacks Using WMD


• 1994 (Chemical) – Sarin in Matsumoto, Japan killing 8 and hospitalizing
200
• 1995 (Chemical) – Sarin in a Tokyo subway, 12 dead, 5500 affected
• 1995 (Explosive) – LVB attack at the Federal Building in Oklahoma City,
168 dead
• 1995 (Radiological) – Chechens placed a 14 Kilo package of radioactive
Cesium 137 and explosives in a Moscow park
• 2001 (Biological) – Anthrax released through mail system in the United
States, kills 5, thousands affected
• 2013 (Chemical) – The Syrian government and opposition trade accusa-
tions over a gas attack that killed some 26 people, including more than a
dozen soldiers, in the town of Khan al-Assal in northern Syria. A
U.N. investigation later finds that sarin nerve gas was used, but does not
identify a culprit
• 2017 (Chemical) – More than 90 people are killed in a suspected nerve gas
attack on the town of Khan Sheikhoun in the rebel-held Idlib province.
Victims show signs of suffocation, convulsions, foaming at the mouth and
pupil constriction. Witnesses say the attack was carried out by either
Russian or Syrian Sukhoi jets. Moscow and Damascus deny responsibility.
• 2017 (Chemical) – Assassination of Kim Jong Nam, the exiled half brother
of North Korean dictator Kim Jong Un, by two women assailants in the
Kuala Lumpur International Airport last week. On Friday, Malaysian
officials announced that toxicologists found VX nerve agent on his face.
Global Health Security and Weapons of Mass Destruction Chapter 189

The fall of the Soviet Union and Warsaw Pact Nations resulted in the rise of
concern that WMD weapons held by these nations would find themselves on the
black market and perhaps, made available to terrorists. This added dimension to
proliferation makes it even more difficult to mitigate. The potential for non-state
actors, which includes both domestic and international terrorists, successfully
obtaining access to WMD’s is a very real threat to the safety and security of all
people [4]. One should realize that a nexus exists between WMD and terrorism. The
driving motivation for terrorism is to inflict fear and create destruction to achieve
their goals. The prospect of a terrorist faction successfully obtaining WMD poses
one of the gravest risks to civilization. A successful WMD terror attack could
potentially kill thousands and result in many more thousand casualties. Likewise,
the social, political, and economical impacts of such an attack would threaten the
civilized world. The interconnected nature of people, economies, and international
infrastructure around the world can infuse seemingly isolated or remote events with
global consequences [5].
Significant efforts have been made in the United States and other countries to
eliminate the threat of the spread of WMD. In 2003, President George Bush signed the
Proliferation Security Initiative (PSI), which was designed to stop the global traffick-
ing of WMD. On 31 May 2003, President Bush unveiled the Proliferation Security
Initiative (PSI) in Krakow, Poland, which outlined a new interdiction cooperative
agreement outside of treaties and multilateral export control regimes [6]. PSI is not a
program housed in only one agency, but instead is a set of activities with participation
by multiple U.S. agencies and other countries [7]. In its December 2002 National
Strategy to Combat Weapons of Mass Destruction (WMD) Proliferation, the Bush
Administration articulated the importance of countering proliferation once it has
occurred and managing the consequences of WMD use. In particular, interdiction of
WMD-related goods gained more prominence. U.S. policy sought to “enhance the
capabilities of our military, intelligence, technical, and law enforcement communities
to prevent the movement of WMD materials, technology, and expertise to hostile
states and terrorist organizations [8]. President Bush’s efforts follow a long line of
previous efforts to curb the proliferation of WMD (Illustration 1).
In 1972, beginning with the signing of the Biological Weapons Convention,
which prohibited the development, production, stockpiling, acquisition, retention
or transfer of biological weapons, was the first multilateral disarmament treaty

Illustration 1 Organization for the Prohibition of Chemical Weapons [9]


190 C. Reynolds

banning an entire category of WMD. While this was a valiant attempt, the agreement
fell short as there was no built-in verification mechanism [10]. In 1993, the landmark
Chemical Weapons Convention was held in Paris that resulted in 130 countries
agreeing on the elimination of chemical weapons [9]. It also established the Orga-
nization for the Prohibition of Chemical Weapons (OPCW), whose mission is to
assure the objectives outlined in the CWC are carried out and for ensuring the
implementation the CWC provisions. This includes the verification of compliance
of CWC directives (43, 44). The year 2007 marked the tenth anniversary of the
CWC, which boasted 182-member states. In the preceding years between 1993 and
2007, approximately 25,000 metric tons of chemical weapons were destroyed, with
over 3000 international inspections conducted. Even with the successes of the
OPCW, the world witnessed Syria launch a chemical WMD attack on civilians,
which we will discuss later in the chapter. According to the Worldwide Threat
Assessment of the US Intelligence Community, the Mideast nations of Iraq and
Syria have already demonstrated their use of WMD on civilians. In his statement for
the record, United States Director of National Intelligence, Daniel R. Coats [11]
outlined the following situation status:

2.1 Russia

Russia has developed a ground-launched cruise missile (GLCM) that the United
States has declared is in violation of the Intermediate-Range Nuclear Forces (INF)
Treaty. Despite Russia’s ongoing development of other Treaty-compliant missiles
with intermediate ranges, Moscow probably believes that the new GLCM provides
sufficient military advantages to make it worth risking the political repercussions of
violating the INF Treaty. In 2013, a senior Russian administration official stated
publicly that the world had changed since the INF Treaty was signed in 1987. Other
Russian officials have made statements complaining that the Treaty prohibits Russia,
but not some of its neighbors, from developing and possessing ground-launched
missiles with ranges between 500 and 5500 km.

2.2 China

The Chinese People’s Liberation Army (PLA) continues to modernize its nuclear
missile force by adding more survivable road-mobile systems and enhancing its silo-
based systems. This new generation of missiles is intended to ensure the viability of
China’s strategic deterrent by providing a second-strike capability. China also has
tested a hypersonic glide vehicle. In addition, the PLA Navy continues to develop
the JL-2 submarine-launched ballistic missile (SLBM) and might produce additional
Global Health Security and Weapons of Mass Destruction Chapter 191

JIN-class nuclear-powered ballistic missile submarines. The JIN-class submarines—


armed with JL-2 SLBMs—give the PLA Navy its first long-range, sea-based nuclear
capability. The Chinese have also publicized their intent to form a triad by devel-
oping a nuclear-capable nextgeneration bomber.

2.3 Iran and the Joint Comprehensive Plan of Action

Tehran's public statements suggest that it wants to preserve the Joint Comprehensive
Plan of Action because it views the JCPOA as a means to remove sanctions while
preserving some nuclear capabilities. Iran recognizes that the US Administration has
concerns about the deal but expects the other participants—China, the EU, France,
Germany, Russia, and the United Kingdom—to honor their commitments. Iran’s
implementation of the JCPOA has extended the amount of time Iran would need to
produce enough fissile material for a nuclear weapon from a few months to about
1 year, provided Iran continues to adhere to the deal’s major provisions. The JCPOA
has also enhanced the transparency of Iran’s nuclear activities, mainly by fostering
improved access to Iranian nuclear facilities for the IAEA and its investigative
authorities under the Additional Protocol to its Comprehensive Safeguards Agree-
ment. Iran’s ballistic missile programs give it the potential to hold targets at risk
across the region, and Tehran already has the largest inventory of ballistic missiles in
the Middle East. Tehran’s desire to deter the United States might drive it to field an
ICBM. Progress on Iran’s space program, such as the launch of the Simorgh SLV in
July 2017, could shorten a pathway to an ICBM because space launch vehicles use
similar technologies.

2.4 North Korea

North Korea’s history of exporting ballistic missile technology to several countries,


including Iran and Syria, and its assistance during Syria’s construction of a nuclear
reactor—destroyed in 2007—illustrate its willingness to proliferate dangerous tech-
nologies. In 2017 North Korea, for the second straight year, conducted a large number
of ballistic missile tests, including its first ICBM tests. Pyongyang is committed to
developing a long-range, nuclear-armed missile that is capable of posing a direct threat
to the United States. It also conducted its sixth and highest yield nuclear test to date.
The assessment is that North Korea has a longstanding BW capability and biotech-
nology infrastructure that could support a BW program. We also assess that North
Korea has a CW program and probably could employ these agents by modifying
conventional munitions or with unconventional, targeted methods.
192 C. Reynolds

2.5 Pakistan

Pakistan continues to produce nuclear weapons and develop new types of nuclear
weapons, including short-range tactical weapons, sea-based cruise missiles,
air-launched cruise missiles, and longer-range ballistic missiles. These new types
of nuclear weapons will introduce new risks for escalation dynamics and security in
the region.

2.6 Syria

The Syrian regime used the nerve agent sarin in an attack against the opposition in
Khan Shaykhun on 4 April 2017, in what is probably the largest chemical weapons
attack since August 2013. We continue to assess that Syria has not declared all the
elements of its chemical weapons program to the Chemical Weapons Convention
(CWC) and that it has the capability to conduct further attacks. Despite the creation
of a specialized team and years of work by the Organization for the Prohibition of
Chemical Weapons (OPCW) to address gaps and inconsistencies in Syria’s decla-
ration, numerous issues remain unresolved. The OPCW-UN Joint Investigative
Mechanism (JIM) has attributed the 4 April 2017 sarin attack and three chlorine
attacks in 2014 and 2015 to the Syrian regime. Even after the attack on Khan
Shaykhun, we have continued to observe allegations that the regime has used
chemicals against the opposition [11].
The danger from hostile state and non-state actors who are trying to acquire
nuclear, chemical, radiological, and biological weapons is increasing. The Syrian
regime’s use of chemical weapons against its own citizens undermines international
norms against these henious weapons, which may encourage more actors to pursue
and use them. ISIS has used chemical weapons in Iraq and Syria. Terrorist groups
continue to pursue WMD-related materials [12]. With respect to proliferation, it is
important to remember the line between countries and terrorist groups is not always
distinct. It is clear that some terrorist groups are supported by nation-states and vice
versa. And it is evident that some terrorist groups act as proxies for nation-states. In
addition, leading scientists working within a country might not be under the control
of national authorities, as was the case in the history of nuclear weapons proliferation
(www.fas.org/sgp/crs/nuke/RL34248.pdf).
In 2004, the United Nations Security Council passed Resolution 1540, with the
intent of keeping WMD out of the hands of non-state actors, which included nuclear,
biological, and chemical weapons, their means of delivery, and related materials.
Resolution 1540 included the following three core directives: [13].
1. All States are prohibited from providing any form of support to non-state actors
seeking to acquire weapons of mass destruction, related materials, or their means
of delivery.
Global Health Security and Weapons of Mass Destruction Chapter 193

2. All States must adopt and enforce laws criminalizing the possession and acqui-
sition of such items by non-state actors, as well as efforts to assist or finance their
acquisition.
3. All States must adopt and enforce domestic controls over nuclear, chemical, and
biological weapons, their means of delivery, and related materials, in order to
prevent their proliferation.

3 Biological

Biological weapons are perhaps, the most insidious form of WMD’s. These are
weapons that contain viruses and/or bacterial pathogens or poisonous substances that
have been engineered to cause severe illness or death in human beings, animals and
vegetation. In their natural state, these pathogens or substances are not normally fatal
to living beings and must be amplified or weaponized to become a threat. In addition,
biological weapons also require a delivery mechanism. The Centers for Disease
Control (CDC) categorizes biological threats into three distinct categories based on
lethality, with each category containing specific biological agents. Category A
contains the most lethal agents that pose the greatest threat, as they are easily
disseminated and have the highest mortality rates [14]. Category A agents include
Anthrax- Bacillus anthracis, Botulism- Clostridium botulinum, Plague- Yersinia
pestis, Smallpox- Variola virus, Tularemia- Francisella tularensis, and Viral Hem-
orrhagic Fever Viruses (which includes the Ebola Virus) [15].
Category B agents have a low to moderate morbidity and are less threatening to
the general public. Category B agents include Bacterial, Rickettsial, and Protozoal
agents (Brucellosis, Glanders, Melioidosis, Q Fever, Psittacosis, Typhus Fever,
Cholera, and Cryptosporidiosi); Toxins (Staphylococcus Enterotoxin B,
C. Perfringens Epsilon Toxin, and Ricin Toxin); and, Viral agents (Viral Enceph-
alitides, including Venezuelan, Western, and Eastern Equine Encephalitis)
[14, 15]. The final category is Category C, which are those pathogens that can be
engineered for mass dissemination because they are readily available, have a general
ease of production, and their potential for high mortality rates [14]. These include
emerging viral pathogens, including Nipah Virus, Hantavirus, which also includes
Hantavirus Pulmonary syndrome and Hantavirus Hemorrhagic Fever Syndrome.
These pathogens have a higher mortality than Cat B Agents [15].
One only need look at history to see the impact of biological weapons. The
Greeks contaminated the water wells of their enemies in 300 B.C. Later, in the
French and Indian War, the British Army distributed smallpox infected blankets to
the Indians. British General Sir Jeffrey Amherst proposed presenting local Indian
tribes with the smallpox-laden blankets, which would allow colonist an easier path to
colonization [16]. At a peace conference, the blankets were presented as gifts to the
unsuspecting Indians. What the tribes did not realize, was that the blankets came
from a smallpox-infected soldiers that were located in the area. The resulting impact
194 C. Reynolds

was an outbreak of smallpox in the Indian tribes of the area which was estimated to
have a case fatality rate of almost 90% [16].
In the Second World War, the most notorious was conducted by the Japanese
Army under the leadership of Lt. Gen. Shiro Ishii. Ishii commanded the infamous
Unit 731 and employed over 3000 scientists [17]. Unit731 operated in over six
different cities with more than 3000 researchers who all focused on the development
of deadly biological agents [18]. Among the biological toxins researched included
anthrax, plague, and typhus. Their test subjects included prisoners and innocent
populations. It is estimated that over 10,000 people endured this horrible experi-
mentation and later died as a result [16]. One of Unit 731’s most notorious attacks
occurred in 1941 with the air distribution of plague infected wheat and mosquitos
over the town of Chang the, China. Within a week, residents of the town began dying
of plague. The final death toll estimate was 400 people [19]. In the 1950s, the United
States Navy sprayed a low pathogenic bacterium over San Francisco Bay by boat to
assess the vulnerability of a large American coastal town to a biological attack
[17]. In the 1970s, the USSR maintained a clandestine biological weapons research
lab that was known as Chief Directorate for Biological Preparation (Biopreparat) that
produced plague, tularemia, anthrax, glanders, smallpox and Venezuelan equine
encephalomyelitis [20].
Shortly after the September 11th, 2001 attacks on the World Trade Center and
Pentagon, several US government leaders received “Anthrax Letters”, which caused
panic in Washington, DC. Although this was a relative small-scale attack, it still
elicited fear from throughout the United States. One of the key goals of a terrorist is
to disrupt normal day-to-day life. The Anthrax Letters did just that—they cause very
little damage and no one was infected, but they caused widespread panic. Attacks do
not have to be successful in creating casualties to be successful. Indeed, the psycho-
logical damage done by launching a biological attack will have a tremendous impact
on the government and population.
Any response to an incident involving biological substances brings about a higher
level of concern and will challenge a community’s emergency response infrastruc-
ture. New or engineered pathogens can spread quickly throughout the world. In his
book, “Hot Zone”, author Richard Preston writes a fictional account of the spread of
the Ebola Virus. Readers learned just how dangerous this pathogen was. Although
this was a fictional account, an actual Ebola outbreak nearly occurred in Ranson,
Virginia in 1989 at a CDC Primate Lab when monkeys became infected with a
hemorrhagic fever outbreak. Although the incident was contained, it required the
ethnicization of 450 primates by the United States Army Medical Research Institute
of Infectious Diseases (USAMRIID). Fortunately for the primate lab scientists, their
strain of Ebola was not harmful to humans. This outbreak gave birth to the new
Ebola-Reston strain, which is only one of five Ebola strains not harmful to humans
[21]. The world’s attention was again focused on the Ebola virus when in 2014, a
Dallas hospital nurse became infected with Ebola after treating an Ebola patient. As
the world learned of the Ebola incident, citizens and concerned scientists concerns
over laboratory safety began to be heard. The pandemic potential of accidental
release of insufficient biosafety presents a danger [22].
Global Health Security and Weapons of Mass Destruction Chapter 195

Ebola is just one example of a biological threat—there are many others. It is


important to note that each pathogen is unique and requires differing forms of
response and treatment. As noted in the Department of Homeland Security’s
(DHS) Biological Incident Annex, individual pathogens present a real threat to
public health and local plans should be written to deal with the aftermath of such
threats [23]. Given the difficulty of weaponizing and distributing biological agents in
enough quantity to create a mass casualty incident, it is unlikely that terror groups
have this capability. In the Senate Committee on Homeland Security and Govern-
mental Affairs report, the Commission concluded that the United States should be
less concerned that terrorists will become biologists and far more concerned that
biologists will become terrorists [3].
The cornerstone of international efforts to prevent biological weapons prolifera-
tion and terrorism is the 1972 Biological Weapons Convention (BWC). This treaty
bans the development, production, and acquisition of biological and toxin weapons
and the delivery systems specifically designed for their dispersal. The BWC forbids
member states (now numbering more than 160) from assisting other governments,
non-state entities, or individuals in obtaining biological weapons [3].
The future threat of biological weapons is also significant. Advances in bioengi-
neering and biotechnologies that synthesize DNA has created a new biohazard
known as “synthetic genomics.” This capability allows scientists to synthesize any
virus whose DNA has been decoded. Imagine the impact of synthesizing the
smallpox virus. Although smallpox was eradicated in 1977, samples remain frozen
in cryogenic containers [3]. Attention to global health security that includes efforts to
help prepare for and address pandemic and epidemic diseases has grown signifi-
cantly over the past few decades, driven by the ongoing threat posed by emerging
infectious diseases (EIDs), including HIV, SARS, H1N1, Ebola, and Zika [24].

4 Chemical

The threat that weapons of mass destruction places on communities worldwide cannot
be overstated. Chemical weapons present unique challenges to emergency responders
and healthcare practitioners requiring specialized decontamination procedures and
treatment. Chemical weapon use anywhere in the world poses a grave threat to the
safety and security of all worldwide. The presence and usage of chemical weapons
also pose a continuing threat and risk to global security and instability. A chemical
attack is the spreading of toxic chemicals with the intent to do harm. A wide variety of
chemicals could be made, stolen, or otherwise acquired for use in an attack. Industrial
chemical plants or the vehicles used to transport chemicals could also be sabotaged
[25]. Both the concentration and toxicity of a chemical impacts the severity of an
attack. Similarly, whether the agent is released in a closed space or in the open air will
impact the persistence of chemical agents. It is important to note that chemical
weapons are banned under customary international law, the 1925 Geneva Protocol
and the 1997 Chemical Weapons Convention (CWC) [26].
196 C. Reynolds

The Organization for the Prohibition of Chemical Weapons (OPCW) is the


implementing body for the Chemical Weapons Convention, which entered into
force on 29 April 1997. The OPCW, with its 193 Member States, oversees the
global endeavor to permanently and verifiably eliminate chemical weapons [27]. The
OPCW divides chemicals into three distinct schedules with each category listing
chemicals by the threat they pose. Schedule 1 chemicals are those that present the
highest risk, as they include those chemicals that are prohibited by the Chemical
Weapons Convention. These chemicals have little or no use for peaceful purposes in
commercial or industrial activities. Among them are chemicals that have actually
been produced, stockpiled or used as weapons, such as VX, sarin, mustard and two
biological toxins—ricin and saxitoxin (OPCW 2017).
Schedule 2 chemicals are those that present a significant risk because of their
lethal, incapacitating or other properties that could enable them to be used as a
chemical weapon. Examples include Amiton, BZ, thiodiglycol, and pinacolyl alco-
hol (OPCW 2017). Schedule 3 chemicals are similar to Schedule 1 chemicals in that
many have been stockpiled or used as weapons, but different in that they generally
are produced in large commercial quantities for purposes not prohibited by the
Convention. They may represent a risk to the object of the CWC due to their toxicity
or to their importance in producing any of the chemicals listed in Schedule 1 or
precursors listed in Schedule 2. Examples of Schedule 3 chemicals include phos-
gene, hydrogen cyanide, triethanolamine, and phosphorus trichloride (OPCW 2017).
The history of chemical weapons can be traced back to World War I, where the
Germans used chlorine and phosgene weapons that resulted in hundreds of thou-
sands of chemical casualties. In 1915, in Ypres, Belgium, the Germans opened the
valves on more than 6000 steel cylinders, releasing 160 tons of chlorine gas on the
unsuspecting French trenches killing more than 1000 French and Algerian soldiers
and wounding more than 4000 soldiers [28]. Again in 1917, the Germans introduced
a new chemical agent—Mustard Gas—which was referred to as the “King of the
Battle Gasses.” Unlike chlorine or phosgene, mustard gas is a vesicant, also referred
to as a “Blister Agent”, whose symptoms may not be realized until 2–24 h after
initial exposure [29]. Mustard gas produces large blisters on the skin and if inhaled,
can cause blistering in the lungs. Mustard gas produced more chemical casualties
than all the other agents combined, including chlorine, phosgene, and cyanogen
chloride [28].
In 1988, Iraqi leader Saddam Hussein attacked the city of Halabja, Iraq with
mustard gas and nerve agents, killing 50,000 civilians and injuring another 70,000
[30]. In 1995, an obscure Japanese religious cult, Aum Shinrikyo, launched a sarin
gas attack on the Japanese subway system killing 12 and injuring more that 5000
other subway passengers (RAND 2005).
In 2012, the despot leader of Syria, President Bashar al-Assad attacked his own
population in Khan Al Asal near Aleppo killing 25 civilians and injuring more than
100 others. Again, in 2017, he released another chemical attack in Douma, outside of
Damascus, Syria that captured world attention after the bodies of children were
Global Health Security and Weapons of Mass Destruction Chapter 197

broadcast worldwide. In this attack, 1700 innocent civilians were killed, including
women and children [26]. This attack prompted United States President Donald
Trump to launch punitive strikes on Syrian targets that were associated with the
Syrian regime’s chemical-weapons programs [31].
The use of chemical weapons has continued into 2018 with countries attempting
to thwart the Chemical Weapons Convention by simply adjusting their formulary to
create new chemical compounds. On March 4th, 2018 former Soviet double agent
Sergei Skripal and his daughter Yulia Skripal were both poisoned with the nerve
agent Novichok, which is a military-grade nerve agent [32]. The Organization for the
Prohibition of Chemical Weapons (OPCW) confirmed the United Kingdom’s
(UK) findings that Novichok was used to target the former Russian double agent
and his daughter in the English city of Salisbury” [32]. According to former UK
Foreign Secretary Boris Johnson, “there can be no doubt what was used and there
remains no alternative explanation about who was responsible – only Russia has the
means, motive and record” [32].
Numerous treaties and agreements have attempted to thwart the development and
deployment of chemical weapons. Regulation attempts of chemical weapons dates
back to 1675 when “the first international agreement limiting the use of chemical
weapons was signed between France and Germany, prohibiting the use of poison
bullets.” [9] In 1874 the Brussels Convention on the Law and Customs of War was
signed, which prohibited the employment of poison or poisoned weapons, and the
use of arms, projectiles or material to cause unnecessary suffering [9]. In 1889, an
agreement was signed that was a part of The Hague Peace Conference in which
countries agreed to abstain from the use of projectiles, the sole object of which is the
diffusion of asphyxiating or deleterious gases [9]. As we have read, the first half of
the twentieth century witnessed nations putting great resources into the development
of chemical weapons. From the Cold War years (1947–1991), the United States and
the Soviet Union were the two major superpowers still producing and maintaining
chemical weapons stockpiles. In 1997, the world’s first multilateral disarmament
agreement witnessed 130 signatory nations agree to specifically at eliminating
chemical weapons stockpiles, which was known as the Chemical Weapons Conven-
tion (CWC) [9]. Their success was such that the OPCW was awarded the Nobel
Peace Prize in 2013.
Illustration 2 depicts the efforts of disarmament and non-proliferation in CBRN
weapons.
The danger that chemical weapons pose to the world’s population is significant,
as they have the ability to incapacitate, injure, and kill without discrimination. Their
exposure presents dire consequences to any population or person who comes into
contact with them. As we have read in this chapter, the sad history of chemical
weapons use has resulted in nations attempting to stop the production and use of
these agents. The public health impact cannot be overstated—from exposure, clean
up, and contamination to decontamination and displacement of those impacted
populations, it is critical that all public health personnel become familiar with the
threat these weapons pose.
198 C. Reynolds

Illustration 2 Non-proliferation legislation. (Source: Disarmament & Non-Proliferation


Infographic – Kingdom of Belgium)

5 Radiological

Radiological agents occur naturally and are used in everyday life, from medical
x-rays to industrial applications. It is important to note that a radiological device is
not a nuclear weapon. The threat radiological devices pose is through low-order
detonation of explosives that spread the radiological agent in the atmosphere, ground
and water. Topical exposure or inhalation of radiologically-contaminated substances
is where the threat lays. As we have scene with biological and chemical substances,
terrorists are always seeking the materials to construct Radiological Dispersal
Devices (RDDs). Radiological events include the potential for terrorists to obtain
these materials in an attempt to create the nonnuclear release of radioactive materials
[33]. As we will read about under the Nuclear section of this chapter, the fall of the
former Soviet Union and lapse of security in former nuclear sites potentially allowed
for the sale of these materials on the black market. There are sites all around the
world that radiological materials exist and many of these locations have virtually no
security.
Global Health Security and Weapons of Mass Destruction Chapter 199

RDDs are likely to be the radiological weapon of choice because of their relative
simplicity and widespread availability of RDD-adaptable radioactive materials in
medicine, scientific research, and industries, such as civil engineering, petroleum
engineering, aeronautics, and radio-thermal energy generation [34]. As stated pre-
viously, an RDD is not a nuclear weapon and its main purpose is spreading
radioactive materials in a low-order explosive. Although far less catastrophic than
a nuclear detonation, an RDD attack would likely result in few immediate casualties,
but would certainly have longer term impacts on public health. The ancillary impacts
of an RDD include the potential of widespread panic, economic loss, and costly
cleanup [33].
According to the Congressional Research Service report, “Dirty Bombs”: Tech-
nical Background, Attack Prevention and Response, Issues for Congress,” the author
states that governments and organizations have taken steps to prevent an RDD attack
[35]. Within the United States, the Nuclear Regulatory Commission (NRC) has
issued regulations to secure radioactive source, which has assisted both United
States and other countries to secure and prepare for RDD attacks. Internationally,
the International Atomic Energy Agency (IAEA) has led efforts to secure radioactive
sources. Other nations and nongovernmental organizations have acted to secure
sources as well. Key points include: [35]
• Nuclear Regulatory Commission actions have done much to instill a security
culture for U.S. licensees of radioactive sources post-9/11.
• Many programs have sought to improve the security of radioactive sources
overseas, but some incidents raise questions about security.
Even though tougher regulatory measures have been put into place, additional
steps are needed to help reduce the RDD threat. It is truly a nightmare scenario if a
terrorist detonates an RDD and spreads radioactive material across dozens of square
miles, causing panic in the target area and beyond, costing tens of billions of dollars
to remediate, costing further sums in lost wages and business, compelling the
demolition and rebuilding of contaminated buildings, forcing difficult decisions on
how to dispose of contaminated rubble and decontamination chemicals, and requir-
ing people to relocate from areas with elevated levels of radiation [35].
In a scenario involving an RDD detonation in Washington, DC., the Sandia
National Laboratories projected the impact of such an attack. Their scenario included
the detonation of a RDD that included 1000 curies of cesium-137 chloride (about
50 grams). Their model includes exposure from radioactive material both deposited
on the surface and resuspended into the air and inhaled. The following map is based
on an atmospheric dispersion model, depicting where individuals would be projected
to have an increased risk of developing cancers due to radiation exposure over a year
or more [35].
200 C. Reynolds

Source: William Rhodes III, Senior Manager, International Security Systems Group, Sandia
National Laboratories, September 2010; analysis by Heather Pennington; graphics by Mona
Aragon.

One can see the increase in the cancer risk over time. Depending where they are at
the time of the exposure, this type of attack would increase the lifetime incidence of
cancer by 461 people, and lifetime deaths from cancer by 314. The Figure assumes
no relocation, sheltering, or decontamination. All these actions would occur in the
real world, significantly reducing cancer incidence and deaths caused by the
attack [35].
Global Health Security and Weapons of Mass Destruction Chapter 201

6 Nuclear

Nuclear weapons and the materials that make them up presents a true danger to
civilization. The Cold War years between 1947 and 1991 witnessed a dramatic rise
in both the United States and Soviet Union’s nuclear stockpiles. The conventional
wisdom of stockpiling these weapons was the concept of “mutually assured destruc-
tion,” which simply meant both the United States and Soviet Union would
completely destroy one another in a nuclear war. Towards the end of the Cold
War, other nations began acquiring nuclear weapons. These nations included
China, Pakistan, North Korea, and India. The addition of these nations added to
the complex calculus of mutual assured destruction. Twenty years ago, Russia and
14 other newly-independent states emerged from the breakup of the Soviet Union,
which put into question the status of their 35,000 nuclear weapons spread out at
thousands of sites across a vast Eurasian landmass that stretched across 11 time
zones (Graham 2012).
The concern the world faces is what happens when a terrorist group obtains a
nuclear weapon. One thing is for certain, we are in an age where terrorism is almost
common place, with terrorists always seeking different ways to achieve mass
casualties. While it is unlikely that a terrorist group could obtain an in-tact nuclear
weapon, they could construct a crude device. Indeed, it is potentially within the
capabilities of a technically sophisticated terrorist group, as numerous government
studies have confirmed [36]. In 2004, a report written by Harvard University’s
Project on Managing the Atom concluded that “a capable and well-organized
terrorist group plausibly could make, deliver, and detonate at least a crude nuclear
bomb capable of incinerating the heart of any major city in the world.” [37] The
consequences of detonation of even a crude terrorist nuclear bomb would be severe,
turning the heart of modern city into a smoldering radioactive ruin and sending
reverberating economic and political aftershocks around the world [36].
Without argument, a nuclear detonation would be catastrophic and cause death
and destruction the likes of which we have never seen. A more likely scenario for a
terror group would be in the form of a “dirty bomb”, or radiological dispersal device
(RDD). This is a device that is a mix of explosives, such as dynamite, and radioactive
powder or pellets. It is important to note a dirty bomb cannot create an atomic blast.
When the bomb explodes, the blast carries radioactive material into the surrounding
area where it can cause widespread radiation exposure and sickness. People nearby
could be injured by pieces of radioactive material from the bomb. Only people who
are very close to the blast site would be exposed to enough radiation to cause
immediate serious illness. However, the radioactive dust and smoke can spread
farther away and could be dangerous to health if people breathe in the dust, eat
contaminated food, or drink contaminated water. People injured by radioactive
pieces or contaminated with radioactive dust will need medical attention [38].
202 C. Reynolds

7 Public Health Concerns

Triaging, treating and transporting victims of radiation, chemical, or biological


exposure require swift and effective decontamination procedures. The risk of spread-
ing contaminates to healthcare workers is significant, which also includes workers,
bystanders, or others who may be in the contamination area or downwind. Contam-
inated victims will require a special assessment for decontamination needs, which
may include rapid decontamination on the scene of the incident and/or the hospital
[39]. The risk of secondary contamination is a significant concern that needs to be
addressed in emergency response plans.
Triage is the process of determining the priority of a victim’s treatments based on
the severity of their condition. Before treatment can begin, however, a mechanism
must be in place to determine whether victims must also be decontaminated. It is
equally important, however, to identify patients or victims who will not require
decontamination and can be quickly evacuated from the incident site. The process of
triage will determine the order of decontamination of victims. Quick observance of
victim signs and symptoms will help determine whether decontamination is neces-
sary. The three most important reasons for decontaminating exposed victims are:
[40]
1. To remove the contaminant from the victim’s skin and clothing, thus reducing
further agent exposure and physical effects.
2. Protecting emergency responders, medical personnel, family members, or others
from secondary transfer exposures.
3. Preventing victims from spreading contamination over additional areas of
their body.
The decision to decontaminate victims must occur through the medical branch of
the Incident Command System, with the approval of the Incident Commander. The
decision points the medical branch utilizes to make this decision is based on the
follow outside indicators: [39]
• The victim’s signs and symptoms to include airway, breathing, and circulation.
• Visual proof of contaminants on the skin or victim’s clothing.
• Whether the victim was in the contamination zone.
• Positive contamination results from the use of chemical detection paper/tape,
chemical agent monitor, Geiger counter, or other technology.
• Proof of potential chemical exposure after reviewing the material safety data
sheets (MSDS).
The ultimate goal of decontamination is to be expedient and thorough. One must
also remember that the longer it takes for victims to undergo decontamination, the
longer it will take for them to be transported and treated at a hospital. Decontami-
nation can be divided into tiers, which allows for flexibility and adaptability based on
the incident type. Additionally, each of the tiers can be conducted on the scene of the
incident or at a medical treatment facility [39].
Global Health Security and Weapons of Mass Destruction Chapter 203

8 Mass Patient Decontamination

Decontamination activities conducted for a large number of potentially contaminated


patients, which may exceed the typical response capacity of an organization, may
require additional resources or personnel, and require that patients be prioritized for
the decontamination process. The number of patients that constitutes mass decon-
tamination is dependent on the jurisdiction, responding agency, and capacity. Mass
decontamination may occur within any of the decontamination tiers.

9 Self-Care

Actions that a patient can perform for him/herself, including distancing him/herself
from the site of release, removing clothing, and wiping visible contamination from
skin and clothing in order to reduce his/her own contamination level immediately,
without waiting for a formal decontamination process to be set up.

10 Gross Patient Decontamination

Actions likely to be performed by or with the assistance of first responders or first


receivers in order to achieve a gross or hasty reduction in contamination, signifi-
cantly reducing contamination on skin or clothing, as soon as possible after contam-
ination has occurred.

11 Technical Patient Decontamination

Planned and systematic actions, likely to be performed under the guidance of or with
the assistance of first responders or first receivers, to achieve contamination reduc-
tion to a level that is as low as possible.

12 Summary

Weapons of mass destruction presents a threat to civilization by both hostile states


and non-state actors, including terrorists. Numerous legislative efforts have
attempted to halt the proliferation of WMD which works for nations who intend
on following the law. The danger exists with non-state actors, rogue nations and
terrorists, anyone of whom wouldn’t think twice on using them. It is important for
204 C. Reynolds

nations and international law enforcement agencies to keep tabs on these rogue
nations and terrorists. The threat is too great and the cost is just too high if these
weapons fall into the wrong hands. Global proliferation of weapons of mass destruc-
tion (WMD) presents a clear and present danger to global health security and unlike
conventional weapons that confine themselves to a defined and targeted area,
WMD’s cross international boundaries and borders. From the sarin attacks in 1994
and 1995 in Japan and 2001 anthrax attacks to the Syrian chemical attack on
innocent civilians and VX nerve agent assignation in Malaysia, weapons of mass
destruction are a sad reality in today’s society. As we have learned in this chapter, the
driving motivation of terrorism is to strike fear and kill or injure innocent civilians to
achieve their twisted goals. WMD’s pose a much larger threat than conventional
weapons and could potentially kill thousands and result in many more thousand
casualties.
WMD’s include chemical agents, biological pathogens, radiological agents, and
nuclear weapons, each of which require special protective measures for responders
and decontamination for victims. Chemical agents include Lung Damaging Agents
(Chlorine (CL) and Phosgene (CG)), Blood Agents (Cyanogens), Blister Agents
(Mustard (H), Lewisite (L), and Phosgene Oxime (CX)), and Nerve Agents (Tabun
(GA), Sarin (GB), Soman (GD), and VX). Biological agents are those that contain
viruses and/or bacterial pathogens or poisonous substances that have been
engineered to cause severe illness or death in human beings, animals and vegetation.
In their natural state, these pathogens or substances are not normally fatal to living
beings and must be amplified or weaponized to become a threat. In addition,
biological weapons also require a delivery mechanism. The Centers for Disease
Control (CDC) categorizes biological threats into three distinct categories based on
lethality, with each category containing specific biological agents. Category A
contains the most lethal agents that pose the greatest threat, as they are easily
disseminated and have the highest mortality rates [14]. Category A agents include
Anthrax- Bacillus anthracis, Botulism- Clostridium botulinum, Plague- Yersinia
pestis, Smallpox- Variola virus, Tularemia- Francisella tularensis, and Viral Hem-
orrhagic Fever Viruses (which includes the Ebola Virus) [15]. Category B agents
have a low to moderate morbidity and are less threatening to the general public.
Category B agents include Bacterial, Rickettsial, and Protozoal agents (Brucellosis,
Glanders, Melioidosis, Q Fever, Psittacosis, Typhus Fever, Cholera, and
Cryptosporidiosi); Toxins (Staphylococcus Enterotoxin B, C. Perfringens Epsilon
Toxin, and Ricin Toxin); and, Viral agents (Viral Encephalitides, including Vene-
zuelan, Western, and Eastern Equine Encephalitis) [14, 15]. The final category is
Category C, which are those pathogens that can be engineered for mass dissemina-
tion because they are readily available, have a general ease of production, and their
potential for high mortality rates [14]. These include emerging viral pathogens,
including Nipah Virus, Hantavirus, which also includes Hantavirus Pulmonary
syndrome and Hantavirus Hemorrhagic Fever Syndrome. These pathogens have a
higher mortality than Cat B Agents.
Global Health Security and Weapons of Mass Destruction Chapter 205

The radiological threat are those posed by the spread of radioactive materials in
the atmosphere. Radiological dispersal devices (RDDs) may be explosive-driven—a
dirty bomb—or use nonexplosive means like a crop duster airplane. Radioactive
material may be dispersed indoors to contaminate a building, though the scenario
most commonly discussed involves detonation of a dirty bomb outdoors. The
nuclear threat is the greatest of all threats The concern the world faces is what
happens when a terrorist group obtains a nuclear weapon. One thing is for certain, we
are in an age where terrorism is almost common place, with terrorists always seeking
different ways to achieve mass casualties. The United States and United Nations
have worked hard to eliminate the threat of the spread of WMD, but even with the
best intentions, it is difficult to maintain enforcement with rogue states and terrorists.
It is critically important that all public health providers maintain vigilance and
become aware of the WMD threat.

References

1. GAO (2017) Testimony before the subcommittee on emergency preparedness, response, and
communications, Committee on Homeland Security, House of Representatives. DHS’s Chem-
ical, biological, radiological, and nuclear program consolidation efforts
2. Blair D (2010) Annual threat assessment of the US intelligence community for the Senate Select
Committee on intelligence. Senate Select Committee on Intelligence, Washington, DC
3. United States Senate (2008) Hearing before the committee on homeland security and govern-
mental affairs. United States Senate, Washington, DC
4. United Nations (UN) (2017) Ensuring effective interagency interoperability and coordinated
communication in case of chemical and/or biological attacks. New York
5. Department of Homeland Security (2010) Quadrennial Homeland Security Review Report.
Washington, DC
6. Bolton JR (2004) Testimony before the House International Relations Committee, “The Bush
Administration’s nonproliferation policy: successes and future challenges
7. United States Government Accountability Office (GAO) (2012) Proliferation security initiative:
agencies have adopted policies and procedures but steps needed to meet reporting requirement
and to measure results. GAO, Washington, DC
8. White House (2002) National strategy to combat Weapons of Mass Destruction (WMD),
December 2002, p 2
9. Organization for the Prohibition of Chemical Weapons (OPCW) (2014) Origins of the chemical
weapons convention and the OPCW. The Hague, Netherlands
10. United Nations Office for Disarmament Affairs (UNODA) (2017) The contribution of the
biological weapons convention to global biosecurity. Geneva
11. Coats D (2018) Statement for the record: worldwide threat assessment of the US Intelligence
Community. Washington, DC
12. Trump D (2017) National security strategy of the United States. The White House, Washington,
DC
13. Oosthuizen G, Wilmshurst E (2004) Terrorism and weapons of mass destruction: United
Nations security council resolution 1540. Chatham House, London
14. Centers for Disease Control (CDC) (2018) Bioterrorism Agents/Diseases (by category) |
Emergency Preparedness & Response. Emergency.cdc.gov. Retrieved 8 Nov 2018, from
https://emergency.cdc.gov/agent/agentlist-category.asp
206 C. Reynolds

15. Bozigian-Merrick S (2011) Overview of category a bioterrorism agents. Massachusetts Asso-


ciation of Public Health Nurses, Milton
16. Riedel S (2004) Biological warfare and bioterrorism: a historical review. Bayl Univ Med Cent
Proc 17(4):400–406
17. Cenciarelli O, Rea S, Carestia M, D’Amico F, Malizia A, Bellecci C, Gaudio P, Gucciardino A,
Fiorito R (2013) Bioweapons and bioterrorism: a review of history and biological agents. Def
S&T Tech Bull 6(2):111–129
18. Christopher G, Cieslak T, Pavlin J, Eitzen E (1997) Biological warfare. A historical perspective.
JAMA 278(5):412–417
19. Byrd GD (2005) General Ishii Shiro: his legacy is that of genius and madman. Electronic Theses
and Dissertations
20. Davis CJ (1999) Nuclear blindness: an overview of the biological weapons programs of the
former Soviet Union and Iraq. Emerg Infect Dis 5:509–512
21. Pugh K (2014) Ebola Reston: a look back at the monkey house. Inside Nova, Woodbridge
22. Inglesby T, Relman DA (2015) How likely is it that biological agents will be used deliberately
to cause widespread harm? EMBO Rep 17(2) 2016
23. Department of Homeland Security (2017) Biological Incident Annex to the Response and
Recovery Federal Interagency Operational Plans. Washington, DC
24. Michaud J, Moss K, Kates J (2017) The U.S. Government and Global Health Security. The
Henry J. Kaiser Family Foundation, Menlo Park
25. Department of Homeland Security (2004) Chemical Weapons Fact Sheet from the National
Academies. Washington, DC
26. Trapp R (2017) The use of chemical weapons in Syria: implications and consequences. One
hundred years of chemical warfare: research, deployment, consequences, pp 363–375. https://
doi.org/10.1007/978-3-319-51664-6_19
27. Organization for the Prohibition of Chemical Weapons (OPCW) (2018) About the OPCW.
Retrieved on 18 November 2018 from https://www.opcw.org/about-us
28. Haber LF (1986) The poisonous cloud: chemical warfare in the first world war. Clarendon
Press, New York, pp 31–32
29. United States Centers for Disease Control (CDC) (2005) Toxic syndrome descriptions.
Washington, DC
30. Valadbigi A, Ghobadi S (2010) The tragedy of Halabja (A pathological review on social-legal
aspects of the case from historical and international points of view)
31. Krishnadev C (2018) How will this attack on Syria be any different? The Atlantic, Boston
32. Smith-Spark L, Culen S (2018) ‘Pure’ Novichok used in Skripal attack, watchdog confirms.
Retrieved on 11 November 2018 from https://www.cnn.com/2018/04/12/europe/opcw-report-
skripal-poisoning-uk-intl/index.html
33. Bunn M, Bielefeld T (2007) Reducing nuclear and radiological terrorism threats. Harvard
University, Cambridge, MA
34. Barnett D, Parker C, Blodgett D, Wierzba R, Links JM (2006) Understanding radiologic and
nuclear terrorism as public health threats: preparedness and response perspectives. Nucl Med
Oct 47(10):1653–1661
35. Medalia J (2011) “Dirty bombs”: technical background, attack prevention and response, issues
for congress. CRS, Washington, DC
36. Bunn M, Malin M, Roth N, Tobey WH (2016) Preventing nuclear terrorism continuous
improvement or dangerous decline? Belfer Center for Science and International Affairs Harvard
Kennedy School, Cambridge, MA
37. Bunn M, Wier A (2004) Securing the bomb: an agenda for action, project on managing the
atom. Harvard University, Cambridge, MA
38. National Center for Environmental Health (NCEH) (2018) Dirty bomb or radiological dispersal
device. Centers for Disease Control, Atlanta
Global Health Security and Weapons of Mass Destruction Chapter 207

39. United States Department of Health and Human Services (HHS) (2014) Patient decontamina-
tion in a mass chemical exposure Incident: national planning guidance for communities.
Washington, DC
40. Lake W, Schulze P, Gougelet R, Divarco S (2013) Guidelines for mass casualty decontamina-
tion during a HAZMAT/weapon of mass destruction incident, vol I and II. U.S. Army Chem-
ical, Biological, Radiological and Nuclear School, Fort Leonard Wood
41. Allison G (2012) What happened to the Soviet superpower’s nuclear arsenal? Clues for the
Nuclear Security Summit. HKS Faculty Research Working Paper Series RWP12-038, John
F. Kennedy School of Government, Harvard University
42. Daly S, Parachini J, Rosenau W (2005) Aum Shinrikyo, Al Qaeda, and the Kinshasa reactor.
RAND Corporation, Santa Monica
43. Organization for the Prohibition of Chemical Weapons (OPCW) (2017a) The structure of the
OPCW. The Hague, Netherlands
44. Organization for the Prohibition of Chemical Weapons (OPCW) (2017b) Monitoring chemicals
with possible chemical weapons applications. The Hague, Netherlands
Antimicrobial Resistance in One Health

Marie-jo Medina, Helena Legido-Quigley, and Li Yang Hsu

Abbreviations

ACT Artemisinin-based combination therapy


AIDS Acquired immune deficiency syndrome
AMR Antimicrobial resistance
ART Antiretroviral therapy
BRICS Brazil, Russian Federation, India, China and South Africa
CDC Centers for Disease Control and Prevention
CRE Carbapenem-resistant Enterobacteriaceae
EEA European economic area
EU European Union
FAO Food and Agriculture Organization
FDA Food and Drug Administration
GARDP Global Antibiotic Research and Development Partnership
GDP Gross Domestic Product
GLASS Global Antimicrobial Resistance Surveillance System
HAI Healthcare Associated Infections
HIV Human immunodeficiency virus
IACG Interagency Coordination Group on Antimicrobial Resistance
INH Isonicotinylhydrazide (isoniazid)
LMIC Lower-middle-income countires
MDG Millennium development goals
MDR Multidrug resistance
MDR-TB Multidrug-resistant tuberculosis

M.-j. Medina (*) · H. Legido-Quigley · L. Y. Hsu


Saw Swee Hock School of Public Health, National University of Singapore, Singapore,
Singapore
e-mail: ephmjm@nus.edu.sg

© Springer Nature Switzerland AG 2020 209


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1_10
210 M.-j. Medina et al.

MRSA Methicillin-resistant Staphylococcus aureus


OIE World Organisation for Animal Health
SDG Sustainable development goals
SSI Surgical site infections
TB Tuberculosis
UN United Nations
UTI Urinary tract infections
VRE Vancomycin-resistant Enterococcus
VRSA Vancomycin-resistant Staphylococcus aureus
WB World Bank
WHA World Health Assembly
WHO World Health Organization
WHO CIA List Who list of critically important antimicrobials for human
medicine
XDR-TB Extensively drug-resistant tuberculosis

1 Introduction

In 1945, Sir Alexander Fleming received a Nobel Prize for his discovery of penicil-
lin. In his acceptance lecture, he warned the world about the threat of antibiotic
resistance. Using the hypothetical example of a man who imprudently used antibi-
otics for a sore throat and then fatally infected his wife with antibiotic-resistant
bacteria, he warned that treating bacteria with inadequate doses of the antibiotic
would “educate them to resist penicillin” [1].
Seventy-three years since Fleming’s warning, antimicrobial resistance (AMR)
has become one of the biggest threats to global health security [2]. Antibiotic-
resistant bacteria now infect at least 2 million people annually in the US [3], and
up to 50,000 people in the US and in Europe die each year of infections that used to
be treatable with antibiotics [2]. Globally, more than 700,000 people die each year of
drug-resistant bacteria, malaria, HIV/AIDS, and tuberculosis (TB) and by 2050, it
has been projected that up to 10 million lives may be lost each year from drug-
resistant infections [2]. AMR also translates to poorer disease outcomes and longer
hospitalizations, which correlate with an increase in global healthcare spending
[4]. Therefore, AMR is a health and economic security threat that can compromise
the achievement of the Sustainable Development Goals (SDGs) globally, and needs
to be prioritized for action by global stakeholders.
In this chapter, we attempt to improve the awareness and understanding of AMR as
a threat to global health security, in the context of a One Health framework. We begin
by defining antimicrobials and describing the emergence and transmission of AMR.
We then describe the threats posed by AMR to global health security, in the context of
healthcare-associated infections, malaria, HIV/AIDS, TB, and fungal infections; to
food security, in the context of the complex and controversial practice of industrial
farming; and to economic security, in the context of the World Bank global GDP and
Antimicrobial Resistance in One Health 211

poverty projections for 2050. We then present global initiatives that address the
problem of AMR through a One Health lens, followed by future considerations for
continued action against AMR and then conclude with a summary of the chapter.

1.1 Definition of Terms

In this chapter, we define three terms as follows:


1. Food-producing animals include all terrestrial and aquatic animal sources of food,
even though livestock are used as examples.
2. Market weight for pigs is about 275 lbs, or the weight when growth stops and fat
increases [5].
3. Antimicrobial stewardship means “the optimal selection, dosage, and duration of
antimicrobial treatment that results in the best clinical outcome for the treatment
or prevention of infection, with minimal toxicity to the patient and minimal
impact on subsequent resistance” [6].

1.2 Antimicrobials and Antimicrobial Resistance

Antimicrobials are therapeutic agents used in combination with the body’s immune
system to treat human and animal infections caused by pathogens such as bacteria
(antibiotics), fungi (antifungals), parasites (antiparasitics), and viruses (antivirals)
[7, 8]. Bacterial resistance to antibiotics is where the majority of the concern with
AMR currently lies. In bacteria, antibiotics are conceptualized as being either
bactericidal (i.e., kills bacteria under laboratory conditions) or bacteriostatic (i.e.,
stops the growth and reproduction of bacteria under laboratory conditions)
[7, 8]. The primary mechanisms of action of antibiotics include:
• Preventing cell wall production, which makes bacteria susceptible to their exter-
nal environment;
• Disabling DNA or RNA synthesis, which inhibits the translation of functional
and structural proteins; and
• Disrupting metabolic processes, which interrupts bacterial development [7].
Because the most antibiotics are developed from modifications of compounds
produced by microbes to compete against one another, bacteria have been able to
quickly evolve or acquire natural mechanisms of resisting the effects of antibiotics,
including:
• Mechanical and biochemical changes that prevent antibiotics from acting on their
cell targets;
• Structural changes that make the targets unrecognizable to antibiotics, and
• Production of enzymes, such as beta-lactamases, that neutralize antibiotics [7, 8].
Bacteria have also developed the ability to acquire AMR from other bacteria and
from the environment. Antibiotic resistance genes are acquired through horizontal
212 M.-j. Medina et al.

transfer between bacteria, and by vertical transfer to progeny bacteria [7]. AMR is a
natural evolutionary process, but human actions and activities have sped up this
process dramatically, that it currently outpaces our ability to develop new antibiotics
[7, 9, 10]. Figure 1 summarizes antibiotic development and licensure throughout the
decades indicating that, although new antibiotics are being licensed for clinical use,
no new classes of drugs have entered the market since the 1990s.
Because the ability to prevent and treat bacterial infections is the cornerstone of
medical care – including performing complex surgery, organ transplantation or
prescribing cancer chemotherapy – AMR is a serious threat to the provision of
safe and effective medical care. AMR has the potential to impact health as much as
unexpected high-fatality disease outbreaks, albeit more slowly but also more
assuredly [11]. The key bacterial pathogens of clinical concern have recently been
listed by the World Health Organization (http://www.who.int/news-room/detail/27-
02-2017-who-publishes-list-of-bacteria-for-which-new-antibiotics-are-urgently-
needed). Among these, most severe infections in both hospital and community
settings are caused by Enterobacteriaceae and Staphylococcus aureus [12].
Enterobacteriaceae are enteric Gram-negative bacteria that include Escherichia
coli, Klebsiella spp. and Salmonella spp. As a group, Enterobacteriaceae are dem-
onstrating increased resistance to carbapenems (carbapenem-resistant
Enterobacteriaceae or CRE). Carbapenems are now the last line of safe and effective
broad-spectrum antibiotics used against life-threatening hospital acquired infections
(HAI) caused by Enterobacteriacea and other Gram-negative bacteria. Colistin – an
old drug with lower efficacy and safety profiles compared to carbapenems – treat
infections caused by carbapenem-resistant Gram-negative bacteria, but resistance to
this drug is now also present in many regions around the world [4]. Individually,
E. coli and K. pneumoniae are two of the most common causes of HAI [4]. E. coli is
also the most common cause of urinary tract infections (UTI), and resistance to
penicillins, cephalosporins, and fluoroquinolones complicate the treatment of
community-associated UTI. Fluoroquinolones are now ineffective in most UTI
cases worldwide [4].
S. aureus are Gram-positive commensal bacteria that are also opportunistic
pathogens. They are characterized by their virulence, adaptability in different envi-
ronmental settings, and ability to cause a range of infections that include life-
threatening diseases [13]. S. aureus is the leading cause of infections in both hospital
and community settings [13]. S. aureus can be resistant to multiple antibiotics, the
most important of which is beta-lactam resistance (methicillin-resistant S. aureus or
MRSA). Since the early 2000s, there has been an increase in MRSA infections in
community settings worldwide that is occasionally associated with high mortality
[4, 13]. An estimated 64% of people infected with MRSA are more likely to die than
those infected with antibiotic-susceptible S. aureus [4].

1.3 AMR Emergence

Acquired bacterial resistance occurs and spreads rapidly because of the excessive
use of antibiotics in humans and food-producing industries, as well as via
9 New Classes
1943: Aminoglycosides (Neomycin, Streptomycin)
1945: Cyclic Peptides (Bacitracin) 6 New Classes
1945: Tetracyclines (Chlortetracycline) 1961: Pyrimidines (Trimethoprim)
1946: Furans (Nitrofuran) 1962: Lincosamides (Lincomycin)
1947: Lipopeptides (Polymyxin) 1962: Quinolones (Nalidixic Acid)
1947: Phenylpropanoids (Chloramphenicol) 1962: Tetracyclic Triterpenoids (Fusidic Acid)
1 New Class 1948: Cephems (Cephalosporin) 1964: Peptides (Actinomycin D)
1 New Class 1928: β-lactams (Penicillin) 1949: Macrolides (Erythromycin)
No New Classes of Antibiotics
1969: Phosphophenolpyruvates (Fosfomycin)
Antimicrobial Resistance in One Health

1907: Arsphenamines (Salvarsan) 1950: Pleuromutilins (Ratapamulin)

1900 - 1910 1910 - 1920 1920 - 1930 1930 - 1940 1940 - 1950 1950 1960 1960 - 1970 1970 - 1980 1980 - 1990 1990 - 2000 2000 - 2010 2010 - 2020
0* 2* 0* 4* 10* 12* 17* 22* 42* 21* 6* 9*

1 New Class 7 New Classes 1 New Class


No New Classes of Antibiotics 5 New Classes
1932: Sulfonamides (Prontosil) 1952: Nicotinic Acids (Isoniazid) 1994: Glycylcyclenes (Tigecycline)
1953: Glycopeptides (Vancomycin) 1971: Crotonic Acids (Mupirocin)
1953: Streptogramins (Virginiamycin) 1976: β-lactamase Inhibitors (Clavulanic Acid)
1953: Nitroimidazoles (Azomycin) 1976: Carbapenems (Thienamycin)
1954: Amino Acids (Cycloserine) 1978: Cycloserines (Oxazolidinone)
1978: Monobactams (Aztreonam)
1955: Aminocoumarins (Novobiocin)
1957: Ansamycins (Rifampicin)
* Number of antibiotics licensed for clinical use, regardless of drug class
213

Fig. 1 Developing new classes of antibiotics throughout the decades


214 M.-j. Medina et al.

environmental pollution with antibiotics. In humans, the inappropriate use of anti-


biotics to treat viral infections and excessive or prolonged use of broad-spectrum
antibiotics for bacterial infections exacerbates the pace of development and spread of
AMR. The antibiotic selection pressure spurs the development of resistance in
existing gut bacteria. This can deplete beneficial bacteria and promote the growth
of commensal bacteria in the gut, which cause opportunistic diseases and carry
resistance genes that can then be transmitted [7, 14, 15]. Multiple factors promote
the imprudent and excessive use of antibiotics including, the lack of regulation and
oversight on prescribing, over-the-counter and internet sales in many low and lower-
middle income countries (LMIC) [9], demand for antibiotics by patients, and lack of
antibiotic treatment guidelines and microbiology laboratories in hospitals in
many low and LMICs.
In terrestrial and aquatic food-producing industries, AMR arises when antibiotics
are used in sub-therapeutic levels to promote the rapid growth of livestock (growth
promoters) and to prevent rather than treat their diseases [9]. Frequent ingestion of
low levels of antibiotics increases the prevalence of resistant pathogenic and com-
mensal bacteria in the gut of livestock, which provides a reservoir for the transmis-
sion of resistance genes to human bacteria [16]. The biggest factor in the imprudent
use of antibiotics in food production is the intensification of industrial farming over
the decades, due to the substantial increase in demand for meat foods since the
1950s [17].
In the U.S., clinics prescribe 30–50% of antimicrobials unnecessarily [14]. Nev-
ertheless, antimicrobial use in human healthcare is still considerably less than in
food-production worldwide [17]. Even though the U.S. Food and Drug Administra-
tion (FDA) banned fluoroquinolone use in poultry in 2005 and the EU banned
growth promoters in 2006, 13,000 tons of antimicrobials were used in animals in
the U.S. in 2009, approximately 80% of all antimicrobials used in the country that
year; 8,500 tons of antimicrobial products were sold to farms in 25 countries in
Europe in 2011, mostly in Germany; and 100,000 tons of antimicrobials are cur-
rently fed to animals in China each year, mostly without supervision [17]. Because
the antibiotics used in food-producing animals are generally the same ones that are
used to treat infections in humans, animal-to-human transmission of AMR ulti-
mately results in challenges to human health in clinical settings, particularly when
the antibiotics of interest – such as colistin – are the last lines of defense against the
bacteria in hospitals [4].
In the environment, AMR emerges in waters, soils, and sediments through
contamination with human and animal wastes that contain AMR bacte-
ria [18, 19]. Waste products from farms, hospitals, households, and industries
enter receiving waters through the discharge of untreated or inadequately treated
wastewater or through runoffs from flooding. Waste products also settle on soils and
sediments during irrigation with contaminated effluent water, when using manure or
influent sludge as fertilizer, and also during flooding [18, 19]. Animals and humans
are estimated to excrete between 10% and 90% of the antibiotics they ingest, either
unchanged or as bioactive compounds which then enhance the emergence of resis-
tant bacteria in the environment [18]. AMR bacteria in the environment are
Antimicrobial Resistance in One Health 215

problematic because of the potential for direct and indirect transmission of AMR
genes to animals and humans.

1.4 AMR Transmission

Animal-to-human transmission of drug-resistant bacteria can occur directly, through


human contact with infected animals and ingestion of raw or improperly cooked
food, or indirectly, through contact with soil and water contaminated with livestock
manure, or inhalation of air blown from abattoirs and livestock farms
[16, 17]. Animal-to-human transmission of drug-resistant bacteria could have three
outcomes, the third of which is by far the biggest threat to public health:
1. The end of transmission, as occurs during sporadic foodborne disease outbreaks;
2. Sustained human-to-human transmission, which enables bacteria to breach the
species barrier and cause zoonotic diseases or opportunistic infections. For
example, human infection with MRSA that are normally only found in livestock,
without any direct contact with the infected animals; or
3. The horizontal transfer of new resistance genes from bacteria in the animal gut to
pathogenic bacteria that infect humans, which are then evolutionary selected
when humans are treated with antibiotics [16].
Human-to-human transmission of drug-resistant bacteria can occur directly through
exposure to an infected person or biological materials (e.g., saliva and blood), or
indirectly through contact with contaminated materials (e.g., catheters and syringes).
Human-to-human infection with AMR bacteria is of greatest concern when acquired
as a consequence of health care or hospitalization [20]. Figure 2 illustrates the
occurrences of animal-to-human and human-to-human transmission of AMR.

2 AMR Threat to Global Health, Food, and Economic


Security

AMR is a global problem. Because globalization enables the rapid and ceaseless
movement of people, animals, plants, and other goods across the world, the trans-
mission of AMR from any species in any part of the world threatens global health,
food, and economic security [9].

2.1 Global Health Security Risk

The most alarming risk of AMR to global human health security is that, without
effective antibiotics, people are at risk for deadly infectious diseases that have
216 M.-j. Medina et al.

Fig. 2 Examples of human-to-human and animal-to-human transmission of AMR. (Source: CDC


Antibiotic Resistance Threats in the United States, 2013 [20])

already been eradicated or that can be otherwise managed therapeutically [4]. Exam-
ples of these diseases include healthcare-associated infections, malaria, HIV/AIDS,
Tuberculosis, and fungal infections.

2.1.1 Healthcare-Associated Infections AMR

As the term implies, healthcare-associated infections (HAI) are infections acquired


in healthcare settings, usually hospitals. HAIs are among the most commonly
reported adverse events associated with hospitalizations worldwide, even though
they are largely preventable [21]. At any given time, at least 7% of inpatients in high-
income countries and 10% in low-income and middle-income countries acquire
HAI, putting them at risk for unfavorable clinical outcomes and death [21]. The
most commonly reported HAI differ by World Bank (WB) country classification,
with UTI reported in high-income countries while surgical site infections (SSI)
are most common in lower-income countries [21]. Although the burden of HAI is
highest in patients in intensive care units (ICU) – with approximately 30% of all
ICU cases in hospitals in high-income countries and much higher rates in lower
Antimicrobial Resistance in One Health 217

income countries [21] – the clinical outcomes are invariably worse in inpatients
infected with drug-resistant HAIs than those infected with drug-susceptible ones,
regardless of the setting [4]. Aside from CRE and MRSA mentioned in the previous
section, extended-spectrum beta-lactamase-producing (ESBL) Enterobacteriacea are
other drug-resistant pathogens commonly associated with HAI [4].

2.1.2 Malaria AMR

Malaria is endemic in 91 countries in the world, primarily in the continents of Africa,


Asia, and the Americas [22]. It is a vector-borne disease caused by the parasite
Plasmodium spp., which is transmitted by a bite from female Anopheles mosquitoes.
Infections with Plasmodium falciparum is of utmost public health concern because
of the highest rates of associated complications and fatalities compared to infections
with other human or primate Plasmodium species [22]. The first-line treatments
against P. falciparum are artemisinin-based combination therapies (ACT) [4].
In 2016, an estimated 216 million people worldwide were infected with malaria,
5 million more than in 2015, of whom 445 thousand died [23]. Also in 2016,
incidences of ACT resistance were confirmed in the Greater Mekong countries of
Cambodia, the Lao PDR, Myanmar, Thailand, and Vietnam [4]. This is particularly
challenging for areas along the Cambodia-Thailand border, where P. falciparum is
already resistant to all other available antimalarials. There is a real risk that
multidrug-resistant (MDR) malaria will spread from the Cambodia-Thailand border
to other countries within and beyond the region [4]. The increase in cases and the
threat of MDR malaria put at risk the World Health Assembly Global Technical
Strategy to reduce malaria cases and mortality rates by at least 40% in 2020 [23].

2.1.3 HIV/AIDS AMR

Human immunodeficiency virus (HIV) infections weaken human immune systems,


increasing their susceptibilities to multiple infections and cancers [24]. If left
untreated, HIV infections develop into acquired immune deficiency syndrome
(AIDS), the most advanced stage of the disease [24]. The risk for HIV infection
increases with behavior and social conditions including, unprotected sex, injection
drug use, blood transfusion, incarceration, history of sexually transmitted diseases,
and mother-to-child transmission [24]. In 2017, 1.8 million people were newly
infected with HIV, adding to the 36.9 million already living with the disease.
More than 35 million have died from complications associated with HIV since the
1980s, 940,000 in 2017 alone [24]. Only 75% of those infected are aware of their
HIV status [24].
There is no known cure for HIV, although the virus can now be inhibited
indefinitely by reversing immune suppression with antiretroviral therapy (ART)
[24]. In 2015, the World Health Organization (WHO) recommended that everyone
with HIV should be started on ART. Currently, 21.7 million people are receiving
ART, all of whom are at potential risk for treatment failure due to ART-resistant HIV
218 M.-j. Medina et al.

[4]. At present, 15% of those starting ART and 40% who are re-starting treatment are
infected with ART-resistant HIV [4]. Therefore, the goal to provide easier access to
ART by 2020 to achieve the global target to end the AIDS epidemic by 2030 is at
considerable risk from the spread of ART-resistant HIV [24].

2.1.4 Tuberculosis AMR

TB is caused by Mycobacterium tuberculosis complex, which is transmitted between


individuals via airborne particles, and typically involves the lungs [25]. In 2017,
10 million people worldwide were diagnosed with TB, of which 1.6 million died.
Approximately 300,000 were co-infected with HIV [25]. Up to 87% of the new cases
were from countries that already have a high TB burden, two-thirds of which are in
Southeast Asia, the Western Pacific regions, and South Africa [25].
Although up to 25% of the world population that are infected with TB do not have
symptoms and, therefore cannot transmit the disease (latent TB), they have 5–15%
risk of developing active TB within their lifetime [25]. TB infections can be
prevented and cured, but it remains one of the top ten most deadly diseases
worldwide because it can be fatal if left untreated [25]. Without proper treatment,
death is certain for approximately 45% of those infected with TB and for nearly all
TB patients with HIV [25]. Figure 3 shows that successful TB treatment is possible
with effective anti-TB drugs [26].
Isoniazid (INH) and rifampicin are the cornerstones of the current short-course,
6 to 9-month long TB therapy using first-line anti-TB drugs [25]. In 2017, there were
an estimated 558,000 new cases of multidrug-resistant TB (MDR-TB), defined by
the organism having developed a resistance to both first-line drugs [26]. Second-line,
injectable anti-TB drugs (amikacin, capreomycin, or kanamycin) successfully treat
MDR-TB in almost 55% of cases, but they are ineffective against extensively drug-
resistant TB (XDR-TB) [25]. XDR-TB is resistance to at least four of the core anti-
TB drugs – the first-line and second-line drugs, along with any fluoroquinolones
[25, 26]. As of 2017, 8.5% of MDR-TB patients had XDR-TB [26]. At least
113 countries are known to have XDR-TB [26]. The spread of both MDR-TB and
XDR-TB threatens the global SDG to end the TB epidemic by 2030 [25].

2.1.5 Fungal AMR

Antifungal resistance is an emerging health issue, although one that poses a consid-
erably smaller public health risk compared to the drug-resistant pathogens listed
above. One example of the issue of antifungal resistance is invasive aspergillosis,
which is an opportunistic infection that is mainly caused by the fungus Aspergillus
fumigatus [27]. Immunocompromised patients such as stem-cell transplant and
organ transplant recipients are at high-risk for invasive aspergillosis and are then
at greater than 50% risk of dying from the infection [27]. Antifungals improve the
outcomes and survival rates in aspergillosis infections. The triazoles voriconazole,
Antimicrobial Resistance in One Health 219

Fig. 3 Global TB treatment outcomes with effective anti-TB drugs. (Source: WHO Global
Tuberculosis Report 2018 [26])

posaconazole, and itraconazole are the primary antifungals used for treating asper-
gillosis because patients tolerate them better than other antifungals [27].
Triazole resistance is common in Europe, but more countries around the globe are
reporting treatment failures due to triazole drug resistance. Particularly alarming are
reports of triazole-resistant A. fumigatus carrying resistance markers associated with
fungicide use in agriculture and in the environment instead of those associated with
triazole use in hospitals, because the use of fungicides far exceeds that of triazole use
in humans [27]. A. fumigatus spores are carried in the air over long distances hence,
even patients in non-agricultural areas or in areas where fungicides are not used may
become susceptible to drug-resistant infections [27].

2.2 Global Food Security Risk

Globalization and growing demands for meat products since the 1950s brought
about a drastic increase in meat production in developed countries [17]. The same
shift of livestock production to industrial farming is occurring in the least developed
countries where, like developed countries, farmers keep more livestock to supply
local and global markets [28]. Because there are now fewer farmers than previous
220 M.-j. Medina et al.

decades [28], industrial farming is necessary to meet the increasing demands for
global food security. At present, the poultry industry is the fastest growing sector in
the global food market, largely driven by the increased consumption in China and
India [17, 29]. Although overall meat consumption worldwide has decreased it is
estimated that by 2020, poultry production will increase by 37% in China, 28% in
Brazil, and 16% in the U.S., and by 2050, annual poultry consumption in India will
amount to 10 million tons. Overall, the demand for meat products in 2022 is
expected to increase by 80% of what it is at present in China and India due to
their growing middle classes [17, 29].
Industrial farmers use large amounts of antibiotics in the feed and water of whole
herds of livestock to enable the animals to endure the harsh conditions in industrial
farms and prevent the spread of diseases among the herd. But mostly, antibiotics are
used to hasten the growth of animals in order to reduce their odds of becoming ill
before slaughter and thus, increase profit [17]. For example, pigs treated with
antibiotics eat 10–15% less feed because they reach market weight sooner
[17]. The practice of using antibiotics to promote the rapid growth of livestock is
legal in several countries, and many industrial farms still treat herd food and water
with continuous, low-dose antibiotics such that, there are now more healthy animals
treated with antibiotics than sick humans [17]. Between 2010 and 2030, the esti-
mated increase in antibiotic consumption in food animals in Brazil, Russia, India,
China, and South Africa (BRICS) is 99%, seven times more than the projected
population growth in these countries, based on current baseline trends [30]. Figure 4
illustrates the sale of antibiotics in 25 EU/ EEA countries from 2001 to 2011 and
compares how much was used to treat ill humans with those used to feed livestock.
AMR in the food chain is a complex issue. Although it is generally recognized
that antibiotic use in food-producing animals is the leading cause of resistant
infections in humans through the food chain, there are limited data on the prevalence
of resistance in these animals, and the available data do not conclusively show a
direct association between animal and human AMR [16]. AMR in the food chain is
also a controversial issue. Some suggest that a large part of the blame is placed on the
food industry to lessen the accountability of antibiotic vendors and prescribers in
imprudently using drugs and others fund studies to show that the burdens on human
health of antibiotic use in animals are exaggerated [16]. The complexities and
controversial issues on food security risks associated with AMR are delineated in
the chapter by Schlundt, et al. in this book.

2.3 Global Economic Security Risk

Humans infected with AMR pathogens have more severe disease outcomes and longer
hospital stays than those infected with antimicrobial-susceptible pathogens, which
correlate to an increase in global healthcare spending [4]. For HAI, the highest burden
is invariably in lower-income countries, although data are hard to come by due to a
lack of surveillance [31]. In Brazil, HAI cost USD 18 million in 1992; In Mexico, ICU
costs at least USD 12,155 per case of HAI [21]; In Europe, HAI accounts for 16 million
Antimicrobial Resistance in One Health 221

Fig. 4 Sales of antibiotics in 25 EU/ EEA countries from 2001 to 2011 for human use and food
production. (Source: Meat Atlas – Antibiotics: Breeding Superbugs [17])

extra hospital stays annually, with direct costs alone amounting to €7 billion [31]; and
in the U.S., the economic loss from HAI is at least USD 28.4 billion annually, with
loss in life years and loss in productivity costing an added USD 12.4 billion each year
[32]. For malaria, efforts to control and eliminate the disease cost the global economy
USD 2.7 billion in 2016, which is still less than the estimated USD 6.5 billion annual
global investment needed by 2020 for the technical strategy to reduce malaria
[4]. Governments in countries where malaria is endemic spent USD 800 million in
2016, while the U.S. contributed USD 1 billion. Other leading contributors to the
global malaria fund in 2016 included the U.K., France, Germany, and Japan [4]. For
HIV, second-line ART costs three times more than first-line therapy, while third-line
drugs cost 18 times as much [4]. And for TB, global healthcare spending at present is
USD 6.9 billion, USD 3.5 billion less than is needed to end TB by 2020. Most
domestic funds come from BRICS, while international funds mainly come from
countries outside of BRICS that have high TB burden [26].
Collectively, an estimated 10 million lives are at risk for AMR infections each
year by 2050, at a cumulative economic output loss of USD 100 trillion globally
[2]. The projected economic loss by 2050 is equal to that brought about by the global
financial crisis in 2008, except the loss from AMR will be persistent with no hope for
immediate recovery [33]. The impact on annual global GDP is a decrease estimated
at 1.1–3.8%, which translates to a 28.3 million increase in the number of people in
extreme poverty by 2050 [33]. The impact on global healthcare costs by 2050 would
be an annual increase between USD 300 billion and USD 1 trillion. Finally, current
economic gains from continuing antibiotic use in livestock are predicted to decrease
due to a 2.6–7.5% global decline in production each year until 2050 [33]. Figure 5
shows the WB-estimated global GDP by 2050 with low and high AMR prevalence,
in comparison to GDP in the years before and during the most recent global financial
crisis [33].
222 M.-j. Medina et al.

Fig. 5 The economic costs of AMR by 2050 compared to the 2008–2009 global financial crisis.
(Source: WB Final Report: Drug-Resistant Infections – A Threat to Our Economic Future [33])

3 Solutions to Address AMR Globally and the One Health


Approach

In terms of policy, AMR is a global ‘wicked problem’ that cannot be addressed with
traditional policies because solving any of its many facets may create other,
unforeseen problems with no end in sight [34]. Thus, stopping the use of antibiotics
would be counter-productive because it makes humans and animals vulnerable to
many infectious diseases that can be treated with antibiotics [4], without any
guarantees that resistant bacteria already present in the environment will cease to
exist or to affect their health [16]. No new antibiotics have entered the market since
the 1990s [2]. Therefore, there is a need to preserve effective antibiotics for as long
as possible to enable drug discovery, while using them as needed to treat human and
animal infections [16].
In order to preserve effective antibiotics, special interest groups are implementing
isolated interventions that include local consumer advocacies toward natural and
sustainable foods on the market [16], and institutional healthcare practices on
antimicrobial stewardship [35]. For example, Namibia passed a law in 1991 that
banned the use of growth hormones and antibiotics in the beef industry. As a result,
they now have a thriving industrial beef farming export business [36]. In 2006, the
Ministry of Health in Israel made it mandatory for medical facilities to isolate
patients with carbapenem-resistant Klebsiella and created a permanent authoritative
body to oversee implementation and perform surveillance. These actions controlled
the outbreak of resistant Klebsiella HAI and established evidence-based hospital
infection control procedures in the country [37].
Antimicrobial Resistance in One Health 223

Aside from localized interventions, there were also regionalized bans on the use
of antimicrobials in agriculture, but which had much less success. When the
U.S. FDA banned fluoroquinolone use in poultry in 2005, it did not result in a
decrease in fluoroquinolone resistance. Nevertheless, the FDA issued a policy in
2013 for farms to voluntarily stop their routine use of antimicrobials and to consult
veterinarians before they treat animals [16]. When the EU banned the use of all
antimicrobials in animals except those needed for medicinal purposes in 2006, the
outcome was a rapid decrease in the prevalence of VRE in farm animals but not in
humans [16].
Although these isolated interventions are in some ways contributing to preserving
antibiotics, their impact is still only of slight consequence due to the enormity of the
problem of AMR [4]. Interventions with greater impact are needed because the
problem affects all of society [4]. Also, collaboration and coordination are necessary
because the problem involves multiple stakeholders including, international food,
human, animal, and economic sectors [4]. To address the wicked problem of AMR,
the WHO leads several initiatives including, World Antibiotic Awareness Week,
Global AMR Surveillance System (GLASS), Global Antibiotic Research and Devel-
opment Partnership (GARDP), and the WHO/UN Interagency Coordination Group
on AMR (IACG) [4]. However, of greatest importance is the WHO 68th World
Health Assembly (WHA 68.7) because it shrewdly tackles AMR using a One Health
approach by adopting the Global Action Plan on Antimicrobial Resistance in
May 2015.
The goal of the action plan is to ensure that infectious diseases are successfully
treated and prevented for as long as possible with effective, safe, and quality-assured
medicines that are used responsibly and that are accessible to everyone [10]. In order
to achieve its goal, the plan has five strategic objectives:
1. Improve AMR awareness and understanding and promote behavioral change
through mass communication, professional development education, and training;
2. Fill knowledge gaps and strengthen evidence-based surveillance and research
into diagnostics, drug development, alternatives for animal growth promotion,
and economic outcomes;
3. Reduce incidences of AMR infection through hygiene, sanitation, and
immunization;
4. Optimize antimicrobial use in treating human and animal infections by develop-
ing a database on the antimicrobial use in both sectors, encouraging government
regulation on the distribution, use, and quality of antimicrobials, and including
evidence-based prescribing and dispensing in the standard of care; and
5. Develop an economic case for sustainable investment in AMR and increased
spending on diagnostics, new antimicrobials and vaccines, and non-therapeutic
interventions [10].
The global action plan framework takes a multi-sectoral, One Health approach for
collaboration among stakeholders that include the WHO Secretariat, the WHO
Member States, and national and international partners such as FAO, OIE, and the
World Bank [10]. Member States develop national AMR action plans based on
224 M.-j. Medina et al.

national needs, priorities, and governance, and in parallel with the global action plan
and the standards and guidelines of the Codex Alimentarius Commission, FAO, and
OIE [10]. The Secretariat supports, leads, and collaborates with other stakeholders to
develop, implement, and monitor the effectiveness of their action plans and assess
their investment needs, and publishes biennial progress reports on individual coun-
tries, Member States, and partners [10]. Finally, the national and international
partners coordinate with other stakeholders to avoid duplicated efforts in
implementing, monitoring, and reporting on the effectiveness of country and partner
action plans [38]. Of importance is the partnership among FAO, OIE, and WHO.
This Tripartite was established decades ago because these organizations recognize
that they are jointly responsible for addressing and minimizing the health and socio-
economic impact of zoonotic infections and other diseases of public health impor-
tance [38]. The goal of the Tripartite is to prevent, detect, control, remove, or manage
the threat to humans of disease transmission from domestic and wild animals. To
achieve their goal, the Tripartite establishes government frameworks, early warning
detections, and systems for enhancing coordination and supporting Member
States [38].
In addition to the global action plan on antimicrobials, the WHO published
guidelines on the use of medically important antimicrobials in food-producing
animals. These evidence-based guidelines were developed in consultation with
FAO and OIE and aimed to mitigate the adverse effects in human health of the use
of clinically-relevant antimicrobials in food-production. They present five recom-
mendations and two best practice statements for using the list to preserve effective
antimicrobials that are critically important to human and animal health [39]. The five
recommendations on the use of clinically-relevant antimicrobials are:
1. Overall reduction of all classes for food production;
2. Complete restriction of all classes for growth promotion;
3. Complete restriction in all classes for preventing undiagnosed infectious diseases;
4. Cessation of their use for preventing diseases diagnosed among herds of live-
stock; and
5. Cessation of the use of highest priority drugs for treating livestock diagnosed with
an infectious disease.
The two best practice statements are:
1. New antimicrobials or combination antimicrobials for human use are considered
critically important unless otherwise classified by the WHO; and
2. Clinically-relevant antimicrobials that are not already in use in food production
cannot be used in livestock and plants [39].
These guidelines were based on the 5th revision of the List of Critically Important
Antimicrobials for Human Medicine (WHO CIA List, 2017), which classifies
antimicrobials as important, highly important, or critically important for human
medicine according to specific criteria [39]. A summary of the list that includes
the classification and prioritization of the classes of drugs that are clinically-relevant
is shown in Fig. 6.
Antimicrobial Resistance in One Health 225

Fig. 6 Summary table of the WHO CIA list (5th Revision, 2017). (Source: The WHO Guidelines
on Use of Medically Important Antimicrobials in Food-Producing Animals [39])

4 AMR Future Considerations

The diminishing effectiveness of clinically-relevant antibiotics increases the impor-


tance of novel approaches to treating and preventing human and animal infectious
diseases [13]. The one obvious approach is to create new drugs. However, pharma-
ceutical companies are now either reducing or eliminating their antimicrobial units
[13] because they can develop drugs that would cost a lot more to sell on the market
than antimicrobials and because biochemical screenings have not come up with a
226 M.-j. Medina et al.

product that is both viable and safe. Therefore, more investment is needed to
incentivize drug discovery. Governing bodies and financing entities need to under-
stand that the costs of prevention would be considerably less than the cure for
resistant infectious diseases or the loss in global productivity due to deaths from
treatment failures [2].
In the absence of new drugs, of potential benefit are behavioral changes that
include proper hygiene, active immunization, and altered practices in industrial
animal farming [11]. However, the more radical approach of national legislation
and international regulation may be what is necessary. It is important that global
stakeholders are committed to and have accountability for reducing the threat of
AMR. One of the current initiatives for commitment is the CDC AMR Challenge.
This year-long challenge was launched on 25 September 2018 to ensure that
antimicrobial resistance continues to be a top priority at the 2019 UN General
Assembly. Its goal is for worldwide stakeholders including, governments, private
enterprises, and non-governmental organizations to publicly and formally announce
their commitment to surveillance, infection control and prevention, antimicrobial
stewardship, environmental issues and sanitation, or advances in diagnostics, drug
development, and improved access [40]. Also in progress is the Tripartite proposal
for a global framework for development and stewardship to combat antimicrobial
resistance, whose goal is to address the gaps in governance in implementing
objectives 4 and 5 of the global action plan on antimicrobial resistance. The
proposed framework is still in its draft stages [41].

5 Conclusion

To summarize our attempt in this chapter to improve the awareness and understand-
ing of AMR as a threat to global health security: AMR is a problem that is hastened
by the imprudent use of antibiotics. The most blame is placed on food production
because industrial farming uses more antibiotics for animal growth promotion than
medicine uses for human diseases. Nevertheless, antibiotic vendors, prescribers, and
users are not entirely free of responsibility. An AMR problem in one country is a
problem worldwide due to the interconnectivity of the globalized market. AMR
threatens global health and economic security – it increases the number of people
that can no longer be treated for infections that were once treatable, which results in
poor health outcomes, long hospital stays, treatment failures, and increased
healthcare spending. AMR bacteria in food also threatens global food security.
AMR is a global and multi-sectoral problem that requires global, collaborative,
and coordinated solutions. The WHO-FAO-OIE Tripartite endorses a One Health
approach of involving the human, animal, and environmental sectors in devising
solutions. Through the Eighth World Health Assembly, they altogether created the
Global Action Plan on Antimicrobial Resistance. The List of Critically Important
Antimicrobials for Human Medicine was then updated to complement the global
action plan.
Antimicrobial Resistance in One Health 227

There are context-specific actions against AMR that need to be considered


including, new drugs and human behavioral changes, but implementing the One
Health approach of the global action plan is the best action we have available. For the
plan to be successful, however, there needs to be a global commitment to its
initiatives. The CDC and the WHO are ensuring global participation by endorsing
radical approaches that include a formal, public commitment to reducing AMR, and
a framework for national legislation and international regulation.

References

1. Fleming A (1945) Penicillin – Nobel lecture


2. The Review on Antimicrobial Resistance – Chaired by Jim O’Neill (2016) Tackling drug-
resistant infections globally: final report and recommendations
3. Centers for Disease Control and Prevention (CDC). Antibiotic/antimicrobial resistance
(AR/AMR). https://www.cdc.gov/drugresistance/. Updated 10 Sept 2018. Accessed 2 Oct 2018
4. World Health Organization (WHO). Antimicrobial resistance. http://www.who.int/antimicro
bial-resistance/en/. Updated 2018. Accessed 2 Oct 2018
5. Answers. What is the ideal market weight of swine? http://www.answers.com/Q/What_is_the_
ideal_market_weight_of_swine. Updated 2018. Accessed 3 Oct 2018
6. Gerding DN (2001) The search for good antimicrobial stewardship. Jt Comm J Qual Improv 27
(8):403–404
7. HowStuffWorks. How do bacteria become resistant to antibiotics? https://science.
howstuffworks.com/environmental/life/cellular-microscopic/question561.htm. Updated 30 Jan
2001. Accessed 2 Oct 2018
8. Alliance for the Prudent Use of Antibiotics (APUA). General background: antibiotic resistance.
http://emerald.tufts.edu/med/apua/about_issue/about_antibioticres.shtml. Updated 2014.
Accessed 2 Oct 2018
9. Food and Agriculture Organization of the United Nations (FAO) (2016) The FAO action plan
on antimicrobial resistance 2016–2020: supporting the agriculture sectors in implementing the
global action plan on antimicrobial resistance to minimize the impact of antimicrobial resis-
tance. FAO, Rome
10. World Health Organization (WHO) (2015) Global action plan on antimicrobial resistance.
WHO, Geneva
11. Lindmeier C (2017) WHO news – stop using antibiotics in healthy animals to prevent the spread
of antibiotic resistance
12. World Health Organization (WHO). Health topics – news: WHO publishes list of bacteria for
which new antibiotics are urgently needed. http://www.who.int/news-room/detail/27-02-2017-
who-publishes-list-of-bacteria-for-which-new-antibiotics-are-urgently-needed). Updated
27 Feb 2017. Accessed 28 Nov 2018
13. Lowy FD (2003) Antimicrobial resistance: the example of staphylococcus aureus. J Clin Invest
111(9):1265–1273
14. Centers for Disease Control and Prevention (CDC). Antibiotic prescribing and use. https://
www.cdc.gov/antibiotic-use/. Updated 1 Oct 2018. Accessed 11 Oct 2018
15. Marshall BM, Ochieng DJ, Levy SB (2009) Commensals: underappreciated reservoir of
antibiotic resistance. Microbe Mag 4(5):231–238
16. Chang Q, Wang W, Regev-Yochay G, Lipsitch M, Hanage WP (2015) Antibiotics in agricul-
ture and the risk to human health: how worried should we be? Evol Appl 8(3):240–247
228 M.-j. Medina et al.

17. Antibiotics BK (2014) Breeding superbugs. In: Chemnitz C, Becheva S, Bartz D, Mundy P
(eds) Meat atlas: facts and figures about the animals we eat, 1st edn. Heinrich Boll Foundation
and Friends of the Earth Europe, Berlin, pp 26–27
18. Li S, Shi W, Liu W et al (2018) A duodecennial national synthesis of antibiotics in China’s
major rivers and seas (2005–2016). Sci Total Environ 615:906–917
19. Marathe NP, Pal C, Gaikwad SS, Jonsson V, Kristiansson E, Larsson DGJ (2017) Untreated
urban waste contaminates Indian River sediments with resistance genes to last resort antibiotics.
Water Res 124:388–397
20. Centers for Disease Control and Prevention (CDC) (2013) Antibiotic resistance threats in the
United States, 2013
21. World Health Organization (WHO). Health care-associated infections: fact sheet. http://www.
who.int/gpsc/country_work/gpsc_ccisc_fact_sheet_en.pdf?ua¼1. Updated 2016. Accessed
15 Nov 2018
22. World Health Organization (WHO). Malaria: information for travellers. http://www.who.int/
malaria/travellers/en/. Updated 27 Feb 2018. Accessed 14 Nov 2018
23. World Health Organization (WHO) (2017) World malaria report: 2017. WHO, Geneva.
Licence: CC BY-NC-SA 3.0 IGO; 2017; No. 2018
24. World Health Organization (WHO). Fact sheet: HIV/AIDS. http://www.who.int/news-room/
fact-sheets/detail/hiv-aids. Updated 19 July 2018. Accessed 14 Nov 2018
25. World Health Organization (WHO). Fact sheet: tuberculosis. http://www.who.int/en/news-
room/fact-sheets/detail/tuberculosis. Updated 18 Sept 2018. Accessed 14 Nov 2018
26. World Health Organization (WHO) (2018) Global tuberculosis report 2018. WHO Licence.:
CC BYNC- SA 3.0 IGO, Geneva
27. Beer KD, Farnon EC, Jain S et al (2018) Multidrug-resistant aspergillus fumigatus carrying
mutations linked to environmental fungicide exposure – three states, 2010–2017. MMWR
Morb Mortal Wkly Rep 67(38):1064–1067
28. Chemnitz C (2014) The rise of the global market. In: Chemnitz C, Becheva S, Bartz D, Mundy
P (eds) Meat atlas: facts and figures about the animals we eat, 1st edn. Heinrich Boll Foundation
and Friends of the Earth Europe, Berlin, pp 10–11
29. Willett M (2014) Business insider: how people consume meat around the world [CHARTS]
30. Van Boeckel TP, Brower C, Gilbert M et al (2015) Global trends in antimicrobial use in food
animals. Proc Natl Acad Sci U S A 112(18):5649–5654
31. World Health Organization (WHO) (2011) Report on the burden of endemic health care-
associated infection worldwide: clean care is safer care. WHO, Geneva
32. Centers for Disease Control and Prevention (CDC). Health topics – healthcare-associated
infections (HAI). https://www.cdc.gov/policy/polaris/healthtopics/hai.html. Updated June
8, 2018. Accessed 14 Nov 2018
33. World Bank Group (2017) Final report: drug resistant infections – a threat to our economic
future
34. Hutchinson E (2017) Governing antimicrobial resistance: wickedness, competing interpreta-
tions and the quest for global norms
35. Doron S, Davidson LE (2011) Antimicrobial stewardship. Mayo Clin Proc 86(11):1113–1123
36. World Health Organization (WHO). WHO news – feature stories: Namibia’s ban on antibiotics
in healthy animals drives meat exports. http://www.who.int/news-room/feature-stories/detail/
namibia-s-ban-on-antibiotics-in-healthy-animals-drives-meat-exports. Updated 14 Nov 2017.
Accessed 3 Oct 2018
37. World Health Organization (WHO). WHO news – feature stories: facing the threat of antibiotic-
resistance: Israel’s success to prevent and control the spread of carbapenem-resistant bacteria.
http://www.who.int/news-room/feature-stories/detail/facing-the-threat-of-antibiotic-resistance-
israel-s-success-to-prevent-and-control-the-spread-of-carbapenem-resistant-bacteria. Updated
8 Nov 2017. Accessed 3 Oct 2018
Antimicrobial Resistance in One Health 229

38. FAO, OIE, WHO (2010) The FAO-OIE-WHO collaboration: sharing responsibilities and
coordinating global activities to address health risks at the animal-human-ecosystems interfaces
– a tripartite concept note
39. World Health Organization (WHO) (2017) WHO guidelines on use of medically important
antimicrobials in food-producing animals. Geneva: WHO Licence.: CC BY-NC-SA 3.0 IGO;
2017
40. Centers for Disease Control and Prevention (CDC). The AMR challenge. https://www.cdc.gov/
drugresistance/intl-activities/amr-challenge.html. Updated 25 Sept 2018. Accessed 25 Sept
2018
41. World Health Organization (WHO) (2017) Global framework for development & stewardship
to combat antimicrobial resistance – draft roadmap
Food Security: Microbiological
and Chemical Risks

Joergen Schlundt, Moon Y. F. Tay, Hu Chengcheng, and Chen Liwei

1 Introduction

Food Security is defined by the UN Food and Agricultural Organization (FAO) to


exist when “all people, at all times, have physical and economic access to sufficient,
safe and nutritious food.” Global (public) health security on the other hand is defined
by WHO (World Health Organization) as “the activities required, both proactive and
reactive, to minimize vulnerability to acute public health events that endanger the
collective health of populations.”
Food Security, as seen within a health security context, therefore relates to the
systems to be put in place to deal with acute events related to foodborne hazards, be
they chemical or microbiological in nature. However, it should be noted that systems
aimed at prevention of acute foodborne events (outbreaks) are not inherently differ-
ent from the systems dealing with the prevention of foodborne events in general,
i.e. the same systems that are used to deal with outbreaks are also used to deal with
sporadic foodborne cases, and are also – at least in principle – dealing with chronic
foodborne disease.
While the focus of surveillance systems – and especially such systems focused on
acute risks – are often primarily focused on microbiological hazards, chemical
hazards in general constitute a very significant part of food safety problems. This
is why systems aimed at providing data for food contamination and foodborne
disease prevention must consider both microbiological and chemical hazards.

J. Schlundt (*) · M. Y. F. Tay · H. Chengcheng · C. Liwei


School of Chemical and Biomedical Engineering, Nanyang Technological University (NTU),
Singapore, Singapore
Nanyang Technological University Food Technology Centre (NAFTEC), Nanyang
Technological University (NTU), Singapore, Singapore
e-mail: jschlundt@ntu.edu.sg

© Springer Nature Switzerland AG 2020 231


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1_11
232 J. Schlundt et al.

The description of food security systems in this chapter will hence include all
these areas – recognizing that in most countries there is only one system dealing with
the prevention of foodborne diseases, typically governed by a food safety authority,
in some cases several authorities. The food safety regulatory system includes
oversight of both microbiological and chemical hazards, which again both can
cause acute as well as chronic disease events.
This Chapter will include a description of existing (national and international)
surveillance systems, the existing models for food safety risk assessment as well as
examples of risk mitigation action within the most recent food safety systems
development for both chemical and microbiological hazards (physical hazards can
also be important but generally constitute a minor proportion of food safety
problems).
With food security issues relating to ‘sufficient food’, the disasters related to
famine need to be included also. While it is generally recognized that the world food
production capacity is certainly sufficient even for a global population of 10 billion,
it is more questionable if the present methods of production are sustainable, for
instance relative to water and land use, phosphorus and nitrogen flows, CO2 and NH3
contamination, etc. Thus, the most important issue related to food production
becomes that food systems must be transformed to produce more nutritious food
with a lower environmental footprint [1].
Therefore, food security within a health security context needs to also include
reflexions on sustainability of food production, both relative to assessment systems
and mitigation action. The Chapter will therefore also describe the relevant meth-
odology for quantitative sustainability assessment, as well as the concept of global
sustainability limits (boundaries) and potential action within agri- and aquaculture
production systems to improve sustainability.
The sustainability of food production is not only related to the environment but
can overlap with the safety of the food produced. Several examples of such interac-
tions will be discussed, including one important example related to both human and
animal health which is the increase in antimicrobial resistance (AMR) of foodborne
microorganisms. The increase in the number of multi-resistant bacteria (resistant to
several antimicrobials) threatens a return to the pre-antibiotic era, where a simple
scratch or a sore throat could be life-threatening. It is estimated that 700,000 die
globally every year from AMR microorganisms, and that this figure will escalate to
10 million by 2050 (the global death toll from cancer is 8 million) [2]. Which relative
fraction of this problems that is caused by the animal use of antimicrobials is still
debated but it is likely to be significant. The Chapter will describe existing (national
and international) surveillance systems as well as examples of risk mitigation action.
In relation to the future of global surveillance of communicable (including
foodborne) diseases, the present and future use of Next Generation Sequencing
(NGS) will be described. The sharing of Whole Genome Sequences (WGS) of all
microorganisms gives us a potential to develop standardized global surveillance of
all microorganisms as well as AMR, providing a basis for global, regional and
national ‘One Health’ interventions to analyze, control and minimize the problem.
Food Security: Microbiological and Chemical Risks 233

2 Food Safety Monitoring and Surveillance, Risk


Assessment and Risk Mitigation

Decisions about policies aimed at preventing food contamination and foodborne


illness have become very important in general, both nationally and internationally.
Infectious disease surveillance systems are used both in relation to human and
animal health. These systems are typically set up to collect data on the occurrence
of diseases in humans and/or animals, thereby enabling identification of outbreaks,
tracking the spread of diseases and providing early warning for national as well as
international human and animal health institutions. Food Safety surveillance systems
are focused on either food contamination, typically referred to as food monitoring, or
foodborne disease surveillance.
Recent experience has shown that these traditional systems are not always
effective nor timely in relation to alerting officials to newly emerging foodborne or
zoonotic diseases: diseases transmitted between animals and humans [3]. Examples
include HIV/AIDS, variant Creutzfeldt-Jakob Disease, Highly Pathogenic Avian
Influenza (HPAI) such as H5N1 and H1N1 (pandemic) AI virus. Zoonotic disease
outbreaks seem to be increasing in number. Out of 175 pathogenic microbiological
species considered to be ‘emerging’, it is estimated that 132 (75%) are zoonotic, and
overall, zoonotic pathogens are twice as likely to be associated with emerging
diseases than non-zoonotic pathogens [4].
Most of all, important zoonoses relate in some way to animals in the food
production chain. Therefore food becomes an important vehicle for many zoonotic
pathogens. Zoonotic diseases related to food animals can be separated into three
groups [5]. In the first group are diseases with a potential for global spread and with a
dramatic public relations potential, often these diseases have a large human reservoir
showing some level of human-human transmission, e.g. SARS, HPAI (H1N1) and
certain types of Antimicrobial Resistant (AMR) bacteria. The second group relates to
the industrialized food production chain, such as Salmonella and Campylobacter,
human pathogens that are often non-pathogenic in animals and seem to be distrib-
uted in all countries, both rich and poor. In the third group are the ‘neglected
zoonotic diseases’. They are zoonotic diseases which have been eradicated
(or drastically reduced) in affluent economies through vaccination, culling policies,
and/or introduction of better animal management practices. However, in many poor
settings these diseases are ‘neglected diseases’ and receive very little attention from
national authorities or even international organizations. This group includes Bru-
cella, bovine TB (tuberculosis), and many parasitic diseases, e.g. leishmaniasis and
cysticercosis.
In addition to the factors described above, food production and food trade is now
more and more global, and thus some of the food related problems are also global
food problems. On the positive side globalization has helped with some of the
important global food issues: it has helped deal with – at least to some degree –
food insecurity in the most dramatic form, i.e. famine. To the extent that we still do
have famine occuring in certain regions it is more an outcome of (political) inability
234 J. Schlundt et al.

to distribute the food that we actually produce.1 In recent decades the occurrence of
major famines has diminished significantly and abruptly as compared to earlier eras.
However, together with the food also the foodborne diseases now travel the
globe. And if we do not stay on top of the problem, disease outbreaks might affect
large parts of the global food sector negatively, in the end leading to significant neg-
ative health impact – but will also cause negative financial and socio-economic
effects. A more holistic and pro-active approach to food safety and disease surveil-
lance may help prevent future food disasters and in the process build healthy
economies.
One of the major issues related to regulatory action in food safety over the latest
decades has been the lack of cross-sectoral collaboration across the food production
chain. Major food safety events have been significantly affected by the lack of
collaboration between the animal health, the food control, and the human health
sector. This led to renewed international action, or maybe more correctly put: it led
to discussion about how this apparent lack of coordination could be mitigated,
which resulted in the creation of the (somewhat) novel concept of ‘One Health’.

2.1 The Need for One Health Surveillance: The Zoonotic


Influenza Virus Examples

It was primarily the outbreaks of SARS (Severe Acute Respiratory Syndrome),


zoonotic influenza, and BSE (Bovine Spongiform Encephalopathy) which alerted
the world to the need for a One Health approach. Outbreaks of viral diseases in
humans, originating in or spreading through farm animals (avian flu – influenza A
(H5N1) and swine flu – influenza A(H1N1)) have caused major global alerts in the
last decades. These influenza outbreaks spread very quickly, either in the animal
population (H5N1) or directly in the human population (H1N1), and formed a global
threat for human health. H1N1 was therefore characterized by the WHO as a
pandemic. Although, in total the human disease burden related to the endemic
bacterial zoonoses is probably many fold higher than these influenza outbreaks, it
is basically these relatively few but fast spreading outbreaks that have put One
Health on the global agenda. In addition, the failure to predict, monitor and control
the spread of these diseases in animals presented regulators and politicians with a
wake-up call, and made them demand (better) cross-sectoral collaboration between
the animal and human health sectors [5].
Avian influenza (AI), caused by the influenza A virus, is one of several zoonotic
influenza diseases. Although WHO for some time – and maybe under pressure from
major pork producers – maintained that swine influenza should not be characterized
as such, most scientists (including WHO) now refer to these influenza types as per

1
This is not to underplay the very real risk facing the roughly 80 million people currently living in a
state of crisis-level food insecurity and requiring urgent action.
Food Security: Microbiological and Chemical Risks 235

their ‘natural’ host. Humans can be infected with avian, swine and other zoonotic
influenza viruses, such as avian influenza virus subtypes A(H5N1), A(H7N9), and A
(H9N2) and swine influenza virus subtypes A(H1N1), A(H1N2) and A(H3N2). The
majority of human cases of avian influenza A virus infections have been associated
with direct or indirect contact with infected live or dead poultry. Controlling the
disease in the animal source is critical to decrease risk to humans. Zoonotic influenza
infection in humans will continue to occur, notably from avian and other animal
sources. To minimize public health risk, surveillance in both animal and human
populations is essential.
Avian Influenza A outbreaks in birds not only impact animal production, but also
give rise to a risk in food caused by viral contamination of poultry products in the
food supply chain. Distinctions in Avian Influenza outbreaks between strains H5N1
and H7N9 indicate that early detection of the AI virus in poultry is crucial for the
effective warning and control of AI to ensure food safety. Therefore, the establish-
ment of a poultry surveillance system for food safety by early detection is urgent and
critical [6].
Global human outbreaks of swine influenza A(H1N1) are not as prevalent as
human outbreaks related to Avian influenza, but have been more dramatic in
outcome. Notably the influenza pandemic in 1918 was caused by a strain of
influenza A(H1N1). In June 2009, the WHO issued a pandemic alert concerning
the spread of an influenza A(H1N1) virus, originally characterized in April 2009 in
human patients in California and Texas, USA, and in patients from Mexico, which
were likely closer to the original jump from the porcine reservoir [7]. This strain
showed distinctive genetic characteristics, with a main mutation in the gene coding
for hemagglutinin (HA). The remarkable feature of A/(H1N1)pdm09, compared
with seasonal strains, is its high fatality rate and its higher incidence among younger
people [8].
Conventional methods usually applied for the purpose of AI diagnosis face some
practical challenges in animal production chains. To establish a comprehensive
poultry surveillance program throughout the poultry supply chain systematic
approaches and integrated methods are needed at every stage of this chain to limit
AI outbreaks in animals and prevent AI outbreaks in humans. It should be noted that
the novel application of close to real-time characterization of influenza virus strains
using next generation sequencing is a very promising development in this area [9].

2.2 The Future of One Health Food Safety

Future achievements in food safety, public health and welfare will largely be based
on how well politicians, researchers, industry, national agencies and other stake-
holders manage to collaborate using the One Health approach. Data on occurrence
and disease burden from foodborne hazards combined with knowledge of source
attribution are crucial in assessing costs and benefits of control measures. Food
safety resources should be allocated where they contribute most to One Health
236 J. Schlundt et al.

benefits. Without knowledge of the incidence and burden of disease associated with
particular pathogen/food commodity combinations, prioritization of foodborne haz-
ards for mitigation action is difficult [10].
The three most relevant international organizations in this area (WHO, FAO,
OIE) have recognized that combating zoonoses is best achieved via a One Health
approach, as stated in their seminal paper ‘A Tripartite Concept Note’ [11] - (OIE:
World Organization for Animal Health). Given the impact that zoonotic diseases are
recognized to have in socio-economical terms, a One Health vision is also endorsed
by the World Bank (WB) and the United Nations Children’s Fund (UNICEF) [12].
While the groups of zoonotic diseases mentioned above are very different, they
are all most efficiently prevented by a One Health approach which considers the full
farm-to-fork chain, and is based on surveillance data covering the full food produc-
tion chain, and linked to public health human disease data in the end. Such preven-
tive and holistic approaches may reduce both the disease burden to human health and
the economic burden to developing economies, and therefore represent a significant
potential for improvement as seen in a One Health perspective.
A number of food related chemical hazards are shared by animals and people
either directly, through food or through the environment, i.e. should be covered
within the One Health framework. Corn, for example, is a shared food ingredient and
can be a source of aflatoxin poisoning for both people and animals if contaminated.
The recent melamine poisoning of pets in North America and children in China has
highlighted the need for joint one health investigations. In the melamine poisoning
outbreaks, nephrotoxicity was observed in pets in 2007 and subsequently in Chinese
infants and children in 2008 [13]. Chemical food contamination is a major cross-
cutting issue, pesticides and other chemicals are often used in food production,
sometimes inappropriately, providing opportunities for residues at dangerous levels
in food products. The question of the exorbitant use of antimicrobials in animals will
be dealt with in Sect. 4. The subject of animals as sentinels of environmental and
ecosystem health has been discussed by the toxicology community for over 30 years
[13]. In major contamination events the entire ecosystem, including people, are often
affected by the pollution. Therefore, One Health monitoring and surveillance sys-
tems should clearly include chemical hazards.
The use of pesticides can protect crops and prevent post-harvest losses, thus
contributing to food security. The development of pesticides was fundamental to the
Green Revolution and transformation of modern agriculture. More recently evidence
of the serious impacts on the environment has emerged. Pesticide misuse and
pesticides as water pollutants are increasingly serious global challenges resulting
in heavy environmental pollution and most likely significant health risks for humans
[14]. Pesticide monitoring data from EU countries are reported to EFSA, and
typically used to evaluate the level of samples where the Maximum Residue Level
(MRL) is exceeded. In the period 2013–2015 (for a total of 28,912 conventional and
1940 organic food samples the MRL exceedance rate for conventional and organic
food amounted to 1.2% and 0.2% [15]. It is important to note that the MRLs do not
directly reflect human health effect limits, instead they reflect the lowest level
manageable to maintain pest-killing effect under present agricultural methods.
Food Security: Microbiological and Chemical Risks 237

Notably, the most significant human health risk related to pesticide use in
agriculture is pesticide poisoning. In the USA the Environmental Protection Agency
(EPA) conducts poisoning surveillance to determine whether labeling is effective.
Based on this EPA can require that interventions be instituted that involve changing
pesticide use practices, and the appropriate interventions for these cases include
enhanced education and enforcement [16]. On average, in Germany almost
200 annual cases of pesticide poisoning are hospitalized, and approx. 5% of such
poisoning cases are reported to be fatal [17]. Most likely such figures are signifi-
cantly higher in developing countries with less efficient health systems. Data from
monitoring systems focused on pesticides and other chemicals in food will in the
future also be used for risk assessment of combined exposure to multiple chemicals
(“chemical mixtures”). Typically, risk assessment of multiple chemicals is
conducted using a tiered approach for exposure assessment, hazard assessment and
risk characterization [18], an approach that clearly needs update as new data from
animal experiments show the potential for additive effect of such chemicals [19].
One Health formulates both the need for, and the benefit of, cross-sectoral
collaboration. Here we will focus on the human health risk related to hazards present
both in plants grown for food and food animals and food derived from these animals,
and typically transmitted to humans through food. Some diseases have global
epidemic – or pandemic – potential, resulting in dramatic action from international
organizations and national agricultural- and health authorities in most countries, for
instance as was the case with avian influenza. Other diseases relate to the industri-
alized food production chain and have been – in some settings – dealt with efficiently
through farm-to-fork preventive action in the animal sector, e.g. Salmonella, or in the
plant production sector, e.g. DDT.

2.3 International, Regional and National Examples of Food


Safety Surveillance and Risk Assessment

This section will list a number of – in no way fully representative – examples of


existing food safety systems with a focus on national and regional surveillance
systems providing data for food contamination and foodborne disease, resulting
from microbiological as well as chemical hazards. The section will also briefly
describe examples of the now accepted methodology for science-based decision
support based on such data: risk assessment, within the risk analysis framework,
initially defined by WHO and FAO [20].
The examples described here are in no way intended to provide a full picture of
developments in this area, however they do represent some of the novel develop-
ments in the food safety area that have contributed to significantly revise food
control, food safety and foodborne disease prevention over the latest decades.
238 J. Schlundt et al.

2.3.1 FAO/WHO Food Safety Expert Bodies

FAO and WHO work together to provide scientific guidance on chemical as well as
microbiological hazards and the human health risk they cause. The first FAO/WHO
Expert Committee was the Joint Expert Committee on Food Additives (JECFA)
created in 1955 to study the impact of food additives, including veterinary drugs,
chemicals and toxins on human health. As an independent group, JECFA advises the
FAO/WHO Codex Alimentarius Commission and other Codex bodies on current
and emerging issues in this area. In 1963 an additional group dealing with chemical
safety assessment was created: the Joint FAO/WHO Meeting on Pesticide Residues
(JMPR), advising the Codex Alimentarius Commission on maximum residue levels
for pesticides and environmental contaminants in food products. More recently the
problems of microbiological contaminants in food have resulted in the creation of a
Joint FAO/WHO Expert meeting on Microbiological Risk Assessment (JEMRA).
This body has since 2000, in collaboration with the Codex Committee on Food
Hygiene, initiated risk assessment work on a number of important foodborne
pathogens (e.g. Salmonella, Campylobacter, Listeria and Vibrio). All three expert
bodies operates under the generic Risk Analysis framework focusing on a formalized
and standardized risk assessment process (see Fig. 1).
The Codex Alimentarius Commission (Codex) as a subsidiary body of FAO and
WHO is one of the most important and successful multilateral institutional mecha-
nisms for regulatory harmonization and standards cooperation in the global system.
In many ways, the success of Codex as the multilateral institutional standard-setting
mechanism for food safety is the result of well-defined normative agreements, and
well segmented and sustained work on strategic market and regulatory policy issues

Fig. 1 Components of a
microbiological Risk Hazard
Assessment
Identification

Hazard Exposure
Characterization Assessment

Risk
Characterization
Food Security: Microbiological and Chemical Risks 239

in the global food system. It is likely that the clear definition – and separation – of the
scientific advice provided by the FAO/WHO Expert groups and the management
decisions suggested by the Codex Committees has contributed significantly to this
success. Through the long-term focus on social, economic and scientific aspects of
food safety regulation, the institutional legitimacy of the Codex Alimentarius Com-
mission has grown as globalization of the agri-food industries and food systems
accelerates. Consequently, the ability of FAO, WHO and Codex to effectively
mobilize national governments, industry and civil society in support of food safety
regulatory standards harmonization reinforces the need for increased multilateral
cooperation in this area.

2.3.2 WHO/FAO International Food Safety Authorities Network


(INFOSAN)

It has been recognized for some years that the increased globalization of food trade
also increases the risk of contaminated food spreading quickly around the globe. In
2004 WHO created the INFOSAN network to enable WHO to assist Member States
in managing food safety risks and ensure rapid sharing of information during food
safety emergencies, later INFOSAN became a joint Network with FAO. INFOSAN
also facilitates the sharing of experiences and tested solutions in and between
countries in order to optimize future interventions to protect the health of consumers.
National authorities of 186 Member States are part of the network [21].
INFOSAN Member States typically have an Emergency Contact Point and
several Focal Points. Members are expected to respond to requests for information
and take the initiative to share and disseminate food safety information of potential
international relevance. In the 2016/17 biennium INFOSAN has been operational
during 84 food safety events. The level of engagement by the INFOSAN Secretariat
relates to the countries involved, the severity of the public health impact, and the
duration of the event. In many cases, the INFOSAN Secretariat will request infor-
mation from INFOSAN Emergency Contact Points following the receipt of infor-
mation about a food safety event of potential international concern. During complex
events involving multiple countries, the INFOSAN Secretariat actively obtains and
disseminates information to and from INFOSAN members regarding food safety
events of international concern. INFOSAN is considered to be functioning under the
umbrella of the WHO International Health Regulation (IHR) which stipulates that
any country experiencing a ‘public health event of potential international concern’
(PHEIC) must inform WHO – and thereby the world – about this event. INFOSAN
does not have regulatory oversight, as per the nature of the United Nations systems
precludes such action.
The total number of events treated under INFOSAN pales in comparison with
systems that are part of food legislation (e.g. EU Rapid Alert System for Food and
Feed: RASFF). Nevertheless, the nature of events reported under INFOSAN can
give us an idea of global trends. Table 1 lists the events as recorded per hazard type
(biological, chemical etc.), clearly showing the biological (microbiological) events
240 J. Schlundt et al.

Table 1 International food safety events acted upon by INFOSAN by hazard category, 2013–2017
[22]
2017 2016 2015 2014 2013
N ¼ 44 N ¼ 40 N ¼ 37 N ¼ 40 N ¼ 44
events events events events events
Hazard n (%) n (%) n (%) n (%) n (%)
Biological 28 (64%) 30 (75%) 22 (59%) 26 (65%) 28 (64%)
Chemical 7 (16%) 5 (12%) 8 (22%) 10 (25%) 15 (34%)
Physical 1 (2%) 3 (8%) 3 (8%) 1 (3%) –
Undeclared 3 (7%) – 3 (8%) 2 (5%) –
allergen
Unknown 5 (11%) 2 (5%) 1 (3%) 1 (3%) 1 (2%)

occurs most frequently. Table 2 describes events caused by (micro)biological haz-


ards with Salmonella and Listeria – as expected – causing the highest number of
internationally important events. It should be noted that the recognition of events
under the INFOSAN Network in no way can be said to give a scientifically valid
estimation of the public health importance of different microbiological (or chemical)
hazards.

2.3.3 European Union

The routine monitoring of reported data related to food contamination events in all
EU Member States is available for Member States through the EU Rapid Alert
System on Food and Feed (RASFF) through the database maintained by the
European Commission (EC) [23]. This source of information, largely based on
surveillance and inspection programs driven by food safety contamination events
reported by Member States, strongly depends on the nature of national monitoring
and control programs. The filtering of notifications to be sent to the RASFF at a
national level is only partially standardized and an unknown proportion of food
incidents occurring at a national level never arrive into the system [24]. It is evident
that the very high number of alerts recorded (See Table 3 and Fig. 2, [23]) means that
many events will not really be treated further by most Member States in any major
way. However, the system and the relatively new set-up which enables open sharing
of data also outside regulatory agencies (RASFF Consumer Portal) clearly contrib-
utes to transparency and open risk communication across borders. It should be noted
that other regions are trying to set up mirror RASFF systems, notably there is now an
ASEAN-RASFF system open for alert sharing in the ASEAN region – although not
yet effective (http://arasff.net/). A new plan of action for A-RASFF has been adopted
in October 2018 [25].
All member states within the European Union (EU) are obliged to collect data on
occurrence of zoonoses, zoonotic agents, antimicrobial resistance, animal
populations, and foodborne outbreaks, according to Directive 2003/99/EC. These
reports enable evaluation of trends and sources of zoonotic agents, antimicrobial
resistance and foodborne outbreaks within the EU [26]. It is noteworthy that these
Food Security: Microbiological and Chemical Risks 241

Table 2 International food safety events, acted upon by INFOSAN, involving biological hazards,
2013–2017 [22]
2017 2016 2015 2014 2013
N ¼ 28 N ¼ 30 N ¼ 22 N ¼ 26 N ¼ 28
events events events events events
Biological hazard n (%) n (%) n (%) n (%) n (%)
Anisakis 1 (4%) – – – –
Bacillus spp. – 2 (7%) 1 (5%) 2 (8%) –
Brucella spp. – 1 (3%) – – –
Campylobacter – 2a (7%) – – –
Clostridium spp. 4 (14%) 5 (17%) 4 (18%) 2 (8%) 4 (14%)
Cronobacter sakazakii 1 (4%) – – – –
Cyclospora – 1 (3%) 2 (9%) – –
cayetanensis
Datura stramonium – – – – 1 (4%)
Dead lizard – – – 1 (4%) –
Escherichia coli 4 (14%) 2 (7%) 1 (5%) 4 (15%) 3 (11%)
Hepatitis A virus 1 (4%) 3 (10%) 2 (9%) 1 (4%) 4 (14%)
Influenza A virus – – – – 1 (4%)
(H7N9)
Listeria 5 (17%) 2 (7%) 3 (14%) 5 (19%) 5 (18%)
monocytogenes
Norovirus 1 (4%) 2a (7%) 3 (14%) 1 (4%) 1 (4%)
Rhizopus oryzae – – – 1 (4%) –
Salmonella enterica 11 (39%) 10a (30%) 4 (18%) 6 (23%) 7 (25%)
spp.
Schmallenberg virus – – – – 1 (4%)
Shigella spp. – – 1 (5%) – –
Staphylococcus aureus – – – – 1 (4%)
Trichinella – – – 1 (4%) –
Unknown – 1 (3%) 1 (5%) – –
Vibrio spp. – 1 (3%) – 1 (4%) –
Yersinia – – – 1 (4%) –
pseudotuberculosis
a
1 event involved Campylobacter, Norovirus and Salmonella Enterica spp. in 2016

reports have been effective in directing Member State efforts in the area, for example
specific efforts to mitigate Salmonella risk in EU countries seem to have been
effective as documented in an almost 50% reduction in human Salmonella cases in
the EU over a short 5 year period (2004–2009) [27]. At the same time, the prevalence
of Salmonella in poultry decreased significantly, especially in laying hen flocks,
presumably this reduction is likely to be the main reason for the decline of Salmo-
nella cases in humans, since eggs are considered the most important source of human
infections in the EU. Notably some EU Member States have even succeeded in
eradicating Salmonella in egg-laying hens and thereby in nationally produced eggs
for the market. The most convincing documentation for this comes from Denmark
242 J. Schlundt et al.

Table 3 Evolution of the number of notifications - by notification classification: Original notifi-


cations and Follow-up from 2011 to 2017
Information for Information for
Year Alert Border rejection attention follow-up
Follow-
Original up Original Follow-up Original Follow-up Original Follow-up
2011 617 2265 1820 1053 720 480 550 1126
2012 522 2312 1712 906 679 664 507 1325
2013 584 2376 1438 525 679 763 429 1493
2014 725 3280 1357 581 605 670 402 1377
2015 748 4028 1376 417 475 538 378 1222
2016 817 4659 1159 421 573 704 372 1504
2017 927 5781 1570 771 683 979 586 1586
Data are from RASFF: EU Rapid Alert System for Food and Feed [23]

Fig. 2 Evolution of the number of notifications - by notification classification. Original notifica-


tions and follow-up–from 2011 to 2017. (Data are from RASFF: EU Rapid Alert System for Food
and Feeds [23])

where in 2015 a record low number of foodborne salmonella cases were registered
with no cases attributed to Danish eggs for the first time in the almost 30-year history
of the salmonella source account in that country [28].
The European food safety system underwent a very dramatic revision following
several food scandals in the late 1990s, most notably the ‘Mad Cows Disease’
scandal. Regulation (EC) No 178/2002 of 2002 laid down the general principles
and requirements of a European food law, establishing the European Food Safety
Authority (EFSA) and all procedures in matters of food safety for all Member States.
Food Security: Microbiological and Chemical Risks 243

It is noteworthy that while food safety provisions are EU authority, health issues are
typically under Member State authority. This sometimes causes problems at EU
level when food (EFSA) and health (ECDC) data are collated. It should be said,
however, that the collaboration between EFSA and ECDC has improved markedly
over the latest years.
EFSA – in spite of its name – is not an Authority in the typical sense of that word,
the regulatory entity in the EU system is the EU Commission. EFSA responsibility is
risk assessment and risk communication, which naturally includes overseeing mon-
itoring and surveillance systems in collaboration with Member States. EFSA there-
fore has initiated a number of Expert Panels. As an example of the type of work
performed by such panels the EFSA Panel on Biological Hazards (BIOHAZ) has
published two EU-wide farm-to-fork quantitative microbiological risk assessments
(QMRA), with regard to Salmonella in slaughter and breeder pigs and Campylo-
bacter in broilers. The Scientific Opinion on a QMRA of Salmonella in pigs
represented a major step forward in terms of modelling from farm to consumption
as it took into account the variability between and within EU Member States. This
QMRA model was developed to estimate the prevalence of infection and contami-
nation and the microbial load from the farm to the point of consumption (exposure)
and then estimating the probability of infection. It was also used to investigate the
effect of interventions to control Salmonella in pigs at different points of the food
chain and resulted in a hierarchy of suggested on– farm and slaughterhouse control
measures, with estimates of the reduction of human cases they would result in
[29]. To model the effect of interventions from farm to fork on the incidence of
human campylobacteriosis, a QMRA model was developed for Campylobacter in
broiler meat. Reductions to the public health risk of campylobacteriosis could be
achieved through a variety of interventions, both in primary production or at the
slaughterhouse, with different impacts. Reductions of public health risk using targets
at primary production or microbiological criteria were also estimated through
modelling using additional models.
In general, QMRA of food-borne pathogens at European level has proven a
useful and efficient tool to enable risk managers to evaluate the feasibility and the
cost-benefit ratio of introducing control measures and targets to further protect public
health of consumers [29].
Since its creation in 2002, EFSA has produced risk assessments for more than
4000 substances in over 1600 scientific opinions, statements and conclusions
through the work of its scientists. For individual substances, a summary of human
health and – depending on the relevant legislation and intended uses – animal health
and ecological hazard assessments has been collected and structured into EFSA’s
chemical hazards database: OpenFoodTox., [30]. This database provides open
source data for substance characterization and links to EFSA’s related outputs,
background European legislation, and a summary of the critical toxicological end-
points and reference values. OpenFoodTox is a tool and source of information for
scientific advisory bodies and stakeholders with an interest in chemical risk assess-
ment. Summary data sheets for individual substances can be downloaded.
244 J. Schlundt et al.

2.3.4 USA

The US Food and Drug Administration (FDA) uses risk analysis, a concept and
framework fostered by the WHO and the FAO in the mid-1990s, to ensure that
regulatory decisions about foods are science-based and transparent. The FDA’s
Center for Food Safety and Applied Nutrition (CFSAN) applies the concept of risk
analysis using tools aimed at presenting new possibilities for detecting and mitigat-
ing risks to the food supply. For example, CFSAN and NASA (the National
Aeronautics and Space Administration) are conducting a pilot project that uses
geospatial analysis to recognize patterns of contamination in crops, forecasting
high potential for contamination events in specific regions, at specific times and
under various weather conditions [31].
The US Food Safety System in general represents a case of shared government
responsibilities. Food safety and quality in the United States is governed by 30 fed-
eral laws and regulations administered by 15 federal agencies. The three main
agencies are: The Food and Drug Administration (FDA), the US Department of
Agriculture (USDA), dividing between them food and food production according to
food groups, and the Centers for Diseases Control and Prevention (CDC), mainly
responsible for investigating localized and nationwide outbreaks of foodborne
illnesses. In many cases, the food safety functions of the FDA and USDA overlap;
particularly inspection/enforcement, training, research, and rulemaking, for both
domestic and imported food. Nevertheless, the system presents a generally success-
ful example of how integration (and shared use) of data can be possible across
sectors and governmental entities.
USA was one of the first – if not the first – country to implement (in 1996) an
active surveillance system for foodborne diseases. The Foodborne Diseases Active
Surveillance Network (FoodNet) tracks foodborne illnesses, generating information
used to guide and monitor food safety policy and prevention efforts. FoodNet
estimates numbers of foodborne illnesses, monitors changes in incidence of specific
illnesses over time, and attributes illnesses to specific sources and settings. The
system functions as a collaborative program of the CDC, 10 state health depart-
ments, USDA and FDA. FoodNet conducts population-based active surveillance for
laboratory-confirmed infections caused by seven bacterial pathogens (Campylobac-
ter, Listeria monocytogenes, Salmonella, Shiga toxin-producing Escherichia coli
[STEC], Shigella, Vibrio, and Yersinia), two parasitic pathogens (Cyclospora and
Cryptosporidium), and one syndrome: hemolytic uremic syndrome (HUS), typically
caused by STEC. The FoodNet surveillance area includes approximately 15% of the
population of the United States of America [32].
In an effort to re-conceptualize the US strategic food safety system it has been
realized that exchanges of knowledge and information about foodborne hazards
facilitated by new communication technologies could drive improved coordination
with more efficient regulatory intervention. However, across the farm-to-table spec-
trum, many critical points are beyond the reach of rules and standards [33]. The
dominant logic of traditional approaches using control rather than management may
Food Security: Microbiological and Chemical Risks 245

result in less than desirable outcomes. The US system can be said to be one of the
primary national regulators promoting a farm-to-table, science-based management
framework all the way back from an original first description in the ‘Clinton Farm-
to-Table Plan’ [34].
The US system has been shown to be able to detect and respond to new
developments in the food safety landscape. An increasing number of microbial
foodborne illnesses are associated with fresh fruits and vegetables. An analysis of
foodborne outbreaks in the USA found that 12% of outbreaks and 20% of outbreak-
related illnesses were associated with produce [35]. A modern risk-based food safety
system takes a farm-to-fork preventative approach to food safety and relies on the
proactive collection and analysis of data to better understand potential hazards and
risk factors, to design and evaluate interventions, and to prioritize prevention efforts.
Such a system focuses resources at the points in the food system with the likelihood
of having greatest benefit to public health [36].
PulseNet USA, a national molecular subtyping network for foodborne disease
surveillance was initiated in the United States in 1996 as a critical early warning
system for foodborne disease outbreaks. The system was based on a at that time
revolutionary (relatively) new typing methodology: PFGE (Pulsed Field Gel
Eletrophoresis) enabling rapid genomic comparison between human and food/ani-
mal foodborne disease related isolates. The PulseNet network is now being repli-
cated in different ways in Canada, Europe, the Asia Pacific region, and Latin
America [37]. These independent networks work together in PulseNet International
allowing public health officials and laboratorians to share molecular epidemiologic
information in real-time and enabling rapid recognition and investigation of multi-
national foodborne disease outbreaks.
A new PulseNet International vision is focused on the standardised use of whole
genome sequencing (WGS) to identify and subtype food-borne bacterial pathogens
worldwide, replacing traditional methods. Focused on real-time surveillance, such
standardized subtyping will deliver sufficiently high resolution and epidemiological
concordance. Ideally, WGS data collected for surveillance purposes should be
publicly available in real time, not only for disease surveillance and outbreak
purposes but also to answer scientific questions pertaining to source attribution,
antimicrobial resistance, transmission patterns, etc. [38].

2.3.5 Denmark

The national system for food safety in Denmark has been organized (since 2007)
with a clear separation between risk assessment (hosted in a University Institute) and
risk management (hosted by the Gov. Food Safety Regulator). Thus, the adminis-
trative responsibilities (rules, control etc.) lie with the Danish Veterinary and Food
Administration while the National Food Institute, Technical University of Denmark,
is responsible for the scientific assessment of risks and the research-based assess-
ment of monitoring data. This separation enables independent scientific description
of problems and possible solutions, offering a transparent and seemingly efficient
246 J. Schlundt et al.

system. A cornerstone in providing research-based scientific advice is to have people


involved who actually do research in relevant areas, i.e. University scientists. The
National Food Institute conducts research in microbiological and chemical risk
assessment, but also in food production and nutrition. The Institute thus adopts a
holistic approach to food, including knowledge about production forms as well as
the positive and negative aspects of our food. The basic research conducted by the
National Food Institute is recognized internationally, and the Institute operates a
number of EU reference laboratories as well as WHO collaborating centers [39].
The Danish Zoonosis Centre was created in 1995 to combine data on zoonotic
pathogens between the animal, food and health sectors. It is therefore the first
example of a ‘one health’ surveillance system – before the term was actually
invented (in 2008). The Centre publishes annual reports enabling science-based
policy decisions in this area. Similar zoonosis reports are now produced in a number
of other European countries. The Danish Zoonosis Report 2017 shows that Cam-
pylobacter is the most common foodborne illness in Denmark, using integrated data
the report shows that cattle may be a source of Campylobacter infection, leading to
changes in the new Danish Action Plan against Campylobacter 2018–2021 [40]. In
2017, the Salmonella source account, which links the number of human Salmonella
infections to specific food items and animals reservoirs, by modelling the distribu-
tion of serovars, was for the first time based on results from WGS. Domestic and
imported pork were estimated to be the sources most commonly associated with
human salmonellosis Burden of disease studies can be used to compare severity of
foodborne pathogens, for instance showing that even though the number of cases
with listeriosis is lower than e.g. salmonellosis, the burden of disease is high due to
the serious nature of the disease (12 deaths reported in Denmark in 2017 from
listeriosis). The burden of disease study on Norovirus estimated approximately
185,060 cases of Norovirus and 26 deaths in Denmark in 2017 [40].
The research-based risk assessment conducted by the National Food Institute can
be divided into chemical and microbiological risk assessment with the chemical part
covering both population exposure estimation and an assessment of potential effects
in humans. Risk assessment is the scientific part of risk analysis which consists of a
further two elements: risk management and risk communication. Risk assessment
includes hazard identification and characterization and exposure assessment, and
based on these aspects, the risk is characterized (See Fig. 1).
Risk assessments of chemicals are generally based on a comparison of human
exposure to a NOAEL (No Observed Adverse Effect Level) for the chemical, i.e. the
highest dose of chemical causing no adverse effects in laboratory animals. This is
done for one chemical at a time. However, humans are exposed to many different
chemicals on a daily level. In vitro studies and studies in experimental animals show
that for e.g. endocrine disrupting chemicals exposure to several chemicals can
induce effects, although the doses for the single chemicals are below or around
NOAEL. This implies that risk assessments of single chemicals in isolation most
likely underestimates the combined risk for humans. New knowledge related to risk
assessment of chemical cocktails in food suggests that we need additional data to
elucidate combined exposure to chemicals [41].
Food Security: Microbiological and Chemical Risks 247

2.3.6 Canada

The Canadian Food Inspection Agency (CFIA) uses ‘Ranked Risk Assessment’
(RRA) to prioritize chemical hazards for inclusion in monitoring programmes or
method development projects based on their relative risk. The relative risk is
calculated for a chemical by scoring toxicity and exposure in the ‘risk model scoring
system’ of the Risk Priority Compound List (RPCL). The ranking may be refined by
the data generated by the sampling and testing programs. The two principal sampling
and testing programmes are the National Chemical Residue Monitoring Program
(NCRMP) and the Food Safety Action Plan (FSAP). The NCRMP sampling plans
focus on the analysis of products for residues of veterinary drugs, pesticides,
environmental contaminants, mycotoxins, and metals. FSAP surveys focus on
emerging chemical hazards associated with specific foods or geographical regions
for which applicable maximum residue limits (MRLs) are not set. Follow-up actions
vary according to the magnitude of the health risk, all with the objective of
preventing any repeat occurrence to minimize consumer exposure to a product
representing a potential risk to human health [42].

2.3.7 Australia and New Zealand

Food Standards Australia New Zealand (FSANZ) is an independent statutory agency


established by the Food Standards Australia New Zealand Act 1991 (FSANZ Act).
FSANZ is part of the Australian Government’s Health portfolio. FSANZ, along with
other government agencies in Australia and New Zealand, monitors the food supply
to ensure it is safe. FSANZ routinely conducts targeted surveys and Total Diet
Studies to collect analytical data on the levels of chemicals, microbiological con-
taminants and nutrients in food.
The Communicable Disease Network Australia and OzFoodNet monitor inci-
dents and outbreaks of foodborne disease which can lead to the detection of an
unsafe food product or unsafe food practice [43]. Microbial contamination may take
place at prefarming, farming or post-farming stages of the food supply chain.
Campylobacter, Salmonella, Listeria monocytogenes, Escherichia coli O157:H7
and non-O157:H7 STEC E. coli are the most common pathogenic bacteria associ-
ated with food safety issues in the food supply chain [44]. Efficient process controls
and effective food safety management systems are vital elements to reduce microbial
contamination and improve food security.

2.3.8 The Netherlands

The Dutch National Institute for Public Health and the Environment (RIVM) collects
and collates knowledge and information from various sources, both national and
international, placing it at the disposal of policy-makers, researchers, regulatory
248 J. Schlundt et al.

authorities and the general public. Each year, RIVM produces numerous reports on
all aspects of public health, nutrition and diet, health care, disaster management,
nature and the environment. The RIVM covers three domains with specific knowl-
edge and expertise: Infectious Diseases and Vaccinology (Centre for Infectious
Disease Control), Environment and Safety (including environmental incident ser-
vice), Public Health and Health Services (including food and food safety) [45].
Microorganisms may enter the food chain for instance during production or
during home preparation. Foods may also contain chemical contaminants, some of
which can be harmful to health. RIVM develops models to determine food safety,
and maintains databases of relevant information. For example, determining the
concentration at which a chemical substance will pose a risk to health, and how
much of that substance a person can safely ingest. In the field of microbial food
safety RIVM has developed, together with international partners, a risk assessment
tool called Quantitative Microbiological Risk Assessment (QMRA). This tool con-
tains food chain models (‘farm-to-fork’) in which the prevalence and number of
pathogens are followed. The Dutch government, together with other national author-
ities in Europe, is responsible for establishing, monitoring and enforcing laws and
regulations to that end. RIVM advises the government in these matters, at the
national and international levels. The Netherlands Food and Consumer Product
Authority (NVWA) is responsible for supervision and enforcement in the
Netherlands.
Dutch research is increasingly being undertaken in the international context for
organisations such as EFSA and WHO/FAO. RIVM also researches food allergens,
seeking to identify substances which cause an allergic reaction and the quantity of
the substance which is likely to do so. Based on research findings, RIVM advises
various clients. They include the Ministry of Health, Welfare and Sport (VWS) and
other Ministries, the Netherlands Food and Consumer Product Safety Authority
(NVWA), the Board for the Authorisation of Plant Protection Products and Biocides
(Ctgb), the Veterinary Medicinal Products Unit (BD), the European Food Safety
Authority (EFSA), WHO and FAO [45]. RIVM hosts the World Health Organiza-
tion Collaborating Centres on Chemical Food Safety and of Risk Assessment
of Pathogens in Food and Water.
Pesticide risk assessment is hampered by worst-case assumptions leading to
overly pessimistic assessments. This is mostly based in deterministic risk assessment
models, that have been used for chemicals in food for more than 50 years. In
addition, cumulative health effects of similar pesticides are often not taken into
account in these assessments. The European research project ACROPOLIS has
attempted to developed stochastic modelling in this area, something that has been
done for microbiologicl risk assessments for more than 20 years [46]. These models
are appropriate for both acute and chronic exposure assessments of single com-
pounds and of multiple compounds in cumulative assessment groups. The software
system MCRA (Monte Carlo Risk Assessment) is available for stakeholders in
pesticide risk assessment at http://mcra.rivm.nl/. The emphasis is on cumulative
assessments, presenting two contrasting approaches, sample-based and compound-
based. Examples are given of model and software validation of acute and chronic
Food Security: Microbiological and Chemical Risks 249

assessments, using both simulated data and comparisons and not surprisingly,
additional data on agricultural use of pesticides may give more realistic risk assess-
ments. This program is an independent research tool from the Dutch government,
developed by the Wageningen University, in close cooperation with RIVM.
RIKILT-Institute of Food Safety is an independent non-profit institute conducting
research on the detection and identification of contaminants in food and feed, The
institute contributes to the monitoring of production chains, the quality of agricul-
tural products, and the knowledge of health-protecting substances in food. It carries
out legislative and policy-supporting tasks for the Dutch government and interna-
tional bodies, including the European Commission and EFSA [47].
TNO (Netherlands Organisation for Applied Scientific Research) is providing
independent advice in safety assessment (including food safety) and risk manage-
ment. Developing methodology that enables manufacturers and public-sector bodies
to quickly and accurately assess the microbiological and toxicological safety of
complex products. Also developing methods for predicting the allergenicity of
proteins and peptides, and developing instruments for the early detection of public
health risks and potential food incidents. TNO is working on internationally
recognised testing methods that speed up product and policy development and
enable more decisive responses to potential food incidents. TNO has been working
for more than 20 years on investigating the effect of different foodstuffs on the
enteric environment, and thereby health, using in vitro gut models [48].

2.3.9 China

Like most other major economies, China has been updating and changing its food
safety regulatory system in major ways over the last 20 years. Food production in
China is now one of the major drivers of economic development, and the identifi-
cation of food safety as a national priority, combined with a number of major food
safety scandals, has driven modernization of the food safety legislative framework.
One of the important, new developments has been the creation of China’s National
Center for Food Safety Risk Assessment (CFSA). The National Food Safety Stan-
dards (NFSS) Framework was established, benchmarked on international best prac-
tices and on the guidance of the Codex Alimentarius Commission (CAC), with a
clear direction to base food safety standard setting on risk analysis principles and in
particular on risk assessments based on Chinese data [49].
An important component of the food safety risk analysis framework in China is
monitoring and surveillance based in the new Food Safety Law of the People’s
Republic of China in 2009. At present, the system is comprised of four networks plus
dietary exposure monitoring. The four networks include the foodborne disease
surveillance network, the biological hazards (bacteria, virus and parasites) monitor-
ing in foods network, chemical hazards monitoring in foods network and the
microbial PFGE profile network. The system now covers all 31 provinces, major
municipalities and autonomous regions in Mainland China and is carried out for the
national food and exposure monitoring and foodborne disease surveillance and
250 J. Schlundt et al.

investigation. The China National Center for Food Safety Risk Assessment has been
assigned overall responsibility for foodborne disease surveillance and dietary expo-
sure monitoring through periodic national Total Diet Studies [50].
The Chinese microbiological food safety surveillance system collects data regard-
ing food contamination by foodborne microorganisms, providing relevant data for
food safety supervision, risk assessment, and standard-setting [51].

2.3.10 Lebanon

While China is still formally considered a developing country, it should be realized


that a significant number of other developing countries are in a situation where the
regulatory food safety capacity is only now being built. Lebanon is but one of such
countries. A risk-based food safety and quality governance based on international
guidance is presently being developed in Lebanon [52].
The new Lebanese food safety law (2016) will result in the creation of a Lebanese
food safety authority (LFSA) which will be developing a food safety governance
system in Lebanon in accordance with the FAO/WHO risk analysis framework and
the World Trade Organization (WTO) Sanitary and Phytosanitary (SPS) Agreement.
Lebanese officials have used experience from regulatory and institutional food safety
governance system developed in USA, EU, Canada and France as relevant models.
There is a recognized need to strengthen the Lebanese infrastructure capacity at the
institutional and stakeholders level through harmonization of Risk Assessment
(RA) and Risk Management (RM) processes. It was recognized that food safety
systems in the model countries listed did not always correspond to the scientific
approach where RM and RA should be functionally and institutionally
separated [52].

3 AMR in the Context of Food Safety Surveillance and Risk


Mitigation

The inventor of the way we still identify micro-organisms, Louis Pasteur, stated in
1878: “It is a terrifying thought that life is at the mercy of the multiplication of these
minute bodies; it is a consoling hope that science will not always remain powerless
before such enemies.” And indeed science did provide fantastic solutions to combat
microorganisms: we now have antimicrobials with the ability to kill even some of the
most dangerous of these “minute bodies”. But unfortunately, microorganisms have
also found ways to fight back. In 2013 the chief medical officer for England, Sally C
Davies, wrote in her book ‘The Drugs Don’t work’: “We are now at a crossroads . . .
as our use of these valuable drugs is not only becoming threatened by the spectre of
resistance among the bugs they are used to treat, but also as we recognise that their
injudicious use can cause harm in its own right”.
Food Security: Microbiological and Chemical Risks 251

The spread of antimicrobial-resistant bacteria poses a major threat not only to our
ability to treat and prevent specific diseases, but to provide medical care across a
range of emergency events. Therefore the occurrence of and rapid increase in the
level of antimicrobial resistance (AMR) in microorganisms, including human and
animal pathogens should be considered a significant health security threat. That the
need for action to contain this global threat is both immediate and growing was
recognized by global leaders meeting at the General Assembly of the United
Nations, which in a 2016 declaration recognized resistance to antibiotics (antibiotics
are antimicrobials produced by microorganisms) as the “greatest and most urgent
global risk,” [53].
Globally, more than half of all antimicrobials (up to 85%) are not used to treat
humans but to help in animal production. While the EU has banned the use of
antimicrobials as growth promoters in animals, in all other parts of the world such
use (i.e. use that is not linked to disease) likely constitutes at least half of all animal
use – the rest relating to actual treatment (or prophylaxis) in animals. Any use of
antimicrobials in animals can lead to the development of antimicrobial resistance
(AMR) in their bacteria, and all animal bacteria will potentially end up in humans –
mainly through our food but also through other routes, e.g. direct contact.
While the issue of the use of antimicrobials in animal food production systems
has been acknowledged as a potential serious problem for at least 20 years [54] more
recent documentation of increased serious AMR stemming from animal production
are now emerging, causing serious concern [55, 56]. Likewise it has been obvious
for some time, that irresponsible use of antimicrobials as animal growth promoters
(AGP) was contributing to the problem [57], and that experience from different
national attempts to control the problem could suggest directions towards successful
mitigation [58]. One such major regulatory milestone was the EU ban of use of
Antimicrobials for animal growth promotion in 2006.
Especially worrisome is the emergence of resistance against antimicrobials that
are considered critically important in human medicine, and in multidrug resistant
(MDR) infections [59]. During recent decades the animal use of antimicrobials,
particularly as AGPs, has led to alarming levels of AMR in many countries.
Conflicts of interests, values and risks between agriculture, health and commercial
stakeholders seem to have complicated the introduction of efficient interventions to
mitigate this increasing risk [58]. In addition, unintended economic incentives of
veterinarians profiting from their own subscriptions has most likely stimulated
antimicrobial use in animals in general, such incentives now banned in all Scandi-
navian countries.
It should be recognized that the use of AGPs are not only relevant in land-
animals. It is well documented that the exposure of fish pathogens and aquatic
bacteria to antimicrobials drives the development of drug resistance, and there
seems to be a causal relationship between the use of specific antimicrobials in
aquaculture and an increase in AMR prevalence [60]. Additionally, other studies
suggest that AMR in aquaculture environments could contribute to the AMR of
human pathogens [61].
252 J. Schlundt et al.

3.1 Surveillance Systems for Antimicrobial Resistance


and Antimicrobial Use

The need for antimicrobial resistance surveillance has been discussed internationally
for at least 25 years. While the French system for surveillance of AMR in certain
animal species was initiated already in 1982, the first two national, integrated
(animal/human) systems to be effectuated were: DANMAP (Danish Programme
for surveillance of antimicrobial consumption and resistance in bacteria from ani-
mals, food and humans) and US NARMS (The National Antimicrobial Resistance
Monitoring System). As the titles reveal only the Danish system included data on
antimicrobial use (both veterinary and human). Much later (2017) experience from
the collaborative efforts of the European Antimicrobial Resistance Surveillance
System (EARSS) and the European Surveillance of Antimicrobial Consumption
program (ESAC) have clearly demonstrated that the integrated monitoring of resis-
tance, use, and costs can prove a crucial factor driving political commitment to
successful resistance containment campaigns.

3.1.1 WHO

The GLASS data-sharing platform was initiated in 2015 following the adoption of
the Global action plan on antimicrobial resistance by the 68th World Health Assem-
bly that year. This reflects the global consensus that AMR poses a profound threat to
human health and that enhanced global surveillance and research is needed to
strengthen the evidence base and support AMR risk mitigation. GLASS was devel-
oped to facilitate and encourage a standardized approach to AMR surveillance
globally, but unfortunately it is not integrated across disciplines and it does not
support data on human use. The first GLASS plan suggests that at a later stage it will
allow progressive incorporation of information from other surveillance systems
related to AMR in humans, such as for foodborne AMR, as well as monitoring of
antimicrobial use [62]. For some time – basically since 2000 – WHO has actually
promoted integrated surveillance – at least for foodborne pathogens – and the
WHO AGISAR group (Advisory Group on Integrated Surveillance of Antimicrobial
Resistance) has produced significant guidance over the years to that effect [63].

3.1.2 EU

The European Antimicrobial Resistance Surveillance System (EARSS) was


established in 1998 and in 2010 it was transferred to the European Centre for Disease
Prevention and Control (ECDC) as the European Antimicrobial Resistance Surveil-
lance Network (EARS-Net). It provides public access to descriptive data (maps,
graphs and tables) that are available through the ECDC Surveillance Atlas of
Infectious Diseases. More detailed analyses are presented in annual reports and
Food Security: Microbiological and Chemical Risks 253

Fig. 3 Frequency distribution of Escherichia coli isolates completely susceptible and resistant to
1–11 antimicrobials in broilers, 30 EU/EEA Member States, 2016. (From ECDC/EFSA [56])

scientific publications. The objectives of EARS-Net are to collect comparable,


representative and accurate AMR data, encourage the implementation, maintenance
and improvement of national AMR surveillance programmes. It is noteworthy that
the general picture when comparing the EU/EEA countries there seems to be a very
clear trend for higher AMR prevalence in the south and lower AMR prevalence in
the north, most likely reflecting the efficiency of risk mitigation policies in these
regions (see Fig. 3).
Fortunately the European data is now also presented in an integrated report with
input/data from ECDC, EFSA and EMA (European Medicines Agency) [64], which
covers both AMR resistance and AM use in food-producing animals and humans.

3.1.3 Denmark

In 1995, Denmark was the first country to establish an integrated, systematic and
continuous monitoring program of antimicrobial drug consumption and antimicro-
bial agent resistance in animals, food, and humans, the Danish Integrated Antimi-
crobial Resistance Monitoring and Research Program (DANMAP). Monitoring of
254 J. Schlundt et al.

antimicrobial drug resistance and a range of research activities related to DANMAP


have contributed to restrictions or bans of use of antimicrobial growth promotors
(AGP) in food animals in Denmark and other European Union countries. In fact
Danish data were instrumental in driving EU policy and legislation towards the ban
of the use of antimicrobial growth promotors in animal production [55].
In Denmark DANMAP data and analyses have been used to promote sustainable
animal production practices where high productivity is reached without inappropri-
ate use of antimicrobials. The key elements here are good animal husbandry
practices that prevent disease, combined with commercial disincentives for AM
use and a legal framework that regulates the use of antimicrobials in the animal
sector as well as takes away the opportunity of veterinarians to make a profit from
(prescribing and) selling antimicrobials. Indeed the success story of Danish pig
production is instructive here, with an annual production of 20 million pigs before
the ban of AGP and 30 million after the ban [57].

3.1.4 USA

The US National Antimicrobial Resistance Monitoring System for Enteric Bacteria


(NARMS) was established in 1996. NARMS is a collaboration among state and
local public health departments, CDC, the U.S. Food and Drug Administration
(FDA), and the U.S. Department of Agriculture (USDA). NARMS uses an inte-
grated “One Health” approach to monitor antimicrobial resistance in enteric bacteria
from humans, retail meat, and food animals. NARMS data are not only essential for
ensuring that antimicrobial drugs approved for food animals are used in ways that are
safe for human health but they also help address broader food safety priorities.
NARMS surveillance, applied research studies, and outbreak isolate testing
provide data on the emergence of drug-resistant enteric bacteria; genetic mechanisms
underlying resistance; movement of bacterial populations among humans, food, and
food animals; and sources and outcomes of resistant and susceptible infections.
NARMS surveillance focuses on two major zoonotic bacterial causes of foodborne
illness in the United States, nontyphoidal Salmonella and Campylobacter. Food
animal and retail meat surveillance also include Enterococcus and Escherichia
coli, common intestinal bacteria that can serve as reservoirs of resistance genes
and indicators of selection pressures in Gram-positive and Gram-negative bacteria,
respectively. In addition, CDC uses the NARMS human surveillance platform for
monitoring resistance in E. coli O157, Vibrio, and the nonzoonotic enteric patho-
gens, Shigella and typhoidal Salmonella. NARMS data can be used to guide and
evaluate the impact of science-based policies, regulatory actions, antimicrobial
stewardship initiatives, and other public health efforts aimed at preserving drug
effectiveness, improving patient outcomes, and preventing infections [65].
Food Security: Microbiological and Chemical Risks 255

3.1.5 France

The French surveillance network for antimicrobial resistance in pathogenic bacteria


of animal origin (RESAPATH) was set up in 1982 under the name of RESABO
(BO for bovines). In 2000, it was expanded to pigs and poultry and in 2007, to other
animal species, and now resides under the French Agency for Food, Environmental
and Occupational Health Safety (ANSES). The surveillance system estimates AMR
in animal pathogens and is also part of a recent intersectorial “One Health” national
action plan against antimicrobial resistance in humans, animals and the environment
adopted in 2016.
Data from RESAPATH have documented a decline or stabilisation in resistance
for the vast majority of antimicrobials tested in animal pathogens 2006–2014, with
the proportion of multi-resistant bacterial strains significantly reduced in all species.
These results are consistent with the large reductions in exposure of animals to
antimicrobials in France in recent years. However, resistance levels seem to have
slightly increased between 2014 and 2016 for several animal species and
antimicrobials [66].

3.1.6 Germany

In Germany data on the consumption of antimicrobials and the spread of antimicro-


bial resistance in human and veterinary medicine is recorded in the GERMAP report
within the “One Health” approach with input from a large number of federal
institutions, including the BfR (Risk Assessment) and Robert Koch Institute (Public
Health) [67]. The National Reference Laboratory for Antibiotic Resistance within
BfR is tasked with – under the framework of the Zoonoses Monitoring Directive
(2003/99/EC): Antimicrobial resistance testing, collaboration in the analysis of
infection chains, molecular characterization of antimicrobial resistance determinants
and conduct of inter-laboratory studies. Regarding animal use of antimicrobials the
report documented that antimicrobial-resistant bacteria or resistance genes can be
transferred between humans and animals and vice versa. Should the use of antimi-
crobials not finally be limited to the extent required for treatment and metaphylaxis,
it is suggested that further legal interventions into the therapeutic freedom of
veterinarians must be expected.
A novel, web-based surveillance system for hospital antimicrobial consumption
has been developed in Germany providing real-time surveillance at unit and facility
levels, accessible to all relevant stakeholders. User-defined reports are available via
an interactive database, enabling comparison of different antimicrobial use groups as
defined by the WHO, also enabling comparing the proportional use with other
countries [68].
256 J. Schlundt et al.

3.1.7 United Kingdom (in This Case England)

The English Surveillance Programme for Antimicrobial Utilisation and Resistance


(ESPAUR) was established in 2013 to support Public Health England (PHE) in the
delivery of the UK Five Year Antimicrobial Resistance Strategy 2013–2018. Aston-
ishingly, the report focuses only on human use of antimicrobials and does not
integrate human and animal data, although it does in a few cases refer to problems
specifically originating in animals (e.g. ESBL) [69]. The report documents that the
estimated total numbers of human bloodstream infections caused by pathogens
resistant to one or more key antimicrobials increased with 35% from 2013 to
2017. The burden of antimicrobial resistant bloodstream infections is particularly
marked for those caused by Enterobacteriaceae, particularly E. coli, as they are the
infections with the highest incidence, comprising 84.4% of the total. The burden of
resistant infections remained unchanged for Gram-positive infections.
PHE also publishes a web-based tool intended to raise awareness of the value of
comparing practices of prescribing antimicrobials (https://fingertips.phe.org.uk/pro
file/amr-local-indicators). The tool refers to the observation that Antimicrobial
prescribing practices and antimicrobial resistance are inextricably linked, as overuse
and incorrect use of antimicrobials are major drivers of resistance. The AMR local
indicators described in the tool are publically available data intended to raise
awareness of antimicrobial prescribing and to facilitate the development of local
action plans. The data published in the tool is intended for use by healthcare staff,
academics and the public to compare the situation in their local area to the national
picture.

3.1.8 Japan

The Japanese AMR One Health Surveillance Committee, covering human health,
animals, food and the environment, publish surveillance data on AMR and antimi-
crobial use, covering sources in Min. Of Health as well as Min. Of Agriculture [70].
In Japan, the proportion of carbapenem resistance in Enterobacteriaceae such as
Escherichia coli and Klebsiella pneumoniae remains at around 1% during the last
decade, despite its global increase in humans. The proportion of Escherichia coli
resistant against the third generation cephalosporins and fluoroquinolones, however,
have been increasing; as have that of methicillin-resistant Staphylococcus aureus
(MRSA) accounting for approximately 50% of AMR hospital cases. In animals,
monitoring of resistant bacteria in cattle, pigs and chickens has been conducted.
Tetracycline resistance is common, although the degree of the resistance depends on
animal and bacterial species. The proportion of third generation cephalosporin- and
fluoroquinolone-resistant Escherichia coli was low and remained mostly less than
10% during the observed period (2011–2015). It should be noted that Japan imports
more than 60% of all food.
Food Security: Microbiological and Chemical Risks 257

4 The Use of NGS/WGS

4.1 Next Generation Sequencing for the Surveillance


of Foodborne Pathogens and Antimicrobial Resistant
Microorganisms in the Food Chain

Foodborne pathogens (e.g. bacteria like Shiga toxin-producing Escherichia coli


(STEC), Salmonella enterica (S. enterica), Campylobacter spp. and Listeria
monocytogenes, viruses, fungi or parasites) and antimicrobial resistant bacteria
(e.g. gut commensal bacteria – Escherichia coli (E. coli) and foodborne bacterial
pathogens) in the food chain represent a major food safety concern in all regions.
These pathogens can also be easily spread globally via the food chain due to global
trading of animals and food products and international travel and movement of
humans [71]. In order to improve food safety management of these bacteria in the
global food chain through a “One Health” approach, the world needs surveillance
and response systems that are capable of detecting them rapidly, understanding them
and responding to them [72].
The “One Health” integrated approach involves “the collaboration of multiple
disciplines, sectors and multiple groups working locally, nationally and globally to
attain optimal health for people, animals and the environment” [73]. This framework
is considered to be the most efficient, integrated approach to tackle the foodborne
disease and AMR threats in the food chain because of the complex interrelated roles
of human, animal and the environment in the emergence/reemergence and spread of
these threats [74]. In recent years, next generation sequencing (NGS) including
whole genome sequencing (WGS) and metagenomics testing have emerged with a
great potential to revolutionize how microbiological food safety is managed. Partic-
ularly WGS has emerged as a new tool that has great potential within a One Health
context [75–79]. WGS provides the highest possible microbial subtyping resolution
available to public health authorities for the surveillance of and response to
foodborne disease and AMR threats. “When used as part of a surveillance and
response system, it has the power to increase the speed with which threats are
detected and the detail in which the threats are understood, and ultimately lead to
quicker and more targeted interventions” [72]. NGS can be used widely in several
areas to improve food safety management, which include the use of WGS and
metagenomics for foodborne disease outbreak investigation and epidemiological
surveillance, as well as AMR surveillance. Owing to its rapidly declining cost, the
application of NGS in food safety management could lead to greater food/nutrition
security, health care, animal and environmental protection, sustainable development,
consumer protection, trade facilitation and tourism, which are all within the realm of
global health security.
258 J. Schlundt et al.

In order to elevate global health security to make the world a safer and more
secure place from infectious disease threats, the health security considerations were
initiated within WHO intergovernmental discussions from 2005, especially related
to the WHO International Health Regulation. A more proactive agenda was devel-
oped through the “Global Health Security Agenda (GHSA) which pursues a
multisectoral approach to strengthen both the global and national capacity to pre-
vent, detect, and respond to human and animal infectious diseases threats, whether
naturally occurring or accidentally or deliberately spread” [80, 81]. It was elaborated
by Gronvall et al. that “The objectives of the GHSA will require not only a “One
Health” approach to counter natural disease threats against humans, animals, and the
environment, but also a security focus to counter deliberate threats to human, animal,
and agricultural health and to nations’ economies” [82]. Hence collectively,
foodborne zoonotic pathogens and antimicrobial resistant bacteria in the food
chain is a “One Health” problem because animal health can directly affect human
health, food safety, food security, economic stability and biodiversity, which in a
bigger picture is also a “Global Health Security” problem.
As illustrated above, NGS is already recognised as a “One Health” tool that is
capable of improving food safety management, which leads to strengthening of food
security and ultimately global health security. Be it a naturally occurring or delib-
erate threat, NGS can also be used in both scenarios to prevent, detect, and respond
to human and animal infectious diseases threats. Though Global Health Security has
clear overlap with “One Health”, it also encompasses national economic, law
enforcement and security [82] and NGS data can be used by food safety and public
health regulators to take regulatory action faster [83]. In the case of foodborne
disease, the ability to quickly identify and track the causative foodborne pathogens
leads to reduction in (1) adverse impact on human health (e.g. fewer illnesses and
lower death rates), (2) number of contaminated products to be recalled, therefore
lower economic losses and (3) public fear when the threat is deliberate
(e.g. bioterrorism). In fact, surveillance of and response to foodborne pathogens
and AMR by WGS have already been applied routinely by several national author-
ities, including Public Health England [84], the Statens Serum Institut in Denmark
[85] and the Food and Drug Administration (FDA) in the United States
(US) [86]. These countries can leverage on their existing NGS infrastructure for
food security purpose and the additional cost to be incurred is limited to extending
the current NGS-based surveillance and response system to plants and wild animals,
and their environment.

4.2 Next Generation Sequencing Platforms

The Sanger method (“first generation” technology) was the main sequencing tech-
nology used between 1975 and 2005 for microbial WGS [87]. It produces long
(500–1000 bp), high-quality sequencing reads and has been regarded as the gold
standard for sequencing DNA. The Sanger method was used to sequence the first
Food Security: Microbiological and Chemical Risks 259

bacterial genome, Haemophilus influenza in 1995 [88] and other bacterial genomes
over the next few years [89–93]. In 2005, the NGS (“second generation” technology
or massively parallel sequencing) era began and this high throughout technology
allows short sequencing reads (50–400 bp), subsequently long sequencing reads
(1000–100,000 bp) to be generated and detected in a single machine run, without the
need for cloning. The short read technologies, such as those employed by platforms
that are provided by Illumina (e.g. MiSeq, NextSeq and HiSeq 2500) and Life
Technologies (e.g. Ion Torrent Personal Genome Machine) produce read lengths
ranged from ~100 to ~600 bp with low per-base error rate (usually less than 1%)
[94]. They are routinely used for assembly of good quality draft bacterial genomes
that contain multiple contigs2 (can be up to 100 of them), and are of good coverage
(>95%) and high accuracy. To generate a fully closed good quality bacterial genome,
the longer read technologies (“third generation” technology or single-molecule
sequencing) that are incorporated into platforms by Pacific Biosystems
(e.g. PacBio RSII) and Oxford Nanopore (e.g. MinION), together with the above
mentioned short read technologies are commonly used. Though longer read tech-
nologies generate read lengths ranging from ~1000 to ~100,000 bp, the error rate is
relatively high (15–30%), and they generally provide significantly lower coverage
and are more expensive than short read technologies [94]. However, when both
technologies are used in combination to close genomes, also known as hybrid
sequencing, the short reads generate good quality contigs while the long reads can
close the gaps that are between the contigs during scaffold assembly. For more
information on the main sequencing platforms and their performance, refer to the
brief summary (see Table 2; [87]) and the details [77, 94, 95] that are described in
above cited excellent recent reviews.

4.3 Whole Genome Sequencing for Foodborne Pathogens


and Antimicrobial Resistant (AMR) Microorganisms

Through WGS of bacterial isolates, both pathogen identification and characteriza-


tion, and detection of virulence factors and AMR genes can be directly obtained
from the sequence data rapidly and at the level of precision that was not previously
possible. Unlike traditional subtyping methods (e.g. serotyping, phage typing,
PCR-based detection method and pulsed-field gel electrophoresis; reviewed in detail
by [77]), WGS is not organism-specific and thus, allowing multiple bacteria to be
sequenced simultaneously, enabling simpler, faster and cheaper laboratory opera-
tions when compared to conventional microbiological method. In addition, WGS
offers the ease of standardisation and harmonisation of operating protocols for WGS
data collection, assessment of sequencing data quality, data processing and inter-
pretation. Nevertheless, the data from WGS provides a common standardised

2
A contig refers to overlapping (contiguous) sequence data (reads).
260 J. Schlundt et al.

language that can be deposited to online international public data repositories for
global data sharing and comparison, as well as global surveillance of foodborne
pathogen and AMR. In order to benefit from the full advantage of WGS in improving
food safety management, it still relies on the interpretation of analysed data in the
context of appropriate food consumption history and epidemiological data. Lastly,
WGS is more effective if it is used in a One Health and Global Health Security
context where WGS data of isolates from multiple sectors that involve human,
animal and environmental health are shared and compared locally, nationally and
globally.
The birth of NGS technology has led to the rapid rise of high quality draft
genomes being deposited into online international public databases (e.g. National
Center for Biotechnology Information (NCBI), European Nucleotide Archive
(ENA) and DNA Data Bank of Japan (DDBJ); these databases sync their data
nightly) for global data sharing and downstream analysis. This is in part due to the
ability to quickly and cheaply generate draft genomes from WGS data since micro-
bial genomes are smaller and more compact in comparison to eukaryote genomes.
This has enabled microbial draft genomes to be generated routinely with a fast
turnaround time for important applications in food safety management like
foodborne disease outbreak investigation and epidemiological surveillance, and
AMR surveillance (see section 4.3.2). The traditional subtyping methods expose a
very small fraction of the entire genomic information of the foodborne pathogen and
therefore, provides limited resolution for discriminating outbreak-related strains
from unrelated, sporadically circulating strains [79]. On the contrary, WGS can
theoretically reveal the entire genomic information of a microbial pathogen for the
discrimination of strains that differ by a single nucleotide through comparing of
bacterial sequences that are each millions of nucleotides in length [79].

4.3.1 Whole Genome Sequencing in Foodborne Outbreak Investigation


and Epidemiological Surveillance

For outbreak investigation, the foodborne pathogen must be linked to the correct food
product, which is the infection source. The investigation begins with subtyping isolates
that are obtained from affected individuals, implicated food products and production
facilities. Isolation is important because of the legal implications associated with any
public health interventions or regulatory actions taken [77]. It is critical that the
subtyping tool used is able to identify the pathogen down to strain (clone) level
resolution rather than the species level so that the sources of co-occurring outbreak
can be differentiated and targeted intervention strategies can be implemented. The
typing tool must be able to clearly and precisely resolve the isolates so that isolates
belonging to linked cases can be identified for inclusion in investigation [77]. Similarly,
it must also be able to differentiate concurrent, nonrelated and sporadic cases from
outbreak cases so as not to confound the investigation [77]. The latter is getting
increasingly important as foodborne pathogens can easily cross country boundaries
Food Security: Microbiological and Chemical Risks 261

due to globalisation of food supply chain and unrelated outbreak temporally and
geographically overlap.
The ability of WGS technology to resolve an outbreak source was well demon-
strated during the 2010 Haiti cholera outbreak [79], which is the most serious,
recorded cholera epidemic in recent history. This outbreak is responsible for killing
at least 8000 people and sickening over 600,000 individuals [96]. This successful
NGS application was enabled through the rapid public release of sequence genomes
by researchers with Vibrio cholerae collections [97–99] and by US CDC from the
Haitian outbreak [79]. Through a joint analysis of available epidemiological data
from the Haitian outbreak and publicly available sequence data as well as isolate data
released from Nepalese authorities, strong evidence suggested a single-source intro-
duction of the outbreak strain from Nepal (Nepalese UN contingent in Haiti) into
Haiti [100]. Subsequently, several genomic-based epidemiological investigations
also demonstrated the promising use of WGS in resolving outbreak investigations
in a highly time sensitive manner and many reports on the use of WGS in outbreak
investigations and surveillance have been published and some have been reviewed
by the following excellent reviews [77, 78].
Globally, the most extensive and well-known WGS-based application for food
safety management is the GenomeTrakr Network [101]. GenomeTrakr is an inter-
national collaboration between US FDA, US CDC, United States Department of
Agriculture (USDA), NCBI, state health departments and international partners
[101]. This network aims to collect WGS data from foodborne bacterial pathogens
and upload them quickly to a publicly accessible database, NCBI. Once genomic
data of bacteria (e.g. Salmonella, Listeria, E. coli, Campylobacter, Vibrio,
Cronobacter), parasites and viruses from US surveillance efforts are available,
they are uploaded by GenomeTrakr into NCBI [102]. The NCBI Pathogen Detection
website plots phylogenetic trees to generate daily clusters that determines the closest
matches to newly submitted data [103, 104]. Genetic relatedness suggests potential
linkages between the animal, food, environment and human isolates but it is not
sufficient for regulatory action, unless it is supported by epidemiological evidence.
Apart from country-specific effort in adopting WGS for food safety, a global effort –
known as the Global Microbial Identifier (GMI) – has been underway to suggest
the creation of a global genomic infrastructure and database that will enable this rev-
olutionary new technology to identify and characterize microorganisms from ani-
mals, food, environment and humans in a timely (minutes to hours) fashion through
utilising an international interactive system of DNA databases containing the full
genomes of all investigated microbial isolates in the world [105]. Notably, the GMI
idea represents the notion of global inclusiveness to harness benefit from this novel
technology for all mankind, society and the environment. The basis for the GMI
vision lies in the implementation of next generation DNA sequencing in microbiol-
ogy labs around the world. Since its inception in 2011, GMI has garnered increasing
support to advance the debate concerning the social, political, economic, ethical and
technological barriers to realising GMI’s vision. GMI has been organising global
meetings across the continents of Asia, the Americas, and Europe that invites
international experts and participants to speak and discuss on existing and current
262 J. Schlundt et al.

trending themes relating to the use of next generation sequencing (NSG) in clinical,
public health, and food microbiology, including virology. Lastly, WHO has
published a landscape paper on “WGS for foodborne disease surveillance”
[72]. This paper is drafted by technical experts from laboratories and public health
authorities to provide guidance that is comprehensive and relevant. It summarizes
some of the benefits and challenges inherent in the implementation of WGS and
describes some of the issues developing countries may face [72]. It also provides an
evidence base for some of the approaches to be considered for WGS
implementation [72].

4.3.2 Whole Genome Sequencing in Antimicrobial Resistance


Surveillance

Foodborne pathogens such as bacteria, viruses, fungi and parasites can enter the food
chain at some point from farm to fork to contaminate foods, potentially causing
human foodborne disease. While many foodborne diseases are mild and do not
require treatment, antimicrobials may be prescribed to treat severe cases. However,
with the increasing number of reported AMR foodborne pathogens, certain antimi-
crobials may no longer be effective against them and this poses a serious threat to
public health.
AMR surveillance has typically relied on the isolation of culturable indicator
microorganism (e.g. Salmonella spp., Campylobacter spp., E. coli and Enterococcus
spp. [106]) and the phenotypic characterization of animal, food, environmental and
clinical isolates. This approach, when sometimes utilised together with PCR-based
genotypic detection of AMR genes has been and will continue to be widely used in
molecular epidemiology for AMR surveillance. However, such combinatorial
approaches are unable to provide information on the mechanisms and drivers of
AMR, and on the presence and spread of AMR genes throughout the global food
chain [76]. The use of WGS can overcome these limitations and this is evidenced by
the increasing number of publications describing various WGS applications for
AMR surveillance among isolates from animals, food, environment and humans.
One of the valuable WGS applications is the ability to predict phenotypic AMR
profile with WGS-based genotypic AMR profile. This application has been demon-
strated by several groups in S. enterica [107–109], E. coli [110, 111], Campylobac-
ter spp. [112], Staphylococcus aureus [113] and Mycobacterium tuberculosis
[114, 115] and high resistance phenotype-genotype correlation (>97%) is
commonly seen.
Currently, WGS-based AMR surveillance has already been adopted nationwide
by US public health surveillance system, known as the National Antimicrobial
Resistance Monitoring System (NARMS) [116]. NARMS tracks changes in antimi-
crobial susceptibility and characterizes AMR in foodborne enteric bacteria found in
ill people (CDC), retail meats (FDA), and food animals (USDA) [76, 116]. NARMS
monitors antibiotic resistance among the following four major foodborne bacteria:
Salmonella spp., Campylobacter spp., E. coli, and Enterococcus spp. [116]. In a
recent review by NARMS, they mentioned that their WGS data alone can predict
Food Security: Microbiological and Chemical Risks 263

resistance in Salmonella [107] and other bacteria [111, 112] with a high degree of
accuracy for most major drug classes [117]. They have also generated a simple and
publicly available tool, Resistome Tracker that provides visually informative dis-
plays of antibiotic resistance genes in Salmonella across the globe [118]. Similarly,
WGS-based AMR surveillance has also been conducted by the European Union
(EU), through EU harmonized antimicrobial resistance monitoring program
[119]. In their recent study, they found that horizontal transfer has played a major
role in the spread of colistin resistance among bacteria (i.e. commensal bacteria and
major foodborne pathogen (the example in this study is Salmonella) in Italian meat-
producing animals [119]. This was demonstrated by the presence of the same
transferable determinant of colistin-resistance on the same conjugative plasmid
found in both E. coli and major Salmonella serotypes that were isolated from the
same intensive-farming industry in Italy [119]. In general, a global WGS-based
AMR surveillance system has yet to be implemented and most countries still depend
on phenotyping testing and PCR-based genotypic methods for AMR surveillance.
Nevertheless, many studies on the use of WGS-based AMR surveillance of above-
mentioned bacteria have been published and some are already described in excellent
review by [76].

4.4 Metagenomics for Foodborne Pathogens


and Antimicrobial Resistant Microorganisms

Metagenomics is a powerful tool that enables direct, culture-independent analysis of


complex microbiome (e.g. food, water, fecal, soil, or environmental samples) in one
analytical procedure (one sequencing run). It allows the genomes of difficult-to-
culture or non-culturable microorganisms to be analysed since the entire DNA
content of a sample is sequenced, regardless of its origin [76]. Metagenomic data
provides an in-depth taxonomic identification (i.e. to species/strain level) and the
relative abundance of organisms present in the microbiome. Apart from character-
izing the microbiome, metagenomics has potential application in AMR surveillance
[120]. For example, it could facilitate the tracking of AMR genes and mobiles
genetic elements in difficult-to-culture or non-culturable microorganisms, which
might also play a role in transmission of AMR across the food chain, as well as
the abundance and diversity of AMR genes in animals, food, environment and
humans.
As compared to the use of WGS for foodborne outbreak investigation and
epidemiological surveillance, and AMR surveillance, the use of metagenomics for
the same purposes are still in its early days (some examples have been discussed by a
recent review from [76]). In a recent extensive study that examined the single largest
metagenomic AMR monitoring effort of livestock (9 EU countries, 181 pig and
178 poultry farms, 359 herds, >9000 animal samples and >5000 GB sequencing
data), the pig and poultry resistomes showed great difference in abundance and
composition [120]. There was a pronounced country-specific effect on the
264 J. Schlundt et al.

resistomes, more so in pigs than in poultry [120]. Pigs were found to have higher
AMR loads, whereas poultry resistomes were more diverse. It is interesting to note
that total AMR abundance in livestock was positively associated with the overall
country-specific antimicrobial usage, and countries with comparable usage patterns
had similar resistomes. However, total functional AMR abundance was not associ-
ated with antimicrobial usage. This suggests that some genes might not provide
AMR functionality in their natural hosts with natural expression levels even though
the same genes can provide AMR functionality when cloned and expressed in a host
(usually E. coli) in functional metagenomic assays, and this may have implication on
assessing the risk of AMR genes versus functional AMR genes to human health.

5 Food Sufficiency and Food Sustainability Assessment

Sustainable development is development that meets the needs of the present gener-
ation without compromising the ability of future generations to meet their own needs
[121]. Although it is mostly understood to relate to the environment, it should be
realized that sustainability assessments can – and should – actually refer to three
areas: environmental, societal and economical. Thus, sustainable development in the
food sector should focus on conservation of land, water, plant and animal genetic
resources, avoidance of environmental degradation, technical appropriateness, eco-
nomic viability and social acceptability [122]. It is estimated that around 25% of
global greenhouse gas emissions comes from food systems, and agriculture is also
linked to deforestation, biodiversity loss, land degradation, water overuse and
socioeconomic impacts [123]. Although more than sufficient food is produced
annually in the world, hunger, undernourishment and micronutrient deficiencies
still exist [124]. Meanwhile, obesity and diet-related non-communicable diseases
make great threats to human health. Therefor the focus should be not only on
sufficiency of food systems (which globally we already have achieved) but much
more on sustainability and nutritional value of the food production systems we use.
Sustainability assessments in the food sector can help gain insights in the
sustainability performance of food systems, monitor and certify to provide proof to
customers, in landscape planning, in advising farms to assess the strengths and
weakness of their set-up, and in serving as a basis for management improvements
or strategy developments. More than 35 approaches [125–129] have been developed
for sustainability assessment on farms, farming systems, and supply chains, includ-
ing the response-inducing sustainability evaluation (RISE), farm sustainability indi-
cators (IDEA), sustainability assessment in food and agriculture systems (SAFA)
guidelines, sustainability monitoring and assessment routine (SMART), etc.
Besides, initiatives such as the Environmental Food Protocol and the Product
Environmental Footprint (PEF) pilots (European Commission 2016) encourage the
environmental life cycle-based assessment of food products [130, 131]. UNEP/
SETAC Life Cycle Initiative [132] has also published several papers like Towards
a Life Cycle Sustainability Assessment to enhance the global consensus and
Food Security: Microbiological and Chemical Risks 265

relevance of existing and emerging life cycle methodologies and data management.
In the following paragraphs, main research findings on the quantitative sustainability
assessment in the food production and nutritional diets studies are summarized and
discussed.

5.1 Sustainable Food Production Systems

As food production systems is one of the leading drivers of impacts on the environ-
ment, it is important to assess and improve food-related supply chains as much as
possible. Over the years, a large number of Life Cycle Assessment (LCA) studies
[130, 133] have been done to assess agricultural and food processing systems, and
compare the alternatives “from farm to table” including consideration of food waste
management systems.
Most of the LCA studies [134–136] on organic and conventional non-organic
farming show that organic farming may be one of the solutions to minimize negative
externalities and to reduce agriculture’s impacts on the environment, which are
mainly achievable by omitted usage of synthetic fertilizers and pesticides, crop
diversification, and application of organic fertilizers. Local (regional) versus long-
distanced (imported) food supply chains studies have produced various results.
Some have found that foods produced locally use less energy and produce fewer
GHG emissions than the same products from long-distance sources [137, 138]. Others
have shown that location is especially important in the case of agriculturally derived
products. Brodt et al. [137] found that California-produced (long-distanced) con-
ventional and organic tomato paste and canned diced tomatoes are almost equivalent
in energy use and GHG emissions to the Great Lake region (regionally) produced
and consumed products. Long-distanced tomato production benefits from higher per
hectare yields and soil amendments with lower carbon dioxide emissions, which
substantially offset the added energy use and GHG emissions associated with long-
distance shipment of products by rail.
It has been suggested by LCA studies [139] that agriculture intensification leads
to less overall environmental impacts, which means to increase land use efficiency is
a logical way forward to mitigate the pressure from urbanization. Meanwhile,
developed cities have great capacity to mitigate emissions through careful choice
of sustainable food practices, which can reduce embodied greenhouse gases, urban
heat island reduction, and storm water mitigation. But the impacts on food waste
minimization and ecological footprint reduction should be further explored. For
cities to continue to be food secure with strong resilience to potential future climate,
fossil, land and water resource constraints, a multifaceted approach to fresh food
production such as local commercial peri-urban horticulture is recommended by
Rothwell et al. [140]. Some advanced farming technologies [141], such as hydro-
ponic farming, could be promising technologies for more sustainable food produc-
tion especially in terms of land use and water consumption [142].
Aquaculture is the fastest growing food sector and has increasing economic
importance, which provides healthy proteins for humans and complements the
266 J. Schlundt et al.

limited availability from overexploited fisheries. FAO has proposed Best Manage-
ment Practices (BMP) to enhance sustainable aquaculture production
[143, 144]. The goal of BMPs is to make aquaculture environmentally responsible,
while also considering social and economic sustainability [145]. A recent compre-
hensive review conducted by Bohnes et al. [146] found that the influence of the
species farmed and feed conversion rate (FCR) obtained within the system are par-
ticularly important factors determining environmental performance of aquaculture
systems. Aquaculture feed production is a key driver for climate change, acidifica-
tion, cumulative energy use and net primary production use, while the farming
process is a key driver for eutrophication.
It is also suggested that seafood farmers should focus on improving the general
management of the aquaculture systems, with a specific attention to the management
of nutrients, the water management and the choice of adapted and FCR-optimized
aquafeed. Some technologies such as polyculture and recirculating aquaculture
system (RAS) have a great potential to improve environmental impacts in aquacul-
ture systems. A global effort to optimize, integrate and disseminate such combined
technologies could lead to a sustainable blue revolution in aquatic systems, similar to
the green revolution for terrestrial crop production [145].

5.2 Nutritionally Sustainable Diet

FAO has defined sustainable diets as “those diets with low environmental impacts,
which contribute to food and nutrition security and to healthy life for present and
future generations.” Sustainable diets are protective and respectful of biodiversity
and ecosystems, culturally acceptable, accessible, economically fair and affordable;
nutritionally adequate, safe and healthy; while optimizing natural and human
resources” [147]. The sustainable diets definition establishes four main goals:
human health and nutrition, cultural acceptability, economic viability, and environ-
mental protection [148]. They highlight long-term health and protection of the
environment. The Mediterranean diet is a typical model system to develop and
validate methods and indicators for sustainable diets [149].
Dietary choices have great global impacts on environmental sustainability and
human health. A recent study by Tilman and Clark [150] suggests that current diets –
with high levels of processed foods, refined sugars and fats, oils and meats – are
greatly increasing global incidences of type II diabetes, cancer and coronary heart
disease, as well as causing globally significant increases in GHG emissions and
contributing to tropical forests, savannas and grass lands clearing. Alternative
dietary options (Mediterranean diet, Pescetarian diet, vegetarian diet, etc.) could
substantially improve both human and environmental health.
Besides, more sustainable nutrition assessment methods are being developed,
such as the Combined Nutritional and Environmental Life Cycle Assessment
(CONE-LCA) framework that evaluates and compares in parallel both the environ-
mental and nutritional effects of foods and diets [151]. Stylianou et al. [152]
Food Security: Microbiological and Chemical Risks 267

provided the first quantitative epidemiology-based estimate of the complements and


trade-offs between nutrition and environment human health burden expressed in
Disability Adjusted Life Years (DALYs) by the example of adding one serving of
fluid milk to the present US adult diet. This led to an increase of environmental
impacts (in terms of particular matter (PM) and global warming (GW) impacts) on
human toxicity of the total diet, but at the same time other, more beneficial, impacts
on human health were gained. It will be important in the future to develop further
these types of more holistic evaluations, including one-health, environmental and
socio-economic metrics.

References

1. Grainger-Jones E (2018) How changing the world’s food systems can help to protect the
planet. The conversation. https://theconversationcom/how-changing-the-worlds-food-sys-
tems-can-help-to-protect-the-planet-100619
2. O’Neill J (2016) Tackling drug-resistant infections globally: final report and recommendations
UK
3. Keusch GT, Pappaioanou M, Gonzalez MC, Scott KA, Tsai P (2009) Sustaining global
surveillance and response to emerging zoonotic diseases IOM. The National Academies
Press, Washington, DC
4. Taylor LH, Latham SM, Woolhouse ME (2001) Risk factors for human disease emergence.
Philos Trans R Soc Lond Ser B Biol Sci 356:983–989. https://doi.org/10.1098/rstb.2001.0888
5. Wielinga P, Schlundt J (2014) One health and food safety. In Confronting Emerging Zoono-
ses: the one health paradigm. Springer Verlag. doi:https://doi.org/10.1007/978-4-431-55120-
1_10
6. Shi P, Geng S, Li TT, Li YS, Feng T, Wu HN (2015) Methods to detect avian influenza virus
for food safety surveillance. J Integr Agric 14:2296–2308. https://doi.org/10.1016/s2095-3119
(15)61122-4
7. CDC (2009) Update: swine influenza A (H1N1) infections – California and Texas April 2009.
Morb Mortal Wkly Rep 58:435–437
8. Baldo V, Bertoncello C, Cocchio S, Fonzo M, Pillon P, Buja A, Baldovin T (2016) The new
pandemic influenza A/(H1N1)pdm09 virus: is it really “new”? J Prev Med Hyg 57:E19–E22
9. Lin Z et al (2014) Next-generation sequencing and bioinformatic approaches to detect and
analyze influenza virus in ferrets. J Infect Dev Ctries 8:498–509. https://doi.org/10.3855/jidc.
3749
10. Hald T, Vose D, Wegener HC, Koupeev T (2004) A Bayesian approach to quantify the
contribution of animal-food sources to human salmonellosis. Risk Anal 24:255–269. https://
doi.org/10.1111/j.0272-4332.2004.00427.x
11. FAO/OIE/WHO (2011) The FAO-OIE-WHO collaboration – a tripartite concept note. http://
www.fao.org/docrep/012/ak736e/ak736e00.pdf
12. FAO/OIE/WHO/UNICEF/WORLD-BANK (2008) Contributing to one world, one health; a
strategic framework for reducing risks of infectious diseases at the animal–human–ecosystems
interface. http://www.fao.org/docrep/011/aj137e/aj137e00.htm
13. Rumbeiha WK (2012) Toxicology and “one health”: opportunities for multidisciplinary
collaborations. J Med Toxicol: Off J Am Coll Med Toxicol 8:91–93. https://doi.org/10.
1007/s13181-012-0224-4
14. FAO/WHO (2016) Manual on development and use of FAO and WHO specifications for
pesticides. FAO Plant Production and Protection Paper 228 (FAO/WHO joint work) Rome
268 J. Schlundt et al.

15. EFSA (2018a) Monitoring data on pesticide residues in food results on organic versus
conventionally produced food EFSA Supporting publication EN-1397. https://doi.org/10.
2903/sp.efsa.2018.EN-1397
16. Calvert GM, Beckman J, Prado JB, Bojes H, Mulay P, Lackovic M (2015) Acute occupational
pesticide-related illness and injury – United States, 2007–2010. MMWR Morb Mortal Wkly
Rep 62:5–10
17. Moebus S, Boedeker W (2017) Frequency and trends of hospital treated pesticide poisonings
in Germany 2000–2014 German medical science. GMS e-J 15:Doc13–Doc13. https://doi.org/
10.3205/000254
18. EFSA (2013) International frameworks dealing with human risk assessment of combined
exposure to multiple chemicals. EFSA J 11:3313. https://doi.org/10.2903/j.efsa.2013.3313
19. Solecki R et al (2017) Scientific principles for the identification of endocrine-disrupting
chemicals: a consensus statement. Arch Toxicol 91:1001–1006. https://doi.org/10.1007/
s00204-016-1866-9
20. FAO/WHO (1995) Application of risk analysis to food standards issues. http://www.who.int/
foodsafety/publications/risk-analysis/en/
21. WHO (2018) International Food Safety Authorities Network (INFOSAN). http://www.who.
int/foodsafety/areas_work/infosan/en/
22. WHO/FAO. (2018). INFOSAN activity report 2016/2017. Geneva: World Health Organiza-
tion and Food and Agriculture Organization of the United Nations. Avaible from: https://www.
who.int/foodsafety/publications/infosan_activity2016-17/en/ ISBN: 978-92-4-151464-4
23. EC (2018) European Commission – The rapid alert system for food and feed – 2017 Annual
Report. https://ec.europa.eu/food/sites/food/files/safety/docs/rasff_annual_report_2017.pdf
24. EFSA (2010) Establishment and maintenance of routine analysis of data from the rapid alert
system on food and feed. EFSA J 8:1449. https://doi.org/10.2903/j.efsa.2010.1449
25. ARASFF (2018) Plan of action of the ASEAN Rapid Alert System for Food and Feed
(ARASFF) 2018–2023. https://asean.org/storage/2012/05/POA-ARASFF-2018-2020.pdf
26. EFSA/ECDC (2016) The European Union summary report on trends and sources of zoonoses,
zoonotic agents and food-borne outbreaks in 2015. EFSA J 14:e04634. https://doi.org/10.
2903/j.efsa.2016.4634
27. Messens W, Vivas-Alegre L, Bashir S, Amore G, Romero-Barrios P, Hugas M (2013)
Estimating the public health impact of setting targets at the European level for the reduction
of zoonotic Salmonella in certain poultry populations. Int J Environ Res Public Health
10:4836–4850. https://doi.org/10.3390/ijerph10104836
28. Helwigh B, Christensen J, Müller L (2016) Annual report on Zoonoses in Denmark 2015.
Danish Zoonosis Centre, National Food Institute, Technical University of Denmark. http://
orbit.dtu.dk/en/publications/annual-report-on-zoonoses-in-denmark-2015(d3b81f09-e69c-
4bfc-8940-5b3f8d09040c).html
29. Romero-Barrios P, Hempen M, Messens W, Stella P, Hugas M (2013) Quantitative microbi-
ological risk assessment (QMRA) of food-borne zoonoses at the European level. Food Control
29:343–349. https://doi.org/10.1016/j.foodcont.2012.05.043
30. EFSA (2018b) Open food tox database. https://www.efsa.europa.eu/en/microstrategy/
openfoodtox
31. FDA (2011) Risk analysis at FDA food safety 2011 https://www.fda.gov/food/
foodscienceresearch/risksafetyassessment/ucm243439.htm
32. Scallan E, Mahon BE (2012) Foodborne diseases active surveillance network (FoodNet) in
2012: A Foundation for food safety in the United States. Clin Infect Dis 54:S381–S384.
https://doi.org/10.1093/cid/cis257
33. Yamaguchi T (2014) Social imaginary and dilemmas of policy practice: the food safety arena
in Japan. Food Policy 45:167–173. https://doi.org/10.1016/j.foodpol.2013.06.014
34. Anonymous (1997) Food safety from farm to table: a national food-safety initiative – a report
to the President, May 1997. Institute of Medicine (US) and National Research Council
(US) Committee to Ensure Safe Food from Production to Consumption Ensuring Safe Food:
From Production to Consumption. Washington, DC: National Academies Press (US).
Food Security: Microbiological and Chemical Risks 269

35. Lynch MF, Tauxe RV, Hedberg CW (2009) The growing burden of foodborne outbreaks due
to contaminated fresh produce: risks and opportunities. Epidemiol Infect 137:307–315. https://
doi.org/10.1017/S0950268808001969
36. Chapman B, Gunter C (2018) Local food systems food safety concerns. Microbiol Spectr
6. https://doi.org/10.1128/microbiolspec.PFS-0020-2017
37. Swaminathan DB et al (2006) Building PulseNet international: an interconnected system of
laboratory networks to facilitate timely public health recognition and response to foodborne
disease outbreaks and emerging foodborne diseases. Foodborne Pathog Dis 3:36–50. https://
doi.org/10.1089/fpd.2006.3.36
38. Nadon C et al (2017) PulseNet International: vision for the implementation of whole genome
sequencing (WGS) for global food-borne disease surveillance. Euro Surveill: Bull Eur Mal
Transmissibles ¼Eur Commun Dis Bull 22:30544. https://doi.org/10.2807/1560-7917.ES.
2017.22.23.30544
39. Rindom S (2016) The Danish system for food safety and nutrition. Availble from: https://
www.food.dtu.dk/english/service/about-the-institute/the-danish-system-for-food-safety-and-
nutrition
40. Anonymous (2018) Annual report on Zoonosesin Denmark 2017. National Food Institute,
Technical University of Denmark. http://www.food.dtu.dk/publikationer/
sygdomsfremkaldende-mikroorganismer
41. Hass U, Christiansen S, Axelstad M, Scholze M, Boberg J (2017) Combined exposure to low
doses of pesticides causes decreased birth weights in rats. Reprod Toxicol 72:97–105. https://
doi.org/10.1016/j.reprotox.2017.05.004
42. Bietlot HP, Kolakowski B (2012) Risk assessment and risk management at the Canadian Food
Inspection Agency (CFIA): a perspective on the monitoring of foods for chemical residues.
Drug Test Anal 4:50–58. https://doi.org/10.1002/dta.1352
43. FSANZ (2016) Australia, New Zealand safe food system. http://www.foodstandards.gov.au/
about/safefoodsystem/pages/default.aspx
44. Hussain MTERGMA (2017) Microbial safety of foods in the supply chain and food security.
Adv Food Technol Nutr Sci 3:22–32. https://doi.org/10.17140/AFTNSOJ-3-141
45. RIVM (2018) Food safety. https://www.rivm.nl/en/food-safety
46. Van der Voet H et al (2015) The MCRA model for probabilistic single-compound and
cumulative risk assessment of pesticides. Food Chem Toxicol 79:5–12. https://doi.org/10.
1016/j.fct.2014.10.014
47. RIKILT (2018) RIKILT-Institute of Food Safety (RIKILT), The Netherlands. http://www.
grace-fp7.eu/en/content/rikilt-institute-food-safety-rikilt-netherlands
48. Ribnicky DM et al (2014) Effects of a high fat meal matrix and protein complexation on the
bioaccessibility of blueberry anthocyanins using the TNO gastrointestinal model (TIM-1).
Food Chem 142:349–357. https://doi.org/10.1016/j.foodchem.2013.07.073
49. Zhang Z, Godefroy SB, Lyu HY, Sun BG, Fan YX (2018) Transformation of China’s food
safety standard setting system – review of 50 years of change, opportunities and challenges
ahead. Food Control 93:106–111. https://doi.org/10.1016/j.foodcont.2018.05.047
50. Wu Y-N, Chen J-S (2018) Food safety monitoring and surveillance in China: past, present and
future. Food Control 90:429–439. https://doi.org/10.1016/j.foodcont.2018.03.009
51. Pei XY et al (2015) Microbiological food safety surveillance in China. Int J Environ Res
Public Health 12:10662–10670. https://doi.org/10.3390/ijerph120910662
52. Abou Ghaida T, Spinnler HE, Soyeux Y, Hamieh T, Medawar S (2014) Risk-based food
safety and quality governance at the international law, EU, USA, Canada and France: effective
system for Lebanon as for the WTO accession. Food Control 44:267–282. https://doi.org/10.
1016/j.foodcont.2014.03.023
53. UN (2016) Resolution adopted by the General Assembly on 5 October 2016. Political
declaration of the high-level meeting of the General Assembly on antimicrobial resistance
54. WHO (2001) WHO global strategy for containment of antimicrobial resistance. http://www.
who.int/drugresistance/WHO_Global_Strategy_English.pdf
55. DANMAP (2017) DANMAP 2017 use of antimicrobial agents and occurrence of antimicro-
bial resistance in bacteria from food animals, food and humans in Denmark. Available from:
270 J. Schlundt et al.

https://www.danmap.org/-/media/arkiv/projekt-sites/danmap/danmap-reports/danmap-2017/
danmap2017.pdf?la¼en
56. ECDC/EFSA (2018) The European Union summary report on antimicrobial resistance in
zoonotic and indicator bacteria from humans, animals and food in 2016. Avaible from:
https://ecdc.europa.eu/sites/portal/files/documents/AMR-zoonotic-bacteria-humans-animals-
food-2016_Rev3.pdf
57. Aarestrup F (2012) Sustainable farming: get pigs off antibiotics. Nature 486:465. https://doi.
org/10.1038/486465a
58. Osman A Dar et al (2016) Exploring the evidence base for national and regional policy
interventions to combat resistance. Lancet 387:285–295
59. Kumarasamy KK et al (2010) Emergence of a new antibiotic resistance mechanism in India,
Pakistan, and the UK: a molecular, biological, and epidemiological study. Lancet Infect Dis
10:597–602. https://doi.org/10.1016/S1473-3099(10)70143-2
60. Sørum H (2006) Antimicrobial drug resistance in fish pathogens. In: Antimicrobial resistance
in bacteria of animal origin. Am Soc Microbiol, pp 213–238. https://doi.org/10.1128/
9781555817534.ch13
61. Furushita M et al (2003) Similarity of tetracycline resistance genes isolated from fish farm
bacteria to those from clinical isolates. Appl Environ Microbiol 69:5336–5342. https://doi.org/
10.1128/AEM.69.9.5336-5342.2003
62. WHO (2018b) Global antimicrobial resistance surveillance system (GLASS) report, Early
implementation 2016–2017
63. WHO (2017b) Integrated surveillance of antimicrobial resistance in foodborne bacteria:
application of a one health approach. Guidance from the WHO Advisory Group on Integrated
Surveillance of Antimicrobial Resistance (AGISAR). Avaible from: http://apps.who.int/iris/
bitstream/handle/10665/255747/9789241512411-eng.pdf?sequence¼1
64. ECDC/EFSA/EMA (2017) ECDC/EFSA/EMA second joint report on the integrated analysis
of the consumption of antimicrobial agents and occurrence of antimicrobial resistance in
bacteria from humans and food-producing animals. Joint Interagency Antimicrobial Con-
sumption and Resistance Analysis (JIACRA) report. EFSA J 15:4872. https://doi.org/10.
2903/j.efsa.2017.4872
65. Karp BE et al (2017) National Antimicrobial Resistance Monitoring System: two decades of
advancing public health through integrated surveillance of antimicrobial resistance. Foodborne
Pathog Dis 14:545–557. https://doi.org/10.1089/fpd.2017.2283
66. ANSES (2018) French surveillance network for antimicrobial resistance in pathogenic bacteria
of animal origin 2016 Annual Report. ANSES, France. Availble from: https://resapath.anses.
fr/resapath_uploadfiles/files/Documents/2016_RESAPATH%20Rapport%20Annuel_GB.pdf
67. GERMAP (2017) GerMap 2015 – consumption of antibiotics and the spread of antimicrobial
resistance in human and veterinary medicine in Germany
68. Schweickert B et al (2018) Antibiotic consumption in Germany: first data of a newly
implemented web-based tool for local and national surveillance. J Antimicrob Chemother
73:3505–3515. https://doi.org/10.1093/jac/dky345
69. PHE (2018) English Surveillance Programme for Antimicrobial Utilisation and Resistance
(ESPAUR). Public Health England (PHE). Availble from: https://assets.publishing.service.
gov.uk/government/uploads/system/uploads/attachment_data/file/759975/ESPAUR_2018_
report.pdf
70. Anonymous (2017) Nippon AMR one health report (NAOR) 2017. The AMR One Health
Surveillance Committee. Availble from: https://www.mhlw.go.jp/file/06-Seisakujouhou-
10900000-Kenkoukyoku/0000204347.pdf
71. Holmes AH et al (2016) Understanding the mechanisms and drivers of antimicrobial resis-
tance. Lancet (London, England) 387:176–187. https://doi.org/10.1016/s0140-6736(15)
00473-0
72. WHO (2018c) Whole genome sequencing for foodborne disease surveillance: Landscape
paper. http://www.who.int/iris/handle/10665/272430. License: CC BY-NC-SA 3.0 IGO
Switzerland
73. AVMA (2008) One health: a new professional imperative. Schaumburg, Illinois, USA
Food Security: Microbiological and Chemical Risks 271

74. Robinson TP et al (2016) Antibiotic resistance is the quintessential one health issue. Trans R
Soc Trop Med Hyg 110:377–380. https://doi.org/10.1093/trstmh/trw048
75. Cao Y, Fanning S, Proos S, Jordan K, Srikumar S (2017) A review on the applications of next
generation sequencing technologies as applied to food-related microbiome studies. Front
Microbiol 8:1829–1829. https://doi.org/10.3389/fmicb.2017.01829
76. Oniciuc EA, Likotrafiti E, Alvarez-Molina A, Prieto M, Santos JA, Alvarez-Ordonez A (2018)
The present and future of whole genome sequencing (WGS) and whole metagenome sequenc-
ing (WMS) for surveillance of antimicrobial resistant microorganisms and antimicrobial
resistance genes across the food chain. Genes 9. https://doi.org/10.3390/genes9050268
77. Ronholm J, Nasheri N, Petronella N, Pagotto F (2016) Navigating microbiological food safety
in the era of whole-genome sequencing. Clin Microbiol Rev 29:837–857. https://doi.org/10.
1128/cmr.00056-16
78. Sekse C, Holst-Jensen A, Dobrindt U, Johannessen GS, Li W, Spilsberg B, Shi J (2017) High
throughput sequencing for detection of foodborne. Pathog Front Microbiol 8:2029–2029.
https://doi.org/10.3389/fmicb.2017.02029
79. Taboada EN, Graham MR, Carriço JA, Van Domselaar G (2017) Food safety in the age of next
generation sequencing, bioinformatics, and open data access. Front Microbiol 8:909–909.
https://doi.org/10.3389/fmicb.2017.00909
80. GHSA (2018) What is GHSA? https://www.ghsagenda.org/. Accessed 26 Nov 2018
81. Osterholm MT (2017) Global Health security—an unfinished journey. Emerg Infect Dis 23:
S225–S227. https://doi.org/10.3201/eid2313.171528
82. Gronvall G, Boddie C, Knutsson R, Colby M (2014) One health security: an important
component of the global health security agenda. Biosecurity Bioterrorism: Biodefense Strat-
egy Pract Sci 12:221–224. https://doi.org/10.1089/bsp.2014.0044
83. Pightling AW, Pettengill JB, Luo Y, Baugher JD, Rand H, Strain E (2018) Interpreting whole-
genome sequence analyses of foodborne bacteria for regulatory applications and outbreak
investigations. Front Microbiol 9:1482. https://doi.org/10.3389/fmicb.2018.01482
84. Inns T et al (2017) Prospective use of whole genome sequencing (WGS) detected a multi-
country outbreak of Salmonella Enteritidis. Epidemiol Infect 145:289–298. https://doi.org/10.
1017/s0950268816001941
85. Torpdahl M, Löfström C, Møller Nielsen E (2014) Whole genome sequencing. In: Annual
report on Zoonoses in Denmark 2013. Annual report on Zoonoses in Denmark. DTU Food,
Søborg, pp 16–19
86. FDA (2018d) Whole genome sequencing (WGS) program. https://www.fda.gov/Food/
FoodScienceResearch/WholeGenomeSequencingProgramWGS/default.htm. Accessed 1 Dec
2018
87. Besser J, Carleton HA, Gerner-Smidt P, Lindsey RL, Trees E (2018) Next-generation
sequencing technologies and their application to the study and control of bacterial infections.
Clin Microbiol Infect: Off Publ Eur Soc Clin Microbiol Infect Dis 24:335–341. https://doi.org/
10.1016/j.cmi.2017.10.013
88. Fleischmann RD et al (1995) Whole-genome random sequencing and assembly of
Haemophilus influenzae Rd. Science 269:496–512
89. Blattner FR et al (1997) The complete genome sequence of Escherichia coli K-12. Science
277:1453–1462
90. Cole ST et al (1998) Deciphering the biology of Mycobacterium tuberculosis from the
complete genome sequence. Nature 393:537–544. https://doi.org/10.1038/31159
91. Fraser CM et al (1998) Complete genome sequence of Treponema pallidum, the syphilis
spirochete. Science 281:375–388
92. Kunst F et al (1997) The complete genome sequence of the gram-positive bacterium Bacillus
subtilis. Nature 390:249–256. https://doi.org/10.1038/36786
93. Parkhill J et al (2001) Genome sequence of Yersinia pestis, the causative agent of plague.
Nature 413:523–527. https://doi.org/10.1038/35097083
94. Goodwin S, McPherson JD, McCombie WR (2016) Coming of age: ten years of next-
generation sequencing technologies. Nat Rev Genet 17:333–351. https://doi.org/10.1038/
nrg.2016.49
272 J. Schlundt et al.

95. Loman NJ, Pallen MJ (2015) Twenty years of bacterial genome sequencing. Nat Rev
Microbiol 13:787–794. https://doi.org/10.1038/nrmicro3565
96. CDC (2014) Cholera in Haiti. https://www.cdc.gov/cholera/haiti/. Accessed 8 Dec 2018
97. Chin CS et al (2011) The origin of the Haitian cholera outbreak strain. N Engl J Med
364:33–42. https://doi.org/10.1056/NEJMoa1012928
98. Hendriksen RS et al (2011) Population genetics of vibrio cholerae from Nepal in 2010:
evidence on the origin of the Haitian outbreak. MBio 2:e00157–e00111. https://doi.org/10.
1128/mBio.00157-11
99. Reimer AR et al (2011) Comparative genomics of Vibrio cholerae from Haiti. Asia Afr Emerg
Infect Dis 17:2113–2121. https://doi.org/10.3201/eid1711.110794
100. Eppinger M et al (2014) Genomic epidemiology of the Haitian cholera outbreak: a single
introduction followed by rapid, extensive, and continued spread characterized the onset of the
epidemic. MBio 5:e01721. https://doi.org/10.1128/mBio.01721-14
101. Allard MW, Strain E, Melka D, Bunning K, Musser SM, Brown EW, Timme R (2016)
Practical value of food pathogen traceability through building a whole-genome sequencing
network and database. J Clin Microbiol 54:1975–1983. https://doi.org/10.1128/jcm.00081-16
102. FDA (2018b) GenomeTrakr Network. https://www.fda.gov/food/foodscienceresearch/
wholegenomesequencingprogramwgs/ucm363134.htm. Accessed 11 Dec 2018
103. Allard MW et al (2018) Genomics of foodborne pathogens for microbial food safety. Curr
Opin Biotechnol 49:224–229. https://doi.org/10.1016/j.copbio.2017.11.002
104. NCBI (2018) Pathogen detection. https://www.ncbi.nlm.nih.gov/pathogens/. Accessed 11 Dec
2018
105. GMI (2018) Global microbial identifier. http://www.globalmicrobialidentifier.org/. Accessed
9 Dec 2018
106. WHO (2017a) Guidance on integrated surveillance of antimicrobial resistance in foodborne
bacteria: Application of a One Health Approach. Switzerland
107. McDermott PF et al (2016) Whole-genome sequencing for detecting antimicrobial resistance
in Nontyphoidal Salmonella. Antimicrob Agents Chemother 60:5515–5520. https://doi.org/
10.1128/aac.01030-16
108. Neuert S et al (2018) Prediction of phenotypic antimicrobial resistance profiles from whole
genome sequences of non-typhoidal Salmonella enterica. Front Microbiol 9:592. https://doi.
org/10.3389/fmicb.2018.00592
109. Zankari E et al (2013) Genotyping using whole-genome sequencing is a realistic alternative to
surveillance based on phenotypic antimicrobial susceptibility testing. J Antimicrob Chemother
68:771–777. https://doi.org/10.1093/jac/dks496
110. Stoesser N et al (2013) Predicting antimicrobial susceptibilities for Escherichia coli and
Klebsiella pneumoniae isolates using whole genomic sequence data. J Antimicrob Chemother
68:2234–2244. https://doi.org/10.1093/jac/dkt180
111. Tyson GH et al (2015) WGS accurately predicts antimicrobial resistance in Escherichia coli. J
Antimicrob Chemother 70:2763–2769. https://doi.org/10.1093/jac/dkv186
112. Zhao S et al (2016) Whole-genome sequencing analysis accurately predicts antimicrobial
resistance phenotypes in campylobacter spp. Appl Environ Microbiol 82:459–466. https://doi.
org/10.1128/AEM.02873-15
113. Gordon NC et al (2014) Prediction of Staphylococcus aureus antimicrobial resistance by
whole-genome sequencing. J Clin Microbiol 52:1182–1191. https://doi.org/10.1128/JCM.
03117-13
114. Phelan J et al (2016) Mycobacterium tuberculosis whole genome sequencing and protein
structure modelling provides insights into anti-tuberculosis drug resistance. BMC Med 14:31.
https://doi.org/10.1186/s12916-016-0575-9
115. Walker TM et al (2015) Whole-genome sequencing for prediction of Mycobacterium tuber-
culosis drug susceptibility and resistance: a retrospective cohort study. Lancet Infect Dis
15:1193–1202. https://doi.org/10.1016/s1473-3099(15)00062-6
116. FDA (2018a) About NARMS. https://www.fda.gov/animalveterinary/safetyhealth/
antimicrobialresistance/nationalantimicrobialresistancemonitoringsystem/ucm059089.htm.
Accessed 10 Dec 2018
Food Security: Microbiological and Chemical Risks 273

117. McDermott PF, Zhao S, Tate H (2018) Antimicrobial resistance in nontyphoidal Salmonella.
Microbiol Spectr 6. https://doi.org/10.1128/microbiolspec.ARBA-0014-2017
118. FDA (2018c) Global Salmonella Resistome Data. https://www.fda.gov/AnimalVeterinary/
SafetyHealth/AntimicrobialResistance/NationalAntimicrobialResistanceMonitoringSystem/
ucm570694.htm. Accessed 11 Dec 2018
119. Alba P et al (2018) Molecular epidemiology of mcr-encoded colistin resistance in
Enterobacteriaceae from food-producing animals in Italy revealed through the EU harmonized
antimicrobial resistance monitoring. Front Microbiol 9:1217. https://doi.org/10.3389/fmicb.
2018.01217
120. Munk P et al (2018) Abundance and diversity of the faecal resistome in slaughter pigs and
broilers in nine European countries. Nat Microbiol 3:898–908. https://doi.org/10.1038/
s41564-018-0192-9
121. WCED (1987) Our common future, report of the World Commision on Enviroment and
Development
122. FAO (2012) Sustainability assessment of food and agriculture systems (SAFA). Guidelines.
Food and Agriculture Organization of the United Nations, Rome
123. Garnett T (2016) Plating up solutions. Science 353:1202–1204
124. FAO (2017) The state of food security and nutrition in the world
125. Briquel V, Vilain L, Bourdais J-L, Girardin P, Mouchet C, Viaux P (2001) La méthode IDEA
(indicateurs de durabilité des exploitations agricoles): une démarche pédagogique Ingénieries-
EAT, pp 29–39
126. Castoldi N, Bechini L (2010) Integrated sustainability assessment of cropping systems with
agro-ecological and economic indicators in northern Italy. Eur J Agron 32:59–72
127. Grenz J, Thalmann C, Stämpfli A, Studer C, Häni F (2009) RISE–a method for assessing the
sustainability of agricultural production at farm level. Rural Dev News 1:5–9
128. Kamali FP, Borges JA, Meuwissen MP, de Boer IJ, Lansink AGO (2017) Sustainability
assessment of agricultural systems: the validity of expert opinion and robustness of a multi-
criteria analysis. Agric Syst 157:118–128
129. Schader C, Grenz J, Meier MS, Stolze M (2014) Scope and precision of sustainability
assessment approaches to food systems. Ecol Soc 19:42
130. Hauschild M, R R, Olsen SI (2018) Life cycle assess- ment – theory and practice. Springer
International Publishing, Cham
131. Protocol E (2013) Environmental assessment of food and drink protocol european food
sustainable consumption and production round table (SCP RT), Working Group 1
132. Valdivia S, Ugaya CM, Hildenbrand J, Traverso M, Mazijn B, Sonnemann G (2013) A UNEP/
SETAC approach towards a life cycle sustainability assessment—our contribution to Rio+ 20.
Int J Life Cycle Assess 18:1673–1685
133. Notarnicola B, Salomone R, Petti L, Renzulli PA, Roma R, Cerutti AK (2015) Life cycle
assessment in the agri-food sector. In: Life cycle assessment in the agri-food sector: system
studies, methodological issues and best practices, pp 10–68
134. Aguilera E, Guzmán G, Alonso A (2015) Greenhouse gas emissions from conventional and
organic cropping systems in Spain. I. Herbaceous crops. Agron Sustain Dev 35:713–724
135. Comm. E (2007) Council Regulation (EC) No 834/2007 of 28 June 2007 on organic produc-
tion and labelling of organic products and repealing Regulation (EEC) No 2092/91. Official
Journal of the European Union L189
136. Tricase C, Lamonaca E, Ingrao C, Bacenetti J, Giudice AL (2018) A comparative life cycle
assessment between organic and conventional barley cultivation for sustainable agriculture
pathways. J Clean Prod 172:3747–3759
137. Brodt S, Kramer KJ, Kendall A, Feenstra G (2013) Comparing environmental impacts of
regional and national-scale food supply chains: A case study of processed tomatoes. Food
Policy 42:106–114
138. Pimentel D, Williamson S, Alexander CE, Gonzalez-Pagan O, Kontak C, Mulkey SE (2008)
Reducing energy inputs in the US food system. Hum Ecol 36:459–471
274 J. Schlundt et al.

139. Notarnicola B, Sala S, Anton A, McLaren SJ, Saouter E, Sonesson U (2017) The role of life
cycle assessment in supporting sustainable agri-food systems: a review of the challenges. J
Clean Prod 140:399–409
140. Rothwell A, Ridoutt B, Page G, Bellotti W (2016) Environmental performance of local food:
trade-offs and implications for climate resilience in a developed city. J Clean Prod
114:420–430
141. Goldstein B, Hauschild M, Fernández J, Birkved M (2016) Urban versus conventional
agriculture, taxonomy of resource profiles: a review. Agron Sustain Dev 36:9
142. Barbosa GL et al (2015) Comparison of land, water, and energy requirements of lettuce grown
using hydroponic vs. conventional agricultural methods. Int J Environ Res Public Health
12:6879–6891
143. Boyd CE (2003) Guidelines for aquaculture effluent management at the farm-level. Aquacul-
ture 226:101–112
144. FOOD, NATIONS. AOOTU (2013) AQUACULTURE DEVELOPMENT: use of wild fish as
feed in aquaculture. FOOD & AGRICULTURE ORG
145. Bosma RH, Verdegem MC (2011) Sustainable aquaculture in ponds: principles, practices and
limits. Livest Sci 139:58–68
146. Bohnes FA, Hauschild MZ, Schlundt J, Laurent A (2018) Life cycle assessments of aquacul-
ture systems: a critical review of reported findings with recommendations for policy and
system development. Rev Aquac
147. Burlingame B, Dernini S (2012) Sustainable diets and biodiversity: directions and solutions
for policy, research and action. International Scientific Symposium, Biodiversity and Sustain-
able Diets United Against Hunger, FAO Headquarters, Rome, Italy, 3–5 November 2010. In:
Sustainable Diets and Biodiversity: Directions and Solutions for Policy, Research and Action.
International Scientific Symposium, Biodiversity and Sustainable Diets United Against Hun-
ger, FAO Headquarters, Rome, Italy, 3–5 November 2010. Food and Agriculture Organization
of the United Nations (FAO)
148. Fanzo J, Cogill B, Mattei F (2012) Metrics of sustainable diets and food systems. CGIAR,
Washington, DC
149. Prosperi P, Allen T, Padilla M, Peri I, Cogill B (2014) Sustainability and food & nutrition
security: A vulnerability assessment framework for the Mediterranean region. SAGE Open
4:2158244014539169
150. Tilman D, Clark M (2014) Global diets link environmental sustainability and human health.
Nature 515:518
151. Fulgoni V, Wallace T, Stylianou K, Jolliet O (2018) Calculating intake of dietary risk
components used in the global burden of disease studies from the what we eat in America/
National Health and Nutrition Examination Surveys. Nutrients 10:1441
152. Stylianou KS, Heller MC, Fulgoni VL, Ernstoff AS, Keoleian GA, Jolliet O (2016) A life
cycle assessment framework combining nutritional and environmental health impacts of diet:
A case study on milk. Int J Life Cycle Assess 21:734–746
Part III
Exploring the Technology Landscape
for Solutions
Gaussianization of Variational Bayesian
Approximations with Correlated
Non-nested Non-negligible Posterior Mean
Random Effects Employing Non-negativity
Constraint Analogs and Analytical
Depossinization for Iteratively Fitting
Capture Point, Aedes aegypti Habitat
Non-zero Autocorrelated Prognosticators:
A Case Study in Evidential Probabilities
for Non-frequentistic Forecast
Epi-entomological Time Series Modeling
of Arboviral Infections

Angelica Huertas, Nathanael Stanley, Samuel Alao, Toni Panaou,


Benjamin G. Jacob, and Thomas Unnasch

1 Introduction

Aedes (Ae.) aegypti and Ae. albopictus are significant mosquito species, since they
are capable of transmitting dengue fever, chikungunya, Zika, and yellow fever
viruses. Ae. aegypti mosquitoes have adapted to the urban environment by utilizing
artificial containers as the preferred breeding sites, such as waste tires, trash cans,
toys, plant pots, neglected pools, and unsealed septic tanks.
Ae. aegypti disease risk may be modeled with high predictive accuracy by
employing Geographic Information Systems (GIS) and other cartographic software
(e.g., object based ENVI technology) for mapping and interpolating time series
abundances of vectors and pathogens. These reservoirs are often associated with
environmental co-factors. Predictively regressively understanding the processes by

A. Huertas · N. Stanley · S. Alao · T. Panaou · B. G. Jacob (*) · T. Unnasch


College of Public Health, University of South Florida, Tampa, FL, USA
e-mail: bjacob1@health.usf.edu

© Springer Nature Switzerland AG 2020 277


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1_12
278 A. Huertas et al.

which species colonize and adapt to human habitats is particularly important in the
case of a virulent disease-vectoring arthropod such as Ae. aegypti. Real time, ArcGIS
regressing the spatial risk of human exposure to Ae. aegypti pathogens, ideally can
have high accuracy for forecasting county abatement or district geolocations of
elevated risk without overestimating risk coverage, which may be constructed
based on epi-entomological time series data of abundance of vectors or infected
vectors.
Leandrol-Reguillo et al. [38] employed multi-criteria, spatial statistical simula-
tion, modeling in ArcGIS, whereby the fuzzy overlay function was adopted to create
a distribution, predictive, risk map of potential, seasonal, urban, breeding site,
eco-georeferenceable geolocations of Ae. aegypti in four districts of Barcelona,
Spain. The Overlay type in ArcGIS combined the epi-entomological, vector, arthro-
pod, seasonal, geosampled data based on set theory analysis which allowed the
exploration of the membership of each grid cell belonging to various input criteria
[e.g., fuzzy and, fuzzy or, fuzzy Product, fuzzy Sum, and fuzzy Gamma]. The
authors employed fuzzy products to determine Ae. aegypti, capture point, signature,
covariate membership to various input criteria including narrow streets, highly
dense, neighborhoods, urban vegetation and other specified, land use, land cover
(LULC) ArcGIS, anthropogenic geoclassified regions (courtyards or patios within
building blocks).
The fuzzification decomposed the N potential, Ae. aegypti, larval/pupal habitat,
capture points into K disjoint subsets Sj containing Nj capture points which subse-
PK P ı ı
quently minimized the sum-of-squares criterion J ¼ ıxn – μ j ı2 , in a real
j¼1 n2S j
time, ArcGIS model, where xn was a, potential, seasonal, superbreeder, prolific,
Ae. aegypti foci, represented by the nth breeding site when μj was the geometric
centroid of the foci in Sj. In statistics, the residual sum of squares (RSS), also known
as the sum of squared residuals (SSR) or the sum of squared errors of prediction
(SSE), is the sum of the squares of residuals (e.g., deviations forecasted from actual,
empirical, potential, seasonal, superbreeder, county abatement, capture point
covariates) [19]. In Leandrol-Reguillo et al. [38] the RSS was a measure of the
discrepancy between the geosampled Ae. aegypti habitat data and an estimation
mode. In general, the algorithm achieved a global minimum of J over the ArcGIS
assigned LULCs. In mathematical analysis, the maxima and minima (the respective
plurals of maximum and minimum) of a function, known collectively as extrema (the
plural of extremum), are the largest and smallest value of the function, either within a
given range (the local or relative extrema), or on the entire domain of a function (e.g.,
the global or absolute extrema tabulated from an empirical dataset of potential,
geosampled seasonal, superbreeder, eco-georeferenceable, Ae. aegypti, capture
point foci) [36]. The number of representations of n by k squares, allowed zeros
and distinguishing signs and order, to be denotable as rk(n) in the model. The special
case k ¼ 2 corresponding to two squares is often expressed simply as r2(n) ¼ r(n)
[19].
Employing the eco-cartographically delineated, Ae. aegypti, potential, seasonal,
superbreeder, eco-georeferenceable, empirical geoclassified LULC variables,
Gaussianization of Variational Bayesian Approximations with Correlated. . . 279

Leandrol-Reguillo et al. [38], then generated a fuzzified building density map


which predicted that the historic center and west area of the, study site had a
potentially high population density of habitats. Since small containers, vases, flower
pot dishes and patio drains were chosen as an indicator, the fuzzified, patio, LULC,
density map located the possible accumulation of these potential breeding sites. The
final model revealed that a highest density, of potential, superbreeder, capture point
habitats occurred mainly in the first major development of Barcelona characterized
by Cerda’s courtyard blocks typology.
It may be assumed that an empirical, time series, geosampled, county abatement,
Ae. aegypti, larval/pupal, aquatic habitat, frequency, covariate, estimator dataset
consists of n independent observations y ¼ (y, y,,)0 from a one-parameter family,
and that the model being entertained expresses the parameter, Oi. This estimators
may be associated with the ith geosampled, potential, seasonal, superbreeder, cap-
ture point, habitat observation as a given linear combination of p unknown param-
eters ß ¼ (ß1, . . ., ßp)0. As such , the log likelihood function L(ß; y) would be the
sum of n terms l(θi; yi) with θi ¼ xi, ß1, xi being the ith row of a full rank matrix X in
the vector arthropod probablistic paradigm. This specification may describe some of
the most widely employed seasonal, epi-entomological covariates for critically
targeting potential, superbreeder, county abatement capture point, larval/pupal,
habitat covariates of unknown foci.
The use of confidence regions based on the drop-in log-likelihood in ArcGIS may
have a wide appeal for predictive vulnerability epi-entomological, real time model-
ing, eco-georeferenceable, geosampled, Ae. aegypti, time series, county abatement,
frequency, geosampled covariates for determining prognosticators of seasonal
superbreeder, county abatement unknown, foci. The arguments supporting it include
invariance, good second-order properties and other criteria of a more heuristic
nature. This would include the central role played by the likelihood function in the
various inferences parsimoniously derivable from empirically regressing, real-time,
capture point, frequency estimators to determine optimal endemic covariates of
statistical significance spatiotemporally associated with potential, seasonal,
superbreeder, county abatement foci of Ae. aegypti. Unfortunately, computational
difficulties have prevented the assessment of the precision ML estimates by means of
the log-likelihood functions in these seasonal, vector arthropod, larval/pupal, habitat,
capture point, predictive, county abatement, risk models in the literature. This is one
of the main reasons for the extensive utilization of approximations to the
log-likelihood, in the literature, for assessment of, potential, superbreeder, county
abatement, eco-georeferenceable, geosampled, Ae. aegypti capture points for
targeting prolific, seasonal larval/pupal habitat, regression covariates.
Geometrically, the sum of the squared capture point distances of potential,
prolific, Ae. aegypti, breeding site seasonal covariates geosampled in a county
abatement, intervention, epi-entomological, study site may be parallel to the axis
of the dependent variable (e.g., weekly prevalence of Zika or dengue in a county
abatement, epi-entomological, study site) PROC REG or R may optimally
geo-spatiotemporally quantitate datasets of eco-georeferenceable potential, seasonal,
superbreeder, Ae. aegypti, capture points geosampled in a county abatement,
epi-entomological intervention site. The corresponding capture point on the
280 A. Huertas et al.

regression surface may reveal that the smaller the differences in the immature,
frequency, geosampled, density counts, the better the model fits the
epi-entomological, data. The resulting estimator may be expressible by a simple
formula, in PROC REG or R especially in the case of a simple linear regression, in
which there is a single explanative variable in the regression equation.
Bayesian probability is an interpretation of the concept of probability, in which,
instead of frequency or propensity of some phenomenon, probability is interpreted as
reasonable expectation [21], representing a state of knowledge or as quantification of
a personal belief. The Bayesian interpretation of probability in a predictive, county
abatement, Ae. aegypti, larval/pupal, habitat, regression, forecast, vulnerability, risk
model could be seen as an extension of propositional logic that enables reasoning
with hypotheses, [i.e., the propositions whose truth or falsity about a forecasted,
potential, geosampled, seasonal superbreeder, foci explanatory covariate is uncer-
tain]. In the Bayesian view, a probability is assigned to a hypothesis, whereas under
frequentist inference, a hypothesis is typically tested without being assigned a
probability [20]. Bayesian probability belongs to the category of evidential proba-
bilities; to evaluate the probability of a hypothesis; the Bayesian probabilistic
specifies some prior probability, which is then updated to a posterior probability in
the light of new, relevant data (evidence) [29]. The Bayesian interpretation for
simulating potential, seasonal, superbreeder, Ae. aegypti, eco-georferenceable, cap-
ture points in an epi-entomological, intervention, county abatement study site may
hence provide an iteratable standardizable set of procedures and formulae to perform
calculations.
A Bayesian inferential, time series Ae ageypti, paradigm would employ a sub-
jective maximum likelihood estimator and a predicted posterior prior in order to
determone statistical significance of a geosampled dataset of capture point
covariates. Such a non-frequentistic, hierarchical, iterative paradigm may reveal,
non-normal noisy attributes [e.g. nonlinear non-homoskedastic residuals] from
regressed datasets of eco-georeferenced, county abatement, Ae. aegypti, regressors.
Thereafter, a forecast vulnerability model specification with a conditionally
autoregressive prior and a residual coefficient minimization criterion in SAS may
simulate potential, eco-georeferenceable, seasonal, superbreeder, prolific, Ae.
aegypti county abatement foci predictors while optimally quantitating leptokurtic/
playtokurtic and other non-Gaussian distributions in the regression-based, probabi-
listic regression paradigm.
We employed frequency, Bayes, time series, geosampled, covariates estimates
geosampled from a dataset of eco-georeferenced, seasonal, superbreeder, Ae aegypti.
breeding site, capture point foci progonosicators geosampled in an
epi-entomological, intervention site in Hillsborough County, Florida, USA while
adjusting propagational non-normalities, (e.g., pseudo-replicated data distributions,
spatial and aspatial dependent estimators) in the model output. The prior distribution
was modified into the posterior distribution that represented the uncertainty over the
sampled vector arthropod, habitat covariates. In Bayesian statistical inference, a
prior probability distribution, often simply called the prior, of an uncertain quantity
is the distribution that would express one’s beliefs about this quantity before some
Gaussianization of Variational Bayesian Approximations with Correlated. . . 281

evidence is taken into account [12]. In the Ae. aegypti forecast vulnerability, time
R derived from the Bayes formula [i.e., π(θ/
series model, Rthe posterior distribution was
x)¼f(x/θ)π(θ) .Θf(x/θ)π(θ)dθ, where Θ . f(x/θ)π(θ)dθ was a normalization con-
stant, x indicated in the county abatement geosampled capture point data, θ was the
unknown quantity (i.e., potential, seasonal, eco-georeferenceable, superbreeder,
capture point covariates, π(θ) was the prior distribution, f(x/θ) was the likelihood
function, and the posterior distribution uncertainty was π(θ/x). There are two key
advantages of Bayesian theory: (i) once the uncertainty in the posterior distribution is
expressed via probability distribution then the statistical inference can be automated
and (ii) available prior information is reasonably incorporated into the statistical
model [9]. Here, the posterior estimate of the potential, eco-georeferenced, seasonal,
superbreeder parameter R0 was derived employing its prior information, where
likelihood function followed a binomial distribution with a conjugate prior as beta
distribution of first kind in PROC MCMC.
PROC MCMC uses an adaptive blocked random-walk Metropolis algorithm with
a normalized proposal distribution. We employed the d-dimensional Markov chain,
X (i.e., iteratable, geosampled, county abatement, Ae. aegypti, larval/pupal, habitat,
frequency, capture point, discrete integer, count values), to determine a new value
(potential, prolific, seasonal, superbreeder, breeding site, covariate) X which
was obtainable by proposing a jump [i.e., Y*: ¼ X* – X] from the prespecified
Lebesgue density. Lebesgue (density is usually considered as a point function in the
sense that a fixed subset of a space X is given and then the value of the density of this
set is obtained at various points of the space [10].
Our assumption was that PROC MCMC could obtain prolific, Ae. aegypti samples
from the corresponding posterior distributions and save the posterior samples in an
output unbiased empirical dataset. In so doing, the Ae. aegypti,capure point, larval/
pupal, aquatic habitat data that had any likelihood, prior or hyperprior with PROC
MCMC could then be accommodated in a time series, forecast, vulnerability, county
abatement, model non-frequenctistic framework as long as these functions were
programmable by employing the SAS DATA step functions. We also assumed that
the seasonal, normalized parameters could be constructed robustly using the model
linearly, or in any nonlinear functional form in PROC NL MIXED thereafter.
The objectives of this research were: (1) to construct a robust, Poisson regression,
model framework employing multiple, geosampled, explanatory, predictor vari-
ables; and, (2) to generate an autoregressive, Bayesian-oriented, uncertainty, fre-
quency matrix employing multivariate seasonal geosampled, parameterizable
spectrotemporal estimators for filtering all latent non-zero parameter estimators
and other biased, non-normal random noise (e.g., spatial heteroskedascity) in the
residuals for precisely, forecasting, immature, seasonal, Ae. aegypti, superbreeder
breeding site, capture point geosampled covariates in a simulation sub-meter reso-
lution model of Hillsborough County, Florida. Although this epi-entomological
forecast, vulnerability model is focused on Ae. aegypti immatures, in a county
abatement, intervention, study site in Florida, the iterative paradigm may be employ-
able for other vector arthropods or nuisance mosquitoes/pests in other local and
foreign intervention study sites.
282 A. Huertas et al.

2 Materials and Methods

2.1 Study Area

Hillsborough County is located on the central west coast of the Florida peninsula,
with a land area of 2642.33 km2, and has a subtropical humid climate. Figure 1
highlights the location of Hillsborough County on a map of Florida. In 2010, the
U.S. census recorded 1.2 million residents in the county, which is expected to
increase to 1.4 million by the year 2020.

2.2 Ae. aegypti Abundance in Different Containers

Ae. aegypti abundance data (Table 1) for this study was obtained from various
sources [2, 7].
Dinh and Novak [18] studied the diversity and abundance of mosquitoes in an
urban Florida swamp area in waste tires. Out of 1223 fourth instar larvae and pupae
surviving to adulthood, only 1% were of the Ae. aegypti species (12 larvae/pupa)
(Table 1). The location of the tire placement was not in an optimal environment for
Ae. aegypti, indicating this abundance is below the expected value.
To determine Ae. aegypti abundance in a single cup for this study, the range of
eggs laid under laboratory conditions (40–142), was divided by the two cup density
to obtain 20–71 eggs in one 300 mL ovitrap (Table 1).

Fig. 1 Hillsborough County Study Area


Gaussianization of Variational Bayesian Approximations with Correlated. . . 283

Table 1 Ae. aegypti abundance in different volume containers and their potential use as substitutes
for common artificial containers
Potential Substitute
Container Abundance Containers Source
60 ml water 9–14 eggs Shot glass; ashtray
(1 female)
200 ml water in 360 ml cup 74 (27–125, CI Lined plant container, cup
(lined with rugose) 95%) (1 female) with napkin inside
200 ml water in 360 ml cup 5 (2–20, CI 95%) Beer bottle, soda can, solo
(unlined) (1 female) cup, mug
300 ml water 20–71 eggs Abreu
et al. [2]
Waste tire in swamp 12 larvae/pupa N/A Dinh and
Novak [18]

2.3 Temperature and Relative Humidity

We assumed that the covariates Temperature and Relative humidity were important
to the study. This is supported by a study by Dickerson who measured the percent
hatch rate of Ae. aegypti and Ae. albopictus eggs via 25 tests employing different
temperature and humidity combinations (Table 2). The tests revealed that as relative
humidity increased, the percent of eggs that moved on to the larvae stage also
increased. Additionally, the optimal temperature for mosquito egg development
was between 28 and 33 ○ C, whereas progression to the larval stage decreased the
longer the eggs were exposed to temperatures between 32 and 35 ○ C.
Data obtained from Dickerson.

2.4 Regression Analyses

The relationship between each potential, seasonal, superbreeder, Ae. aegypti, capture
point and each individual, geosampled, endemic, transmission-oriented, predictive,
risk-related, explanatory, time-series covariate was investigated by single variable
regression analysis in GEN MOD. Since the parasite prevalence data were binomial
fractions Poisson regression analyses were employed to determine the relationship
between the, potential, county abatement, seasonal, superbreeder, Ae. aegypti cap-
ture points and the geosampled habitat characteristics.
The Poisson process in our analyses was provided by the limit of a binomial
distribution of the geosampled, explanatory, county abatement, predictor estimates
N! n N–n
using Pp ðnjN Þ ¼ n!ðN–nÞ! p ð1 – pÞ . We viewed the distribution as a function of
the expected number of capture point count variables using the sample size N for
quantifying the fixed p in Eq. (2.1), which was then transformed into the linear
( υ )n ( )N–n
equation: PNυ ðnjN Þ ¼ n!ðN–n
N!
Þ! N 1 – Nυ . Based on the sample size N, the
284 A. Huertas et al.

Table 2 Temperature and Relative Humidity effect on Ae. aegypti immature survival
Temperature Relative Percent Temperature Relative Percent
(○ C) Humidity (%) Hatch Rate (○ C) Humidity (%) Hatch Rate
15 15 32.34 32 15 68.04
35 46.12 35 75.67
55 42.26 55 93.16
75 61.73 75 81.04
95 55.64 95 94.45
21 15 59.62 35 15 58.28
35 82.07 35 59.42
55 81.77 55 69.58
75 90.68 75 79.73
95 91.38 95 82.24
27 15 86.85
35 91.54
55 90.52
75 92.47
95 90.86

distribution approached Pv(n) was limn!1 Pp ðnjN Þ ¼ limN!1 N ðN–1Þ...n!ðN–nþ1Þ


( ) (
υ N
)–n n ( )N ( )–n
υn
¼ limN!1 N ðN–1Þ...NðnN–nþ1Þ υn! 1 – Nυ
n
Nn 1 – N 1 – Nυ 1 – Nυ ¼ 1 . νn! . e–υ . 1 ¼
υn e–υ
n! :
The GENMOD procedure then fit a generalized linear model (GLM) to the
sampled data by maximum likelihood estimation of the parameter vector β. The
GENMOD procedure estimated the seasonal geosampled parameters of the county
abatement, forecast, vulnerability regression model numerically through an iterative
fitting process. The dispersion parameter was then estimated by the residual deviance
and by Pearson’s chi-square divided by the degrees of freedom (d.f.). Covariances,
standard errors, and p-values were then computed for the geosampled covariate
coefficients based on the asymptotic normality derived from the ML estimation.
The regression analyses assumed independent counts (i.e., Ni), taken at multiple,
potential, superbreeder, capture point, eco-georeferenced, locations i ¼ 1, 2, . . ., n.
The Ae. aegypti habitat, immature counts were then described by a set of variables in
PROC REG denoted by matrix Xi, where a 1 x p was a vector of indicator values for
a geosampled, potential, seasonal, superbreeder, capture point, county abatement
location i. The expected value of the epi-entomological, time series, capture point,
habitat data was given by μi(Xi) ¼ ni(Xi)exp(Xiβ), where β was the vector of
non-redundant explanatory parameters in the endemic, transmission-oriented, risk
model where the Poisson rates were given by λi(Xi) ¼ μi(Xi)/ni(Xi). In probability
theory and statistics, the Poisson distribution is a discrete probability distribution that
expresses the probability of a given number of events occurring in a fixed interval of
time or space if these events occur with a known constant rate and independently of
the time since the last event [28].
Gaussianization of Variational Bayesian Approximations with Correlated. . . 285

The rates parameter λi(Xi) was both the mean and the variance of the Poisson
distribution for each geosampled, Ae. aegypti, capture point, larval/pupal habitat
geolocation i. The dependent variable was total, seasonal, immature density count.
The Poisson regression model assumed that the empirical, vector arthropod, imma-
ture, geosampled, habitat data was equally dispersed, that is, that the conditional
variance equaled the condition mean. Partial correlations were then defined after
introducing the concept of conditional distributions.
We initially restricted ourselves to just the potential, seasonal, superbreeder,
eco-georeferenceable, capture point, Ae. aegypti, conditional distributions rendered
from the multivariate normal distributions. We noted an n x 1 random vector
Z which we partitioned into two random vectors X and Y where X was an n1 x 1
vector and Y was an n2 x 1 vector in the equation Z ¼ (XY). The conditional
distribution properties of the regressed, habitat, explanatory, endemic, transmission-
oriented, covariate coefficients were then defined. Subsequently, we partitioned the
mean vector and covariance matrix in a corresponding manner. That is, μ ¼ (μ1μ2)
and Σ ¼ (Σ11Σ21Σ12Σ22). In so doing, μ1 rendered the means for the asymptotical,
normalized, endemic, transmission-oriented, capture point, predictor variables in the
geosampled data x1, and Σ11 along with the variances and covariances for set x1. The
matrix Σ12 then provided the covariances between the geosampled, potential, sea-
sonal, superbreeder, Ae. aegypti, explanatory regressors in set x1 and set x2 as did
matrix Σ21. Any distribution for a subset of variables from multivariate normal
conditional on known values for another subset of variables has a multivariate
normal distribution [19].
We noted that the conditional distribution of x1 given the known, capture point,
potential, seasonal, superbreeder, geosampled time series values for x2 was multi-
variate normal with a mean vector covariance which was described by
matrix ¼ μ1 + Σ12Σ–122(x2 – μ2)Σ11 – Σ12 – Σ – 122Σ21. The procedure employed
ML estimation to find the operationalized, potential, seasonal, superbreeder, capture
point, independent variables. The data was then log-transformed before analyses to
normalize the distribution and minimize standard error.
In the Poisson, regression, capture point, probability model, the
epi-entomological data generation process was conducted in PROC REG by gðyiÞ
expð–μi Þμiyi 0
¼ yi! where μi ¼ exi β . Thus, the model was defined as P(yi ¼ 0|xi,
yi
expð–μ Þμ
zi) ¼ Fi + (1 – Fi)exp(–μi) where Pðyi jxi, zi Þ ¼ ð1 – F i Þ yi !
i i
, yi > 0. The
conditional expectation and conditional variance of yi for the model was given by
E(yi|xi, zi) ¼ μi(1 – Fi). Note that the Ae. aegypti model exhibited overdispersion
since V(yi|xi, zi) > E(yi|xi, zi).
In statistics, overdispersion is the presence of greater variability (statistical disper
sion) in a data set than would be expected based on a given statistical model.
Overdispersion is often encountered when fitting very simple parametric models
such as those based on the Poisson distribution [20]. If overdispersion is a feature in
a seasonal, predictive, risk–related, endemic, transmission-oriented, county, abate-
ment, vector, arthropod, epi-entomological, habitat, frequency, distribution model,
an alternative model with additional free parameters may provide a better fit
286 A. Huertas et al.

[34]. Here, we employed a negative binomial model to quantitate over-Poissonized


parameters associated to the seasonal, geosampled, regressed Ae. aegypti, larval/
pupal, habitat covariates.
The zero-inflated negative binomial (ZINB) model in PROC COUNTREG was
based on the negative binomial model with quadratic variance function (p ¼ 2). The
ZINB model was obtained by specifying a negative binomial distribution for the
geosampled, time series, potential, seasonal, superbreeder, Ae. aegypti, larval hab-
Γðy þα–1 Þ
( –1 )α–1 ( ) yi
μi
itat, data generation process referred to by gðyi Þ ¼ y !Γi ðα–1 Þ α–1α þμ –1
α þμ .
i i i

Hence, the ZINB model was definable


–1
Pðyi ¼ 0jxi ; zi Þ ¼ F i þ ð1 – F i Þð1 þ αμi Þ–α where
Γðyi þα–1 Þ
( –1 )α–1 ( )yi
Pðyi jxi ; zi Þ ¼ ð1 – F i Þ y !Γðα–1 Þ α–1α þμ x α–1μþμ
i
, yi > 0. In this case, the
i i i

conditional expectation and conditional variance of yi in the vulnerability,


forecast-oriented, Ae. aegypti, county abatement, paradigm were E(yi|xi,
zi) ¼ μi(1 – Fi) V(yi|xi, zi) ¼ E(yi|xi, zi)[1 + μi(Fi + α)], respectively.
A mixture model with a negative binomial distribution determined statistically
significant covariates of potential, seasonal, superbreeder, Ae. aegypti capture points
geosampled in the Hillsborough County, epi-entomological, intervention, study site
where the mean of the Poisson distribution was itself a random variable drawn from
the non-homogeneous gamma distribution. We introduced an additional free param-
eter in the spatiotemporal, capture point Ae. aegypti, larval/pupal, habitat, probability
distribution. The family of negative binomial distributions is a two-parameter family
which uses several parameterizations for treating overdispersion data [16].
The Poisson distribution had one free parameter which did not allow for the
variance to be adjusted independently by the mean of the Ae. aegypti, forecast,
vulnerability, model. A parameterization technique was employed in PROC REG
employing two, geosampled, capture point variables p and r with 0 < p < 1 and r > 0.
Under this parameterization, the probability mass function (pmf) of the
eco-georeferenced, geosampled, potential, seasonal, superbreeder, Ae. aegypti,, lar-
val/pupal habitat, asymptotical, normalized, capture point, explanatory, predictor
variables
( with a NegBin
) (r, p) distribution took the(following form:) for k ¼ f(k; r,
kþr–1 k kþr–1 ΓðkþrÞ
p) ¼ r
. p . ð1 – pÞ 0, 1, 2, where ¼ k!ΓðrÞ ¼ ð–1Þk
( ) k k
–r
– and Γ(r) ¼ (r – 1)!.
k
The pmf is often the primary means of defining a discrete probability distribution
rendered from an epi-entomological, forecast, vulnerability, capture point, vector
arthropod, diagnostic model output. Such functions exist for either scalar or multi-
variate potential, seasonal, superbreeder, county abatement geosampled, time series
random variables whose domain is discrete.
We also employed an alternative parameterization for optimally regressively
quantitating the empirically geosampled, Ae. aegypti, habitat, time series, frequency
r
covariates employing the mean λ : λ ¼ r . ðp–1 – 1Þp ¼ rþλ . in PROC REG. The
λk ΓðrþkÞ
pmf then became: gðk Þ ¼ k! . ΓðrÞ ðrþλÞk . λ r , where λ and r were the potential
1
ð1þr Þ
Gaussianization of Variational Bayesian Approximations with Correlated. . . 287

seasonal, superbreeder, capture point, geosampled estimators. Under this parame-


k
terization, we were able to generate: r! 1 gðk Þ ¼ λk! . 1 . exp1ðλÞ which we noted
lim

resembled the mass function of a Poisson-distributed random variable with Poisson


rate λ in the Ae. aegypti forecast model output. In other words, the alternatively,
parameterized, negative, binomial distribution generated from the regressed, Ae.
aegypti, larval/pupal, habitat, explanatory, capture point, time-series, covariates con-
verged to the Poisson distribution, and r controlled the deviation from the Poisson.
This made the negative binomial habitat model suitable as a robust alternative to the
Poisson framework for modeling the seasonal, geosampled, endemic, transmission-
oriented, explanatory, time-series, Ae. aegypti, capture point covariates.
The negative binomial distribution of the, epi-entomological, time-series, Ae.
aegypti, potential, seasonal, superbreeder, breeding site, county abatement,
geosampled, frequency covariates arose as a continuous mixture of Poisson distri-
butions where the mixing distribution of the Poisson rate was a gamma distribution.
A gamma distribution is a general type of statistical distribution that is related to the
beta distribution and arises naturally in processes for which the waiting times
between Poisson distributed events are relevant. The mass function of the negative
binomial distribution of the geosampled, time series, endemic, explanatory, capture
point, Ae. aegypti, predictor variables then was written as:
Z 1
f ðk Þ ¼ PoissonðkjλÞ . Gammaðλjr; ð1 – pÞ=pÞ dλ
Z0 1
λk λr–1 exp ð–λp=ð1 – pÞÞ
¼ exp ð–λÞ . r dλ
0 k! Z 1Γðr Þ ðð1 – pÞ=pÞ
1 1
¼ pr λðrþkÞ–1 exp ð–λ=ð1 – pÞÞdλ
k!Γðr Þ ð1 – pÞr 0
Γ ðr þ k Þ r
¼ p ð 1 – pÞ k :
k!Γðr Þ

Subsequently, we employed a generalizable, hierarchical, Bayesian, inferential,


probability model constructed in PROC MCMC to determine which covariates were
geospatially and temporally associated with capture point larval/pupal, abundance
counts in an empirically geosampled, eco-georeferenceable dataset of potential,
seasonal, superbreeder, Ae. aegypti, county abatement foci. A SAS calculation
validated that all the larval habitat, county abatement observations were indepen-
dent. The logarithm of the posterior density was calculable as follows: log(ρθ/
Pn
y) ¼ log(Πθ) + logðf ðyi =θÞÞ where a vector of the geosampled, larval, habitat
i¼1
parameters was generated. The term log(π(θ)) was the sum of the log of the prior
densities in the forecast, vulnerability Ae aegypti model.
We specified the PRIOR and HYPERPRIOR statements in PROC MCMC. The
term log( f(yi|θ)) was the loglikelihood specified in the MODEL statement. Subse-
quently, the MODEL statement specified the log likelihood for a single, geosampled,
eco-georeferenceable, potential, seasonal, Ae. aegypti, superbreeder, capture point
observation in the Bayesian simulated, habitat dataset.
288 A. Huertas et al.

The Bayesian framework was defined by conditional probabilities constructed


from the asymptotically normalized, Ae. aegypti, habitat, capture point, PROC
MCMC dataset. The observation nodes in the Bayesian estimation model was
denoted by a vector x ¼ (x1, x2, . . ., xN), since the set of states of the observation
node xj were generated from the geosampled, endemic, transmission-oriented, sea-
sonal geosampled, habitat data which was represented by xj E {1, 2, . . ., Yj} in PROC
MCMC. The hidden nodes were denoted by zk E {1, 2, . . ., Tk}. The probability that
the state of the hidden node zk was i, 1 ≤ i ≤ Tk was expressed as a(k, i) :¼ P(zk ¼ i).
The conditional probability in the Ae. aegypti model was based on the j-th larval
habitat, time-series, explanatory, epi-entomological geosampled, capture point pre-
dictors when xj. was present Meanwhile l, (1 ≤ l ≤ Yj) were based on the condition
that the states of hidden nodes were described by [Z ¼ (Z]1, Z2, . . ., ZK), which was
generated by b( j, l|z) :¼ P(xj ¼ l/z). We then optimally defined a :¼ {a(k, i)}, b :¼ {b( j,
l|z)}. Thereafter, we employed ω ¼ {a, b} for representing a dataset of potential,
seasonal, superbreeder, Ae. aegypti, habitat, capture points, and asymptotically
normalized, frequency estimators.
We constructed an asymptotic confidence interval for a class of nonstationary, Ae.
aegypti, larval/pupal habitat, potential, seasonal, superbreeder, capture point
covariates with constant mean and time-varying variances. Due to the large number
of unknown parameters, traditional approaches based on consistent estimation of the
limiting variance of sample mean through moving block or non-overlapping block
methods are not applicable for epi-entomological time series regression forecast
modelling [35].
Under a block-wise asymptotically equal cumulative variance assumption, we
constructed a self-normalized confidence interval that was robust against the
nonstationarity and dependence structure of the geosampled, capture point, Ae.
aegypti, eco-georeferenceable, potential, seasonal, superbreeder, seasonal data. We
also constructed an asymptotic confidence interval for the mean difference of
nonstationary processes with piecewise constant means generated from the
geosampled, time series, capture point, oviposition, frequency covariates. The pro-
posed methods were illustrated through simulations in PROC MCMC. As a result,
the joint probability that the states of stochastically/deterministically iteratively
interpolative, breeding site, vector arthropod, observation nodes were x ¼ (x1, x2,
. . ., xN) and the states of hidden nodes were parsimoniously quantitated by [z ¼ (z]1,
QK Q
N
z2, . . ., zK), which were based on P(x, z|ω) ¼ aðk, zk Þ bð j, x j jzÞ in PROC MCMC.
k ¼1 j ¼1
The marginal probability that the state of the time-series, explanatory, potential,
seasonal superbreeder, Ae. aegypti, habitat nodes wer (e represented ) by x was gener-
P Q K P
Tk Q k¼1K QN
ated by formulating Pðx; z j ωÞ ¼ Pðx; zj ωÞ ¼ □ aðk, zk Þ
z k ¼1 zk ¼1 j¼1
( )
P Q Pk
K T P1
T
bð j, x j jzÞ : in PROC MCMC. We employed the notation ¼ □ ≔
z k¼1 zk ¼1 z1 ¼1
P
T2 P
TK
... for the summation over all states of the hidden nodes. We assumed the
z2 ¼1 zK ¼1
Gaussianization of Variational Bayesian Approximations with Correlated. . . 289

capture point, geosampled, time series dependent, explanatory covariates Xn ¼ {X1,


X1, . . .Xn} were independently and identically taken from the true distribution p0(x).
In Bayesian learning, the prior distribution φ(ω) on the parameter ω is set [21].
Predive posterior probabilities may enable determining unobserved values in an epi-
entomological, model employing priors [36].
Subsequently, the prior posterior distribution p(ω|Xn) was computed from the
empirical, distribution of the eco-georeferenced, potential, seasonal, superbreeder, Ae.
aegypti habitat, capture point covariates by pðω j X n Þ ¼ ZðX1 n Þ exp ð – nH n ðωÞÞφ ðωÞ,
which then generated the expression H#n(ω) ¼ 1/n∑#(i ¼ 1)"n ≡ log (po{X#i))/( p
{X#i) |ω)), and Z(Xn) (i.e., the normalization constant) in PROC MCMC. The Bayesian
predictive distribution p(x|Xn) was provided R by averaging the model over the posterior
distribution as follows: p(x j Xn) ¼ [p(x | ω)p(ω )Xn] dω. The Bayesian stochastic
complexity F(Xn) was then defined by F(Xn) ¼ –logZ(Xn), which was employed as a
criterion in PROC MCMC by which the model selected the hyperparameters in the
prior. We let Exn ½.] be the expectation over all the county abatement, larval/pupal,
aquatic habitat predictors. The Bayesian stochastic complexity had the following
asymptotic form: E#(X"n) [F(X"n) ≈ λ log n – (m – 1)log log n + O(1), where λ and
m were the, prolific, larval/pupal, habitat, regressed, spatiotemporal, geosampeld,
capture point, covariates, and their endemic, time-series, covariate coefficient, indicator,
frequency count values, respectively.
In regular models, 2λ is equal to the number of parameters and m ¼ 1, while in
non-identifiable models, 2λ is not larger than the number of parameters and m ≥ 1.
However, our Bayesian paradigm required integration over the posterior distribu-
tion, which could not be performed analytically.
The variational framework in PROC MCMC approximated the Bayesian poste-
rior p(Zn, ω|Xn) based on hidden explanatory variables and the variational posterior q
(Zn, ω|Xn), which was factorized employing q(Zn, ω|Xn) ¼ Q(Zn|Xn)r(ω|Xn), where Q
(Zn|Xn) and r(ω|Xn) were posteriors based on the inconspicuous error coefficients and
the quantitated time-series, geosampled, epi-entomological, capture point data,
respectively. The variational posterior q(Zn, ω|Xn) was subsequently chosen to
minimize the functional F– as parsimoniously robustly defined by
P R qðZ n , ω j X n ÞlogðqðZ n ; ω j X½qn]ÞpoðX n ÞÞ

F ½q] ¼ dω, which in our vector arthropod model
pðX n , Z n , ωÞ
Zn
was further defined by s ¼ F(X"n) + K(q(Z"n, ω j X"n) | j p(Z"n, ω j X"n)),
employing the normalized, parameter estimators, which were realized as K(q(Z"n,
ω j X"n) | j p(Z"n, ω j X"n)).
The true Bayesian posterior was p(Zn, ω|Xn) and the variational posterior was q
(Z , ω|Xn). This led to the functional F–½q] being minimized under the constraint and
n

then the variation posteriors, r(ω|Xn) and Q(Zn|Xn) which we computed by


employing the equation rðω j X n Þ ¼ C1r φðωÞexp < logpðX n j Z n j ωÞ > Q and Q
(Zn|Xn) ¼ C1Q exp < log p(Xn, Zn|ω) > r, where Cr and CQ were the normalization
constants. In probability theory, a normalizing constant is a constant by which an
everywhere non-negative function must be multiplied so the area under its graph is
1, (e.g., to make it a pmf). It was important to note that these equations rendered only
necessary conditions for the functional F ½q] to be minimized in the endemic,
290 A. Huertas et al.

transmission-oriented, risk-related, Ae. aegypti, time-series geosampled, county


abatement empirical dataset. Subsequently, the variational posteriors were computed
by an iterative interpolative algorithm. We defined the variational stochastic com-
plexity FðX n Þ by the minimum value of the functional F ½q], which was FðX – n Þ ¼ min
r, Q
F ½q], based on the difference between FðX n Þ and the Bayesian stochastic complexity
[i.e., F(Xn)].
Subsequently, we generated variational posterior for the estimation matrix. In this
research, variational Bayes, was seen as an extension of the EM (expectation-
maximization) algorithm from maximum a posteriori estimation (i.e., MAP estima-
tion), of the single most probable, seasonal, superbreeder, county abatement, Ae.
aegypti capture point, parameter estimator value. The fully Bayesian estimation
computed an approximation to the entire posterior distribution of the geosampled
larval habitat quantitators and latent explanatory diagnostic variables. We assumed
that the prior distribution [φ(ω)]of the geosampled, Ae. aegypti, capture point, larval
habitat, observational parameters [ω ¼ {a, b}] was the conjugate prior distribution.
Q
Tk
ϕ –1
In so doing, φ(ω) was given by φ(ak) ¼ ΓðT k ϕ0 Þ
Γðϕ Þ Tk aðk0, zk Þ , k ¼ 1, 2, . . ., K, employing
0
zk ¼1
the time-series, regressed, coefficient estimates which was subsequently quantitated
Yj
ΓðY ξ Þ Q ξ –1
by φ ðbðjk jzÞ Þ ¼ Γðξ jÞY0 j bð 0j, x j jzÞ , j ¼ 1, 2, . . ., N. These were expressed as
0
x j ¼1
Dirichlet distributions with asymptotical, normalized, hyperparameters, which was
generated by employing ϕ0 > 0 and ξ > 0. The Dirichlet distribution [i.e., Dir(α)] is a
family of continuous, multivariate, probability distributions parameterized by the
vector α of positive reals which can generate the multivariate generalization of the
beta distribution, and conjugate prior of the categorical distribution and multinomial
distribution in Bayesian statistics [17].
We let δ(n) be 1 when n ¼ 0 and 1 otherwise, and then defined the potential,
seasonal, superbreeder, capture point, county abatement, geosampled, Ae. aegypti
larval habitats in the time series, predictive, epi-entomological, county abatement,
forecast, vulnerability model and specified parameter uncertainty estimates with
Pn
ðk Þ
nðkz , zk Þ ≔ hδðZ i – zk ÞiQ : in PROC MCMC. We also employed
i¼1
Pn ( ) Q K
ðjÞ ðk Þ ðjÞ
nðxj, x j jzÞ ≔ δ Xi – x j h δðZ i – zk ÞiQ : In these specified equations, X i
i¼1 k¼1
ðk Þ
was the state of the j-th observational, node, and Z i was the state of the k-th hidden
node. The variational posterior distribution of the geosampled, eco-georeferenceable
Ae. aegypti, capture point, parameters [i.e., ω ¼ {a, b}] was then given by the
Q
Tk
equation rðak j X n Þ ¼ QT ΓðnþT( k ϕ0 Þ ) –
anðk x
, zk Þ ðk , zk Þ þ ϕ0 – 1, ω ¼ {a, b}where
k z zk¼1
zk¼1
Γ nðk, z Þ þϕ0
k

each of the normalized coefficients were spatially regressed employing the itera-
tive equation –n#z"x ^ (≔)∑#(i ¼ 1)"n ≡ h∏#(k ¼ 1)"K ≡ [δ(Z#i"((k)) – z#k]i#Q in
Yj
P
PROC MCMC. It then followed that n–zx ¼ n–ðjx , x j jzÞ , for j ¼ 1, . . ., N, and n–ðkz , zk Þ
x j¼1
Gaussianization of Variational Bayesian Approximations with Correlated. . . 291

P
¼ n–zx which subsequently denoted the sum over zi(i 6¼ k) in the forecast vulner-
z–k
ability model.
We evaluated the statistical efficiency of the MCMC sequence by employing the
following steps:
Step 1. Sample S(m) from f(S|X, Z(m–1));
Step 2. Sample P(m) from f(P|X, S(m), Z(m–1));
Step 3. Sample Z(m) from f(Z|X, P(m), S(m), Φ(m–1));
where X was the empirical county abatement, geosampled dataset of the time-series,
Ae. aegypti, larval habitat, potential, seasonal, superbreeder, capture point, covariate,
coefficients. Thereafter, Z was randomly assigned a starting value Z(1) by using a
uniform prior distribution.
Step 1 was performed by drawing from an inverse Wishart distribution. In
Bayesian statistics, inverse Wishart distribution is used as the conjugate prior for
the covariance matrix of a multivariate normal distribution [21]. The pdf of the
explanatory, time-series, geosampled, county abatement, larval, habitat, endemic,
–1 Þ
m mþpþ1 traceð=υB
½=υj□] 2 jBj– 2 e

, where B and υ= were p x
2
transmission-related, covariates was: mp
ðÞ2 2 τp m2
p positive definite matrices, and τp(.) was the multivariate inverse Wishart Gamma
function. Positive definite matrix is a symmetric matrix with all positive
eigenvalue [17].
The potential superbreeder, Ae. aegypti, capture point, larval habitat seasonal,
distribution rendered from the inverse of a Wishart-distributed matrix was generated
from A ~ W(Σ, m) and Σ, in PROC MCMC which was of size p x p. Under these
circumstances, B ¼ A–1 had an inverse Wish art distribution BW–1 ~ Σ–1(, m).
We calculated the pdf [i.e., τp(.)] which in the epi-entomological, forecast-oriented,
vulnerability model represented the multivariate Gamma function of the explana-
tory, capture point, seasonal data. A pdf or density of a continuous random variable
is a function that describes the relative likelihood for this random variable to occur at
a given point [23].
The marginal and conditional distributions from the inverse, Wishart-distributed
time-series matrix were then quantified employing A~W–1(ψ, m). We partitioned
the matrices
[ for] determining
[ if ψ] was conformable with each other using
A11 A12 ψ11 ψ12
A¼ , ψ¼ , where Aij and ψij were pi x pj matrices.
A21 A22 ψ21 ψ22
We then determined if:
(i) A11 was independent of A–1 –1
11 A12 and A22.1, when A22.1 ¼ A22– A21 A11 A12 was
the Schur complement (i.e., a submatrix within a larger matrix) of A11 in A;
(ii) A11~W–1(ψ11, m – p2); N (iii) A#11"(–1)A#12|A#(22 . 1)~MN#( p#1xp#2)
"
(ψ#11 (–1)ψ#12, A#(22 . 1) ψ#11"(–1)), where MNp x q(., .) was a matrix
normal distribution generated from the dataset of regressed spatiotemporal, larval
habitat, eco-georeferenced, covariate coefficients; and, (iv) A22 . 1~W–1(ψ11, m).
292 A. Huertas et al.

We then employed the conjugate distribution to make inference about the


geosampled, county abatement, larval habitat, covariance matrix Σ, whose prior
had a W–1(ψ11, m) distribution. If the capture point, observations (i.e., potential,
seasonal, superbreeder, time series) X ¼ x1, . . ., xn independent p-variate Gaussian
variables are drawn from a distribution, then the conditional distribution had a
W–1(A + ψ, n + m) distribution, where A ¼ XXT was n times the sample covariance
matrix. Because the prior and posterior distributions are the same family, the inverse
Wishart distribution was the conjugate to the multivariate Gaussian generated from
the explanatory, Ae. aegypti, time-series predictive, epi-entomological, county
abatement, geosampled, vulnerability -related estimators. This task was simplified
by assuming that the capture point, Ae. aegypti, spatiotemporal, risk analyses had
covariance matrices with common eigenvectors. If covariance’s differ among
sources, the inverse Wishart draws often produce invalid results especially for
sources that only possess small components of the mixture [29]. As such, we
developed covariance matrix S, which was decomposed into a vector of standard
deviations V and a correlation matrix in R in PROC MCMC using: S¼diag(V )Rdiag
(V ), where diag(V ) which was a matrix with diagonal elements V and zeros
elsewhere. This decomposition permitted the independent sampling of V and R.
We then simulated the standard
( deviations, V,)from an inverse Gamma distribu-
( ) s2 nk
tion: p V 2k, l jX; Z ðm–1Þ ~ IG α þ n2k ; β þ k,2l , where nk was the number of
individual, potential, seasonal, superbreeder, eco-georeferenceable, geospatial, cap-
ture points, assigned to cluster k by Z(m – 1) (i.e., the Ae. aegypti larval habitat sample
size). In so doing, s2k, l was the sample variance of element l in cluster k as assigned by
Z(m – 1); and, α and β were constants, both set to the non-informative prior value of 1.
Subsequently, we simulated the elements of the correlation matrices, R, from a
hyperbolic-tangent
( transformed distribution employing p(tanh(Rk, i, j) |X, Z(m – 1)) ~
( ) 1)
N tanh R ^ k , i, j ; ^
nk , where R k , i, j was the current estimate of the, larval habitat,
geosampled, Ae. aegypti capture point, eco-georeferenced, county abatement, cor-
relation coefficients between the i-th and j-the elements (i 6¼ j) in the regressed time-
series cluster k given Z(m – 1), whence nk had the same coefficients values as the
cluster. After the geosampling values for both V and R were optimally quantitated,
we reassembled the covariance matrix SK for each statistically significant forecasted,
potential, seasonal, superbreeder, Ae. aegypti, larval habitat eco-geographic covar-
iate cluster, correlation coefficient, thus, completing Step 1.
Step 2 required drawing simulated capture point values for the elemental means,
P. We noted that the Ae. aegypti, larval habitat, explanatory, time-series data X had
an approximate multivariate normal distribution as such, Step 2 was performed by
employing the vector of mean concentrations for cluster k. The multivariate, normal
distribution generated from the larval, habitat covariates were given by the sample
means calculated from X, generated from the cluster assignments, Z(m – 1) and the
covariance SK from Step 1. If cluster k was empty at m – 1, no individual habitat
parameters were assigned to k by Z(m – 1), and the grand mean and covariance matrix
of X was employed as the sample mean and covariance matrix of k.
Gaussianization of Variational Bayesian Approximations with Correlated. . . 293

Step 3 involved drawing Ae. aegypti larval habitat, cluster assignments


employing each individual geosampled, predictive potential, seasonal, superbreeder,
larval habitat parameter. It was necessary to calculate Pr(zi ¼ k) for every i,
k combination (zi was the itch element of Z ). Again, we assumed multivariate
normalized distributions from the outcome of the regressed dataset of iteratively
interpolative, potential, seasonal, superbreeder capture point, eco-georeferenced
county abatement, geosampled, covariate coefficients where Z(m) was simulated
( ı ) ϕk f ðX i jPk ; Sk ; zi ¼k Þ
ðm Þ ðmÞ

from: Pr zi, ðmÞ ¼ kıX; PðmÞ ; SðmÞ ; Φ ¼ P K , when the f(.)


ϕk0 f ðX i jPk0 ; Sk0 ; zi ¼k0 Þ
ðmÞ ðm Þ

K–1

terms on the right-hand side were multivariate normal likelihoods. Thus, the likeli-
hood that the i-th element of X was present in the county abatement, larval habitat
population k was normalized by the sum of likelihoods for all potential sources
influencing presence of immature Ae. aegypti at the epi-entomological, intervention,
county abatement, study site. Finally, the sample Φ(m) from f(Z|X, P(m), S(m), Z(m))
was employed to generate a dataset of asymptotical efficient estimates in the
seasonal, multivariate, larval habitat, interpolative county abatement, forecast, vul-
nerability model.

3 Results

We conducted a count data analysis employing a Poisson regression in PROC


COUNT REG. We assumed that yi, given the vector of geosampled, potential,
seasonal, superbreeder Ae. aegypti, larval habitat, capture point covariates xi, as
y
e–μi μ i
independently Poisson distributed withPðY i ¼ yi jxi Þ ¼ y ! i , yi ¼ 0, 1, 2, . . . and the
i
mean parameter—that is, the mean ( 0 number
) of county abatement, habitat sampling
events— was given by μi ¼ exp xi β where β was a (k + 1) x 1 parameter vector.
The intercept was β0; the coefficients for the k breeding site regressors (i.e., β1, . . .,
βk) Taking the exponential of x0i β ensured that the mean parameter μi was nonneg-
ative in the vector( arthropod
) model. The conditional mean was given by
E ðyi jxi Þ ¼ μi ¼ exp x0i β .
In the capture point, Poisson regression Ae. aegypti, model the logarithm of the
conditional mean was linear in the parameters: ln ½Eðyi jxi Þ] ¼ ln ðμi Þ ¼ x0i β. Note
that the conditional variance of the count random variable was equal to the condi-
tional mean in the Poisson regression model: V(yi|xi) ¼ E(yi|xi) ¼ μi. The equality of
the conditional mean and variance of yi is known as equidispersion. The marginal
effect of a geosampled, potential, seasonal, superbreeder, Ae. aegypti, larval habitat,
( )
explanatory capture point regressor was given by ∂E∂x ðyi jxi Þ
ji
¼ exp x0i β β j ¼ E ðyi jxi Þ
β j Thus, a one-unit change in the jth regressor lead to a proportional change in the
conditional mean E(yi|xi) of βj in the forecast vulnerability model output. The standard
estimator for the Poisson model is the maximum likelihood estimator (MLE). Since the
vector arthropod, county abatement, seasonal, breeding site observations were
294 A. Huertas et al.

independent, the log-likelihood function was written as


P
N PN ( )
x0i β 0
L¼ ð–μi þ yi ln μi – ln yi !Þ ¼ –e þ yi xi β – ln yi ! . The gradient and
i¼1 i¼1
P
N N (
P 0 ) 2
∂L
the Hessian were respectively, ∂β
¼ ð yi – μ i Þ xi ¼ yi – exi β xi ∂β∂ ∂β
L
0 ¼
i¼1 i¼1
P
N P
N
x0i β
– μi xi x0i ¼– e xi x0i .
i¼1 i¼1
The Ae. aegypti, capture point model was generalized by introducing an
unobserved heterogeneity term for observation i. Thus, the geosampled capture
point prognosticators were assumed to differ randomly in a manner that was not
fully accounted for by the observed covariates. This was formulated as E ðyi jxi ; τi Þ
0
¼ μi τi ¼ exi βþεi where the unobserved heterogeneity term τi ¼ eεi was independent
of the vector of capture point regressors xi. In so doing, the distribution of yi
conditional on xi and τi was Poissonian with conditional mean with a conditional
yi
variance of μi τi : f ðyi jxi ; τi Þ ¼ expð–μiyτi!Þðμi τi Þ Let g(τi) be the pdf of τi. Then, the
i
distribution f(yi|xi) (no longer conditional
R1 on τi) is obtained by integrating f(yi|xi, τi)
with respect to τi : f ðyi jxi Þ ¼ 0 f ðyi jxi ; τi Þgðτi Þdτi [28].
The Poisson distribution revealed that the capture point covariate coefficients
–ν
reached a maximum when d Pdnν ðnÞ ¼ e nðγ–H n!
n þln νÞ
¼ 0 where γ was the Euler-
Mascheroni constant and Hn was a harmonic number, leading to the transcendental
equation γ – Hn + ln v ¼ 0. The Euler–Mascheroni constant (also called Euler’s
constant) is a mathematical constant recurring in analysis and number theory, usually
denoted by the lowercase Greek letter gamma (γ).which may be defined as the
limiting difference between the harmonic series and the natural logarithm [1].
The, capture point, log-linear, Ae. aegypti regression, forecast, vulnerability
model also revealed that the Euler-Mascheroni constant arose in the integrals as
Z 1 Z 1 ( )
1
γ¼– e–x ln xdx ¼ – ln ln
Z 1(
0 ) 0 Z 1x ( ) ð12:1Þ
1 1 –x 1 1 –x
dx ¼ – e dx ¼ – e dx
0 1 – e–x x 0 x 1þx

Commonly, integrals
R1 that render γ in combination with temporal geosampled
R1 con-
pffiffiffi
stants include 0 e–x ln x dx ¼ –14 π ðγ – 2 ln 2Þ which is equal to 0 e–x ðln xÞ2
2

dx ¼ γ 2 þ 16 π 2 ½2] [5]. Thereafter, the double integrals in the model included


R1 R1
γ ¼ 0 0 ð1–xyx–1 Þ ln ðxyÞ dxdy.
An interesting analog of Eq. (12.1) in the PROC REG, regression-based
() P 1 [ ]
model was then precisely calculated as ln π4 ¼ ð–1Þn–1 1n – ln nþ1
n ¼
R1 R1 n¼1
x–1
0 0 ð1þxyÞ ln ðxyÞ dx dy ¼ 0:241564 . . . γ. This solution was also provided by incorpo-
Qn
rating Mertens theorem [i.e., eγ ¼ lim ln1p 1
1– 1
where the product was aggregated
n!1 n
i¼1 pi

over the county abatement, geosampled, temporal values found in the


epi-entomological, capture point, larval habitat empirical datasets. Mertens’ 3rd
Gaussianization of Variational Bayesian Approximations with Correlated. . . 295

Q( )
theorem: lim n 1 – 1p ¼ e–γ is related to the density of prime numbers where γ is
n!1 p≤n
the Euler–Mascheroni constant [22].
" in the(Ae. aegypti
By taking the logarithm of both sides
)
model,
# an explicit formula
P
for γ was derived employingγ ¼ lim ln 1–1 1 – ln ln x . This expression was
x!1 p≤x p

also rendered by quantitating the capture point time series, Ae. aegypti habitat data
employing Euler, and equation (12.1) by replacing ln nb ln(n + 1), in the equation
P1 [ ( )]
γ¼ k – ln 1 þ k
1 1
which then generated
k¼1 ( )
lim ½ln ðn þ 1Þ – ln n] ¼ lim ln 1 þ 1n ¼ 0. We then substituted the telescoping
n!1 n!1
Pn ( ) ( )
sum ln 1 þ k for ln(n + 1) which then generated ln 1 þ 1k ¼ ln ðk þ 1Þ – ln k.
1
k¼1
A telescoping sum is one in which subsequent terms cancel each other, leaving only
nP
–1
initial and final terms (e.g., ðai – aiþ1 Þ ¼ (a1 – a2) + (a2 – a3) + . . . + (an – 2 –
S¼i¼1
an + (an – 1 – a]
–[ 1) n) ¼ (a1 – an)) [1]. Thereafter, our product was
P
n P
n ( ) n [
P ( )]
lim 1
k – ln 1 þ 1
k ¼ lim k – ln 1 þ k .
1 1
n!1 k ¼1 k ¼1 n!1 k¼1
γ
Additionally, other series in the Ae. aegypti, model included the equation (◇),
P
1 () P 1
ð–1Þn–1 ζ ðnþ1Þ P
1
whereγ ¼ ð–1Þn ζðnnÞ ¼ ln π4 þ 2n ðnþ1Þ and ζ(Z ) was γ ¼ ð–1Þn ½lgn
n
]
n¼2 n¼1 n¼1
plus the Riemann zeta function. The Riemann zeta function ζ(s) is a function of a
P1
1
complex variables that analytically continues the sum of the infinite series ns
n¼1
which converges when the real part of s is greater than 1 where lg is the logarithm to
base 2 and the bxc is the floor function [28]. Nielsen earlier provided a
P1 2P n
–1
n
series equivalent to γ ¼1– ð2kþ1Þð2kþ2Þ and, thereafter
n¼1 k¼2n–1
1
ð2kþ1Þð2kþ2Þ ¼ –1
2kþ1
1
which was then added to 0 ¼ –12 þ 14 þ 18 þ 16
2kþ2
1
þ . . . to
render Vacca’s formula. We employed the sums
( k–j )–1
P1 P 1
ð–1Þ k P
1 kP–1
2 þj
γ¼ k ¼
1
2kþ1
with k – j by replacing the undefined
n¼1 k¼2n k¼1 j¼0 j
I and then rewrote the equation in PROC REG as a double series for applying the
Euler’s series transformation to the empirical dataset of geosampled time-series
dependent, explanatory, covariate coefficient estimates.
In this research nk was employed as a binomial coefficient, which was rearranged to
achieve the conditionally convergent series in the county abatement, Ae. aegypti
forecast, vulnerability model. The plus and minus terms were first grouped in pairs of
the empirical geosampled capture point, covariates employing the resulting series
based on the actual, observational, discrete integer, experimental, indicator values.
R1 1 P 1 n
The double series was equivalent to Catalan’s integral: γ ¼ 0 1þx x2 –1 dxin
n¼1
the model. Catalan’s integrals are a special case of general formulas due to the
296 A. Huertas et al.

(pffiffiffiffiffiffiffiffiffiffiffiffiffiffi) Rπ
equation J 0 z2 – y2 ¼ π1 0 eν cos θ cos ðz sin θÞdθ where J0(Z ) which is a Bessel
function of the first kind [28]. The Bessel function is a function Zn(x) expressible in a
robust regression model by employing the recurrence relations Z nþ1 þ Z n–1 ¼ 2n x Zn
dZ n
and Znþ1 – Z n–1 ¼ –2 dx ½2] which more recently has been defined as solutions in
linear models using the differential equation x2 ddx2y þ xdy
2

dx þ ðx – n Þy ¼ 0½6].
2 2

In the Ae. aegypti, capture point


H model, the Bessel function Jn(z) was defined by
the contour integral J n ðzÞ ¼ 2π1 i eðz=2Þðt–1=tÞ t –n–1 dtwhere the contour enclosed the
origin and wasR traversed in a counter-clockwise direction. This function generated:
pffiffi π
J 0 ð2i zÞ ¼ π1 0 eð1þzÞ cos θ cos ½ð1 – zÞ sin θ]dθ z ≡ 1 – z0 and y ≡ 1 + z0In math-
ematics, Bessel functions are canonical solutions y(x) of Bessel’s differential
equation: x2 ddx2y þ xdy
2

dx þ ðx – α Þy ¼ 0 for an arbitrary real or complex number α


2 2

(i.e., the order of the Bessel function); the most common and important cases are
for α an integer or half-integer. Thereafter, to quantify the equivalence in the
regression-based, capture point, vector arthropod, parameter estimators, we
expanded 1/(1 + x) in a geometric series and multiplied the geosampled, larval
n
habitat, potential, seasonal, superbreeder, data, feature attributes by x2 –1 .
We then integrated the term wise as in Sondow and Zudilin. In so doing, other
P1
series for γ were included [e.g., γ ¼ 32 – ln 2 – ð–1Þm m–1
m ½ζ ðmÞ – 1] and γ ¼
m¼2
2n
P
1
2mn
P
m ( )
e 2n ðmþ1Þ!
1
t þ1 – n ln 2 þ o 1
2n e2n ].
m ¼0 t ¼0
A
rapid
[ ly converging limit for] γ was [ subsequently provided ] by
Pn ( ) Pn ( )
2n–1 ζð1–k Þ 2n–1 Bk
γ ¼ lim 2n – ln n þ k – nk
1
¼ lim 2n – ln n þ k 1 þ nk
1
andγ
n!1 n!1
[ k¼2
n ( ) ] [ k¼2
n (
]
P ζð1–kÞ P Bk
)
¼ lim 2n2n–1 – ln n þ k – nk
1
¼ lim 2n–1
2n – ln n þ k 1 þ nk
1
where
n!1 k¼2 n!1 k ¼2
Bk was a Bernoulli [number. Another limit ] formula was then provided by the
Γð1nÞΓðnþ1Þn1þ1=n
– nnþ1 . In mathematics, the Bernoulli numbers
2
equation γ ¼ – lim Γð2þnþ1nÞ
n!1
Bn are a sequence of rational numbers with deep connections to number theory,
whereby, values of the first few Bernoulli numbers are B0 ¼ 1, B1 ¼ 1/2, B2 ¼ 1/
6, B3 ¼ 0, B4 ¼ –1/30, B5 ¼ 0, B6 ¼ 1/42, B7 ¼ 0, B8 ¼ –1/30 [1].
In the Ae. aegypti model, the Euler–Maclaurin formula provided expressions for
optimally quantitating the difference between the sum and the integral in terms of the
higher derivatives ƒ(k) at the end points of the interval m and n in the county
abatement forecast, vulnerability, Ae. aegypti model. The Euler-Mascheroni con-
stant was rendered by the expressions γ ¼ – Γ0(1) ¼ – ψ 0(1) where ψ 0(x) was the
[ ]
digamma function γ ¼ lim ζ ðsÞ – s–1 1
and the asymmetric limit form ofγ ¼ limþ
s!1 s!1
1 (
P ) ( )] [
1
ns – 1
sn and γ ¼ lim x – Γ 1x . In mathematics, the digamma function is
n¼1 x!1
defined as the logarithmic derivative of the gamma function:
0
d
ψ ðxÞ ¼ dx ln ΓðxÞ ¼ ΓΓððxxÞÞ where it is the first of the polygamma functions. In the
county abatement, Ae. aegypti, regression, forecast, vulnerability, capture point
model, the digamma function, ψ0(x) was related to the harmonic numbers in that
Gaussianization of Variational Bayesian Approximations with Correlated. . . 297

ψ(n) ¼ Hn – 1 – γ where Hn was the nth harmonic number, and γ was the Euler-
Mascheroni constant.
Unfortunately, extra-Poisson variation was detected in the variance estimates in
the capture point, Ae. aegypti model. A modification of the iterated, re-weighted,
least square scheme and/or a negative binomial, non-homogenous, regression-based,
framework conveniently accommodates extra-Poisson variation when constructing
seasonal log-linear models employing frequencies or prevalence rates as dependent
response variables [28]. As such, we constructed a robust negative binomial regres-
sion model in SAS with non-homogenous means and a gamma distribution by
incorporating α ¼ 1θ ðα > 0Þ in equation (12.1).
We let g(τi) be the pdf of τi in the model. Then, the distribution f(yi|xi) was no
longer conditional on τi. Instead R 1it was obtained by integrating f(yi|xi, τi) with respect
to the equation τi : f ðyi jxi Þ ¼ 0 f ðyi jxi ; τi Þgðτi Þdτi . The distribution in the capture
point, Ae. aegypti, regression model was then
Γðyi þα–1 Þ
( –1 )α–1 ( )yi
α μi
f ðyi jxi Þ ¼ y !Γðα–1 Þ α–1 þμ α–1 þμ , yi ¼ 0, 1, 2. The negative binomial distri-
i i i

bution was thus derived as a gamma mixture of Poisson random variables. The
0
conditional mean in the Ae. aegypti model was then E ðyi jxi Þ ¼ μi ¼ exi β and the
variance [ in ] the residual estimates was
V ðyi jxi Þ ¼ μi 1 þ 1θμi ¼ μi ½1 þ αμi ] > Eðyi jxi Þ.
To further estimate the district-level models, we specified DIST¼NEGBIN (p¼1)
in the MODEL statement in PROC REG. The negative binomial model NEGBIN1
was set up where p¼1, which revealed the variance function V(yi|xi) ¼ μi + αμi was
linear in the mean of the vector arthropod model. ( The log-likelihood function ) of the
yPi –1 ( ( ) )
NEGBIN1 model was then provided by L ¼ ln j þ α–1 exp x0i β . Addi-
j¼0
( ( ))
tionally, the equation – ln ðyi !Þ – yi þ α–1 exp x0i β ln(1 + α) + yi ln(α) was
generated. ( The ! gradient for
) our model was
δL
PN yPi –1
μi –1
δβ ¼ jαþμi xi – α ln ð1 þ αÞμi xi and
i¼1 j¼0
( ! )
δL
PN yP
i –1
α–1 μi –2 ðyi þα–1 μi Þ yi
δα ¼ – ðjαþμ Þ – α μi ln ð1 þ αÞ – 1 þα þα .
i
i¼1 j ¼0
We referenced the negative binomial regression model with the variance function
V(yi|xi) ¼ μi + αμ2i , in the NEGBIN2 model. To estimate this capture point model, we
specified DIST¼NEGBIN (p¼2) in the MODEL statements. A test of the Poisson
distribution was then performed by examining the hypothesis that α ¼ θ1i ¼ 0.
A Wald test of this hypothesis was also provided which reported t statistics for the
geosampled Ae. aegypti, larval habitat, capture point estimates in the model. Under
the Wald statistical test, the ML estimate θ^ of the parameter(s) of interest θ was
compared with the proposed geosampled value θ0, with the assumption that the
difference between the two was approximately normally distributed [28]. The
log-likelihood function
( of the regression models (i.e., NEGBIN2) was then gener-
P
N yP
i –1 ( ( ))
ated by L ¼ ln ðj þ α–1 Þ – ln ðyi !Þ ¼ –ðyi þ α–1 Þ ln 1 þ αexp x0i β þ
i¼1 j¼0
298 A. Huertas et al.

P
N
yi –μi
yi ln ðαÞ þ yi x0i βg whose gradient was δL
δβ ¼ 1þαμi xi . The variance in the forecast,
i¼1
vulnerability,
( epi-entomological model was
) then assessed by
P N yPi –1
δL
δα ¼ –α–2 ðjþα–1 Þα
1 –2
ln ð1 þ αμi Þ þ αðy1þαμ
i –μi
. The final mean in the

i¼1 j¼0
(j k
pðr–1Þ
pr if r > 1
model was calculated as: 1–p, the mode as; 1–p , the variance as
0 if r ≤ 1
pr ð1–pÞ2
2 , the skewness as
pffiffiffiffi, the kurtosis as 6 þ
1þp
ð1–p Þ pr r pr , the moment generating function
( )r ( )r
as 1–pet for t < –log p, the characteristic function as 1–pe
1–p 1–p
it with t 2 ℝ ; and, the
( )r
probability generating function as 11–p –pz for j z j< p.
1

An analytical solution to the integral exists when τi was assumed to follow a


gamma distribution in the negative binomial distribution. When the model contains a
constant term, it is necessary to assume that E(eεi) ¼ E(τi) ¼ 1, in order to identify
the mean of the distribution [28]. Thus, it was assumed that τi followed a gamma (θ,
θ) distribution in the larval habitat, forecast, vulnerability, capture point, Ae. aegypti R1
θ
model with E(τi) ¼ 1 and V ðτi Þ ¼ I=θ : gðτi Þ ¼ ΓθðθÞ τθ–1 i expð–θτi Þwhere ΓðxÞ ¼ 0
zx–1 expð–zÞdz is the gamma function and θ is a positive parameter. Then, the density
of yi given xi was derivable as
R1 θθ μiyi R 1 –ðμ þθÞτi θþyi –1
y
θθ μi i Γðyi þθÞ
f ðyi jxi Þ ¼ 0 f ðyi jxi ; τi Þgðτi Þdτi ¼ y !ΓðθÞ 0 e i τi dτi ¼ y !ΓðθÞðθþμ Þθþyi ¼
( )θ ( )yi i i i
Γðyi þθÞ θ μi
yi !ΓðθÞ θþμi θþμi . Making the substitution α ¼ θ (α > 0), the negative binomial
1

Γðy þα–1 Þ
( –1 )α–1 ( ) yi
μi
distribution was then rewritten asf ðyi jxi Þ ¼ y !Γi ðα–1 Þ α–1α þμ –1
α þμ , yi ¼ 0,
i i i

1, 2, . . . in PROC REG. Thus, the negative binomial distribution was derived as a


gamma mixture of potential, seasonal, superbreeder Poissonized, Ae. aegypti, larval
0
habitat, random variables. The model had [a conditional
] mean of E ðyi jxi Þ ¼ μi ¼ exi β
and conditional variance of V ðyi jxi Þ ¼ μi 1 þ 1θμi ¼ μi ½1 þ αμi ] > E ðyi jxi Þ.
A negative binomial [NEGBIN2] model with variance function V(yi|xi) ¼ μi +
αμ2i , was then constructed which was quadratic in the mean. To estimate this model,
we specified DIST¼NEGBIN(p¼2) in the MODEL statement. The Poisson distri-
bution is a special case of the negative binomial distribution where α ¼ 0 [28]. A test
of the Poisson distribution was carried out by testing the hypothesis that α ¼ θ1i ¼ 0.
A Wald test of this hypothesis was provided (it was the reported t statistic for the
estimated α in the forecast, vulnerability, negative binomial, county abatement, Ae.
aegypti capture point model). The log-likelihood function of the ( negative binomial
PN yP
i –1
regression model (NEGBIN2) was given by ln ðj þ α–1 Þ
i¼1 j¼0
( ( ))
– ln ðyi !Þ – ðyi þ α–1 Þ ln 1 þ αexp x0i β þ yi ln ðαÞ þ yi x0i βg where use of the: Γ

Q
y–1
ðy þ aÞ=ΓðaÞ ¼ ðj þ aÞ permitted optimal quantitation of the geosampled,
j¼0
Gaussianization of Variational Bayesian Approximations with Correlated. . . 299

potential, seasonal, capture point, epi-entomological, geosampled covariates. The


P N
yi –μi
gradient in the entomological, forecast vulnerability model was ∂L ∂β
¼ 1þαμi xi and
( ) i¼1

∂L
PN
–2
yP
i –1
–2 y i – μi
∂α
¼ –α ðjþα–1 Þ þ α
1
ln ð1 þ αμi Þ þ αð1þαμ Þ .
i
i¼1 j¼0
Cameron and Trivedi [11] consider a general class of negative binomial models
with mean μi and variance function μi þ αμip . The NEGBIN2 model, with p ¼ 2, is
the standard formulation of the negative binomial model [19]. Models with other
values of p, –1 < p < 1, may have the same density f(yi|xi) except that α–1 is
replaced everywhere by α–1μ2–p. The negative binomial model NEGBIN1, which
sets p ¼ 1, has variance function V(yi|xi) ¼ μi + αμi, which is linear in the mean.
To estimate this model, we specified DIST¼NEGBIN(p¼1) in the MODEL state-
ment. The
( log-likelihood function of the NEGBIN1 regression model was given by )
P
N yP
i –1 ( ( )) ( ( ))
L¼ ln j þ α–1 exp x0i β ¼ – ln ðyi !Þ – yi þ α–1 exp x0i β ln ð1 þ αÞ þ yi ln ðαÞ .
i¼1 j¼0
( ! )
∂L
P
N yP
i –1
μi
The gradient was ∂β
¼ xi – α–1 ln ð1 þ αÞμi xi
ðjαþμi Þ and
i ¼1 j¼0
( ! )
∂L
P
N yP
i –1 –1
α μi –2 ðyi þα–1 μi Þ yi
∂α
¼ – jαþμ – α μi ln ð1 þ αÞ – 1þα þ α .
i
i¼1 j¼0
In the Bayes formulation, the specification of the time series, explanatory, larval
habitat, model estimators were completed by assigning priors to all unknown,
potential, seasonal, superbreeder, capture point parameters. We employed the dataset
of spatiotemporal, geosampled, explanatory, endemic transmission-oriented, Ae.
aegypti, breeding site observations where each xi for i ¼ 1, . . ., n was assumed to
be distributed according to some distribution p(xi|θ). Here, θ was a parameter that
was unknown and had to be inferred from the epi-entomological, time series data.
Our Bayesian procedure began by assuming that θ was distributed according to
some prior distribution p(θ α), where the parameter α was a hyperparameter. The
joint probability of the geosampled larval habitat, endemic, transmission oriented,
predictive, risk-related data was then determined by employing θ ¼ θ ¼ θ Π¼inip
X p x . . .xn p x whereby, p(X θ, α) ¼ p(X θ) and p(xi|θ, α) ¼ p(xi|θ) which was
conditionally independent of the hyperparameters. Bayesian inference then deter-
mined the posterior distribution of the parameter p(θ X, α) for simulating the
potential, superbreeder, county abatement foci covariates.
We generated the inverse-Gamma distribution which in this research was a
univariate specialization of the inverse-Wishart distribution. In probability theory
and statistics, the inverse gamma distribution (i.e., reciprocal of a gamma distribu-
tion) is a two-parameter family of continuous probability distributions on the
positive real line, which is the distribution of the reciprocal of a variable distributed
according to the gamma distribution. The inverse gamma distribution in Bayesian
statistics, occurs when the distribution arises as the marginal posterior distribution
300 A. Huertas et al.

for the unknown variance of a normal distribution. If an uninformative prior is used,


and has an analytically tractable conjugate prior, then an informative prior is
required. An uninformative prior or diffuse prior expresses vague or general infor-
mation about a variable [23].
We performed a theoretical comparison between maximum likelihood and the
presented Bayesian algorithms for quantitating non-informative parameter values for
the priors hyperparameters in the epi-entomological, capture point, Ae. aegypti larval
habitat model. We also provided a numerical comparison using synthetic time series,
capture point data. The introduction of these novel Bayesian estimators opened the
possibility of including Gamma distributions into more complex Bayesian struc-
tures, (e.g. variational Bayesian mixture models) employing the seasonal,
eco-georeferenceable, potentially prolific, county abatement larval habitats.
The density function from the Wishart distribution for the eco-georeferenced
empirical geosampled, Ae. aegypti, larval habitat, capture point estimators was:
μ μ–n–1 ( ( ))
pd f ðx; μ; ΣÞ ¼ Cn1ðμÞ jΣj–2 jxj 2 exp –12tr Σ–1 x with μ > n, and the trace
P μn ()
of a square matrix A was given by: trðAÞ ¼ aii C n ðμÞ ¼ 2 2 Γn μ2 Γn ðzÞ ¼
i
nðn–1Þ Q
n ( )
π 4 Γ z – i–1
2
i¼1
If V~IWn(μ, Σ) then V–1~Wn(μ – n – 1, Σ–1). The functions had syntax:
y ¼ LOGMPDFWISHART(‘name’ν, μ, ‘name’Σ);and for the inverted Wishart: the
functions had the syntax y ¼ LOGMPDFIWISHART(‘name’ν, μ, ‘name’Σ);.
For spatiotemporally, regressively, quantizing, the time-series, county abatement,
epi-entomological, eco-georeferenced, risk model covariates, multiple MCMC
chains were estimated for the intercept, which appeared to converge within the
first 1000 samples. The first 1000 samples were discarded to allow the model to
stabilize (i.e., known as “burn in”), and the next 10,000 samples were employed to
derive optimal capture point, larval habitat, parameter estimates based on statistical
significance.
The pdf (for) the epi-entomological, forecast, vulnerability, model was
βα –α–1
ΓðαÞ x exp –β x while the mean of the capture point model was α–1 β
for α > 1.
pffiffiffiffiffiffiffi
β 2
4 α–2
The variance was ðα–1Þ2 ðα–2Þ for α > 2. The skewness was α–3 for α > 3, while the
Þ ðα–4Þ for α > 4, and the entropy was α + ln (βΓ(α)) – (1 + α)Ψ(α).
30α–66
kurtosis was ðα–3
α
pffiffiffiffiffiffiffiffiffiffi
The moment generating function was 2ðΓ–βt Þ2
ðαÞ K α ð –4βt Þ, while the characteristic
α
pffiffiffiffiffiffiffiffiffiffiffiffi
function was 2ðΓ–iβt Þ2
ðαÞ K α ð –4iβt Þ.
The iterative, simulation, seasonal, Ae. aegypti superbreeder, habitat model
revealed that w p ¼ 1 (i.e., univariate) and α ¼ m/2, β_ ¼ Ψ=2, and x ¼ B, and the
α –α–1
pdf of the inverse-Wishart distribution was pðxjα; βÞ ¼ β x Γexp 1 ðαÞ
ð–β=xÞ
. The pdf of
–x=θ
the Gamma distribution was f ðxÞ ¼ xk–1 θek ΓðkÞ. We defined the transformation
Y ¼ gðX Þ ¼ X1 , employing the resulting transformation where we found
Gaussianization of Variational Bayesian Approximations with Correlated. . . 301

ı ı ( )k–1 ( )
ı d –1 ı
f Y ðyÞ ¼ f X ðg–1 ðyÞÞ ıdy g ðyÞ ı ¼ θk Γ1ðkÞ 1y exp –1
θy y2 ¼
1
( )kþ1 ( ) ( )
–1
1 1
θk ΓðkÞ y
exp –1θy ¼ θk ΓðkÞ y
1 –k–1
exp –1 θy . Replacing k with α; θ with β; and
β α
–α–1
( –β
)
y with x resulted in the inverse-Gamma pdf: f ðxÞ ¼ ΓðαÞ x exp x . The inverse
Gamma distribution’s pdf was then defined
α ( )over the support x > 0 employing the
equation f ðx; α; βÞ ¼ ΓβðαÞ ðxÞ–α–1 exp – βx with shape parameter α and scale
parameter β.
The regularized Gamma functions were defined in the vector arthropod, capture
point, risk model by γðΓa;ðaÞzÞ and ΓðΓa;ðaÞzÞ, where γ(a, z) and Γ(a, z) were incomplete
Gamma functions, and Γ(a) was a complete Gamma function.
The complete gamma function Γ(a) was thereafter generalized to the incomplete
gamma function Γ(a, x) such that Γ(a) ¼R Γ(a, 0). This “upper” incomplete gamma
1
function was given by Γða; xÞ ≡ x t a–1 e–t dt. For a an integer n
nP–1 k
Γðn; xÞ ¼ ðn – 1Þ!e–x x –x
k! ¼ ðn – 1Þ!e en–1 ðxÞ, where en(x) was the exponential
k¼0
sum function. It was implemented as Gamma[a, z] in the Wolfram Language.
We employed methods in PROC MCMC to calculate the multivariate Gamma
function for quantitating the geosampled, seasonal, superbreeder, Ae. aegypti, cap-
ture point,
R covariate coefficients. The function was constructed employing
Γp(a) ¼ S > 0 exp ( – trace(S)) |S|a – ( p + 1)/2dS where S > 0 thus, S was positive-
definite [i.e., a symmetric matrix with all positive eigenvalues}. An n x n complex
matrix A is called positive definite if

R½x* Ax] > 0 ð12:2Þ

for all nonzero complex vectors x 2 Cn, where x* denotes the conjugate transpose of
the vector x. In the case of a real matrix A, Eq. (12.2) reduces toxTAx > 0,where xT
denotes the transpose.
The logarithm of the posterior density in the Ae. aegypti model was calculable
employing log(p(θ|y)) which revealed potential, seasonal, superbreeder, Ae. aegypti,
P
n
larval, habitat model, frequency estimators which was equivalent to logðπ ðθÞÞ þ
i¼1
logðf ðyi jθÞÞ when θ was a vector of the seasonal, capture point, prolific,
eco-georeferenced, geosampled, county abatement foci. PROC MCMC employed
a random walk Metropolis algorithm to obtain Ae. aegypti, capture point posterior
samples. Formally, the blocked Metropolis algorithm was employed as a sampling
distribution algorithm in the epi-entomological model as follows. We let wj be the
collection of θi that were in block zi and let qj(.| wj) be a symmetric multivariate
distribution centered at the geosampled capture point habitat values (wj). We let
t ¼ 0. We chose potential,( seasonal,
) superbreeder foci capture points for all w tj . This
was an arbitrary point if p w tj jy > 0for j ¼ 1, . . ., d:We generated a new sample, wj,
( )
t
new, using the proposal distribution q j .jw j .We calculated the following quantity:
[ ]
pðw j ;newjw t ;...;w t ;wt–1 ;...;w t ;yjÞ
r ¼ min ( t t 1 t j–1 tþ1jþ1 dt ; 1 : The sample u from the uniform
p w j jw1 , ..., w j–1 , w j–1 , ..., wd , y
302 A. Huertas et al.

distribution was U(0, 1).We set wtþ1


j ¼ w j, new if r < a; wtþ1
j ¼ w tj otherwise.
Subsequently we set t ¼ t + 1.
By default, PROC MCMC assumed that all the larval habitat, probable, prolific
observations in the capture point dataset were independent. The logarithm of the
posterior density was calculated as follows
Pn
logðpðθjyÞÞ ¼ logðπ ðθÞÞ þ logðf ðyi jθÞÞwhere θ was a parameter or a vector of
i¼1
the geosampled, time series, potential, seasonal, superbreeder, breeding site param-
eters. The term log(π(θ)) was the sum of the log of the prior densities specified in the
PRIOR and HYPERPRIOR statements. The term log( f(yi|θ)) was the log likelihood
specified in the MODEL statement.
Table 3 presents the results of the regression for the predictive time- series risk-
related, Ae. aegypti, explanatory, iterative covariates. These results provided infor-
mation for the time-series and remote-specified estimates of the prior distribution
based on the regressed coefficients in the Bayesian analysis. The values for param-
eter estimates and standard errors in Table 4 were employed as mean values and
standard errors to parameterize prior expected values for a potential, capture point,
eco-georeferenceable, superbreeder seasonal, county abatement foci. The prior
expected mean value for the error term was assumed to be zero (0), with a standard
deviation of 0.01. Initial values for the MCMC chains were generated.
To derive the improvement of fit values listed in Table 4, the posterior mean
deviance values were obtained with deviance information criterion (DIC) spatial
analytical tools. We focused on a spatial consideration of the localized DIC
measure for model selection and goodness-of-fit evaluation. We employed a
partitioning of the DIC into the local DIC, leverage and deviance residuals, to
assess the local model fit and influence of the time-series capture point simulated,
potential, seasonal, superbreeder, Ae. aegypti, larval habitat observations in the
Bayesian hierarchical, parameter estimator framework. The NL MIXED procedure
computed three kinds of residuals. Residuals are available for all generalized linear
models except multinomial models for ordinal response data, for which residuals
are not available in SAS.

Table 3 SAS results used to estimate prior distribution of coefficients in the MCMC county
abatement frequency analysis
Posterior Intervals
Parameter Alpha Equal-Tail Interval HPD Interval
Intercept 0.05 2.1098 2.8393 2.1098 2.8386
Volume 0.05 0.0212 0.0264 0.0211 0.0264
Temp 0.05 –0.0122 0.00845 –0.0125 0.00813
RH 0.05 –0.0967 –0.0773 –0.0967 –0.0733
PEH 0.05 –0.00294 0.00675 –0.00296 0.00671
Gaussianization of Variational Bayesian Approximations with Correlated. . . 303

Table 4 Improvement of fit of the PROC MCMC hierarchical iterative Bayesian model Ae. aegypti
superbreeder larval habitat county abatement simulation model
Posterior Correlation Matrix
Parameter Intercept Volume Temp RH PEH
Intercept 1.000 –0.360 –0.380 0.314 –0.498
Volume –0.360 1.000 0.007 –0.963 –0.029
Temp –0.380 0.007 1.000 0.040 –0.431
RH 0.314 –0.963 0.040 1.000 –0.085
PEH –0.498 –0.029 –0.431 –0.085 1.000

4 Conclusion

In conclusion we chose capture point, Ae. aegypti geosampled values of the initial
capture points μ1, μ2, σ 1, σ 2 and the observed immature data x–, s, M, and investigated
the effects of these choices on the posterior distributions p(μ|– x, s, M, μ1, μ2, σ 1, σ 2)
and p(σ|–x, s, M, μ1, μ2, σ 1, σ 2). We began with a “prior distribution”, which was based
on the relative likelihoods of the geosampled, eco-georeferenced, explanatory, time
series Ae. aegypti, eco-georeferenceable, larval habitat “Bayesianizable” endemic,
transmission-oriented, predictive, risk-related covariates. In practice, it is common to
assume a uniform distribution over the appropriate range of values for the prior
distribution [10]. Next, we calculated the likelihood of the observed distribution as a
function of the specified, potential, seasonal, superbreeder Ae. aegypti, larval habitat,
parameter estimator values, by multiplying the likelihood function by the prior
distribution, which was then normalized to obtain a unit probability over all possible
values (i.e., posterior distribution). In so doing, the mode of the distribution was then
the quantitated parameter estimates and “probability intervals”. These intervals
represented the optimal Ae. aegypti, larval habitat superbreeder covariates where
their interaction terms were the analogue of confidence intervals in the regression-
based endemic transmission-oriented time series explanatory, county abatement,
forecast vulnerability model.

References

1. Abramowitz M, Stegun IA (1972) Handbook of mathematical functions with formulas, graphs,


and mathematical tables, 9th printing. Dover Press, New York. pp 16 and 806
2. Abreu FVSD, Morais MM, Ribeiro SP, Eiras ÁE (2015) Influence of breeding site availability
on the oviposition behaviour of Aedes aegypti. Mem Inst Oswaldo Cruz 110(5):669–676
3. Agresti, A. (2002), Categorical data analysis (2nd ed.), Wiley, New York.
5. Arfken, G. (1985), “Bernoulli numbers, Euler-Maclaurin formula”, in Mathematical methods
for physicists, 3rd ed., Academic Press, Orlando, FL, pp. 327-338,
6. Asmussen SR (2003) Steady-state properties of GI/G/1. In: Applied probability and queues.
Stochastic modelling and applied probability, vol 51. Springer, New York
304 A. Huertas et al.

7. Bentley MD, Day JF (1989) Chemical ecology and behavioral aspects of mosquito oviposition.
Annu Rev Entomol 34(1989):401–421
8. Berger JO, Sellke T (1987) Testing a point null hypothesis: The irreconcilability of p-values and
evidence. J Am Stat Assoc 82(397):112–122
9. Bernardo JM, Smith AFM (1994) Bayesian theory. Wiley, Hoboken, NJ
10. Berry DA (1996) Statistics: a Bayesian perspective. Duxbury Press, Belmont, CA
11. Cameron AC, Trivedi PK (1998) Regression analysis of count data. Cambridge University
Press, New York
12. Carlson BP, Louise TA (2008) Bayesian methods for data analysis, 3rd edn. CRC Press, Boca
Raton, FL
13. Carrington LB, Seifert SN, Willits NH, Lambrechts L, Scott TW (2013) Large diurnal temper-
ature fluctuations negatively influence Aedes aegypti (diptera: Culicidae) life-history traits. J
Med Entomol 50(1):43–51
14. Couret J, Benedict MQ (2014) A meta-analysis of the factors influencing development rate
variation in Aedes aegypti (diptera: Culicidae). BMC Ecol 14(3)
15. Cowles MK, Carlin BP (1996) Markov Chain Monte Carlo convergence diagnostics: a com-
parative review. J Am Stat Assoc 91(434):883–904
16. Cox DR, Hinkley DV (1974) Theoretical statistics. Chapman and Hall, Boca Raton, FL
17. Cressie N (1993) Statistics for spatial data, Rev edn. Wiley, New York
18. Dinh ET, Novak RJ (2018) Diversity and abundance of mosquitos inhabiting waste tires in a
subtropical swamp in urban Florida. J Am Control Assoc 34(1):47–49
19. Draper NR, Smith H (1998) Applied regression analysis, 3rd edn. Wiley, Hoboken, NJ
20. Fox J (1997) Applied regression analysis, linear models, and related methods. Sage, Thousand
Oaks, CA
21. Gelman A, Carlin JB, Stern HS, Dunson DB, Vehtari A, Rubin DB (2013) Bayesian data
analysis, 3rd edn. Chapman and Hall/CRC, Boca Raton, FL
22. Gentle JE (1998) Cholesky factorization. In: Numerical linear algebra for applications in
statistics. Springer, Berlin, pp 93–95
23. Glantz SA, Slinker BK (1990) Primer of applied regression and analysis of variance. McGraw-
Hill Professional Publishing, New York
24. Gower RM, Richtarik P (2016) Randomized Quasi-newton updates are linearly convergent
matrix inversion algorithms, Available at: https://arxiv.org/abs/1602.01768. Accessed 5 Oct
2018
25. Griffith DA (2003) Spatial autocorrelation and spatial filtering. Springer, Berlin/
Heidelberg/New York
26. Griffith DA (2005) A comparison of six analytical disease mapping techniques as applied to
West Nile Virus in the coterminous United States. Int J Health Geog 4:18
27. Gu W, Novak RJ (2005) Habitat-based modeling of impacts of mosquito larval interventions on
entomological inoculation rates, incidence, and prevalence of malaria. Am J Trop Med Hyg 73
(17):5460–5552
28. Haight FA (1967) Handbook of the Poisson distribution. Wiley, New York
29. Hazewinkel M (2001) Bayesian approach to statistical problems. In: Encyclopedia of mathe-
matics, 1994th edn. Springer/Kluwer Academic Publishers, Dordrecht, NL
30. Hilbe JM (2011) Negative binomial regression. Cambridge University Press, Cambridge, UK
31. Jacob BG, Griffith DA, Novak RJ (2008) Decomposing malaria mosquito aquatic habitat data
into spatial autocorrelation eigenvectors in a SAS/GIS® module. Trans GIS 12:341–364
32. Jacob BG, Griffith DA, Muturi EJ, Caamano EX, Githure JI, Novak RJ (2009a) A
heteroskedastic error covariance matrix estimator using a first-order conditional autoregressive
Markov simulation for deriving asymptotical efficient estimates from ecological sampled
Anopheles arabiensis aquatic habitat covariates. Malaria J 8(1):216–225
33. Jacob BG, Gu W, Caamano EX, Novak RJ (2009b) Developing operational algorithms using
linear and non-linear square estimation in Python for the identification of Culex pipiens and
Culex restuans in a mosquito abatement district (Cook County, Illinois, USA). Geos Heal 3
(2):157–176
Gaussianization of Variational Bayesian Approximations with Correlated. . . 305

34. Jacob BG, Griffith D, Gunter J, Muturi EJ, Caamano E, Shililu J, Guthure J, Regens J, Novak RJ
(2009c) A spatial filtering specification for an auto- negative binomial model of Anopheles
arabiensis aquatic habitats. Trans GIS 12:515–539
35. Jacob BG, Morris JA, Caamano EX, Griffith DA, Novak RJ (2011) Geomapping generalized
eigenvalue frequency distributions for predicting prolific Aedes albopictus and Culex
quinquefasciatus habitats based on spatiotemporal field-sampled count data. Acta Trop 2:61–68
36. Jacob BG, Mendoza DM, Ponce M, Caliskan S, Moradi M, Gotuzzo E, Griffith DA, Novak RJ
(2014) Pseudo R2 probablity measures, durbin watson diagnostic statistics and einstein sum-
mations for deriving unbiased frequentistic inferences and geoparameterizing non-zero first-
order lag autocorvariate error in regressed multi-drug resistant tuberculosis time series estima-
tors. Am J Appl Math Stat 2(5):252–301
38. Leandro-Reguillo P, Panaou T, Carney R, Jacob BG (2017) Fuzzification of multi criteria proxy
geoclassifiable vegetation and landscape biosignature estimators to predict the potential inva-
sion of Aedes aegypti in Barcelona, Spain. Int J Geogr Inf Syst 4(2):12–21
Simulation and Modeling Applications in
Global Health Security

Arthur J. French

1 Introduction

Global health security (GHS) is dependent upon having an adequate and prepared
health security workforce. There are currently numerous challenges in establishing
and maintaining a health security workforce. The frequency and magnitude of
disasters have increased significantly over the past 30 years. Current and future
GHS threats, both manmade and natural, require a prepared and flexible healthcare
provider workforce ready to respond to current or emerging GHS threats [1]. Devel-
oping and maintaining GHS -specific skills in the healthcare workforce is a tremen-
dous logistical challenge. Innovative education technologies, including simulation
and digital learning, can be leveraged to achieve preparedness for GHS threats [2].
Ebola Virus Disease (EVD), Middle East Respiratory Syndrome (MERS) and
other emerging infectious diseases (EID) are also concerns for GHS. The world faces
increasing challenges related to natural and manmade disasters as the frequency and
magnitude of disasters have increased significantly over the past 30 years.
Healthcare workers protect the well-being of people around the world by preventing,
detecting, and responding to public health threats. Current and future GHS threats,
both manmade and natural, require a prepared and flexible healthcare provider
workforce ready to respond to current or emerging threats [3].
There is a global public health workforce shortage. The WHO tracks healthcare
workforce data on 186 countries, and 57 were insufficient in healthcare provider
availability and accessibility [4]. The 2013–2016 African EVD epidemic revealed
WHO organizational deficiencies in coordinating responses to public health emer-
gencies and a shifting of priorities away from global health security threats [5]. It

A. J. French (*)
College of Public Health, University of South Florida, Tampa, FL, USA
e-mail: ajfrench@aeromd.global

© Springer Nature Switzerland AG 2020 307


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1_13
308 A. J. French

also revealed gaps in response capabilities to deploy adequate numbers of medical,


logistical, and public health experts [6, 7]. It has been estimated that over 1000
deployed healthcare workers would be needed each month in West Africa and stated
that innovative means would be needed to prepare for future GHS responses [8].
The United Nation’s Global Burden of Disease 2017 Study reported information
on countries’ health workers capacity. The study authors estimated that only approx-
imately 50% of countries met their estimated requirement for health-care workers.
Developing countries in Africa, southeast Asia, and Oceania suffered the largest
population-based relative gaps and are the most vulnerable [9].
GHS is a critical public health workforce mission. While medicine has made great
advances in preventing and treating infectious diseases, these advances have been
offset by new EID threats including SARS, EVD, and MERS. These emerging and
re-emerging public health crises have identified gaps in public health capabilities and
generated recommendations to globally strengthen public health’s pandemic
preparedness [10].
The Global Health Security Agenda (GHSA) was launched in 2014 to establish a
world-wide coalition of 44 nations to address the re-emergence of threats from
infectious disease epidemics and pandemics, including bioterrorism. Participating
countries agreed to define and measure milestones of progress and metrics [3]. GHS
is a critical component of nations’ security interests [11].
There is an increased requirement for public health professionals being prepared
to respond to complex humanitarian emergencies and public health emergencies of
international concern (PHEICs), creating a supply versus demand gap. This gap will
grow due to an insufficient public health workforce and increased requirements with
the concomitant threats of increased epidemics and pandemics to disrupt interna-
tional economies and political destabilization.

2 Global Health Security Workforce Competency


Challenges

In 2016 the world health organization announced the process of developing and
launching emergency medical teams (EMTs), a critical component of the GHS
workforce concept. Over 64 countries are in the development stages of establishing
accredited teams, both international and national, to deliver emergency clinical care
to sudden onset disasters and outbreak affected populations. EMTs must comply
with the classification of minimum standards for EMTs developed in 2013 by the
WHO Foreign Medical Team Working Group in the Global Health Cluster. Not
infrequently, health providers have had to learn quickly that their specialty training
alone was not enough that they lack the operational skills sets and collaborative
leadership capacity required for the demands of resource, poor decision-making
during the complexities of public health emergencies such as epidemics. Under the
EMT concept all countries share the responsibility to ensure that cross disciplinary,
Simulation and Modeling Applications in Global Health Security 309

competency-based knowledge in field-related tasks to practice globally with high


standards of care [12] An essential component to improve WHO EMT quality and
accountability of national and international EMTs is appropriate education and
training. Multiple disaster education and training programs are available, however,
most are centered on individual professional development rather than on the EMTs
operational performance. No common overarching or standardized training frame-
work exist [13]. A proposed three-step learning process approach for EMTs is:
(1) professional competence; (2) adaptation to the operational context; and
(3) team performance. Training delivery should go from theory to practice and
from individual to team training [14].
In addition to the shortages of GHS workers there are gaps in their competencies
to respond to PHEICs. There are multiple reasons for the GHS preparedness gap.
There are few formal training programs for disaster responders, particularly in
clinical training and few U.S. public health graduate programs address humanitarian
assistance [15, 16]. Most training is focused on traditional individual skills and not
on unique response-related operational competencies for PHEICs.
Developing and maintaining skills needed during a PHEIC presents a tremendous
logistical challenge. Many of these skills are unique to health security events and are
new skills and knowledge that must be adapted to dynamic situations, particularly
infectious disease outbreaks. They also must be performed in resource-constrained
environments, requiring adaptation of unfamiliar crisis standards of care by
healthcare providers [17]. An additional challenge for healthcare responders is the
deterioration of previously learned knowledge and skills. Competency and skill
retention are important components of training in infrequently used procedures.
Innovative education technologies, including simulation, can be leveraged as a
GHS healthcare force multiplier.
Experience responding to disasters alone is not sufficient. Previous experience
with international disaster relief needs to be combined with disaster medicine
training. There is a need to develop a universal tool for measuring physician skills
in international disaster relief because of the difficulties in evaluating the medical
relief activities of disaster medical teams. Skills change in accordance with
differences in demographic characteristics, previous experiences, and disaster
situations [18].
Because of the immense variation in the nature and magnitude of health security
events the boundaries of specific healthcare provider disciplines’ skills are imprecise
[19]. To prepare GHS responders to meet these challenges, various organizations
and universities have developed competencies for healthcare responders. In 2008 the
American Medical Association published a consensus-based and multidisciplinary
educational framework for disaster medicine and public health preparedness defin-
ing 19 core competencies. Healthcare responders were assigned to three levels of
proficiency: Informed worker/student, practitioner, and leader. This approximates
the ascending levels of application and synthesis of knowledge in Bloom’s taxon-
omy of learning [20]. Core disaster competencies aligned with the six levels of
proficiency established in Blooms taxonomy have been proposed by others also
[19, 21].
310 A. J. French

3 Simulation Based Training: A GHS Force Multiplier

Providing initial and sustainment training to the diverse GHS healthcare workforce
is expensive and inefficient utilizing the traditional classroom model. Greater
demands have been placed on disaster medicine educators. There is a need to
develop innovative methods to educate healthcare providers in the ever-expanding
body of disaster medicine knowledge [22].
Many of these challenges can be met with gaming and simulation technologies
new generations of GHS responders have embraced. Simulation-based training
(SBT) has been demonstrated to provide effective learning and is increasingly
being integrated into healthcare provider education [23]. The advantages of SBT
may be summarized as follows: (1) safety- simulated lives and health can be
jeopardized to any extent required for training without harming any people; (2) econ-
omy- simulated material and equipment can be used, misused, and expended;
(3) visibility- simulation can provide visibility in two ways; you can make the
invisible visible and control the visibility of details allowing the learner to discern
the forest from the trees or the trees from the forest as needed; (4) time control
-simulator time can be sped up, slowed down, or stopped. It can also be completely
reversed, allowing learners to replicate specific problems, events, or operational
environments as needed [24].
SBT has been institutionalized in other high hazard professions, such as aviation,
nuclear power, and the military. GHS responders often have few opportunities for
critical skill development and sustainment. Simulation can provide just-in-time
training and refresher training, which potentially would reduce decay of critical
skills. Simulation can also reach individuals wherever they are during mobilization
for disaster relief in remote areas, including team training of individuals even before
the team is geographically assembled. GHS training must balance the use of
technology, with its probable absence in operational environments. This is analo-
gous to the U.S. Department of Defense (DOD) military services and others who
operate in remote and austere environments where technology, including tele-
mentoring and decision support tools, may not be available. DOD has invested in
computer games and simulation technology for increased training effectiveness and
patient safety, with reduced training costs through increased preparation of person-
nel before costly field training exercises and reduction in total training time [25].
Medical training has traditionally used patients to hone the skills of health pro-
fessionals, balancing the obligations to provide optimal treatment and to ensure
patients’ safety. SBE can mitigate this tension by developing health professionals’
knowledge, skills, and attitudes while protecting patient patients, maximize training
safety, and minimize risks. Expanding use of SBT in GHS training can be considered
an ethical imperative [26].
David Gaba, a leader in the field of medical simulation, defined simulation as “a
technique not a technology to replace or amplify real experiences with guided
experiences that evoke or replicate substantial aspects of the real world and a fully
interactive manner.” [27] Simulation is a person, device, or set of conditions which
Simulation and Modeling Applications in Global Health Security 311

attempts to present educational and evaluation problems authentically. Learners are


required to respond to the problems as he or she would under natural circumstances.
High fidelity medical simulations facilitate learning under the right conditions of: 1)
providing feedback 2) repetitive practice 3) curriculum integration 4) a range of
difficulty 5) multiple learning strategies 6) a controlled environment 7) individual-
ized learning; 8) defined outcomes 9) and simulator validity. These conditions
maximize the impact of simulation-based training [23].
Simulation provides a range of educational tools. A growing number of computer
screen simulations, high fidelity manikins, and virtual reality simulators have
expanded the number of procedures and disaster conditions which can effectively
simulated [28]. In many, but not all, instances SBT can be the best tool for achieving
learning objectives. Simulation create “perfect practice” environments. Advanced
technology has been engineered into large-scale production of simulators capable of
modeling a wider variety of clinical scenarios [29] Responding to changing GHS
environments requires new paradigms for training. Simulation technologies encom-
pass diverse products including computer-based virtual reality simulators, high
fidelity and static manikins, task trainers, live animals, animal products, and
human cadavers. In comparison with no intervention, technology-enhanced simula-
tion training is consistently associated with large effects for outcomes of knowledge,
skills, and behaviors [30].
The important questions for GHS educators are when and how to implement SBT
most effectively and cost efficiently. SBT has demonstrated effectiveness in training
healthcare workers for complex humanitarian emergencies and EIDs [15, 31]. Imple-
mentation science addresses the mechanisms of education delivery in healthcare.
The aim of implementation sciences is to study and seek to overcome healthcare
organizational silos and barriers, pockets of cultural inertia, professional hierarchies,
and financial disincentives that reduce efficiency and effectiveness [32]. GHS
leaders need to be knowledgeable in the fundamentals of SBT theories and modal-
ities, and innovative in SBT strategies to overcome implementation barriers.

4 Simulation Learning Theories

Adult learning theory serves as the foundation for SBT. Effective SBT of adults
requires a sound understanding of adult learning theory and experiential learning.
Mental models form the basis for operational decision-making. GHS educators can
design simulation scenarios to stress existing mental models, helping individuals to
identify areas where they need and want to learn.
Malcolm Knowles developed the andragogy conceptual framework for adult learn-
ing. This framework about adult learners must be considered when developing simu-
lation programs. These assumptions about adult learners are: (1) Adults need to know
why they need to learn something before undertaking the effort to learn; (2) adults have
a self-concept bias towards independent and self-directed learning; (3) adults have
acquired a great deal of life experience that help shape adults self-identity; (4) adults
312 A. J. French

value learning that helps them cope with the demands of their everyday life; (5) adults
are more interested in life centered or problem centered approaches than subject
centered approaches to learning; (7) adults are more motivated to learn by internal
drives than external ones, including self-efficacy. For effective SBT there needs to be a
strong sense of a safe learning environment where the participants do not feel threat-
ened to make mistakes and mistakes are used for teachable moments opportunities
rather than embarrassing the learners. This safe learning environment must be
established during the orientation to the simulation environment [33].
Simulation incorporates the evidence-based learning theories of Kolb’s experi-
ential learning cycle [33, 34]. Kolb described learning as a continuous reflective
process grounded in concrete experience and a continuous adaptation of what a
person actually experiences in the world. Reflection is the ability to learn and
develop continually by applying current and past experience experiences to unex-
pected events. Simulation is able to provide repeat and consistent experiences with
the expected or unexpected [35]. Simulation-based mastery learning is a form of
competency-based education which all learners acquire essential skill and knowl-
edge measured rigorously in relation to high-end fixed achievement standards
without restricting learning time to uniform interval to reach the outcome. The
education goal of mastery learning is excellence for all learners.
A key concept in this conceptual framework is the formation and maintenance of
learner self-efficacy, the belief in one’s capabilities to organize and execute the
courses of action needed to manage prospective situations. Self-efficacy relates to
believing in oneself to take action [35]. Errors trigger stress and emotional responses
from learners and can be a strong stimulus for learning. Most learners can recall what
they did incorrectly because a situation in which errors occur provides an active
learning environment. Error management training and stress inoculation training
capitalize on that stress response, allowing learners to develop contingency plans,
backup strategies, and stress management [36].
Benjamin Bloom’s taxonomy of learning is a hierarchical classification of types
of learning. Depicted as a pyramid, it implies that student abilities increase the higher
on the pyramid they go, progressing from knowledge to comprehension, and then to
application and analysis [37]. Knowledge, and comprehension are the simplest
levels of learning. The ability of the learner to apply and analyze knowledge is an
indicator of competence. SBT can facilitate advancing the learner from knowledge
and comprehension to application, analysis, and even synthesis [34].
GHS preparedness is a complex and dynamic field involving the acquisition,
retention, and rapid recall of large amounts of knowledge and the ability to quickly
and precisely perform a variety of procedures. Procedural technical skills have
traditionally been acquired through an informal manner exemplified by the mantra
of “see one, do one, teach one”. Reliance on chance exposure to procedural learning
is suboptimal, especially for the GHS environment in which procedures are
unplanned, unpredictable, and often time sensitive. Using simulation for high-stress
psychomotor skills involves the steps of conceptualization: (1) the preparation to
learn the procedure; (2) visualization of the procedure in its entirety performed by an
expert; (3) verbalization and demonstration of the procedure with narration by the
Simulation and Modeling Applications in Global Health Security 313

instructor of the steps being performed, breaking the procedure down into smaller
task; (4) verbalization by the learner. During guided practice; (5) feedback.
Watching video playback obtained with platforms such as tablets smart phones or
point of view devices together provides a shared mental model which facilitates
reception of feedback [38].
Extensive experience does not invariably lead people to become experts
[39, 40]. Coach Vince Lombardi coined the phrase “Practice does not make perfect.
Perfect practice makes perfect.” [41] Ericsson’s theory of deliberate practice is
defined as “a regimen of effortful activities designed to optimize improvement”
and involves “training focused on improving particular tasks” [42]. Simulation
enables learners to progress from just knowing information to being able to apply
it in actual situations through deliberate practice. Some professionals develop faster
than others and continue to improve during the ensuing years. These individuals are
eventually recognized as experts. In contrast, many professionals reach a stable,
average level performance within a relatively short timeframe and maintain this
status for the rest of their careers. Expert performance requires continued deliberate
practice for maintenance of high performance. Deliberate practice has been found to
be key factor in maintaining expert levels as performers reach older ages. Most
expert musicians have spent over 10,000 h of practice which is 5000 h more than less
accomplished groups of musicians. A sufficient amount of weekly delivered prac-
tices been shown to allow expert pianists in their 50s and 60s to maintain their piano
performances at a comparable level to that of young experts. Meta-analyses of the
use and benefit of simulators in medical training have discussed the need for
structured training and deliberate practice with the simulations guided by the goals
of the training [43].
The learning curve theory applies to SBT. The learning curve is a logistic
(S) shape of the relationship between time spent repetitive practice and improvement
in performance and riches in an inflection point where learning becomes more
effortful. A typical learning trajectory involves an initial latent phase, then rapid
learning but later, after an inflection point, diminishing returns for each unit of effort
invested. Each repetition results in an improvement in ability but there exists within a
learning system some maximum learning achievable [44]. Environmental and emo-
tional stressors may negatively affect critical thinking and clinical skill performance.
By introducing more advanced simulation scenarios with added stressors, SBT
facilitators may induce “stress inoculation” in learners and enable them develop
responses to overcome stress-related performance barriers in high stress
environments [40].
SBT should be based upon cognitive task analyses. The development of effective
SBT requires a collaborative team effort where three types of expertise are coordi-
nated: (1) specialty expertise focused on providing complete and accurate informa-
tion about the critical skills and knowledge to be simulated; (2) instructional design
expertise focused on the simulations and assessment methods that produce maxi-
mum learning and transfer of knowledge; (3) software development expertise for
efficient design and development of the software required to capture expertise,
presented in an engaging way, and assess student interactions with the simulator.
314 A. J. French

The traditional teaching by subject matter experts often is inadequate due to


unintentional omission of approximately 70% of critical decisions and analysis
required to succeed for completing the task. It is critical to implement a collaborative
team approach using cognitive task analysis methods, involving subject matter
experts to produce complete and accurate representations of expert cognitive behav-
ioral strategies. Cognitive task analysis-based strategies should be incorporated into
the simulation authoring environment to guide the production of simulations for
effective learning experiences [45].
Instructional design is a critical element of SBT. Simulation requires an instruc-
tional design framework to assist educators in creating appropriate simulation
learning experiences. Simulation includes vastly different educational modalities.
The framework for simulation instructional design has four domains:(1) Instructional
medium; (2) simulation modality; (3) instructional method; (4) presentation. Each
simulation modality is best suited for specific competency domains and learning
outcomes. The choices within each domain are based on a matrix of simulation
relating the severity of possible outcomes to the frequency of events, with a
corresponding optimal “Zone of simulation matrix” [46].
Most learning does not take place during the simulation but rather during the
debrief which should follow. A debriefing is a discussion that occurs immediately
following the simulation experience during which educators and learners can reflect
together, analyze performance and enhance the mental models that guide behavior.
This reflection leads to permanent change [34]. Debriefing of learners after SBT
session has been proposed as the primary learning method, with different models of
debriefing depending upon the scenarios and knowledge level of learners
[47]. Debriefing is a critical component of the simulation experience. Facilitators
guide the students for the management of the scenario by discussing the learning
objectives and providing feedback on the students performances. Key insights from
SBT are derived during this important aspect [33].

5 Blended Learning

Blended learning is an integration of multi-media methods of instructional design


that combines face-to-face instruction with online learning and simulation. Other
words and phrases used to describe blended learning include web-based learning,
e-learning, hybrid learning, flexible learning and mixed mode learning. Combining
these alternative instructional methods with traditional instructional methods recon-
stitutes a “blended approach”. Blended learning has been advocated to improve the
preparedness training for the public health workforce [48]. A blended learning
curriculum combining a standard course manual with simulation has demon-
strated improved performance over a standard reading course [49].
Blended learning combines traditional instructor-led teaching and innovative
technology-enhanced learning methods, reducing classroom seat-contact time
Simulation and Modeling Applications in Global Health Security 315

[48]. Blended learning is a new term in the vocabulary of educators since the
integration of computers and learning. Educators and information technology pro-
fessionals refer to blended learning as those courses that combine face-to-face
traditional classroom style teaching with online learning. Blended learning utilizing
simulation has been shown to be effective in disaster medical education while
improving faculty efficiency over traditional education methods. The blended learn-
ing approach is well received by learners and can offer significant enhancement to a
online distance-based learning course, particularly when specific skills are required
in addition to didactic information [50, 51].
Blended learning and the “flipped classroom” model can be integrated with SBT and
effectively teach preparedness for PHEICs to public health workers. Blending learning
of online learning modules and on-site simulation has been effectively used for rapid
deployment of EBV training. Virtual reality simulation presented on laptop computers
coupled with human simulators has been proposed as a cost-effective method of hybrid
SBT to achieve competency in train-the-trainer programs [48, 51, 52].
Prototypical blended learning programs for teaching disaster medicine consist of
an e-learning platform with a classroom session involving problem-based learning
activities, tabletop exercises, and a computerized simulation. The blended approach
incorporating simulation tools successfully increased participants’ disaster medicine
competencies in performing mass casualty triage [50]. Students did have knowledge
decay 6–12 months after course completion which is consistent with existing
literature on decay of knowledge and skills following educational interventions [53].
Video-based learning also offers a promising alternative to traditional learning
methods for teaching disaster medicine core competencies. Video modules have
been shown to be as effective as traditional lecture teaching triage, decontamination,
and personal protective equipment procedures. Recognizing that increased knowl-
edge and comfort scores do not necessarily translate into task competencies, it was
recommended that a practical skills demonstration component be added to the course
for the testing of learners’ practical skills and judgment [22].
Blended learning technologies also allow opportunities for unsupervised, self-
regulated learning. Self-directed learning is when a learning environment is designed
to promote autonomous learning. Studies suggest that directed self-regulated learn-
ing is as effective as instructor regulated learning yet allows more learner flexibility
and access [54].

6 Modes of Simulation

There are multiple simulation technologies available to GHS educators, ranging


from part-task trainers, computer-based interactive systems, computer-enhanced
high-fidelity manikins, and virtual reality simulators [55]. Digitally-enhanced syn-
thetic manikins vary technologically from high sophistication to low sophistication.
Virtual workbenches use digitally generated images to create a medical training
316 A. J. French

environment focused on specific skills with haptic feedback. Complex algorithms


generate a virtual three-dimensional anatomy on a graphic display screen.
A partial task trainer is frequently the ideal tool if the learning objective is
confidence and/or competence in an emergency procedure. The simulator can be
as simple as an orange for learning injections to a virtual-reality ultrasonography
model [29].
There are a number of terms used for the digitally-enhanced manikin modality,
the most frequent of which are human patient simulators, high-fidelity simulators,
and high-fidelity patient simulators. Although there are no strict definitions, these
terms generally apply to full-body simulators with remote computerized controls,
monitor displays of vital signs, and air compressor electrically- driven functions
creating normal physiology and pathophysiological processes. High-fidelity mani-
kins provide a suitable training platform for complex decision-making and integra-
tion of interventions into case-based scenarios.
Another technique for SBT utilizes screencasting. Screencasting is a type of
lecture that incorporates digital recording of computer screen actions and audio
narration. Screencasting can incorporate digital recording, narration, interactivity,
and metrics into the lecture format. Assessment questions come embedded within
the lectures to provide students immediate feedback on their understanding of the
topic. Instructors can monitor students time spent viewing the lectures and their
scores on embedded assessment questions to gain valuable pre-class assessment
data [56].
Immersive simulation is a planned educational activity where learner or group of
learners take part in the care of a simulated environment realistic enough for the
participants to feel immersed in the surroundings, suspend disbelief, and manage the
scenario as if it were real. There are numerous methods for creating immersive
simulation. In situ simulation, where the scenario is executed in the actual work area
such as the emergency department, intensive care unit, helicopter, or field response,
can provide an equal experience and has its own advantages. Participants manage the
simulated patient in their actual environments while performing tasks with their own
equipment, allowing for increased realism, and enhanced transfer of the experience
into actual clinical care [57]. Computer-based training is logistically simpler in
most instances and as effective as more expensive forms of simulation. Successful
applications of three-dimensional computer simulations include team training, clin-
ical pathway management, and system-based simulations focusing on disaster
response exercises for hospitals and prehospital events.
Digital learning is more effective when there is learner interactivity versus passive
learning. Interactivity can be increased by using branched chain narrative virtual
patient cases that require learners to actively participate in simulated clinical man-
agement [20, 58].
A simulation is considered live if all roles are played by human beings in a real
environment with real systems. A training simulation is considered virtual if some
roles, systems, and/or the environment is represented by a computer program,
Simulation and Modeling Applications in Global Health Security 317

mathematically and/or graphically. A training simulation event is considered con-


structive if all the roles, systems, and environment are represented by a computer
program. A combination of live, virtual, and constructive simulations has the
potential to allow exploration of complex training scenarios alongside other envi-
ronmental factors. An advantage of the combination includes being able to foster
collaboration and trust under stressful medical situations. A disadvantage of live
virtual constructive simulations is the cost [59].

7 Simulation Fidelity

The quality of simulation-based training experience depends on successful engage-


ment with the learner. Achieving engagement depends on a sense of immersion,
successful visuals, and simulator responsiveness. It is important for GHS educators
to understand the implications of simulation fidelity on achieving learning objectives
in order to determine what fidelity is optimal.
Simulation fidelity is how well the simulation represents reality. It is important
that simulation technologies and fidelity be appropriately utilized and aligned with
the learning objectives [60]. Simulations should have optimal fidelity to place
learners in lifelike situations that provide immediate feedback about questions,
decisions, and actions. Simulation fidelity consists of three distinct domains: engi-
neering (physical equipment), environment, and functional [61]. Engineering is the
degree to which the simulator replicates the physical characteristics of the actual
task. This is usually directly proportional to the technological complexity and
concomitant expense of the simulator. Environmental fidelity refers to visual, audi-
tory, and other sensory feedback the simulator provides. Psychological fidelity is the
degree of disbelief that learners perceive from the simulator and their interaction as if
in an actual situation the simulator is emulating. Simulation fidelity is often, but not
always, related to the level of simulator technology.
Healthcare simulation fidelity can also be expressed from Uwe Laucken’s social
endeavor perspective of three modes of reality- physical, semantical, and phenom-
enal. The physical mode are things that can be measured in standard physical
measurement dimensions e.g., length, sound frequency, and weight. The semantical
mode represents how presented information makes sense and can be logically
interpreted. The phenomenal mode includes emotions and beliefs that learners
experience during the simulation. The semantical and phenomenal modes represent
psychological fidelity [62].
The concept of simulator fidelity usually implies the degree to which a simulator
looks, feels, and act like a human patient. Although this can be a useful guide in
designing simulators, this definition emphasizes technological advances in physical
resemblance over principles of educational effectiveness. Studies have shown that
the degree of fidelity appears to be independent of educational effectiveness. Several
concepts associated with fidelity are more useful in explaining educational effec-
tiveness, such as transfer of learning, learner engagement, and suspension of
318 A. J. French

disbelief. It is been proposed to abandon the term fidelity in SBT and replace it with
terms reflecting the underlying primary concepts of physical resemblance and
functional task alignment, and shift from the current physical emphasis or emphasis
on physical resemblance to a focus on functional correspondence between the
simulator and applied context [63].
Simulations can achieve engagement with the learner more successfully if actions
they perform in the scenario are followed by visible or audible responses. Human
physiology can be simulated by various means including physiology engines,
complex state machines (CSMs), simple state machines (SSMs), kinetic models,
and static readouts.
Moderate fidelity approaches to simulate physiology are often managed by
CSMs. CSMs are computer programs consisting of logical rules and decision tree-
based logic that responds to user activity. User actions, simulation timers, and other
events trigger different states that change the patient presentation and vital signs. The
major disadvantage of CSMs is that they do not respond well to unexpected,
complex, or combinational inputs. Undesirable inputs and program complexity are
addressed by limiting the variety of possible interventions. Recovery from user
errors once a scenario is moved down a decision tree is difficult to program. VR
and game-based simulations often use the CSM model.
Low fidelity approaches require less technology, effort, and sophistication to
author. Patient simulations can use simple state machines to great effect. SSMs
consist of three or more fixed states that alter the appearance, communication, and
physiology data of the simulated patient. They lack branching and conditional
features of the more complex CSMs yet still offer many of their benefits. They are
especially useful in adding a dynamic appearance to a simple case presentation.
For many medical simulation-based educational experiences low fidelity
approaches are often adequate if not preferable. Physiology engines are neither
necessary nor desirable in many situations because simpler methods for depicting
physiology states are often more practical. Physiology engines excel in advanced
simulations and exploratory learning where sophisticated learners want to try unex-
pected things and see accurate responses. Examples of this include anesthesia and
resuscitation simulations. CSMs are useful for interactive case scenarios in game-
based training because they provide predictable behavior, excellent responsiveness,
and can be reasonably complex while appearing to have high fidelity. SSMs are well-
suited to interactive case scenarios, lecture presentations, and many activities
because they’re very easy to author yet are interactive and responsive to user
input [64].
It is been proposed that the notion of high-fidelity versus low-fidelity simulation
requiring complete technological replication of reality instead be replaced by accu-
rate representation of real-world cues and stimuli. All aspects of fidelity significantly
hinge on the learners perceived realism of the context of the learning episode as
opposed to any one particular element [65]. The balance between costs and higher
fidelity SBT depends upon the learning objectives and the learners’ level of entering
knowledge. The goal is to create an immersive environment which engages the
Simulation and Modeling Applications in Global Health Security 319

learner. Low fidelity SBT is often as effective as high-fidelity SBT, and the task of
the educator is to identify where the costs and learning curves meet [46].
Expensive, high fidelity simulators predict the real-world flight performance of
expert pilots. There is an optimal point beyond which one additional unit of
simulator fidelity results in a diminished rate of practical assessment of non-expert
pilot performance. The total fidelity concept may be most appropriate for the training
and assessment of expert pilots who can identify and process all the visual, aural, and
other contextual cues of real-world aviation task. Novice pilots can become
overwhelmed with total fidelity. This is because initially novice pilots must first
familiarize themselves with the look, shape, location, and feel the actual devices in
the cockpit and in the memorization and execution of emergency procedures. It can
therefore be deduced that high fidelity is desired in simulation-based assessment
devices that attempt to predicted expert performance in real-world situations. How-
ever, the same may not hold true for the practical assessment of learners with skill
and experience level falling between novice an expert. With part task trainers, novice
learners can build confidence and procedural knowledge, while enhancing safety and
learning from mistakes [60].
There is ongoing debate on the effectiveness of training on live tissue models,
such as goats, versus synthetic training simulation models. A study done by the
United States Army showed that certain procedures were done as effectively on
synthetic training models as on live tissue models. Some skills were done better on
live tissue models, and studies are ongoing to evaluate new synthetic training models
to replace live tissue training. Low fidelity models may be equivalent or better for
novice trainees. It is difficult to predict which model might be best for training.
Structural fidelity such as physical resemblance may be higher in one model such as
a human patient simulator, and functional fidelity such as tissue feel enforces
required may be higher in the live tissue model [66]. While learners may prefer
life tissue training over simulation there is been no difference in performance
between learners training simulators versus live tissue models [67]. There is also
the cost of the life of an animal to consider and labor to prepare animal models which
is increased over synthetic models [68]. Current initiatives in development of more
realistic synthetic manikins include creating common open standards for hardware
so the manufacturers and developers can create unique equipment tools which could
be implemented on any manikin [69].
The effectiveness of simulation outweighs cost if human lives are at risk. Cost
considerations, however objective and well-developed, should not be the sole
concern. Funding decisions for simulation need to include personnel, facilities,
equipment, materials, and simulation effectiveness [70]. Each one of these needs
to be approached in the analysis, design, development, implementation, and evalu-
ation for each of those domains [24].
320 A. J. French

8 Virtual Reality

Total immersion virtual reality (VR) is a progression from a virtual workbench that
creates an all-encompassing virtual environment for the learner displayed in a three-
dimensional form. VR enables creating online virtual worlds with virtual patients
which can be used for student assessments in clinical scenarios [71]. The display can
be either on a computer screen or can be on a device worn by the learner. A 3-D
virtual learning environment can demonstrate the interconnectedness of healthcare
practices across disciplines. VR environments can utilize a technique called
machinima, which is a method for making movies in the virtual worlds. Its real-
time nature favors speed, cost saving, and flexibility over the higher quality of
pre-rendered computer animation [72]. In full immersion virtual reality, the individ-
ual wears eye goggles that allow him or her to experience a normal enclosed spaces
and interactive environment that he or she can move through and engage elements
within, as if doing so in real life.
Scenarios requiring a physiologically responsive virtual patient require a model-
ing framework to build the virtual patient. This framework allows clinicians to
remodel the patient physiology without requiring computer programming expertise.
Virtual worlds are live, online, interactive three-dimensional environments in which
users interact using speech or text by a personalized avatar. Access requires a
modern computer and Internet connection. High fidelity environments can be pro-
hibitively expensive for use on a large scale. The graphical fidelity of Second
Life© and open simulators is sufficient to provide a realistic, immersive environ-
ment. Second Life, and the open source equivalent OpenSimulator, are low-cost,
easily accessible virtual worlds [73].
Immersive training in a virtual environment has the potential to be a powerful tool
to train GHS responders for high consequence, low frequency events such as a
terrorist attack. The flexibility of the VR environment allows, with minimal pro-
gramming effort, modifying avatars to simulate victims of unique hazards or injuries
to emphasize specific skills [74]. Virtual training can be provided at flexible times
and in modular components to test rare events. It can assess not only clinical skills,
but also organizational proficiency-a key element of disaster management. It can be
useful to cost-effectively meet various regulatory requirements and to provide
sustainment in addition to initial training [75]. Simulation can be used to provide
just-in-time training for critical care skills to responders who might not use them
frequently enough to maintain their proficiency. Virtual simulators for GHS skills
difficult to maintain, such as mechanical ventilation, have been developed. These
skills will be essential during pandemics or emerging infectious diseases, which
often require providers to extend their normal scope of practice into critical care
skills [76].
Augmented reality is a combination of digital and physical media. Augmented
reality adds computer-generated imagery to everyday objects to provide additional
information about the object in the environment to a learner. An example of
augmented reality is Body Explorer© which uses a projector mounted above the
patient simulator to provide x-ray vision views of anatomy, physiology, and clinical
Simulation and Modeling Applications in Global Health Security 321

procedures. Microsoft HoloLens© is a mixed reality head-mounted device. The


see-through holographic computer allows one to view high-definition holograms
within his or her learning space. Working in the virtual environment allows learners
to experience a realistic work situation, explore information and action options, and
receive feedback from the virtual patient as they proceed. VR systems can shorten
the learning curve, decrease practice time, and improve learning outcomes [77].
Training healthcare teams in online, virtual environments with dynamic virtual
patients is an effective method of training for management of mass casualty inci-
dents. Live exercises are the accepted “gold standard” for healthcare disaster exer-
cises, but they are costly and time-consuming to organize, and may be disruptive to
local services. These drills are expensive to provide and require abundant resources,
including volunteers to play patients, moulage artists, and participation by many
healthcare and safety personnel, taking instructors and trainees away from their
normal duties for extended periods of time. Despite the cost and disruption of
organizing and running major incident exercises there is little evidence of their
effectiveness in improving the preparedness of an organization for a major incident
response. Low-cost virtual worlds have been successfully developed and tested for
preparation and training multiagency and multisite major incident responses. Virtual
environments are more immersive over tabletop exercises, placing participants into a
realistic and stressful environment which can be based upon their local or novel
surroundings. Scenarios can be reset rerun within seconds to maximize the resource
availability and cost effectiveness, enabling participants to perform and gain an
understanding of multiple key roles. The ability to record and playback the scenarios
from a point of view of each participant enables accurate and structured debriefing,
either by expert facilitators or peers, which is difficult to achieve in a large-scale live
or tabletop exercise. Virtual worlds also have the advantage of replicating exactly
out location resources in a healthcare facility. Trainees do not have to be present at
the same location to play their avatar roles, drills can be conducted at any time of day
or night; a variety of patient conditions remodeled to simulate the complexity of a
real mass casualty incident, even deaths. Scenarios can be run more than once in a
short period of time allowing trainees to learn from their mistakes; and trainees
performance during the simulation recapture for playback and assessment after the
after the event [73, 78].
VR simulation has been shown to be equivalent to live simulation with standard-
ized patients in assessing mass casualty triage skills, including identifying and
treating bioterrorism diseases [75, 79–82] In a dynamic VR learning environment,
each victim can be triaged multiple times, first in primary triage and again in a
secondary triage area. This enables users to test their knowledge in a wide range of
victim scenarios and receive immediate feedback [83]. Novice learners have dem-
onstrated improved triage and mass casualty intervention scores, speed, and self-
efficacy during an iterative, fully immersed virtual reality experience [84].
Team training with online virtual world simulation yields comparable results as
are achieved with the traditional method for simulation-based team training with
instrumented manikin- based simulation and a real physical environment. VR
scenarios can be presented to multiple trainees simultaneously over time to geo-
graphically dispersed locations, reducing costs [85].
322 A. J. French

VR provides flexible, consistent, on-demand training options. VR is an effective


training platform for rare GHS incidents where high level performance is critical, but
difficult to rehearse, such as CBRNE incidents [81]. A VR hybrid approach has been
successfully used to simulate Ebola treatment centers. By immersion in a highly
realistic Ebola treatment center projected on the wall, learners advance to accomplish
competency-based specific operational skills by using real equipment and
performing procedures on manikins. This hybrid simulation model provides trainees
opportunities to integrate different evaluation methodologies, such as using fluores-
cent liquid or powder under ultraviolet light to measure the degree of exposure, and
video cameras for education replay and educational analysis [86].
VR has the ability to record and assess patient flow aspects of the simulated mass
casualty incident response, from individual patient movements to overall
nontechnical and team performance. The visual layout can be customized to match
actual hospitals to prepare local staff or site-specific responses or scenarios. The
virtual healthcare facility allows learners to recognize the need to prioritize the
availability of “downstream” beds with higher levels of care in intensive care units
versus focusing primarily on the “upstream” management of acutely presenting
patients, and actions can be recorded, replaced, and reflected upon [87]. VR simu-
lations are effective in identification of bottlenecks, crowd control issues, and
resource needs to educate key hospital decision-makers about disaster procedures
prior to a full-scale drill [88].
Individual GHS skills can be assessed in online 3-D virtual world simulations
using virtual patients. Commercial software such as Second Life©, a 3-D virtual
accessible via the Internet can be used to provide simulations. Learners are able to
navigate through the virtual world and manage a virtual patient via an avatar, an
online character. The inherent benefits of 3-D virtual patients can be extended to
every GHS specialty that encompasses patient contact [71].
Computer-based virtual patients are not being utilized to create high fidelity
simulated patient interactions. The term virtual patient is used to represent a variety
of developing technologies to create interactive computer models of the patient-
provider interaction, simulating communication and information gathering, and
allowing the user to apply diagnostic reasoning. Computer-based virtual patients
can be integrated into GHS curricula in a flexible manner. The most efficient use of
these cases is for institutions to exchange, edit and reuse them, and create a pool of
cases for general use. To overcome these challenges a virtual patient working group
was established by MedBiquitous, a nonprofit organization accredited by the Amer-
ican National Standards Institute to develop information technology standards for
VR patients [20, 58].
VR can also incorporate adaptive educational technologies. Learners vary greatly in
knowledge and a one size for all approach is insufficient. Adaptive educational tech-
nologies can address learning challenges presented by complex learning performance
environments such as GHS, and can minimize the amount of time in which advanced
learners are bored and struggling learners are overwhelmed. Adaptive learning technol-
ogies include software that can partially mimic a simulation educator to meet immediate
and long-term needs of leaners learners with minimal or no human intervention [89].
Simulation and Modeling Applications in Global Health Security 323

9 Simulation Applications in Health Security Exercises

Simulation exercises are an essential component of GHS emergency preparedness


and response. While training is essential, responders need the ability to practice what
they’ve been taught in order to build their confidence to be able to apply those skills
in the event of an emergency. The Homeland Security Exercise and Evaluation
Program (HSEEP) provides a set of guiding principles for exercise programs as well
as a common approach to exercises [90, 91]. The WHO recommends nations include
simulation exercises in monitoring and evaluation to test the actual functionality of
their required International Health Regulations (IHR) core capacity and share les-
sons and best practices with other countries and stakeholders. Protocols for national
simulation exercises include table-top exercises, skill drills, national functional
assessment exercises, or full-scale exercises, which may be combined. Simulation
exercises have been identified as a key component in the validation of core capacities
under the IHR monitoring and evaluation framework. They play a key role in
identifying the strengths and gaps in the development and implementation of
preparedness and response measures [68, 92, 93].
Functional exercises represent an important link between disaster planning and
disaster response. Although these exercises are widely performed, no standardized
method exist for their evaluation. Conducting a functional exercise requires exten-
sive preparation, and large amounts of time and money are often required. The
effectiveness of standard disaster drills as a tool for hospital preparedness is difficult
to determine due to the logistical challenges of conducting these exercises and their
effect on normal healthcare operations [94].
Simulations have been shown to have equivalent results compared with live stan-
dardized patients. Participants rated simulators equivalent with respect to realism,
disease representation, physical examination findings, and administering treatments.
Simulators also offer the advantage of less variability than live actor patients in case
presentation and progression [95]. Interactive screen-based simulation is potentially
easier and more convenient to accomplish with computerized mass casualty scenar-
ios than having live moralized standardized patients for mass casualty incident drills.
Computerized simulations did not have the psychological fidelity of standardized
patient drills and functional fidelity. The ability of the simulation to portray realistic
inputs and outputs by the participant is important [96]. To maximize the benefit of
these exercises, accurate evaluation of the actions carried out during the exercise is
necessary [97]. Computerized simulations have ability to include built-in evaluations
and debriefing.
Most clinical personnel are unfamiliar with the complex processes, equipment,
and unwavering attention to the small but necessary details associated with
low-frequency, high-consequence GHS events. An advantage of simulation is a
re-creation of a safe and realistic environment where trainees can gain and maintain
essential skills. Simulations may include practice with the use of cognitive aids and
checklists. Simulation may also be used for testing and validating action protocols
before actual implementation and the actual context of care [98]. Internet-
distributable training programs that provide course materials and built-in evaluation
324 A. J. French

tools to train healthcare workers in high risk infectious disease responses, such as
EVD, are available. Courses include an online self-study component, a hands-on
simulation workshop, and a data driven performance assessment toolset to provide
debriefing [99]. The U.S. military has recognized that simulation training is an
integral component for EID outbreak preparedness, and created a program of
instruction outlining a formalized EVD training program using high fidelity simu-
lation replicating the work environment inside an Ebola treatment unit [100].
Multidisciplinary simulations have demonstrated gaps in systems that could
expose responders to EVD. Walk-through evaluations do not provide information
on latent threats at the same level of granularity as a full-scale simulation. Improve-
ments may not be accomplished with a “one and done” simulation experience, and
longer periods for deeper simulations may be required. This type of effort is not
possible without an investment in simulation educators, technologists, and a robust
simulation infrastructure [101].
Simulations integrating video recording of provider behavior can be used to
assess the effectiveness of PPE techniques. Ultraviolet tracers, liquids that fluoresce
under ultraviolet light, can be used to simulate provider self-contamination while
doffing PPE. The UV tracer is a useful analog of contaminated bodily fluids because
it spreads easily and its spread decreased with the use of barrier methods. This model
can be used during simulations to measure the effectiveness of different forms of
PPE and provider competencies for appropriately donning and doffing PPE
[102]. Video recording of simulations have been used to identify PPE errors for
failure modes and effects analyses. Human factors methodologies can identify error-
prone steps, delineate the relationship between errors in self contamination, and
suggest remediation strategies. Simulations can assist in probing processes to iden-
tify latent system hazards and human factor effects [29, 76, 103].
Simulation can be used to teach teamwork for multiple patient casualty scenarios.
Teamwork is a critical aspect in response to multiple patient casualties. Team
situational awareness, team leadership, coordination, and information exchange are
core team processes required for team performance and multiple patient casualty
scenarios. Relevant behaviors and sub-behaviors within these processes can be
observed in simulations [104].

10 Game-Based Learning Health Security Applications

The benefits of simulation are well documented, however the significant expense in
human resources required to deliver traditional manikin or standardized patient
simulations have resulted in questions on whether such resource intensive modalities
are needed. Game-based learning (GBL), which falls under the umbrella of simula-
tion, is poised to take on a greater role in GHS training. As the number of serious
games for healthcare training continues to grow, having schemas that organize how
educators approach their development and evaluation is essential [105].
Simulation and Modeling Applications in Global Health Security 325

GBL is a disruptive technology that sits at the intersection of high-fidelity


simulation, computer-based learning, and distance education [106]. Games with
the purpose of improving an individual’s knowledge, skills, and attitudes are called
serious games [107]. Similar to aviation simulators which train pilots in emergency
procedures, serious games have been used to train surgeons in situational awareness
for addressing crises and their ability to anticipate nonroutine adverse events during
surgery [108A]. Serious games have been used to train forward combat casualty care
medical personnel in a hostile remote environment advanced lifesaving procedures
and trauma triage to non-trauma specialists [109–111].
Games should be used for targeted learning objectives, and should align with
classroom activities, and not confuse liking with learning [112]. Educational game
design and development is a process of defining learning and assessment goals,
determining what is needed to achieve those goals, developing an appropriate
simulation, and validating its effectiveness. The methodology often used is the
ADDIE instructional design model of analysis, design, development, improvement,
and evaluate. Portions of the methodology may require a compromise between
educators and developers because instruction and assessment requirements may
dictate selection of a particular genre platform. Learning outcomes from simulations
also depend how well the domain instruction is integrated into the simulation due to
cognitive load theory. Cognitive load is the total amount of mental activity imposed
on working memory at a point in time. Extraneous cognitive load is a load caused by
any unnecessary stimuli. These are key design considerations when designing
educational games [113]. Achieving the proper balance between entertainment and
instruction in a game is essential and can be difficult to attain. Serious games are an
experiential activity, rather than a presentation requiring memorization of facts.
Learning goals and gameplay are often not disclosed to the user. Often the user
learns by playing the game, where discovery in itself may be part of the
gameplay [114].
GBL simulations have the interactivity of high-fidelity simulation with the added
benefits of convenience, scalability, and easy distribution. An important concept in
game-based learning is the concept of flow. A common colloquialism for this
concept is “being in the zone”. Flow is a mental state in which the person is fully
engaged, focused, and committed to the success of the activity. Inducing a flow state
is a key goal in game-based learning. A learner cannot consciously force themselves
into a flow state, but three conditions make flow more likely: (1) the learner must be
immersed in an activity which the goals are clearly stated; (2) the learner must
believe they possess the skills to overcome the perceived challenge; (3) the task must
include prompting coherent feedback allowing learned to adjust their performance to
accommodate changing demands [106].
Every learner decision in a game can be exported to a learning management
system. This ability to track assist learners to assess their own progress, and can used
both for formative and summative assessment. GBL utilizes the concept of chaining.
Chaining is an instructional concept attributed to B.F. Skinner. Chaining is achieved
by breaking down complex behavior of the multiple individual behaviors called
links. Each link must be mastered before moving onto the next, and each link
326 A. J. French

reinforces the behaviors mastered before it. This is also known as laddering or
scaffolding. GBL uses this to teach complex tasks through small interwoven,
interactive links [106].
The development of learning games is complex. Game development requires
expertise in GHS, education, and technology development to create the complex
design, modeling, and scoring required to make an effective game. Game develop-
ment can be costly and require lengthy development time. Proposed frameworks for
game development have been proposed using a three-phase development process.
The first phase is preparation and design. This includes identifying and funding the
appropriate members needed to develop the technical components i.e. game devel-
opers, GHS content subject matter experts, and users. A development schedule to
include frequent meeting should be agreed upon. The second component in this
phase is a subject matter concepts transfer. The SMEs demonstrate how to perform
tasks, allowing game developers to ask questions and take notes and photographs for
reference. One of the key challenges in game development is difficulty conveying
expert concepts to nonexpert developers. Concepts transfer helps orient the game
developers to the information to be shared or used in the game. Content production,
including any necessary physiological modeling incorporating existing evidence,
guidelines, and expert derived algorithms is completed during this phase with
storyboards describing the flow and required functionality of the game. The second
phase is development of wireframes, which are illustrations of proposed game
components and assist in visual communication design of the structure, functional-
ity, and learner interface. The third phase is a formative evaluation. This starts with
usability testing to identify content, design, functionality, and usability problems
with the game, and should include both subject matter experts and end-users. After
usability testing is complete and all edits have been made, the final product is
delivered for beta testing [115].
Before GHS educators consider using serious games as solutions for GHS
competencies, it is important to understand that the problem is being addressed by
the game and that a proposed claim of effectiveness is indeed trustworthy. The
functionality of a serious game differs from that of a mobile health application.
Mobile health applications communicate information where as a game requires the
user to operate or interact with the content, with the ultimate goal to change one’s
behavior in real life, i.e. learning [108].

11 Distributed Telesimulation

Distributed simulation is a method to increase GHS access to SBT worldwide.


Distributed simulation is providing SBT over the Internet, utilizing real-time video-
conferencing or asynchronous computer-based virtual reality. Distributed simulation
can increase access to SBT and to provide just-in-time training to the learners at the
right time and place [116]. Training effectiveness of simulation by remote facilita-
tion is as effective as traditional locally facilitated simulation. Remote simulation can
Simulation and Modeling Applications in Global Health Security 327

be a viable alternative method, especially where experienced SBT facilitators are


limited [117].
SBT can be delivered electronically to remote areas far from the simulation center
faculty that cannot logistically support their own local simulation program, either
because of space, simulators, or staff. Simulation distributed from a central location
can be provided asynchronously to other facilities regardless of time zone differ-
ences, facilitating participation when learner schedules allow. Distributed simulation
can provide SBT to GHS responders who do not have adequately-resourced simu-
lation programs or the additional capacity to provide required training for new
programs [50]. The concept of distance-based simulation training is based on remote
access to the central simulation facility and its patient simulator from any number of
distant training sites located anywhere that has adequate communications, or alter-
natively having the patient simulator located at a distant site with learners and
connected to a central simulation faculty site. SBT can be conducted under the
guidance of an expert teacher physically present at the central simulation
facility [118].
High-frequency, low-fidelity distributed simulation has been used to prepare
health-care workers on EBV personal protective equipment using virtual reality in
a “fragile health system” and low literacy levels in Liberia. This project utilized
lightweight laptop computers and off-the-shelf (OTS) software to create a virtual
reality immersive and an interactive clinical environment [39]. The American
College of Surgeons Advanced Trauma Life Support course has been effectively
taught by distance learning, including the skills station components of the course.
For troubleshooting, quality control, and for assisting with the skills teaching there
should be faculty at the remote course site [119].
Three domains of successful SBT applications in resource limited global health
applications have been described. The first is the ability to provide academic
affiliations to academically trained clinicians in remote areas. Secondly, SBT can
provide mastery-based instruction to those who otherwise would not have access to
it. Thirdly, it can develop partnerships between community-based traditional healers
and professional health-care workers [120].
Distributed simulation enables deployment of GHS “train the trainers” programs
to remote areas. This model has the advantage of overcoming cultural barriers to
instruction and can lead to providing a lasting GHS training infrastructure in the
region. There are many benefits to the model. These programs leverage limited
resources to create sustainable educational systems, which are the central part of
developing a public health system. Once a cadre of experienced educators is created
in the host country, education can proceed in an exponential fashion with proper
ongoing support. Another benefit of this model is a consultant trainers do not need to
uproot themselves from the home country, which could be disruptive to their careers
and family. Working with their host to develop curricula allows the idiosyncrasies of
local practice to be incorporated into the curriculum. It is suggested that trainers first
develop their courses alongside their sponsors, then practice administering the
328 A. J. French

course together, and finally deliver the course with the consultants present only in a
supportive role which can be done through remote simulation [94].
Distance-based distributed simulation integrates multiple innovative technology-
enhanced learning (TEL) concepts, particularly telepresence, focused on digitally
distributed simulation to remote locations. Telepresence is the experience of being
fully present at a live real world location remote from one’s own physical location.
TEL can establish telepresence by various means. The telepresence established by
distance-based distributed simulation facilitates students from different sites being
able to learn vicariously through observation of the simulation at another site.
Vicarious learning by observers and remote debriefing can increase the number of
learners per SBT session and increase remotely delivered SBT utilization. Vicarious
learning has been demonstrated to be an effective as direct learning if augmented
with observation scripts provided to the learners. Observation scripts focus learners’
attention on important objectives of the simulation and can improve peer
feedback [121].
There are logistical and pedagogical challenges to providing SBT in resource-
constrained areas. Limited technology infrastructures may limit SBT to low-fidelity
simulators. Simulation should be combined with hands-on learning, particularly for
psychomotor skills. Care must be taken to ensure that SBT is aligned with local
cultural norms and is respectful of local health-care customs and traditions, including
recognizing that roles of different professionals may vary, particularly roles of-non-
traditional healers, and uses of different medications and therapies [122, 123],
Remote facilitated simulation-based learning is technically feasible with
low-cost, pre-existing, and easy access resources. This can be applied to any
simulators driven by personal computer. A remote facilitator can control the simu-
lator and provide simulation training and debriefing to trainees. This type of
approach can enable novice facilitators to learn how experienced facilitators run
sessions and debrief remotely. These applications can enhance dissemination of SBT
and globalization of training efficiently, leading to standardization of patient care
[124]. Distributed simulation can be enhanced by integrating application sharing
software into simulations [125]. Having learners utilize GHS-related desktop or
mobile applications that are demonstrated and screen-shared by remote facilitators
can reinforce their use.
Simulation-based medical education may be difficult to adopt in developing
countries because it requires expensive equipment, adequate facilities, and experi-
enced staff members. Challenges to overcome simulation-based training in devel-
oping countries may include difficulty in transporting high-technology equipment
and solving equipment problems that occur on site without communications to
technical assistance [126]. Distributed simulation facilitators should be well pre-
pared and utilize a checklist for simulation related activities to prepare both the
broadcast and the learner remote sites and also utilize a script for instructing local
and remote learners on the expectations of the simulation [127].
A potential barrier to expanding SBT is the availability of qualified SBT instruc-
tors. Allowing instructors to remotely observe and debrief simulation sessions can
make simulation-based instruction more convenient, thus expanding the pool of
Simulation and Modeling Applications in Global Health Security 329

instructors available. The debriefing instructor can be remote from the actual simu-
lation and still provide real-time feedback and discussion on the performance of
learners [128].
Instructors may face challenges to master the technology and achieve a
telepresence with learners [129]. Remotely facilitated simulation can be optimized
through short training sessions for remote learners on the technical aspects, such as
camera and audio equipment familiarization. It has been recommended that a prior
face-to-face meeting may enable nonverbal cues to be recognized during subsequent
teleconferencing and helps to clarify visual and social cues [130].
Learners in remotely facilitated simulation-based training have reported feeling
uncomfortable without instructors present, that communication was a barrier to
learning, and that the quality of instruction was inferior. Learners may be willing
to overlook remote facilitated simulation-based training deficiencies because these
were outweighed by perceived benefits of having access to simulation-based training
versus not having access at all. Although learners may consider remote facilitation
simulation experiences less positive than locally facilitated it has not shown to have a
measurable impact on knowledge or learning outcomes [131].
As in live SBT, an integral component of distributed simulation is debriefing,
during which learning and reflection are greatest. To be maximally effective,
debriefing should be performed by facilitators who have both experience in the
subject matter and have a strong understanding of debriefing principles. Qualified
instructors to facilitate post-simulation debriefing may be inadequate in remote
locations, resulting in suboptimal educational experiences for learners. By
connecting SBT educators from geographically distant areas, learners are exposed
to faculty who can provide content expertise and high-quality debriefing [132].
Distributed simulation has the ability to be integrated into existing telemedicine
systems and vice versa. This would accomplish dual purposes of training and
providing real-time or asynchronous mentoring and consultations with higher level
care facilities. Remote telemedicine consultations to minimally-trained providers in
resource-constrained situations has been demonstrated to be effective [133]. Cou-
pling these two technologies can be synergistic for GHS preparedness.

12 Modeling and Simulation for Global Health Security


Preparedness

As computing resources become more affordable and modeling and simulation


capabilities become more advanced, modeling and simulation offer greater
far-reaching benefits to support GHS training and decision making. Although the
terms modeling and simulation are often used interchangeably, they are distinct
terms. Modeling is a representation of an object or phenomena, which is used by
simulation. Models are mathematical, physical, or logical representations of a
system, entity, phenomena, or process. Models are used by simulation to predict
330 A. J. French

the future state. Simulation is a representation of the functioning of a system or


process. Through simulation, a model may be implemented with unlimited varia-
tions, producing complex scenarios. These capabilities allow analysis and under-
standing of how individual elements interact can affect the simulated GHS
environment.
Modeling and simulation can provide rehearsal environments for GHS responder
personnel. Modeling and simulation can answer “what if” questions and provide an
experimentation or training environment that may not otherwise be realized.
Models and simulations are set up to locate the peaks and valleys and perfor-
mance envelopes, so systems can be optimized prior to development. Dynamic GHS
events, such as pandemics, mass casualty events, or pandemics challenge GHS
decision-makers with uncertainty, complexity, and rapid change. Modeling and
simulation virtual environments allow learners to interact with the virtual environ-
ments to test hypotheses, such as analyzing acute disease epidemiology by investi-
gating a disease outbreak in a virtual community. Modeling the simulation offers
great potential to inform thinking by enabling the examination of complex
problems [134].
Modeling can be used to assist in GHS, such as mass care after a disaster.
Predicting patient load in mass gatherings is a nonlinear problem, with a nonlinear
relationship between patient presentations and multiple event characteristics. There
is a positive correlation between environmental factors. Modeling can be useful for
predicting patient presentation numbers not only for planned mass gatherings but for
unplanned mass security events [135].
Modeling has been used to assist hospitals in planning for surge operations during
a future influenza pandemic. Models allow for changes in the length of stay,
ventilator usage, and the attack rate of the virus. The Centers for Disease Control
and Prevention (CDC) developed FluSurge to estimate daily demands of patients
and resource use for a flu epidemic [136]. FluSurge is available as a downloadable
Excel spreadsheet online. A software tool, the Hospital Surge Evaluation Tool, was
recently developed by the U.S. Department of Health and Human Services to support
live surge capacity drills and tabletop exercises [137].
Modeling has also been used to predict disaster responses for 25 major US cities
using a hybrid of geographic information systems and dynamic modeling. The
Disaster Response Simulation Model provides emergency preparedness and health
systems planners more realistic estimates for mass casualty events. It identified
longer wait and transport times needed to distribute high numbers of patients to
distant trauma centers in sudden-impact disasters, which created predictable
increases in mortality and trauma center resource utilization [137].
Modeling and simulation can be used to assess and enhance hospital surge
capacity [139]. It is recommended that emergency planners collaborate with
human factors and industrial engineers who are experts in risk analysis, modeling,
and systems analysis. Individual hospitals may assess their surge capacity and flow
constraints by using analytic tools or computational modeling to examine patient
flow analysis, staff utilization efficiencies, hospital bed the man patterns, throughput,
and wait times [140]. Different strategies to expand hospital surge capacity, such as
Simulation and Modeling Applications in Global Health Security 331

opening unlicensed beds, canceling elective admissions, and implementing reverse


triage can be interactively evaluated [141]. Computer simulation modeling to assess
the effect on emergency department length of stay of increasing the number emer-
gency department beds versus altering the admitted patient departures from the ED
has been used to find the optimum patient flow interventions [142].
Modeling has been combined with statistical process control methods to visualize
opportunities to improve surge capacity during a mass casualty incident. Modeling
can allow a reproducible and reusable instrument to measure surge capacity. A
standardized simulation scenario can be used to develop baseline metrics in a
simulation repeated under various conditions to quantify the relative changes in
surge capacity [143]. Modeling has been used to incorporate human movement
trajectory data during triage drills by tracking patient movement and providing
data for optimization of use of limited spatial resources within a triage post [144].
Discrete event simulations are widely used to model a number of healthcare
settings and successfully assess patient flows. The benefits of discrete event model-
ing to iteratively design and test technologies and work processes, such as an
electronic patient tracking system, allows assessment of human performance mea-
sures before they are implemented. Unexpected consequences may ensue from these
technological solutions that they failed to provide support for providers work
activities and have negative consequences on performance and potential patient
safety [145]. Discrete event modeling is applicable to GHS contingency planning,
such as deployable field hospitals and EVD treatment centers.
Before using a simulation to draw conclusions or provide recommendations, it is
necessary to build the blocks of the modeled process, flows, resources, and con-
straints, acquire accurate figures for input to the system, and validate the model
against empirical evidence. Professional versions of such software can have prohib-
itive purchasing or licensing costs and require significant internal or consulting
resources to program, test, and produce accurate estimates. Spreadsheet simulations
have been explored for simulation modeling and are nearly as effective as traditional
simulations but easier to use, understand, and implement. Spreadsheet software is
widely available at a fraction of the cost of discrete event simulation software [146].
Accurate modeling requires data, and data on GHS events can be challenging to
obtain. To fill this gap GHS planners can leverage military medical data collection
and modeling and simulation efforts. The U.S. Department of Defense has collected
data on wounds, diseases, treatments and patient outcomes in military conflicts since
2004. This data makes up part of the Joint Theater Trauma Registry and includes
similar data collected by other military branches. It is been proposed to combine this
data with nonmilitary data such as from the National Trauma Databank to develop a
modeling and simulation databank for medical disaster responses, including primary
care for illnesses that may or may not be directly related to the disaster itself.
However, rigorous long-term efforts must be made to collect disaster response
patient care data during GHS events, including the number and types of injuries
and illnesses seen by responders. Empirically-based modeling and simulation can
help GHS disaster planners be prepared for the infrequent but “Predictable surprise”
[147, 148].
332 A. J. French

13 Conclusion

SBT and modeling have the potential to be GHS workforce multipliers by reaching
out to responders who have limited access to training. SBT has demonstrated its
effectiveness and positive return on investment. Challenges with initial front-end
funding remain. An adequate Internet infrastructure is required to effectively utilize
remote distributed SBT, which may limit SBT deployment in resource-constrained
regions. New technologies continue to increase Internet coverage in developing
countries and improved bandwidth in developed countries. Instructional designers
and serious game developers are continuing to decrease the need for high-speed
networks while simultaneously mobile device manufacturers are increasing the
processing speeds and memory of mobile devices.
GHS educators need to learn how to leverage SBT technologies to reach the
public health workforce. GHS educators also will need to collaborate with software
developers and instructional designers to ensure that the goals of increasing public
health workers’ competencies are achieved by aligning SBT fidelity with learner
objectives.
SBT is a tool, not a technology, and simulation faculty and facilitators need to be
familiar with the principles of simulation education, particularly learner immersion
and debriefing skills. GHS leaders should prioritize obtaining funding support for
increased SBT deployment to address the public health workforce shortages and
capability gaps.

References

1. Green MS et al (2018) Confronting the threat of bioterrorism: realities, challenges, and


defensive strategies. Lancet
2. Gardner AK et al (2016) Using simulation for disaster preparedness. Surgery
160(3):565–570
3. Executive order – advancing the global health security agenda to achieve a world safe and
secure from infectious disease Threats. 2014
4. Global Health Workforce Alliance (2015) A universal truth: no health without a workforce.
World Health Organization, Geneva
5. Heymann DL, Takema LC, Fidler DP, Tappero JW, Thomas MJ, Frieden TR, Yach D,
Nishtar S, Kalache A, Olliaro PL, Torreele E, Gostin LO, Ndomondo-Sigonda M,
Carpenter D, Rushton S, Lillywhite L, Devkota B, Koser K, Yates R, Dhillon RS,
Rannan-Eliya RP (2015) Global health security: the wider lessons from the West African
Ebola virus disease epidemic. Lancet 385:1884–1901
6. Haussig JM et al (2017) The European Medical Corps: first public health team mission and
future perspectives. Eur Surveill 22(37):1–6
7. U.S. Department of Health and Human Services (2016) Report of the Independent Panel on
the U.S. Department of Health and Human Services (HHS) Ebola Response, Retrieved from
https://www.phe.gov/preparedness/responders/ebola/ebolaresponsereport/documents/ebola-
panel.pdf
Simulation and Modeling Applications in Global Health Security 333

8. Global health security agenda: getting ahead of the curve on epidemic threats. Testimony of
Dr. Rajiv Shah, USAID Administrator, before the House Committee on Foreign Affairs, in
House Committee on Foreign Affairs
9. Global Burden of Disease 2017 Sustainable Development Goals Collaborators (2018)
10. Sands P, Mundaca-Shah C, Dzau VJ (2016) The neglegted dimension of global security – a
framework for countering infectious-disease crises. N Engl J Med 374(13):1281–1287
11. Nuzzo JB, Cicero AJ, Inglesby TV (2017) The importance of continued US investment to
sustain momentum toward global health security. J Am Med Assoc 318(24):2423–2424
12. Burkle FM (2016) The World Health Organization global health emergency workforce: what
role will the United States play? vol 10. Disaster Medicine and Public Health Preparedness,
Preparedness, pp 531–535. https://doi.org/10.1017/dmp.2016.114
13. Daily E, Padjen P, Birnbaum M (2010) A review of competencies developed for disaster
healthcare providers; limitations of current processes and applicability. Prehosp Disaster
Med 25(5):387–395
14. Amat Camacho N, Burkle FM, Ingrassia PL, Ragazzoni L, Redmond A, Norton I, von
Schreeb J (2016) Education and training of emergency medical teams: recommendations for
a global operational learning framework. PLOS Curr Disasters. https://doi.org/10.1371/
currents.dis.292033689209611ad5e4a7a3e61520d0
15. Evans DP et al (2016) Innovation in graduate education for health professionals in human-
itarian emergencies. Prehosp Disaster Med 31(5):532–538
16. Hansoti B, Kellog DS, Aberle SJ, Broccoli MC, Feden J, French A et al (2016)
Preparing emergency physicians for acute disaster response: a review of current training
opportunities in the U.S. Prehospital Disaster Med 31(6):1–5. https://doi.org/10.1017/
S1049023X16000820
17. Koenig KL, Tsai SH (2011) Crisis standards of care: refocusing health care goals during
catastrophic disasters and emergencies. J Exp Clin Med 3(4):159–165
18. Noguchi N, Inoue S, Shimanoe C, Shibayama K, Matsunaga H, Tanaka S, Ishibashi A,
Shinchi K (2016) What kinds of skills are necessary for physicians involved in international
disaster response? Prehosp Disaster Med 31(4):397–406
19. Subbarao I, Lyznicki JM, Hsu EB, Gebbie KM, Markenson D, Barzansky B, Armstrong JH,
Cassimatis EG, Coule P, Dallas C, King R, Rubinson L, Sattin R, Swienton R, Lillibridge S,
Burkle FM, Schwartz RB, James JJ (2008) A consensus-based educational framework and
competency set for the discipline of disaster medicine and public health preparedness.
Disaster Med Public Health Prep 2:57–68
20. Triola M, Feldman H, Kalet A, Zabar S, Kachur E, Gillespie C, Anderson A, Griesser C,
Lipkin M (2006) A randomized trial of teaching clinical skills using virtual and live
standardized patients. J Gen Intern Med 21(5):424–429
21. Markenson D, Dimaggio C, Redlener I (2005) Preparing health professions students
for terrorism, disaster, and public health emergencies: core competencies. Acad Med
80(6):517–526
22. Curtis HA, Trang K, Chason KW, Biddinger PD (2018) Video-based learning versus
traditional lecture for instructing emergency medicine residents in disaster medicine princi-
ples of mass triage, decontamination, and personal protective equipment. Prehosp Disaster
Med 33(1):7–12. https://doi.org/10.1017/S1049023X1700718X
23. Issenberg SB, McGaghie WC, Petrusa ER, Gordon DL, Scalese RJ (2005) Features and uses
of high-fidelity medical simulations that lead to effective learning: a BEME systematic
review. Med Teach 27(1):10–28. https://doi.org/10.1080/01421590500046924
24. Fletcher JD, Wind AP (2013) Cost considerations in using simulations for medical training.
Mil Med 178(October Supplement):37–46. https://doi.org/10.7205/MILMED-D-13-00258
25. Friedl KE, O’Neil HF (2013) Designing and using computer simulations in medical
education and training: an introduction. Mil Med 178:1–6
26. Ziv A, Wolpe PR, Small SD (2003) Simulation-based medical education: an ethical
imperative. Acad Med 78(8):783–788
334 A. J. French

27. Gaba DM (2004) The future vision of simulation in health care. Qual Saf Health Care
13(Suppl 1):i2–i10
28. Subbarao I, Bond WF, Johnson C (2006) Using innovative simulation modalities for
civilian-based, chemical, biological, radiological, nuclear, and explosive training in the
acute management of terrorist victims: a pilot study. Prehosp Disaster Med 21(4):272–275
29. Ten Eyck RP (2011) Simulation in emergency medicine training. Pediatr Emerg Care
27(4):333–344
30. Cook DB, Hatala R, Brydges R, Zendejas B, Szostek JH, Wang A, Erwin PJ, Hamstra SJ
(2011) Technology-enhanced simulation for health professions education: a systematic
review and meta-analysis. J Am Med Assoc 306(9):978–988
31. Elcin M, Onan A, Odabasi O, Saylam M, Ilhan H, Kockaya PD, Gurcuoglu I, Uckuyu Y,
Cengiz D, Nacar OA (2016) Developing a simulation-based training program for the
prehospital professionals and students on the management of middle east respiratory
syndrome. Simul Healthc 11(6):394–403. https://doi.org/10.1097/SIH.0000000000000198
32. McGaghie WC, Issenberg SB, Barsuk JH, Wayne DB (2014) A critical review of simulation-
based mastery learning with translational outcomes. Med Educ 48:375–385. https://doi.org/
10.1111/medu.12391
33. Wang EE (2011) Simulation and adult learning. Dis Mon 57:664–668. https://doi.org/10.
1016/j.disamonth.2011.08.017
34. Zigmont JJ, Kappus LJ, Sudikoff SN (2011) Theoretical foundations of learning through
simulation. Semin Perinatol 35:47–51. https://doi.org/10.1053/j.semperi.2011.01.002
35. McGaghie WC, Harris IB (2018) Learning theory foundations of simulation-based mastery
learning. Simul Healthc 13(3S):S15–S20. https://doi.org/10.1097/SIH.0000000000000279
36. Hughes PG, Crespo M, Maier T, Whitman A, Ahmed R (2016) Ten tips for maximizing the
effectiveness of emergency medicine procedure laboratories. J Am Osteopath Assoc
116:384–390. https://doi.org/10.7556/jaoa.2016.079
37. Forehand M (2005) Blooms taxonomy: original and revised. In Orey M (ed) Emerging
perspectives on learning, teaching, and technology. https://www.d41.org/cms/lib/
IL01904672/Centricity/Domain/422/BloomsTaxonomy.pdf
38. Davis AJ, Fierro L, Guptill M, Kiemeney M, Brown L, Smith DD, Young TP (2017)
Practical application of educational theory for learning technical skills in emergency med-
icine. Ann Emerg Med 70(3):402–405. https://doi.org/10.1016/j.annemergmed.2017.04.026
39. Gale TCE, Chattterjee A, Mellor NE, Allan RJ (2016) Health worker focused distributed
simulation for improving capability of health systems in Liberia. Simul Healthc
11(2):75–81. https://doi.org/10.1097/SIH.0000000000000156
40. Wier GS, Tree R, Nusr R (2017) Training effectiveness of a wide area virtual
environment in medical simulation. Simul Healthc 12(1):28–40. https://doi.org/10.1097/
SIH/0000000000000207
41. Vince Lombardi Jr. https://www.goodreads.com/quotes/29093-practice-does-not-make-per
fect-perfect-practice-makes-perfect
42. Ericsson KA (2008) Deliberate practice and acquisition of expert performance: a general
overview. Acad Emerg Med 15:988–994
43. Ericsson KA (2008) Deliberate practice and the acquisition of expert performance: a general
overview. Acad Med 15(11):988–994
44. Pusic MV, Boutis K, McGaghie WC (2018) Role of scientific theory in simulation education
research. Simul Healthc 13(3S):S7–S14. https://doi.org/10.1097/SIH.0000000000000282
45. Munro A, Clark RE (2013) Cognitive task analysis-based design and authoring software for
simulation training. Mil Med 178(October Suppl):7–14. https://doi.org/10.7205/MILMED-
D-13-00265
46. Chiniara G, Cole G, Brisban K, Huffman D, Cragg B, Lamacchia M, Norman D, Canadian
Network for Simulation in Healthcare Guidelines Working Group (2013) Simulation in
healthcare: a taxonomy and a conceptual framework for instructional design and media
selection. Med Teach 35:e1380–e1395. https://doi.org/10.3109/1042159X.2012.733451
Simulation and Modeling Applications in Global Health Security 335

47. Sawyer T, Eppich W, Brett-Fleegler M, Grant V, Cheng A. More than one way to debrief: a
critical review of healthcare simulation debriefing methods. Simul Healthc 11(3). https://doi.
org/10.1097/SIH.0000000000000148
48. Moore GS, Perlow A, Judge C, Koh H (2006) Using blended learning in training the public
health workforce in emergency preparedness. Public Health Rep 121:217–221
49. Dankbaar MEWP, Roozeboom MB, Oprins EAPBP, Rutten F, van Merrienboer JJG,
van Saase JLCM, Shuit SCE (2017) Preparing residents effectively in emergency skills
training with a serious game. Simul Healthc 12(1):9–16. https://doi.org/10.1097/SIH.
0000000000000194
50. Ingrassia PL, Ragazzoni L, Tengattnini M, Carenzo L, Della Corte F (2014) Nationwide
program of education for undergraduates in the field of disaster medicine: development of
a core curriculum centered on blended learning and simulation tools. Prehosp Disaster Med
29(5):508–515. https://doi.org/10.1017/S1049023X14000831
51. Chandler T, Qureshi K, Gebbie KM, Morse SS (2008) Teaching emergency preparedness to
public health workers: use of blended learning in web-based training. Public Health Rep
123:676–680
52. Phrampus PE, O”Donnell JM, Farkas D, Abernathy D, Brownlee K, Dongilli T, Martin S
(2016) Rapid development and deployment of Ebola readiness training across an academic
health system. Simul Healthc 11(2):82–88. https://doi.org/10.1097/SIH.000000000000013
53. Miller JL, Rambeck JH, Snyder A (2014) Improving emergency preparedness
system readiness through simulation and interprofessional education. Public Health Rep
129(Suppl 4):129–135
54. Brydges R, Nair P, Ma I, Shanks D, Hatala R (2012) Directed self-regulated learning versus
instructor-regulated learning in simulation training. Med Educ 46:648–656. https://doi.org/
10.1111/j.1365-2923.2012.04268.x
55. Issenberg SB, Scalese RJ (2008) Simulation in health care education. Perspect Biol Med
51(1):31–46. Johns Hopkins University Press. https://doi.org/10.1353/pbm.2008.0004
56. Woodruff AE, Jensen M, Loeffler W, Avery L (2014) Advanced screencasting with
embedded assessments in pathophysiology and therapeutics course modules. Am J Pharm
Educ 78(6):Article 128
57. Verheul MLMI, Duckers MLA, Visser BB, Beerens RJJ, Bierens JJJLM (2018) Disaster
exercises to prepare hospitals for mass-casualty incidents: does it contribute to preparedness
or is it ritualism? Prehosp Disaster Med 33(4):387–393
58. Triola MM, Campion N, McGee JB, Albright S, Greene P, Smothers V, Ellaway R (2007)
An XML standard for virtual patients: exchanging case-based simulations in medical
education. AMIA Ann Symp Proc AMIA Symp 2007:741–745
59. Padilla JJ, Diallo SY, Armstrong RK (2018) Toward live virtual constructive simulations in
healthcare learning. Simul Healthc 13:S35–S40
60. Noble C (2002) The relationship between fidelity and learning in aviation training and
assessment. J Air Transp 7(3):33–49
61. Rehman A, Mitman R, Reynolds M (1995) A handbook of flight simulation requirements
for human factors research, Technical report no. DOT/FAA/CT-TN96/46. Crew Systems
Ergonomics Information Analysis Center, Wright-Patterson AFB
62. Diekmann P, Gaba D, Rall M (2007) Deepening the theoretical foundations of patient
simulation as social practice. Simul Healthc 2:183–193
63. Hamstra SJ, Brydges R, Hatala R, Zendejas B, Cook DB (2014) Reconsidering fidelity in
simulation-based training. Acad Med 89(4):387–392. https://doi.org/10.1097/ACM.
0000000000000130
64. Talbot TB (2013) Balancing physiology, anatomy, and immersion: how much biological
fidelity is necessary in a medical simulation? Mil Med 178:28–36. https://doi.org/10.7205/
MLMED-D-13-00212
65. Kyaw Tun J, Alinier G, Tang J (2015) Redefining simulation fidelity for healthcare
education. Simul Gaming 46(2):159–174. https://doi.org/10.1177/1046878115576103
336 A. J. French

66. Hart D, Rush R, Rule G, Clinton J, Beilman G, Anders S, Brown R, McNeil MA, Reihson T,
Chipman J, Sweet R (2017) Training and assessing critical airway, breathing, and hemor-
rhage control procedures for trauma care: live tissue versus synthetic models. Acad Emerg
Med. https://doi.org/10.1111/acem.13340
67. Savage EC, Tenn C, Vartanian O, Blackler K, Sullivan-Kwantes W, Garrett M, Blais A,
Jarmasz J, Peng H, Pannell D, Tien HC (2015) A comparison of live tissue training and high-
fidelity patient simulator: a pilot study in battlefield trauma training. J Trauma Acute Care
Surg 79(4, Suppl 1):S157–S163. https://doi.org/10.1097/TA0000000000000668
68. Hauglum SD et al (2018) Evaluation of a low-cost, high-fidelity animal model to train
graduate advanced practice nursing students in the performance of ultrasound-guided central
line catheter insertion. Simul Healthc 13(5):341–347
69. Advanced Modular Manikin. https://www.advancedmodularmanikin.com/about.html
70. Bewley WL, O’Neil HF (2013) Evaluation of medical simulations. Mil Med 178:64–75
71. Patel V, Aggarwal R, Cohen D, Taylor D, Darzi A (2013) Implementation of an interactive
virtual-world simulation for structured surgeon assessment of clinical scenarios. J Am Coll
Surg 217(2):270–279. https://doi.org/10.1016/j.jamcollsurg.2013.03.023
72. Bai X, Duncan RO, Horowitz BP, Graffeo JM, Godstein SL, Lavin J (2012) The added value
of 3D simulations in healthcare education. Int J Nurs Educ 4(2):67–72
73. Cohen D, Sevdalis N, Patel V (2013) Tactical and operational response to major incidents:
feasibility and reliability of skills assessment using novel virtual environments. Resuscita-
tion 84:992–998
74. Wilkerson W, Avstreih D, Gruppen L, Beier KP, Woolliscroft J Using immersive simulation
for training first responders for mass casualty incidents. Acad Emerg Med 15:1152–1159.
https://doi.org/10.1111/j.1553-2712.2008.00223.x
75. Heinrichs WL, Youngblood P, Harter P, Kusumoto L, Dev P (2010) Training healthcare
personnel for mass-casualty incidents in a virtual emergency department: VED II. Prehosp
Disaster Med 25(5):424–432
76. Lino JA, Gomes GC, Sousa NDSVC, Carvalho AK, Dinez MEB, Junior ABV (2016)
A critical review of mechanical ventilation virtual simulators: is it time to use them?
JMIR Med Educ 2(1):e8. https://doi.org/10.2196/mededu.5350
77. Foronda CL, Alfes CM, Dev P, Kleinheksel AJ, Nelson DA, O’Donnell JM, Samosky JT
(2017) Emerging technologies in nursing education. Nurse Educ 42(1):14–17. https://doi.
org/10.1097/NNE.000000000000295
78. Koenig K (2010) Editorial comments-training healthcare personnel for mass casualty inci-
dents in a virtual emergency department: VED II. Prehosp Disaster Med 25(5):433–434
79. Ingrassia PL et al (2015) Virtual reality and live simulation: a comparison between two
simulation tools for assessing mass casualty triage skills. Eur J Emerg Med 22:121–127
80. Farra S, Miller ET (2012) Integrative review: virtual disaster training. J Nurs Educ Pract
3(3):93–101. https://doi.org/10.5430/jnep.v3n3p93
81. Andreatta PB, Maslowski SE, Petty S, Shim W, Marsh M, Hall T, Stern S, Frankel J (2010)
Virtual reality triage training provides a viable solution for disaster-preparedness. Acad
Emerg Med 17(8):870–876. https://doi.org/10.1111/j.1553-2712.2010.00728.x
82. Kizakevich PN, Lux l, Steve Duncan S, Curry Guinn C (2003) Virtual simulated patients for
bioterrorism preparedness training. Stud Healthc Inform 94:165–167
83. Foronda CL, Shubeck K, Swoboda SM, Hudson KW, Budhathoki C, Sullivan N, Hu X
(2016) Impact of virtual simulationto tech concepts of disaster triage. Clin Simul Nurs
12(4):137–144. https://doi.org/10.1016/j.ecns.2016.02.004
84. Vincent DS, Sherstyuk A, Burgess LPA, Connolly K (2008) Teaching mass casualty
triage skills using immersive three-dimensional virtual reality. Acad Emerg Med
15(11):1160–1165. https://doi.org/10.1111/j.1553-2712.2008.00191.x
85. Youngblood P, Harter PM, Srivastava S, Moffett S, Heinrichs WL, Dev P (2008) Design,
development, and evaluation of an online virtual emergency department for training trauma
teams. Simul Healthc 3(3):146–153
Simulation and Modeling Applications in Global Health Security 337

86. Ragazzoni L, Ingrassia L, Echeverri L, Maccapani F, Berryman L, Burkle F, Della Corte F


(2015) Virtual reality simulation training for Ebola deployment. Disaster Med Public Health
Prep 9(5):543–546. https://doi.org/10.1017/dmp.2015.36
87. Pucher PH, Batrick N, Taylor D, Chaudery M, Cohen D, Darzi A (2014) Virtual-world
hospital simulation for real-world disaster response: design and validation of a virtual reality
simulator for mass casualty incident management. J Trauma Acute Care Surg 77(2):315–321
88. Hsu EB, Jenckes MW, Catlett CL, Robinson KA, Feuerstein C, Cosgrove SE, Green GB,
Bass EB (2004) Effectiveness of hospital staff mass casualty incident training methods:
a systematic literature review. Prehosp Disaster Med 19(3):191–199
89. Lineberry M, Dev P, Lane HC, Talbot TB (2018) Learner-adaptive educational technology
for simulation in healthcare: foundations and opportunities. Simul Healthc 13(3S):S21–S27.
https://doi.org/10.1097/SIH.0000000000000274
90. Federal Emergency Management Agency (2018) Homeland Security Exercise Evaluation
Program. Retrieved from: https://www.fema.gov/hseep
91. Hanson K, Hernandez L, Banaski JA (2018) Building simulation exercise capacity in Latin
America to manage public health emergencies. Health Secur 16:S98–S102
92. World Health Organization (2017) WHO Simulation Exercise Manual. World Health
Organization, Geneva
93. WHO concept note: development, monitoring and evaluation of functional core capacity for
implementing the International Health Regulations (2005) http://www.who.int/ihr/publica
tions/concept_note_201407.pdf?ua¼1
94. Weiner SG, Totten VY, Jacquet GA, Douglass K, Birnbaumer DM, Promes SB, Martin IBK
(2013) Effective teaching and feedback skills for international emergency medicine “train
the trainers” programs. J Emerg Med 45(5):718–725. https://doi.org/10.1016/j.jemermed.
2013.04.040
95. Gillett B, Peckler B, Sinert R, Onkst C, Nabors S, Issley S, Maguire C, Galwankarm S,
Arquilla B (2008) Simulation in a disaster drill: comparison of high-fidelity simulators
versus trained actors. Acad Emerg Med 15:1144–1151. https://doi.org/10.1111/j.1553-
2712.2008.00198.x
96. Claudius I, Kaji A, Santillanes G, Cicero M (2015) Comparison of computerized patients
versus live moulaged actors for a mass-casualty drill. Prehosp Disaster Med 30(5):438–432.
https://doi.org/10.1017/S1049023X15004963
97. Ingrassia PL, Prato F, Geddo A, Colombo D, Tengattini M, Calligaro S, La Mura F, Franc
JM, Della Corte F (2010) Evaluation of medical management during a mass casualty incident
exercise: an objective assessment tool to enhance direct observation. J Emerg Med
39:629–636
98. Rojo E, Oruna C, Sierra D, Garcia G, Del Moral I, Maestre JM (2016) Simulation as a tool to
facilitate practice changes in teams taking care of patients under the investigation for Ebola
virus disease in Spain. Simul Healthc 11(2):89–93. https://doi.org/10.1097/SIH.
0000000000000139
99. Menkin-Smth L, Lehman-Huskamp K, Schaefer J, Alfred M, Catchpole K, Pockrus B,
Wilson DA, Reves JG (2018) A pilot trial of online simulation training for Ebola response
training. Health Secur 16(6). https://doi.org/10.1089/hs.2018.0055
100. Delaney HM, Lucero PF, Maves RC, Lawler JV, Maddry JK, Biever KA, Murray CK (2016)
Ebola virus disease simulation case series patient with ebola virus disease in the prodromal
phase of illness (scenario 1), the “wet” gastrointestinal phase of illness (scenario 2), and the
late, critically ill phase of disease (scenario 3). Simul Healthc 11(2):106–116. https://doi.org/
10.1097/SIH.0000000000000115
101. Biddell EA, Vandersall BL, Bailes SA, Estephan SA, Ferrara LA, Nagy KM,
O’Connell JL, Patterson MD (2016) Use of simulation to gauge preparedness for Ebola at
a free-standing children’s hospital. Simul Healthc 11(2):94–99. https://doi.org/10.1097/SIH.
0000000000000134
102. Drew JL, Turner J, Mugele J, Hasty G, Duncan T, Zaiser R, Cooper D (2016) Beating the
spread: developing a simulation analog for contagious body fluids. Simul Healthc
11:100–105. https://doi.org/10.1097/SIH.0000000000000157
338 A. J. French

103. Davis MA, Landesman R, Tadmor B (2008) Initial test of emergency procedure performance
in temporary negative pressure isolation by using simulation technologies. Ann Emerg Med
51(4):420–425
104. Marlow SL, Bedwell WL, Zajac S, Reyes DL, LaMar M, Khan S, Lopreiato J, Salas E
(2018) Multiple patient casualty scenarios: a measurement tool for teamwork. Simul Healthc
13(6):394–403
105. Wang R, DeMaria S, Goldberg A, Katz D (2016) A systematic review of serious games in
training health care professionals. Simul Healthc 11(1):41–51. https://doi.org/10.1097/
SIH0000000000000118
106. Taekman JM, Shelley K (2010) Virtual environments in healthcare: immersion, disruption,
and flow. Int Anesthesiol Clin 48(3):101–121
107. Thompson D, Baranowski T, Buday R, Baranowski J, Thompson V, Jago R, Griffith J
(2010) Simul Gaming. August 1; 41(4): 587–606. https://doi.org/10.1177/
1046878108328087
108. Graafland M, Dankbaar M, Mert A, Lagro J, De Wit-Zuurendonk L, Schuit S, Schaafstal A,
Schijven M (2012) How to systematically assess serious games applied to health care. JMIR
Serious Games 2(2):e11. https://doi.org/10.2196/games.3825
108A. Graafland M, Bemelman WA, Schijven MP (2017) Game-based training improves the
surgeon’s situational awareness in the operation room: a randomized controlled trial. Surg
Endosc. https://doi.org/10.1007/s00464-017-5456-6
109. Pasquier P, Merat S, Malgras B, Petit L, Queran X, Bay C, Boutonnet M, Jault P, Ausset S,
Auroy Y, Perez JP, Tesniere A, Pons F, Mignon A (2016) A serious game for massive
training and assessment of French soldiers involved in forward combat casualty care
(3D-SC1): development and deployment. JMIR Serious Games 4(1):e5. https://doi.org/10.
2196/games.5340
110. Planchon J, Vacher A, Comblet J, Rabatel E, Darses F, Mignon A, Pasquier P (2017) Serious
game training improves performance in combat life-saving interventions. Injury 49:86–92.
https://doi.org/10.1016/j.injury.2017.10.025
111. Mohan D, Farris C, Fischhoff B, Rosengart MR, Angus DC, Yealy DM, Wallace DL,
Barnato AE (2017) Efficacy of educational video game versus traditional educational apps
at improving physician decision making in trauma triage: randomized controlled trial. Br
Med J 359:5416. https://doi.org/10.1136/bmj.j5416
112. Mayer RE (2016) What should be the role of computer games in education? Policy Insights
Behav Brain Sci. https://doi.org/10.1177/2372732215621311
113. Koenig A, Iseli M, Wainess R, Lee JJ (2013) Assessment methodology for computer-based
instructional simulations. Mil Med 178(October Suppl):47–54. https://doi.org/10.7205/
MILMED-D-13-00217
114. Giunti G, Baum A, Giunta D, Plazzotta F, Benitez S, Gomez A, Luna D, Gonzolez F, de
Quiros B (2015) Serious games: a concise overview on what they are and their potential
applications to healthcare. Medinfo 2015. https://doi.org/10.3233/978-1-61499-564-7-386
115. Olszewski AE, Wolbrink TA (2017) Serious gaming in medical education: a proposed
structured framework for game development. Simul Healthc 12(4):240–253
116. Kneebone R, Arora S, King D (2010) Distributed simulation-accessible immersive training.
Med Teach 32(1):65–70
117. Ohta K, Kurosawa H, Shiima Y, Ikeyama T, Scott J, Hayes S, Gould M, Buchanan N,
Nadkarni V, Nishisaki A (2017) The effectiveness of remote facilitation in simulation-based
pediatric resuscitation training for medical students. Pediatr Emerg Care 33(8):564–556
118. Von Lubitz DKJE, Levine H, Patricelli F, Richir S (2008) Distributed simulation-based
clinical training: going beyond the obvious. In: Kyle RR, Murray WB (eds) Clinical
simulation operations, engineering, and management. Academic, Burlington, pp 591–622
119. Ali J, Sorvari A, Camera S, Kinach M, Mohammed S, Pandya A (2013) Telemedicine as a
potential medium for teaching the Advanced Trauma Life Support (ATLS) course. J Surg
Educ 70(2):258–264. https://doi.org/10.1016/j.jsurg.2012.11.008
120. Andreatta P (2017) Healthcare simulation in resource-limited regions and global health
applications. Simul Healthc 12(3):135–138. https://doi.org/10.1097/SIH.0000000000000220
Simulation and Modeling Applications in Global Health Security 339

121. Stegmann K, Pilz F, Siebeck M, Fischer F (2012) Vicarious learning during simulations: is it
more effective than hands-on training? Med Educ 46:1001–1008. https://doi.org/10.1111/j.
1365-2923.2012.04344.x
122. Perry MF, Seto TL, Vasquez JC, Josyula S, Rule ARL, Rule DW, Kamath-Rayne BD (2018)
The influence of culture on teamwork and communication in a simulation-based resuscita-
tion training at a community hospital in Honduras. Simul Healthc 13(5):363–370. https://doi.
org/10.1097/SIH.0000000000000323
123. Pitt MB, Eppich WJ, Shane ML, Butteris SM (2017) Using simulation in global health.
Simul Healthc 12(3):177–181. https://doi.org/10.1097/SIH.0000000000000209
124. Ikeyama T, Shimizu N, Ohta KD (2012) Low-cost and ready-to-go remote-facilitated
simulation-based learning. Simul Healthc 7(1):35–39
125. Erickson D, Greer L, Belard A, Tinnel B, O’Connell J (2010) A hybrid integrated services
digital network-internet protocol solution for resident education. Telemed E-Health. https://
doi.org/10.1089/tmj.2009.0132
126. Kim H (2017) Experience of simulation-based training in a developing country. Simul
Healthc 12(3):202. https://doi.org/10.1097/SIH.0000000000000203
127. Smith-Stoner M (2009) Web-based broadcast of simulations. Nurse Educ 34(6):266–269
128. Hayden EM, Navedo DD, Gordon JA (2012) Web-conferenced simulation sessions: a
satisfaction survey of clinical simulation encounters via remote supervision. Telemed
E-Health 18(7):525–529
129. Christensen MD, Oestergaard D, Dieckmann P, Watterson LM (2018) Learners’ perceptions
during simulation-based training: an interview study comparing remote versus locally
facilitated simulation-based training. Simul Healthc 13(5):306–315. https://doi.org/10.
1097/SIH.0000000000000300
130. Cameron M, Ray R, Sabesan S (2015) Remote supervision of medical training via video-
conferencing in northern Australia: a qualitative study of the perspectives of supervisors and
trainees. Br Med J Open 5:e006444
131. Christensen MD, Rieger K, Tan S, Dieckmann P, Oestergaard D, Watterson LM (2015)
Remotely versus locally facilitated simulation-based training in management of the deteri-
orating patient by newly graduated health professionals. Simul Healthc 10(6):352–359.
https://doi.org/10.1097/SIH.0000000000000123
132. Ahmed R, Gardner AK, Atkinson SS, Gable B (2014) Teledebriefing: connecting learners to
faculty members. Clin Teach 11:270–273
133. Gerhardt R, Berry J, Mabry R, Fluornoy L, Arnold RG, Hults C, Robinson JB, Thaxton RA,
Cestero R, Heiner JD, Koller AR, Cox K, Patterson JN, Dalton WR, McKeague AL,
Gilbert G, Manemeit C, Adams BD (2014) Evaluation of contingency telemedical support
to improve casualty care at a simulated military intermediate resuscitation facility: the
EM-ANGEL study. J Spec Med Oper Med 14(1):50–57
134. Modeling and Simulation Committee (2011) A primer on modeling and simulation. National
Training and Simulation Association
135. Arbon P, Bottema M, Zeitz K, Lund A, Turris S, Anikeeva O, Steenkamp M (2018)
Nonlinear modelling for predicting patient presentation rates for mass gatherings. Prehosp
Disaster Med 33(4):362–367. https://doi.org/10.1017/S1049023X18000493
136. Zhang X, Meltzer MI, Wortley PM (2006) FluSurge- a tool to estimate demand for hospital
services during the next pandemic. Med Decis Mak 26(6):617–623. https://doi.org/10.1177/
0272989X06295359
137. Hospital Surge Evaluation Tool. Public Health Emergency website. https://www.phe.gov/
preparedness/planning/hpp/surge/pages/default.aspx
138. Carr BG, Walsh L, Williams JC, Pryor JP, Branas CC (2016) A geographic simulation model
for the treatment of trauma patients in disasters. Prehosp Disaster Med 31(4):413–421.
https://doi.org/10.1017/S1049023X1600510
139. Mumma BE, McCue JY, Li CS, Holmes JF (2014) Effects of emergency department
expansion on emergency department patient flow. Acad Emerg Med 21(5):504–509.
https://doi.org/10.1111/acem.12366
340 A. J. French

140. Kaji AH, Bair A, Okuda Y, Kobayashi L, Khare R, Vozenilek J (2008) Defining systems
expertise: effective simulation at the organizational level- implications for patient safety,
disaster surge capacity, and facilitating the systems interface. Acad Emerg Med
15(11):1098–1103
141. Toerper MF, Kelen GD, Sauer LM, Bayrum JD, Catlett C, Levin S (2017) Hospital surge
capacity: a web-based simulation tool for emergency planners. Disaster Med Public Health
Prep 12(4):513–522. https://doi.org/10.1017/dmp2017.93
142. Khare RK, Powell ES, Reinhardt G, Lucenti M (2009) Adding more beds to the emergency
department or reducing admitted patient boarding times: which has a more significant
influence on emergency department congestion? Ann Emerg Med 53(5):575–585. https://
doi.org/10.1016/j.annemergmed.2008.07.009
143. Franc JM, Ingrassia PL, Verde M, Colombo D, Della Corte F (2015) A simple graphical
method for quantification of disaster management surge capacity using computer simulation
and process-control tools. Prehosp Disaster Med 33(4):387–393
144. Ohta S, Yoda I, Takeda M, Kuroshima S, Uchida K, Kawai K, Yukioka T (2015) Evidence-
based effective triage operation during disaster: application of human-trajectory data to
triage drill sessions. Prehosp Disaster Med 30(1):102–109. https://doi.org/10.1017/
S1049023X14001381
145. Pennathur PR, Cao D, Sui Z, Lin L, Bisantz AM, Fairbanks RJ, Guarrera TK, Brown JL,
Perry SJ, Wears R (2010) Development of a simulation environment to study emergency
department information technology. Simul Healthc 5(2):103–111. https://doi.org/10.1097/
SIH.0b013e3181c82c0a
146. Klein MG, Reinhardt G (2012) Emergency department patient flow simulations using
spreadsheets. Simul Healthc 7(1):40–47
147. Hill M (2011) Disaster medicine: using modeling and simulation to determine medical
requirements for responding to natural and man-made disasters. Report no. 10-38, Naval
Health Research Center. https://apps.dtic.mil/dtic/tr/fulltext/u2/a561719.pdf
148. Embrey EP, Clerman R, Gentilman MF, Cecere F, Klenke W (2010) Community-based
medical disaster planning: a role for the department of defense and the military health
system. Mil Med 175:298–300
The Growing Role of Social Media
in International Health Security: The Good,
the Bad, and the Ugly

Stanislaw P. Stawicki, Michael S. Firstenberg, and Thomas J. Papadimos

1 Introduction

International health security (IHS), also referred to as “global health security” or


“public health security”, consists of a broad collection of topics that are inextricably
tied to human security [1]. First outlined by the United Nations in 1994, the
definition of “health security” remains nebulous in that there is incomplete overlap
between the individual/primary domains of “health” and “security” [2]. Conse-
quently, some controversy exists regarding both the degree of overlap and its precise
context (Fig. 1). In general, more recent IHS applications revolved around emerging
infectious diseases and the threat of bio-terrorism [3, 4]; however, the primary
domain can be defined much more broadly when one considers the potential impact
of various manmade and non-manmade events/factors on “health security” from
global health governance perspective [5–8].
The emergence of social media (SM) as a pervasive communication platform
brought numerous benefits, as well as some potential challenges into the “global
village” narrative. With its ability to bring people closer together than ever before,
SM has also introduced a new and unique set of challenges, including phenomena of
cyber-bullying, voter sentiment manipulation, and other forms of criminal activity
[9]. In this chapter, we will explore how SM can be both a positive force and a
negative influence in the context of IHS.

S. P. Stawicki (*) · M. S. Firstenberg · T. J. Papadimos


Department of Research & Innovation, St. Luke’s University Health Network,
Bethlehem, PA, USA
Department of Surgery (Cardiothoracic), Medical Center of Aurora, Aurora, CO, USA
Department of Anesthesiology, University of Toledo School of Medicine, Toledo, OH, USA

© Springer Nature Switzerland AG 2020 341


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1_14
342 S. P. Stawicki et al.

Fig. 1 International health security (IHS) represents a fairly heterogeneous content area, with
significant overlap between various modifiable and non-modifiable factors; {Climate change
includes phenomena such as volatile weather conditions and the appearance of invasive species/
diseases; *Novel and disruptive factors include, but are not limited to, technological advances such
as social media and distributed ledger technologies

2 The Basics of Social Media

The fundamental definition of SM is quite simple, and can be distilled to “. . .any


medium involving user-generated content” [10, 11]. Examples of SM include, but
are not limited to: blogs/microblogs, message boards/forums, social networks, wikis,
and other media-sharing platforms (e.g., video/photo posting sites) [10–12]. Wide-
spread adoption of SM enabled exponential increase in the volume of, and the ability
to share, data/information across all domains of human activity, including medicine
and public health [9, 13]. However, much of this tremendous volume of data is
unfiltered and unverified (Fig. 2).
As the Internet has grown, so has the spectrum of SM platforms and related
communication tools [11]. Unfortunately, the motivations and focus of these tools
are not always clearly defined within the context of the so-called “participatory
culture” [14, 15]. While the stated purpose of majority of large SM platforms is
altruistic, it is difficult to always promote (or enforce) the end-user’s or the
community’s best interest [16–18]. As with all exchanges of knowledge, it then
becomes important that transparency and appropriate disclosures (e.g., actual
sources of information/data or financial/non-financial conflict of interest) are made
in an honest and open manner, helping to preserve the legitimacy of the respective
SM platform [19–21]. When applied to the concepts relevant to IHS, it becomes
critically important that SM must still adhere to established scientific and ethical
principles of biomedical/health informatics, hypothesis testing, peer-review, and
external validation [22–29]. One must also recognize, especially when applied to
SM in the context of IHS, that the most popular, repeated, or “believable” messages
may not represent the most accurate or valid information, and that healthy amount of
skepticism is warranted in majority of situations [30–34].
The Growing Role of Social Media in International Health Security: The. . . 343

Fig. 2 A word cloud reflecting the complexity of social media and its interconnectedness with
everyday human activities

3 Social Media in Public Health

Within the area of public health, SM has a number of potential real-life applications
[35]. Perhaps most importantly, the use of SM can serve as a powerful method of data
aggregation, especially during rapidly evolving events such as outbreaks or epidemics
[35, 36]. Topics of SM discussions, such as the appearance and geographic evolution
patterns of specific signs or symptoms, confirmed cases, or other observations that
may be highly associated with certain diseases, can be tracked, aggregated, and
analyzed in near-real-time fashion [37, 38]. In one example, the use of SM messaging
application has been instrumental in helping to contain the Ebola outbreak in Nigeria
by facilitating instantaneous information sharing and critical coordination between
public health professionals [38]. In another instance, messaging service data were used
to track the prevalence of Influenza-like illness [39]. Search engines have also
recognized the value of various tools, based on their aggregate end-user data, in
detecting the epidemiology and spread of disease [40–42]. While such tools are
often used to leverage targeted marketing and sales, they can also been used for
more scientifically focused endeavors. Increased regional frequency of specific search
topics, such as simple questions including “how do I prevent the flu”, “how do I treat
the flu”, and “what are the symptoms of the flu” have been shown to be quite accurate
and timely in predicting disease spread and prevalence, when compared to more
traditional tools to track disease progression. A strong correlation has also been
shown between “search engine trends” and emergency department encounters involv-
ing cases of influenza [43]; however, some investigators urge caution when using such
data without the proper epidemiological context [44].
344 S. P. Stawicki et al.

Traditional scientific processes including systematic methodology and peer


review, although far from perfect, often contain a mechanism for checks and
balances to minimize the risk of dissemination of inaccurate or inappropriate content
[45, 46]. Social media platforms often will encourage open membership and largely
unrestricted exchange of ideas under the principles of “protecting and enabling free
speech” – unfortunately, short of legal consequences, there is often little objective
accountability for what is said or communicated [47–50]. In extreme circumstances,
when unregulated, on-line attacks can be personally and professionally destructive –
both in the digital and real world [9, 51]. Many established social media platforms
struggle with the balance between self- or user-censorship and cyber bullying
[9]. Such issues become even more important to balance when the consequences
have clear real-world health-care implications. Unfortunately, as evidenced by the
concerns regarding the impact of SM on the 2016 US Presidential election, various
SM sites have the potential to serve as conduits for extreme ideologies that manifest
as violent and destructive social behaviors [52–55].

4 Social Media as Positive Modulator of IHS

It is well established that SM-based monitoring can be effective for epidemic


identification and trending [56]. During the 2014 Ebola outbreak in West Africa,
small foci of Ebola were able to be extinguished using coordinated SM campaigns,
including targeted correction of hoax messages [38]. Beyond constructive applica-
tions in the setting or outbreaks/epidemics, SM can provide added value by helping
public health investigators elucidate the structure and function of social networks in
the context of health and disease, including health security-specific behaviors at both
individual and population levels [57]. Health promotion and behavior change are
among the most pronounced early applications of SM in this domain [58].

5 Potentially Harmful Uses of Social Media and Malignant


Intensions in IHS

Whenever information is shared over SM in a non-peer reviewed fashion or without


proper context, there is risk of misinformation or misinterpretation of the informa-
tion being transmitted [59]. In a recent study, the daily volume of instant messages
containing “inaccurate or fake news” outnumbered “fact checking” by approxi-
mately tenfold [60]. Moreover, the average lag between the appearance of “fake
news” and the subsequent “fact checking” was 13 h, indicating that significant
amount of exposure occurs before misinformation can be rectified [60]. One can
only imagine how much damage a highly motivated individual (or group) can inflict
in the context of IHS, especially if misinformation is carefully crafted, subtly
The Growing Role of Social Media in International Health Security: The. . . 345

Table 1 Primary types of negative behaviors involving social media, including associated char-
acteristics, potential for harm, and suggested remedial/corrective steps
Communication/
behavior Characteristics/potential harm Suggested remedies
Cyber bullying Communications that send messages Aggressive surveillance, reporting,
of an intimidating or threatening and prompt corrective action, as well
nature; risk of self-harm or harm to as creation of appropriate legal and
others; both mental and physical regulatory frameworks
Fake news Intentional disinformation or hoaxes Strict adherence to proper journalis-
spread via both traditional and social tic standards; sound editorial poli-
media; unintended and difficult-to- cies; continuous surveillance, fact-
predict consequences, including checking, and prompt intervention to
intentionally or unintentionally prevent any subsequent, appropriate
harmful or damaging behaviors, or legal and regulatory initiatives
wrongly directed action
Misinformation Directing or informing individuals to Empowering social medial modera-
(deliberate) perform actions that may have the tors to remove content when appro-
potential to be deleterious to self or priate; continuous surveillance, fact-
others. The danger lies in random/ checking, and prompt intervention to
unpredictable effects of any prevent any subsequent damage;
misinformation, which requires an legal consequences of deliberately
active user to receive, process, and introducing potentially damaging
implement any information before misinformation
actual harm can result
Misinterpretation Erroneous assumptions made regard- Appropriate vetting of both the
ing data generated or compiled from source data and the analytical pro-
social media inputs. This may result cesses; finally, incorporation of
in incorrect action plans put into appropriate decision-making algo-
place, thus leading to potential harm rithms and cross-verification
mechanisms

communicated, and strategically delivered (Table 1). Misinformation and “fake


news” can be misleading in several ways, from concealing negatives as positives
(e.g., claiming that hand-washing is “bad for you” or that vaccines have more risks
than benefits) to providing incorrect information about a disease or a condition (e.g.,
misstating signs and symptoms of a viral infection). Programmatic content moder-
ation is one of the answers that SM companies have resorted to, but this approach is
expensive and difficult for the employees involved [61].
Luxton et al., described very troubling and increasingly prevalent aspects of SM
influence on population-level behaviors, including cyber-bullying, strong peer-
pressure, and elevated risk of suicide. For example, an Internet search using 12 sui-
cide-associated terms revealed that approximately half of the identified sites featured
overtly pro-suicide content and provided potentially dangerous facts about suicide
[62]. Of interest, excessive use of the Internet in itself has been associated with
increased suicide rates [63]. Whilst SM can provide platforms for open discussion on
virtually any topic, the troubling fact is that SM may also contribute to a society that
has fewer and vaguely defined boundaries (e.g., “real-life friend” versus “social
media friend”) [64, 65]. Personal space is no longer clearly demarcated, thus
346 S. P. Stawicki et al.

presenting a risk of either intended or unintended “invasion” [66, 67]. While the dark
side of SM manipulation has been addressed in the aforementioned comments, there
is no doubt that SM can be used to make a populace more aware of public health/
mental health matters thereby being a force for good, but caution should always be
exercised when dealing with SM platforms.
While the association between SM activity as a surrogate for the presence of
actual disease outbreaks or epidemics can be significant, the risk of “false alarms”
continues to be substantial, thus limiting the utility of SM as an effective population-
level tracking tool [68]. It is also important to remember that any information
gathered, and subsequently processed, can be subject to further propagation, dis-
semination, and interpretation. The magnitude of potentially harmful misinformation
or misinterpretation of critical data increases significantly when the “story roots” are
grounded in either actual or “potentially real or validated” news, which then may
become distorted and ultimately championed by influential individuals without
appropriate content expertise [69, 70]. The so-called “damage control” can be
difficult under circumstances where the original message is found to be wrong or
inaccurate, but the wave of damage has already taken its toll and the point of
irreversibility has been crossed [71–74]. The controversy around the relationship
between childhood vaccines and autism represents one case where truly global
concerns were raised in a highly contentious public health matter [75–77]. Despite
numerous large-scale, well-designed, and meticulously conducted studies that have
failed to find a link between childhood vaccinations and the perceived increase in the
incidence of autism, there still exist large segments of society who believe in the
factual nature of such a correlation [78–80]. Extensive and very public discussions
on SM sites, often championed by high-profile individuals with little formal training
in the scientific method, continue to disseminate misinformation [81–83]. Unfortu-
nately, it is the trust that some place in the messengers and their SM tools, combined
with the distrust in the poorly understood “medical establishment” that continues to
fuel the propagation of potential harm [84–86]. These unsolved challenges clearly
have implications when considered in the context of IHS, and more specifically on
how to mitigate the impact of the “loudest voice in the room” not always conveying
the most accurate or important message.
There is a potential for malignant actors resorting to intentional misuse of SM to
harm others and seed disinformation [87, 88]. Following the emergence of swine
influenza in 2009, there was a proliferation of SM videos reporting various “con-
spiracy theories” about the flu virus, its allegedly “genetically manipulated” origin
by “big pharma”, as well as the “planned martial law” [89–91]. Subsequent reflec-
tions on the “swine flu panic” continued to be characterized by unsubstantiated,
highly conspiratorial, and questionable SM- and traditional news-based analyses
[90, 91]. For just the three SM videos discussed in this paragraph [89–91], the total
audience exceeded 56,000 views. What is truly concerning is that the number of
search results for “swine flu epidemic” exceeded 5700 items at the time of writing
this chapter [92], exemplifying the staggering nature of the potential for promoting
misinformation using web-based video platforms alone. It is important to recognize
that SM tools may be selected as a preferred platform by those who feel they have no
The Growing Role of Social Media in International Health Security: The. . . 347

Table 2 Factors important to propagation of “fake news” and misinformation on social media
platforms
Factor Comment
Appealing or attractive headline A “catchy” title can attract greater viewership and dissemina-
tion of information
Cult-like following Ideas that attract fanatical, cult-like followership may be prone
to form “social media communities” that propagate biased/
false information
Disclaimer is provided In majority of cases, sites that contain a disclaimer should be
viewed with significant skepticism
Emotional character of the If the content provokes strong emotional response, it is highly
content probable that it was intended to do so (and likely “packaged”
with an attached agenda)
Information “appears” When skillfully “packaged and disseminated” information
legitimate may superficially appear to be legitimate
Predictive statements or claims Stories that provide specific claims of a future event and/or
may be embedded outcome are unlikely to be authentic; similarly, reports of an
effective therapy are important; verification and healthy skep-
ticism will be required
Reputable source/origin is FN may appear more credible if its source “appears” credible;
claimed however, claims of a reputable source must be substantiated
and verified
Propagation by high-profile Perception of credibility can be created around questionable
individuals stories, especially when high-profile individuals (actors, poli-
ticians, scientists) decide to endorse/share the information
Too good to be true. . . When a SM story/report/factoid appears to be “too good to be
true”. . . well, it probably isn’t factual
Unusual or atypical domain When a story comes from a source with a questionable domain
name name, suspicious user name, and/or URL, great caution is
advised
FN fake news, SM social media

other means to express their beliefs, thoughts, and voices while also serving as a
virtual gathering place for people who share similar beliefs [93–95], with the
potential for “moving beyond online presence” [96]. In traditional models of com-
munities, unconventional or controversial beliefs were often suppressed (for better or
for worse) as there were no effective or efficient tools to advocate and disseminate
such ideas [97–99]. With the rapid growth and wide-spread availability of SM
platforms, virtually every voice and idea can have a “home” [100, 101]. While
such “homes” can be the foundation for innovation, scientific discovery, and the
exploration of new ideas, they can also be the sources of misinformation, negative
behaviors, and scientifically flawed or inaccurate information. Table 2 provides an
overview of factors that may be associated with fake news and misinformation.
Armed with SM tools, “bad actors” with malignant intentions can be very
destructive. Consequently, malignant use of SM can put international security in
general – and more specifically IHS as it relates to this chapter – at risk. Beyond this,
countries that control Internet infrastructure and access, combined with countries
348 S. P. Stawicki et al.

that allow unimpeded access to the internet, can set the stage for the “bad actor” or
“manipulated public discourse” to cause potential disruptions to society in general,
subgroups of society, as well as psychological trauma to individuals [102, 103]. The
next two paragraphs will discuss cases where malignant use of SM is controlled
centrally (e.g., government) versus peripherally (individuals, interests,
non-governmental groups).
In cases where central control of the Internet (e.g., government) exists, a message
can theoretically be created at the highest levels of government, and broad dissem-
ination of such message can be easily accomplished. Of course, this should be used
to promote health, wellness, and positive behaviors. However if such central author-
ity (or government) is the “bad actor” and their choice of message leads to a
“manipulated public discourse”, the intended message may involve attempting to
influence a society’s impression of a therapy or vaccine, the fidelity of a medical or
public health process, or the reliability or intentions of a third party assisting that
particular nation on a health matter [104, 105]. A corrupt central authority can use
the malignant manipulation of SM regarding an event or process to extort money or
promises or create a situational compromise from a targeted organization, institution,
or individual [106, 107]. Furthermore, a central authority may be negotiating an
external agreement with a third party for an intervention that the international
community insists on, while internally not wishing for such cooperation to occur
[108]. Therefore, a central authority has the opportunity and a robust platform to
foment unrest or opposition to an accord through SM. These focused SM messages
may involve overtly false statements, false statistics, controversial or manipulated
images, and may be directed at particularly susceptible or vulnerable subgroups
(e.g., religious, racial, gender-specific or otherwise) in that society [109, 110]. Finally,
the central authority may wish to be dismissive of specific segment(s) within the
society, and thereby propagate malignant information about particular groups [111].
In cases where central control is not involved, but laws, judicial precedents,
and/or executive orders leave the Internet or SM totally free to be accessed by all
players, individual or institutional, then the so-called peripheral control can be
attained [112–114]. Here, individuals or institutions/organizations can act freely
and target select individuals, societal subgroups, and institutions for manipulation
through “fabricated public discourse” [115]. These types of approaches can be
particularly effective in pluralistic societies, where various interest groups may
desire different, unique, and specific processes and outcomes, at times in opposition
to interests of other parties [116–118]. In the arena of peripheral control, external
actors (e.g., countries or interest groups) who wish to disrupt the fabric of a particular
society can work directly through the Internet disguised as indigenous individuals or
organizations and can wreak havoc within and between groups in the society being
targeted [119]. An inherent weakness in pluralistic societies is that entrepreneurs
provide the platforms with which “bad actors” can attack a society from the
periphery. The difficulty is that in a free society the providers of these SM platforms,
although not overtly malevolent, tolerate all users of the platform, at least until very
recently, in order to maximize profits [120, 121]. Finally, there is the possibility that
certain SM platforms that we now consider to be on the periphery of the discourse
The Growing Role of Social Media in International Health Security: The. . . 349

can, under certain circumstances, transform into new central authorities and media
control hubs.
Hypothetical, but nevertheless probable “false” actions may occur through the
use of SM. For example, during an epidemic, or especially a Public Health Emer-
gency of International Concern (PHEIC) social media can be used to manipulate
populations to wrong/inappropriate locations for assistance [122]. Social-media
driven rumors could cause local populations to flee toward or away from cities, or
toward or away from sanctuary, and in the very least contribute to public distress and
waste or misuse of resources [123–125]. Local populations can be manipulated to
accept or reject treatments or vaccines provided by governmental institutions or
non-governmental organizations. Finally, health-care providers could be manipu-
lated through various SM platforms into not accepting the risk to care for those in
need, thus effectively negating the health-care provider’s obligation for provision of
care [126, 127].
Historically, conspiracy theories have been circulated in every generation, but in
today’s era of multiple competing SM platforms, it is a prevalent, everyday reality
[128, 129]. In fact, today, reality can be engineered by motivated individuals or
groups by using SM to “bend or alter the truth” [130].

6 Social Media and Health Security: Evolving/


Miscellaneous Topics

An intriguing application of SM has been the concept of crowdsourcing and


crowdfunding. The underlying principle behind both of these concepts is to use
the benefits of a large and highly motivated audience to solve a particular problem
[131–133]. Crowdfunding relies on the willingness of a SM audience, or the
“crowd”, to become involved with solving a specific challenge by financially
supporting a mission that addresses that problem [134, 135]. Early applications of
crowdfunding were focused on raising funds to support the development of various
physical projects or as a way for small projects to gain funding when traditional
revenue streams (e.g., grant writing) were ineffective [136, 137]. Gradually,
fundraising for health-care related issues emerged, including IHS-centric causes
[1, 138]. Such approaches have been very effective in not only generating financial
support, but also getting across a message by raising social awareness. A sufficiently
powerful message can help drive the ability to raise a considerable amount of money
in a relatively short period of time. At the same time, there is also the risk of theft and
misappropriation of funds, including the furthering of criminal or otherwise illegal
schemes [139]. Recent evidence has suggested that millions of US dollars have been
raised to support dubious medical treatments, research projects, or solutions to
problems with little, if any, realistic chance of achieving a desirable goal [140]. In
summary, while SM can be very valuable and effective in mobilizing resources to
350 S. P. Stawicki et al.

address critical needs, as with the exchange of ideas, there must be appropriate
oversight and accountability.
Another evolving area of using SM is the ability to leverage the availability of
distributed resources to provide a specific service, functionality, or data output to a
specific group of end-users [141–146]. One of the initial implementation of this
approach, well before modern SM platforms were established, was the SETI@home
initiative [147]. SETI, or the Search for Extra-Terrestrial Intelligence, requires a
tremendous amount of computational processing and power to analyze the vast
amount of data collected by radio telescopes to help provide astronomers with
candidate signal identification and planning for subsequent areas of focus
[148]. SETI@home allowed volunteers from around the world run an application
on their personal computer that took advantage of unused processing cycles to
analyze the vast stream of SETI data [149]. Subsequent applications emerged in
the area of public health, from disaster management to outbreak tracking and
mapping [150–152].
Last, but not least, distributed computing resulted in the development of
blockchain applications such as cryptocurrencies, smart contracts, and distributed
storage solutions [153]. Cryptocurrencies that are based on blockchain technology
rely on vast distributed computational power to “mine” or create “coins” (e.g.,
currency units) by solving complex equations [154, 155]. With the incentive of
potential wealth creation, malignant actors have abused SM platforms to install
software on end-user computers, without permission, to illegally harness processing
power to “mine” for currency [153]. As the number of blockchain technology
applications increases, so does the potential for both benevolent and malevolent
use. Finally, we would be remiss not to mention the evolution of the “internet-of-
things” (IOT), or a framework where interconnected computing devices embedded
in everyday objects send and receive operating data for increased efficiency and
enhanced end-user experience [156, 157]. By extension, the incorporation of IOT as
an essential element in the design of public health infrastructure promises to trans-
form the way future cities, health systems, and medical infrastructure are designed
[158–161].

7 Conclusions

Continuous information sharing around the globe fosters growth and exchange of
knowledge, thus accelerating the advancement of humanity. The same appears to be
true about the use of SM in IHS-specific applications. While numerous benefits of
SM exist, the relevance of this heterogeneous group of media modalities in the
context of IHS remains incompletely understood. The incorporation of SM into
public health applications created an unprecedented opportunity for the development
of highly refined, integrated mechanisms for disease tracking and epidemiological
trend identification. At the same time, great caution is warranted as the potential for
misinterpretation of data or incorrect response to reported data can result in potential
The Growing Role of Social Media in International Health Security: The. . . 351

harm. Finally, SM tools in the hands of malignant actors can lead to significant
amount of deliberate harm through the intentional propagation of “fake news” and
misinformation. Thus, risks and benefits associated with SM use in the context of
public health and IHS must be carefully weighed and monitored to avoid any
unintended harm and other negative consequences.

References

1. Chiu Y-W et al (2009) The nature of international health security. Asia Pac J Clin Nutr 18
(4):679–683
2. Programme, U.N.D (1994) Human development report 1994. Oxford University Press,
New York
3. Scharoun K, Van Caulil K, Liberman A (2002) Bioterrorism vs. health security—crafting a
plan of preparedness. Health Care Manag 21(1):74–92
4. Heymann DL et al (2015) Global health security: the wider lessons from the west African
Ebola virus disease epidemic. Lancet 385(9980):1884–1901
5. Reiter P et al (2004) Global warming and malaria: a call for accuracy. Lancet Infect Dis 4
(6):323–324
6. Hobson C, Bacon P, Cameron R (2014) Human security and natural disasters. Routledge,
New York
7. Brown T (2011) ‘Vulnerability is universal’: considering the place of ‘security’ and ‘vulner-
ability’ within contemporary global health discourse. Soc Sci Med 72(3):319–326
8. Kay A, Williams O (2009) Global health governance: crisis, institutions and political econ-
omy. Springer, Switzerland
9. Stawicki TT et al (2018) From “pearls” to “tweets:” how social media and web-based
applications are revolutionizing medical education. Int J Acad Med 4(2):93
10. Miller D et al (2016) What is social media. In: How the world changed social media, vol 1, pp
1–8
11. Boyd DM, Ellison NB (2007) Social network sites: definition, history, and scholarship. J
Comput-Mediat Commun 13(1):210–230
12. Taprial V, Kanwar P (2012) Understanding social media. Bookboon
13. Thackeray R et al (2012) Adoption and use of social media among public health departments.
BMC Public Health 12(1):242
14. Jenkins H et al (2009) Confronting the challenges of participatory culture: media education for
the 21st century. Mit Press, Cambridge, MA
15. Hansen DL, Shneiderman B, Smith MA (2010) Analyzing social media networks with
NodeXL: insights from a connected world. Morgan Kaufmann, Burlington
16. Li H, Sakamoto Y (2014) Social impacts in social media: an examination of perceived
truthfulness and sharing of information. Comput Hum Behav 41:278–287
17. Pavlíček A (2013) Social media–the good, the bad, the ugly. IDIMT-2013, p 139
18. Wu L et al (2016) Mining misinformation in social media. In: Big data in complex and social
networks, pp 123–152
19. Wasike J (2013) Social media ethical issues: role of a librarian. Libr Hi Tech News 30(1):8–16
20. Scott PR, Jacka JM (2011) Auditing social media: a governance and risk guide. Wiley,
Hoboken
21. Goldsmith A (2015) Disgracebook policing: social media and the rise of police indiscretion.
Polic Soc 25(3):249–267
22. Goodman KW, Cushman R, Miller RA (2014) Ethics in biomedical and health informatics:
users, standards, and outcomes. In: Biomedical informatics. Springer, pp 329–353
352 S. P. Stawicki et al.

23. Gritzalis D et al (2014) History of information: the case of privacy and security in social media.
In: Proceedings of the history of information conference
24. Potts L (2013) Social media in disaster response: how experience architects can build for
participation. Routledge, New York
25. Bricout JC, Baker PM (2010) Leveraging online social networks for people with disabilities in
emergency communications and recovery. Int J Emerg Manag 7(1):59–74
26. Imran M et al (2015) Processing social media messages in mass emergency: a survey. ACM
Comput Surv (CSUR) 47(4):67
27. Castillo C (2016) Big crisis data: social media in disasters and time-critical situations.
Cambridge University Press, Cambridge, UK
28. Korac-Boisvert N, Kouzmin A (1994) The dark side of info-age social networks in public
organizations and creeping crises. Adm Theory Prax 16:57–82
29. Bowdon MA (2014) Tweeting an ethos: emergency messaging, social media, and teaching
technical communication. Tech Commun Q 23(1):35–54
30. Flanagin AJ, Metzger MJ (2000) Perceptions of Internet information credibility. Journal Mass
Commun Q 77(3):515–540
31. Johnson TJ, Kaye BK (1998) Cruising is believing?: Comparing Internet and traditional
sources on media credibility measures. Journal Mass Commun Q 75(2):325–340
32. Flanagin AJ, Metzger MJ (2007) The role of site features, user attributes, and information
verification behaviors on the perceived credibility of web-based information. New Media Soc
9(2):319–342
33. Greer JD (2003) Evaluating the credibility of online information: a test of source and
advertising influence. Mass Commun Soc 6(1):11–28
34. Eastin MS (2001) Credibility assessments of online health information: The effects of source
expertise and knowledge of content. J Comput Mediated Commun 6(4):JCMC643
35. Dredze M (2012) How social media will change public health. IEEE Intell Syst 27(4):81–84
36. Schmidt CW (2012) Trending now: using social media to predict and track disease outbreaks.
Environ Health Perspect 120(1):a30
37. Chunara R, Andrews JR, Brownstein JS (2012) Social and news media enable estimation of
epidemiological patterns early in the 2010 Haitian cholera outbreak. Am J Trop Med Hyg 86
(1):39–45
38. Carter M (2014) How Twitter may have helped Nigeria contain Ebola. BMJ 349:g6946
39. Lampos V, De Bie T, Cristianini N (2010) Flu detector-tracking epidemics on Twitter. In: Joint
European conference on machine learning and knowledge discovery in databases. Springer
40. Chorianopoulos K, Talvis K (2016) Flutrack. org: open-source and linked data for epidemi-
ology. Health Informatics J 22(4):962–974
41. Pelat C et al (2009) More diseases tracked by using Google Trends. Emerg Infect Dis 15
(8):1327
42. Carneiro HA, Mylonakis E (2009) Google trends: a web-based tool for real-time surveillance
of disease outbreaks. Clin Infect Dis 49(10):1557–1564
43. Dugas AF et al (2012) Google Flu Trends: correlation with emergency department influenza
rates and crowding metrics. Clin Infect Dis 54(4):463–469
44. Olson DR et al (2013) Reassessing Google Flu Trends data for detection of seasonal and
pandemic influenza: a comparative epidemiological study at three geographic scales. PLoS
Comput Biol 9(10):e1003256
45. Chubin DE, Hackett EJ, Hackett EJ (1990) Peerless science: peer review and US science
policy. Suny Press, Albany
46. Wilson EB (1990) An introduction to scientific research. Courier Corporation
47. Sunstein CR (2018) # republic: divided democracy in the age of social media. Princeton
University Press, Princeton
48. Raboy M et al (2008) Broadcasting, voice, and accountability: a public interest approach to
policy, law, and regulation. University of Michigan Press, Ann Arbor
The Growing Role of Social Media in International Health Security: The. . . 353

49. Kietzmann JH et al (2011) Social media? Get serious! Understanding the functional building
blocks of social media. Bus Horiz 54(3):241–251
50. Lagu T, Greysen SR (2011) Physician, monitor thyself: professionalism and accountability in
the use of social media. J Clin Ethics 22(2):187–190
51. Vaidhyanathan S (2005) The anarchist in the library: how the clash between freedom and
control is hacking the real world and crashing the system. Basic Books, New York
52. Aiken M (2017) Cyber effect: an expert in cyberpsychology explains how technology is
shaping our children, our behavior, and our values – and what we can do about it. Spiegel
& Grau
53. Shirky C (2011) The political power of social media: technology, the public sphere, and
political change. Foreign Aff 90:28–41
54. Wood W, Wong FY, Chachere JG (1991) Effects of media violence on viewers’ aggression in
unconstrained social interaction. Psychol Bull 109(3):371
55. Patton DU et al (2014) Social media as a vector for youth violence: a review of the literature.
Comput Hum Behav 35:548–553
56. Kavanaugh AL et al (2012) Social media use by government: from the routine to the critical.
Gov Inf Q 29(4):480–491
57. Valente TW (2010) Social networks and health: models, methods, and applications, vol
1. Oxford University Press, New York
58. Korda H, Itani Z (2013) Harnessing social media for health promotion and behavior change.
Health Promot Pract 14(1):15–23
59. Chen X et al (2015) Why students share misinformation on social media: motivation, gender,
and study-level differences. J Acad Librariansh 41(5):583–592
60. Shao C et al (2016) Hoaxy: a platform for tracking online misinformation. In: Proceedings of
the 25th international conference companion on world wide web. International World Wide
Web Conferences Steering Committee
61. Wilson M (2018) The hardest job in the Silicon Valley is a living nightmare. November
9, 2018. Available from: https://www.fastcompany.com/90263921/the-hardest-job-in-silicon-
valley-is-a-living-nightmare
62. Luxton DD, June JD, Fairall JM (2012) Social media and suicide: a public health perspective.
Am J Public Health 102(S2):S195–S200
63. Shah A (2010) The relationship between general population suicide rates and the Internet: a
cross-national study. Suicide Life Threat Behav 40(2):146–150
64. Meyrowitz J (1986) No sense of place: the impact of electronic media on social behavior.
Oxford University Press, New York
65. Skeels MM, Grudin J (2009) When social networks cross boundaries: a case study of
workplace use of facebook and linkedin. In: Proceedings of the ACM 2009 international
conference on supporting group work. ACM
66. Bailenson JN et al (2001) Equilibrium theory revisited: mutual gaze and personal space in
virtual environments. Presence Teleoperators Virtual Environ 10(6):583–598
67. Abril PS (2007) A (My) space of one’s own: on privacy and online social networks. New J
Tech Intell Prop 6:73
68. Culotta A (2010) Towards detecting influenza epidemics by analyzing Twitter messages. In:
Proceedings of the first workshop on social media analytics. ACM
69. Brodeur J-P, Dupont B (2008) Introductory essay: the role of knowledge and networks in
policing. In: The handbook of knowledge-based policing: current conceptions and future
directions, pp 9–36
70. Kumar S, Shah N (2018) False information on web and social media: a survey. arXiv preprint
arXiv:1804.08559
71. Aronson RH, McMurtrie J (2007) The use and misuse of high-tech evidence by prosecutors:
ethical and evidentiary issues. Fordham L Rev 76:1453
72. Alexander DE (2014) Social media in disaster risk reduction and crisis management. Sci Eng
Ethics 20(3):717–733
354 S. P. Stawicki et al.

73. Veil SR, Buehner T, Palenchar MJ (2011) A work-in-process literature review: incorporating
social media in risk and crisis communication. J Conting Crisis Manag 19(2):110–122
74. Wendling C, Radisch J, Jacobzone S (2013) The use of social media in risk and crisis
communication
75. Baker JP (2008) Mercury, vaccines, and autism: one controversy, three histories. Am J Public
Health 98(2):244–253
76. Hobson-West P (2007) ‘Trusting blindly can be the biggest risk of all’: organised resistance to
childhood vaccination in the UK. Sociol Health Illn 29(2):198–215
77. Clarke CE (2008) A question of balance: the autism-vaccine controversy in the British and
American elite press. Sci Commun 30(1):77–107
78. Link K (2005) The vaccine controversy: the history, use, and safety of vaccinations. Green-
wood Publishing Group
79. Dixon GN, Clarke CE (2013) Heightening uncertainty around certain science: media cover-
age, false balance, and the autism-vaccine controversy. Sci Commun 35(3):358–382
80. Gross L (2009) A broken trust: lessons from the vaccine–autism wars. PLoS Biol 7(5):
e1000114
81. Travers JC et al (2016) Fad, pseudoscientific, and controversial interventions. In: Early
intervention for young children with autism spectrum disorder. Springer, pp 257–293
82. Carrion ML (2014) Risk, expertise, and the good mother: a qualitative examination of maternal
vaccine refusal. Purdue University
83. Holton A et al (2012) The blame frame: Media attribution of culpability about the MMR–
autism vaccination scare. Health Commun 27(7):690–701
84. Brotherton R (2015) Suspicious minds: why we believe conspiracy theories. Bloomsbury
Publishing, London
85. Kramer RM, Cook KS (2004) Trust and distrust in organizations: dilemmas and approaches.
Russell Sage Foundation, New York
86. Holmberg C (2015) Politicization of the LOW-CARB HIGH-FAT diet in Sweden, promoted
on social media by non-conventional experts. Int J E-Polit (IJEP) 6(3):27–42
87. Baccarella CV et al (2018) Social media? It’s serious! Understanding the dark side of social
media. Eur Manag J 36(4):431–438
88. Allcott H, Gentzkow M (2017) Social media and fake news in the 2016 election. J Econ
Perspect 31(2):211–236
89. Media Y (2009) Swine flu conspiracy theories. April 28, 2009. Available from: https://www.
youtube.com/watch?v=C3fW82DbLKQ
90. Today R (2010) Swine flu, bird flu ‘never happened’: probe into H1N1 ‘false pandemic’.
September 29, 2018. Available from: https://www.youtube.com/watch?v=3haectEvDq0
91. Labs Z (2009) #211 debate: FEMA camps trains trucks busses coffins swine flu & martial law.
September 29, 2018. Available from: https://www.youtube.com/watch?v=5I0eXsBqtlU
92. Videos, G (2018) Google search: “Swine Flu Epidemic”. Cited September 29. 2018. Available
from: https://www.google.com/search?num=50&hl=en&tbm=vid&ei=UCewW47hNJCV5w
Lt7KG4Cw&q=%22swine+flu+epidemic%22&oq=%22swine+flu+epidemic%22&gs_l=psy-
ab.3..0j0i30k1.5065.5719.0.5887.2.2.0.0.0.0.77.136.2.2.0....0...1c.1.64.psy-ab..0.2.134....0.
LIBd3IoDxMA
93. Marwick AE, Boyd D (2011) I tweet honestly, I tweet passionately: twitter users, context
collapse, and the imagined audience. New Media Soc 13(1):114–133
94. Gerlitz C, Helmond A (2013) The like economy: social buttons and the data-intensive web.
New Media Soc 15(8):1348–1365
95. Lewis K, Gonzalez M, Kaufman J (2012) Social selection and peer influence in an online
social network. Proc Natl Acad Sci 109(1):68–72
96. Harlow S (2012) Social media and social movements: Facebook and an online Guatemalan
justice movement that moved offline. New Media Soc 14(2):225–243
97. Pawar M (2003) Resurrection of traditional communities in postmodern societies. Futures
35(3):253–265
The Growing Role of Social Media in International Health Security: The. . . 355

98. Cao X (2010) What speech should be outside of freedom of expression?


99. Langvardt AW (1982) Not on our shelves: a first amendment analysis of library censorship in
the public schools. Neb L Rev 61:98
100. Kaplan AM, Haenlein M (2010) Users of the world, unite! The challenges and opportunities of
Social Media. Bus Horiz 53(1):59–68
101. Clinton H (2011) Internet rights and wrongs: choices & challenges in a networked world. US
State Department
102. Starvridis J (2013) Convergence: illicit networks and national security in the age of globali-
zation. Government Printing Office
103. Svete U (2012) European e-readiness? Cyber dimension of national security policies. J Comp
Polit 5(1):38–59
104. Aliyev H (2017) Precipitating state failure: do civil wars and violent non-state actors create
failed states? Third World Q 38(9):1973–1989
105. Chen D (2017) “Supervision by Public Opinion” or by government officials? Media criticism
and central-local government relations in China. Mod China 43(6):620–645
106. Enikolopov R, Petrova M, Sonin K (2018) Social media and corruption. Am Econ J Appl Econ
10(1):150–174
107. Enikolopov R, Petrova M, Sonin K (2012) Do political blogs matter?: Corruption in state-
controlled companies, blog postings, and DDoS attacks. Centre for Economic Policy Research
108. Myers DP (1917) Violation of treaties: bad faith, nonexecution and disregard. Am J Int Law 11
(4):794–819
109. Fuchs C (2017) Social media: a critical introduction. Sage
110. Korta SM (2018) Fake news, conspiracy theories, and lies: an information laundering model
for homeland security. Naval Postgraduate School, Monterey
111. Ting CSW, Song SGZ (2017) What lies beneath the truth: a literature review on fake news,
false information and more. October 28, 2018. Available from: https://lkyspp.nus.edu.sg/docs/
default-source/ips/report_what-lies-beneath-the-truth_a-literature-review-on-fake-news-false-
information-and-more_300617.pdf
112. Fuchs C, Trottier D (2014) Theorising social media, politics and the state: an introduction. In:
Social media, politics and the state. Routledge, pp 15–50
113. Cohen E (2010) Mass surveillance and state control: the total information awareness project.
Springer, Berlin
114. Castells M (2015) Networks of outrage and hope: social movements in the Internet age. Wiley
115. Fox NJ (1998) Foucault, Foucauldians and sociology. Br J Sociol 49:415–433
116. White D (2018) Public discours: importance & strategies. October 28, 2018. Available from:
https://study.com/academy/lesson/public-discourse-importance-strategies.html
117. Wolf R (2018) Immigration, gay rights, politics, abortion, taxes, technology: crunch time at the
Supreme Court. October 28, 2018. Available from: https://www.usatoday.com/story/news/
politics/2018/04/29/immigration-gay-rights-politics-abortion-taxes-technology-supreme-
court/518748002/
118. PublicHealth.org (2018) Vaccine myths debunked. October 28, 2018. Available from: https://
www.publichealth.org/public-awareness/understanding-vaccines/vaccine-myths-debunked/
119. Richey M (2018) Contemporary Russian revisionism: understanding the Kremlin’s hybrid
warfare and the strategic and tactical deployment of disinformation. Asia Europe Journal 16
(1):101–113
120. Dwoskin E (2017) Facebook profit hits an all-time high, unaffected by recent scandals—so far.
October 28, 2018. Available from: https://www.washingtonpost.com/news/the-switch/wp/
2018/04/25/facebook-profit-hits-an-all-time-high-unaffected-by-recent-scandals-so-far/?
noredirect=on&utm_term=.ac014b07938e
121. Kass-Hout TA, Alhinnawi H (2013) Social media in public health. Br Med Bull 108(1):5–24
122. WHO (2018) Strengthening heath security by implementing the International Health Regula-
tions (2005): IHR procedures concerning Public Health Emergencies of International Concern
(PHEIC). October 28, 2018. Available from: http://www.who.int/ihr/procedures/pheic/en/
356 S. P. Stawicki et al.

123. Briggs CL (2003) Stories in the time of cholera: racial profiling during a medical nightmare.
Univ of California Press, Berkeley
124. Berlant L (2007) Slow death (sovereignty, obesity, lateral agency). Crit Inq 33(4):754–780
125. Bickerstaff K, Simmons P, Pidgeon N (2006) Situating local experience of risk: peripherality,
marginality and place identity in the UK foot and mouth disease crisis. Geoforum 37
(5):844–858
126. Papadimos TJ et al (2018) Ethics of outbreaks position statement. Part 2: family-centered care.
Crit Care Med 46(11):1856–1860
127. Papadimos TJ et al (2018) Ethics of outbreaks position statement. Part 1: therapies, treatment
limitations, and duty to treat. Crit Care Med 46(11):1842–1855
128. Andrejevic M (2013) Infoglut: how too much information is changing the way we think and
know. Routledge, New York
129. Shoemaker PJ, Reese SD (2013) Mediating the message in the 21st century: a media sociology
perspective. Routledge, London
130. Arendt H (2013) The human condition. University of Chicago Press, Chicago
131. Beaulieu T, Sarker S, Sarker S (2015) A conceptual framework for understanding
crowdfunding. CAIS 37:1
132. Kosonen M et al (2014) User motivation and knowledge sharing in idea crowdsourcing. Int J
Innov Manag 18(05):1450031
133. Abrahamson S, Ryder P, Unterberg B (2013) Crowdstorm: the future of innovation, ideas, and
problem solving. Wiley, New York
134. Lehner OM (2013) Crowdfunding social ventures: a model and research agenda. Ventur Cap
15(4):289–311
135. Murray R, Caulier-Grice J, Mulgan G (2010) The open book of social innovation. National
Endowment for Science, Technology and the Art, London
136. Hemer J (2011) A snapshot on crowdfunding. Working papers firms and region
137. Gerber EM, Hui J (2013) Crowdfunding: motivations and deterrents for participation. ACM
Trans Comput Hum Interact (TOCHI) 20(6):34
138. Leather AJ et al (2010) International health links movement expands in the United Kingdom.
Int Health 2(3):165–171
139. Semova M. Cryptocurrencies and financing of social and anti-social projects
140. Associated_Press (2018) People have raised almost $7 million for dubious medical treatments on
crowdfunding platforms, study indicates. November 7, 2018. Available from: http://mailview.
bulletinhealthcare.com/mailview.aspx?m=2018102401acc&render=y&r=2860396-ed2b#S5
141. Magoules F et al (2009) Introduction to grid computing. CRC press, Boca Raton
142. Imran M et al (2013) Extracting information nuggets from disaster-related messages in social
media. In: Iscram
143. Asur S, Huberman BA (2010) Predicting the future with social media. In: Proceedings of the
2010 IEEE/WIC/ACM international conference on web intelligence and intelligent agent
technology-volume 01. IEEE Computer Society
144. Gundecha P, Liu H (2012) Mining social media: a brief introduction. In: New directions in
informatics, optimization, logistics, and production. Informs, p 1–17
145. Tang L, Liu H (2011) Leveraging social media networks for classification. Data Min Knowl
Disc 23(3):447–478
146. Tang L, Liu H (2010) Community detection and mining in social media. In: Synthesis lectures
on data mining and knowledge discovery, vol 2, no. 1, pp 1–137
147. Linders D (2012) From e-government to we-government: defining a typology for citizen
coproduction in the age of social media. Gov Inf Q 29(4):446–454
148. Nov O, Arazy O, Anderson D (2011) Technology-mediated citizen science participation: a
motivational model. In: ICWSM
149. Lievrouw LA (2010) Social media and the production of knowledge: a return to little science?
Soc Epistemol 24(3):219–237
150. Adam NR, Shafiq B, Staffin R (2012) Spatial computing and social media in the context of
disaster management. IEEE Intell Syst 27(6):90–96
The Growing Role of Social Media in International Health Security: The. . . 357

151. Padmanabhan A et al (2014) FluMapper: a cyberGIS application for interactive analysis of


massive location-based social media. In: Concurrency and computation: practice and experi-
ence, vol 26, no. 13. pp 2253–2265
152. Ashktorab Z et al (2014) Tweedr: mining twitter to inform disaster response. In: ISCRAM
153. Stawicki SP, Firstenberg MS, Papadimos TJ (2018) What’s new in academic medicine?
Blockchain technology in health-care: bigger, better, fairer, faster, and leaner. Int J Acad
Med 4(1):1
154. Swan M (2015) Blockchain: blueprint for a new economy. O’Reilly Media
155. Shyamasundar R, Patil VT (2018) Blockchain: the revolution in trust management. Proc
Indian Natl Sci Acad 84(2):385–407
156. Gershenfeld N, Krikorian R, Cohen D (2004) The internet of things. Sci Am 291(4):76–81
157. Medaglia CM, Serbanati A (2010) An overview of privacy and security issues in the internet of
things. In: The internet of things. Springer, pp 389–395
158. Boulos MNK, Al-Shorbaji NM (2014) On the Internet of things, smart cities and the WHO
Healthy Cities. BioMed Central
159. Islam SR et al (2015) The internet of things for health care: a comprehensive survey. IEEE
Access 3:678–708
160. Höller J et al (2014) From machine-to-machine to the internet of things. Elsevier
161. Pang Z et al (2015) Design of a terminal solution for integration of in-home health care devices
and services towards the Internet-of-Things. Enterp Inf Syst 9(1):86–116
Part IV
Leadership and Partnerships
Effecting Collective Impact Through
Collective Leadership on a Foundation
of Generative Relationships
Leadership Lessons Learned from Global Health Threat
Response: Ebola in 2014: As Seen Through the Eyes of a
Former Virginia State Health Commissioner

Marissa J. Levine

“Relationships are the ‘secret sauce’ of our preparedness and response efforts,” I
commented in 2016 at an after action review and ongoing planning meeting held
with our statewide unique pathogen response partners in Virginia.
What follows is an effort to expand on the concept and definition of public health
emergency preparedness which leaves room for a deeper dive into what underlies the
leadership preparedness element [1]. Through the lens of the story of Virginia’s
response to the Ebola threat of 2014 from a leadership vantage point, this chapter
will elucidate a relational focus foundational to effective organizational behavior and
disaster response. Such a focus was integral to the successful pre-event preparedness
as well as the Ebola threat event response in Virginia. In addition, this focus resulted
in an approach never before taken at the state level that will likely serve to inform
responses to complex public health urgencies and emergencies in the future.

1 Key Leadership Lesson Learned

The key leadership lesson learned from Virginia’s experience with the Ebola threat
was that leaders who value and respect the establishment (pre-event) and ongoing
maintenance of generative relationships (positive relationships where both parties
are better off) lead more effective, efficient and adaptable emergency preparedness
and response efforts. Such relationships are necessary within the leader’s organiza-
tion as well as between key people within stakeholder organizations primarily due to

M. J. Levine (*)
College of Public Health, University of South Florida, Tampa, FL, USA
e-mail: mjlevine@health.usf.edu

© Springer Nature Switzerland AG 2020 361


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1_15
362 M. J. Levine

the uncertainty that clouds the public health emergency preparedness, response and
recovery arena (EPRR). Those relationships become critical in finding a path toward
making sense out of the uncertainty that initially manifests in emergencies. At best
this work is complicated, but more likely we find ourselves in complex or chaotic
environments, a so called VUCA (volatile, uncertain, complex and ambiguous)
world in which leaders must partner to determine an approach to sense making
[2]. It is hoped that this chapter will be part of a larger effort to “spread an epidemic”
of a growth mindset [3] – a ‘can-do, can grow’ mindset that also values positive
relationships - within the emergency planning, response and recovery leadership
communities. The goal of the “epidemic” is for these mindset changes to become
foundational (‘endemic’) to all leaders’ actions and decisions. Consider it “genera-
tive relationships first and foremost” before, during and after the emergency.
This relational focus in no way minimizes the importance of competency in the
emergency preparedness and response capabilities. What it does, instead, is empha-
size the need for both performance and humanity, to quote leadership guru Bob
Sutton [4]. The balance between accountability for outcomes and the health and
well-being of the people doing the work is the important work of leadership. Also, in
keeping with the groundbreaking emotional intelligence work of Daniel Goleman
[5], very often leaders assume the secret sauce in such work is technical competency
and rigid, centralized command and control when, in reality, our ability to connect to
one another as human beings is the often overlooked skill and focus required in
emergency preparedness, response and recovery. Leaders who learn this lesson and
value such relationships can lead by creating the conditions in which positive
relationships are formed, maintained and nurtured. We see the strength of positive,
caring relationships borne out during the early hours of a catastrophe when neighbor
helps neighbor and all differences are seemingly pushed aside despite the absence of
an incident commander and command structure. This could be described as a form of
non-romantic love and perhaps this form of love is the intervention most needed in
large scale global and public health emergencies [6] where agility and adaptability to
local conditions is most needed. It is possible that the full impact of such an approach
has yet to be realized.

2 Ebola Panics Virginians

During the fall of 2014, panic gripped the people of Virginia as it did elsewhere
when the 2014 Ebola epidemic in West Africa [7] became a real local issue for
people in the continental United States.
Virginia was one of five jurisdictions with an airport of entry (Dulles International
Airport- IAD) for individuals traveling from the affected region of West Africa
(Sierra Leone, Guinea and Liberia) and therefore played a significant role in the
monitoring and evaluation of travelers from that part of the world. Building on many
years of public health and emergency preparedness experience, Virginia key officials
initiated a novel approach to preparing for and responding to the Ebola threat. In
addition to the collective effort of many colleagues and partners (local, state, federal,
Effecting Collective Impact Through Collective Leadership on a. . . 363

private sector and other non-governmental organizations (NGO) to name a few),


actions by key leaders from multiple sectors were instrumental in setting the
direction and assuring the conditions for a successful implementation of a robust
plan. We share those insights and tease out key leadership principles in this chapter
for consideration by other leaders at all levels as they plan and respond to increas-
ingly complex global public health challenges.
Key insights that will be elucidated in this chapter:
• Positive interpersonal relationships are key:
– Between and among leaders prior to the event
– Between leaders and those they lead as a foundation for trust which is central
to an effective, efficient and adaptable response
• Positive relationships emerge from a growth mindset and emotional intelligence
• Consider incident management/health security threat response as forms of col-
lective action with resultant collective impact [8].
• Competency always matters and is grown over time with practice during which
learning takes place through generative relationships (with self and others) in the
context of a growth mindset.
• Robust and comprehensive communication and flow of information is founda-
tional and best occurs through established generative relationships
• Clearly articulated vision implemented using adaptive capabilities is necessary
• Creating organizing structures and processes that promote all of the above and
support and enhance local agility and adaptability to the response is a critical job
of leaders

3 Background

Virginia’s governmental public health system dates back to the early 20th cen-
tury when the State Board of Health and the Virginia Department of Health
(VDH) were founded in response to tuberculosis. Per Virginia law, all of the public
health regulatory oversight functions rest with the Virginia Board of Health, a body
of gubernatorial appointees [9]. However, in any given year, that body only meets
four times for 1 day meetings each time. As a result, Virginia law also designates that
the Virginia State Health Commissioner, a gubernatorial physician appointee, shall
be “vested with all the authority of the Board when it is not in session, subject to such
rules and regulations as may be prescribed by the Board” [10]. The State Health
Commissioner is also the agency lead for VDH. Since its inception, VDH evolved
into an integrated governmental public health department with 119 local health
department offices.
Virginia’s governmental public health system, as defined in the Virginia Code,
has a regional focus via its 35 local health districts (LHD) which by law must be led
by a physician health director [11]. LHD’s borders are defined generally by popu-
lation size (Fig. 1).
364 M. J. Levine

Fig. 1 Virginia’s local health districts

That is, some health districts represent single large jurisdictions while others
(generally more rural) are comprised of up to ten jurisdictions, each with its own
local health department, albeit some with limited staffing. Thirty three of the 35 local
health districts are part of the state system and the other two are locally administered
but seamlessly contracted with VDH. For all of the fully integrated health districts,
local health district budgeting is a cooperative venture with each locality matching
the state’s share and, therefore, also partially financially supporting district opera-
tions. This funding strategy is important to ensure that, despite the fact that 33 of the
35 district personnel are state employees, they are seen as integral and integrated into
local governmental functions. This approach also provides flexibility for the local-
ities to enhance local health department activities over and above the basic contrac-
tual expectations when local differences require an augmented role. In addition, as a
non “home-rule” state, this Virginia model proves advantageous to the coordinated
roll out of a rapid emergency response [12].
VDH has a long history of public health emergency preparedness and response
(EPR). Before 9/11/01 and the anthrax attacks that followed, VDH led infectious
disease outbreak responses and was lead or co-lead concerning environmental health
issues. These epidemiologic functions also were organized in a regional model to
assure that each unique state region has access to critical support and coordination
resources. Furthermore, the Virginia Emergency Medical System (EMS) and hospi-
tal systems already were well organized into regional councils (Fig. 2) with signif-
icant coordination and planning among localities in their defined regions and state
level coordination via contractual obligations and critical relationships. Each
regional council had a Mass Casualty Incident (MCI) plan and relevant policies
consistent with any relevant state policies and national best practices in trauma and
EMS care provision. Those response capabilities represented good examples of
cross-agency and external partner coordination; however they were not fully
Effecting Collective Impact Through Collective Leadership on a. . . 365

Fig. 2 Virginia’s EMS regions and regional councils

informed by standard public safety approaches to incident management until after


9/11. Also, other than required work with other state agencies, this VDH work was
siloed somewhat within the governmental public health arena and even within
respective departments within VDH. This was more the result of years of focused
activity often driven by the siloing of funders (e.g. the federal government) who
required unique and unrelated databases and data collection. Looking ahead, tre-
mendous opportunities existed by moving the expectation to one of leading across
siloes and creating new cross sector relationships.
Like all health departments, VDH began developing its formal public safety
preparedness response capabilities before the events of September 11, 2001. With
the 9/11 attack on the Pentagon in northern Virginia and the subsequent anthrax
attacks, VDH found itself in full response mode using the experience and knowledge
it acquired to date. The prior response work was complex in its own right but the
events of 9/11 and subsequently represented a new definition of a chaotic landscape
demanding different thinking and approaches.
The Department’s subsequent preparedness efforts were informed by the
response to those events and ramped up in earnest with the initiation of the US
Centers for Disease Control and Prevention’s (CDC) first bioterrorism grants (sub-
sequently known as the Public Health Emergency Preparedness Program – PHEP) as
well as the US Department of Health and Human Services (HHS) Health Resources
and Services Administration’s (subsequently managed by the Office of the Assistant
Secretary for Preparedness and Response (ASPR)) Hospital Preparedness Grant
program (HPP) in the early 2000s. These two programs led to a rapid buildup of
staff and resources as had not previously been seen before in governmental public
health. These programs also laid the groundwork for the development of novel cross-
sector relationships between VDH and others particularly in the local, regional,
statewide and national public safety communities.
Virginia strategized an approach to utilize this significant and, in some cases,
unprecedented investment in governmental public health. These strategic decisions
ultimately turned out to lead to very robust system development. Those strategic
decisions built upon two key concepts.
366 M. J. Levine

The first concept of the EPR strategy built off the organic and distributed nature of
VDH itself. VDH was one of a small number of state agencies that were well
integrated into local government functions and thus a true part of local communities.
As an integrated, statewide public health system, investments in the local health
districts were a form of capacity building in Virginia’s communities. Lessons
learned from prior response efforts identified that epidemiologic capability was
lacking in many LHDs and there was also a need for enhanced public health
emergency planning and coordination capacity or expertise. The LHDs ability to
be recognized as an integral member of the local public safety collaborative was
limited by the lack of such capacity or expertise in many localities. Leaders at VDH
at the time decided to put the majority of PHEP funds toward enhancing local and
regional personnel. This resulted in the recruitment and support of an epidemiologist
and a public health emergency planner in each of the 35 LHDs along with regional
epidemiologists and regional emergency planners in each of VDH’s designated
regions. The epidemiology positions were seen as an immediate benefit particularly
in those LHDs with no prior such position. The regional epidemiologists ensured
coordinated and collaborative approaches to issues that crossed district jurisdictions
and also established a connection to VDH’s central offices to ensure coordination up
and down the command chain. The LHDs traditionally enjoyed close relationships
with their local hospital infection control preventionists. The addition of the epide-
miologists augmented these relationships significantly and this would positively
impact our Ebola planning response efforts.
The emergency planners initially focused on development and coordination of
plans as their names implied. What would be considered an unintended consequence
of this component of the personnel enhancement was the infusion of many skilled
individuals with little to no public health experience but who brought diverse
experiences with and relationships from other key sectors including emergency
management, military operations, fire programs, strategic planning, logistics and
other administrative areas. The recruitment of such individuals immediately
enhanced the diversity of backgrounds and relationships brought to the local and
state health department offices and, at the same time, introduced public health to a
cadre of skilled individuals who were unaware of the mission and scope of govern-
mental public health work. This consequence brought huge dividends to VDH and,
undoubtedly, paved the way to the development of key positive relationships with
our public safety and local, state and national key EPR partners.
Over time, as plans were implemented and new relationships emerged, the
function of these emergency planners expanded. They became known as public
health emergency coordinators, emphasizing the critical role they played locally and
regionally to ensure coordination among key partners, particularly with their local
public safety colleagues. Local PHEP positions were a significant enhancement to
the public health emergency preparedness and response capacity not just in the LHD
but in the communities they served. All of the above personnel supported and
enhanced the role of the LHD director who was often the Emergency Support
Function (ESF)-8 [13] lead in the jurisdiction(s) they covered. In addition, over
time, all VDH personnel were considered part of the VDH response effort. As stated,
Effecting Collective Impact Through Collective Leadership on a. . . 367

Fig. 3 Virginia hospital preparedness regions

and perhaps most importantly, the addition of these local health department positions
promoted the creation and maintenance of generative relationships between and
among key LHD partners.
The second concept was implemented with the initiation of the HPP grant
program when VDH leaders also made a critical decision that would reap benefits
down the road including during the Ebola threat. That decision was to subcontract
elements of the HPP grant to the Virginia Hospital and Healthcare Association
(VHHA), the state’s member organization for hospitals and a large number of long
term care facilities. Contracting with VHHA was the first step in developing a more
robust regional and statewide healthcare coordination system. This partnership
evolved to ensure that each of the state’s six hospital regions (Fig. 3) established a
regional healthcare coordinating center which could be activated in large scale
regional or statewide events. These centers were centered around Virginia’s level I
trauma centers and were intimately connected not only to each other and the hospital
systems they served but ultimately to regional emergency healthcare coalitions and
the statewide public health emergency coordinating system – ESF 8 – which is led
by VDH as an integral partner in the state’s emergency planning and response
community. Here too, in retrospect, the many years of close coordination and
collaboration created generative relationships through which plans could be devel-
oped, issues solved, and ideas cross-fertilized. Many of the people in the many roles
resulting from this VDH-VHHA partnership remained in their positions over a
number of years. The generative relationships that developed were maintained as a
result of the longevity and continuity of the people involved in these efforts. Even
when some of the key positions turned over, the strong bonds that had developed
resulted in a smooth, planned and coordinated transition that included an emphasis
on forming a relationship on par with the one that was terminating. This aspect of our
preparedness cannot be overstated and was consistent with best practices and models
for integrating hospitals into the community emergency preparedness and response
planning process [14].
368 M. J. Levine

Prior to the Ebola threat of 2014, VDH and its partners worked to continually
evolve and improve the health care and public health response system that was
described above. I often reminded those I worked with that, “We were only as good
as our last response.” Personally, I frequently reminded myself of the story of Rick
Rescorla and his efforts on and before 9/11. Rick’s story is captured in the book “The
Unthinkable” by Amanda Ripley [15]. The lessons of Rick’s efforts as head of
security at Morgan Stanley Dean Witter in the World Trade Center were important
for all EPR leaders. I specifically embraced Rick’s lesson that we cannot over
exercise or over prepare for emergencies. The need to practice our responses so
they become automatic in an emergency where we might otherwise be prone to
freeze or under respond were made all too clear by Rick’s heroic feats on 9/11.
Rick’s approach was designed to overcome the “freeze” mindset that plagues people
who find themselves in emergency situations. Repetitive training automates our
responses at a time when our brains go to their lowest evolutionary level and prevent
actions necessary for our survival. Rick understood that emotion and behavior were
critical aspects to consider in emergency response yet such training is not necessarily
integrated into ICS/NIMS training as noted by the work of McNulty et al. [16] who
identified a new model incorporating brain science insights into crisis leadership
training systems.
Based on the above insights, we took advantage of every opportunity to exercise
our plans and learn and adjust from those activities. Thus, Virginia, as evidenced by
its high marks on evaluations based on national benchmarks,1 became a model for
public health preparedness and health care regional coalition development. Regional
coalitions and regional collaboration were considered critical success factors as
highlighted in many areas around the country [17]. Furthermore VDH required
that all of its 35 health districts be certified as NACCHO Project Public Health
Ready (PPHR) [18]. Much of the structure and process put in place in Virginia
informed the subsequent Institute of Medicine’s Crisis Standards of Care report and
recommendations [19]. Again, this system didn’t just happen. There was an inten-
tionality of effort built on foundational relational values and beliefs (that local and
regional relationships matter) which led us in this direction.
A good example of the approach to practice and exercise was the annual effort to
immunize the community against influenza by exercising our mass dispensing
capabilities in most of our local health districts – similar to what Maryland local
health departments described in 2005 [20]. These drills involved all staff in the
districts. Special efforts were made to intentionally assure that those in the community
who might otherwise be unable to access influenza vaccines were included in these
drills. These actions were seen as a way to also improve community health and
increase community resilience. These efforts also helped us realize that such activities

1
Virginia regularly scored among the best on evaluations by the CDC for its Strategic Stockpile
program.
Effecting Collective Impact Through Collective Leadership on a. . . 369

like community mass dispensing could be accomplished and, with creative and
innovative approaches, often in an expedient and efficient manner. This focus on
continuous learning and exercising led, we believe, to the “can do” attitude that was
pervasive throughout the department.
Relevant to our pre-Ebola readiness, those tests were not only prompted by
common outbreaks, other natural events and scheduled exercises, but also resulted
from the H1N1 influenza pandemic of 2009. During H1N1, Virginia, similar to
many states, realized its pandemic flu plan required adaptation to meet the actual
H1N1 threat. Virginia was able to adapt those plans and meet the H1N1 threat in
large part because of the robust relationships developed as a result of the concerted
efforts of multiple public and private sector partners prior to 2009. This conclusion
isn’t just speculation on our part but is supported by the international work of Chen
et al. [21] who noted that in a VUCA world “flexible adaptive coordination and
control mechanisms are required.” Their case study review highlighted the “critical
roles played in ensuring alignment and coordination among partners by (1) relational
and social capital, and (2) social, organizational and economic mechanism.” Valuing
the social component (generative relationships) over time results in the institution-
alization of those behaviors. Their review identified that “the higher degree of
institutionalization. . .the greater the likelihood of partnerships being successful.”
For us in Virginia, our many pandemic influenza planning efforts at the state and
local levels that were supported during the Bush administration allowed us to form
and maintain those generative relationships and institutionalize those critical behav-
iors. Furthermore robust communications prior, during and after a response, so
critical to the success of our efforts, could only be built on the network of those
relationships noted. H1N1 expanded the types of partners with whom we developed
such relationships and forced us to enhance our adaptability. H1N1 also humbled us
which further reinforced the mantra that being prepared required constant vigilance,
persistence, adaptability and continuous learning.
One key specific aspect in the realm of Virginia’s public health emergency
training and exercises was the focus on leadership training. That training included
the standard FEMA specific NIMS courses and also very specific public health
leadership training via a Meta-Leadership Summit [22] centered on the meta-
leadership model from the Harvard National Preparedness Leadership Initiative
[23]. NIMS training was frequently done under the auspices of our local first
responder community. This ensured that our public health personnel not only learned
the “language” of the first responders but also had the opportunity to get to know
them as people and vice versa before the event ensued. The Meta-leadership Summit
held in Blacksburg, Virginia in May of 2011 was also timely as it allowed us to take
lessons learned from a number of significant events including H1N1 and the Virginia
Tech shooting tragedy of 2007 and bring together key leaders from around the state.
The location lent itself to having both rural and urban leaders convene and learn
together. We also utilized it as an opportunity to discuss how leaders from different
parts of the state would collaborate should a statewide mass disaster strike.
A final critical activity to highlight that occurred before the Ebola threat relates to
the change in state government leadership that occurred in early 2014. When the new
370 M. J. Levine

McAuliffe administration took over in January of 2014, VDH leadership invited the
Governor, Secretary of Public Safety and Homeland Security, Secretary of Health
and Human Resources and the State Emergency Coordinator to an overview briefing
at the VDH Emergency Coordination Center (ECC). The ECC was a defined
physical space in the VDH central office building in which the incident management
team (IMT) could coordinate regional or statewide public health responses to be sure
that local efforts were adequately resourced and effectively aligned in their efforts.
During state level emergencies where the state Emergency Operations Center (EOC)
was activated, the VDH ECC became a site to support ESF-8 and other ESFs and
again assure adequate local and regional public health support and coordination.
During this briefing with the Governor and his staff, VDH leaders reviewed the
public health and healthcare capabilities that had been developed over the prior
decade or more of effort. In addition, the statewide videoconferencing capability was
demonstrated by connecting with VDH personnel from around the state. This
demonstration also allowed us to showcase and build awareness of the regional
healthcare coordination system that was in place. The overarching goal was to
develop positive relationships between VDH leaders and their teams with the
Governor, key cabinet members and the state emergency coordinator. In meta-
leadership terms, we were “leading up.” Given that a new state emergency coordi-
nator took the realm at the Virginia Department of Emergency Management
(VDEM), I also made it a point to develop and maintain a strong working relation-
ship with the new coordinator. This allowed VDH to demonstrate its support for the
new coordinator and maintain its status as a valued member of the Virginia Emer-
gency Response Team – the group of state agencies that were central to statewide
emergency response. Through this relationship building process, the history of the
state’s coordination during prior health emergencies was reviewed and lessons
learned shared. In meta-leadership terms this was leading “across.” These efforts
to develop important generative relationships would come to pay off significantly
later that year. We believed that such relationships are the context in which trust is
developed [24]. Trust was a commodity on which we would come to rely during the
Ebola threat and is a critical ingredient to the collective impact effort required of a
team in general and certainly during an emergency [25]. Overall, the approaches
described above were examples of key activities that led to the development of the
“secret sauce” to which I was referring in 2016.
In retrospect, what developed over the years from 2002 until mid-2014 was a
robust, nested system of preparedness and response connecting local entities to each
other as well as to state, neighboring and national resources. In addition, Virginia’s
integrated public health system provided an important local component to the
emergency response community with the addition of an epidemiologist and emer-
gency coordinator in each of the 35 health districts. Perhaps even more critical was
the stability of the specialists dedicated to public health emergency preparedness
within VDH. Although exact numbers are not available, there are countless exam-
ples of employees who have been with the VDH emergency response and prepared-
ness programs since its official inception after 9/11. All told, VDH leaders and
personnel took their emergency preparedness and response roles very seriously as
Effecting Collective Impact Through Collective Leadership on a. . . 371

was demonstrated by top scores in national benchmarks for public health prepared-
ness [26]. By 2014 it was not a stretch to say VDH and its leadership were trusted
partners in the state’s emergency management community at the local, state and
national level.

4 Six Foundational EPR Principles for VDH Leaders

At this point it is important to highlight the values and beliefs that Virginia public
health leaders used as a guide to their work and leadership. These principles evolved
over the years as the VDH EPR system developed but were more “lived” than
articulated in a document until now:
1. All VDH EPR work required internal and external partnerships and teams. It was
critical and of the highest import to develop generative relationships within the
organization and with key external public safety, emergency response and other
key stakeholders, including the public, before the emergency.
2. Emergency preparedness and response work was best performed in a manner that
improved our day to day core public health functions.
3. The best emergency preparedness and response systems were manifestations of,
and scaled up versions of day-to-day approaches to our core public health work.
4. Within the agency everyone had an emergency preparedness and response role.
5. We were only as good as our last response, therefore utilize every opportunity to
test and exercise our public health emergency preparedness [27] and healthcare
emergency preparedness capabilities [28].
6. Leadership mattered at all levels and leaders have a special job to assure condi-
tions in which experts in their field can do their work unfettered and in agile,
adaptive yet coordinated manner.

5 The Ebola Outbreak of 2014

During the first year of their term as state health official (SHO), all new SHOs have
the opportunity for an in-person orientation to the CDC at their Atlanta headquarters.
During the spring of 2014 I was one such SHO who found herself at the CDC
headquarters in Atlanta. While on tour of the CDC Emergency Operations Center
(EOC), the “great wall” at the front of the center showed a large number of screens
with various information ranging from TV stations reporting live to maps of current
global outbreaks. One such outbreak captured my attention. It was a rather large
Ebola outbreak in the West African nations of Sierra Leone, Guinea and Liberia. As
someone who thought proactively about what might be on the horizon of concerns, I
recalled saying to myself, this one is something we need to follow closely in
Virginia.
372 M. J. Levine

The Virginia public health enterprise had again learned from H1N1 and other
emergencies about the importance of pre-event planning and scripting. Specifically,
everything that we could do before an event we should do even if it only resulted in a
template for actions. Those include job duty sheets, incident command systems
organization charts, pre-scripted media releases and pre-scripted public guidance
documents, among others. Plan development was a critical function at all levels of
the organization. In the VDH central offices, this effort came to be known as our
playbook development process and by 2014 was a robust activity that had resulted in
a number of such threat/hazard specific playbooks. The goal with the playbook was
to manage those aspects of the initial hours/days of a response that made it unique to
the specific issue by having defined steps written out and materials available in one
place – preferably electronically. The playbooks also allowed VDH, particularly its
central office IMT, to periodically exercise and make sure assigned personnel
understood their potential roles and the relevant resources before the event. In
retrospect, acknowledging that no plan can perfectly prepare us for the reality we
face, what we were most critically doing was creating a growth mindset and assuring
the conditions for a highly functional, agile and adaptable team.
During the summer of 2014 the Ebola outbreak in the three affected West African
countries worsened and, for the first time, caused widespread transmission in large
populated areas [29]. On August 1, 2014, CDC issued its first Health Advisory via
the Health Alert Network (HAN) providing guidelines for evaluation of US patients
suspected of having Ebola Virus Disease (EVD). By August 8, 2014, the World
Health Organization (WHO) declared the Ebola situation a Public Health Emergency
of International Concern [30]. While WHO and CDC were preparing their global
and national response plans, Virginia also was gearing up. I issued the first Ebola
specific “Dear Colleague” letter to all health care providers summarizing the CDC
guidance and tailoring it to Virginia. This communique was delivered via a time-
tested system to assure that information of public health urgency was disseminated
timely to all clinicians and key partners throughout all parts of the state. Our local
health districts were also instrumental in further disseminating this information so all
providers had the opportunity to receive these important messages. Additionally, key
partners such as the Medical Society of Virginia and VHHA further disseminated
such messages to their members. We valued the relationship with our clinical
partners and the use of these “Dear Colleague” letters was one way to initially
develop and then maintain those trusted relationships.
Additionally, Virginia’s state public health lab, the Division of Consolidated
Laboratory Services (DCLS) in the Virginia Department of General Services was a
key partner in all public health responses. DCLS became one of the 56 state/local
labs to receive CDC training to perform Ebola reverse transcription polymerase
chain reaction (RT-PCR) testing for EVD. This proved critical as in September 2014
Virginia was working to implement a system to screen and follow all travelers
returning from Ebola-affected countries in West Africa. Virginia played a special
role given that IAD, located in northern Virginia, was one of the five airports of entry
for such travelers after the federal government outlined its plan for mitigating the risk
of EVD disease transmission via travel.
Effecting Collective Impact Through Collective Leadership on a. . . 373

Fig. 4 VDH central office IMT

VDH Office of Emergency Preparedness and Response led the effort to develop
an Ebola specific playbook and to define incident command structures at the state
level. LHDs did the same at the local level resulting in alignment and coordination
within the Virginia governmental public health system. As with any new threat, the
IMTs at the VDH central office and local districts reviewed their plans and discussed
a path forward. When the case of Thomas Eric Duncan at Texas Presbyterian
Hospital came to light, uncertainties abounded and media interest was high – we
were in a VUCA world. The fever pitch of concern took root quickly in Virginia as it
did elsewhere. Specifically, this required us to revisit our hospital preparedness plans
for such a threat and to rethink our approach to traveler screening given this new
heightened concern. VDH’s Office of Epidemiology took a lead role in the traveler
monitoring program which required a herculean task. The VDH central office IMT
was activated using the all-hazard framework already in place (Fig. 4).
Given that leadership was required at many levels of the organization, initiating
the IMT was critical to assure coordination and alignment of the effort. So VDH was
now required to address actions on multiple fronts – internally with VDH staff,
across with our emergency response partners, up with the Governor’s office as well
as far and wide with state and federal legislators and the public via the media, social
media and other outlets - and to do so in a thoughtful, transparent yet expedient way.
Remember, this was all being done in addition to the usual day to day public health
work with no new resources or funding.
It became clear that we were facing a monumental effort and entering uncharted
territory. The potential impact was staggering and mind boggling all the while
blanketed with a constant panic revisited and heightened regularly by the media.
To give some insight think about what we had to consider:
374 M. J. Levine

• Creating an entirely new traveler monitoring system that needed to be coordi-


nated with US Custom and Border Patrol (CBP), Dulles Airport authorities, CDC
Quarantine station, local authorities, regional partners and other key state leaders
and agencies. Questions to consider included, “What would we do with persons
under investigation (PUI) for EVD if they didn’t warrant hospital treatment or
didn’t live in northern Virginia but needed to be detained?” or “How do we
maintain confidentiality and who is it that really needs to know when there is such
a traveler identified?” and “What about other ports of entry?”
• Redefining healthcare preparedness and response given the newly emerging
concerns about adequacy of personal protective equipment (PPE) and practices,
potential contamination of emergency departments and EMS vehicles and need
for a tiered system of care and triage with fully functional communication and
triage capabilities. Logistics, coordination, data systems, housing potential, infec-
tion control, privacy consideration, and personal and responder safety were some
of the newly emerging concerns and considerations.
• Reevaluation of the healthcare waste management process given the potential for
Category A biohazard waste and limited processor capabilities in our state and
nationwide.
• The need to define the roles of public safety first responders and provide adequate
PPE and other appropriate mitigation training.
• Consider potential impacts on the correctional facility community and their need
to be prepared in case they were impacted by a traveler who was a PUI.
• How did we obtain key stakeholder input on our plans and once we had adequate
plans, how did we assure that they were disseminated widely answering the
concerns and questions of various decision makers? And how do we adjust
plans as new information or other situational awareness is brought forward?
• How do we respond to public inquiries and communicate answers to their
concerns?
• How do we deal with the impacts on behavioral health for patients, travelers and
responders as well as that of the larger community?
• How do we assure an ethical and legal approach to all these efforts?
• How do we continue to do the day to day public health work given these new
responsibilities?

6 The Virginia Approach to the Ebola Threat

VDH and its partners developed the following approaches to the threat. As men-
tioned above, VDH first organized at the LHD and central offices level to be sure our
personnel were working collaboratively and in alignment. All this was done using
the NIMS framework which was ingrained into our planning and response. As with
other major responses, incident action plans and regularly scheduled planning
meetings were held to be sure we clarified our objectives for the time period and
Effecting Collective Impact Through Collective Leadership on a. . . 375

that concerns and feedback were heard. All of this led to a greater likelihood of
communicating one common message externally. There were many lessons in
history emphasizing why communicating in one voice was so important, however
for me the lessons from a smallpox outbreak in Muncie, Indiana in 1893 were ones
that I learned early in my EPR experience and highlighted what happens when public
health and private physicians were at odds and communicating very different
messages to the public [31]. VDH had the fortune to have physician leaders at
local and state levels. It was incumbent on us to communicate in one voice and to
ensure our clinical partners had the information they needed to do so as well.
From a leadership perspective, it was also clear during this response as with so
many others, that we needed physician and non-physician leaders at all levels of the
organization. As long as the leaders were well connected to the larger effort by
providing input and receiving feedback, we could be assured their decisions would
be made in concert with the larger framework of the Ebola response. Importantly we
could also be assured that senior leadership situational awareness would be enhanced
through the same process. All leaders were expected to be active members of the
IMT and other critical meetings, including teleconferences, and to speak frankly
about what they were seeing and the decisions that were needed particularly if they
involved support or resources greater than that available within their immediate
command.
The VDH Ebola IMT identified critical planning and operational branches and
divisions that represented our key areas of effort:

6.1 Planning

Initial efforts focused on assuring adequate plans that considered the unique nature
of VDH’s multiple geographic regions. This effort built off our very experienced five
regional emergency coordinators and five regional epidemiologists. VDH also
developed a situation unit to be sure information coming to central office was
managed and information heading out was coordinated.

6.2 Operations

In the operations areas there were a multitude of branches that included Community
Health Services (CHS – representing the local health districts), epidemiologic
response, fatality management, behavioral health, hospital and healthcare opera-
tions, Medical Countermeasure Distribution /Strategic National Stockpile (SNS)
coordination, environmental and EMS.
376 M. J. Levine

6.3 Finance and Logistics

Administrative preparedness was a key lesson learned from the H1N1 pandemic
influenza response. VDH had experience with the necessary administrative pro-
cesses that needed to be streamlined in an emergency and had built in such
requirements into its procurement processes such that contracted vendors had
specific requirements to meet during emergency situations. This also allowed us to
approach the Governor and cabinet with realistic estimates of resource needs
(as much as was estimable) to inform them about potential resource shortages
allowing them to put emergency procedures in place as needed. Most importantly,
our hard working administrative staff could now identify their roles as being integral
to the EPR efforts. This, in my opinion, enhanced the esprit de corps necessary for a
highly functioning team.
The above highlighted what was predominantly the VDH central office leader-
ship perspective of the response. This was foundational to health threats in the
Commonwealth. However this health threat was much more complex and
concerning than most and VDH’s response alone would not be adequate to respond
to the Ebola threat.
To explain what emerged from that VDH foundational effort, I will use the key
leadership principles we followed to highlight the additional approaches taken to
respond to the Ebola threat.

7 Preparing for Ebola Through the Lens of Our Key


Principles

Principle 1 – Establish and maintain generative relationships


As described, the Ebola threat was rapidly evolving during the fall of 2014. What
became clear early on was that this effort impacted many stakeholders and would
require many partners, some of whom we had yet to identify. We needed to take
inventory of those partners with whom we were already working closely and then
define those whose skills and expertise would be required. We also needed to build
on the trust already established with our partners and work to maintain it throughout
the response. We could not assume trust would be maintained on its own.
To deal with this emerging threat that required activity on so many fronts, the first
order of business was to get organized internally at VDH. Despite an absence of a
gubernatorial emergency declaration, a declaration by the State Health Commis-
sioner and agency head indicating that VDH leaders were organizing in incident
command indicated to all that the issue was serious and required their attention.
Centrally an Ebola playbook was developed and an Ebola specific IMT was acti-
vated. These were usual first steps to make sure all at the agency knew we meant
business and that the relationships necessary for effective planning, communication,
operations and support were in place. But this would not be enough.
Effecting Collective Impact Through Collective Leadership on a. . . 377

Within the agency, it was clear that this would be an operation unlike many
others. With IAD identified as one of the five entry point airports for travelers from
the affected West African countries, we now had to build a monitoring system and be
prepared for invoking isolation and quarantine orders. Although individual offices at
the VDH headquarters would take lead roles in these efforts, developing and
implementing these processes would require collaboration up, down and across the
agency.
External to the agency it was clear that we also had to be sure state government
was effectively mobilized. Here is where the relationship with the state’s emergency
coordinator was critical. Through ongoing one-on-one conversations in the early
days of the Ebola threat, our discussions focused on the best way to organize state
agencies. We concluded that we should recommend a unified command structure but
without full activation of the state’s emergency operations center (EOC) leaving that
option for a threshold event, such as the first actual EVD case or cases. The state had
never organized in unified command for a health issue and certainly had never done
so without the state EOC being at full activation.
This recommendation required the support of the Governor and cabinet secretar-
ies as well as buy in from the participating agencies. Building on pre-existing
positive relationships, much communication was initiated to begin to obtain the
needed support.
This process started with discussion with key agency leads, then a briefing to the
Governor and cabinet and subsequently many other agency leads. The trust that
existed prior paved the way to having such opportunities and for delivering a
message that was well received.
With the blessings of all required parties, the state Ebola Unified Command was
launched with VDH, VDEM and the Virginia State Police sharing the unified
command post – collective leadership for collective impact to respond to the Ebola
threat.
Unified command then significantly amplified the situational awareness and
capabilities needed by VDH to adapt to this threat. And in the early days where
panic and media hype ruled it was critical that we were organized in a way to process
both the events external as well as those within Virginia and to develop a unified
response and communication strategy.
While the partnerships and response organizations were evolving we also had to
respond to stakeholders, most importantly, the public. The advantage of unified
command as it relates to communication is that we can utilize multiple resources and
organize them in a joint information center (JIC). How intensely we would require
the JIC was unclear in the early days of the threat but it at least allowed us to
establish those key relationships so that we could call on others as needed. VDH also
had experience setting up public call centers and engaging the state’s 211 system as a
partner. Each of these strategies again was implemented particularly with the idea of
not only assuring quality information for the public and their questions but allowing
our local health districts to perform their local public health work without being
overwhelmed answering calls that could be handled by others. Of course, calls that
needed to be directed to the local health departments were forwarded by this call
center thereby maintaining and not undermining key local relationships.
378 M. J. Levine

Other key stakeholders also required updates and insights into the planning and
response unfolding. Key among those were state and local legislators. Efforts to
communicate with state legislators and to craft uniform messages that could be
provided by VDH’s local leaders also were critical. Although the legislative assem-
bly was not in session at the time, legislative committees periodically met in
Richmond. We took those meetings as opportunities to brief key legislators. Those
briefings were disseminated widely and served also to demonstrate to the public the
efforts of state government to manage the Ebola threat while concurrently providing
education.
In addition, coordination with our National Capitol Region (NCR) public health
colleagues in MD and DC as well as with other neighboring states, was critical given
the unique aspects of that geography and the need to coordinate our responses to
monitoring and managing IAD travelers. Fortunately, we spent many years promot-
ing and maintaining generative relationships with each other. However, this com-
ponent of the response was even more complex given the role of various federal
agencies each of whom had unique tasks and responsibilities with lack of clarity of
coordination at the federal level. Here, too, pre-established relationships with CDC,
ASPR, HHS, and Department of Homeland Security (DHS) among others weighed
heavily in our favor and allowed us to have frank conversations with people we
knew well.
And there also was the need to keep state executive branch leadership informed
over and above the initial briefing. For this we discussed the best approach with the
Governor’s Chief of Staff and decided to hold briefing calls. This at first would be a
daily update call and later became a weekly call on the morning before the Gover-
nor’s cabinet meeting so that the Governor and cabinet could be fully apprised of the
current Ebola response status. These calls also assured us that any high level
concerns were vocalized early and could then be addressed. Holding these calls
was reassuring, enhanced situational awareness and, perhaps most importantly,
maintained the trust needed to allow the responding agencies to do their work
unencumbered. From my perspective, leading up in this way allowed me to buffer
my team from the politics of the issue so they could do the tactical work needed. I
can also say, in retrospect, that the trust that existed and was maintained, in part,
limited the Governor from feeling the need to be directly involved in day to day
decision making concerning the threat, unlike what was seen in other states at the
time.
Principle 2 – EPR should improve core public health activities
Principle 3 – Emergency response is scalar building on our day to day response
structures and processes
Much of what was done in the Ebola response was built upon the prior work and
experience outlined previously. The novel efforts, such as the extensive traveler
monitoring process, were developed in a manner intended to lay the groundwork for
innovations and improvements in our day to day public health work. For example,
the team developing the monitoring process explored novel ways to carry out the
monitoring particularly to ensure the safety of the monitor while protecting the
Effecting Collective Impact Through Collective Leadership on a. . . 379

privacy of those being monitored. We had a pressing need to develop these systems
and processes but always had an eye out for how it might allow us to do our core
public health functions better.
Internally our Office of Epidemiology took the lead on developing the Ebola
specific monitoring and quarantine/isolation process. These efforts built upon the
structure already in existence at VDH. Given our local health district integrated
structure, one of our first decisions turned out to be easier than initially thought.
Should the required traveler monitoring program be operated centrally from Rich-
mond or decentralized in our local health districts with central office support? Given
the local capabilities and the importance of the local relationships that would be
needed for an effective monitoring program, we chose the latter. We didn’t know at
the time, but this decision also turned out to affect the isolation/quarantine process
decisions.
As VDH learned from its public safety partners and implemented the NIMS to
inform our public health EPR, we realized, consistent with principle 3, that many of
the systems developed for our foundational public health work could form the basis
for emergency surge. An important example was our response to a case of active
tuberculosis. We utilized those practices and protocols as a basis for a large scale
infectious disease outbreak response. As we did so, it became clear that legal and
administrative preparedness were also important. Virginia state government was
proactive in the wake of 9/11 and put resources into enhancing our legal readiness.
Importantly, as a result of our relationship with state legislators and executive branch
leadership, Virginia’s isolation and quarantine laws were updated in 2007 to assure
the legal framework was in place. However, an involuntary order of quarantine had
never been issued in recent memory, so we had to be sure that the process utilized for
a PUI for EVD was clear, consistent with the law and communicated to all parties.
This effort alone impacted an incredible number of stakeholders. In addition to VDH
and its state partners (and informing the Governor and cabinet), an involuntary order
of quarantine might require law enforcement support, local government for housing
or social services if needed, local EMS and hospital providers, the Attorney Gen-
eral’s office for legal support, possibly translation services, federal partners and the
always critical ongoing communication needs.
With unified command in place, we were able to take our newly revised plans and
run them past critical healthcare and public safety partners and more efficiently craft
the necessary communication messages.
But an actual patient with EVD would raise the level of concern and elevate the
urgency for action and the need for assurance – a surge in its truest sense. Here too
we could draw on our core public health activities and build from them. It was very
common for VDH leaders at all levels to provide public messages. For some urgent
and emergent public health issues, the Commissioner or Governor needed to be front
and center. EVD would be one those such cases. To prepare for that we actually
practiced the messaging through mock press conferences and worked with the
Governor’s staff to be sure he was aware of what we recommended in that eventu-
ality. By doing this we met a number of needs. One was the need to be prepared
specifically for Ebola, another was maintaining the trust of the Governor and a third
380 M. J. Levine

was maintaining our competency in this area which would help us for other core
public health work in the future.
One other important arena that the early events of the Ebola threat emphasized
was the need to enhance our hospital and health care capabilities. The events in
Texas concerning their first EVD patient and the subsequent nosocomial infections
that resulted proved that our plans were not good enough. This was not easy because
we prided ourselves on the healthcare coordination system we developed together.
However, as I said previously, we were only as good as our last response. And we
knew we needed to heed that warning in Virginia. Our hospitals were already
starting to gear up, review protocols and assuring adequate resources. We had a
solid foundation of balancing humanity and performance upon which we could
improve these specific capabilities.
One of the first steps we took was partnering with VHHA and convening their
board of hospital leaders for frank discussions about the role of Virginia’s hospitals
in managing a PUI for EVD as well as the systems we needed to augment or develop
outright. These were not easy discussions but because of the relationships already in
place, these difficult conversations could be had. In fact, we had been prepared for an
opportunity to address the “C suite” about our robust healthcare preparedness system
but to date did not have their attention given the myriad of pressing matters such
individuals needed to address on a daily basis. But Ebola got their attention and so
did we.
The leadership board was convened and we had our first opportunity to not only
broach the discussion about the hospital tiered system necessary for EVD manage-
ment [32] but also to fully brief them on the statewide and regional hospital/
healthcare coordinating system that had evolved and how that system would be
implemented to deal with the Ebola threat. As Commissioner, I approached that
meeting as one where we would lay out the facts and the needs and let the leadership
deliberate on how they would best meet that need. I actually left the meeting so they
could deliberate alone. What emerged from those hospital and healthcare leaders
working together was a Virginia specific approach based on state and federal
recommendations. The VHHA leadership and its member organizations agreed
that the state’s two largest teaching hospitals would become the primary Treatment
Centers for EBV patients, a larger number (seven) of smaller facilities would be key
Assessment Hospitals but all hospitals would need to have some capability to
manage PUIs who “just showed up.”
This newly evolving system would require not only health care leaders in the
hospital and medical worlds, but EMS, public health, other first responders and local
community leaders working together to assure a seamless system of care. For the
Treatment Centers themselves, facilities needed to be reconfigured and the donning
and doffing process for personal protective equipment needed to be revisited through
enhanced training and exercising.
All of the above were in play during those last few months of 2014 not knowing
when they would be needed. But with the travelers already arriving at IAD, the
system would be tested often despite not having an actual patient with EBV.
Effecting Collective Impact Through Collective Leadership on a. . . 381

Principle 4 – EPR is everyone’s job


Principle 6 – Be a leader and create leaders everywhere
EPR during the early days of the Ebola threat meant that everyone at VDH played
a role. If you weren’t personally involved in the Ebola preparations or response then
you were likely taking on additional core public health work to assure that the day to
day public health work continued. All of this occurred in an environment of
uncertainty – uncertainty concerning how significant this would become and uncer-
tainty in terms of how long this would last.
This is the world of governmental public health and it required a level of
teamwork not often seen in other sectors. Given that financial rewards were limited,
the passion of the people who came to work for VDH was a driving force. This is an
important lesson for leaders in governmental public health. You may not be able to
provide enhanced compensation to your teams but you can lead them forward
toward the vision of your mission with focused precision aligning their efforts to
maximize their collective impact. Providing your teams with concrete evidence of
their impact both individually and collectively and then thanking them personally for
their effort provides a tangible benefit most appreciate. These are examples of the
high quality interpersonal relationships that promote learning and, concurrently,
provide psychological safety [33]. Psychological safety allows individuals to feel
comfortable taking risks and experimenting. Without such experimentation/risk-
taking, we could never find solutions to complex, novel problems. The job of the
leader is to work intentionally to create such conditions. Additionally, role modelling
these behaviors provides important lessons for all leaders in the organization who
can then do the same at every organizational level.
In addition to psychological safety, there had to be an effort to assure that our own
personnel’s families would be safe and cared for during an emergency response.
Pre-defining which roles were required in the emergency helped personnel define the
potential impact on their own families and plan accordingly. Also assuring that our
own non-content expert personnel received information and education about the
threat was critical to establishing trust and identifying concerns and potential mis-
understandings. With such an effort and acknowledgement by managers, we could
anticipate those potential concerns and integrate into our administrative readiness
those aspects of the response necessary to support our own personnel and those for
whom they cared. Safety first was our mantra.
Just expecting that the work will be done given these new demands is not enough,
however. Siloed work units in government often live with the mindset that they must
do with what they have. This however will just result in an ineffective response and a
stressed workforce. To counter that, relationships between executive leaders and
action officers must be robust such that a trusting environment encourages the kind
of communication that allows the work unit leader to be able to ask for help for
his/her teams. When the agency is in an “all hand on deck” mode, as we were with
the Ebola threat, then all agency resources are available for all work units. The
appropriate prioritization and resource allocation are critical roles of the senior
leadership team who also need to maintain their visibility. Consistent with Peter
Drucker’s “Management by Walking Around” [34], being physically present
382 M. J. Levine

provides that visibility while also working to assure robust communication and
situational awareness. Very few things match the value of the leader being physically
present by walking to where the organization’s personnel are working, yet during
times of crisis and emergency response, this activity often is overlooked. These
seemingly insignificant actions may provide much needed support and connection in
a very human way and therefore represent a critical investment of time and effort by
leadership.
But it is not about “the” leader. It is about creating leaders everywhere. This is
most consistent with the concept of collectivistic leadership, as described by
Yammarino et al. [35] who defines this type of leadership as “involving multiple
individuals assuming . . . leadership over time in both formal and informal relation-
ships.” Relationships are again key and foundational to creating the conditions in
which leaders emerge. Emergent leadership is critical to planning for and responding
to the types of health security threats we are likely to continue to experience in public
health. Emergent leadership is necessary given the need to answer the question,
“Who is in charge of what?” Rather than the question, “Who is in charge?” This form
of leadership is particularly critical in the early moments of an emergency. McNulty,
et al. confirm this in their case review of the 2013 Boston Marathon bombing
response [36]. Think collective impact through collective leadership built on top
of a solid foundation of generative relationships. This was our challenge for Ebola.
Concisely, this is also our collective challenge for public health in the future.
Principle 5 – Exercise/adjust, exercise/adjust, exercise/adjust to build competency
and develop robust relationships
Once the plans were in place and communicated, VDH and our partners exercised
what we could. At the senior leadership level this meant meetings of the IMT, review
of the playbook, testing an order of quarantine and/or isolation, working with
hospital and EMS partners to review actual and hypothetical PUI scenarios and
continually reevaluating the plans as they had been written. During one of the most
significant snow storms in Richmond during the 2014–2015 winter season, our plans
received a robust test. At that time a recently returned traveler to the affected West
African countries became ill will symptoms consistent with EVD. This individual
was in the monitoring program, however this situation tested our EMS and hospital
response, the public health support and the communication between all key response
stakeholders. The weather added a significant component to the response, but the
strong relationships, prior planning and testing and the coordinating infrastructure in
place all resulted in a robust response. Internal and external communications worked
well and all outcomes were positive. In all, approximately 15 PUI were transported
and evaluated during the threat period. These types of tests of the system provided
the important feedback needed to evaluate and revise our plans. We were able to
demonstrate that the local capabilities were in place and the state-level unified
command was meeting the support required of such local response efforts.
In addition to planned and unplanned local tests, the unified command also met to
identify gaps in situational awareness and planning. These meetings done both in
person and virtually highlighted the value of bringing diverse agencies and organi-
zations together. For example, in the early days of the response it was clear we had
Effecting Collective Impact Through Collective Leadership on a. . . 383

more work to do with the management of Category A waste. Ebola tainted waste is
so classified by the U.S. Department of Transportation’s (DOT’s) Hazardous Mate-
rials Regulations (HMR; 49 C.F.R., Parts 171–180). Management of such waste was
primarily the purview of the Virginia Department of Environmental Quality (DEQ).
DEQ needed to secure a transport/final disposition vendor and review the adequacy
of the vendor’s plans and capabilities. However, this was also around the time of the
first EVD case in NYC. We learned that the waste production from the care of one
patient was significant. We also learned that the waste had to be shipped to Texas for
final processing. Ground transportation of the waste to Texas went right through
Virginia on Interstate 81. This therefore required us to augment our plans to include
building assurance that there were no missteps with that waste in Virginia. So now
we had DEQ and the State Police involved to assure that when a shipment crossed
through Virginia we were aware and could monitor that transport to be sure we knew
that it had safely crossed our jurisdiction.
Similarly, through unified command we identified additional issues with marine
ports. It was not unusual for ships bound for the US from Liberia or other West
African countries to use Norfolk as their US port of entry. Thanks to inclusion of the
US Coast Guard, Virginia Port Authority and Tidewater military base liaisons in our
statewide unified command, we were able to review and enhance Ebola specific
aspects of their plans, connect all the relevant, federal, state and local authorities, test
the plans via tabletop exercises and remain apprised of the situations all were
encountering in that part of Virginia. As Commissioner it was clear that without
this organizational approach we would have had a fragmented response and perhaps
remained unaware of other key aspects of the Ebola threat. Instead we were able to
enhance our pre-event relationships with these entities and better assure the public
that if a regional threat arose we were ready to handle it successfully.
Positive relationships became the conduit for all that needed to happen in
Virginia. A growth mindset by the leaders fosters an environment where positive
relationships are valued. Therefore I believe leadership had a critical role in shaping
our direction. This was our version of collective impact and, as Kania emphasizes,
“several mindset shifts are necessary for collective impact partners” [37].
Collective action and its impact were present at all levels throughout Virginia
during the Ebola threat. As a result, as you would expect, collective leadership
fostered the emergence of leaders throughout the Commonwealth. A long history of
robust, generative relationships got us to where we needed to be in 2014 and then
valuing and maintaining such relationships, I believe, got us through the Ebola
threat.

8 Summary and Recommendations

It is imperative that I conclude with acknowledging the tremendous effort of our


local health districts and their partners in the central offices of VDH, particularly our
Offices of Epidemiology and Emergency Preparedness and Response – all leaders in
their own right. Given that IAD was one of the entry airports in the country, most of
384 M. J. Levine

the traveler monitoring burden fell on our northern health districts. In all, over 2200
people were monitored during the Ebola threat period. The decision to build that
monitoring system on top of the distributed, integrated statewide governmental
health department structure was a key reason our monitoring was successful. In
fact, because of the local relationships and trust previously built between partners
through exercises implementing involuntary quarantine during our pandemic flu
planning and other activities, traveler monitoring went relatively smoothly and,
most importantly, no involuntary orders of quarantine needed to be issued. This
result was not just lucky but also likely had much to do with the fact that local public
health workers were the ones interacting with the monitored travelers. Literally,
neighbors caring for neighbors. The sense of community could not be overstated,
hence the value of making sure the central office resources were not the primary
provider of these processes and service, but rather were in support of the local staff
doing such work with their neighbors. Valuing the critical role of maintaining such
positive relationships by helping improve local performance through capacity build-
ing of the local teams was the right leadership decision. Our decisions balanced
humanity and performance and ensured that all decisions factored in the impact on
generative relationships.
This chapter provided a view through the leadership lens. It does not, however,
adequately describe the effort of a large number of individuals and teams that set up
and implemented complex processes or plans for potential scenarios all necessary to
carry out this complex response. The dedication of these individuals is humbling and
inspiring. They are often the folks behind the scenes who are invisible to the public
but work so hard to minimize the impact or prevent outright the effects of serious
public health threats. Leaders who value their work, buffer their staff to allow them
to continue their work and attribute the accolades to them are more likely to get the
expected performance of their teams. Most importantly, such actions improve the
health and well-being of the people doing the work which can then be transmitted to
the people who need to benefit from such work. This is how we build resilient and
healthy organizations and, most importantly, this is a path toward resilient and
thriving communities.
Leaders planning for and executing responses to global health security threats
expect that competency and rigid command and control structures are the founda-
tional success factors in an otherwise vague, uncertain, complex and ambiguous
situation. Competency and organization matter however this chapter has attempted
to identify that positive, i.e. generative, relationships are the foundational glue from
which competency can launch and other leaders emerge to best prepare, respond and
adapt to the health threats we face. This “secret sauce” was at the root of the key
principles from which the health officials proceeded to plan and respond to the 2014
Ebola threat in the Commonwealth of Virginia, USA and helped assure that we
created the conditions for collective impact through collective leadership.
Valuing generative relationships is not always a given in the leadership realm.
Fortunately, the skills necessary to promote and maintain such relationships can be
learned. The required prerequisite, however, is that such relationships be valued by
leaders in the emergency preparedness and response community. Sharing our
Effecting Collective Impact Through Collective Leadership on a. . . 385

Virginia public health experience with the 2014 Ebola threat is our way to spread the
epidemic of a growth mindset that values generative relationships. This is one “bug”
I hope you catch.

References

1. Nelson C, Lurie N, Wasserman J, Zakowski S (2007) Conceptualizing and defining public


health emergency preparedness. Am J Public Health 97(S1):s9–s11
2. Brougham G (2015) The Cynefin MiniBook. C4Media
3. Dweck CS (2006) Mindset: the new psychology of success. Random House, New York
4. Sutton RI (2010) Good boss, bad boss. Grand Central Publishing, New York
5. Goleman D (1995) Emotional intelligence: why it can matter more than IQ. Bantam Books,
New York
6. Levine MJ, Cooney M (2018) Love as a Public Health Intervention. J Public Health Manag
Pract 24(1):87–89. https://doi.org/10.1097/PHH.0000000000000736
7. CDC Ebola Website (2018) 2014–2016 Ebola outbreak distribution in West Africa. https://
www.cdc.gov/vhf/ebola/history/2014-2016-outbreak/distribution-map.html. Accessed 30 June
2018
8. Kania J, Kramer M (2011) Collective impact. Stanford Social Innovation Review. Winter
9. Code of Virginia (2018a) https://law.lis.virginia.gov/vacode/title32.1/chapter1/. Accessed
27 Oct 2018
10. Code of Virginia (2018b) https://law.lis.virginia.gov/vacode/title32.1/chapter1/section32.1-20/.
Accessed 27 Oct 2018
11. Code of Virginia (2018c) https://law.lis.virginia.gov/vacode/title32.1/chapter1/section32.1-30/.
Accessed 27 Oct 2018
12. Paeglow CE, Hall K, Wahl PM, Levine M, King B, Remley K, Dennis G, Daniel G (2011)
Evaluation of the Virginia Department of Health’s H1N1 Flu Mitigation Efforts. October,
American Public Health Association annual meeting, Washington, DC (poster)
13. Long B (2017) National incident management system third edition. Federal Emergency Man-
agement Agency. https://www.fema.gov/media-library-data/1508151197225-ced8c60378c393
6adb92c1a3ee6f6564/FINAL_NIMS_2017.pdf. Accessed 2 Nov 2018
14. Braun BI, Wineman NV, Finn NL, Barbera JA, Schmaltz SP, Loeb JM (2006) Integrating
hospitals into community emergency preparedness planning. Ann Intern Med 144:799–811
15. Ripley A (2008) The unthinkable. Crown Publishers, New York
16. McNulty EJ, Dorn BC, Serino R, Goralnick E, Grimes JO, Flynn LB, Pillay SS, Marcus LM
(2018a) Integrating brain science into crisis leadership development. J Leadersh Stud 11
(4):7–20. https://doi.org/10.1002/jls.21548
17. Stoto M (2008) Regionalization in local public health systems: variation in rationale, imple-
mentation, and impact on public health preparedness. Public Health Rep 123:441–449
18. National Association of County and City Health Officials (2018) Project public health ready.
https://www.naccho.org/programs/public-health-preparedness/pphr. Accessed 1 Nov 2018
19. Hanfling D, Bruce M, Altevogt BM, Viswanathan K, Gostin LO (eds) (2012) Crisis standards of
care: a systems framework for catastrophic disaster response. National Academies Press,
Washington, DC
20. Phillips FB, Williamson JP (2005) Local health department applies incident management
system for successful mass influenza clinics. J Public Health Management Practice 11
(4):269–273
21. Chen J, Chen THY, Vertinsky I, Yumagulova L, Park C (2013) Public private partnerships for
the development of disaster resilient communities. J Conting Crisis Manag 21(3):130–143.
https://doi.org/10.1111/1468-5973.12021
386 M. J. Levine

22. Sobelson RK, Young AC, Marcus LJ, Dorn BC, Neslund VS, McNulty EJ (2013) The meta-
leadership summit for preparedness initiative: an innovative model to advance public health
preparedness and response. Biosecurity Bioterrorism: Biodefense Strategy, Pract Sci 11
(3):251–261. https://doi.org/10.1089/bsp.2013.0056
23. Harvard National Preparedness Leadership Initiative (2018) Meta-leadership. https://npli.sph.
harvard.edu/meta-leadership-2/. Accessed 30 Oct 2018
24. Covey SMR (2006) The speed of trust. Free Press, New York
25. Lencioni P (2002) The five dysfunctions of a team. Jossey-Bass, San Francisco
26. University of Kentucky (2018) National health security preparedness Index. https://nhspi.org/.
Accessed 18 Oct 2018
27. Centers for Disease Control and Prevention (CDC) (2018) Public health emergency prepared-
ness and response capabilities. U.S. Department of Health and Human Services, Atlanta. https://
www.cdc.gov/phpr/readiness/00_docs/DSLR_capabilities_July.pdf. Accessed 22 Oct 2018
28. Assistant Secretary for Preparedness and Response (ASPR) (2017) Healthcare preparedness
and response capabilities. U.S. Department of Health and Human Services. https://www.
phe.gov/Preparedness/planning/hpp/reports/Documents/2017-2022-healthcare-pr-capablities.pdf.
Accessed 22 Oct 2018
29. Millman AJ, Chamany S, Guthartz S, Thihalolipavan S, Porter M, Schroeder A, Vora NM,
Varma JK, Starr D (2015) Active monitoring of travelers arriving from ebola-affected countries
— New York City, October 2014–April 2015. MMWR 65(3):1–11
30. World Health Organization (2014) Statement of the 1st meeting of the IHR emergency committee.
http://www.who.int/mediacentre/news/statements/2014/ebola-20140808/en/. Accessed 1 Oct
2018
31. Eidson WA (1990) Confusion, controversy, and quarantine: the Muncie smallpox epidemic of
1893. Indiana Mag Hist 85(4):374–398
32. Bell BP, Damon IK, Jernigan DB, Kenyon TA, Nichol ST, O’Connor JP, Tappero JW (2016)
Overview, control strategies and lessons learned in the CDC response to the 2014–2016 Ebola
epidemic. MMWR 65(3):4–11
33. Carmeli A, Brueller D, Dutton JE (2008) learning behaviors in the workplace: the role of high-
quality interpersonal relationships and psychological safety. Syst Res Behav Sci 26:81–98.
https://doi.org/10.1002/sres.932
34. Drucker P, Waterman RH (1982) In search of excellence. HarperCollins Publishers, New York
35. Yammarino F, Salas E, Serban A, Shirreffs K, Shuffler M (2012) Collectivistic leadership
approaches: putting the “we” in leadership science and practice. Ind Organ Psychol 5
(4):382–402. https://doi.org/10.1111/j.1754-9434.2012.01467.x
36. McNulty EJ, Dorn BC, Goralnick E, Serino R, Grimes JO, Flynn LB, Cheers M, Marcus LJ
(2018b) Swarm intelligence: establishing behavioral norms for the emergence of collective
leadership. J Leadersh Educ 17(2):19–41. https://doi.org/10.12806/V17/I2/R2
37. Kania J, Hanleybrown F, Splansky Juster J (2014) Essential mindset shifts for collective impact.
Stanford Social Innovation Review, Fall
Global Health Security Innovation

James Stikeleather and Anthony J. Masys

1 Introduction

Recent infectious disease outbreaks have demonstrated that a local threat can rapidly
become a global crisis that jeopardizes the health, economy, and safety of persons
everywhere. For example, the outbreak of severe acute respiratory syndrome in 2003
highlighted vulnerabilities within the global public health community about defi-
ciencies in rapidly detecting and responding to cross border outbreaks, thereby
emphasizing that the world was still ill-prepared for global public health
emergencies.
‘Unexpected events often audit our resilience’ everything that was left unprepared becomes
a complex problem, and every weakness comes rushing to the forefront’ [1]

Challenges in managing the complex landscape of emerging microbial threats,


such as cholera, Middle East respiratory syndrome coronavirus (MERS-CoV), Zika
in 2015, Ebola (2014–2018) are manifest. With such recent ‘disaster events’, health
security has emerged as a non-traditional security issue affecting national security
and global security. The transborder nature of such disease outbreaks as Ebola,
H1N1, H5N1, MERS, has emphasized the critical importance of understanding the
complexity and resident vulnerabilities that pervade societies with regards to infec-
tious disease threats. These outbreaks, with their transborder character, have resulted
in significant global societal and economic impacts. For example, the Ebola epi-
demic of 2014 affected the West African countries of Liberia, Sierra Leone, and
Guinea highlighting the risks of emerging pathogens on global populations. More
than 28,000 Ebola cases were reported from the three countries during the epidemic,
and >11,000 persons died. These countries are among the least developed in the

J. Stikeleather · A. J. Masys (*)


College of Public Health, University of South Florida, Tampa, FL, USA
e-mail: tmasys@health.usf.edu

© Springer Nature Switzerland AG 2020 387


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1_16
388 J. Stikeleather and A. J. Masys

world, and their weak infrastructures and underfunded health systems were further
compromised by the epidemic.
The scope of the impact of pandemics on society is significant. As described in
[2:1], the deadliest pandemics in recorded human history include the Black Death
pandemic (bubonic/pneumonic plague; 25–40 million deaths) in the fourteenth
century; 1918 influenza pandemic (50 million deaths), and the HIV/AIDS pandemic
(35 million deaths so far). More recently ‘. . .cholera has repeatedly reemerged over
more than two centuries in association with global travel, changing seasons, war,
natural disasters and conditions that lead to inadequate sanitation, poverty and social
disruption’ [2:2].
The global health security domain is not limited to infectious diseases but has
seen the global implication of non-communicable diseases on societies realized in
healthcare costs associated rates of heart disease, diabetes, obesity, hypertension,
and other non-communicable diseases [3:1886]. In fact, non-communicable diseases
are the leading cause of death and disability worldwide. The ‘wicked nature’ of
communicable and non-communicable diseases, drug resistant diseases, emerging
and re-emerging diseases highlight gaps in Global Health Security pertaining to
preparedness, detection, response. The question becomes, how do we innovate in
real-time to manage the global health security landscape.

2 Complex Problem Framing

The global health security landscape has an inherent ‘wickedness’ [4] to it, emerging
from a dynamic complexity. Such ‘wickedness’ or messes are seemingly intractable
and are characterized as value-laden, ambiguous and unstable, that resist being
tamed by classical problem solving. Actions and interventions associated with this
complex problem space can have highly unpredictable and unintended conse-
quences’ [5]. Helbing [6] argues that we have ‘. . .created pathways along which
dangerous and damaging events can spread rapidly and globally’. The World
Economic Forum Global Risks [7] places the spread of infectious diseases within
the top 10 risks in terms of impact. The risks that pervade our hyper connected world
highlight the ‘. . .fragility and vulnerabilities that lie within the social/technological/
economic/political/ecological interdependent systems’ [8]. This certainly challenges
linear, reductionist approaches to problem framing and solution navigation. Woods
[9:316] ask the question:
How do people detect that problems are emerging or changing when information is subtle,
fragmented, incomplete or distributed across different groups . . . Many studies have shown
how decision makers in evolving situations can get stuck in a single problem frame and miss
or misinterpret new information that should force re-evaluation and revision of the situation
assessment. . . .

Recognizing the wickedness of global health security landscape requires a frame-


work and approach to understanding the dynamics, inherent conflicts and possible
outcomes resulting from a particular framing of the problem space.
Global Health Security Innovation 389

3 Discussion

Crisis management has dominated the Global Health Security domain. Sagan
[10:13] argues that ‘Things that have never happened before, happen all the time’.
This resonates with the problem space of Global Health Security whereby we are
continually ‘surprised’ by epidemic events.
The ‘perfect storm’ associated with the Ebola outbreak revealed that the gover-
nance of global health security is not working. Ebola is just one of many recently
emerged zoonotic diseases that have impacted global health security. As described in
Belay et al. [11:S65] 75% of emerging infectious organisms pathogenic to humans
are zoonotic in origin and have exhibited cross border transmission resulting in high
morbidity and mortality rates. In addition to the high costs in lives, the World Bank
estimated that six major zoonotic disease epidemics during 1997–2009 resulted in an
economic loss of >$80 billion.
The success in eradicating and controlling of infectious diseases is manifest.
However much of the dynamic complexity arises from the nature of pathogenic
microorganisms that can undergo rapid genetic changes, leading to new phenotypic
properties that take advantage of changing host and environmental opportunities.
The novel MERS coronavirus in Saudi Arabia, and H7N9 avian influenza virus
Eastern China, have pandemic potential with high mortality rates. Events like the
Ebola crisis has brought the issue of global health security back on the political and
the technical agenda recognizing its societal, political and economic dimensions
[12:3].
Moore and Westley [13] argue that ‘complex challenges demand complex solu-
tions. By their very nature, these problems are difficult to define’. As described in
Masys [5, 14–16] the complexity lens is a key approach to examine and develop new
strategic possibilities regarding global health security problem framing that lever-
ages new ways of thinking and innovation. As described in Masys [5], in this
complex problem landscape associated with global health security, systems thinking
provides a worldview that informs one’s understanding and can be used as an
approach in problem solving. ‘Systems thinking’ as discussed in Senge [17] empha-
sizes interconnectedness, causal complexity and the relation of parts to the whole,
thereby challenging traditional linear thinking and simple causal explanations. Senge
[17] describes systems thinking as ‘a discipline for seeing wholes. . .a framework for
seeing interrelationships rather than things, for seeing patterns of change rather than
static snapshots’. Systems thinking purports that, although events and objects may
appear distinct and separate in space and time, they are all interconnected.
As described in Westley, Zimmerman and Patton [18] ‘. . .when we think of a
problem as complex, rather than complicated, the inquiry process unfolds in a
radically different way. The starting assumption is that of transformation, where
true novelty is possible. The status quo needs to be understood, but it is not perceived
as a constraint. Resources are assumed to consist of not only what is already part of
the system, but that which can be understood or created in relationship to each
other’.
390 J. Stikeleather and A. J. Masys

3.1 Cynefin Framework

The Cynefin Framework [19–21] is best construed as a sense making tool. In this
way it helps us understand that the systems we are engaged in (Global Health
Security context) have an inherent dynamic complexity.
The framework (Fig. 1) has five domains divided amongst two overarching
characterizations: ‘ordered’ and ‘unordered’. The ordered characterization is com-
prised of ‘simple’ and ‘complicated’ domains whereas the disordered characteriza-
tion is comprised of ‘complex’ and ‘chaos’. The central domain is that of ‘disorder’.
When viewing the global health security landscape, the domain that best supports
the problem framing is the complex domain. In this ‘un-ordered’ domain, there are
cause/ effect relationships but their non-linear nature and the multiplicity of agents
defy conventional analysis. Considering the outbreaks of H1N1, H5N1, SARS,
Ebola, the complex disaster aetiology arises from the intersection of the pathogens
with society. When applied to GHS domain, Cynefin points to the complexity that
characterizes the problem space thereby leading to the requirement for innovation
and creativity as a solution space.
We can’t solve problems by using the same kind of thinking we used when we created them.
Albert Einstein

Fig. 1 Cynefin Framework.


(Kurtz and Snowden
[19]. https://doi-org.
ezproxy.lib.usf.edu/10.
1147/sj.423.0462)
Global Health Security Innovation 391

4 Global Health Security and Innovation

Managing a wicked problem space such as global health security requires creating
conditions and opportunities for innovation.
Innovation is an “essentially contested concept”. Gallie [22] originally introduced
the term to categorize the sorts of abstract, qualitative, and evaluative concepts such
as beauty, fairness, security and social justice in which there are genuine disputes
which appear intractable with various uses and criteria of the concept in conflict.
Everyone agrees to the concept because it is the only term used, but they cannot
agree to its definition, as it is used differently with different interpretations when it is
used. For example, a group of people can look at a painting on a wall, agree that it is
a painting, further agree that it is a watercolor and not an oil, but might argue
vehemently as to whether it was art or not. Innovation, in the vernacular, is the same
thing, something that like beauty is in the eye of the beholder.
The Innovation Group at Dell, Inc. developed a set of set of attributes of
innovations to address this issue and were useful in making management and
investment decisions around potential innovative projects. These are:
• Innovation and invention are different. Invention is the process of taking capital1
and creating knowledge. Innovation is the process of taking that knowledge and
turning it back into capital by developing a value proposition for stakeholders2
using that new knowledge.
• Innovations are new ideas as explained in Everett Rogers’ Diffusion of Innova-
tions [25] and defined as a “new idea, creative thoughts, new imaginations in form
of device or method”,3 to which the idea of a value proposition is added. An
innovation does not have to be a completely new idea but can be an old idea
applied in a new way or in a new area. For example, using lessons learned and
principles developed in manufacturing and applying them to healthcare.
• An innovation is generally forward thinking, addressing opportunities or prob-
lems not yet fully formed or expected in the future. They often surface
unrecognized needs or wants. Addressing existing problems in traditional ways,
or using Cynefin terminology, addressing simple (with knowns) or complicated
(with known unknowns) systems, is the domain of Total Quality Management
(TQM), Six Sigma, and other analytic approaches. Innovation applies in poten-
tially changing or replacing those types of systems but is more commonly applied
in complex environment (unknown unknowns as a system “probe”) or chaotic
environments (unknowable as an action).

1
Capital is used here to mean any resource applied which could include funds, people, intellectual
property, time, facilities, and even social capital such as reputation and favor exchange.
2
Any person, group, or “thing” (such as the environment and Dell’s innovations around the
circular economy) that has an interest or concern (can affect or be affected by) the innovation.
The concepts are based upon Freeman’s Stakeholder Theory [23] and Samantha Miles’ definition
and taxonomy [24]
3
Merriam-webster.com
392 J. Stikeleather and A. J. Masys

• An innovation must be feasible, that is it can be made manifest. For example,


when the TV series Star Trek4 was first aired in 1966 it exhibited two “innova-
tions”, the communicator and the transporter. While farfetched in 1966, the
communicator was feasible and today most of us carry one around. In fact,
Capt. Kirk was limited to using his to ask Scotty to “beam me up”, Spock to
inquire on the ships computer, Sulu as to where were the Klingons, and McCoy if
the red shirt died yet. Our communicators not only allow us to communicate with
each other, but have access to the knowledge of humanity, can tell us where we
are and how to get to where we want to get to, and a myriad of other things. In
fact, we have the entire non-weapons capabilities of the Enterprise in our hands.
This is an example of Amara’s Law5 in action. However, no transporters (tech-
nically teleporters) exist nor are they likely (baring a fundamental change in the
physics of the universe). They transform mass into information and energy to be
reconstructed as mass somewhere else. Einstein’s famous equation E¼mc2 sug-
gests this is not very feasible. The bomb that was dropped on Hiroshima in WWII
only converted seven tenths of a gram (less than a dollar bill) of material into
energy. Imagine what converting a 90,718.5-gram person (200 lbs) to energy
would be like?
• An innovation must be viable. In the computer industry there is this concept of
SMOP (Small Matter of Programming), which is short hand for many things can
be done once if you put in the effort. But once is not viable or sustainable, you
must be able to deliver an innovation with repeatability, reliability, and regularly.
If you cannot consistently deliver your innovation and its value proposition to its
stakeholders, then it really isn’t an innovation but a special event (or one off). As
part of viability/sustainability is that the value delivered by the innovation must
eventually exceed the resources consumed in its creation and other potential uses
of those resources (opportunity costs).
• An innovation must be valuable, and it is surprising how often this is missed.
Someone or something somewhere must be informed and willing to surrender
resources in order to acquire the benefits (value) of the innovation. This was often
ignored in the dot com era of the late 1990s, the financial crisis (lack of
transparency and information in the “innovative” new financial instruments),

4
Science fiction has an admirable track record in anticipating innovations including television,
satellites and “communicators”. At Dell, the innovation team was expected to read science fiction.
5
Roy Charles Amara was an American researcher, scientist, futurist and president of the Institute for
the Future best known for coining Amara’s law on the effect of technology often applied in
foresight and innovation planning. “We tend to overestimate the effect of a technology in the
short run and underestimate the effect in the long run.”
Global Health Security Innovation 393

5 Forms of Innovation: A Taxonomy

Innovation is often talked about in terms of new products or services. Many times,
products and services are the least valuable forms of innovation and often emerge
from the other forms discussed here. In her book Thinking in Systems, Donella
Meadows [26] address places to intervene in a system, what she calls leverage
points.6 Any enterprise, for profit, not for profit, NGO, government, charity, or
even a flash mob, all are systems and systems principles (such as Cynefin) apply
to them. The forms (taxonomy) of innovation observed in enterprises track well with
Meadows’ leverage points.
Global Health Security is a nebulous concept that requires a strategic intervention
strategy pertaining to innovation. With that in mind, such innnovation must be
considered across an enterprise framework to include: products and services, oper-
ations, organizational design and management, and business models (Fig. 2).

Fig. 2 Points of Leverage in intervening in a system. (Meadows [27]. Thinking in systems: A


primer. (Chelsea green publishing)

6
“Folks who do systems analysis have a great belief in “leverage points.” These are places within a
complex system (a corporation, an economy, a living body, a city, an ecosystem) where a small shift
in one thing can produce big changes in everything.”[28]
394 J. Stikeleather and A. J. Masys

5.1 The Products and Services

Products and services innovation is the most common and best understood form of
innovation. It involves creating a new value proposition for stakeholders. It is
sometimes incremental innovation where changes in the operating model results in
tangible improvements in price, quality, time, flexibility, availability (getting it –
inventory, delivery), accessibility (getting to it – time, space, financing), function-
ality, capability, ease of use or other parameters of an existing product or service
type. In some cases, the operating model changes are simply part of a continuous
improvement process, sometimes they are the result of an innovation in the operating
model. This continuous, sustaining, incremental innovation is generally part and
parcel of everyday business operations and competition and a result of continuous
improvement initiatives such as total quality management (TQM) or six sigma.
Sometimes products and services offer breakthrough innovation wherein an
existing stakeholder value proposition is enhanced or changed via an existing
product or service but changes in the operating, organizational, managerial or
business models resulted in a new stakeholder experience with that product or
service. Likewise, a breakthrough innovation can occur with a completely new
configuration of products and services based upon the existing higher models but
also generating a new stakeholder value experience. Break through innovation is
most often both a cause and an effect of industry (enterprises dedicated to a particular
value proposition) or market (stakeholders with similar value needs and wants)
consolidation.
Disruptive product or service innovations, those which totally reinvent or create
new industries or markets are usually the result of innovations at the higher levels of
the system. For example, the iPod is often considered a disruptive product innova-
tion, but the reality is that it was a totally new service-based business model, iTunes,
that provided the success for the iPod and the eventual disruption of the media
industry and the future creation of the smart phone market (and associated industry
disruption) via its foundation as an App Store.
From a systems perspective, products and services are the lowest level elements
of the system, the most easily changed and, though critical and important, have the
least impact on altering the system’s (enterprise’s) efficient and effective perfor-
mance of its mission and achievement of its purpose in delivering value to its
stakeholders.

6 The Operating Model

The operating model is the specific collection of process, organization, location,


information, suppliers, technology, and control systems (management) that enables
an enterprise to assemble resources and transform them into a value proposition for
and delivered to its stakeholders. In systems terms it is the operational model that
Global Health Security Innovation 395

Fig. 3 Porter’s value chain model

delivers the behavior (and success or failure) of the system (enterprise). It is


composed of processes and procedures using information flow derived from the
operations to reinforce its behavior (do more) or balance its behavior (do less or do
different) or to delay and potentially stop the value generation process.
An operating model describes the way an enterprise does business today. It is the
enterprise’s value chain, such as Porter’s Value Chain Model [29] made manifest.
Innovation at the operating model involves additions, changes, deletions to the
information and resource flows across the available capabilities and the associated
control structures (Fig. 3).

6.1 The Organizational and Managerial Models

The organizational and managerial models provide the form of the enterprise
including the structure (What the different parts of the operating model are and
how they are put together) to the system (enterprise), clarity and specificity to the
goal (mission, purpose, objectives, constraints) of the system (enterprise), the rules
the operational level of the system (operational model of the enterprise) must follow,
including the rules for changing all of the above.
The organizational model is not just the structure of the organization, but the
climates within it and the overall culture. An organizational model defines how
activities such as task allocation, coordination and supervision are directed toward
the achievement of enterprise goals. The organizational model can enable, facilitate
and accelerate the value production of the operating model, or it can severely inhibit
it. Innovation is possible in both culture and structure resulting in changes to value
delivered to stakeholders.
396 J. Stikeleather and A. J. Masys

“A management model is the choices made by a company’s top executives


regarding how they define objectives, motivate effort, coordinate activities and
allocate resources; in other words, how they define the work of management”
[30]. It is the management model that defines the policy and process followed by
the operational model as well as how exceptional conditions are to be handled. The
managerial model can enable, facilitate and accelerate the value production of the
operating model, or it can severely inhibit it. “Over the past 100 years, management
innovation, more than any other kind of innovation, has allowed companies to cross
new performance thresholds.” [31].
Innovation at the managerial model such as Holocracy [32], dynamic teams and
project methods such as agile and lean [33], as well as on demand and virtual
organizations [34] have resulted in innovations in the delivery of value to
stakeholders.7

6.2 The Business Model

Last, and most significant is the enterprise business model. A business model
describes the rationale of why an organization exists and how an organization
creates, delivers, and captures value, in economic, social, cultural or other contexts.
The process of business model construction and modification is a form of business
model innovation and is part of business strategy. While the term business is used
here, it applies to any enterprise – public, private, for profit, not for profit, NGO,
charity or otherwise who assembles and transforms resources (money, people, time,
knowledge, etc.) into value for groups of stakeholders.
There are three areas of innovation available when dealing with an enterprise’s
business model. First is developing the paradigm of the business.8 This includes its
vision, mission, and values in the context of the environment within which it
operates. Here is a framework for building a business model.
First is establishing the context of the enterprise, how the enterprise wants to
participate in that context, its intent on how to accomplish that participation and how
to translate that intent in a vision, mission, values, goals, strategies (affirming and
constraining) to enable it to take actions to accomplish the outcomes the enterprise
seeks (Fig. 4).
These are the (leverage) points of innovation opportunity that can have the most
significant impact on the overall system. In Meadows’ terms it is a point where the

7
Keep in mind that managers, employees, executives are all stakeholders as well as customers,
suppliers, and funders.
8
Keep in mind the term business is being used as short hand to describe an assemblage of resources
and capabilities to sustainably deliver a value proposition to a group of stakeholders.
Global Health Security Innovation 397

Fig. 4 Strategic planning model developed by the Dell Innovation Team. (Unpublished,
Stikeleather 2014)

paradigm of the enterprise is recognized (and could potentially be transcended to


something entirely different), it documents the mindset or mental model of the
enterprise that engages its stakeholders and drives its activities and establishes the
outcomes or goals that define the purpose of and designs the enterprise.
Second, taken to more detail, a framework like the Business Model Canvas of
Osterwalder and Pignuer [35] can be used to organize stakeholders, resources,
processes into the value creation process. This becomes the foundation for
constructing the products and services (value propositions), the operating model
on how to construct them, the organizational model for implementing the operating
model, and lastly the managerial model for day to day execution and variance
control. A Business Model Canvas representation of a business model also provides
insights as to where an innovator would want to design, conduct and analyze their
MVP/MVS/MVVP experiments described in the next section (Fig. 5).
The third area where the business model offers opportunity for innovation is in
the actual design of the enterprise and its relationship with its ecosystem (stake-
holders) and environment. It is this designing process, taking advantage of design
patterns, system archetypes, and tools such as design research and design thinking
that enables opportunities to innovate the managerial, organizational, and opera-
tional models of the enterprise for more effective and efficient delivery of value to its
stakeholders (Fig. 6).
How to apply these interventions and resulting models are handled in more detail
later in the Discussion Section.
398 J. Stikeleather and A. J. Masys

Fig. 5 Business model canvas of Osterwalder and Pignuer [35]

Fig. 6 Business model canvas of Osterwalder and Pignuer with externalities [35]
Global Health Security Innovation 399

7 Where Does Innovation Come from?

Innovation is the result of failure. Try something, fail fast, fail furiously, learn, and
then try again until you fail into success. Of course, most organizations are not
prepared to do this, in fact most organizational structure, policy, and procedure is
designed to insure failure does not take place. However innovative enterprises are
different. They adopt principles that have come to be called LEAN. Lean thinking
and practice help organizations become more innovative and effective, which in turn
allows them to become more economically sustainable. Today, lean is considered a
new, more effective approach to doing work, no matter what the work is, the sector
or the size of the organization. One part of lean focuses on eliminating waste and
increasing efficiency (often under the term lean six sigma), which while important,
happens after the introduction of the effective innovation.
Earlier, one of the attributes of an innovation was that it was a new idea. Before it
is an idea, it is really a hypothesis.
It’s important to understand that the words “idea” and “hypotheses” mean two very different
things. For most innovators the word “idea” conjures up an insight that immediately requires
a plan to bring it to fruition. In contrast, a hypothesis means we have an educated guess that
requires experimentation and data to validate or invalidate.9

These hypotheses span the gamut from who’s the stakeholder(s), to what’s the
business model, operating model, value proposition (product/service features), pric-
ing, distribution channel, demand creation (customer acquisition, activation, reten-
tion, etc.) and any other element of the stakeholder experience (Fig. 7).

Fig. 7 Developing new product of service, minimal viable approach. (Olsen, D. (2015). The lean
product playbook: how to innovate with minimum viable products and rapid customer feedback.
Wiley. Retrieved from http://ezproxy.lib.usf.edu/login?url¼http://search.ebscohost.com/login.
aspx?direct¼true&db¼cat00847a&AN¼usflc.033803999&site¼eds-live)

9
Why ‘build, Measure, Learn’ Isn’t Just Throwing Things . . . (n.d.). Retrieved from https://
medium.com/startup-grind/why-build-measure-learn-isn-t-just-throwing-thi
400 J. Stikeleather and A. J. Masys

Fig. 8 Innovation portfolio model developed by the Dell Innovation Team. (Unpublished,
Stikeleather 2014)

These experiments are equivalent to the “probe” step in addressing a complex


system and the “action” step in addressing a chaotic system as described by the
Cynefin Framework. Taking these experiments to “market” and observing their
(likely) failure provides the insights to refine the hypothesis (improve the idea)
and then experiment some more. This is the foundation for the concept “fail fast,
fail furiously, learn, and then try again until you fail into success” as the source of
innovation. Any potential innovation is simply a collection of untested hypotheses in
the beginning (Fig. 8).
Scientists and engineers never test their ideas (hypotheses) at full scale or in
totality. If things go wrong (fail) it is hard to determine what part(s) were the sources
of the failure. Instead they create small scale or partial tests of their ideas. When
talking about delivering value to a stakeholder or stakeholders, this is referred to as
building a minimal viable product or service. The goal of an MVP or MVS is to
provide enough value (Minimum Viable Value Proposition – worth the stake-
holder’s effort to engage) that the stakeholders can legitimately evaluate some
element of the offering, the innovator can legitimately evaluate the need infrastruc-
ture and ecosystem of resources and capabilities to evaluate and project resource
consumption to deliver the value to the stakeholders, and do so with the least
possible consumption of resources but sufficient to have an informative experiment.
What the innovator is trying to establish with these series of experiments, tests,
insights, and revisions to the idea (hypothesis) is to establish the total value propo-
sition to all stakeholders, including themselves (see below, not necessarily
exhaustive list).
Keep in mind that these MVP/MVS/MVVP experiments are meant to gain insight
for all of the models referenced in the Innovation Taxonomy, not just the offering to
the stakeholders.
Global Health Security Innovation 401

8 How Do You Enable, Facilitate and Accelerate It?

Innovation is an emergent activity. It can be planned, though anticipated is perhaps a


better term, through foresight and future study in order to concentrate the enterprise’s
attention on high probability or high reward opportunities. Having the right types of
people is important. They have to be found, but in the right environment people can
acquire and develop the necessary skills even if they do not have them to begin with.
That is the real key, having the right environment of processes, leadership,
rewards and a common language of what is meant by innovation and how to go
about it.

8.1 Right People

There is a misconception about how innovation occurs. It is often thought that


innovation comes from the brilliant lone wolf idea guy when in reality innovation
is a team sport. There are always at least three personas associated with innovation.
Perhaps they can all reside in one person, but more than likely they are spread across
a whole team of people.
402 J. Stikeleather and A. J. Masys

First is the idea generator(s) who see the future through foresight and future study,
who sees the jobs that stakeholders are trying to accomplish and the value that can
accrue to them if the job is accomplished. Idea generators see the environment, the
opportunities, the challenges and construct a strategy and team for dealing with
them. The classic example is Steve Jobs who saw the potential of a personal
computer for individuals who were not nerds capable of building one for themselves.
But the best idea however is worthless unless it can be made real. This is the realm
of the idea manifestors. They take that collection of value propositions from the idea
generators and apply technologies, processes, resources, organizations in such a way
that a product or service becomes real and can be delivered to the stakeholders who
will benefit from it. Manifesting is not just producing a product or service but the
business model, management model, organizational model, and operating model that
produces the product or service. Continuing the Apple analogy, the manifestor in
chief was the Woz.
Even the best product or service cannot deliver value unless the stakeholders
whom it could benefit or from whom it could take contributions know about it. This
is where the idea communicators bring their skills to bear. It is one thing for the idea
originator and manifestor to understand the value of a product or service (or the
system that creates and delivers it). But often times they understand it in ways
differently than other stakeholders would. It is the job of the idea communicator to
translate the values, features, functions, capabilities, benefits of the offering into
concepts and terminology stakeholders can understand and use to make informed
decisions about the offering. To finish the Apple analogy, this would have been
Regis McKenna.10
All innovations are born from the parenting of these three personas. These
personas also have personalities, things that enables innovation to thrive around
them. First, they do not quite fit the existing models (business, management,
organizational, operating) as they think and act out of the box. In terms of the
emergent organization of the enterprise, they are likely outsiders because they see
things differently or have different interests. They are probably infected with the
5-year old’s disease, constantly asking why things are the way they are, why they are
done the way they are done. They are constantly networking inside and outside the
organization, generally outside of organization and industry boundaries. They are
also keen observers, the source of the infinite number of “whys” and are always
trying things out or experimenting.

8.2 Right Process

Scott Anthony in The Little Black Book of Innovation [36] talks about the process of
innovation as it emerges in organizations and some of the commonalities in

10
https://en.wikipedia.org/wiki/Regis_McKenna
Global Health Security Innovation 403

innovative organizations. He talks about discovering opportunities, effectively the


discussion on foresight and future study. He talks about blueprinting ideas such as
Minimal Viable Products, Minimal Viable Services, and Minimally Viable Value
Propositions that can then allow the testing of the ideas (hypothesis) with the least
risk and maximal learning opportunities. Lastly, he talks about moving forward,
which is progressive executions of these ideas until they succeed as products,
services, value propositions, and the enterprises and ecosystems necessary for
them to exist.
A mental model based upon a systems approach to innovation [37] with the right
processes, linked with the right leadership behavior discussed below, establishes and
nurtures the environment wherein innovation grows and thrives. For example, have
policies and processes that encourage and reward failure where valuable learning
takes place, yet recognizes the eventual goal of success. Having policies and
processes that support and reward creativity rather than standardization and auto-
mation yet recognize efficiency and efficacy. Having policies and processes that
provide for and encourage “idle” time as that where creativity and divergent thinking
takes place, yet keeps focus on the goal and urgency to create value for stakeholders.
In other words, an environment of leadership rather than management and
supervision.

8.3 Leadership Role Modeling

Ram Charan in his book The Game Changer [38] talks about one does not lead so
much as one demonstrates leadership. There are five key elements to leading
innovation activities:
• Shaping Context. The role of an innovation leader is not to tell people what to do
or how to do it. Instead it is to communicate the desired outcome such as making a
wicked problem less troublesome or making the “victims” of a wicked problem a
little better. It is also important that the leader establish the limits on the inputs
and processes, the constraints on the innovation around its characteristics (new,
forward thinking, feasible, viable, and valuable) so that the innovators may
measure their progress. It is not only telling the innovators, but demonstrating it
to them with the leasers own decisions and actions.
• Actively participating. Innovation is not a steering committee exercise. The
innovation leader is one of the innovators actively engaged in the innovation
process and work with the team.
• Engaged review. Innovation reviews should never be status dumps. They should
be truly informative and engage debates on issues, opportunities to improve,
changes in constraints, etc., involving the leader and an innovation team peer.
• Process breaking. A typical managerial role is to enforce process, standards and
limit variability. An innovation leader’s role is the exact opposite. Seeking
innovation is a de facto admission that current policy and process is not effective
404 J. Stikeleather and A. J. Masys

or efficient in addressing the problem or opportunity at hand. However organi-


zations are designed around “make no mistakes”, have no excess capacity, ensure
conformance and other dictums that are the antithesis of innovation. The role of
the leader is to cut through all those “good for production efficiency and risk
reduction” rules, policy and process to enable incurring risk, increasing variabil-
ity, and facilitate learning failure.
• Being available. Innovation and addressing wicked problems is an ego bruising,
if not busting, exercise in ongoing futility, with success around some distant
corner. The leader must motivate the team through this (see next section) but
importantly, must be always available to help the team celebrate success, deal
with failure, and recognize and appreciate the difference.

8.4 Right Rewards and Incentives

Daniel Pink in his work Drive11 the surprising truth about what motivates us [39]
which is important when consideration innovation. Innovation requires creativity, it
requires perseverance to address wicked problems which are never solved only made
better or less worse, hopefully. Innovation work is not algorithmic in which the same
thing is done repeatedly but is instead heuristic requiring the innovators to come up
with new ways to do things continuously with no real instructions to follow other
than their mental models. Pink’s research and that of others examined traditional
rewards and incentives when we want to motivate people to be creative, problems
solving and innovative.
Pink shows that extrinsic rewards can be effective for algorithmic tasks—those
that depend on following an existing formula to its logical conclusion. But for more
right-brain undertakings—those that demand flexible problem-solving, inventive-
ness, or conceptual understanding such as addressing wicked problems—contingent
rewards can be counterproductive causing loss of intrinsic interest for the activity
and the ability to persevere in it. Instead, the secret to motivating innovation isn’t our
traditional biological drives or our reward-and-punishment drive, but our third
drive—our deep-seated desire to direct our own lives, to extend and expand our
abilities, and to live a life of purpose.
For fostering innovation in the addressing of wicked problems, a new approach to
motivation has three essential elements:
• Autonomy, the desire to direct our own lives;
• Mastery, the urge to get better and better at something that matters; and
• Purpose, the yearning to do what we do in the service of something larger than
ourselves.

A short but comprehensive video of “Drive” can be found here - https://www.youtube.com/


11

watch?v¼u6XAPnuFjJc
Global Health Security Innovation 405

8.5 Common Language

A common language is really just a shared mental model of what innovation is, how
to go about it, and why you are going about it. It is a collection of ideas such as the
business model canvas, “lean” approaches to testing ideas, that failure is good if
something is learned, it is the idea of stakeholders and how the enterprise can create
value for them. In the case of innovation, it is a systems thinking approach to value
creation across business models, management models, organization models, operat-
ing models as well as the end products and services.
It is keeping in mind and using all of the characteristics of innovation (forward
thinking, new (or new use of) ideas, feasible, viable and valuable. It is also under-
standing that innovation is not what the enterprise does, but rather it is what the
stakeholders adopt.
Using a common set of mental models for managing innovation is to make sense
of a complex, uncertain and highly risky set of phenomena as encountered in
complex and chaotic environments as described by Cynefin. Because innovation is
messy with many false starts, dead ends, leaps of faith and iterative failures to
success, it is the mental models that keep everything on track. This idea is described
in detail by Tidd, Bessant and Pavitt [14]. Likewise, a common language about
innovation, in particular recognize all potential stakeholders identified in future
studies, helps avoid overly focusing on technological opportunities and immediate
value creation to include less obvious social concerns, expectations and pressures
that make up the wicked problems the innovation is attempting to address.
Mental models are difficult to change and are one of the biggest barriers for
innovation as they determine the level of assimilation of new ideas find and the level
and quality of effort that will be put into them. The importance of shared mental
models among a team cannot be understated as demonstrated by Prasad Kaipa’s
article “Steve Jobs and the Art of Mental Model Innovation”.12
It is a common language or shared mental model that keeps innovation on track
when dealing with a wicked problem that involves complex systems of disruptive
and discontinuous events and networks of actors and sources.
As discussed earlier, a sensemaking framework such as the Cynefin Framework
helps us to understand the role of evidence-based practices to work across the
simple, complicated and complexity domain and to shed light on the perils of
oversimplification. In this way, Cynefin promotes reflective practices [40] that
help to shape effective intervention strategies and the pitfalls of unintended conse-
quences. Managing uncertainty within the global health security landscape requires
one to explore the possibility/plausibility space to enable innovation. Foresight
thereby becomes a key tool to support global health security innovation.

12
https://iveybusinessjournal.com/publication/steve-jobs-and-the-art-of-mental-model-innovation/
406 J. Stikeleather and A. J. Masys

9 Using Foresight and Futures Study to Support Innovation

Prediction is very difficult, especially if it’s about the future. Niels Bohr

The concept of using foresight and studying potential futures is a critical


component of innovation as discussed previously.
Foresight is the act of looking to and thinking about the future. Foresight enables
you to get better at predicting, creating, and leading the future. There is a tiny but
growing community of professional futurists, people who think and talk about
various aspects of the future full-time. However, anyone can learn efficient and
effective tools and methods for envisioning and planning to reach a desirable future.
The foresight and futures study area is very broad, and goes by a variety of
different names:
• Futures research,
• Futures analysis
• Futurism
• Futuristics
• Futurology
For many, these have rather negative connotations of respectively sloppy or very
superficial work done on behest of commercial marketing activities, or of exces-
sively empiricist and overly-prediction-oriented academic work in specialized fields
such as predicting future compute power of chips. The subject of this discussion has
nothing whatsoever to do with stock market “futures trading” or speculation. Instead,
futurists use the plural of “futures” specifically because the master concept of the
foresight and futures field is that of the existence of many potential alternative
futures, rather than simply a single future.
Ray Amara,13 a former president of the Institute for the Future once suggested
[41] that there are three fundamental premises upon which the futures field rests:
• The future is not predetermined. This is actually a direct consequence of under-
standing that the real world we deal with is either complex or chaotic as described
in the Cynefin framework where new futures emerge whose cause can only be
seen retrospectively (complex) or are just the consequence of totally random
events (chaotic). Therefore, there cannot be any single predetermined future or
prediction, but one can construct many potential alternative futures that can be
anticipated.
• The future is not predictable. This is actually an extension to the previous
premise. Even if the future were predetermined, we could never collect enough

13
Amara is best known for Amara’s Law – We tend to overestimate the effect of a technology in the
short run and underestimate the effect in the long run.
Global Health Security Innovation 407

information about it to a degree of accuracy necessary to construct a complete


model of how it would develop. The errors introduced by not having precise
information would cause the model to deviate from what is happening in the
system we are modeling.
• Future outcomes can be influenced by our choices in the present. We can
influence by the shape of the future which does happen by the choices we make
regarding our actions (or inaction) in the present (inaction is also a choice). This is
the reason anticipating possible futures is important, because from them we can
derive current actions that might result in the manifestation of a preferred future.
This ability to take responsibility for our futures is a foundation for creating and
managing innovation in all its forms. The actual future which eventuates, and in
which we will ultimately live and experience as the present then, will be governed by
our actions or inaction now, along with the choices we have made among many
alternative potential futures anticipated. We make those choices by altering the
course of the present by introducing innovations into our operating models, organi-
zation models, management models, business models, the value we create through
products and services, and the stakeholder communities we seek to deliver that
value.

9.1 Anticipated Futures

While it is possible, and fun, to generate an infinite number of potential alternative


futures, Henchey [42] proposed four distinguishable classes, later expanded by
Voros [43] to five, to help facilitate focusing on those with the highest probability
of impact upon the forecaster.
• Possible futures. These are those which “might happen” no matter how ridicu-
lous, unlikely or outside of the box of normal ideas or expectations. These are
generally reliant on the existence of some future knowledge or capability do not
yet possess and not sure where it might come from (but we can imagine it) in
order to come about. Possible futures are imaginative.
• Plausible futures. These could happen. They do not rely upon knowledge or
capabilities that do not yet exist, or if they do, there is a reasonably clear path on
how that knowledge or capability might emerge. Care needs to be taken here as
this is determined by our current understanding of physical laws, processes,
causation, systems of human interaction, regulation, and other areas where we
may not be fully informed. Because of this, plausible futures should be researched
to confirm or expand our understanding of the current state of knowledge and
capabilities. These are a subset of the possible futures.
• Probable futures. These are fundamentally continuations of current trends, a
business as usual linear extension of the present, and are therefore considered
likely to happen. These are usually the extent of most strategic planners’ consid-
erations. This is dangerous as few trends are continuous over long periods of time,
408 J. Stikeleather and A. J. Masys

Fig. 9 This image was adapted by Joseph Voros [43] from the work of Hancock and Bezold [45]

and they made fade away suddenly or be usurped suddenly by a new one
unexpectedly (unless identified in a possible or plausible future). Simply reading
trends, which is what many marketeers and strategist do, is not foresight. The
resulting probable futures is a much smaller subset of possible futures that may
impact the enterprise and its stakeholders.
• Preferable futures. These represent what we want to have happen. While the
previous futures are basically cognitively and generally objectively derived, these
involve value judgements and are more subjective. This set of futures can
intersect any or the above futures and are not part of a continuous subset
progression as the others are.
• Wild card futures. These are low probability futures, or perhaps highly specific
futures (reducing their probability for all of the complexity and chaos reasons
discussed earlier) but have an additional characteristic that should they occur they
would have a very significant outsized impact on our enterprise and/or its
stakeholders. These futures are sometimes referred to as black swan events
from their description and analysis by Taleb [44] (Fig. 9).
The value of foresight activities is directly related to the depth of work done [63,
64]. The least useful is the “reading trends” approach discussed above. This is the
shallowest and most superficial level of futures thinking; it is also by far the most
widespread, well-known and popular. It is usually highly media-oriented and is
found in television reports, in newspaper magazine articles, popular books, and
“sound bites” by “experts”. It only reveals a very small segment of the potential
futures and is often preoccupied with technology. There is little insight or innovation
value found at this level.
A more serious level is often concerned with how organizations and society
might, or ought to, respond to challenges lying in the nearer-term future. This is
where strategic thinking should take place and, in the public sector, often touches
upon the “big-picture” and “wicked” problems as described earlier. This is referred
Global Health Security Innovation 409

to as problem-oriented futures work and is where the majority of effort should and
does take place in innovation.
Slaughter goes on to describe “critical futures studies” which is more related to
Social Systems Theory [36] dealing with how we create the problems in the first
place through our worldviews and unquestioned assumptions. He also describes
“epistemological futures work” where futures work integrates with philosophy,
epistemology, ontology, cosmology, macro history, the study of time, the nature
and influence of consciousness on the human endeavor and other areas of human
society. Both of these levels are usually beyond the interest and utility of everyday
practitioners of innovation.
This layering of futures thinking has been used by futures researchers to develop
analytical methods to get to the issues beneath the all the many wicked problems and
themes which tend to capture and divert our attention and keep us from developing
innovations that improve the situations we are trying to address. The layering also
supports the different approaches to foresight including:
• Pragmatic foresight. This generally looks for opportunities within known prob-
lem spaces. This is what is most common in business, where it seeks new
markets, new challenges, innovation, is highly entrepreneurial, and looks at the
future as an ecological competitive space within which one needs to adapt in
order to gain advantage. Innovation here tends to focus on products and services,
operating models, and organizational models.
• Progressive foresight. Whereas pragmatic foresight works on innovation among
participants within a known ecology or system, progressive foresight looks for
ways in which the system or ecology itself can be innovated, transformed and
create more value for participants or create new participants. Innovation here
looks like what the business press calls “digital transformation” and creates new
ecosystems (online retail) and new participants (Amazon). Innovation here tends
to center on management models and business models.
• Civilizational foresight. This focuses on potential change to multiple ecosystems
in an ecosphere. It is called civilizational as it is often looking at preferable futures
of a complete society and what can be done to bring one about. In business this
tends to happen outside the enterprise with emerging concepts of corporate social
responsibility [55], corporate shared value [47], and the rise of new legal struc-
tures such as benefit corporations [48] as examples of resulting innovations.
You can address foresight at four levels:
• Personal – improving our individual, relationship, and family navigation of the
future.
• Organizational – improving our teams, companies, and institutions abilities to
create the futures they desire.
• Global – improving our societies ability to cooperate, compete and adapt.
• Universal – which includes science, complexity studies, and models for universal
change.
410 J. Stikeleather and A. J. Masys

The point of this section is not to teach foresight and future studies but to increase
awareness of the discipline, its usefulness in understanding and addressing wicked
problems though innovation. Being able to anticipate the potential jobs to be done in
the future by the enterprise to provide value to its stakeholders, to understand how to
potentially reduce the pains of the stakeholders in accomplishing those jobs as well
as increasing the value of the job accomplishments themselves, is the basis of
successful innovation. There are many helpful organizations, most non-profit think
tanks, that supply education and input for a foresight process. Some of the leading
ones are:
• The Institute for the Future – “Making the Future with Foresight - IFTF is
celebrating its 50th anniversary as the world’s leading non-profit strategic futures
organization.” http://www.iftf.org/home/
• The Future Society – “Our mission is to help build a future that preserves
humanity while harnessing the upsides of technology.” http://www.
thefuturesociety.org/
• World Future Society – “community of future-minded citizens charting a new
course for humanity” https://www.worldfuture.org/
• And smaller, but no less interesting ones such as futurist.comhttps://www.futur
ist.com/think-tank/ and the Da Vinci Institute https://www.davinciinstitute.com/
• As well as ones that focus on areas of interest such as Foresight Institute
“focusing on molecular machine nanotechnology, cybersecurity, artificial intelli-
gence” https://foresight.org/
There are also university programs:
• Futures Studies and Foresight Education at Finland University - https://www.
finlanduniversity.com/service/futures-studies-and-foresight-education/
• University of Houston’s Graduate Program in Foresight - http://www.uh.edu/
technology/departments/hdcs/graduate/fore/
• With a fairly complete list available here: https://www.accelerating.org/
gradprograms.html
And there are many very good texts, papers, and sites. A great starting point
would be Andy Hines and Peter Bishop’s Thinking about the Future: Guidelines for
Strategic Foresight [49].

9.2 A Note of Caution

Because it involves all the elements of sensemaking [50] and systems thinking such
as analysis, synthesis, induction, deduction, abduction, observation and creativity,
foresight is subject to many of the weaknesses of human intellectual processes. This
is compounded by the fact there is no test for correctness other than time itself.
Global Health Security Innovation 411

Consequentially, self awareness and reflection are critical in foresight and futures
research.
Futures researchers must be aware of which thinking system they are using
[51, 52] and not jump to conclusions based upon their existing, comfortable mental
models but entertain new mental models [53].
Good foresight and futures work are based upon what is as opposed to what
should be. Incorporating the potential (likelihood) of irrationality on the part of the
stakeholders, as well as the futures researchers decision making [54].
Lastly, foresight and futures researchers are encouraged to regularly assess how
bias14 and logical fallacies15 may be entering their research before applying their
conclusions into constructing future innovation planning scenarios.

9.3 Intervening in the Systems of Health Security

Intervening and innovation in the system associated with Global health security
requires the following:
• Recognizing systems and systems thinking
• Mental models and problem framing
• Divergent thinking
• Design and experimentation (real time and foresight)
Returning to Weick and Sutcliffe [1] ‘. . .unexpected events often audit our
resilience, everything that was left unprepared becomes a complex problem, and
every weakness comes rushing to the forefront’: the complex threat landscape
associated with global health security matters requires a mindset that embraces
innovation and creativity. The application of systems thinking, Cynefin and Fore-
sight as toolsets to support innovation presents a method of inquiry that unfolds in a
radically different way from traditional linear approaches.
There are a few key concepts to keep in mind when intervening in a system.

14
“Cognitive biases make our judgments irrational. We have evolved to use shortcuts in our
thinking, which are often useful, but a cognitive bias means there’s a kind of misfiring going on
causing us to lose objectivity. This website has been designed to help you identify some of the most
common biases stuffing up your thinking.” https://yourbias.is/
15
“A logical fallacy is a flaw in reasoning. Logical fallacies are like tricks or illusions of thought,
and they’re often very sneakily used by politicians and the media to fool people. Don’t be fooled!
This website has been designed to help you identify and call out dodgy logic wherever it may raise
its ugly, incoherent head.” https://yourlogicalfallacyis.com/
412 J. Stikeleather and A. J. Masys

9.4 Recognize You Are Dealing with a System

Through this chapter we have used the term system assuming a common under-
standing. It is important to be working from a common understanding of what is
meant by a system. An excellent introduction to what a system is and how to think
about it can be found at the Systems Thinker web site.16 It is beyond the scope of this
chapter address all the concepts and issues of the systems paradigm, general systems
theory, systems thinking, sensemaking, complexity, systems innovation and other
concepts discussed here in more depth. However, there is an excellent collection of
articles, instruction, videos for learning a disciplined approach to these topics at
Complexity Labs.17 Another good source for learning systems thinking is the Waters
Foundation.18 An excellent review of available resources on the topic is provided by
Monat and Gannon [55] which the reader is encouraged to peruse.
By understanding what a system is, it becomes easier to identify what system
(or system of another system) you are dealing with at a moment in time, where its
boundaries are, and how it is behaving. This is the critical first step in building a
mental model or map of the landscape [50] that you want to understand, to intervene
in, and to change its behavior. This is called sensemaking and it is “a key leadership
capability for the complex and dynamic world we live in today” that enable leaders
to “explore the wider system, create a map of that system, and act in the system to
learn from it” [56]. It is the process of “the making of sense” [57] of an environment
by “structuring the unknown” [58] by observing and structuring the inputs (signals,
information, resources, etc.) to the system and the consequential behaviors and
outputs as the inputs vary “to comprehend, understand, explain, attribute, extrapo-
late, and predict” [59] in order to prepare to intervene and measure the effects on an
intervention in the system (Fig. 10a).
Systems thinking, approaching problems and asking how various elements within
a system, which could be an ecosystem, an organization, or something more
dispersed such as a health care system, influence one another, uses an iceberg
model to describe the process. Rather than reacting to individual problems that
arise, a systems thinker will ask about relationships to other activities within the
system, look for patterns over time, and seek root causes.
For more details on the iceberg model,19 the reader is encouraged to review
Senge, Kleiner, Roberts, Ross, and Smith, “ The Fifth Discipline Fieldbook” [60]
which is an applied perspective of the background and theory of systems thinking
presented in “The Fifth Discipline” [61].
Such an approach is crucial for “wicked problems” [4] such as health security. A
wicked problem is a problem that is difficult or impossible to solve for as many as
four reasons: incomplete or contradictory knowledge, the number of people and

16
https://thesystemsthinker.com/introduction-to-systems-thinking/
17
http://complexitylabs.io/ (disclosure one of the authors is a member)
18
https://www.watersfoundation.org/
19
A quick introduction is available here: https://nwei.org/iceberg/
Global Health Security Innovation 413

Fig. 10a Adapted from systems thinking in schools, Waters Foundation, www.watersfoundation.
org

opinions involved, the large economic burden, and the interconnected nature of these
problems with other problems. Almost all social, political, cultural, and organiza-
tional problems are “wicked.”
According to Rittel, there are ten characteristics of wicked problems, which are
crucial to understand before attempting any intervention:
• Wicked problems have no definitive formulation. The problem of poverty in
Texas is grossly similar but discretely different from poverty in Nairobi, so no
practical characteristics describe “poverty.”
• It’s hard, maybe impossible, to measure or claim success with wicked problems
because they bleed into one another, unlike the boundaries of traditional design
problems that can be articulated or defined.
• Solutions to wicked problems can be only good or bad, not true or false. There is
no idealized end state to arrive at, and so approaches to wicked problems should
be tractable ways to improve a situation rather than solve it.
414 J. Stikeleather and A. J. Masys

• There is no template to follow when tackling a wicked problem, although history


may provide a guide. Teams that approach wicked problems must literally make
things up as they go along.
• There is always more than one explanation for a wicked problem, with the
appropriateness of the explanation depending greatly on the individual perspec-
tive of the designer.
• Every wicked problem is a symptom of another problem. The interconnected
quality of socio-economic political systems illustrates how, for example, a change
in education will cause new behavior in nutrition.
• No mitigation strategy for a wicked problem has a definitive scientific test
because humans invented wicked problems and science exists to understand
natural phenomena.
• Offering a “solution” to a wicked problem frequently is a “one shot” design effort
because a significant intervention changes the design space enough to minimize
the ability for trial and error.
• Every wicked problem is unique.
• Designers attempting to address a wicked problem must be fully responsible for
their actions.
Again, it is not the purpose of this chapter to teach wicked problems, but it is
important to understand the consequences of intervening in a system hosting a
wicked problem. Jon Kolko [62] has written an excellent introduction with a
summary available on the Stanford Social Innovation Review site.20

9.5 Mental Models and Systems Models

The point of systems thinking is to be able to describe the system of interest in a way
that a shared mental model among all the many stakeholders can be established for
productive communication. The process of building that model, systems thinking
and sensemaking, are themselves productive communication efforts.
It is beyond the scope of this chapter to describe systems models. Being able to
discuss a system quickly, efficiently and completely with stakeholders is important
to develop courses of intervention, agreement to intervene, and deciding upon the
“success” or “failure” of the intervention.21 There are three key22 ways of describing
systems:

20
https://ssir.org/articles/entry/wicked_problems_problems_worth_solving
21
Keep in mind that with wicked problems, interventions either make things better or worse, there is
no solution (success) with a wicked problem.
22
There are number of other tools that are used to map out events or how things are connected.
Network mapping, social network analyses, and process mapping involve a range of tools to
illustrate and analyze connections between people, organizations, or processes in both qualitative
and quantitative ways.
Global Health Security Innovation 415

• DSRP - Distinctions, Systems, Relationships, Perspectives23 [63]


• CLD - Causal Loop Diagrams24 [64]
• SD - Systems Dynamics25 [26]
Being able to take what has been learned about the system under consideration
with a tangible representation that can be shared with others is critical in establishing
a shared mental model as to how to engage with that system among all the
stakeholders. Mental models are important as they are how we reason. From the
Princeton Mental Models and Reasoning site:
The theory of mental models rests on simple principles, and it extends in a natural way to
inferring probabilities, to decision making, and to recursive reasoning about other people’s
reasoning. We can summarize the theory in terms of its principal predictions, which have all
been corroborated experimentally. According to the model theory, everyday reasoning
depends on the simulation of events in mental models (e.g., [65]). The principal assumptions
of the theory are:
1. Each model represents a possibility. Its structure corresponds to the structure of the
world, but it has symbols for negation, probability, believability, and so on. Models that
are kinematic or dynamic unfold in time to represent sequences of events.
2. Models are iconic insofar as possible, that is, their parts and relations correspond to those
of the situations that they represent. They underlie visual images, but they also represent
abstractions, and so they can represent the extensions of all sorts of relations. They can
also be supplemented by symbolic elements to represent, for example, negation.
3. Models explain deduction, induction, and explanation. In a valid deduction, the conclu-
sion holds for all models of the premises. In an induction, knowledge eliminates models
of possibilities, and so the conclusion goes beyond the information given. In an abduc-
tion, knowledge introduces new concepts in order to yield an explanation.
4. The theory gives a ‘dual process’ account of reasoning. System 1 constructs initial
models of premises and is restricted in computational power, i.e., it cannot carry out
recursive inferences. System 2 can follow up the consequences of consequences recur-
sively, and therefore search for counterexamples, where a counterexample is a model of
the premises in which the conclusion does not hold.
5. The greater the number of alternative models needed, the harder it is: we take longer and
are more likely to err, especially by overlooking a possibility. In the simulation of a
sequence of events, the later in the sequence that a critical event occurs, the longer it will
take us to make the inference about it.
6. The principle of truth: mental models represent only what is true, and accordingly they
predict the occurrence of systematic and compelling fallacies if inferences depend on
what is false. An analogous principle applies to the representation of what is possible

23
An excellent free interactive instruction on DSRP using the Plectica tool is available here: https://
cabrera-research-lab.tahoe.appsembler.com/
24
An excellent introduction to Causal Loop Diagrams is available on the Systems and Us website at
https://systemsandus.com/2012/08/15/learn-to-read-clds/
25
An absolutely amazing, comprehensive and easy introduction to systems dynamic modeling (and
other forms) is available from an open source free model simulation tool Insight Maker (https://
insightmaker.com/) whose tutorials explain the different modeling approaches very well. An
accompanying ebook is available (http://beyondconnectingthedots.com/). Another gentle interac-
tive introduction to systems dynamics is available here: https://kumu.io/stw/insight-maker using the
mental modeling tool kumu https://kumu.io/
416 J. Stikeleather and A. J. Masys

rather than impossible, to what is permissible rather than impermissible, and to other
similar contrasts.
7. The meanings of terms such as ‘if’ can be modulated by content and knowledge. For
example, our geographical knowledge modulates the disjunction: Jay is in Stockholm or
he is in Sweden. Unlike most disjunctions, this one yields a definite conclusion: Jay is in
Sweden.
The theory accounts for the informality of arguments in science and daily life, whereas
logic is notoriously of little help in analyzing them.

Shared mental models are critical for collaboration and agreement before inter-
vening in a system.
There are two models that an innovation in Health Security always needs to be
cognizant of: The systems model that describes the wicked problem being addressed
so that interventions can be designed, executed and monitored; the systems that
describe the enterprise as discussed above in Forms of Innovation so that you
understand how the two systems (and a larger system they are both part of) will
interact.
One last point about adopting a systems thinking approach and using systems and
mental models for innovation is the ability to leverage systems archetypes
[66]. Sometimes also called design patterns, systems archetypes provide building
blocks for your models based upon commonly found structures in systems. A
systems archetype is a well-defined structure, exhibits a distinct behavior over
time, frequently occurs across multiple disciplines of science and has well defined
strategies for dealing with the implications of the structure. There are 15 (12 negative,
3 positive) reoccurring archetypes common to human systems such as Health
Security. They are:
• Limits to growth
• Tragedy of the commons
• Escalation
• Eroding goals
• Addiction
• Seeking the wrong goal
• Exponential successful
• Race to the bottom
• Rule breaking
• Shifting the burden
• Fixes that fix back
• Growth Paradox
And:
• Intensity to action
• Regenerative relationships
• Status quo disruption
Global Health Security Innovation 417

There is an excellent series on systems thinking by Leyla Acaroglu on Systems


Thinkers in her Disruptive Design26 blogs. These 15 archetypes are described in
one.27 Gene Bellinger has developed individual short videos that explain these and
many more archetypes that are available on YouTube.28
These models become the testbed for using the different types of innovation
described in the Forms of Innovation section. They also provide the framework for
applying Meadows’ 12 points of Intervention shown in Fig. 2 and discussed in her
book [26] which she later expanded upon [67] (Fig. 10b).

9.6 Care in Intervention

The three biggest issues in addressing wicked problems are unintended conse-
quences, side effects and mismeasurements.
Mismeasurement, either measuring wrongly or measuring the wrong thing is the
easiest to control. Before intervening in a system several measurement caveats
should be kept in mind:
• Goodhart’s Law29 - “When a measure becomes a target, it ceases to be a good
measure.” In other words, when we set one specific goal, people will tend to
optimize for that objective regardless of the consequences.
• McNamara’s Fallacy30 - The first step is to measure whatever can be easily
measured. The second step is to disregard that which can’t be easily measured
or to give it an arbitrary quantitative value. The third step is to presume that what
can’t be measured easily really isn’t important. The fourth step is to say that what
can’t be easily measured really doesn’t exist. In other words, in pursuit of an easy
measurement, context and complexity are lost.
• The Flaw of Averages [68] - plans based on average assumptions are wrong, on
average.
• Pursuit of Statistical Significance [69] - statistical significance should not be
equated with substantive real world significance and that empirical researchers
should convey more information about the magnitude of relationships and
effects.31

26
https://medium.com/disruptive-design
27
https://medium.com/disruptive-design/tools-for-systems-thinkers-the-12-recurring-systems-
archetypes-2e2c8ae8fc99
28
https://www.youtube.com/user/systemswiki/search?query¼archetype Gene is also a major con-
tributor to the Insight Maker project.
29
https://towardsdatascience.com/unintended-consequences-and-goodharts-law-68d60a94705c
30
https://www.logicallyfallacious.com/tools/lp/Bo/LogicalFallacies/237/McNamara-Fallacy
31
This is an area where a good systems model can prevent mistakes
418 J. Stikeleather and A. J. Masys

Fig. 10b Adapted from


https://thesystemsthinker.
com/wp-content/uploads/
pdfs/240111pk.pdf

• Survivorship Bias – when failure becomes invisible, the difference between


failure and success may also become invisible. This leads to assuming you should
focus on the successful if you wish to become successful.
One of the key reasons for modeling systems before intervening in them is to be
able to see all the interrelationships of the components of the system, and the
system’s relationships with other systems. One characteristic of wicked systems is
their complexity and a small change in one part of the system may cause massive
Global Health Security Innovation 419

changes in another (sometimes called a butterfly effect32 [70]). These become


unintended consequences.
Also, by having a systems model (shared mental model) it becomes possible to
anticipate potential propagation of effects across the system anticipating side effects.
Lastly, there is a tendency when intervening in a system to do so in a way that
makes it more efficient. This sometimes has an inverse effect on the efficacy of the
system. More importantly, efficiency tends to reduce the resilience of a system to
respond to unanticipated demands. This has been described by Taleb Nassim [71].

10 Applying Innovation to Health Security

10.1 Build Shared Systems and Mental Models

We talk about having everyone on the same page, and the best way to do that is
literally. Putting a model that everyone can agree to or at least discuss ensures that
when interventions take place, everyone will know what to do and why to do it.
When you have that great idea for improving things (remember one never solves
wicked problems) a systems model lets you see what its potential is, both for making
things better or worse. A systems model will also help identify all the interrelation-
ships from which unintended consequences and unintended degrees of side effects
occur.
The systems models will help align the enterprise (the value creation system) with
its “customers” (the value consuming system) within the context of their shared
environment (the systems of all stakeholders).
Begin to have meetings around diagrams, models, pictures instead of reports and
spreadsheets. Have people draw their ideas in context of the system/problem being
addressed. The Systems Thinker site has a good series of articles33 on how to get
your organization to adopt systems thinking.
This addresses the need to have a common language of innovation as discussed
earlier.
There are many ways to develop Systems Thinking within the organization.
There are numerous courses online, the ones at Complexity Labs are good starting
points. There are also numerous sites, Systems Thinker and Systems & Us are two
examples, with loads of material. The Waters Foundation provides a number of
courses, materials and step by step guides to systems thinking. The Fifth Discipline
and the associated Field Guide also provide good starting points.

32
https://www.wisegeek.com/what-is-the-butterfly-effect.htm
33
https://thesystemsthinker.com/introducing-systems-thinking-into-your-organization/
420 J. Stikeleather and A. J. Masys

10.2 Convergent and Divergent Thinking and People

As managers we are the masters of convergent thinking. We are analytical and


rational. We are quantitively evaluative and focused, so we address one thing at a
time. We work within our perceived constraints to achieve our perceived objectives.
We are successful because we pay attention to specifics and their details. And our
organizations reflection ourselves.
A great deal of value, especially efficiency emerges from this mode of focused
convergent thinking. But when one is deep in the examination of a bark beetle in the
bark of a pine tree, one likely misses the other trees, plants, animals, geology,
weather – the elements of the ecosystem – that enables for that beetle to be there.
That is the value of systems thinking.
Systems thinking is divergent. It is creative (because except for simple systems it
is impossible to enumerate every element and every relationship, and each element
and relationship can be its own system). It is intuitive, qualitative, subjective and
speculative about all the possibilities that could emerge as the system pursues its
purpose. Systems thinking is holistic and deals with conceptual abstractions as much
if not more than specific details, because a complex system is forever displaying
emergent behavior.
Divergent thinking requires divergent people (more points of view) and creativ-
ity. There are many ways to acquire and develop those people.
Meet People’s Needs. Recognize that questioning orthodoxy and convention —
the key to creativity — begins with questioning the way people are expected to work.
How well are their core needs — physical, emotional, mental, and spiritual — being
met in the workplace? The more people are preoccupied by unmet needs, the less
energy and engagement they bring to their work. Begin by asking employees, one at
a time, what they need to perform at their best. Next, define what success looks like
and hold people accountable to specific metrics, but as much as possible, let them
design their days as they see fit to achieve those outcomes.
Teach Creativity Systematically. It isn’t magical and it can be developed. There
are five well-defined, widely accepted stages of creative thinking: first insight,
saturation, incubation, illumination, and verification. They don’t always unfold
predictably, but they do provide a roadmap for enlisting the whole brain, moving
back and forth between analytic, deductive left hemisphere thinking, and more
pattern-seeking, big-picture, right hemisphere thinking. The best description of the
stages I’ve come across is in Betty Edward’s book Drawing on the Artist Within.
The best understanding of the role of the right hemisphere, and how to cultivate it, is
in Edwards’ first book, Drawing on the Right Side of the Brain.
Nurture Passion. The quickest way to kill creativity is to put people in roles that
don’t excite their imagination. This begins at an early age. Kids who are encouraged
to follow their passion develop better discipline, deeper knowledge, and are more
persevering and more resilient in the face of setbacks. Look for small ways to give
employees, at every level, the opportunity and encouragement to follow their
interests and express their unique talents.
Global Health Security Innovation 421

Make the Work Matter. Human beings are meaning-making animals. Money pays
the bills but it’s a thin source of meaning. We feel better about ourselves when we
we’re making a positive contribution to something beyond ourselves. To feel truly
motivated, we have to believe what we’re doing really matters. When leaders can
define a compelling mission that transcends each individual’s self-interest, it’s a
source of fuel not just for higher performance, but also for thinking more creatively
about how to overcome obstacles and generate new solutions.
Provide the Time. Creative thinking requires relatively open-ended, uninterrupted
time, free of pressure for immediate answers and instant solutions. Time is a scarce,
overburdened commodity in organizations that live by the ethic of “more, bigger,
faster.” Ironically, the best way to insure that innovation gets attention is to schedule
sacrosanct time for it, on a regular basis.
Value Renewal. Human beings are not meant to operate continuously the way
computers do. We’re designed to expend energy for relatively short periods of time
— no more than 90 min — and then recover. The third stage of the creative process,
incubation, occurs when we step away from a problem we’re trying to solve and let
our unconscious work on it. It’s effective to go on a walk, or listen to music, or quiet
the mind by meditating, or even take a drive. Movement — especially exercise that
raises the heart rate — is another powerful way to induce the sort of shift in
consciousness in which creative breakthroughs spontaneously arise.

10.3 Fail Fast, Fail Furiously, Learn, Try Again

When putting together your system interventions, go small. When dealing with a
complex system, the guidelines of the Cynefin Framework are to “probe”, what we
earlier called an hypothesis test, then sensemake from the behavior the system
demonstrates in response, then respond by adjusting your hypothesis (probe) and
do it over and over again until you see the improvement in the system you seek.
Large, analysis driven, too big to fail projects will fail for all the reasons discussed
about taking care in intervening in a system.

10.4 Synthesis Based Decision Making

The fallacy of evidence-based decision making is that you really have all the
evidence needed when making the decision. You may have fallen victim to either
Goodhart or McNamara. The evidence may get tweaked and tuned (drop the out-
liers) to improve the stats, when the reality is insight and discovery often lies with the
outliers. Simple unintentional color selection in data representation can skew
422 J. Stikeleather and A. J. Masys

Fig. 11 Adapted from Edward De Bono’s six thinking hats [74, 75]

perception. Information separated by pages or even fonts and font sizes can misrep-
resent relationships34 [72].
Developing a process to support synthesis-based decision making helps innova-
tion emerge in an enterprise. One example of this is the process developed by
Edward De Bono [74, 75] called the “six thinking hats”. There are six metaphorical
hats and each defines a certain type of thinking. By going through all six forms of
thinking a team synthesizes across all the dimensions of the decision to be made –
including knowns (simple system elements),35 known unknowns (complicated sys-
tem elements), unknown unknowns (complex system elements), and unknowables
(chaotic system elements) (Fig. 11).
As described earlier, Heyman et al. [3:1888] argues that, “the world is
ill-prepared” to handle any “sustained and threatening public-health emergency”.
The Global Health Security landscape is characterized by its inherent VUCA
(volatility, uncertainty, complexity and ambiguity). These features along with the
impacts of diseases globally make the requirement for innovation all the more
urgent. Working across the innovation space pertaining to global health security
means drawing upon the tools and approaches resident within the innovation thought
leadership space. This chapter introduces the salient innovation approaches and
mindset required to address Global Health Security issues.

34
Sometimes with deadly consequences as Edward Tufte [73] concluded as part of the Challenger
Space Shuttle accident investigation.
35
Parentheticals are from the Cynefin Framework
Global Health Security Innovation 423

11 Conclusion

H1N1, SARS, H5N1, HIV/AIDS and Ebola are part of a long list of emerging lethal
diseases that comprise and challenge the notion of global health security. These will
not be the last new and lethal pathogen to emerge. Black swans and perfect storms
are inevitable. The question arises: are we ready?
This chapter explored the foundations of the innovation space as it applies to
global health security. The wicked nature of global health security points to how
innovation and complexity framing go hand in hand in dealing with such global
issues.

References

1. Weick KE, Sutcliffe KM (2007) Managing the unexpected: resilient performance in an age of
uncertainty second edition. Wiley, California
2. Morens DM, Fauci AS (2013) Emerging infectious diseases: threats to human health and global
stability. PLoS Pathog 9(7):1–3
3. Heyman et al (2015) Global health security: the wider lessons from the west African Ebola virus
disease epidemic. Vol 385 May 9, 2015 www.thelancet.com
4. Rittel HWJ, Webber MM (1973) Dilemmas in a general theory of planning. Policy Sci 4(2):155
5. Masys AJ (2016) Disaster forensics: understanding root cause and complex causality. Springer
Publishing
6. Helbing D (2013) Globally networked risks and how to respond. Nature 497:51–59
7. World Economic Forum Global Risks (2018) https://www.weforum.org/reports/the-global-
risks-report-2018
8. Masys AJ, Ray-Bennett N, Shiroshita H, Jackson P (2014) High impact/low frequency extreme
events: enabling reflection and resilience in a hyper-connected World. 4th International Con-
ference on Building Resilience, Building Resilience 2014, 8–10 September 2014, Salford
Quays, United kingdom. Procedia Economics and Finance 18 (2014) 772–779
9. Woods DD (2006) Essential characteristics of resilience. In: Hollnagel E, Woods DD, Leveson
N (eds) Resilience engineering: concepts and precepts. Ashgate Publishing, Aldershot, Hamp-
shire, pp 21–34
10. Sagan SD (1993) The limits of safety organizations, accidents, and nuclear weapons. Princeton
University Press
11. Belay ED, Kile JC, Hall AJ, Barton-Behravesh C, Parsons MB, Sayler S, Walker H (2017)
Zoonotic disease programs for enhancing global health security emerging infectious diseases •
www.cdc.gov/eid • vol. 23, Supplement to December 2017
12. Kickbusch I (2016) Governing the global health security domain. Global Health Programme
Working paper No. 12. Graduate Institute of International and Development Studies
13. Moore M, Westley F (2011) Surmountable chasms: networks and social innovation for resilient
systems. Ecol Soc 16(1):5
14. Masys AJ (2014) Disaster Management: Enabling Resilience. Springer Publishing
15. Masys AJ (2015) Applications of systems thinking and soft operations research in managing
complexity. Springer
16. Masys AJ (ed) (2018) Security by design. Springer
17. Senge PM (1990) The fifth discipline: the art and practice of the learning organization. Century
Business, London
424 J. Stikeleather and A. J. Masys

18. Westley F, Zimmerman B, Patton M (2007) Getting to maybe: how the world is changed.
Random House of Canada, Toronto
19. Kurtz CF, Snowden DJ (2003) The new dynamics of strategy: sense-making in a complex and
complicated world. IBM Syst J 42(3):462–483
20. Snowden D (2003) Complex knowledge. Building the knowledge economy: issues, applica-
tions, case studies, 805
21. Snowden DJ, Boone ME (2007) A leader’s framework for decision making (cover story). Harv
Bus Rev 85(11):68–76
22. Gallie WB (1955) Essentially contested concepts. Paper presented at the Proceedings of the
Aristotelian society
23. Freeman RE, Harrison JS, Wicks AC, Parmar BL, De Colle S (2010) Stakeholder theory: the
state of the art. Cambridge University Press
24. Miles S (2015) Stakeholder theory classification: a theoretical and empirical evaluation of
definitions. J Bus Ethics. https://doi.org/10.1007/s10551-015-2741-y
25. Rogers EM (2010) Diffusion of innovations. Simon and Schuster
26. Meadows DH, Wright D (2009) Thinking in systems. [electronic resource]: a primer.
Earthscan., 2009, London, p c2008
27. Meadows DH (2008) Thinking in systems: a primer. Chelsea green publishing
28. Meadows D (1999) Leverage points. Places to intervene in a system
29. Porter ME (1996) What is strategy. Published November
30. Birkinshaw J, Goddard J (2009) What is your management model? 81
31. Hamel G (2006) The why, what, and how of management innovation 72
32. Robertson BJ (2015) Holacracy: the new management system for a rapidly changing world.
Henry Holt and Company
33. Mason-Jones R, Naylor B, Towill DR (2000) Lean, agile or leagile? Matching your supply
chain to the marketplace. Int J Prod Res 38(17):4061–4070
34. Gilson LL, Maynard MT, Jones Young NC, Vartiainen M, Hakonen M (2015) Virtual teams
research: 10 years, 10 themes, and 10 opportunities. J Manag 41(5):1313–1337
35. Osterwalder A, Pigneur Y (2010) Business model generation: a handbook for visionaries, game
changers, and challengers. Wiley, New Jersey
36. Anthony SD (2017) The little black book of innovation, with a new preface: how it works, how
to do it. Harvard Business Review Press
37. Berkhout F, Green K (2002) Managing innovation for sustainability: the challenge of integra-
tion and scale. Int J Innov Manag 6(03):227–232
38. Lafley AG, Charan R (2010) The game changer: how every leader can drive everyday
innovation. Profile Books
39. Pink DH (2011) Drive: the surprising truth about what motivates us. Penguin
40. Schön DA (1983) The reflective practitioner: how professionals think in action. Ashgate
Publishing Ltd, London
41. Amara R (1981) The futures field: searching for definitions and boundaries. Futurist
15(1):25–29
42. Henchey N (1978) Making sense of futures studies. Alternatives (7):24–29
43. Voros J (2003) A generic foresight process framework. Foresight 5(3):10–21
44. Taleb NN (2007) The black swan: the impact of the highly improbable (vol. 2). Random house
45. Hancock T, Bezold C (1994) Possible futures, preferable futures. Healthc Forum J 37(2):23–29
46. Bausch KC (2001) The emerging consensus in social systems theory. Kluwer Academic/
Plenum Publishers, New York, p c2001
47. Porter ME, Kramer MR (2011) Creating shared value. Harv Bus Rev 89(1/2):62–77
48. Robson R (2015) A new look at benefit corporations: game theory and game changer. Am Bus
Law J 52(3):501–555. https://doi.org/10.1111/ablj.12051
49. Hines A, Bishop PJ (2006) Thinking about the future: guidelines for strategic foresight. Social
Technologies, Washington, DC
50. Iveroth E, Hallencreutz J (2015) Effective organizational change: leading through sensemaking.
Routledge
Global Health Security Innovation 421

Make the Work Matter. Human beings are meaning-making animals. Money pays
the bills but it’s a thin source of meaning. We feel better about ourselves when we
we’re making a positive contribution to something beyond ourselves. To feel truly
motivated, we have to believe what we’re doing really matters. When leaders can
define a compelling mission that transcends each individual’s self-interest, it’s a
source of fuel not just for higher performance, but also for thinking more creatively
about how to overcome obstacles and generate new solutions.
Provide the Time. Creative thinking requires relatively open-ended, uninterrupted
time, free of pressure for immediate answers and instant solutions. Time is a scarce,
overburdened commodity in organizations that live by the ethic of “more, bigger,
faster.” Ironically, the best way to insure that innovation gets attention is to schedule
sacrosanct time for it, on a regular basis.
Value Renewal. Human beings are not meant to operate continuously the way
computers do. We’re designed to expend energy for relatively short periods of time
— no more than 90 min — and then recover. The third stage of the creative process,
incubation, occurs when we step away from a problem we’re trying to solve and let
our unconscious work on it. It’s effective to go on a walk, or listen to music, or quiet
the mind by meditating, or even take a drive. Movement — especially exercise that
raises the heart rate — is another powerful way to induce the sort of shift in
consciousness in which creative breakthroughs spontaneously arise.

10.3 Fail Fast, Fail Furiously, Learn, Try Again

When putting together your system interventions, go small. When dealing with a
complex system, the guidelines of the Cynefin Framework are to “probe”, what we
earlier called an hypothesis test, then sensemake from the behavior the system
demonstrates in response, then respond by adjusting your hypothesis (probe) and
do it over and over again until you see the improvement in the system you seek.
Large, analysis driven, too big to fail projects will fail for all the reasons discussed
about taking care in intervening in a system.

10.4 Synthesis Based Decision Making

The fallacy of evidence-based decision making is that you really have all the
evidence needed when making the decision. You may have fallen victim to either
Goodhart or McNamara. The evidence may get tweaked and tuned (drop the out-
liers) to improve the stats, when the reality is insight and discovery often lies with the
outliers. Simple unintentional color selection in data representation can skew
Index

A C
Actors, 26, 41, 49, 69, 72, 99, 184, 187, 189, Cannon, T., 145
192, 193, 203, 323, 346–348, 350, 351, Catchpole, K., 40
405 Chengcheng, H., 231–267
Aedes aegypti, 277–303 Chen, J., 369
Agricultural emergency, 13–27 Cholera, 4–10, 65, 85, 87, 88, 124, 193, 204,
Agriculture-wildlife interface, 22, 25 261, 387, 388
Agroterrorism, 26, 27, 85 Clark, M., 266
Alao, S., 277–303 Climate change, 9, 24, 27, 59–74, 81, 83, 130,
Amin, N., 103–133 266, 342
Antimicrobial resistance (AMR), 209–227, Coats, D.R., 190
232, 233, 240, 245, 250–264 Collective action, 363, 383
Apple Air Max, 402 Collective impact, 361–385
Authentic leadership, 362 Collective leadership, 361–385
Colwell, R., 5
Communicable diseases, 65, 122, 123, 247
B Complex disasters, 143–145, 390
Bayesian, 277–303 Complexity, 40, 62, 70–74, 94, 156, 169, 220,
Belay, E.D., 389 289, 290, 308, 317, 318, 321, 330, 343,
Bell, C., 59–74 387–390, 405, 408, 409, 412, 417, 419,
Benevolenza, M.A., 156 422, 423
Bennett, S., 33–53 Conventions, 26, 38, 95, 96, 128, 187,
Berchin, I., 59 189–192, 195–197, 201, 204, 235, 236,
Bessant, J., 405 259, 265, 390, 420
Bezold, C., 408 Cooper, C., 34
Biosafety, 83, 88, 89, 91–94, 96, 99, 194 Cordery, J., 38
Biosecurity, 17, 22, 79–99 Crisis leadership, 368
Biosurveillance, 89–95 Cynefin, 70, 72–74, 390, 391, 393, 400, 405,
Bioterrorism, 84, 87, 88, 91, 96, 104, 127–129, 406, 411, 421, 422
187, 258, 321, 365
Black, R.E., 9
Blair, D., 188 D
Bohnes, F.A., 266 de Bono, E., 422
Bowles, D., 34 Delva, J., 143–156

© Springer Nature Switzerland AG 2020 427


A. J. Masys et al. (eds.), Global Health Security, Advanced Sciences
and Technologies for Security Applications,
https://doi.org/10.1007/978-3-030-23491-1
428 Index

Denscombe, M., 46 Henchey, N., 407


DeRigne, L., 156 Hey, A., 36
Dinh, E.T., 282, 283 Heyman, 422
Drucker, P., 381 Hillsborough County, Florida, 281
Dual use research, 91, 93, 97 Hollnagel, E., 42
Holocracy, 396
Hospital emergency preparedness, 367
E Hsu, L.Y., 210–227
Ebola, 10, 103, 193, 307, 343, 387 Huertas, A., 277–303
Ebola Virus Disease (EVD), 106–114, 121, Human trafficking, 166–171, 176, 181, 182
128, 130–132, 307, 324, 331, 372, 374, Hurricanes, 59, 83, 96, 143, 144, 146, 147,
377, 379, 380, 382, 383 152–156
Emerging infectious diseases (EIDs), 86, 195, Hydrick, A., 13–27
307, 311, 320, 341
Emerging threats, 79–99, 103–133, 187, 307,
376 I
Epidemics, 4, 21, 104, 143, 184, 195, 218, 237, Infections, 5, 17, 64, 65, 81, 82, 88, 90, 98,
307, 343, 387 107–113, 119, 123, 125, 127–130, 151,
Extreme weather events, 59–74, 144, 152 152, 178, 181, 210–212, 214–224, 226,
235, 241, 243, 244, 246, 251, 254–256,
260, 277–303, 345, 366, 380
F Infectious disease, 107, 144, 194, 215, 233,
Fairlie, H., 37 307, 341, 387
Firstenberg, M.S., 341–351 Innovation, 25, 42, 79, 347, 389
Food security, 13–15, 21, 25–27, 65, 74, 82, 90, International health security (IHS), 341, 342,
176, 210, 219, 220, 231–267 344–351
French, A.J., 307–332 Izurieta, R., 3–10

G J
Gallie, W.B., 391 Jacob, B.G., 277–303
Generative relationships, 361–385 Janis, I.L., 38
Giffin, S., 37
Gilbert, N., 49
Gilpen, J., 16 K
Glass, R.I., 9 Kania, J., 383
Global health, 59–74, 79–99, 103, 132, 143, Kaundert, M., 62
156, 187–205, 210, 215–222, 226, 257, Kenned, J.F., 38
260, 307–332, 341, 361–385, 387–423 Klinect, J., 51
Global health security, 59–74, 91, 92, 143, 156, Klueber, S.A., 167
187–205, 210, 215–219, 226, 257, 260, Kolko, J., 414
307–332, 341, 384, 387–423
Goleman, D., 362
Goodman, A.K., 148 L
Governmental public health, 363, 365, 366, LaBrunda, M., 103–133
373, 381 Lagadec, P., 40
Leadership, 38, 42, 177, 180, 194, 308, 324,
361–385, 401, 403, 404, 412, 422
H Legido-Quigley, H., 210–227
Hackman, J.R., 38 Levine, M.J., 361–385
Hancock, T., 408 Lewin, K., 46
Health operations safety audits (HOSA), 44, Leyens, J.P., 36, 37
46–51, 53 Lichtveld, M., 59
Index 429

Liwei, C., 231–267 Plagues, 3–10, 19, 21, 26, 39, 79, 87, 109, 122,
Lombardi , V. Jr., 313 124, 193, 194, 204, 368, 388
Lubroth, J., 16 Poisson, 281, 283–287, 293, 294, 297, 298
Luxton, D.D., 345 Proactive risk management, 40
Proliferation, 187–193, 195, 203, 346
Public health, 5, 20, 104, 150, 181, 187, 215,
M 231, 307, 341, 387
Marshall, J., 143–156 Public health emergency preparedness, 361,
Masys, A.J., 59–74, 143–156, 387–423 362, 364–366, 370, 371
McMichael, C., 63, 65
McNulty, E.J., 368, 382
Meadows, D.H., 393 R
Medical error, 33–53 Rescorla, R., 368
Medina, Marie-jo, 210–227 Research, 5, 23, 36, 106, 144, 166, 187, 223,
Merritt, A., 51 244, 349, 397
Microbiological risks, 238, 246, 248 Research ethics, 81
Miley, K., 79–99 Reynolds, C., 187–205
Modeling and simulation, 329–331 Ripley, A., 368
Moore, M., 74, 389 Rittel, H.W.J., 413
Mulenburg, J., 39 Rothwell, A., 265

N S
Nassim, T., 419 Scenario planning, 70, 73, 74
NIMS, 368, 369, 374, 379 Schlundt, J., 231–267
Non-state actors, 26, 176, 187, 189, 192, 193, Security measures, 17, 22, 82, 85–87, 89, 92,
203 95, 97, 99
Novak, R.J., 282, 283 Senge, P.M., 389, 412
Serious games, 324–326, 332
Sexual violence, 149, 163
O Snow, J., 4
Older adults, 145, 147, 154–156 Social determinants of health, 103, 117–121
One Health, 209–227, 232, 234–237, 246, Social media, 170, 341–351, 373
254–258, 260, 267 Spowart, S., 163–184
Osterwalder, A., 397, 398 Stanley, N., 277–303
Stawicki, S.P., 341–351
Stikeleather, J., 387–423
P Stoneman, P., 49
Panaou, T., 277–303 Stylianou, K.S., 267
Pandemics, 3–10, 19–21, 80, 85–88, 94, Sustainability, 15, 392
122–124, 126, 131, 194, 195, 233–235, Sutcliffe, K.M, 411
237, 308, 320, 330, 369, 376, 384, 388,
389
Papadimos, T.J., 341–351 T
Pathogenic organisational culture, 41 Taleb, N., 408
Patient safety, 33, 39, 44, 46, 49–53, 310, 331 Taylor, I., 62
Patton, M., 389 Taylor, P., 37
Pavitt, K., 405 Tay, M.Y.F., 231–267
Perrow, C., 40 Tidd, J., 405
Pignuer, Y., 397, 398 Tilman, D., 266
430 Index

Transboundary disease, 13–27 Westley, F., 74, 389


Treaties, 187, 189, 190, 195, 197 Wicked problems, 70, 222, 223, 391, 403–405,
Tufte, E., 422 408, 409, 412, 414, 416, 417, 419
Wiltshire, J., 143–156
Wolkin, 144
U Women and children, 149, 156, 167, 168, 174,
Ugurlu, N., 150 184, 197
Unified command, 377, 379, 382, 383 Woods, D.D., 41
Unique pathogens, 361
Unnasch, T., 277–303
X
Xu, T., 72
V
Voros, J., 407, 408
Vulnerable populations, 14, 59, 167 Y
Yammarino, F., 382

W
Warner, K., 64 Z
Waterson, P., 40 Zika, 122, 150, 151, 195, 277, 279, 387
Weapons of mass destruction (WMD), 95, Zimmerman, B., 389
187–205 Zoonotic diseases, 21, 85, 86, 215, 233, 234,
Weber, J.T., 6 236, 389
Weick, K.E., 411

You might also like