Gradmann Lighthouse Hothouse

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 25

HPLS (2018) 40:8

https://doi.org/10.1007/s40656-017-0176-8

ORIGINAL PAPER

From lighthouse to hothouse: hospital hygiene,


antibiotics and the evolution of infectious disease,
1950–1990

Christoph Gradmann1

Received: 2 February 2017 / Accepted: 20 November 2017


 Springer International Publishing AG, part of Springer Nature 2017

Abstract Upon entering clinical medicine in the 1940s, antibiotic therapy seemed
to complete a transformation of hospitals that originated in the late nineteenth
century. Former death sinks had become harbingers of therapeutic progress. Yet this
triumph was short-lived. The arrival of pathologies caused by resistant bacteria, and
of nosocomial infections whose spread was helped by antibiotic therapies, seemed
to be intimately related to modern anti-infective therapy. The place where such
problems culminated were hospitals, which increasingly appeared as dangerous
environments where attempts to combat infectious diseases had instead created
hothouses of disease evolution. This paper will focus on one aspect of that history. It
caused clinical medicine and hospital hygiene in particular to pay attention to a
dimension of infectious disease it had previously paid little attention to thus far: The
evolution of infectious disease—previously a matter of mostly theoretical interest—
came to be useful in explaining many phenomena observed. This did not turn
hospital hygienists into geneticists, though it did give them an awareness that the
evolution of infectious disease in a broad sense was something that did matter to
them. The paper advances its argument by looking at three phases: The growing
awareness of the hospital as a dangerous environment in the 1950s, comprehensive
attempts at improving antibiotic therapy and hospital hygiene that followed from the
1960s and lastly the framing of such challenges as risk factors from the 1970s. In
conclusion, I will argue that hospital hygiene, being inspired in particular by epi-
demiology and risk factor analysis, discussed its own specific version of disease
emergence and therefore contributed to the 1980s debates around such topics. Being
loosely connected to more specialized studies, it consisted of a re-interpretation of

& Christoph Gradmann


christoph.gradmann@medisin.uio.no
1
Section for Medical Anthropology and History, Institute of Health and Society, University of
Oslo, Oslo, Norway

123
8 Page 2 of 25 C. Gradmann

infectious disease centred around the temporality of such phenomena as they were
encountered in day-to-day dealings of clinical wards.

Keywords Hospital infections  Antibiotics resistance  Disease evolution 


Epidemiology  Risk factors  Medical microbiology

1 Introduction

When medical bacteriology attempted to bring laboratory science to work by the


bedside in the late nineteenth century, it did so in a place that became intimately
connected to its own history and the progress it aspired to: the hospital. At the
beginning of the nineteenth century, this institution would commonly be considered
unhealthier than its environment, and it had few effective medicines to offer
(Bynum 1994, pp. 1–24). The therapeutic promises, which the laboratory revolution
harboured, then transformed the hospitals’ prestige into that of a harbinger of
therapeutic progress. A well-studied progressivist gospel of medical bacteriology
being the discipline that brought laboratory science to the bedside played an
important role in this case, but there is some indication that progress was not just a
promise.1 For instance, the arrival of the diphtheria antitoxin therapy in the early
1890s indicated a fundamental change. It was hospital medicine rather than resident
practices that could apply the rocket science therapy of the day (Weindling 1992).
We can think of related phenomena such as antisepsis or, somewhat earlier, the
moving of surgery into hospitals, and we realize that a century later, what used to be
death sinks for poor patients became perceived as healthy and hygienic environ-
ments where modern medicine would be practiced.
As can be read from a multitude of disinfection practices from carbolic acid to
doctors’ white coats, therapeutic modernity and meticulous hygiene were closely
connected. From the history of late nineteenth century surgery, we can learn how the
notion of asepsis became important when bacteriology started to look at surgical
theatres (Schlich 2012, 2013). When the medical bacteriologist Leonard Colebrook
introduced sulphonamide therapy to treat puerperal sepsis in the 1930s, he combined
the application of Prontosil, the sulpha drug in question, with that of Dettol, a
disinfectant which would make sure that with regard to infectious challenges, the
hospital could be considered a safer place than its outside environment. In addition,
phage typing, then a novelty in clinical medicine, was employed to control for any
cross-infections.2 As Irvine Loudon has told us in his history of puerperal fever, the
application of modern medicines and meticulous hygiene converged in the notion of
the modern hospital being an environment that would be as much a healthy
environment as a harbinger of therapeutic modernity (Loudon 1987, 2000). As
Flurin Condrau and Rob Kirk summarized it, ‘‘Hospitals were conceptualised as

1
For an overview, see Gradmann (2017, pp. 380–383). For a recent analysis of bacteriology’s impact in
clinical medicine, see Wall (2013).
2
Concerning typing, Colebrook and Kenny (1936), Colebrook collaborated with Fred Griffith, a pioneer
of research into bacterial typing (Ayliffe and English 2003, p. 136; Méthot 2016).

123
From lighthouse to hothouse Page 3 of 25 8

inherently healthy institutions and therefore clean locations, almost perhaps vessels
for the emerging scientific medicine’’ (Condrau and Kirk 2011, p. 389).
This notion of the healthy hospital seems to have run aground towards the end of
the twentieth century, when the therapeutic promises of antibiotic therapies gave
way to dystopian visions. As Stuart Levy commented in 1992, ‘‘Antibiotic usage has
stimulated evolutionary changes that are unparalleled in recorded biologic history’’
(Levy 1992, p. 80). As a result, hospitals could now be portrayed as inherently
dangerous environments where particularly dangerous microbes were lurking on the
inmates: ‘‘You will be safer staying home than going to a hospital’’ (Koshland 1992)
was how the biochemist Daniel Koshland introduced an issue of the journal Science
under the headline the ‘‘The Microbial Wars’’. Its contributions summed up a
situation in which the lighthouses of therapeutic modernity were increasingly seen
as life threatening environments. In the run-up to the British general election of
2005, media attention for the ‘‘hospital superbug’’ methicillin-resistant Staphylo-
coccus aureus (MRSA) became the lens through which the public viewed hospitals
as dangerous environments (Washer and Joffe 2006).
The professionals in charge of the hygienic challenges that the modern hospital
poses were hospital hygienists, and it is first and foremost their story that this paper
tells. By looking at published material on both the UK and US, this paper will
investigate how these professionals, who had previously taken pride in safeguarding
the hospital’s status as a healthy environment, tackled a development that presented
that same institution as being beset by infectious risks. Presumably, hospital hygiene
differed quite a bit in the US and the UK. For instance, it is commonly agreed that
the field originated in the UK, though it became more international as it grew.3 Yet,
it is not the local trajectories that interest us here, but the fairly uniform set of
challenges that had to be faced. These could be hospital infections caused by
common and usually non-pathogenic microbes in immunocompromised patients,
so-called non-classical pathogens, infections resulting from medical technology
such as catheters that brought bacteria to places inaccessible to them before or those
that were caused by resistant bacteria like MRSA, which seemed to have
specifically made their home in hospital environments (McKenna 2010; Ayliffe
and English 2003).
Technically speaking, the changes in understanding infectious disease that this
paper discusses relates to what can be described as a belated reception of specialized
research in the epidemiology and evolutionary biology of infectious disease by
clinicians studying hospital infections. Arguably, variation had been studied by
medical microbiology, especially after the rigidity of classical Koch’ian bacteri-
ology softened after 1890. Evolution, however, remained a specialty for basic
biological sciences and epidemiology for most of the twentieth century.4 It only

3
Ayliffe and English (2003).
4
Amsterdamska (1987, p. 682) argues about interwar bacteriology that it challenged the ‘Cohn-Koch
dogma’ through work on bacterial variation but showed little interest in questions of inheritance.
Mendelsohn (2002) makes the more radical point that there probably had never been such a thing as the
Cohn-Koch orthodoxy. Instead there was a place for variation and evolution in modern bacteriology for
instance in work about virulence. His accent, however, like in the case of Amsterdamska is strongly on
studies of variation.

123
8 Page 4 of 25 C. Gradmann

began to be meaningful for clinical medicine only from the late 1950s.5 Just think of
phages that had enjoyed several decades of a career in biochemistry before
becoming central to the diagnosis of antibiotic resistance in the 1950s (Gradmann
2013, p. 560). Only gradually did infection medicine cease to be what Joshua
Lederberg despairingly called one of ‘‘the last refuges of the concept of special
creationism’’6—a field that, despite having studied bacterial variation for long, saw
microbes as rather static as species and that for all practical purposes considered the
evolutionary biology of infection as a subject that could be left to basic science.
Inspired by a host of observations that we will go into later in this paper, infection,
having previously been framed as rather static, now became seen as something that
changed under the eye of clinicians. Next to acquiring some familiarity with disease
ecology and evolutionary biology, risk factor analysis also came to play an
important role. It was employed when clinicians attempted to be able to statistically
predict which patients were more likely to develop hospitals infections than others.7
This sounds like very different sources of inspiration, yet from the perspective of
hospital hygiene, microbiology and risk factor analysis, were connected in a
pervasive interest to understand and control the spread of hospital infections.
More generally speaking, what I will observe is that through the study of hospital
infections, temporality became an essential element in understanding infections.8
This had not been so in previous times. Looking at how the biology of infectious
disease was framed by early twentieth century medicine, we realize that medicine’s
thinking in those days was contradictory in interesting ways: In principle, physicians
considered the biology of infection as something that evolves. Even so, it was
assumed to do so along the slow paths of Darwinian evolution, and thus for all
practical purposes its biological history did not matter.9 In Robert Koch’s work, to
cite a prominent example, bacteria are treated as static in their properties despite the
instance that Koch was a disciple of Ferdinand Julius Cohn’s, a Darwinist
microbiologist (Gradmann 2009). Nonetheless, historical change was projected in
Koch’s days. Yet, it did not relate to the biology of disease, and instead was
supposed to result from human technological intervention aimed at the conquest of
infectious disease. In this case, it was expected to be swift and radical, resulting in
the aspired to goal of eradicating infectious disease. This progressivist technological
optimism was a key feature of twentieth century medical bacteriology and it fueled

5
Méthot (2016) is a recent paper that offers an introduction to such work.
6
Quoted in Méthot and Alizon (2014, p. 781).
7
Introductions on the use of risk factors in clinical medicine are provided in Marks (1997) and Rothstein
(2003).
8
My approach is inspired by Reinhard Koselleck’s work. This historian developed an anthropology of
how time is experienced in history. In historical situations of perceived stability, experience dominates
and the present is seen in continuity with the past. In periods of accelerated change, however, it is
expectation that dominates, while the present is seen as discontinuous with the past (Koselleck 1979). For
an introduction see Bouton (2016).
9
In her classical analysis of two camps in nineteenth century bacteriology, organized around notions of
unitarianism and specificity in microbiology, Mazumdar (1995) emphasized how the dominant school
with its emphasis on specificity would downplay the variability of bacteria.

123
From lighthouse to hothouse Page 5 of 25 8

the prestige of antibiotics.10 Still, as this paper argues, it was through studying the
application of antibiotics and of other means of technological control of infectious
disease, such as immunosuppressive therapies in intensive care medicine, that the
temporality of the conditions studied came to matter in infection medicine. An
institution where the awareness of this incipient crisis was built up was the hospital.
From the mid-twentieth century, its image changed from a lighthouse of therapeutic
modernity to a hothouse of disease evolution.
It would be tempting to cover that history at book length, but this paper will settle
for a more modest aim. It will look at three historical situations in the history of
infection control in hospitals. The first of those is located in the 1950s, the era when
hospitals began to be seen as dangerous environments. Inspiration drawn from
epidemiology and disease ecology provided a means for medical professionals to
comprehend the changes they were observing. The second historical situation
consists of attempts at the control of such challenges a few years later. These could
either be framed as a moral challenge to be met with restraint in the application of
medicines and a re-enforcement of what was seen as traditional hygiene. However,
they could also be embraced, and in this case they offered an exciting opportunity
for pharmacological innovation. The third historical situation takes us into the
1970/1980s, a time when previous approaches had not worked well. Now, risk
factor analysis provided a framework to see hospitals as an essentially dangerous
environment characterized by novel infectious challenges. From this vantage point,
hospital hygienists could again reconnect to basic biologic sciences that were
developing a strong interest in disease emergence and evolution in those years.
While not being connected in their methods, both risk factor analysis and work on
the quick spread of drug resistance through horizontal gene transfer facilitated a
view of hospitals as environments characterized by rapid and discontinuous change.
What is attempted in this paper is not a comprehensive history of hospitals
infections. Instead, the example of such conditions is employed to discuss a
framework for the history of infectious disease in the second half of the twentieth
century, in which temporality figures centrally. In the material consulted, the
analysis is based on the few available historical overviews,11 and on an extensive
search for journal literature that started from what practitioners of hospital hygiene
considered as classical pieces of text.12

2 Hospitals in danger

It is by now commonplace that the arrival of antibiotics and sulpha-drugs during


World War II coincided with the appearance of strains resistant to the microbes
targeted (Bud 2007; Lesch 2007). To cite the most prominent example, resistant
10
See Bud (2007) for an analysis that emphasizes this.
11
Ayliffe and English (2003) provides a wealth of detail for those who want more detail. A useful
interpretation of the British situation in the 1950 s is Condrau and Kirk (2011). Some insight into the US
history can be found in actor’s accounts, including Brachman (1981), Dixon (2011) and Hughes (1987).
12
The collection by Suter and Vincent (1993) served a s a starting point which focused on journals from
the field of hospital hygiene and infection medicine.

123
8 Page 6 of 25 C. Gradmann

Staphylococcus aureus had been shown to exist in penicillin research laboratories


during the war. It then made its appearance in clinical wards soon after (Bud 1993,
pp. 116–122; McKenna 2010; Podolsky 2015, pp. 10–42). Moreover, as contem-
poraries soon realized, such strains were in fact not found everywhere that
antibiotics were in use. Instead, as in particular investigations of an epidemic of
resistant Staphylococcus made clear, they were specialized inhabitants of hospital
wards. Here, as Kathryn Hillier has shown in the example of maternal wards,
resistance could spread via a cross-infection between patients and health personnel
(Hillier 2006). Attention to antibiotic resistance thus became closely linked to the
challenges of hospital hygiene. We should be aware that antibiotics, besides their
therapeutic application, had quickly become popular as efficiency enhancers that
would help in reducing labourious hygienic practices (Bud 2007, pp. 97–115). As
Wesley Spink, a clinician from Minneapolis noted in 1953, the proliferation of these
medicines created rather novel challenges. While antibiotics had been a ‘‘revolu-
tionary advancement’’ in therapy, their widespread application had brought forth
some bewildering consequences. ‘‘The clinician’’, he noted, ‘‘is often perplexed by
those infectious processes that fail to respond favourably to the antibiotics;
paradoxically, the antibiotics have brought forward new clinical problems.’’13 Their
effectiveness and easy availability resulted in their ‘‘indiscriminate use’’, and a
‘‘false sense of security’’ prevailed. By treating one infection, doctors could easily
create openings for another:
The clinicians may properly start treating an infection due to a susceptible
strain or strains [… and] end treating a new bacterial infection in the patient as
a result of penicillin therapy (Spink 1955, p. 586). Reviewing little over a
decade of antibiotics therapy, Spink concluded that an initial feeling of
optimism over what had been accomplished has been succeeded by waves of
increasing pessimism (Spink 1955, p. 589).
Where Spink voiced bewilderment and pessimism, others were more outspoken. In
1955, Spink’s British colleague E.J. Lowbury bluntly stated that ‘‘the problem of
hospital infection has been complicated rather than solved by chemotherapy’’
(Lowbury 1955). Leonard Colebrook, doyen of British hospital hygiene, pointed to
a marked increase of post-operative complications that coincided with the rising
prevalence of resistant Staphylococcus. For Colebrook, it was not so much
resistance itself as its spread via cross-infection in hospitals that was at the core of
the problem. The best answer to this challenge seemed to be a (re)affirmation of
classical rules of hygiene. Instead of merely accepting these new hazards and to
‘‘put our trust in antibiotics’’ (Colebrook 1955, p. 888), Colebrook argued for a
rigourous control of the ‘‘menace of cross-infection’’ (Colebrook 1955, p. 886).
Rather than bringing in more antibiotics, he thought, cleanliness and isolation were
the appropriate answers. Doctors should ‘‘strive continuously to block the many
channels by which pathogens are transmitted in our hospitals so that the hazards of
cross infection are progressively diminished.’’ (Colebrook 1955, p. 888).

13
Spink (1955, p. 585), as well as the following two quotes. The paper was based on a presentation
delivered in 1953.

123
From lighthouse to hothouse Page 7 of 25 8

There are several features in the debates as they evolved in the 1950s that deserve
our attention. As previously mentioned, hospitals had been seen as harbingers of
therapeutic progress. While antibiotics had added to that prestige, they had also
turned these same places into spaces where doctors were now fighting a novel
challenge, namely resistant microbes. Part of that situation was that the ways in
which hospitals could be distinguished from the outside world had changed; having
been considered healthier environments, they began to be seen as the opposite, as
dangerous places were where the problematic consequences of the antibiotic
revolution surfaced most visibly. When William Kirby, a bacteriologist from the
University of Washington, published a paper on urinary tract infections in 1956, he
described what he experienced as a paradox. While under laboratory conditions,
cultures of the bacterial species responsible for these pathologies would not easily
acquire resistance, although resistant strains would nonetheless be frequently found
among hospital inmates who suffered from such infections. If we follow Kirby, the
explanation of this paradoxical situation was to be found in the therapies practiced
in the hospital. By knocking out sensitive strains, the application of antibiotics had
created a situation which favoured both resistant strains of the original bacteria or
colonization by different microbes that had been non-sensitive to the applied
medicines in the first place, but which would not normally infect patients, ‘‘[…],
sensitive strains have been eliminated by antibiotics and have been replaced by
naturally resistant species, such as Pseudomonas or Proteus’’ (Kirby et al. 1956).
In another paper from 1956, Walsh McDermott deemed it necessary to inform his
readers that the prevalent challenge of staphylococcal infection should not be
misinterpreted to be caused by that bacterium’s increasing prevalence or
pathogenicity. After all, staphylococci were ubiquitous bacteria of a usually low
pathogenicity. As he contended, there was little reason to believe that staphylococci
were more widespread in 1956 than they had been before. Infection rates among
those patients presenting to a hospital had been stable, ‘‘We are dealing not really
with invaders but with original settlers’’ (McDermott 1956, p. 58). If problems
inside the hospital escalated, of which McDermott was in no doubt, the causes were
to be found in the very place and its inhabitants:
[…] today we are seeing, with greater frequency, the actual development of
staphylococcal disease in patients after they have come to our hospitals. […] It
is not so much that our patients in hospitals are being subjected to any greater
challenge by staphylococci as it is that they are less well equipped to meet the
same old challenge. […] First, our modern therapies are permitting the
survival of people who are less able […] to cope with bacterial infections. […]
Second, certain of our commonly used treatments […] are known to create
circumstances that, at times, facilitate the development of infection.14
What surfaces in McDermott’s paper is an epidemiological conceptualization of the
problem of hospital infection caused by resistant bacteria and common microbes.

14
From the context it is clear that when writing about hospitals,’’permitting the survival of people who
are less able […] to cope with bacterial infections’’ was not thinking of racial differences but instead of
the patients’ age or the therapies they had undergone (McDermott 1956, p. 64).

123
8 Page 8 of 25 C. Gradmann

Rather than being framed as events in individual patients, hospital infections were
taken to reflect challenges in hospitals environments and their inhabitants seen as a
population.
In 1959, David Rogers, a Professor of Medicine at Vanderbilt University,
published a paper that put much of what had been discussed thus far into a larger
perspective. As a specialist in infection medicine (Altman 1994), he had had the
opportunity to study a remarkable set of evidence. Investigations of death by
infectious disease at his hospital had been executed by the ‘‘same competent
bacteriologist’’ (Rogers 1959, p. 677) over many decades. From this set of cause-of-
death investigations, Rogers had chosen to compare such events in 1938–1940 with
those in 1957–1958. The result was a remarkable insight into the impact that the
introduction of antibiotics had had on hospitals and their inmates. There had been a
decrease in the overall number of deaths. However, this decrease ‘‘had not been as
dramatic as one would expect’’ (Rogers 1959, p. 682) and in fact, despite the
introduction of sulphas and antibiotics infections were still a ‘‘significant medical
problem’’ (Rogers 1959, p. 682). Rogers attributed some of this to the rising average
age of the patient population, but what was more interesting was that not all
infectious causes had diminished at the same pace. While classics such as
pneumonia or tuberculosis had practically disappeared, other infectious pathologies
increased. Fungal, enterobacterial and staphylococcal infections were more
prevalent in 1958, in both relative and absolute numbers. Even more surprising
were the sources of such infections. Before World War II, patients usually presented
infections they had previously acquired, usually outside of the hospital. However,
what was dominant in 1958 were conditions that had arisen after hospitalization.
Fatal infections in 1958 were also different ones, and ‘‘contrary to popular belief’’
(Rogers 1959, p. 682), not dominated by resistant Staphylococcus. Yet, they shared
two common features:
[…], disease was produced by microbes, ordinarily of low invasiveness, that
commonly form part of the normal human flora. Infections arose in patients
already compromised by serious disease who were often receiving therapies
known to affect host resistance to infection (Rogers 1959, p. 682).
What the latter referred to were mostly antibiotic therapies. Rogers employed,
perhaps in a rather simple fashion, an ecological understanding of infectious disease
causation. He emphasized that eradicating one bacterial species would usually
create inroads for others to colonize—one could speak of the creation of an
ecological niche—more so if this space was a hospital full of patients whose
immune defences had been weakened by antibiotic therapies. What the antibiotic
revolution had resulted in was therefore not the eradication of infectious disease, but
rather ‘‘dramatic shifts in the nature life threatening microbial infection’’ (Rogers
1959, p. 683). Like many of his contemporaries, Rogers, when looking for a larger
framework to relate his observations to, employed views put forward by Rene Jules
Dubos. In his widely read book, The Mirage of Health, he put forward a radical
critique of technological optimism that lay behind dreams of disease eradication

123
From lighthouse to hothouse Page 9 of 25 8

based on the application of antibiotics.15 ‘‘The belief that disease can be conquered
through the use of drugs is comparable to the naı̈ve cowboy philosophy that
permeates the Wild West thriller’’ (R. Dubos 1959, p. 132), he maintained. He
encouraged doctors to learn from the history of epidemics and from evolutionary
biologists such as Theobald Smith, who he thought deserved a lot more attention in
medicine than he had received. He encouraged his readers to look at examples of
failed disease control via eradication: ‘‘The difficulties that may follow antibacterial
therapy are in fact similar in essence to those encountered in any attempt to control
predators in nature’’ (Dubos 1959, p. 71). Much as the introduction of myxomatosis
viruses into Australian rabbits had only served to select those resistant to such
pathogens, mass antibiotic therapy would instead result in disease evolution than
disappearance. In a 1958 paper entitled, ‘‘The Evolution of Infectious Diseases in
History’’, and using a metaphor borrowed from epidemiologist William Farr, Dubos
compared infectious diseases to weeds and medicine to gardening. Doctors were’’
like gardeners whose work never ends, students of disease must always be on the
lookout for new problems of infection’’ (Dubos 1958, p. 450). For hospitals, this
meant that while they had been intended as hygienic environments, mass antibiotic
therapy had turned them into hothouses of disease evolution, where uncommon and
dangerous infections thrived on a uniquely vulnerable population. Rogers concluded
with an explicit reference to Dubos:
‘‘The administration of antibiotics appears to prevent colonization by certain
bacteria and, in essence, selects the potentially infecting agent; it does not
protect certain susceptible hospitalized patients from acquiring microbial
infection. … The present observations reinforce the thesis recently elaborated
by Dubos. … The nature of underlying disease appears in part to determine the
risk of acquiring an intra-hospital infection. Antimicrobial drugs appear to
determine the particular microbes that can establish fatal disease under such
circumstances’’ (Rogers 1959, pp. 682/683).
Rogers presented himself as a faithful follower of Dubos. Yet, it is important to see
that his emphasis was quite original. Dubos focussed on large-scale and long-term
phenomena such as recurrent influenza pandemics or the decline of tuberculosis
mortality over a century, i.e. classics of their field of epidemiology. Dubos and other
experts in disease ecology such as Frank Macfarlane Burnet identified antibiotics as
a factor in such histories. The latter did so by stating that such medicines were
‘‘merely new factors introduced into the environment within which the microor-
ganisms of infection must struggle to survive’’ (Macfarlane Burnet 1953 (1940),
p. 261). Reading Dubos, Rogers studied a different set of phenomena, which were
both short-term and highly localized. What this indicated to us is not that he was
changing fields from clinical medicine to epidemiology. In fact, we have no
indication that he ever developed a more extended expertise in that science. Instead,
he drew inspiration from the popularization of disease ecology that he found in
Dubos’ writings. The main reason for this presumably was that for him, much like

15
R. Dubos (1959); cf Moberg (1999).

123
8 Page 10 of 25 C. Gradmann

for other clinicians we have discussed, infectious disease was beginning to be seen
as something that evolved under their eyes.

3 Tact, coupled with expert knowledge

It was one thing to envision the modern hospital as an endangered institution, and to
identify antibiotic therapies as a major factor causing this. Finding a way to deal
with the situation was an altogether different matter. Despite David Rogers’s use of
the term ‘‘risk’’ in the closing lines of his paper, there was little indication in the
early 1960s that hospital hygiene was being framed as an ensemble of quantifiable
risk factors in the way it is today. Instead, we find attempts to regain control of the
hospital in a comprehensive fashion. We can approach these through a clinical
experiment, undertaken at the Hammersmith Hospital, London, in 1960 by Mary
Barber, a leading expert in clinical microbiology and a pioneer of antibiotic
policies.16 The question she asked was whether a development in which hospital
environments had favoured the selection of resistant strains could be reverted
(Barber et al. 1960). What Barber and her team attempted was to adapt classical
hospital hygiene to a novel challenge. Isolation, cleanliness and disinfection, which
had already been recommended by Colebrook, were combined with restraint in the
use of antibiotics whenever possible. For instance, the latter was achieved by
omitting the rather popular prophylactic use of antibiotics ahead of surgery
whenever permissible, and by establishing a clear indication based on microbio-
logical testing before employing them in treatment. Simultaneously, attempts were
made to rigorously control for the cross-infection of resistant bacteria by isolating
patients in whom such bacteria had been found.17 When it came to antibiotic
therapy, the team tried to avoid classical antibiotics in widespread use like
penicillin, instead relying on combination therapies employing several more recent
preparations. While the authors were aware that ‘‘the introduction of universal
double chemotherapy is a potentially dangerous policy’’ (Barber et al. 1960, p. 16),
since it potentially created more resistance, it is important to see the difference
between this awareness and more recent calls for the rational use of antibiotics. A
2017 perspective is informed by an overwhelming awareness of the ‘‘empty
pipeline’’—after all the end of antibiotics has by now been proclaimed for decades
(Podolsky et al. 2015, pp. 140–188; Gradmann 2016, pp. 156–158). In 1960,
however, that did not even exist as a possibility, so instead Barber and her team
found it necessary to answer the question, ‘‘Provided new antibiotics continue to
appear, does drug-resistance matter?’’ (Barber et al. 1960, p. 16). Their answer was
yes, since the same mechanism selected for resistance would in all likelihood also
select for more virulent strains, thereby adding to clinical complications.
The principles for preventing hospital cross-infection enunciated long before
the introduction of antibiotics remain the same to-day, but the ubiquity of the
16
On Barber see (Tansey 2000, p. 5). According to Graham Ayliffe (ibid, p. 36), her work at
Hammersmith Hospital resulted in the first antibiotic hospital policy in the UK.
17
Health personnel had been ruled out as carriers of resistant strains (Barber et al. 1960, p. 16).

123
From lighthouse to hothouse Page 11 of 25 8

staphylococcus makes their strict application difficult in relation to this


microbe. (Barber et al. 1960, p. 11)
With its appeal to traditional hygienic practices, such as cleanliness, isolation or the
washing of hands, Barber framed the arrival of antibiotic resistance in hospitals as a
novel challenge that could nevertheless be met by reinforcing traditional values and
practices. Such values and practices were claimed to have been neglected after the
arrival of antibiotics. While resistant Staphylococcus, the prime focus of the paper,
is clearly framed through the epidemiology of clinical wards, the text is completely
void of risk factor epidemiology. In fact, the word risk itself is not even used.
Instead the readers are presented with an argument about overall prevalence rates
and their reduction through the introduction or enforcement of hygienic practices,
such as the control of cross-infection. Such practices are explicitly presented as
being traditional, serving to frame the crisis of hospital hygiene as a moral challenge
rather than technological.
In a situation where the crisis of antibiotic resistance in hospitals could be and
was indeed, met with pharmacological innovation from the side of interested
industries, this was nonetheless a minority opinion. Instead of simply relying on
methicillin and other second-generation antibiotics, therapeutic rationalists like
Barber argued for a more targeted use of such antibiotics.18 In relation to hospitals,
this meant most of all restraining their application, and the re-enforcement of
hygienic practices. During the 1960s, hospital hygiene was gradually becoming
more important and was building a corpus of knowledge, all of this in response to
the rising challenges of hospital hygiene in these years (Ayliffe and English 2003,
pp. 186–199). With the 1950s epidemic of resistant staph (phage 80/81)
miraculously declining, gram-negative bacteria that attacked patients with weak-
ened immune systems moved to centre stage. Part and parcel of this rational
approach was to increasingly base hospital hygiene on epidemiological studies of
entire hospitals. In relation to antibiotic use and its consequences, larger and
influential epidemiological studies were those produced by Maxwell Finland at the
Boston City Hospital, in addition to a large multi-hospital study in West-Midland
hospitals in the UK (Ayliffe and English 2003, pp. 186–200).
Against the rising tide of pharmacological innovation, hospital infection
specialists were, if we sum this up, attempting to bring principles of public health
hygiene, such as population-based approaches and restraint in the use of medicines,
to work in hospitals.19 Besides the growth of epidemiological knowledge, a new
professional position, the infection control nurse, which was proposed in the early
1960s, stands for an approach to hospital infections that relied as much on public-
health inspired epidemiology as it did on a moral appeal to comprehensive hygiene.
For its inventors in the UK, this nurse, responsible for instating hygiene and
surveying infection, was as much a technical as a moral project. Naming ‘‘tact,
coupled with expert knowledge of aseptic techniques’’ (Gardner et al. 1962) as the
nurses’ combination of skills, the nurse was intended to complement the medical
18
Podolsky (2010); on 2nd generation antibiotics: (Bud 2007, pp. 116–139; Greenwood 2008,
pp. 119–136).
19
See Dixon (2011). The authors name the CDC as a starting point for the US.

123
8 Page 12 of 25 C. Gradmann

microbiologists installed in many hospitals as a result of the 1950s Staph 80/81


epidemic.20 The nurse’s task was both to inform and to take action: ‘‘to keep the
responsible members of staff informed of the incidence of sepsis, to advise
preventive measures, and to check their efficacy’’ (Gardner et al. 1962). This was
based on the experience that hospital epidemiologists, while intended as a cavalry of
infection control, often found themselves engaged in rear-guard battles:
So complex are the problems of infections control that, in our experience, the
unassisted control of infection officer can at best attempt to achieve a more or
less complete retrospective record of seriously ill infected patients, while his
preventive measures tend to be based on his knowledge of published reports
rather than on the requirements of his hospital (Gardner et al. 1962).
Empowering nurses in clinical wards was about both skills and authority, as a
practitioner of hospital hygiene remembered about the days when infection control
nurses became introduced in the US in the late 1950s:
Hand washing and use of masks, gowns and gloves were enforced by nurses,
who were given authority to enforce the rules developed by a committee. […]
This unit became a focus of education for all members of the professional staff
of the hospital and exerted a beneficial influence on clinical practice
throughout the hospital. (Wise et al. 1989, pp. 1011/1012)

4 Infectious risks

The 1960s were clearly a decade of moves in the direction of creating hospital
hygiene as a discipline in its own right. In those days, the focus of that discipline
would usually be on individual hospitals where a hospital hygienist or epidemi-
ologist, aided by a nurse, would attempt to stem the tide of resistant bacteria and
infections by non-classical microbes.21 As has been mentioned, there were now
more comprehensive investigations into the infectious disease epidemiology of
hospitals, in addition to the first textbooks of that new discipline.22 When it comes
to antibiotics, there was one obvious general feature: viewing the hospital as a place
characterized by a very peculiar epidemiology that favoured accelerated disease
evolution, in which a restrained and rational use of antibiotics would help to check
the dangerous consequences of their mass application. A 1968 paper, criticizing
antibiotic prophylaxis ahead of surgery, makes a typical statement:
As yet, prophylactic antibiotics have not been shown conclusively to reduce
the frequency of postoperative infections. Despite this, they are still widely
used, and recently the semisynthetic penicillins and cephalotin have been

20
Hillier (2006); according to Ayliffe and English (2003, p. 193), Colebrook proposed the position of an
infection control nurse in 1955. Wise et al. (1989) report its invention for the US in the late 1950 s. .
21
For a graphic description, see Wise et al. (1989).
22
One in the UK, one in the US; see Eickhoff (1991) for references and a sketch of the situation around
1970.

123
From lighthouse to hothouse Page 13 of 25 8

employed for this purpose with increasing frequency. The danger that this
practice will favor and select organisms resistant to these valuable antibiotics
is real (Thoburn et al. 1968, p. 9).
There is also some indication that after having served to inspire clinicians, disease
ecologists began to see hospitals as environments worthy of their attention. The
1972 edition of McFarlane Burnet’s Natural History of Infectious Disease contained
a new chapter on ‘‘Hospital infections and iatrogenic disease’’ (Burnet and White
1972, pp. 186–192). It pictures hospitals as environments, ‘‘where ecology has been
ousted by technology’’, as places where measures directed against infectious disease
favoured its evolution. Burnet named the stock examples of resistant Staphylococ-
cus, of opportunistic infections in immunocompromised patients, of commensals
turning into pathogens in clinical wards and of the increasing presence of fungal
infections.
Thus, the 1970s should have been the breakthrough decade of hospital hygiene.
Instead, it became a decade of crisis where earlier comprehensive approaches were
dropped in favour of an emphasis on risk-factor epidemiology. In the context of this
paper, this is interesting because risk factors, while not directly addressing the
evolving biology of infection, are still tools to project a change likely to occur in the
future.
In fact, the crisis of hospital hygiene and the rise of risk factor analysis were
intimately connected. When Theodore Eickhoff, a hospital hygiene specialist from
the CDC, was reviewing the development the field had taken, he concluded that
‘‘The 1970s have not been an enormous success [for hospital hygiene]’’ (Eickhoff
1991). The reasons lay not in a lack of challenges, but in the approaches aimed at
their control in that decade like hospital-wide surveillance and comprehensive
hygienic practices, such as the washing of walls and so forth. Putting this into
practice had been costly, though it turned out that arguments for the efficacy of such
measures were hard to come by. As the American Hospital Association stated in
1974: ‘‘The occurrence of nosocomial infection has not been related to levels of
microbial contamination [of hospital environments]. Routine environmental micro-
biologic sampling programs done with no specific epidemiologic goal in mind are
unnecessary and economically unjustifiable.’’23 Robert Haley retrospectively
commented on the 1970s that much of what was done in surveillance was quickly
revealed as expensive and scientifically dubious:
A hospital’s crude overall nosocomial infection rate was considered to be too
time-consuming to collect because of the need to do continuous, comprehen-
sive surveillance, unlikely to be accurate, and thus misleading to interpret, and
unusable for inter hospital comparisons because of the lack of a suitable risk
index of infection of all types (Haley et al. 1985).
Haley’s comment also pointed in the direction of what solutions could be envisioned
in a situation in which hygienic problems in hospitals pertained, but where funding
for hospital hygiene was scarce—the 1970s were after all also witnessing the first

23
Quoted in Maki et al. (1982).

123
8 Page 14 of 25 C. Gradmann

major economic crisis after World War II in both Europe and the US. The obvious
answer was to depart from holistic approaches, and instead address single relevant
phenomena for which risk factors could be calculated. As Jonathan Freeman and
John McGowan stated in a 1978 paper, up to that point there had far been no
statistical investigation undertaken into the question of which patients were more
likely to suffer from hospital infections and who should receive specific attention by
hospital hygiene.
No formal numerical study has been made of the host risk factors for
nosocomial infection for nosocomial infection, nor of the effect of the
differential distribution of the factors determining susceptibility to infection in
the patient populations that have been surveyed, although the importance of
such factors is universally recognised (Freeman and McGowan 1978, p. 811).
What hospital hygiene studying nosocomial infections was mostly in need of, if we
follow this view, was risk factor analysis, a statistical tool developed in the 1950s.24
With a timely eye on cost effectiveness, Freeman and McGowan proposed that
prospective risk was what one should focus on: ‘‘… risk factors that are discernible
prior to the onset of nosocomial infection are potentially the most useful in the
identification of patients at high risk for whom increased attempts at prevention of
infection might be indicated’’ (Freeman and McGowan 1978, p. 816). If we follow
the authors, the problem was that the level of skills in the professional community
was not up to the challenge. The non-implementation of the proper tools for risk-
factor analysis meant that differences in the prevalence rates between hospitals were
taken at face value, instead of inquiring if they, e.g., reflected differing compositions
of their populations, medical interventions or hygienic standards.
In a 1973 paper, two authors complained that in relation to what had become an
increasingly widespread practice in intensive care units, total parenteral nutrition,
studies into the infective risks posed were completely lacking. Relevant suspected
factors such as the nutritive fluids or catheters used, the therapies employed and of
course the patients themselves had not been studied with respect to their relative
contribution, particularly in relation to fungal septicaemia, a typical complication in
such interventions (Goldmann and Maki 1973).
Caught between a rock and the hard place of a shortage of funds, as well as the
lack of a proven efficacy of many of their previous recommendations, hospital
hygienists increasingly tried risk factor analysis towards the end of the 1970s. In an
early study of that type, the investigators classified patients by the severity of the
underlying disease upon admission (Britt et al. 1978). Comparing this to the rates of
hospital infections some time down the road helped to statistically identify patients
who were more likely to acquire hospital infections, and would therefore be the
most appropriate targets of protective measures.25
24
On risk and risk factor analysis: Aronowitz (2015), Rothstein (2003), Schlich and Tröhler (2006),
Timmermann (2012).
25
This first and fairly simple system was followed by increasingly sophisticated classifications of
patients that used pathophysiological markers as risk factors. In parallel textbooks from the period could
contain chapters that attempted to educate hospital hygienists in risk factor epidemiology (Wenzel 1987,
pp. 581–584).

123
From lighthouse to hothouse Page 15 of 25 8

5 From ecology to emergency

In the 1980s, the discussions we have been following reached an interesting point.
What had been perceived as a wave of infectious challenges in hospital medicine
was increasingly framed in a novel way, namely as an assemblage of many
calculable risk factors. For example, such factors could help to classify patients in
abstract terms, identifying those that were likely to develop hospital infections in the
future, and were therefore attractive targets for interventions. Of course, being a
statistical tool risk factor analysis does not address disease evolution on a biological
level. Yet, by making predictions about future events it suits an epidemiological
gaze at clinical wards that focus on arising infectious challenges. Beyond the
technicalities involved, which are not the core matters of this paper, it is important
to take a look at some of the wider implications: Where earlier approaches had
appealed to traditions, the risk factor approach had given up on the notion of the
hospital as a healthy environment. Instead, clinical wards are envisioned as
inherently dangerous places in which an efficient hospital hygiene would safeguard
an acceptable level of such threats, rather than aiming at their full control.
Epidemiological studies of hospital environments had found a new form, addressing
them as many single risk factors characterizing such environments. Risk factor
analysis was connected in important ways to the emergence of evidence-based
medicine. Among many things, evidence-based medicine was a path upon which
epidemiology travelled from public health to clinical medicine. Hospital hygiene,
being strongly inspired by public health, was part of that movement. ‘‘I wanted to do
public health, but I wanted to do it in a medical school’’,26 remembered David
Sacket, one of the spokespersons for evidence-based medicine. As Scott Podolsky in
particular has shown, critical studies of antibiotics use were massively inspired by
public health and evidence-based medicine.27 In 1987, Richard Wenzel, author of
one of the first textbooks in the field, wrote, ‘‘The hospital in which immune-
compromised patients find themselves for prolonged periods of time must be viewed
as a hostile environment and as yet another important risk factor for infection.
Routine hospital procedures represent important infectious risks’’ (Wenzel 1987,
p. 561).
While risk factors provided tools to frame clinical wards as dangerous
environments, discussions about the biology of unusual infections were beginning
to take on a decidedly gloomy note from about 1970. The most important reason for
this was that views about microbial evolution developed in a way that seemed to
matter for clinical medicine. Genetic analysis of bacteria, adapted to medical
purposes from nascent gene technology, pointed to phenomena that could account
for rapid disease evolution. The most important example of this is the study of what
was called episomes, or plasmids in molecular biology, from around 1960 (Brock
1990, pp. 106–108; Grote 2008). These denoted snippets of genetic information that
could be traded between different species, and which could thus account for a swift

26
Quoted in Daly (2005, p. 55), see also Timmermanns and Berg (2003).
27
Scott H. Podolsky (2010) argues this as the example of why the randomized controlled clinical trial, a
core tool of evidence-based medicine, was embraced by critical antibiotics researchers.

123
8 Page 16 of 25 C. Gradmann

acquisition of genetic traits, by helping to avoid the slow path of mutation, selection
and hereditary transmission between generations. If such snippets coded for
antibiotic resistance, this property could now be propagated swiftly between
bacterial species. The phenomenon therefore explained the appearance of drug
resistance in bacterial species that had not been addressed with antibiotic therapies.
Starting from E.S. Anderson’s work on resistance transfer in Salmonella typhi—a
widespread phenomenon in animal husbandry—the threat that antibacterial
resistance now posed seemed to massively increase (Bud 2007, pp. 176–178). It
was this context that caused the New England Journal of Medicine to introduce a
new notion into debates in 1967, namely that of a future without antibiotics: ‘‘It
appears that unless drastic measures are taken very soon, physicians may find
themselves back in the preantibiotic Middle Ages in the treatment of infectious
diseases’’ (Anon 1966). A popular textbook summarizing work of that type was
Stanley Falkow’s Infectious Multiple Drug Resistance from 1975. It seemed as if
drug resistance itself had become infectious. As Falkow explained, this discovery
resonated with the experiences of a whole generation of ‘‘students, clinicians and
research scientists who have virtually been forced to become familiarized with
infectious multiple drug resistance’’ (Falkow 1975, preface).
Understanding horizontal gene transfer opened up perspectives into not just the
spread of resistance but into a generally faster evolution of infectious diseases. By
the 1970s, the necessary genomic tools for study were in place (Wright 1994,
pp. 71–74). Over the two decades that followed, it also became clear that other traits
of bacteria, such as virulence or the ability to produce toxins, could be spread
through horizontal transfer. As summarized by Stanley Falkow, disease evolution
‘‘can occur in big genetic jumps, rather than through slow, adaptive evolution; HIV
is the case in point’’ (Falkow 2006, p. 113), thereby indicating the larger
significance of such observations. David Greenwood retrospectively summarized
the effect of the discovery of horizontal gene transfer on antibiotics research as
setting ‘‘the cat amongst the pigeons’’ (Greenwood 2008, p. 406)—opening
uncomfortable insights into a quick evolution of the biology of infectious disease,
and stifling optimism regarding the future of antibiotics: ‘‘Suddenly, the ingenuity of
microbes, to which we microbiologists were fond of paying lip service, seemed to
have been grossly underestimated and people began to wonder if the astonishing
progress that had been made in antibacterial therapy was as secure as it had been
assumed’’ (Greenwood 2008, p. 407).
While all this did not turn hospital hygienists into evolutionary biologists, we can
name three consequences for our analysis. One is that the notion of hospital
environments as hothouses of disease evolution was reinforced. Where classical
medical bacteriology could envision infection without much interest in its evolution,
late twentieth century infection medicine, inspired by clinical epidemiology and—
albeit more remotely—evolutionary biology, came to grapple with essential traits of
such pathologies that were under quick development. The history of drug sensitivity
testing is a good example for this. What used to be laboratory experiments suited to
study bacteria below the species levels in the interwar years became the basis of
tests for the detection of resistant strains in clinical settings from the 1950s. In that
way, the interest that antibiotics researchers developed in bacterial genetics seems to

123
From lighthouse to hothouse Page 17 of 25 8

have been quite limited. After all, the disc test was serving to identify resistant
strains that had been selected by antibiotics application. The study of a mutation that
might have produced such strains could be left to basic science (Gradmann 2013,
pp. 560/561). Secondly, hospital environments were characterized by frequent and
surprising entanglements of biology and technology. With the rising importance of
intensive care units, such entanglements became a defining feature of hospital
medicine. Thirdly, the classical distinction between pathogens and saprophytes, or
good guys and bad guys, as the bacteriologist and historian of infection, Zinsser
(1935), called them, was becoming blurred. Immunocompromised patients would
fall ill to pathogens that would not attack healthy individuals, or diagnostic
technology would bring microbes to places they had been unable to reach before and
where they would turn pathogenic. In other words, pathogenicity was moving from
being a matter of causes to being a matter of circumstance—the fearful bacterial
invaders that dominated medical thinking since the late nineteenth century gave way
to pathogens that were better understood as tenants gone rogue.28
In his memoirs, Richard Wenzel recalls an example from 1978, in which much of
what has been discussed so far comes together. The University of Virginia’s hospital
critical care unit experienced an epidemic of an otherwise inconspicuous bacterium,
Serratia marscescens, which suddenly caused a series of violent bloodstream
infections, to which 10 of 17 affected patients succumbed.29 The origin of the
infection remained a mystery for quite some time not least since, as Wenzel put it,
‘‘there was not a full appreciation that any organism can cause disease in the right
host, especially if it enters the bloodstream of patients’’ (Wenzel 2005, p. 70).
Searching for a pathogen was, if we follow Wenzel, not in itself sufficient to
understand what had happened. The answer was instead found in a combination of
factors that are typical for modern hospital infections: the presence of patients with
weakened immunities in a large critical care unit and the use of advanced diagnostic
technology that can be integrated into a patient’s bloodstream. Such a piece of
equipment, a so-called transducer, turned out to be an inroad for infection in the
given case. The lesson to be drawn was that it was advisable to see a hospital
environment as a place characterized by the presence of unforeseen and sudden
events. Novel pathologies would often result from previous attempts to tackle
similar challenges. In the case of the Serratia infection, ‘‘a ‘solution’ led to an
unexpected problem’’ (Wenzel 2005, p. 78). As summed up by Wenzel, it was
‘‘routine human behaviour in the ICU that had led to an unexpected epidemic. What
many had expected to be a sure-fire cure, in fact a triumph of technology, actually
caused the problem’’ (Wenzel 2005, p. 74).
In the larger picture, Wenzel’s story illustrates an important feature of the history
of infectious diseases in the late twentieth century. Technological optimism in

28
Casadevall and Pirofski (2003) discussed this from an immunological perspective. In reply, Méthot
and Alizon (2014) proposed an ecological explanation, in which virulence, which is customarily regarded
as a trait of a microbe, is seen as context-dependent. Looking at the debates this paper follows, the
question of diverging rates of hospital infection was resolved by focusing on circumstance rather than
cause: the rates of infections that differed between hospitals became comparable when risk was calculated
per device and intervention (Hospital Infections Programme 1991).
29
Serratia became a classic in hospital infections: Hejazi and Falkiner (1997).

123
8 Page 18 of 25 C. Gradmann

medical microbiology, which had been reinforced by the arrival of antibiotics, gave
way to notions in which entanglements in biology and technology are an important
source of novel pathologies. When William Schaffner reviewed the impact of
technology on priority setting in infection control for an international conference in
1984, he opened his text with a sceptical look at medical technology ‘‘creating
unanticipated ecologic niches which enable microorganisms to produce nosocomial
infections’’ (Schaffner 1984, p. 1). At the top of his list were urinary tract infections,
for which the use of catheters had created inroads. The arrival of HIV/AIDS to
public recognition in the 1980s followed suit to stories like the ones that have just
been told, and due to the hygienic challenges that caring for aids patients posed,
harboured similar lessons that could be learned from it.
The world in which hospital physicians and their patients were beginning to live
around 1980 may well be characterized by a neologism of those years, namely that
of emerging infections. Technically speaking, the term was related to the
evolutionary biology of infectious disease, which was intensely researched in those
days. Even so, initially it had been used in a broader meaning, as the (re)emergence
of epidemic infectious disease at large.30 It inspired gloom rather than optimism in
relation to the control of such conditions, in addition to highlighting infectious
challenges as phenomena that were continuously evolving. The years around 1990
became what one author called ‘‘the golden age of genetics and the dark age of
infectious disease’’ (Tibayrenc 2001), a time where unravelling the genetic
mechanisms of disease evolution facilitated uncomfortable insights into the
ubiquity of microbes and the swiftness of their evolution, rather than providing
perspectives for their control. In a 1991 conference on ‘‘Hospital Infections—
Towards the Year 2000’’, a speaker proposed seeing abrupt changes in antibiotics
resistance as a defining feature of the times that lay ahead (McGowan 1991). It was
the (re)creation of a ‘‘world on alert’’ (Weir and Mykhalovskiy 2010) in relation to
infectious threats—and hospitals were environments in which such developments
were most distinctly felt. Michel Tibayrenc, the editor of the newly founded journal
Infection, Genetics and Evolution, felt that within the time frame of his own career,
a sea-change in the relations of medicine and infectious disease had occurred:
When I became an MD 26 years ago, the specialty of infectious diseases was
often the choice of medical students who wanted to avoid problems and to
practice easy medicine with few failures. The WHO took pride in its role in
eradicating smallpox. Most infectious diseases could be efficiently controlled
by antibiotics and vaccination. But this is no longer true (Tibayrenc 2001,
p. 1).
Of course, interest in the evolutionary biology of infectious disease was not novel in
itself. Still, it became more widespread and it changed in tone. If we look beyond
hospital hygiene, and compare popularizations of such knowledge from the 1950s to
the 1990s, we realize that its tone went from ecology to emergency. Looking at the
30
As introductions (King 2004; Snowden 2008). See (Ewald 1994). It seems that the origins of the term
do lie in epidemiology rather than genetics, in this case denoting an increase of incidence of human
infections. For a contemporary discussion see (Grmek 1993). For a recent historical analysis (Méthot and
Fantini 2014).

123
From lighthouse to hothouse Page 19 of 25 8

future that mankind had in waiting, the hopes of disease control were but a fading
memory for Michel Tibayrenc. Instead, he warned of a return of infectious disease
in a grandiose fashion: ‘‘Nobody can rule out the possibility of a historical
epidemics, such as the Justinian plague, and it is a false dream to believe that
preventive measures are already available’’ (Tibayrenc 2001, p. 1).
In Joshua Lederberg’s famous report on emerging infections from 1992, the
perspective on the history of diseases was quite different from Dubos’ earlier
framing.31 Whereas the latter had envisioned a possible co-existence of men and
microbes along ecological principles, Lederberg argued that mankind was
threatened with extermination. He put it like this in 1997: ‘‘If we were to rely
strictly on biologic selection to respond to the selective factors of infectious disease,
the population would fluctuate from billions down to perhaps millions before slowly
rising again’’ (Lederberg 1997, p. 419). An important factor in this siren song was
that it combined attention to ‘‘a host of apparently ‘new’ infectious diseases… that
affect more and more people every year’’ (Lederberg et al. 1992, p. 26), with the
gloomy notion of an antibiotics development pipeline run dry. On another occasion,
Lederberg reminded his readers of the ‘‘dangerous legacy of ‘miracle drugs’’’ that
had resulted in ‘‘an extraordinary complacency on part of the broader culture’’
(Lederberg 1988, p. 346) when it came to infectious threats. It was about time that
mankind realized that it was in a ‘‘well-chronicled race’’ (Lederberg 1988, p. 352)
between human technology and bacterial mutation. A popular text in these days,
Stuart Levy’s The Antibiotic Paradox from 1992, warned its readers that ‘‘a time
will come when antibiotics as a mode of therapy will only be a fact of historic
interest’’ (Levy 1992, p. 183). The notion of a post-antibiotic age now became
popular, and it has remained with us ever since. And indeed, the scarcity of novel
antibiotics met with an environment where antibiotic therapy itself had become a
major risk factor in treating infectious disease, as a 1984 paper in the Journal of
Hospital Infection concluded (Chavigny and Fischer 1984, pp. 60/61).

6 Conclusions

We might expect the story we have been following to have at least one happy
ending, namely for hospital hygienists themselves who should have grown into a
powerful discipline, ruling over surgical theatres, intensive care units and their
inhabitants. However, the reverse seems to be the case. When Richard Dixon
reviewed recent developments at the decennial conference of his field in 1991, his
tone was gloomy. In relation to antimicrobial use and resistance, the ‘‘boat had been
missed’’ as he put it, mostly because one attempted to stop a process that was
unstoppable, specifically the spread of resistance. Worse, the previous decade had
seen little progress when it came to establishing infection control methods of proven
efficacy (Dixon 1991). Hospital hygiene, which is a common lament among its
practitioners, had come under strong cost-cutting pressures and seems to have been
so ever since. As a more recent handbook concludes, ‘‘Infection control is a low

31
Lederberg et al. (1992), see Morrens and Fauci (2012).

123
8 Page 20 of 25 C. Gradmann

priority whenever health programs are subject to severe budgetary constraints’’


(Wenzel et al. 2002, p. 1). In a lecture on the cost-effectiveness of infection control
in German hospitals published in 1989, the author drew some lessons for the future
of his field, promising ‘‘that nosocomial infection control is expensive, but cost-
effective’’—adding, probably to the relief of hospital administrators, that there was
a potential for increasing efficiency to be exploited since ‘‘many infection control
procedures could be provided at a lower price but with the same effect’’ (Daschner
1989, p. 336). Closer to the focus of this paper, the frustration of hospital hygienists,
as seen historically, represents the outcome of the field embracing risk factor
analysis, and as a result evidence-based medicine. It was not so much the dubious
efficacy of comprehensive hygiene that had been proposed before that was the
problem. Arguably, the increasing focus on efficiency was even more detrimental; it
constrained a field to the management of hospital infections at acceptable costs,
rather than their full control. Only when there was too much of it did hospital
hygiene seem necessary. Instead of being the cavalry in fighting infectious disease,
hospital hygiene ended up as the fire brigade.32
We can relate the fate of hospital hygiene as a discipline to the subject of this
text, hospital infections and the role they played in studies of the evolution of
infectious disease. From the 1950s to the 1990s, we see a rising interest in disease
evolution that seemed to explain more and more relevant phenomena in clinical
medicine. In the 1950s, a piece of vintage biology such as Theobald Smith’s
epidemiological work was propelled into clinical medicine through its popularizer
Rene Dubos.33 Much of what had been considered a success in the control of
infection was now revealed to be biologically naı̈ve and, as Dubos contended, stuck
in nineteenth century thinking about the conquest of infectious disease. A decade
later, plasmids appeared meaningful, thus indicating the presence of rapid disease
evolution in clinical wards. Finally, around 1990, infection medicine became
populated by the re-emergence of infections of various types. In parallel, the notion
of the hospital changed, from being envisioned as a lighthouse of cleanliness and
hygiene, it moved to being an environment that was more dangerous than the
surrounding world; what used to be seen as an institution for the control of
infectious disease came to be perceived as a place that also facilitated the evolution
of such phenomena. When hospitals had become intrinsically dangerous places, risk
factor analysis came to dominate modern hospital hygiene, aiming at the
management of such dangers rather than their control.
It is important to take notice of the peculiarities of the story we have studied. It is
not one about clinicians who became self-educated evolutionary biologists. Instead,

32
This can be interpreted in different directions. Outside of the scope of this paper it seems to represent
the outcome of a struggle concerning social and professional hierarchies in hospitals—between, for
instance, surgeons and microbiologists, doctors and nurses. Of the initial subscribers (in 1980) of the
journal, Infection Control and Hospital Epidemiology, 90% seem to have been non-physicians (Wenzel
2009).
33
A search in Pubmed (30.1.2017), the main bibliographic database for medicine, delivers hardly any
(25) hits for the index term ‘‘Theobald Smith’’, most of which are obituaries or scholarly contributions by
historians. A search for René Jules Dubos, again as an index term, delivers 225 returns, including quite a
few from the period under study here.

123
From lighthouse to hothouse Page 21 of 25 8

by picking up some evolutionary biology, experimental genetics and a good deal of


statistics, they taught themselves to see the hospitals through an epidemiologist’s
glasses and study infections as phenomena that evolve quickly over time. While
inspired by basic science, such attention was nonetheless very different, mostly
through its focus on defined localities and short time developments. When
Lederberg wrote in 1988 on ‘‘Pandemic as a Natural Evolutionary Phenomenon’’, he
told the story of plasmids against a background of long-term disease evolution to
which it added new facets. Lederberg’s story unfolds in localities studied by
epidemiologists, the field and the laboratory. Hospitals are not mentioned. When he
writes about ‘‘infection as a mortal race’’ (Lederberg 1988, p. 349) between a
microbe and a human host or its immune system, patients in whom that race occurs
are also not mentioned. When talking about technology’s impact on disease
evolution, Lederberg resorts to examples from classical ecology such as plant
monocultures, pandemics and the like. Intensive care units were not on his radar.
The significance of the studies on hospital hygiene that we have followed lies in
showing an entirely different arena of thinking about disease evolution. Hospital
hygiene was inspired by basic science, but cannot simply be reduced to it. We can
sum it up if we discuss it against presumed changes in thinking about infectious
diseases as they occurred during the twentieth century. Understanding infections has
gone from a world view dominated by experience to one that privileges—albeit
gloomy—expectation: For early twentieth-century physicians, conditions like
syphilis, tuberculosis or typhoid could be framed as traditional challenges. As in
the case of cholera, they could travel; they could also, as syphilis had done in the
fifteenth century, suddenly appear but mostly they had been around for a long time.
A certain degree of variability in the characteristics of infections could be
contended. Knowing the history of epidemics provided insight into the panorama of
infectious disease that late nineteenth century medical bacteriology acted upon.
However, the notion of an evolving biology of infectious disease in those days was a
matter for specialized biology. In line with a Darwinian concept of evolution, and
against the backdrop of a slowly evolving natural history of infections, it was man-
made technology that brought about historical change.
In the late 1950s, Rene Dubos could accuse his contemporaries of indulging in ‘‘a
mode of thinking which often appears naı̈ve in the light of modern biology’’ (Dubos
1959, p. 61). Arguably, the study of hospital infections did quite a bit to amend this,
and to familiarize clinicians with phenomena from evolutionary biology that they
had previously considered unworldly. As a result, temporality has become an
essential feature of understanding infections. At the end of the twentieth century,
medical microbiology attempted to develop tools to identify and monitor ongoing
and rapid changes in the biology of infectious disease. As Richard Wenzel put it,
clinical microbiology has become a matter of ‘‘stalking microbes’’ (Wenzel 2005),
of knowing the present and anticipating future developments rather than answering
to long established challenges. Late nineteenth century medical microbiology set
out to control the natural history of infectious disease. Yet, its history in the
twentieth century came to be dominated by its rapidly evolving biology rather than
by that of the means of their control. The history of hospital hygiene and infection
control, we realize, is but a small chapter in the larger history of the demise of

123
8 Page 22 of 25 C. Gradmann

technological optimism. The natural history of infectious disease, formerly a


specialty of epidemiology and evolutionary biology, has arrived by the bedside.

Acknowledgements This paper is a much revised version of a presentation on the conference ‘‘Making
Microbes Complex: Parasites, Epidemics and the Origins of Disease Ecology’’ held at QMUL July 7–8,
2016. I wish to thank the conference organizers Mark Honigsbaum (QMUL) and Pierre-Olivier Méthot
(Université Laval, Québec) for inviting me. Both of them also gave valuable comments on the paper in its
various stages of completion. Mathias Grote (HU Berlin) supplied a much appreciated critical reading of
the manuscript.

References
Amsterdamska, O. (1987). Medical and biological constraints: Early research on variation in biology.
Social Studies of Science, 17, 657–687.
Anon. (1966). Infectious drug resistance. New England Journal of Medicine, 275, 277.
Aronowitz, R. A. (2015). Risky medicine: Our quest to cure fear and uncertainty. Chicago: Chicago
University Press.
Ayliffe, G. A. J., & English, M. P. (2003). Hospital Infection from miasmas to MRSA. Cambridge:
Cambridge University Press.
Barber, M., Dutton, A. A. C., Beard, M. A., Elmes, P. C., & Williams, R. (1960). Reversal of antibiotic
resistance in hospital staphylococcal infection. British Medical Journal, 5165, 11–17.
Bouton, C. (2016). The critical theory of history: Rethinking the philosophy of history of Koselleck’s
work. History and Theory, 55(2), 163–184.
Brachman, P. S. (1981). Nosocomial infection control: An overview. Reviews of Infectious Diseases, 3(4),
640–648.
Britt, M., Schleupner, C. J., & Matsumiya, S. (1978). Severity of underlying disease as a predictor in
nosocomial infection: Utility in the control of nosocomialInfection. Journal of the American
Medical Association, 239, 1047–1051.
Brock, T. D. (1990). The emergence of bacterial genetics. Cold Spring Harbour: Cold Spring Harbour
Laboratory Press.
Bud, R. (1993). The uses of life. A history of biotechnology. Cambridge: Cambridge University Press.
Bud, R. (2007). Penicillin: Triumph and tragedy. Oxford: Oxford University Press.
Burnet, F. M., & White, D. O. (1972). Natural history of infectious disease (4th ed.). London: Cambridge
University Press.
Bynum, W. F. (1994). Science and the practice of medicine in the nineteeth century. Cambridge:
Cambridge University Press.
Casadevall, A., & Pirofski, L.-A. (2003). The damage-response framework of microbial pathogenesis.
Nature Reviews/Microbiology, 1, 17–24.
Chavigny, K. H., & Fischer, J. (1984). Competing risk factors associated with nosocomial infection in
two university hospitals. Journal of Hospital Infection, 5(suppl A), 57–62.
Colebrook, L. (1955). Infection acquired in hospital. The Lancet, 266, 885–891.
Colebrook, L., & Kenny, M. (1936). Treatment of human pueperal infections, and of experimental
infections in mice, with prontosil. The Lancet, 227(5884), 1279–1281.
Condrau, F., & Kirk, R. (2011). Negotiating hospital infections: The debate between ecological balance
and eradication strategies in British hospitals, 1947–1969. Dynamis, 31, 385–405.
Daly, J. (2005). Evidence-based medicine and the search for a science of clinical care. New York:
Milbank Memorial Fund.
Daschner, F. (1989). Cost-effectiveness in hospital infection control—Lessons for the 1990s. Journal of
Hospital Infection, 13(4), 325–336.
Dixon, R. E. (1991). Historical perspective: The løandmark conference in 1980. The American Journal of
Medicine, 91(suppl 3b), 6–7.
Dixon, R. E. (2011). Control of health-care-associated infections, 1961–2011. Morbidity and Mortality
Weekly Report, 60(suppl), 58–63.
Dubos, R. J. (1958). The evolution of infectious diseases in the course of history. Canadian Medical
Association Journal, 79(6), 445–451.

123
From lighthouse to hothouse Page 23 of 25 8

Dubos, R. (1959). Mirage of health: Utopias, progress, and biological change. London: Allen & Unwin.
Eickhoff, T. C. (1991). Historical perspective: The landmark conference in 1970. The American Journal
of Medicine, 91(suppl 3b), 3–5.
Ewald, P. W. (1994). Evolution of infectious disease. Oxford: Oxford University Press.
Falkow, S. (1975). Infectious multiple drug resistance. London: Pion.
Falkow, S. (2006). The ecology of pathogenesis. In Forum on Microbial Threats, Board on Global Health
(Ed.), Ending the war metaphor: The changing agenda for unraveling the host-microbe relationship.
Workshop summary (pp. 102–115). Washington, DC: National Academies Press.
Freeman, J., & McGowan, J. (1978). Risk factors for nosocomial infection. The Journal of Infectious
Diseases, 138(6), 811–819.
Gardner, A. M. N., Stamp, M., Bowgen, J. A., & Moore, B. (1962). The infection control sister: A new
member of the control of infection team in general hospitals. The Lancet, 280(7258), 710–711.
Goldmann, D. A., & Maki, D. G. (1973). Infection control in total parenteral nutrition. Journal of the
American Medical Association, 223(12), 1360–1364.
Gradmann, C. (2009). Laboratory disease: Robert Koch’s medical bacteriology. Baltimore: Johns
Hopkins University Press.
Gradmann, C. (2013). Sensitive matters: The World Health Organisation and antibiotics resistance
testing, 1945–1975. Social History of Medicine, 26, 555–574.
Gradmann, C. (2016). Re-inventing infectious disease: Antibiotic resistance and drug development at the
Bayer company 1945–1980. Medical History, 59, 155–180.
Gradmann, C. (2017). Medical bacteriology: Microbes and disease, 1870–2000. In M. Jackson (Ed.), The
Routledge history of disease (pp. 378–401). London: Routledge.
Greenwood, D. (2008). Antimicrobial drugs. Chronicle of a twentieth century triumph. Oxford: Oxford
University Press.
Grmek, M. D. (1993). Le concept de maladie émergente. History and Philosophy of the Life Sciences,
15(3), 281–296.
Grote, M. (2008). Hybridizing bacteria, crossing methods, cross-checking arguments: The transition from
episomes to plasmids (1961–1969). History and Philosophy of the Life Sciences, 30, 407–430.
Haley, R. W., Culver, D. H., White, J. W., Morgan, W. M., Emori, T. G., Munn, V. P., et al. (1985). The
efficacy of infection surveillance and control programs in preventing nosocomial infections in US
hospitals. American Journal of Epidemiology, 121(2), 182–205.
Hejazi, A., & Falkiner, F. R. (1997). Serratia marcescens. Journal of Medical Microbiology, 46, 903–912.
Hillier, K. (2006). Babies and bacteria: Phage typing, bacteriologists, and the birth of infection control.
Bulletin of the History of Medicine, 80, 733–761.
Hospital Infections Program, C. (1991). Nosocomial infection rates for interhospital comparison:
Limitations and possible solutions. Infection Control and Hospital Epidemiology, 12(10), 609–621.
Hughes, J. M. (1987). Nosocomial infection control in the United States. Infection Control, 8(11),
450–453.
King, N. B. (2004). The scale politics of emerging diseases. Osiris, 19, 62–76.
Kirby, W. M. M., Douglas, M. D., Corpron, M. D., & Tanner, D. C. (1956). Urinary tract infections
caused by antibiotic-resistant coliform bacteria. The Journal of the Anerican Medical Association,
162(1), 1–4.
Koselleck, R. (1979). ‘Erfahrungsraum’ und ‘Erwartungshorizont’—zwei historische Kategorien. In R.
Koselleck (Ed.), Vergangene Zukunft. Zur Semantik geschichtlicher Zeiten (pp. 349–375). Frankfurt:
Suhrkamp.
Koshland, D. E. (1992). The microbial wars. Science, 257, 1021.
Lederberg, J. (1988). Pandemic as a natural evolutionary phenomenon. Social Research, 55(3), 343–359.
Lederberg, J. (1997). Infectious disease as an evolutionary paradigm. Emerging Infectious Diseases, 3(4),
417–423.
Lederberg, J., Shope, R. E., & Oaks, S. C. (Eds.). (1992). Emerging infections. Microbial threats to health
in the United States. Washington, DC: National Academic Press.
Lesch, J. E. (2007). The first miracle drugs: How the sulfa drugs transformed medicine. Oxford: Oxford
University Press.
Levy, S. B. (1992). The antibiotic paradox. How miracle drugs are destroying the miracle. New York,
London: Plenum Press.
Loudon, I. (1987). Puerperal fever, the streptococcus, and the sulphonamides, 1911–1945. British
Medical Journal, 295, 485–490.
Loudon, I. (2000). The tragedy of childbed fever. Oxford: Oxford University Press.

123
8 Page 24 of 25 C. Gradmann

Lowbury, E. J. L. (1955). Cross-infection of wounds with antibiotic-resistant organisms. British Medical


Journal, 1(4920), 985–990.
Macfarlane Burnet, F. (1953, 1940). The natural history of infectious disease. Cambridge: Cambridge
University Press.
Maki, D. G., Alvarado, C. J., Hassemer, C. A., & Zilz, M. A. (1982). Relation of the inanimate hospital
environment to endemic hospital infection. New England Journal of Medicine, 307, 1562–1566.
Marks, H. M. (1997). The progress of experiment. Science and therapeutic reform in the United States,
1900–1990. Cambridge: Cambridge University Press.
Mazumdar, P. M. H. (1995). Species and specificity: An interpretation of the history of immunology.
Cambridge: Cambridge University Press.
McDermott, W. (1956). The problem of staphylococcal infections. Annals of the New York Academy of
Sciences, 65, 58–66.
McGowan, J. E. (1991). Abrupt changes in antibiotic resistance. Journal of Hospital Infection, 18,
202–210.
McKenna, M. (2010). Suberbug: The fFatal Menace of MRSA. New York: Free Press.
Mendelsohn, A. (2002). ‘Like all that lives’: Biology, medicine and bacteria in the age of Pasteur and
Koch. History and Philosophy of the Life Sciences, 24, 3–36.
Méthot, P.-O. (2016). Bacterial transformation and the origins of epidemics in the Interwar period: The
epidemiological significance of Fred Griffith’s ‘tranforming experiment’. Journal of the History of
Biology, 49, 1–48.
Méthot, P.-O., & Alizon, S. (2014). What is a pathogen? Towards a process view of host–parasite
interaction. Virulence, 5(8), 775–785.
Méthot, P.-O., & Fantini, B. (2014). Medicine and ecology: Historical and critical perspectives on the
concept of ‘‘Emerging Disease’’. International Archive of the History of Science, 64, 213–230.
Moberg, C. L. (1999). René Dubos: A harbinger of microbial resistance to antibiotics. Perspectives in
Biology and Medicine, 42, 559–580.
Morrens, D. M., & Fauci, A. S. (2012). Emerging infectious diseases in 2012: 20 years after the Institute
of Medicine report. mBio, 3(6), 1–4.
Podolsky, S. H. (2010). Antibiotics and the social history of the controlled clinical trrial. Journal for the
History of Medicine and Allied Sciences, 65, 327–367.
Podolsky, S. H. (2015). The antibiotic era: Reform, resistance, and the pursuit of rational therapeutics.
Baltimore: JHUP.
Podolsky, S. H., Bud, R., Gradmann, C., Hobaek, B., Kirchhelle, C., Mitvedt, T., et al. (2015). History
teaches us that confronting antibiotic resistance requires stronger global collective Aation.
[Vitenskapelig artikkel]. Journal of Law, Medicine & Ethics, 43(3), 27–32.
Rogers, D. E. (1959). The changing pattern of life-threatening microbial disease. The New England
Journal of Medicine, 261(14), 677–683.
Rogers, D. E. 68, Leading medical educator, dies. (1994, 12.6.). The New York Times.
Rothstein, W. G. (2003). Public health and the risk factor: A history of an uneven medical revolution.
Rochester, NY: University of Rochester Press.
Schaffner, W. (1984). Priorities in infection control: The impact of new technology. Journal of Hospital
Infection, 5(Supplement A), 1–5.
Schlich, T. (2012). Asepsis and bacteriology: A realignment of surgery and laboratory science. Medical
History, 56(3), 308–334.
Schlich, T. (2013). Negotiating technologies in surgery: The controversy about surgical gloves in the
1890s. Bulletin of the History of Medicine, 87(2), 170–197.
Schlich, T., & Tröhler, U. (Eds.). (2006). The risks of medical innovation: risk perception and assessment
in historical context. London, NY: Routledge.
Snowden, F. M. (2008). Emerging and reemerging diseases: A historical perspective. Immunological
Reviews, 225(1), 9–26.
Spink, W. W. (1955). Clinical problems relating to the management of infections with antibiotics. Journal
of the American Medical Association, 152(7), 585–590.
Suter, P., & Vincent, J. L. (Eds.). (1993). Milestones in hospital infections. Thorpe, Egham: Medical
Action Communication Ltd.
Tansey, E. M. (Ed.). (2000). Post penicillin antibiotics: From acceptance to resistance. A witness
seminar, held at the Wellcome Institute for the History of Medicine, London, 12 May 1998
(Wellcome witnesses to twentienth century medicine). London.

123
From lighthouse to hothouse Page 25 of 25 8

Thoburn, R., Fekety, R. F., Leighton, E. C., & Melvin, V. B. (1968). Infections acquired by hospitalized
patients. Archives of Internal Medicine, 121, 1–10.
Tibayrenc, M. (2001). The golden age of genetics and the dark age of infectious diseases. Infection,
Genetics and Evolution, 1, 1–2.
Timmermann, C. (2012). Appropriating risk factors: The reception of an American approach to chronic
disease in the two German States, 1950–1990. Social History of Medicine, 25(1), 157–174.
Timmermanns, S., & Berg, M. (2003). The gold standard: The challenge of evidence-based medicine and
the standardization in health care. Philadelphia: Temple University Press.
Wall, R. (2013). Bacteria in Britain, 1880–1939. London: Pickering & Chatto.
Washer, P., & Joffe, H. (2006). The ‘hopital superbug’: Social representations of MRSA. Social Science
and Medicine, 63, 2141–2152.
Weindling, P. (1992). Scientific elites and laboratory organisation in fin de siècle Paris and Berlin The
Pasteur Institute and Robert Koch’s Institute for Infectious Diseases compared. In A. Cunningham &
P. Williams (Eds.), The laboratory revolution in medicine (pp. 170–188). Cambridge: Cambridge
University Press.
Weir, L., & Mykhalovskiy, E. (2010). Global public health vigilance: Creating a world on alert.
Hoboken: Taylor & Francis.
Wenzel, R. P. (1987). Prevention and control of nosocomial infections. Baltimore: Williams & Wilkins.
Wenzel, R. P. (2005). Stalking microbes. A relentless pursuit of infection control. Bloomington/Indiana:
Authorhouse.
Wenzel, R. P. (2009). Thirty years later. Infection Control and Hospital Epidemiology, 31(1), 1–3.
Wenzel, R. P., Brewer, T., & Butzler, J. P. (2002). Infection control in the hospital. Boston: International
Society for Infectious Diseases.
Wise, R. I., Ossman, E. A., & Littlefiled, D. R. (1989). Personal reflections on nosocomial staphylococcal
infections and the development of hospital surveillance. Reviews of Infectious Diseases, 11(6),
1005–1019.
Wright, S. (1994). Molecular politics: Developing American and British regulatory policy for genetic
engineering, 1972–1982. Chicago: University of Chicago Press.
Zinsser, H. (1935). Rats, lice and history: Being a study in biography, which, after 12 preliminary
chapters indispensable for the preparation of the lay reader, deals with the life history of Thyphus
fever. London: Routledge.

123

You might also like