Professional Documents
Culture Documents
Milgram's Obedience To Authority: Its Origins, Controversies, and Replications
Milgram's Obedience To Authority: Its Origins, Controversies, and Replications
net/publication/316559861
CITATIONS READS
2 9,613
2 authors, including:
Harry Perlstadt
Michigan State University
81 PUBLICATIONS 306 CITATIONS
SEE PROFILE
Some of the authors of this publication are also working on these related projects:
All content following this page was uploaded by Harry Perlstadt on 29 April 2017.
Harry Perlstadt
Michigan State University
ABSTRACT
Milgram's study of obedience to authority has been the center of a debate over research
ethics in the social and behavioral sciences since it was first published fifty years ago.
Most learn about the study and its ethical problems through required tutorials for the
responsible conduct of research. But these tutorials do not describe the origins of the
study and the subsequent controversy surrounding it. This paper presents a detailed
historical account of both. Written from a social behavioral science rather than a
philosophical perspective, the paper will shed a different light on both Milgram and his
chief critic Baumrind.For example, Milgram attempted tof ollow the somewhat
ambiguous 1953 APA ethical standards and Baumrind realized that revising research
ethics was a normative process. Replications include a 1966 study that involved hospital
nurses following doctors' order to administer a drug and a 2009 truncated replication that
was carried out with IRB approval. A real-world incident took place at a McDonald's in
2004 in which a supervisor followed the orders of someone claiming to be a policeman
investigating a stolen purse. Future research should explore how this type of behavior
occurs and how a person can act altruistically to prevent bullying and hazing.
Contact:
Harry Perlstadt
Department of Sociology
Michigan State University
East Lansing, MI 48824 USA
Email perlstad@msu.edu
1
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
Introduction
reconsideration from a historical and ethical perspective. Stanley Milgram (1963) first published
his findings in the Journal of Abnormal and Social Psychology, reporting that 65 percent of
naive participants were persuaded to administer increasingly painful electrical shocks up to and
including a lethal dose to a person after that person gave incorrect answers in a memory learning
study. The next year the American Psychologist, the official journal of the American
(1964a) rebuttal. This ignited a controversy over research methods and ethics, specifically the
deception of participants (Elms, 1982; Clarke, 1999). Many institutional review boards ( IRBs)
require researchers and students to take tutorials on human research protections that include
The tutorials summarize a study in two or three paragraphs and then identify one or two
key ethical criteria which the study failed to meet. This is a form of casuistry or argument by
cases to identify an ethical, moral, or legal problem and then judge it against similar cases.
Herrera (2001) has expressed misgivings about the way IRBs perceive and possibly apply the
Milgram experiment and has called for a more in-depth analysis of issues raised beyond
deception, given that empirical research on Milgram's experiment found no significant harmful
effects of deception (Fisher & Fryberg, 1994; Korn, 1997). But as Lindemann Nelson (2004)
2
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
pointed out, this narrative approach to ethics often fails to raise more complex issues that should
be examined, and while the summary may be true, it may not be full or accurate. Chambers
(1999) went further, arguing that moral narratives be considered as fictions, like Aesop's Fables,
which are told in such a way as to persuade the audience to take a specific position or learn a
preferred lesson.
particular moral position and then enforcing it. Becker (1963) argued that some moral
entrepreneurs crusade for a rule that would right what they perceive to be a social evil, but once
the new rules are in place, other moral entrepreneurs selectively enforce some or all of the rules
and identify outsiders and deviants. This suggests that moral entrepreneurs, including those who
Sumner (1906) held that proper ways of acting start as folkways and some evolve into
customs. With the addition of a philosophy of welfare specifying right and wrong behaviors for
both the individual and society, customs are transformed into social mores. Over time, social
mores are often codified and some are enacted into law. Milgram and Baumrind were
protagonists sparking a transition within psychology and behavioral science from customary
research practices to research codes of ethics. To some degree, examining the origins and
controversies surrounding the Milgram experiment in a social and historical context implies
ethical relativism that values and standards change over time. In this paper I attempt to present a
more comprehensive account or narrative concerning the controversy and enable us to better
understand Milgram’s actions and impact on social scientific research ethics. I will leave the
3
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
The study began in the fall of 1960 as a project in Milgram's “Psychology of Small Groups”
class at Yale (Blass, 2004, pp. 67ff). He wanted them to have some hands-on experience in a
social psychology laboratory with a one-way mirror that allowed others to observe the
interactions. The class developed several possible experiments, and the students voted to do what
would become the pilot for the obedience to authority study. Twenty Yale undergraduates
served as subjects for the class experiment, which was conducted in five different sessions in late
November and early December 1960. The class experiment revealed that the student participants
readily accepted the reality of the situation. They believed they were giving extremely painful
electric shocks to another person, but they refused to receive a sample shock. The student
participants would obey commands, even when these went against accepted standards of
behavior. They exhibited signs of nervousness and anxiety―one pulled on his hair, another
wiped his sweating palms and shook his head in dismay as he administered the shock. Milgram
observed that during the class experiment the student participants were often reluctant to look at
the Learner and turned their head to avoid seeing the painful consequences of their actions
Milgram used the class experiment as the basis for an application to the National Science
Foundation (NSF) that he submitted in early 1961. Milgram knew from the class project that
participants experienced sufficient tension and stress and he therefore included a special section
Department of Health, Education, and Welfare issued guidelines in 1971 that were codified into
Federal Regulations 45 CFR 46 in 1974. In order to ensure the well-being of the research
participant before leaving the laboratory, Milgram planned to make every effort to set the
participant at ease by debriefing them after the experiment (Blass, 2004, pp. 70–71).
4
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
The NSF grant proposal included a description of the pilot research and its findings,
accompanied by photographs depicting some of the participants' behavior (Blass, 2004, p. 69).
Out of concern for the research participants, the NSF decided to make a site visit, which was not
always a part of the grant review process (Blass, 2004, p. 71). In April 1961 a site visit team of
three, consisting of Henry Riecken (head of the NSF Office of Social Sciences, and co-author of
When Prophecy Fails (Festinger, Riecken and Schacter, 1956), social psychologist Richard
Christie, co-editor of Studies in the Scope and Method of the Authoritarian Personality (Christie
& Jahoda, 1954), and sociologist James Coleman, co-author of Union Democracy (Lipset, Trow
& Coleman, 1956), met with Milgram and others at Yale. Milgram was able to convince the
committee that he would adequately debrief the participants and make sure they were
comfortable with the decisions they had made and the actions they had taken during the
experiment. Yale's general counsel assured the committee that Yale would be legally responsible
for any negative effects the participants might suffer. In essence, the proposal underwent the
equivalent of a human subjects review and was approved by the university's legal counsel.
Although later vilified, Milgram was ahead of this time, preceding both the Declaration
of Helsinki in 1964, which called for a clearly formulated review of proposed biomedical
research by an independent research ethics committee, and the 1966 statement by US surgeon
general William Stewart, which required prior review by institutional associates to ensure an
independent determination of the protection of the rights and welfare of the participants in order
to receive a Public Health Service (PHS) grant. In his statement, Stewart specifically included the
behavioral and social sciences, although no mention was made of Milgram's experiment,
possibly because it was funded by the NSF and not the PHS.
The NSF review committee commented on the scientific content of the proposal, writing
5
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
that it was clear that Milgram neither has, nor plans to have, an elaborate a priori theory. This
lack of theory would form the basis for the Journal of Personality to reject the manuscript. Its
editor, Edward E. Jones, dismissed the whole study as "a kind of triumph of social engineering"
(Blass, 2004, p. 114). It appears that Milgram was more interested in documenting what happens
and how often it happens rather than using theory and previous studies to explain why it happens.
For Milgram, "obedience was an obvious and observable social fact that is a direct, ubiquitous
and indispensable feature of social life" (1963, p. 372). It did not require a lot of theory or an
extensive literature review. In a few sentences Milgram quickly cites a series of relevant studies
by, among others, Weber, Binet, and Adorno, Frenkel-Brunswik, Levinson, and Stanford without
bothering to comment on them. For example, Weber's (1947) legal-rational authority rests in a
particular office or role that allows an individual to give certain orders and expect a high
probability of having them carried out. Weber's assumption that obedience is probabilistic could
lecture on the learning of discipline. Binet and Henri (1894) briefly showed 150 French
schoolchildren of different ages a straight line on a blackboard. In the first or memory condition
they covered the blackboard and presented a second blackboard containing that line mixed in
with others of different lengths and asked the child to identify the line from the first blackboard.
In the second or direct comparison condition, the first blackboard was left uncovered and the
child could look at both when making his decision. When the child believed he had found the
designated line, the experimenter asked in a soft and quiet tone, without raising his voice,
without making gestures, and without insisting, "Are you really sure? Is it not the next line?"
This simple suggestion induced most grammar school and middle grade children (89
6
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
percent and 80 percent) and more than half (54 percent) of superior grade children to change
their answer in the memory condition, while in the direct comparison condition about three-
fourths of the elementary and middle grade children (74 percent and 73 percent) but just less
than half (48 percent) of the upper grade children changed theirs. In what appears to be a
straightforward adaptation from the Binet and Henri study, Milgram told his subjects "to please
continue," and "the experiment requires you to continue," when they wanted to quit.
Finally, Milgram cites The Authoritarian Personality (Adorno et al., 1950) but does not
discuss several key concepts and findings that relate to his experiments. The study, sponsored by
the American Jewish Committee to explore fascism and anti-Semitism, found that authoritarians
disliked weakness of any kind and tended to believe in strict discipline. This was thought to
Milgram assumed the reader knew the link between The Authoritarian Personality study and the
Holocaust, and the conclusion that authoritarian personalities submit to higher authority while
(Perry, 2013, p 304-310). Most involved the proximity between the teacher (naive subject) and
Learner (a research confederate), the relationship between the teacher and the experimenter
(authority figure), or group dynamics with a naive teacher/subject and one or more teacher
the role of Learner (Milgram, 1974). Unlike the class project, the participants in the study were
recruited from the general adult population via ads in local newspapers that read, "We Will Pay
You $4.00 for One Hour of Your Time. Persons Needed for a Study of Memory" (Slater, 2004).
The original set of participants was recruited from the New Haven area with the
7
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
experiments taking place on the Yale University campus. Subjects were between the ages of
twenty and fifty, and high school and college students were not recruited. Later, a second set of
participants was recruited from the Bridgeport area with the experiment taking place in a
rundown office building in the downtown shopping district rather than on the Yale campus.
The mechanism for administering the shocks had thirty levels or settings ranging from 15
volts marked slight shock to 450 volts marked XXX. Milgram devised a set of four "prods" that
the experimenter gave to participants who asked whether they should continue to administer
shocks: (1) "please continue," (2) "the experiment requires you to continue," (3) "it is absolutely
essential that you continue," and (4) "you have no other choice, you must go on" (Milgram, 1974,
p. 21). These prods were made in sequence, and if the participant refused to obey after prod 4,
that all participants received. They were told that the Learner had not, in fact, received dangerous
electric shocks, and all had a friendly reconciliation with the unharmed Learner. Milgram held
extended discussions with the participants, but the content of the debriefings, or dehoaxing as he
termed it, differed according to the participant's behavior. Obedient participants were assured
that their behavior was entirely normal and that their feelings of conflict or tension were shared
by other participants. In his debriefings with defiant participants who refused to continue to
Harris (1988, pp195-197) and Perry (2013, p78) noted that Milgram was following the
APA 1953 Ethical Standards of Psychologists. Harris wrote that after the Baumrind / Milgram
exchange, psychologists formed sides on whether deception was to be avoided or could be used
if it did no lasting harm to subjects. In order to understand the controversy that arose, it is
8
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
important to see exactly what the APA said about ethical standards in research:
Principle 4.31-1. Only when a problem is significant and can be investigated in no other
way is the psychologist justified in exposing research subjects to emotional stress. He
must seriously consider the possible harmful after-effects and should be prepared to
remove them as soon as permitted by the design of the experiment. Where the danger of
serious after-effects exists, research should be conducted only when the subjects or their
responsible agents are fully informed of this possibility and volunteer nevertheless.
In Harris’ view, the 1953APA standards did not recommend the routine revealing of
deception to subjects. Harris (1988, p195) pointed out that in 1963 Milgram was one of the few
who described his efforts to mitigate possible harmful after-effects in his section on “Interview
and Dehoaxing.” and that Milgram (1964b) was the first to use the term debrief in a published
psychology paper when he replaced dehoaxing with debriefing. Previously debriefing was a
Perry concluded from her examination of Milgram’s research files at Yale and interviews
with former participants and Milgram’s students nearly fifty years later that he did not explicitly
inform them during the debriefing that no shocks were ever administered. She suspected that
Milgram did not end the deception during the debriefings because he did not want potential
subjects to hear about the real purpose of the research until all trials were completed. Subjects
first learned the whole truth when they received a written report and a ten-item follow up
Experiment 1 in December 1961 to the Journal of Abnormal and Social Psychology. It was
9
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
summarily rejected, but in July 1962 the journal changed its mind and accepted the paper for
publication. The article, "Behavioral Study of Obedience," would appear fifteen months later, in
October 1963. Within days of publication (see Blass, 2004, p. 21), the study was reported in the
New York Times (26 October 1963) and newspapers around the world, culminating in a scathing
editorial in the St. Louis Post-Dispatch (2 November 1963). The editorial chastised both
Milgram and Yale University, included a vivid description of the suffering among participants,
and concluded that the experiment was nothing but "open-eyed torture" (Blass, 2004, p. 212). On
16 November 1963 the Post-Dispatch published Milgram's rebuttal letter, and he was on the
Milgram's (1963) initial publication reported only on Experiment 1, involving the remote
condition in which the Learner was in an adjacent room and the subjects received no feedback
from the Learner until the Learner pounded on the wall at 300 volts. At that point five
teacher/subjects refused to administer further shocks. After that the Learner did not respond to
the questions or make any other attempt to communicate with the teacher/subject. Overall,
twenty-six (65.0 percent) of the participants in Experiment 1 obeyed the orders to the end and
administered all thirty shocks, including the one marked "XXX" at 450 volts.
Milgram published other articles on the experiments (1964b, 1964c, 1965a, 1965b), and
produced a film Obedience (1965c). The first comprehensive account of eighteen of the twenty-
four experimental conditions was finally published ten years later in 1974 as Obedience to
Authority: An Experimental View. In addition to the remote condition, Milgram brought the
Learner psychologically and physically closer to the teacher. In Experiment 2, the voice
feedback condition, the Learner was in the adjacent room but the teacher heard prerecorded
complaints that increased as the voltage increased. Under this condition, 62.5 percent
10
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
administered all thirty shocks, compared to the 65 percent in the remote condition. In Experiment
3, the proximity condition, the Learner was seated a few feet away from the teacher, who could
both hear and observe the Learner's reactions to the shocks. Now only 40.0 percent administered
all thirty shocks. Finally, in Experiment 4, the touch-proximity condition, the Learner sat near
the teacher and would only receive a shock when his hand rested on the shock plate. Beginning
at the 150-volt level (the tenth shock), the experimenter would order the teacher to physically
force the Learner's hand on the plate. In this condition almost a third (30.0 percent) administered
Milgram carried out additional trials to discover under what conditions subjects would
experimenter gave his orders by telephone rather than face to face, several participants
administered lower shocks than were required and either never informed the experimenter or
specifically assured the experimenter that they were raising the shock level when in reality they
repeatedly used the lowest shock on the board. In Experiment 11, participants were free to
administer any shock level for any wrong answer. All but two participants administered shocks
at or below level 10 at 120 volts. One participant went to level 25 at 375 volts, and one
administered the maximum level 30 at 450 volts. Milgram argued that this condition
demonstrated that what leads to shocking the Learner at the highest level is not the result of
autonomously generated aggression but rather the transformation of behavior that comes about
In July 1963, six weeks after the end of the experiment, Milgram sent 856 participants a
detailed five-page report on the experiment, including its procedures, its rationale, some of the
main findings, and the causes of the tension they may have experienced. He also sent a ten-item
11
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
multiple-choice questionnaire asking them to reflect back on their experiences (Milgram, 1974,
p. 195, Blass, 2004, p. 124-126). With two follow-ups he achieved a 92 percent return rate for
the mailed survey. The survey revealed that 83.7 percent endorsed the statement that they were
very glad or glad to have been in the experiments, while only 1.3 percent were sorry or very
sorry. Overall, about 64 percent of 653 participants reported not being bothered at all, 30 percent
reported being bothered a little, and 7 percent reported being bothered quite a bit.
Milgram (1974, p. 172) found one significant difference between participants who were
defiant and those who had complied and continued to administer shocks. A little less than half (
47.9 percent) of the obedient participants claimed that during the experiment they fully believed
the Learner was getting painful shocks, compared with almost two-thirds (62.5 percent) of the
defiant participants who may have more honestly acknowledged their concerns about inflicting
Baumrind's Critique
In their tutorial module on History & Ethics, Bankert, Cohen, Cooper, Davis, and Hicks (2009)
identified the key ethical problems in the Milgram experiment as unanticipated psychological
harms that resulted from their participation and the deception of subjects. Subjects could have
suffered harm during the actual conduct of the experiment and/or experienced short- or long-
term harm as a result of their participation. A Harvard (2002) tutorial mentioned that some
thought the participants had been harmed, if not through the stress of the experiment itself, then
through the "inflicted insight" into their own personalities. The IRB Member Handbook (Amdur,
2003) similarly claimed that many of the participants experienced extreme psychological distress
after understanding the level of cruelty of their actions, and the CITI tutorial (Bankert et al., 2009)
stated that after being "debriefed" some of the subjects experienced serious emotional crises.
12
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
Finally, Cave and Holm (2003) asserted that a harm is a harm and that any harm, however
temporary, still needs to be justified. In addition, they pointed out that Milgram's participants
These statements directly or indirectly reflect the arguments of Baumrind (1964) in her
critique titled "Some Thoughts on Ethics of Research: After Reading Milgram's 'Behavioral
Study of Obedience.'" From her opening paragraphs it is clear that Baumrind, a clinical and
compass and was ahead of the curve on research ethics. Published the same month that the
World Medical Assembly approved the Declaration of Helsinki, which established ethical
principles for biomedical research for the international medical profession, Baumrind (1964, p.
421) wrote that in psychological research the experimenter is often required to balance his or her
career and scientific interests against the interests of his or her prospective subjects. The
experimenter is to do the least possible harm to the subjects, but if the experimental conditions
cause the subjects pain or loss of dignity or offer the subject nothing of value, then the
experimenter must consider the reasons why the subject volunteered including the possibility that
Here Baumrind is expanding on the 1953 APA ethical standards which only called for the
She is concerned with the motivation for participating in such experiments including coercion of
students in a class. Except for the pilot study, none of Milgram’s subjects were high school or
college students.
Two years later, Baumrind (1966) would publish "Effects of Authorative Parental
Control on Child Behavior." She had observed thirty-two white middle-class preschoolers and
13
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
identified three parenting styles based on measures of parental responsiveness (fulfilling needs)
and demandingness (behavioral control). Permissive parents were high on responsiveness and
low on demandingness. Permissive parenting was based on Freudian theory and popularized by
Benjamin Spock (1946) in The Common Sense Book of Baby and Child Care. Authoritative
parents, whom she favored, were high on both responsiveness and demandingness. They attempt
to direct the child's activity, but in a rational manner, sharing the reasons for rules but soliciting
demandingness. Most communication was from parent to child, with low child-to-parent
communication. Children raised by authoritarian parents were obedient, conformist, and norm
abiding. Authoritarian parenting was linked to the behaviorist school of J. B. Watson (1928),
who dedicated his book Psychological Care of Infant and Child to "the first mother who brings
up a happy child." Watson had previously conducted the infamous "Little Albert" experiments in
which he conditioned an eleven-month-old boy to fear a white laboratory rat that he had been
playing with by repeatedly making a loud banging sound when the rat was subsequently given to
the boy.
Baumrind argued that Milgram's subjects were put through an emotionally disturbing
experience, that the experiment could affect an alteration in their self-image or ability to trust
adult authorities in the future, and that his subjects were entrapped by a trusted individual into
committing acts that they would consider unworthy (Baumrind, 1964, p. 422; Blass, 2004, pp.
123–124). During Experiments 1 through 4, Milgram asked participants during the debriefing
how nervous or tense they felt at the point of maximum tension on a fourteen-point scale. More
than 75 percent of 137 participants reported they felt moderately or extremely tense and nervous
14
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
(Milgram, 1974, pp. 41–42). His 1963 follow-up questionnaire asked whether during the
experiment participants were extremely upset, somewhat nervous, relatively calm, or completely
calm. Several months after the experiment, only 10 percent of 658 participants recalled being
Based on Milgram's slides and film, one can readily accept Baumrind's arguments that
during the experiment the participants experienced stress ranging from moderate to high and that
they found themselves in a social situation in which they were encouraged by the experimenter
to commit acts to which they would normally object. But, Alan Elms (1972, 150–51), who
assisted Milgram and interviewed a substantial sample of the participants after the conclusion of
the experiment, noted that her comments about ability to trust adults in the future suggested she
viewed the subjects as children or minors whereas all of Milgram's subject were competent male
Further, she asserted that their self-image could be altered as a result of participating in
the experiment and that the subject's personal responsibility for his or her actions is not erased
because the experimenter reveals to him or her the means which he used to stimulate these
actions. Milgram (1964a p. 849) did not cite the APA 1953 standards but claimed that the
debriefing and assessment procedures were carried out as a matter of course and not in response
to any observation of special risk. He also summarized the post-experiment self-report survey,
the validity of which Baumrind (1985) questioned, calling for behavioral evidence to determine
Taking on the role of moral entrepreneur, Baumrind worked behind the scenes
developing and presenting her argument against deception and other ethical issues at meetings,
and, after serving as a resource person to the APA drafting committee, publishing two pieces
15
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
criticizing the proposed revisions (Baumrind, 1971, 1972). She believed that the experimenter
should be morally obliged to bring his or her behavior into conformity with clearly stated
principles of conduct, but acknowledged that the diversity of opinions within the APA made it
impossible to issue a definitive statement of normative ethics. She knew of no ethical system that
condones deceit, lie telling, and breaking of contracts, and asserted that deceptive instructions
prevent informed consent. The APA revised its ethical principles in 1973.
Milgram’s Obedience to Authority was published in 1974, the same year that Congress
passed the National Research Act in response to a series of revelations about unethical
biomedical and behavioral research in the US. This culminated in July 1972 when an Associate
Press story reported on the Tuskegee Syphilis Experiment in which rural African American men
who had previously contracted syphilis were followed for forty years, but were never told they
had syphilis and never received proper treatment. The Act created the National Commission for
the Protection of Human Subjects of Biomedical and Behavioral Research to identify basic
ethical principles and to develop guidelines for the conduct of biomedical and behavioral
research. The Commission asked experts and specialists to submit background papers that would
help it explore and understand the issues before it. Baumrind (1978) wrote one of four papers on
informed consent which was published in an Appendix to the Commission’s Belmont Report.
In her background paper she presented her case against the use of deception in the social
behavioral research. She mentioned Milgram several times but at one point descended into what
In the practice of their profession, these scientists use deceitful practices openly, publish
their procedures without apology and indeed with prideful exhibition of ingenuity (e.g.,
Milgram, 1963), teach their students to copy their example and reward them when they
do, and vigorously defend their procedures when attacked. (Baumrind, 1978, p. 23-10).
She reinforced her position by citing Ring, Wallston and Corey (1970), who conducted a
16
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
percent of 57 female psychology college students were completely obedient in a Milgram remote
discussion section, Ring et al, (1970) claimed that many of the compliant subjects reported they
might experience difficulty in trusting adult authorities in the future. But Baumrind failed to
mention that none of the obedient subjects who were informed of the deception during a
debriefing resented having been deceived or thought it involved anything unethical or should be
discontinued. Just like Milgram’s subjects, properly debriefed subjects in the Ring et al
experiment were upset by the procedure itself but valued the experiment and their participation
in it. However, one negative finding was that about half of the twenty subjects interviewed said
It was the Belmont Report's (1979) principle of respect for person and the subsequent
requirements for education of both investigators and IRB members in research ethics that turned
the spotlight on, among other things, informed consent and deception, and made the Milgram
human research protections defines minimal risk as “the probability and magnitude of harm or
discomfort anticipated in the research that are not greater in and of themselves than those
ordinarily encountered in daily life” (DHHS, 2009). In an excerpt from a debriefing interview,
one of Milgram’s male participants who had recently been discharged from the US Army was, in
his own words, “really sweating bullets,” and “hysterically laughing.” He described himself as
“an emotional wreck,” and “a basket case” during the experiment. After he left the lab he
17
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
realized that somebody could get me to do that stuff (Blass, 2004, p. 114-115).
In his rebuttal published five months after Baumrind's critique, Milgram argued:
It is true that after a reasonable number of subjects had been exposed to the procedures, it
became evident that some would go to the end of the shock board, and some would
experience stress. That point, it seems to me, is the first legitimate juncture at which one
could even start to wonder whether or not to abandon the study. But momentary
excitement is not the same as harm. As the experiment progressed there was no indication
of injurious effects in the subjects; and as the subjects themselves strongly endorsed the
experiment, the judgment I made was to continue the investigation. (1964a, p. 849)
Milgram's use of the phrase "momentary excitement" appears to understate what might better be
called a deeply distressing experience, as some participants sought to break off the experiment
but were persuaded to continue. Milgram also claimed that "relatively few participants
experienced greater tension than a nail-biting patron at a good Hitchcock thriller" (Blass, 2004,
p. 115). Without mentioning Milgram, Baumrind (1971) noted that investigators are
insufficiently sensitive to subtle ethical and psychological consequences and thus unknowingly
inflict harm. Perhaps Milgram's labeling the stress he observed as “momentary excitement” or
Milgram challenged Baumrind's contention that after the experiment, the subject cannot
justify his behavior and must bear the full brunt of his actions. Without citing The Authoritarian
Personality, he argued that the same mechanism that allowed the subject to obey and continue to
administer the shocks would carry over and allow him to rationalize his actions afterward. The
subject simply obeyed the person in authority. Further, he thought that the people who hear of
the experiment and find the idea of shocking the victim to be repugnant will say "people will not
do it," and that "if they do it, they will not be able to live with themselves afterwards" (Milgram,
1964a, p 850). But he overlooked the possibility that, like his subject who had been recently
discharged from the army, people who learn about the experiment for the first time might realize
18
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
presentation by Milgram in which he showed slides of his experiment. It was obvious that
several of the subjects were under high stress, but he also had a slide showing the teacher and
Learner in a friendly pose after the experiment. What astonished me were the highly emotional
displays of moral indignation by several people in the audience during the question-and-answer
session. I interpreted this as an overidentification with the passively obedient subject who
willingly continued to administer shocks. They realized that if they were subjects, they too might
Fred Strodtbeck (1973), who was the lead social psychologist in the 1954 Wichita Jury
Study where real juries were audiotaped without their knowledge, thought that the subjects in
Milgram's remote condition, in which the Learner was in an adjacent room and only objected
once by pounding on the wall, were in a situation similar to bomber pilots who are far removed
from their targets which could include civilian populations. Strodtbeck believed that while
Milgram's experiments could discover ways in which a person can resist authority on behalf of a
victim, it nevertheless revealed a capacity for evil in the subjects. It is this fear of the evil within
us that we do not want to confront which triggers the strong emotional reactions to Milgram
Baumrind also mentioned evil. In her critique of the proposed drafts by the APA
committee on ethical standards, she presented her own principles of ethical conduct in the
treatment of subjects. Her second principle is “Although for everything there is a season and a
time for every purpose under heaven, for a given season and time an action can be judged as
evil,” and she continues “…There is a time and place for deceit, inflicting pain, and putting other
19
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
people down. But the psychologist's laboratory is not the place, and this is certainly not the
time,” (Baumrind 1971, p 890). Without ever mentioning Milgram, she moved on to her third
principle: “Scientific ends, however laudable these may be, do not by themselves justify the use
890).
Or, as philosopher Alasdair MacIntyre (1982, p. 178) put it, “moral harm is inflicted on
someone when some course of action produces in that person a greater propensity to commit
wrong.” He also thought that if one classified a social scientific experiment as involving the
doing of a wrong to someone, then one could argue that this experiment ought not to be done.
laboratory setting, because, first, the subject is prone to behave in an obedient manner and follow
experimenter suggestions, and second, that the baseline for such behavior is probably higher in
the laboratory than in most other settings. In his rebuttal, Milgram (1964a) thought that the real
task is to learn more about the general problem of destructive behavior and that such inquiry will
stimulate insight and yield knowledge that can be applied to a variety of situations.
Milgram had the experimenter don a gray technician's coat, which was a visible symbol
of scientific expertise and authority. But in Experiment 7 after giving initial instructions, the
experimenter left the room and gave his orders by telephone rather than in person. Milgram
found that the subjects were likely to disobey and administer lower shocks than were called for.
Hofling, Brotzman, Dalrymple, Graves, and Pierce (1966) tested the use of the telephone to give
medical orders. Would nurses obey what appears to be a reasonable request from an authoritative
source that cannot be seen or verified? Using the pseudonym Dr. Smith, the experimenter called
20
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
twenty-two nurses on night duty at a nursing station in a psychiatric hospital and ordered them to
administer 20 milligrams of Astroten, a fictional drug that was not on the hospital's approved
drug list. A bottle labeled Astroten had been placed in the drug cabinet, and the label clearly
stated that 10 milligrams was the maximum daily dose. Dr. Smith told the nurses that he would
Not one of the nurses questioned the telephone request, and all but one of the twenty-two
nurses entered the request on the patient's chart and went to the patient's room to administer the
drug, where a senior nurse stopped them. Nurses were expected to follow doctors' orders, and did
admitted they had actually done this, but with a different drug.
A reprehensible real-world incident occurred early one evening in April 2004 when a
female employee about stealing a purse from a customer (abc Primetime, 2005; Wolfson, 2005).
In response, the supervisor asked an employee fitting this description to come back to her office.
The caller then told the supervisor to have the young woman, who happened to be eighteen years
old, empty her pockets and surrender her car keys and cell phone. Both complied. The caller then
demanded that the supervisor have the young woman remove all her clothes to see if she was
hiding anything. When the young woman did so, the supervisor gave her a small apron to cover
herself with.
The supervisor then told the caller that she had to leave the room to check on the
restaurant. The caller, however, demanded that another employee be brought in to watch the
young woman until the police arrived. The supervisor sent in a male employee, who refused to
follow the caller's orders to have the young woman drop her apron and do certain things. He
21
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
walked out but did not hang up the phone. The supervisor returned to the room, and the caller
persuaded her to have her fiancé come over to watch the young woman. When the fiancé arrived,
the supervisor left the room. For the next two hours, the fiancé obeyed the caller's commands,
ordering the young woman to drop her apron and do jumping jacks. During this time, the
supervisor came back into the room periodically, but the fiancé had the young woman put the
apron back on and told her to say nothing. The young woman reported that she begged the
Eventually the caller told the fiancé to give the phone back to the supervisor, who was
asked to bring in someone else. An older maintenance man was brought in, but he refused to
comply with the caller's wishes to have the young woman drop the apron. He told the supervisor
that something was not right, and then left. At that point the supervisor finally called the store
manager at home, although the caller claimed the store manager was talking to him on another
line. But the store manager said she had been sleeping. The supervisor finally realized that this
The two male employees who were on the phone refused to comply with the orders given
by the caller, who claimed to be a policeman. The fast-food company's employee manual stated:
But none of the employees that ABC Primetime (2005) spoke with at the McDonald's said
they ever recall seeing the warning. Sociologist Ester Reiter (1996), who was a participant
observer working in a fast-food restaurant, found that loyalty and obedience to authority are the
22
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
The police investigation eventually uncovered a string of hoax phone calls to least sixty-
eight fast-food restaurants in thirty-two states over a ten-year period (Wolfson, 2005; Zimbardo,
2007, pp. 278ff). Philip Zimbardo, who had conducted the Stanford Prison study in which
students playing guards and prisoners had become so antagonistic that the experiment was called
off after less than a week, speculated that the caller was "was very skilled in human
psychology—he (the caller) may have even read about Milgram" (Wolfson, 2005, pp 7).
Replications
Given the daunting ethical issues immediately surrounding Obedience to Authority and the belief
that no IRB would approve of such a study (Miller, 2009), it may be somewhat surprising to learn
that the Milgram experiments have been replicated in various forms since the late 1960s.
Although the replications used somewhat different methods of recruiting participants, and some
used increasingly loud beeps intead of shocks, the level of obedience was, in most cases, as high
as or even higher than what Milgram had originally found for similar experimental conditions
More recently, after a careful reading of Milgram, psychologist Jerry M. Burger (2009)
discovered that the 150-volt level (the tenth of thirty shocks) was a critical moment in the
experiments (also see Packer, 2008). At that point the Learner vehemently protested and
demanded that the study be ended, and in response nearly every teacher/subject paused and most
indicated either verbally or nonverbally their reluctance to continue. Burger's proposal for a
partial replication of the Milgram experiments was approved by the IRB at Santa Clara, a Jesuit
Catholic university in California. Burger took several steps to ensure that the participants were
treated in a humane and ethical manner. He used a two-step screening process to exclude
23
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
individuals who might have a negative reaction to the experience. More than half of the initial
sample of volunteers was dropped from the study, which seems rather high. Miller (2009)
speculated that as a result of the screening, Burger was less likely to observe the higher levels of
emotion or stress that Milgram found as obedience continued past the 150-volt mark.
Participants were told at least three times that they could withdraw from the study at any
time and still receive their $50 for participating. The experimenter running the study was a
clinical psychologist who was instructed to end the study immediately if he saw any signs of
excessive stress. Participants were debriefed and assured of the Learner's well-being within
Overall, Burger found that 70 percent of his subjects continued past the 150-volt level,
compared with Milgram's 82.5 percent of participants under similar conditions. He recognized
the possibility that situational variables could overpower individual differences in this setting.
Specifically, participants who were high in empathic concern expressed a reluctance to continue
the procedure earlier than others, but this reluctance did not result in a greater likelihood of
actually refusing to continue; that is, situational variables can overcome feelings of reluctance in
these conditions.
Conclusion
The purpose of the Milgram experiment and its replications was to contribute scientific
knowledge of the conditions under which people were willing to continue to obey the orders of
an authority figure. Milgram's experiment and its several replications encouraged participants to
perform what they and others see as dehumanizing acts. His findings challenge the belief that
only those from the sadistic fringe of society would shock the Learner at the most severe levels.
Milgram's participants consisted of ordinary people drawn from working, managerial, and
24
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
professional classes, and, under certain conditions, more than half administered the highest
shock.
The key ethical problems in the Milgram experiment are the unanticipated psychological
harms that resulted from their participation and the deception of subjects (Bankert et al 2009).
accepted practice in psychological research that increased during the 1950s and 1960s (Nicks,
Korn, & and Mainieri, 1997). The controversy over the Milgram experiment stimulated a re-
examination of research ethics and the American Psychological Association revised its ethical
Perhaps following Baumrind, Principle 3, stated that if the participant was not given a full
disclosure of “all features of the research that might be expected to influence willingness to
participate,” then the investigator was required “to protect the welfare and dignity of the research
participant.” Principle 4 stated that if deception was used (perhaps borrowing the term from
Milgram), “the investigator is required to ensure the participant’s understanding of the reasons
for this action,” and, perhaps following Baumrind, “to restore the quality of the relationship with
the investigator,” which Milgram claimed he did. Principle 7, apparently addressing the Milgram
experiment, held that “a research procedure could not be used if it is likely to cause serious and
lasting harm to participants.” Finally Principle 8, perhaps following Milgram’s procedure, stated
that after the data are collected, the researcher is required “to provide the participant with a full
clarification of the nature of the study and to remove any misconceptions that may have arisen.”
The Milgram experiment remains controversial since it is still difficult to separate out the
actual content and findings of the study from the evolving guidelines governing the informed
25
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
science research. Tutorials on the responsible conduct of research still reflect Baumrind’s
position, give a truncated account of the experiment, and present Milgram as a negative exemplar
of research ethics. In these tutorials, Milgram is essentially guilty as charged. However, this is
an ex post facto position, since he operated within the ethical guidelines that existed in the early
1960s. In fact he was one of the first to publish his debriefing procedures and attempted to
document whether or not his subjects experienced harmful after-effects. That some refuse to
accept that he and others (see Fisher & Fryberg, 1994; Korn, 1997) failed to find debilitating
long term harms may reflect a type of cognitive dissonance between their ethical values and the
empirical findings.
It would seem prudent for us to further understand how this type of behavior occurs, and
more importantly how a person can refuse to cooperate with an authority figure in a group
situation and intervene on behalf of a victim, that is, act altruistically. Given the problems of
bullying by an aggressive individual who, in some cases, seeks to exclude the victim from the
group, and hazing by a group of an individual who wants to join it, future research should focus
on exploring Milgram’s group effect conditions in which the naive teacher/subject is joined by
two teacher confederates who either defy the experimenter or blindly follow the experimenter’s
References
ABC Primetime. (2005). Original report: Restaurant shift turns into nightmare, available at
http://abcnews.go.com/Primetime/story?id=1297922&page=1.
APA (1953) Ethical Standards of Psychologists. Washington DC: American Psychological
Association Available at .
http://babel.hathitrust.org/cgi/pt?id=mdp.39015002186909;view=1up;seq=137
26
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
APA (1973) Ethical principles in the conduct of research with human participants. Washington
DC; American Psychological Associaton.
Adorno, T. W., Frenkel-Brunswik, E., & Levinson, D. J. (1950). The authoritarian personality.
New York: Harper & Brothers, 1950.
Amdur, R. J. (2003). The Institutional Review Board Member Handbook. Boston: Jones &
Bartlett.
Bankert, E., Cohen, J. M., Cooper, J. A., Davis, B. D., & Hicks, L. (2009). History & Ethics
Module. CITI Collaborative Institutional Training Initiative. Available at
http://webcache.googleusercontent.com/search?q=cache:l1sDiHVUMtcJ:www.health.gov
.fj/attachments/download/23/Collaborative%2520institutional%2520training%2520initiat
ive-%2520module%25201.pdf+&hl=en&gl=us.
Baumrind, D. (1964). Some thoughts on ethics of research: After reading Milgram's "Behavioral
Study of Obedience." American Psychologist, 19, 421–423.
Baumrind, D. (1966). Effects of authorative parental control on child behavior. Child
Development, 37, 887–907.
Baumrind, D. (1971). Principles of ethical conduct in the treatment of subjects: Reaction to the
draft report of the Committee on Ethical Standards in Psychological Research. American
Psychologist, 26, 887–896.
Baumrind, D. (1972). Reactions to the May 1972 draft report of the Ad Hoc Committee on
Ethical Standards in Psychological Research. American Psychologist, 27, 1082–1086.
Baumrind, D. (1978). Nature and definition of informed consent in research involving deception
in The National Commission for the Protection of Human Subjects in Biomedical and
Social Researh. The Belmont Report: Ethical principles and guidance for the protection
of human subjects of research, Appendix Vol II Paper 23,1- 23,71 (DHEW publication No.
OS 78-0014 23-1-71). Washington DC:
Baumrind, D. (1985). Research using intentional deception: Ethical issues revisited. American
Psychologist, 40, 164–174.
Becker, H. S. (1963). Outsiders: Studies in the sociology of deviance. New York: Free Press.
Belmont Report. (1979). The Belmont Report: Ethical principles and guidelines for the
protection of human subjects of research. Washington DC: The National Commission for
the Protection of Human Subjects in Biomedical and Behavioral Research.
27
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
Binet, A., & Henri, V. (1894). De la suggestibilité naturelle chez les enfants. Revue
Philosophique de la France et de l'Étranger, 38, 337–347. Translated by S. Nicolas,T.
Collins, Y. Gounden , & H. L. Roediger III. (2011). Natural suggestibility in children.
Consciousness and Cognition, 20, 394–398.
Blass, T. (1999). The Milgram paradigm after 35 years: Some things we now know about
obedience to authority. Journal of Applied Social Psychology, 29, 955–978.
Blass, T. (2004). The man who shocked the world: The life and legacy of Stanley Milgram New
York: Basic Books.
Burger, J. M. (2009). Replicating Milgram: Would people still obey today? American
Psychologist, 64, 1–11.
Cave, E. & Holm, S. (2003). Milgram and Tuskegee―Paradigm research projects in bioethics.
Health Care Analysis, 11, 27–40.
Chambers, T. (1999). The fiction of bioethics: Cases as literary texts (reflective bioethics). New
York: Routledge.
Christie, R. & Jahoda, M. (1954). Studies in the Scope and Method of “The Authoritarian
Personality” Continuities in Social Research. Glencoe, IL: Free Press
Clarke, S. (1999). Justifying deception in social science research. Journal of Applied Philosophy,
16, 151–166.
DHHS (2009). Public Welfare Protection of Human Subjects, 45 CFR 46. Department of
Health and Human Services. Available at
http://www.hhs.gov/ohrp/humansubjects/guidance/45cfr46.html
Durkheim, E. (1961). On the learning of discipline. (J. R. Pitts, Trans.) In T. Parsons, E. Shils, K.
D. Naegele, J. R. Pitts (1961) Theories of Society: Foundations of Modern Sociological
Theory. Pp 860-865. New York; The Free Press of Glencoe. [Original work published
1925 as L’Education morale [Moral education ] Paris: Felix Alcan, 1925 pp 147-164]
Elms, A. C. (1972). Social psychology and social relevance. Boston: Little, Brown.
Elms, A. C. (1982). Keeping deception honest: Justifying conditions for social scientific research
stratagems. In T. L. Beauchamp et al. (Eds.), Ethical issues in social science research
(pp. 232–245). Baltimore: Johns Hopkins University Press.
Festinger, L. Riecken, H.W. & Schachter, S. (1956). When Prophecy Fails. New York: Harper
and Row.
28
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
Fisher, C. B., & Fryberg, D. (1994). Participant partners: College students weigh the costs and
benefits of deceptive research. American Psychologist, 49, 1–11.
Harris, B. (1988). Key words: A history of debriefing in social psychology. In The Rise of
Experimentation in American Psychology J. G. Morawski (Ed.). pp 188- 212. New
Haven: Yale University Press.
Harvard. (2002). Kennedy School of Government, Introduction to protecting human participants
in research, A program designed for faculty, staff, and students involved in research using
human participants. PowerPoint slides at www.ksg.harvard.edu/research/
KSGHumanParticipantsnov2002.ppt.
Herrera, C. D. (2001). Ethics, deception, and “those Milgram experiments.” Journal of Applied
Philosophy, 18 245–256.
Hofling, C. K., Brotzman, E., Dalrymple, S., Graves, N., & Pierce, C. M. (1966). An
experimental study of nurse-physician relationships. Journal of Nervous and Mental
Disease, 141, 171–180.
Korn, J. H. (1997). Illusions of reality: A history of deception in social psychology. Albany: State
University of New York Press.
Lindemann Nelson, H. (2004). Narrative ethics and literature. In H. T. Engelhardt & G. George
(Eds)., Handbook of Bioethics. · Philosophy and Medicine 78(pp. 163-181). Dordrecht:
Kluwer.
Lipset, M., Trow, M. A. & Coleman,J. S. (1956). Union Democracy: The Internal Politics of the
International Typographical Union. Glencoe, IL: Free Press.
MacIntyre, A. (1982). Risk, harm and benefit assessments as instruments of moral evaluation. In
T. Beauchamp, R. R. Faden, R. J. Wallace, & L. Walters (Eds.), Ethical issues in social
science research (pp.175-182 ). Baltimore: Johns Hopkins University Press.
McDonald's Response. (2005). http://abcnews.go.com/Primetime/story?id=1301447.
Milgram, S. (1963). Behavioral study of obedience. Journal of Abnormal and Social Psychology,
67, 371–378.
Milgram, S. (1964a). Issues in the study of obedience: A reply to Baumrind. American
Psychologist, 19, 848–852.
Milgram, S. (1964b). Technique and first findings of a laboratory study of obedience to
authority. Yale Science Magazine, 39(14),9-11.
29
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
Milgram, S. (1964c). Group pressure and action against a person. Journal of Abnormal and
Social Psychology. 69, 137-143.
Milgram, S. (1965a). Some conditions of obedience and disobedience to authority. Human
Relations, 18, 57–75.
Milgram, S. (1965b). Liberating effects of group pressure. Journal of Personality and Social
Psychology. 1, 127-134.
Milgram, S. (1965c). Obedience, produced by Stanley Milgram. Available from Penn State
Audio Visual Services, University Park PA: 814–6314.
Milgram, S. (1974). Obedience to authority: An experimental view. New York: Harper and Row.
Miller, A. G. (2009). Reflections on “Replicating Milgram” (Burger, 2009). American
Psychologist, 64, 20–27.
Nicks, S. D., Korn, J. H., & Mainieri, T. (1997). The rise and fall of deception in social
psychology and personality research, 1921–1994. Ethics & Behavior, 7, 69–77.
Packer, D. J. (2008). Identifying systematic disobedience in Milgram's obedience experiments.
Perspectives on Psychological Science, 3, 301–304. Perry, G. (2013) Behind the Shock
Machine: The Untold Story of the Notorious Milgram Psychology Experiments. New
York: New Press.
Reiter, E. (1996). Making fast food: From the frying pan into the fryer.: Montreal: McGill-
Queen's University Press.
Ring, K., Wallston, K., Corey, M. (1970) Mode of debriefing as a factor affecting subjective
reaction to a Milgram type obedience experiment: an ethical inquiry. Representative
Research in Social Psychology 1, p. 67-88. Reprinted in Katz J, with Capron AM, Glass
ES. Experimentation with Human Beings. New York, N.Y.: Russell Sage Foundation;
1972: 395-400.
Slater, L. (2004). Opening Skinner's Box: Great psychological experiments of the twentieth
century. New York: Norton.
Spock, B. (1946). The common sense book of baby and child care. New York: Duell, Sloan and
Pearce.
Strodtbeck, F. L. (1973). Bales 20 years later: A review essay. American Journal of Sociology,
79(2), 459–465.
Sumner, W. G. (1906) Folkways: a study of the sociological importance of usages, manners,
30
H Perlstadt Obedience to Authority: Origins Controversies Theoretical & Applied Ethics 2013, 2(2), 53-77
31