Download as pdf or txt
Download as pdf or txt
You are on page 1of 18

Theory & Psychology

http://tap.sagepub.com/

Control and resistance in the psychology of lying


Maarten Derksen
Theory Psychology 2012 22: 196
DOI: 10.1177/0959354311427487

The online version of this article can be found at:


http://tap.sagepub.com/content/22/2/196

Published by:

http://www.sagepublications.com

Additional services and information for Theory & Psychology can be found at:

Email Alerts: http://tap.sagepub.com/cgi/alerts

Subscriptions: http://tap.sagepub.com/subscriptions

Reprints: http://www.sagepub.com/journalsReprints.nav

Permissions: http://www.sagepub.com/journalsPermissions.nav

Citations: http://tap.sagepub.com/content/22/2/196.refs.html

>> Version of Record - Mar 21, 2012

What is This?

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


427487 TAP22210.1177/0959354311427487DerksenTheory & Psychology

Article

Theory & Psychology


Control and resistance 22(2) 196­–212
© The Author(s) 2012
in the psychology of lying Reprints and permission:
sagepub.co.uk/journalsPermissions.nav
DOI: 10.1177/0959354311427487
tap.sagepub.com

Maarten Derksen
University of Groningen

Abstract
Psychology’s obsession with control, with manipulating the experimental situation and the behavior
of participants, has often been criticized. Mainstream, experimental psychology, it is said, abuses
its power in the laboratory to artificially create docile participants who fit its experimental regime.
I argue that this criticism accords too much control to the experimenter. Using the psychology
of lying and lie detection as an example, I show that the psychologist does not exert full control
in the laboratory, but meets resistance. In the psychological laboratory, lying and lie detection are
constructed on a technological model in which both the psychologist and the liar are operators of
devices, locked in battle. The critical focus, I conclude, should be on the technologies and counter-
technologies at work on the laboratory, and on the limitations of this model.

Keywords
control, experimental deception, lie detection, lying, polygraphy, resistance

Among the problems in social interaction that have inspired technological solutions is
lying. American society in particular has been fascinated by the promise of uncovering
lies by technical means. Nowhere is the polygraph, the machine commonly called “lie
detector,” more in use than in the United States (Alder, 2007). The terrorist attacks on
the World Trade Center have led to a further boost of the science and technology of lie
detection (Littlefield, 2009). Ample funds have flowed to behavioral scientists who
study lying and how to detect it. Among the many “human factors behavioral sciences
projects” of the U.S. Department of Homeland Security are now several that focus on
the detection of deception (U.S. Department of Homeland Security, n.d.). The U.S.
Transportation Security Administration paid leading lie detection researcher Paul
Ekman $1 million to teach TSA inspectors his interview technique (T. Frank, 2007).
The historian Ken Alder (2002) sees the attempt to “replace personal discretion with

Corresponding author:
Maarten Derksen, Faculty of Behavioural and Social Sciences, University of Groningen, Grote Kruisstraat
2/1, 9712 TS Groningen, The Netherlands.
Email: m.derksen@rug.nl

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


Derksen 197

objective measures, and political conflict with science” as typical of the “American
strain of the Enlightenment project” (p. 106). Yet the ideal of replacing the fuzzy art of
social interaction with science and technology has a wider appeal. European research-
ers of lying and lie detection have recently formed E-PRODD, the European Consortium
of Psychological Research on Deception Detection. One of their stated aims is to advise
practitioners—police, security personnel, etc.—on scientifically grounded methods to
detect deception.
Whether such technologies are effective is not the subject of this paper. I want,
instead, to analyze the psychology of lying and lie detection to throw new light on an
aspect of psychology that has been much discussed: its focus on control. Critics have
decried psychology’s obsession with manipulating the experimental situation and the
participants’ behavior in it. Mainstream, experimental psychology, it is said, abuses its
power in the laboratory to artificially create docile participants who fit its experimental
regime. This scientific machinery may churn out a steady stream of results, but their
value outside the laboratory is minimal: there, psychology meets resistance from the
people whose behavior it purports to explain and predict. As soon as psychology loses
control over its subjects, their behavior ceases to conform to the theories developed in
the lab. Although there is much truth in this criticism and it is backed up by many studies,
I believe it unduly separates control and resistance. The case of the psychology of lying
and lie detection shows that in the act of establishing control, psychologists also articulate
its opposite (resistance), and define what escapes their control. The liar and the psy-
chologist are locked in a battle of wills, employing strategies and counter-strategies. In
their laboratories, I shall conclude, psychologists do not wield absolute control; rather,
the laboratory is a place where control is articulated, distributed, and managed.

Control
Experimental psychology’s obsession with control goes back to the days of introspec-
tive psychology. According to Deborah Coon (1993), Wundt and his American follow-
ers in particular held a “technoscientific ideal” that drove them to establish control over
each element of the experiment, including the “observer,” whose introspective reports
furnished the data. The experiment was to be a mechanized, standardized process with
interchangeable parts. The observer functioned either as a passive recorder of mental
events stimulated by standardized instruments, or as a rigorously trained, quasi-
mechanical introspective observer, able to report the contents of consciousness without
subjective interference.1
Control was no less central to behaviorism. Indeed, in his history of that movement,
appropriately titled Control, John Mills (1998) argues that one of its defining features
was the belief that participants’ behavior could be manipulated by the scientist. Mills
also claims that the focus on control has survived behaviorism’s demise: the foundation
of experimental psychology’s methodology is still that control is “removed from
the individual subject and assigned to the investigator” (p. 18). Operationism remains
the guiding principle in experimental psychological research. Theoretical concepts are
defined in terms of the experimental operations required to produce certain effects, thus
shifting control from the experiencing individual to the experimenter. Social psychology

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


198 Theory & Psychology 22(2)

in particular appears to operate on the principle that to produce valid knowledge it is


necessary that the experimenter has full control and the participant is docile. Stam,
Lubek, and Radtke (1998) have criticized social psychology for relying on experiments
that are made up of nothing but set and staged responses, in which the participant is
reduced to “an appendage to the experiment-as-machine” (p. 171). They contend that
Milgram’s obedience study, one of psychology’s most famous experiments and often
lauded as a call for resistance against tyranny, is above all a demonstration of the power
of psychology. The obedience that it purports to measure—whether or not the partici-
pant obeys the instruction to administer an electric shock to someone else—obscures
“the obedience required by the entire apparatus which limits individual responses to
those required by the apparatus in a context thoroughly artificial and staffed by actors”
(p. 171). Milgram could only perform his experiments because the authority of experi-
mental psychology as such was not questioned.
A focus on manipulation and control might at first sight seem to confirm psycholo-
gy’s claim to be a natural science. What makes the natural sciences so successful, after
all, is their ability to manipulate natural processes in the laboratory, to control and
predict exactly the outcome of experiments, and to create entirely new materials and
phenomena (see Schlich, 2007). Some sociologists and philosophers of science have
argued, however, that this is a mistaken view of science. The experimental control that
social scientists, including psychologists, try to emulate is a myth: empirical studies of
scientific practice show that the extraordinary success and fecundity of the natural sci-
ences is not the result of complete control over nature, but rather due to the opposite.
According to Bruno Latour (2000), the objectivity of the natural sciences resides in their
ability to let “objects object to the utterances we make about them” (p. 115). A laboratory
is not a place that enables mastery and domination, but one in which objects are encour-
aged to resist and surprise the experimenter.2 The natural sciences do not study dumb,
passive stuff, but undisciplined, recalcitrant objects. Drawing on the work of Isabelle
Stengers (1997) and Vinciane Despret (2004), Latour criticizes the social sciences for
their misguided obsession with control, and, like them, puts forward Milgram’s research
as a paradigmatic case.
Whereas the psychological laboratory is often identified as a place where the psy-
chologist wields absolute control and manipulates the participants, it is also pointed
out that outside the lab the power of the psychologist meets resistance. One reason why
social psychology is a form of history, according to Gergen in his classic 1973 paper,
is that people, upon learning of the theories that scientists hold about their behavior,
may decide to act differently. “As a general surmise, sophistication as to psychological
principles liberates one from their behavioral implications” (Gergen, 1973, p. 313). If
you have read about the bystander effect, you may be more prone to step out of the
crowd and help the victim. Theories that lend themselves to social control are particu-
larly vulnerable to resistance. Paradoxically, the stronger the theory is, the better it
predicts behavior, the wider its use will be to control people’s behavior, and the more
people will resent it and choose to resist its predictions (Gergen, 1973, p. 314). Given
the high value accorded to personal freedom and individuality in our society, the best
psychological theories tend to be resisted the most, according to Gergen.

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


Derksen 199

Ian Hacking has argued that it is typical of all the human sciences, including
psychiatry, that they meet with resistance. These disciplines cannot help but interact
with and change their objects of study, a phenomenon he calls the “looping effect” (see,
e.g., Hacking, 1986, 1995, 2006). “Kinds of people who are medicalised, normalised,
administered, increasingly try to take back control from the experts and institutions,
sometimes by creating new experts, new institutions” (Hacking, 2007, p. 311). The
classic case is homosexuality, which, through the resistance of the people the label is
applied to, has become a different thing than it was in the 19th century: no longer a
perversion suffered in secret, but a sexual identity that is publicly celebrated—in some
cultures at least. Something similar may be happening with Asperger’s syndrome,
which is increasingly claimed by people so labeled to be not a disease or disability, but
a different, and in some senses superior, way of being.
The psychology of lying and lie detection invites a closer look at these twin issues of
control and resistance. On the one hand, the recent surge in interest in the field is fed to
a significant extent by the wish to control access to a country, a venue, or a building. On
the other hand, it ascribes control to the participant, the liar, which becomes apparent in
his or her resistance to the machinations of the psychologist. The combination of lying
and lie detection ties control and resistance together, a dynamic that is not captured when
psychology is criticized for the docility of its participants.

Lying as a technology
A recurring feature of discussions of lies and lie detection is a paragraph or chapter on the
definition of a lie (see, e.g., Ekman, 2001; Vrij, 2008). Lying is intentionally conveying a
falsehood, on this the authors agree. A lie is deliberate. Calling someone a liar implies
ascribing control. Thus, pathological liars are not in fact liars: they are “untruthful but
cannot control their behavior” (Ekman, 2001, p. 26). Additional requirements may be that
the recipient did not ask to be deceived, as when one attends a play, and that the lie has
not been announced beforehand (Ekman, 2001, p. 27).
Not only do psychologists define lies as deliberate, they also emphasize the control
that is required to construct, perform, and maintain a lie. Liars must consciously man-
age all aspects of their deception. When we lie, we attempt to manipulate the beliefs of
others: their image of us, or their ideas about this or that aspect of the world. Lying,
however, is difficult, as it requires us to control what we say and how we say it. A lie
has to be carefully constructed and performed. Once the lie is told, our actions and
further pronouncements must be consistent with it, or at least seem so to the person or
persons whom we have lied to. The lie obliges us to remain vigilant and control our-
selves so that the illusion is maintained. It requires considerable skill: the liar “needs to
know what he can do with his body; he needs to be aware of his own actions through
both internal and external feedback; and he needs knowledge of how to program his
actions” (Ekman & Friesen, 1969, p. 93). Lying, from the psychologist’s perspective, is
a technology. “The study of deceit … provides an opportunity to witness an extraordi-
nary internal struggle between voluntary and involuntary parts of our life, to learn how
well we can deliberately control the outward signs of our inner life” (Ekman, 2001,
p. 349). The liar operates her body like a device, manipulating it to manipulate others.

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


200 Theory & Psychology 22(2)

She is a guilty mind controlling her body as a tool to create deception. In calling this
perspective on lying “technological,” I mean to say no more than that it conceives of
deception in terms of the manipulation of a tool by a user. Technology, however, is a
difficult and contested concept—as this special issue also makes clear. My point is not
that lying really is a technology, but that it is constructed in psychology along the lines
of this particular idea of technology. And, as I will show, the psychology of lying and
lie detection itself articulates the limitations of this approach to lying.

Lie detection: Registration, resistance, and control


Although there is some research into lying as an interesting psychological phenome-
non per se, it is overshadowed by the overwhelming number of studies that consider
lying from the perspective of lie detection. It is here that the technological perspective
comes into its own. If lying is a technology, lie detection is the counter-technology
designed to neutralize it. The term “lie detector,” which was originally applied to the
polygraph, has also become common to indicate the experimental participant whose
task it is to identify deception. It is even used to refer to “humans” in general: in a
sense, we are all lie detectors (DePaulo, Zuckerman, & Rosenthal, 1980). This is,
moreover, very much an applied field, which tries to find better ways of detecting lies.
Indeed, to the extent that psychologists study lying and lie detection per se, the point is
often to demonstrate how much room for improvement there is. Such studies show that
people are dismal lie detectors, although we believe ourselves to be quite good at rec-
ognizing when we are being lied to. Liars fidget and look away, we think, but in reality
this is not the case. The real and the perceived characteristics of liars show little over-
lap (Ekman, 2001; Vrij, 2008).3
Efforts to develop a better technology of lie detection are based on the premise that
the liar can never fully control all the relevant aspects of his or her behavior. Something
will show: the stress involved in constructing and maintaining a lie, the secret joy of
deception (“duping delight”; Ekman, 2001), the feelings of guilt, the fear of discovery,
or the actual affect behind the feigned emotion. An alternative to this “leakage” para-
digm is being developed by Aldert Vrij and colleagues. Their approach is based on the
idea that lying involves a high “cognitive load” (in other words it is difficult). Increasing
this load, for example by asking a suspect to recount the events backwards, increases the
behavioral differences between liars and truth-tellers (Vrij et al., 2008). Lie detection
then works by manipulating the amount of control required to produce a lie.
Whatever the approach taken, the question is which variables are reliable indicators
of deception, as well as hard or impossible to control. The lie detector proper, the clas-
sic polygraph, is based on the idea that increased autonomous nervous system activity,
measurable as sweaty palms and higher blood pressure and heart rate, is a reliable
indicator of lying, and beyond conscious control. Yet it is a widely acknowledged
problem with the polygraph that it can to some extent be resisted by the testee. There
is a website devoted to this resistance—Antipolygraph.org—which publishes a book
(Maschke & Scalabrini, 2000) that denounces the polygraph and teaches how to beat
it. “Countermeasures” are a subject of much discussion and research in the polygraph
community (e.g., Honts & Amato, 2002). Biting one’s tongue, pressing down one’s

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


Derksen 201

toes, or counting back from 200 at strategic points in the test will erase the difference
between truthful and untruthful responses. Thus, while polygraphists look for “coun-
termeasure detectors” and “counter-countermeasures” (Honts & Amato, 2002, p. 260),
others have proposed techniques that are harder to resist. A variant of the Implicit
Association Test (IAT), for instance, was recently put forward as an accurate and
fake-resistant lie detector (Sartori, Agosta, Zogmaister, Ferrara, & Castiello, 2008). In
response, Verschuere, Prati, and Houwer (2009) have presented evidence that a sub-
stantial percentage of properly instructed participants were able to beat this test. Even
prior experience with the IAT, without explicit instruction, helps to fake innocence.
Given that one can practice the IAT on the web, Verschuere et al. note, its use in a
forensic context is dubious.
An even stronger claim of irresistibility is made for a technique called “brain-
fingerprinting.” This proprietary technology is 100% accurate, according to its inven-
tor and owner, Jerry Farwell. The precise details have not been made public, but EEG
measurement of the P300 event-related potential is the basis of this technique. The
P300 occurs when information is recognized, and can therefore be used to detect guilty
knowledge. “Brain Fingerprinting testing can determine quickly and accurately whether
or not specific knowledge is present or absent in an individual” (Brain Fingerprinting
Laboratories, n.d.a). An important part of the allure of brain-fingerprinting and other
brain-based lie-detection technology is that the brain is seen as outside conscious
control (Littlefield, 2009). After all, in the technological paradigm of lying and lie
detection, the brain is itself the controller, not the controlled. Thus, a direct look at the
brain will show it in the act of deceiving.
Alas, it appears brain-fingerprinting can be fooled as well, because the brain can in
fact be controlled. Rosenfeld et al. found that the brain can be taught to react the same
way to irrelevant stimuli as to relevant “guilty knowledge” (Rosenfeld, Soskins, Bosh,
& Ryan, 2004). Participants who had been instructed to (for example) wiggle their toes
when an irrelevant stimulus appeared on the screen produced P300s that were indistin-
guishable from their responses to relevant stimuli: the participant then recognizes both
“guilty knowledge” and irrelevant questions, thus triggering a P300 in the brain regard-
less of the question. When dealing with domestic crime this may not be a problem—
“[A]ssuming that the domestic scientific community is reasonably free of criminals,
the domestic forensic situation may be safe for criminals lacking the intelligence and
resourcefulness to make use of published papers such as the present one” (Rosenfeld
et al., 2004, p. 218). However, Rosenfeld et al. (2004) warn, foreign terrorists, backed
by less scrupulous scientists, may get access to this counter-technology.
Whereas lying is a technology of control in this paradigm, lie detection, as the term
implies, is presented as a technology of registration. What is registered is lack of con-
trol, the breakdown of the liar’s technology, and if the instrument is tuned to the right
channel and sufficiently sensitive, it will pick up the telltale signals. The instrument
may be a machine, such as the polygraph with its rotating drums, or a human being. Lie
researcher Paul Ekman has started a small company (the Paul Ekman Group, http://
www.paulekman.com/) which among other things produces the “micro expression
training tool” (METT). The METT teaches how to read the brief expressions that flash
across the face when people try to conceal or repress what they feel. The technique is

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


202 Theory & Psychology 22(2)

vividly portrayed in the television series Lie to Me, for which Ekman acts as a scientific
consultant (Grazer, Nevins, & Baum, 2009–2011).
In the actual practice of lie detection, however, registration as such may be of minor
importance, being only one element in a carefully controlled spectacle. The polygraph,
for instance, owes its success not so much to the accuracy of its detection, but to its
perceived infallibility, which leads many suspects to confess. A crucial development in
the history of the polygraph occurred when Leonarde Keeler devised a theatrical context
for its use (Alder, 2007). A typical element was his card trick: the participant was hooked
up to the polygraph, and asked to pick a card from a small deck. Next, the participant
had to deny having picked each card that was turned over. Keeler then made a show of
checking the traces produced by the polygraph, and announced where the participant
had lied. He was invariably right: Keeler usually marked the deck.4 If suspects do not
confess after this show of strength, at least their fear of being caught has been heightened,
and their self-control thereby diminished.
In lie detection, even fake machinery may be effective, as long as it is presented with
the proper theater. A controversy is currently raging about the “voice stress analyzer”
produced by Nemesysco, which has shown promising results in pilot programs con-
ducted by several British city councils. Although the technology has been claimed to be
bogus (Eriksson & Lacerda, 2008),5 it appears to have some effect in combating benefit
fraud—people falsely claiming unemployment or housing benefit although they do not
qualify for it. The councils at least are quite content with the device. Critics put the
effect down to the aura of infallibility that lie detectors have: when claimants are told
that they will be subjected to the lie detector, around a third decide not to apply for
benefit. The power of the lie detector is a self-fulfilling prophecy, a lie come true. When
law-enforcement agencies ask his advice about lie detection devices, the American
psychologist Mitchell Sommers offers to build them an impressive fake machine for
half the price of a “real” lie detector. His offer has not been taken up yet (“Lie detectors,”
2008). As Geoff Bunn (2007) has put it, lie detection is a “spectacular science,” in
which the laboratory and the theater are one.

Demarcating the lie is a matter of control


To study lying and lie detection as psychologists define them requires experimental
control, often including deception. Ekman and Friesen’s (1974) classic study of
“leakage” can serve as an example of the way the lie and its discovery are constructed
to fit their definition. Ekman and Friesen studied how well people could hide the fact
that they were watching a shocking rather than a pleasant video. They first produced
an unequivocal reality, in the form of a video clip containing medical horror scenes
(amputations, treatment of severe burns). Next, they made sure that the liars would be
motivated and really try to lie convincingly by selecting student nurses for this task, and
instructing them beforehand that the ability “to control the expression of your feelings”
(Ekman, 2001, p. 55) is an important skill in nursing. To minimize differences in the
facility with which the content of the lies was constructed, the experimenters provided
the participants with several suggestions. The actual “deceptive interaction” involved
an interviewer, who couldn’t see the film and asked the participants questions such as

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


Derksen 203

whether they would show the film to a young child. False instructions were given to the
interviewer to prevent her from knowing beforehand whether or not the participant
would attempt to lie. She was also left in the dark about the research hypothesis. Hidden
cameras recorded the interaction—participants were only told afterward that they had
been filmed. The actual lie detectors were students following a psychology course, who
were kept “naïve” in the same way as the interviewer. The judgment task for these
observers consisted of a “binary choice, honest or deceptive” (Ekman & Friesen, 1974,
p. 291), with regard to video clips showing either only the face or only the body of the
participants.
Deception and other forms of experimental control are standard elements of research
in psychology, social psychology in particular (Korn, 1997). What is noteworthy here
is that they shape the actions and constrain the options of the participants in the study
in such a way that what takes place exactly fits the psychologists’ idea of lying and lie
detection. Reality is as close to unequivocal as one might hope to get: very few would
describe an amputation as a pleasant sight, and someone who does so is most probably
lying. The lie is intentional, and the liars are motivated to control their verbal and
non-verbal behavior. Next, lie detection takes the form of a detached judgment. Notably,
it is the task not of the interviewer but of the observers; there can be no interaction
involved.6 Finally, corresponding to the unequivocal nature of reality and what was told
about it, observers are restricted to a binary honest–deceptive judgment. Although this
is the most common, other set-ups exist, as well as many variations on this classic
model, but they all have in common that the psychologist manages the experimental
situation in such a way that lying and lie detection take place according to the parame-
ters of the paradigm. The laboratory becomes an arena in which liars and lie detectors,
technology and counter-technology, do battle. Psychologists control the space in which
lying and lie detection can appear and be studied in terms of control and resistance. The
limits of this space, however, are also visible in their work.

Limits of the lie


There is some debate as to whether one can actually study real lies in the laboratory. In
the psychological definition of lying, unambiguous intentionality and falsehood are
essential, but psychologists note that both can be difficult to determine, or manipulate in
research. In a critical analysis of neuroimaging studies of deception, Sip, Roepstorff,
McGregor, and Frith (2008) emphasize the deliberate nature of deception, and the diffi-
culties of studying it experimentally. Since lying is by definition intentional but partici-
pants are instructed to lie in the typical deception experiment, what participants produce
in such studies are not in fact lies at all (see also Wolpe, Foster, & Langleben, 2005, p.
42). Control remains with the experimenter, perhaps best exemplified in a study by Kraut
(1978): “Behind the interviewer, in the candidate’s line of sight, was a pair of signal
lights that instructed the candidate to either lie or tell the truth on an answer” (p. 382).
Sip et al. (2008) call for the development of experimental paradigms in which “subjects
have a real choice as to whether and when to lie” (p. 52). Several such set-ups in
fact exist, but the majority of deception studies still employ lies produced on demand.

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


204 Theory & Psychology 22(2)

As Wolpe et al. (2005) conclude, “[T]he ability to detect simple deceptions in laboratory
settings may not translate into a usable technology in less controlled situations” (p. 42).
Conversely, studying deception outside the laboratory meets with the problem that the
experimenter loses control over the truth. When studying “naturalistic occurrences of
deception” (Ekman & Friesen, 1969, p. 103), it is often impossible to determine when
lying occurs. A similar problem hampers field studies of lie detectors: to know whether
a machine is accurate, one needs to compare its decisions to “ground truth.” But as
scientists, philosophers, and people working in forensic contexts (among others) are well
aware, establishing the truth is far from straightforward. That, after all, is precisely why
lie detectors are needed (Meijer, Verschuere, Merckelbach, & Crombez, 2008, p. 424).
Confessions and the outcomes of court cases are deemed insufficiently robust as criteria
(Ekman, 2001, p. 209).
To put the liar’s control to the test, it is imperative that the lie matters. The more is
at stake, the stronger the intent of the participants, the more control they attempt to
exert, and thus the more realistic the lies that they produce. However, psychologists
often remark on the difficulty of creating “high stakes” in their research. M.G. Frank
and Ekman created an “ecologically valid high-stake scenario” by telling the partici-
pants they could keep the $50 they had “stolen” if they were able to convince the inter-
viewer they didn’t have it. If they failed, they had to hand it back, didn’t get their
participation fee, and would have to “sit on a cold, metal chair inside a cramped, dark-
ened room labeled ominously XXX, where they would have to endure anywhere from
10 to 40 randomly sequenced, 110-decibel startling blasts of white noise over the course
of 1 hr” (M.G. Frank & Ekman, 1997, p. 1431). (Participants were given a brief taste of
this experience, but it was not actually meted out as punishment.) Vrij, however,
remarks that apart from the ethical concerns this study raises, one may wonder whether
$50 and a punishment that participants will probably doubt is real is in fact a high stake.
“[I]t may not be possible to introduce truly high-stakes settings in laboratory experi-
ments,” he concludes (Vrij, 2008, p. 53).
Another problem is that the unambiguous distinction between truth and falsehood that
the psychological study of lying depends on is often difficult to draw. Appelbaum (2007)
notes that lie detectors would have problems in cases

in which the status of the claim by the person being evaluated cannot be neatly categorized as
true or false—as, for example, when a statement is partially true, when the evaluee is uncertain
of the right answer, or when he or she considers the option of lying but ultimately decides to tell
the truth. (p. 461)

Moreover, infamously, people may hold widely divergent views of reality. Screening sex
offenders for recidivism with lie detector tests, for example, is made difficult by the fact
that they are prone to “cognitive distortion,” labeling what everyone else considers to be
sexual abuse in innocent terms such as “cuddling.” Asking convicted sex offenders,
hooked up to a polygraph, whether they have committed sexual abuse while on parole is
therefore no use. Asking about concrete behavior might solve this problem, but that is not
possible with screening, since the questioner doesn’t know what, if anything, happened
(Meijer et al., 2008).

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


Derksen 205

Self-deception poses problems for the aspects of both intent and falsehood. Telling a
falsehood one believes to be true is not lying, according to the definition, but things get
murky when the false belief is somehow one’s own creation. Self-deception is a popular
puzzle for philosophers, but a problem for lie researchers. Vrij simply rules the phenom-
enon out of bounds, on a priori grounds. A lie, according to the definition he favors,
necessarily involves two people, one leading another to believe something that he or she
believes to be untrue. Therefore, self-deception falls outside the definition and can be
ignored (Vrij, 2008, pp. 15–16). Others are less sanguine. According to Ekman and
O’Sullivan (2006), along the dimension of intent, “distortions of reality” (p. 673) run
from denial, repression, and dissociation (the least intentional), via self-deceptions, posi-
tive illusions, self-aggrandizement, and non-conscious cognitions, to white lies, compli-
ments, and courtesies; malingering, factitious disorders, and some dissociative disorders;
and, finally, “the deliberate, high stakes lies in which we have been most interested and
have studied in greatest detail” (p. 674). Unfortunately, liars who succeed in deceiving
themselves are “undetectable” (Ekman, 2001, p. 140). That suggests that, although
psychologists’ definition of lies is only applicable to a restricted range of untruths, at
least they can be distinguished and the relevant ones isolated. Solomon, however, has
warned that deception and self-deception cannot be held apart that easily, that they often
go together, or lead to each other. “To fool ourselves, we must either fool or exclude
others; and to successfully fool others, we best fool ourselves” (Solomon, 1993, p. 42).
The model of the cynical, bare-faced lie is only applicable to a very restricted range of
behaviors; usually lying and self-deception form a “tangled web.”7
Partly as a result of such problems applying their version of lying, psychologists’
attitude about lie detection is deeply ambivalent. One of the fiercest critics of the forensic
use of the polygraph was John Larson, one of its inventors. He saw it as a scientific
instrument for the diagnosis of criminal minds, and rejected the theatrical embedding
that Leonarde Keeler created (Alder, 2007). Psychologists have continued to put science
first in their dealings with lie detectors. They have remained skeptical about the poly-
graph and continue to critically test new technology to determine how accurate and
resistance-proof it is.8 Even those who are most optimistic about lie detection are careful
to emphasize its limitations and the art required to successfully employ the technology.
Choosing the right questions to ask and interpreting the results require skills that cannot
be mechanized. Algorithms do exist that analyze the measurements of a polygraph and
decide whether or not the participant passed the test, but operators often turn this feature
off so that they can continue questioning, in the hope of producing a confession (Alder,
2007, p. 129). In any case, lies cannot be detected per se: what is measured are signs of
stress, emotion, or recognition; there is no algorithm that connects these securely to
deception. Even the technology touted as 100% accurate requires “je ne sais quoi” to
operate: “Every science involves skill, judgment, or ‘art’ on the part of its practitioners,
and the science of Brain Fingerprinting®testing is no exception” (Brain Fingerprinting
Laboratories, n.d.b).
Finally, psychologists are well aware that lie detection, whatever the technology
employed, is only advisable in limited circumstances. Lies are not always morally repre-
hensible (a commonsense misconception, according to Vrij, 2008), and telling the truth
or exposing deceit is often stupid or callous. No one outside the Axis powers would have

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


206 Theory & Psychology 22(2)

preferred the Allies to be honest about where the D-Day landing would take place
(Ekman, 2001, p. 349). Moreover, lies act as a social lubricant, to almost everyone’s
satisfaction. Vrij (2008) discusses the “two-pronged nature of lying” (p. 19), explaining
that as well as the bad, morally reprehensible, self-serving kind, there are other-oriented
and social lies that are justifiable and even essential to social interaction. Whereas being
able to detect the bad, selfish lies would “benefit individuals or the society as a whole”
(p. 35), the ones that act as a social lubricant are best left alone. In theory and practice,
lie research restricts itself mostly to forensics and security. This limitation usually
remains implicit, allowing the suggestion that the results are applicable to, say, marital
infidelity or business negotiations. An exception is an article titled “How People Really
Detect Lies,” in which Park, Levine, McCornack, Morrison, and Ferrara (2002) make the
simple, but fundamental point that “humans” are not in fact “lie detectors” at all. Instead
of trying to detect deception from verbal and nonverbal behaviors, “people often rely on
information from third parties, the consistency of statements with prior knowledge, the
consistency of messages with physical evidence, or confessions when rendering
judgments about the veracity of others’ messages” (p. 144). In other words, in daily life
people do not employ a technology of lie detection, but engage in everyday epistemic
processes to determine the truth. One could add that their actions at the same time have
a moral character. Calling someone a liar, or even suspending judgment awaiting further
information, is a morally highly significant act. What is a simple button push in the
laboratory is a highly charged breach of the social order outside.
It is worth emphasizing that it is often the lie researchers themselves who point out
the limitations of their idea of lying, their research paradigm, and the lie-detection tech-
nology that it produces. Psychologists themselves draw the boundaries and articulate
what lies beyond it. Granted, often they do so only sketchily. Erving Goffman’s The
Presentation of Self in Everyday Life (1956) is sometimes briefly referred to as an expo-
sition of what lies between bare-faced lies and straightforward truths (DePaulo, Wetzel,
Sternglanz, & Wilson, 2003; Ekman, 2001, p. 28 note; Kraut, 1978; Vrij, 2008, p. 18).
In Goffman’s analysis, social interaction is always a matter of performance, and thus to
varying degrees one of misrepresentation. A bare-faced lie, “defined as one for which
there can be unquestionable evidence that the teller knew he lied and willfully did so”
(Goffman, 1956, p. 40), is only one of many techniques that people employ to create a
favorable impression of the situation and their role in it. Tellingly, the paragraph on
misrepresentation is preceded by one on calculated spontaneity: in Goffman’s analysis,
being honest, sincere, and natural is no less an accomplishment than deception. Very
little of Goffman’s rich analysis of the performance of self in everyday life finds its way
into the texts of lie researchers, but it is there nonetheless.
In other words, the field not only produces its proper objects and techniques, but also
creates a description of what is beyond its reach. Ekman’s Telling Lies (2001), his com-
prehensive overview of the state of the art of deception research, is a typical example:
Ekman spends as many pages explaining what falls outside the purview of the psychol-
ogy of lying and lie detection as he does enthusiastically promoting its astonishing
findings and technologies. As he puts it early on in the book: “My message to those
interested in catching political or criminal lies is not to ignore behavioral clues but to be
more cautious, more aware of the limitations and the opportunities” (p. 22).

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


Derksen 207

Yet in the words of Ken Alder (2007), “the dream of certainty dies hard” (p. 44).
Sometimes the lure of total transparency, of a world in which truth is produced by
machinery that operates without friction, is too great to be resisted. Ekman, despite his
frequent warnings about the difficulties of lie detection, now appears to consider his own
techniques so effective that their spread must be controlled. “I don’t advocate teaching
everyone how to read micro-expressions because very often you don’t want to know. …
I would not want to live in a world where there was no way for me to hide my emotions”
(Bond & Ekman, 2007, p. 55). Like x-ray specs, reading micro-expressions and emblem-
atic slips gives you a direct view of the naked truth, and Ekman can only hope people will
use his technology for the good. There are no restrictions, however, to buying the online
interactive course at mettonline.com.

Lie detection: Control and resistance


At first sight, the psychology of lying and lie detection might seem to confirm the
denunciation by Stengers, Despret, and Latour of psychology’s misguided quest for
total control. Here, surely, is a good example of the common equation of science with
mastery over the object. Indeed, in the psychology of lie detection the truth appears
when the participant loses control over the body and the psychologist gains dominance,
exactly as Despret had noted in her study of the psychology of emotions: “The practice
of lie detection is just one of many translations of this almost-absolute confidence in
emotion’s authenticity: a body that is stirred cannot lie” (Despret, 2004, pp. 48–49).
The premise that underlies most research on lying and lie detection is, in the words of
Ken Alder (2002), “that while a human being may tell a conscious lie, that person’s
body will ‘honestly’ betray his or her awareness of this falsehood” (p. 2).
Yet, the psychology of lie detection also shows the limitations of this view of the
discipline as obsessed with control. Indeed, it produces these limits itself, and sketches
what lies beyond. First, although the psychology of lie detection revolves around con-
trol, the psychologist is not alone in control: the liar too is a controller. With the authen-
tic body comes a deceptive consciousness, intent on manipulating the beliefs of others.
The primary given of the field, spelled out in the first pages of any monograph, is that
lies are produced deliberately. Opposite the psychologist is a participant who wants to
control the situation and his or her own body just as much as the psychologist does.
Mechanism is paired with intentionality. In the psychological version of lying and lie
detection, control is distributed between the liar and the psychologist, rather than
monopolized by the latter.9
Of course, the object of the research on lie detection is to redistribute power away
from the deceiving participant. An ideal lie detector would nullify the participant’s efforts
at manipulating his or her body and the beliefs of others, by picking up the signals that a
lying body inevitably gives off. Any control that the liar exerts over his or her body
becomes irrelevant. Nevertheless, psychologists are quite aware that their powers are
limited. Claims of high accuracy rates are met with skepticism.10 To some extent, the
problem is sought in the failings of the detector, whether human or machine: not tuned in
to the right channels, not sensitive enough. But it is also acknowledged that the partici-
pant can actively resist the lie detector. Even when the psychologist has complete control

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


208 Theory & Psychology 22(2)

over the circumstances in which the participant is tested, “countermeasures” remain a


problem. The psychology of lying and lie detection is a play of control and resistance
between psychologist and participant. In the arena of the laboratory, the liar is a far from
docile subject.
Granted that the psychologist does not monopolize control and that the participant’s
resistance is a prominent feature of the psychology of lying and lie detection, the fact
remains that the field is about control. However, the limitations of this approach are a
constant subject of discussion. Lie detection, whatever the material technology involved,
is said to require experience, skill, and intuition. It is to some extent an art that is not
fully formalizable in a manual of operation. Moreover, psychologists acknowledge that,
strictly speaking, there is no such thing as lie detection. There is no “single nonverbal,
verbal, or physiological response … uniquely associated with deception” (Vrij, 2008,
p. 4); no Pinocchio’s nose. What is detected are signs of the stress and mental effort
involved in telling a lie that could be due to other causes. Liars who know this are much
more at ease when tested and thus better able to resist the lie detector. Only when the lie
consists in masking an emotion, and “leakage” of that emotion occurs (via micro-
expressions, for example), can deception be said to be directly detectable. Finally, lie
researchers are aware that their version of lying is but one of the many permutations of
truthfulness and purpose that are possible. The technological model of lying and lie
detection is geared toward its forensic application; beyond that it does not apply. Outside
its limits—determined by unambiguous intent and falsehood—lies a wide landscape of
communicative acts that fail to meet the criteria but are not exactly truthful either. And
even when there are clear lies to detect, doing so may be morally reprehensible.

Conclusion
The relevance of this analysis extends beyond the psychology of deception and lie
detection. The controller/controlled duality pervades many subjects in current psy-
chology. Examples include automaticity theory (Bargh, 2006), mental control (Najmi,
Riemann, & Wegner, 2009), emotional self-regulation (Gross, 2007), and the limited
resource-model of self-control (Baumeister, Muraven, & Tice, 2000). Such types of
research construct a social world that consists of controllers and machines. The distri-
bution of control between psychologist and participant varies: it either may lie largely
with the psychologist, as in automaticity theory, or may be shared with the partici-
pant, as in the work on self-control. Neither has a monopoly. In the lab, the psycholo-
gist may always have the “advantage” (Scheibe, 1988), but even automaticity theorists
accord the participant a measure of resistance (Derksen & Beaulieu, 2011). The psy-
chology of lying and lie detection exemplifies a “dialectic of control” (Morawski,
1988, p. 86) that is becoming increasingly prominent in recent psychology.
By focusing on the psychologist’s control of the experimental situation and the
corresponding “docility” of the participant, critics risk affirming the self-image of
psychologists. Reading psychological experiments against the grain, as I have tried to
do here, reveals the struggle that is inherent in studying people, but seldom finds its way
into experimental reports. The little world of the lab is not one of complete experimental
control, but a playing field of opposing forces. Psychological participants already show

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


Derksen 209

the “recalcitrance” that Latour admires in the objects of the natural sciences (Latour,
2000, p. 116). In its study of lying and lie detection, psychology engages with its
subjects in a battle of measures and countermeasures, control and resistance. The psy-
chologist’s advantage lies in the fact that the arena is structured by binary oppositions
that psychology is particularly comfortable with: the machine and its operator, self and
other, truth and lie. It is not an environment that is alien to the subject and renders him
or her powerless, however. Whatever the conceptual difficulties with such dualisms,
and there are of course many, people are quite capable of playing this game: of ruptur-
ing the tie between self and other, engaging in manipulative verbal and non-verbal
communication, and telling a bare-faced lie. The critical focus should not be on the
purported docility of the participant, but on the technologies and counter-technologies
brought into play in the psychologists’ laboratories, and on the limitations of the control
perspective itself.

Funding
This research received no specific grant from any funding agency in the public, commercial, or
not-for-profit sectors.

Notes
1. See also Benschop & Draaisma (2000) for this second type of observer.
2. For the opposite viewpoint, that laboratories really are spaces of control, see, for example,
Schlich (2007).
3. Vrij (1998, pp. 47–48) raises the interesting question of what explains this discrepancy. His
best explanation is that we just do not know how we ourselves behave when we lie.
4. Nowadays polygraphers use a similar test with numbers instead of cards (Maschke &
Scalabrini, 2000, p. 90).
5. Nemesysco threatened Eriksson and Lacerda with legal action over this claim, and managed
to have the article removed from the journal’s online archive. Nemesysco also sells a “love
detector” that takes the guesswork out of dating. According to Eriksson and Lacerda, it is
identical to the lie detector, but with differently labeled variables.
6. The approach called Interpersonal Deception Theory tries to overcome this limitation and
develop an interactional perspective on lying and lie detection. See, for instance, Burgoon,
Buller, & Floyd (2001), who found that interaction makes it easier to deceive, because the liar
can improve his or her performance using feedback.
7. Baumeister (1993) notes that sometimes self-deception may become a self-fulfilling prophecy,
thus turning a lie into truth.
8. See for overviews Ekman (2001) and Vrij (2008).
9. According to Alder, the fact that the lie detector tries to overcome the participant’s resistance,
and thus accords him or her control, was the reason it was rejected by psychologists, such as
John B. Watson, in the 1920s. In their paradigm, psychology was a natural science that stud-
ied objects rather than willful subjects (Alder, 2007, p. 53).
10. See for instance Vrij’s criticism of Ekman’s claims for his techniques (Vrij, 2008, chap. 6).

References
Alder, K. (2002). A social history of untruth: Lie detection and trust in twentieth-century America.
Representations, 80, 1–33. doi: 10.1525/rep.2002.80.1.1
Alder, K. (2007). The lie detectors: The history of an American obsession. New York, NY: Free Press.

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


210 Theory & Psychology 22(2)

Appelbaum, P.S. (2007). Law & psychiatry: The new lie detectors: Neuroscience, deception, and
the courts. Psychiatric Services, 58, 460–462.
Bargh, J.A. (Ed.). (2006). Social psychology and the unconscious: The automaticity of higher
mental processes. New York, NY: Psychology Press.
Baumeister, R. (1993). Lying to yourself. In M. Lewis & C. Saarni (Eds.), Lying and deception in
everyday life (pp. 166–183). New York, NY: Guilford.
Baumeister, R., Muraven, M., & Tice, D. (2000). Ego depletion: A resource model of volition,
self-regulation, and controlled processing. Social Cognition, 18, 130–150.
Benschop, R., & Draaisma, D. (2000). In pursuit of precision: The calibration of minds and
machines in late nineteenth-century psychology. Annals of Science, 57, 1–25.
Bond, M., & Ekman, P. (2007). How to spot a fibber. New Scientist, 195(2621), 54–56.
Brain Fingerprinting Laboratories. (n. d.a). Counterterrorism applications. Retrieved from http://
www.brainwavescience.com/counterterrorism.php
Brain Fingerprinting Laboratories. (n. d.b). Scope and science. Retrieved from http://www.
brainwavescience.com/ScopeandScienceofBF.php
Bunn, G.C. (2007). Spectacular science: The lie detector’s ambivalent powers. History of
Psychology, 10, 156–178.
Burgoon, J., Buller, D., & Floyd, K. (2001). Does participation affect deception success? A test of
the interactivity principle. Human Communication Research, 27, 503–534.
Coon, D. (1993). Standardizing the subject: Experimental psychologists, introspection and the
quest for a technoscientific ideal. Technology and Culture, 34, 757–783.
DePaulo, B., Wetzel, C., Sternglanz, R., & Wilson, M. (2003). Verbal and nonverbal dynamics of
privacy, secrecy, and deceit. Journal of Social Issues, 59, 391–410.
DePaulo, B., Zuckerman, M., & Rosenthal, R. (1980). Humans as lie detectors. Journal of
Communication, 30, 129–139.
Derksen, M., & Beaulieu, A. (2011). Social technology. In I.C. Jarvie & J. Zamora-Bonilla (Eds.),
The Sage handbook of the philosophy of social sciences (pp. 703–720). London, UK: Sage.
Despret, V. (2004). Our emotional makeup: Ethnopsychology and selfhood. New York, NY: Other
Press.
Ekman, P. (2001). Telling lies: Clues to deceit in the marketplace, politics, and marriage (3rd ed.).
New York, NY: W.W. Norton.
Ekman, P., & Friesen, W. (1969). Nonverbal leakage and clues to deception. Psychiatry, 32, 88–106.
Ekman, P., & Friesen, W. (1974). Detecting deception from the body or face. Journal of Personality
and Social Psychology, 29, 288–298.
Ekman, P., & O’Sullivan, M. (2006). From flawed self-assessment to blatant whoppers: The utility
of voluntary and involuntary behavior in detecting deception. Behavioral Sciences & the Law,
24, 673–686.
Eriksson, A., & Lacerda, F. (2008). Charlatanry in forensic speech science: A problem to be taken
seriously. International Journal of Speech Language and the Law, 14, 169–193.
Frank, M.G., & Ekman, P. (1997). The ability to detect deceit generalizes across different types of
high-stake lies. Journal of Personality and Social Psychology, 72, 1429–1439.
Frank, T. (2007, September 26). Airport security arsenal adds behavior detection. USA Today.
Retrieved from http://www.usatoday.com/travel/flights/2007-09-25-behavior-detection_N.htm
Gergen, K.J. (1973). Social psychology as history. Journal of Personality and Social Psychology,
26, 309–320.
Goffman, E. (1956). The presentation of self in everyday life. Edinburgh, UK: University of
Edinburgh, Social Sciences Research Centre.
Grazer, B., Nevins, D., & Baum, S. (Executive Producers). (2009–2011). Lie to me. 20th Century
Fox Television.

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


Derksen 211

Gross, J.J. (Ed.). (2007). Handbook of emotion regulation. New York, NY: Guilford.
Hacking, I. (1986). Making up people. In T.C. Heller, M. Sosna, & D.E. Wellbery (Eds.),
Reconstructing individualism: Autonomy, individuality, and the self in Western thought
(pp. 222–236). Stanford, CA: Stanford University Press.
Hacking, I. (1995). The looping effects of human kinds. In D. Sperber, D. Premack, & A. Premack
(Eds.), Causal cognition: A multidisciplinary debate (pp. 351–383). Oxford, UK: Clarendon.
Hacking, I. (2006, August 17). Making up people: Clinical classifications. London Review of
Books, 28(16), 23–26.
Hacking, I. (2007). Kinds of people: Moving targets. Proceedings of the British Academy, 151,
285–318.
Honts, C., & Amato, S. (2002). Countermeasures. In M. Kleiner (Ed.), Handbook of polygraph
testing (pp. 251–264). San Diego, CA: Academic Press.
Korn, J.H. (1997). Illusions of reality: A history of deception in social psychology. Albany: State
University of New York Press.
Kraut, R.E. (1978). Verbal and nonverbal cues in the perception of lying. Journal of Personality
and Social Psychology, 36, 380–391.
Latour, B. (2000). When things strike back: A possible contribution of “science studies” to the
social sciences. British Journal of Sociology, 51, 107–123.
Lie detectors: Whose pants on fire? (2008, May 8). The Economist. Retrieved from http://www.
economist.com/science/displaystory.cfm?story_id=11326202
Littlefield, M. (2009). Constructing the organ of deceit: The rhetoric of fMRI and brain finger-
printing in post-9/11 America. Science, Technology, & Human Values, 34, 365–392.
Maschke, G.W., & Scalabrini, G.J. (2000). The lie behind the lie detector. Retrieved from http://
antipolygraph.org/pubs.shtml.
Meijer, E., Verschuere, B., Merckelbach, H., & Crombez, G. (2008). Sex offender management using
the polygraph: A critical review. International Journal of Law and Psychiatry, 31, 423–429.
Mills, J.A. (1998). Control: A history of behavioral psychology. New York, NY: New York
University Press.
Morawski, J.G. (1988). Impossible experiments and practical consequences. In J.G. Morawski
(Ed.), The rise of experimentation in American psychology (pp. 72–93). New Haven, CT: Yale
University Press.
Najmi, S., Riemann, B., & Wegner, D. (2009). Managing unwanted intrusive thoughts in
obsessive-compulsive disorder: Relative effectiveness of suppression, focused distraction,
and acceptance. Behaviour Research and Therapy, 47, 494–503.
Park, H.S., Levine, T.R., McCornack, S.A., Morrison, K., & Ferrara, M. (2002). How people really
detect lies. Communication Monographs, 69, 144–157. doi:10.1080/714041710
Rosenfeld, J., Soskins, M., Bosh, G., & Ryan, A. (2004). Simple, effective countermeasures to
P300-based tests of detection of concealed information. Psychophysiology, 41, 205–219.
Sartori, G., Agosta, S., Zogmaister, C., Ferrara, S. D., & Castiello, U. (2008). How to accurately
detect autobiographical events. Psychological Science, 19, 772–780.
Scheibe, K. (1988). Metamorphoses in the psychologist’s advantage. In J.G. Morawski (Ed.), The
rise of experimentation in American psychology (pp. 53–71). New Haven, CT: Yale University
Press.
Schlich, T. (2007). Surgery, science, and modernity: Operating rooms and laboratories as spaces
of control. History of Science, 45, 231–256.
Sip, K., Roepstorff, A., McGregor, W., & Frith, C. (2008). Detecting deception: The scope and
limits. Trends in Cognitive Sciences, 12, 48–53.
Solomon, R.C. (1993). What a tangled web: Deception and self-deception in philosophy. In
M. Lewis & C. Saarni (Eds.), Lying and deception in everyday life (pp. 30–58). New York,
NY: Guilford.

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012


212 Theory & Psychology 22(2)

Stam, H.J., Lubek, I., & Radtke, H. (1998). Repopulating social psychology texts: Disembodied
“subjects” and embodied subjectivity. In B. Bayer & J. Shotter (Eds.), Reconstructing the
subject: Bodies, practices and technologies (pp. 153–186). London, UK: Sage.
Stengers, I. (1997). Power and invention: Situating science (P. Bains, Trans.). Minneapolis:
University of Minnesota Press.
U.S. Department of Homeland Security. (n.d.). Human factors/Behavioral sciences projects.
Retrieved from http://www.dhs.gov/files/programs/gc_1218480185439.shtm
Verschuere, B., Prati, V., & Houwer, J. D. (2009). Cheating the lie detector: Faking in the autobio-
graphical implicit association test. Psychological Science, 20, 410–413.
Vrij, A. (1998). De psychologie van de leugenaar: Liegen en voorgelogen worden op het werk, in
de rechtszaal en thuis [The psychology of the liar: Lying and being lied to at work, in court,
and at home].Lisse, The Netherlands: Swets & Zeitlinger.
Vrij, A. (2008). Detecting lies and deceit: Pitfalls and opportunities (2nd ed.). Chichester, UK:
Wiley.
Vrij, A., Mann, S., Fisher, R., Leal, S., Milne, R., & Bull, R. (2008). Increasing cognitive load
to facilitate lie detection: The benefit of recalling an event in reverse order. Law and Human
Behavior, 32, 253–265.
Wolpe, P.R., Foster, K.R., & Langleben, D.D. (2005). Emerging neurotechnologies for lie-detec-
tion: Promises and perils. American Journal of Bioethics, 5(2), 39–49.

Maarten Derksen is Assistant Professor of Theory and History of Psychology at the University of
Groningen, The Netherlands. He has written on the popularization and demarcation of psychology,
on the history of clinical psychology in Great Britain, and on the concept of culture in evolutionary
psychology. Boundaries—between nature and culture, psychology and common sense, people and
machines—are a recurrent theme in all his work, including that on social technology. Address:
Faculty of Behavioural and Social Sciences, University of Groningen, Grote Kruisstraat 2/1, 9712
TS Groningen, The Netherlands. Email: m.derksen@rug.nl

Downloaded from tap.sagepub.com at UNIV OF VIRGINIA on August 18, 2012

You might also like