Download as pdf or txt
Download as pdf or txt
You are on page 1of 13

Wisecrackers: A Theory-Grounded Investigation

of Phishing and Pretext Social Engineering Threats


to Information Security

Michael Workman
College of Business, Florida Institute of Technology, Melbourne, FL. E-mail: workmanfit@yahoo.com;
workmanm@fit.edu

The collection and dissemination of information about take basic security precautions that result in billions of dol-
people by businesses and governments is ubiquitous. lars annually in individual and corporate losses and even
One of the main threats to people’s privacy comes from
human carelessness with this information, yet little empir-
to crimes (Calluzzo & Cante, 2004: Pahnila, Siponen, &
ical research has studied behaviors associated with infor- Mahmood, 2007; Shreve 2004). “Knowing better, but not
mation carelessness and the ways that people exploit this doing better” is thus one of the key scholarly and practical
vulnerability. The studies that have investigated this im- issues that have not been fully addressed.
portant question have not been grounded in theory. In par- One area of particular concern involves threats from
ticular, the extant literature reveals little about social
engineering threats and the reasons why people may or
social engineering. Social engineering consists of techniques
may not fall victim. Synthesizing theory from the market- used to manipulate people into performing actions or
ing literature to explain consumer behavior, an empirical divulging confidential information (Mitnick & Simon,
field study was conducted to see if factors that account 2002). Social engineers often attempt to persuade potential
for successful marketing campaigns may also account for victims with appeals to strong emotions such as excitement
successful social engineering attacks.
or fear, whereas others utilize ways to establish interpersonal
relationships or create a feeling of trust and commitment
People have become increasingly aware of the pervasive
(Gao & Kim, 2007). For example, they may promise that
threats to information security and there are a variety of
valuable prize or interest from a transfer bank deposit will be
solutions now available for solving the problem of information
given if the victim complies with a request for information.
insecurity such as improving technologies, including the
The emotional aspect of the interaction is distracting and
application of advanced cryptography, or techniques, such as
serves to interfere with the victim’s ability to analyze care-
performing risk analyses and risk mitigation (Bresz, 2004;
fully the content of the message. The social engineer’s illicitly
Sasse, Brostoff & Weirich, 2004). There has also been
gotten information may then be used to gain unauthorized
important suggestions from the information systems (IS)
access to computer systems to invade a person’s privacy,
security literature that include augmenting security proce-
commit fraud, industrial espionage, or to damage assets
dures as a solution (cf. Debar & Viinikka 2006), addressing
(Dodge, Carver, & Ferguson, 2007).
situational factors such as reducing workload so that security
However, not all successful social engineering episodes
professionals have time to implement the recommended pro-
result from duplicity; some people willingly give up sensitive
cedures (Albrechtsen, 2007), improving the quality of policies
information despite their awareness of the pervasive threats
(von Solms & von Solms, 2004), improving the alignment
(Calluzzo & Cante, 2004; Straub & Nance, 1990). For
between an organization’s security goals and its practices
example, although people generally state that they are
(Leach, 2003), and gaining improvements from software
concerned about information security and privacy, and even
developers regarding the security implementations during
claim that they are willing to pay a fee to protect their
the software development cycle (Jones & Rastogi, 2004).
personal information, in many cases they are willing to
Yet despite all these recommendations, people often fail to
trade-off privacy for convenience, or even bargain the release
of very personal information in exchange of relatively small
rewards (Acquisti & Grossklags, 2003; Leyden, 2004).
A review of the social psychology, management, and
Received July 23, 2007; revised October 9, 2007; accepted October 9, 2007
security literature revealed no theoretical framework specif-

© 2007 Wiley Periodicals, Inc. Published online 21 December 2007 in ically grounding the study of social engineering security
Wiley InterScience (www.interscience.wiley.com). DOI: 10.1002/asi.20779 threats, although the extant literature does suggest a significant

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY, 59(4):662–674, 2008
number of countermeasures. Beyond the technological and Problem recognition reflects the extent to which an
procedural recommendations to address the problem, the individual recognizes a problem as relevant to him or her;
research into the underlying behavioral causes of information that is, how likely a threat is perceived to impact the person.
insecurity has suggested interventions consisting primarily The level of active involvement results from a perception of
of raising security awareness (Calluzzo & Cante, 2004; how emotionally the problem is felt, such as the perceived
Dodge et al., 2007), rewarding and punishing behaviors (cf. severity of damage to the person posed by the threat. Constraint
Straub & Welke, 1998), providing instruction on situational recognition reflects the extent to which people perceive their
ethics and responsible conduct (cf. Harrington, 1996; Hsu & behaviors as limited by factors beyond their own control.
Kuo, 2003; Kurland, 1995), and using training on specific According to Grunig (1997), if these three factors accurately
security techniques (Straub & Nance, 1990). However, these depict external conditions then the environment must
have either not addressed the issue of social engineering change before a person will respond, but if they are merely
directly, or their investigations were not grounded in theory perceived (internal), they may be changed by persuasive com-
to explain under which conditions social engineering inter- munication, hence persuasion is a key element in whether
vention should work and why. and how people respond to messages about a threat (Petty &
Social engineers have discussed the use of techniques Cacioppo, 1986).
used in marketing campaigns to persuade or gain a victim’s Working from these premises, we reviewed the social
compliance (e.g., Gao & Kim, 2007). The elaboration likeli- psychology, management, and security literature to determine
hood model (Petty & Cacioppo, 1986) has been utilized in what had been studied relative to social engineering and
marketing research to explain how people are persuaded to information security. Three factors rose to the surface: trust
make purchases (Petty, Cacioppo, & Schumann, 1983), and (c.f. Wang & Emurian, 2005), fear (c.f. Straub & Welke,
has been employed in techniques by telemarketers (Schumann, 1998), and commitment (Theoharidou, Kokolakis, Karyda,
Hathcote, & West, 1991). Although social engineering does & Kiountouzis, 2005). Also, Mitnick and Simon’s (2002)
not seek to sell a product or service, it does seek to persuade seminal work on social engineering outlined attack architec-
people to provide sensitive information in a similar fashion ture drawing from Petty & Cacioppo’s (1986) elaboration
(Gao & Kim, 2007; Mitnick & Simon, 2002). Hence, using likelihood model (ELM). From these, we synthesized theory
the elaboration likelihood model (ELM) as a framework and from the literature to ground our investigation.
drawing from the social psychology theory literature on Social engineering often involves duplicity, such as using
commitment, trust, and fear, we devised a field experiment ruse or lures, to intercept sensitive information such as per-
to test its applicability in explaining the social engineering sonal identification numbers, social security numbers, birth
threats to determine whether the defenses generally suggested dates, and other personal information that can be used for
against succumbing to marketing ploys, or interventions identity theft and other frauds, or to gain illicit access to
suggested for other kinds of information security threats, computers by stealing login information and passwords
might also be applied to social engineering countermeasures (Gao & Kim, 2007). Some social engineering techniques
and interventions. cause people to spend time and effort on navigating through
a labyrinth of links on a Web site, each time providing incre-
mentally more information, to obtain some “free software”
Theory and Hypotheses (which sometimes contain malicious programs including
“key-logger” Trojan horse programs that steal information)
Previous Research and Background
or to obtain some prize, or allow a perpetrator the ability to
Although Langenderfer and Shimp (2001) and others use the intended victim’s bank account while performing an
have developed theoretical frameworks for some social alleged international money transfer in exchange for
engineering approaches, to our knowledge, the theory has promise of some reward, or the perpetrator may promise a
not been empirically tested to ascertain what factors actually valuable item in exchange for what appears to be a small
do account for a variety of successful social engineering financial transaction to cover shipping costs (Miller, 2005;
attacks or effective concomitant countermeasures or inter- Mitnick & Simon, 2002).
ventions. However, linkages have been empirically tested In most social engineering cases, the attacker refrains
between theory related to health-related threats and security from coming into physical contact with the intended victim
threats (c.f. Bresz, 2004; Gao & Kim, 2007; Pahnila et al., and relies instead on e-mail or telephone calls to perpetrate
2007; Siponen, 2000). As an example, Aldoory and Van the attack (Mitnick & Simon, 2002) because it is in this
Dyke (2006) tied the situational theory of publics (Grunig, setting that a peripheral route of persuasion can be best
1978) and the health-belief model (Rosenstock, 1974) to utilized (Cacioppo, Petty, Kao, & Rodriguez, 1986). The
issues related to bio-terrorism attacks. The situational theory ELM distinguishes “central” from “peripheral” routes of
of publics asserts that a populace may be segmented based persuasion, where a central route encourages an elaborative
on the activeness or passiveness of communication behavior analysis of a message’s content, and a peripheral one is a
(Aldoory & Van Dyke, 2006). The factors purported by this form of persuasion that does not encourage elaboration (i.e.,
theory are problem recognition, the level of active involve- extensive cognitive analysis) of the message content. Rather,
ment, and constraint recognition (Grunig, 1978). it solicits acceptance of a message based on some adjunct

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—February 15, 2008 663
DOI: 10.1002/asi
TABLE 1. Peripheral route persuasion and research model. arouse an emotion such as fear or desire, and then obtain
some form of action (Gao & Kim, 2007). To accomplish this,
Factor Construct
most social engineering attacks appeal to multiple factors;
Normative commitment Reciprocation as obligation for example, they may combine techniques that utilize
Continuance commitment Cognitive investment and perceptual
authority, normative commitment, and scarcity in one sce-
consistency nario. The social engineering attack scenarios can take on
Affective commitment Social “proof” as behavioral modeling many forms, but the two most common and fastest growing
and conformance are pretexting and phishing (Dodge et al., 2007; Miller,
Trust Likeability and credibility 2005; Mitnick & Simon, 2002). In a pretext, an imposter
Fear Obedience to authority and acquiescence creates a setting designed to influence an intended victim to
to threat of punishment or negative release sensitive information, pay money, or perform actions
consequences that compromise the confidentiality of information.
Reactance Scarcity and impulsivity In a highly publicized example in 2006, it was alleged
that Hewlett Packard’s Chief Executive Officer had author-
ized private investigators to use pretext to acquire the social
element, such as perceived credibility, likeability, or attrac- security number of a journalist, and then under false pretense
tiveness of the message sender, or “a catchy” phrase or slo- pose as employees of the journalist’s telephone provider as a
gan (Miller, 2005). For example, celebrities are frequently means of obtaining phone records so that she could deter-
used to sell products with which they have no obvious or mine who had been leaking sensitive information to the
special expertise, and consumers often purchase these prod- Press. In one particularly damaging case, identity thieves
ucts because “they think they know” and like or identify posed as legitimate business operators to gather thousands of
with the celebrity (Cacioppo et al., 1986). Peripheral route social security numbers and other sensitive information about
persuasion is an important element in social engineering consumers from Choice Point, a subsidiary of Equifax.
scams because it offers a type of shield for the attacker (Mit- Other well-documented pretext frauds have also included
nick & Simon, 2002). people who have claimed to represent charitable foundations
Cialdini (2001) identified six factors associated with and make appeals such as for donations to the Phuket
peripheral route persuasion: (a) reciprocation, (b) consistency, tsunami relief fund, or represent the Florida Highway Patrol,
(c) social proof, (d) likeability, (e) authority, and (f) scarcity or the United Christian Support Fund.
(Table 1). Reciprocation theoretically involves normative Phishing is a ruse designed to gain sensitive information
commitment in which people form implied obligations to from an intended victim by way of e-mail and Web pages or
others (Allen & Meyer, 1990). For example, when some- letters that appear to be from genuine businesses, that com-
thing of value is offered, such as a free sample, people feel mand the potential victim to supply information to prevent
obligated to return the favor by making a purchase that they an account from being closed, or as part of a promotion or
had not originally intended. Consistency theoretically give-away called a gimmie. A typical example is an e-mail
involves continuance commitment, where people become that appears to come from a bank and containing authentic
psychologically vested in a decision they have made (Petty, looking company logos, requesting that potential victims
Briñol, & Tormala, 2002). A manifestation of consistency is click on a link contained in the e-mail and update financial
observed in the economic concept of “loss aversion and sunk information from a Web site that appears to be the legitimate
costs,” where some people continue to spend money on institution. In one recently publicized attack, victims
losing ventures (Staw, 1981). Social proof theoretically received letters in the mail that appeared to be on letterhead
involves affective commitment, where people model the from their bank, which instructed the victims to call a phone
behaviors of their peer group, role models, important others, number supplied in the letter. An interactive voice response
or because it is generally “fashionable” (Asch, 1946). Like- system then requested social security numbers, personal
ability can be used to gain trust, where people trust and identification numbers, and verification information including
comply with requests from others who they find attractive dates of birth and mother’s maiden names, which were then
or are perceived as credible and having special expertise or used to withdraw money from accounts and open new ones.
abilities such as sports figures or actors they like (Giles &
Wiemann, 1987; Horai, Naccari & Fatoullah, 1974). Author-
Theory: Commitment, Reciprocation, Consistency,
ity can be used to engender fear, where people obey com-
and Social Proof
mands to avoid a negative consequence such as losing a
privilege or something of value, punishment, humiliation, or The commitment concept has enjoyed a long history of
condemnation (Milgram, 1983). Scarcity is based on the research in organizational settings. Commitment has gener-
principle of reactance, where people respond to perceived ally been defined as an attitude that leads one to perseverant
shortages by placing greater psychological value on per- action, and is a fairly stable personal characteristic (McCaul,
ceived scarce items (Brehm, 1966). Hinsz & McCaul, 1995; Mowday, Steers, & Porter, 1979;
The social engineering attack architecture seeks to get the Sagie, 1998). Allen and Meyer (1990) delineated types of
attention of the potential victim, hold his or her interest, commitments based on the situation and the target, such as a

664 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—February 15, 2008
DOI: 10.1002/asi
brand, a person or organization. Normative commitment to engage in consistent lines of activity based on the individ-
comes from a reciprocal exchange with a target, where ual’s recognition of the costs (or lost side-bets) associated
someone will expend effort and perform actions because it is with discontinuing an activity” (Allen & Meyer, 1990,
customary or is obligatory (Beck & Wilson, 2000). pp. 2–3). With continuance commitment, people become
Social norms form around the “credibility of commitment psychologically vested in a decision they have made and
actions” during an exchange such that when a giver offers maintain consistency in behaviors related to it (Petty et al.,
something valued by a receiver, the receiver experiences 2002), such as in the case where people “pour good money
cognitive dissonance (Festinger & Carlsmith 1959) until he after bad” in the economic concept of “loss aversion and
or she provides something to the giver that is likewise valued sunk costs” by continuing to spend money on losing ven-
(Bergman, 2006; Gundlach, Achrol & Mentzer, 1995; tures despite new evidence suggesting that the decision
Harmon-Jones & Mills, 1999). In this theory of social might be wrong (Arkes & Blumer, 1985; Staw, 1981).
exchange, people decide fairness and develop a comparison According to Cialdini (2001), there is evidence that some
against which they evaluate the “give-and-take.” The extent people have a tendency to think that persistent effort will
of exchange varies according to the value or weighting they eventually “payoff.” This type of commitment explains why
place on what is exchanged, and they consider whether the “some people will pay escalating costs to try to win a cheap
exchange is fair and reasonable (Kelley & Thibaut 1978). In teddy bear at a carnival” (Brill & Molton, 2006, p. 33).
terms of ELM and commitment theory, people tend to honor Some research (e.g., Milne, Sheeran, & Orbell, 2000)
a commitment to maintain this type of implied social using protection motivation theory (Rogers, 1975) in heath-
contract even in some cases where there is imbalance related threats have included psychological cost-benefit
(Theoharidou et al., 2005). For instance, if an incentive is models, and extending from this line of reasoning to the
given to someone in exchange for a promise, people who are information security and social engineering arena, people
highly committed tend to continue to live up to their end of may be influenced in such a way as to continue investing in
the bargain even after the incentive is later withdrawn a risky proposition to gain something they value (Josephs,
(Cacioppo et al., 1986; Cialdini, 2001). Larrick, Steele, & Nisbett, 1992; Pechmann, Zhao, Goldberg,
Relative to information security, people sometimes & Reibling, 1993). Relative to information security and
will divulge sensitive or private information to those to social engineering, in most cases the threat is designed so
whom they feel obligated (Bergman, 2006; Leyden, 2004; that the level of effort the victim invests will be lower than
Theoharidou et al., 2005). In some forms of social engineer- the purported benefits, and thus the decision to disregard
ing attacks, the perpetrator relies on this reciprocal commit- precautions toward the threat emerges from the reasoning
ment norm by offering something to the intended victim, and that the costs or risks are outweighed by the perceived possi-
then requests or entices them to perform an action, such as ble benefits (Charbaji & Jannoun, 2005; Pechmann et al.,
the perpetrator may offer a sum of money if the intended 1993; Wilson, 2004). This cost-benefit assessment is a
victim will allow the perpetrator the ability to “park funds” favorable or unfavorable affective and cognitive evaluation
in the intended victim’s bank account while performing an of a risk/reward that generally influences behavior, which
international money transfer, or the perpetrator may promise may or may not lead to continuance commitment (Ajzen,
a valuable item in exchange for what appears to be a small 2002), where risk may be defined as a lack of predictability
financial transaction to cover shipping costs (Mitnick & about an outcome or consequences of an action and reward
Simon, 2002; Panko, 2004). Consistent with these theories, as a monetary or nonmonetary item that has some intrinsic
when social engineers utilize peripheral route persuasion value to the recipient (Charbaji & Jannoun, 2005).
geared toward reciprocation, people who are more norma- It is important to realize that people maintain different
tively committed are more susceptible to social engineering cost-benefit assessments and hence continuance commitment
scheme than those who are not. Stated formally, attitudes independently of the perceived business value or
sensitivity of the information assets, particularly when it
Hypothesis 1: People who are higher in normative commit- comes to self-interests (International Federation of Accoun-
ment will succumb to social engineering more frequently tants, 2006). For example, although people generally state
than those who are lower in normative commitment. that they are concerned about information security and
privacy, and even claim that they are willing to pay a fee to
Consistency theories, such as Festinger and Carlsmith’s protect their personal information, in many cases they are
(1959) cognitive dissonance theory, suggest that people are willing to trade-off privacy for convenience (Acquisti &
motivated to maintain congruence between attitudes and Grossklags, 2003). This factor likely carries over to cost-
behaviors, and as such, they often commit to actions through benefit assessment and continuance commitment toward
to conclusion even despite disconfirming evidence to retain information security and whether people are willing to take
a feeling of consonance (Brehm, 1966; Harmon-Jones & precautions against social engineering threats or whether
Mills, 1999). Thus, whereas normative commitment extends they continue to commit to their behavioral and psychological
from a reciprocal exchange with a target, continuance com- investments (Charbaji & Jannoun, 2005).
mitment is the product of investment perceptions (time, From the prospective of social engineering and informa-
effort, or money), where “commitment is viewed as a tendency tion security, then, continuance commitment is the result of

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—February 15, 2008 665
DOI: 10.1002/asi
a positive cost-benefit association in reference to the advan- engineering schemes that utilize social proof than those who
tages of perceived rewards that may yield compared to the are not. Stated more formally,
cost of taking precautions or the opportunity costs associated
with a social engineering threat (Thomas, 2004). On the Hypothesis 3: People who are higher in affective commit-
other hand, if the cost of taking precautions is perceived as ment will succumb to social engineering more frequently
small relative to a threat, or if the value proposition from the than those who are lower in affective commitment.
threat delivers only a small incremental degree of perceived
value, people may take precautions against the threat
Likeability and Trust
(Pechmann et al., 1993). When people perceive that benefits
outweigh the cost of taking precautions, they exhibit contin- The concept of “liking” from an ELM point of view
uance commitment where they are more likely to yield to draws from characteristics such as physical attractiveness,
social engineering threats to gain the perceived possible charisma, charm, or general popularity (Cialdini, 2001; Gass
rewards, and vice-versa (Hsu & Kuo, 2003). Stated formally, & Seiter, 1999). These features, in the context of peripheral
we hypothesize that route persuasion, tend to lead people to comply with
requests to be liked by those they like (Cacioppo et al., 1986;
Hypothesis 2: People who are higher in continuance com- Horai et al., 1974). In most social engineering cases, the
mitment will succumb to social engineering more frequently attacker avoids coming into physical contact with the inten-
than those who are lower in continuance commitment. ded victim and instead relies on e-mail, postal letters, Web
sites, or telephone calls to perpetrate the attack, and thus it is
People form their self-concepts based, in part, on their difficult for the social engineer to telegraph characteristics
relationships with or membership in certain social circles, such as charm or charisma (Dotterweich & Collins, 2006).
which may be referred to as social identity (Tajfel & Therefore, the social engineer tries to get the potential victim
Turner, 1986), and this leads to affective commitment, to like the perpetrator and gain his or her trust by establish-
which is a form of psychological attachment to others with ing a friendly rapport (Gendall, 2005). They do this in what
whom they like and identify (Allen & Meyer, 1990). Affec- is sometimes referred to as a confidence scheme by preying
tive commitment causes people to expend effort and per- on one’s loneliness, or making appeals to one’s need for
form actions in exchange for the satisfaction that is derived friendship, or creating a sense of similarity with the potential
from the emotional ties with the target (a brand, person, victim, or feigning ties with or even pretending to be a like-
group, or organization) (Beck & Wilson, 2000). For exam- able famous individual (Mitnick & Simon, 2002).
ple, people model the behavior of valued peer-groups or Asch (1946) and others (c.f. Casciaro & Lobo, 2005;
important others in order to be associated with the Guadagno & Cialdini, 2002; Horai et al., 1974; Komito,
restricted social group or “clique” (Asch, 1946). Many 1994) have identified that people usually “trust those they
marketing campaigns take advantage of this tendency by like,” and conversely, they usually “like those they trust.”
persuading people to perform actions (i.e., make pur- Moreover, people trust those they perceive as credible such
chases) to maintain a relationship or an association with a as having a special expertise or ability, even if the perceived
certain fashionable group or famous individual (Cacioppo ability or expertise is not related to the particular item the
et al., 1986). In the context of this study, it is important to “expert” represents (Cacioppo et al., 1986; Gass & Seiter,
recognize that people vary in their degrees of social attach- 1999; Horai et al., 1974). For example, people may be
ment and social identity (Asch, 1946; Beck & Wilson, persuaded by a professional basketball player or a famous
2000; Tajfel & Turner, 1986). That is to say, some people actor they like to purchase batteries or a breakfast cereal
easily identify with others whereas others do not, and some (Giles & Wiemann, 1987). Hence, the basis for liking from
people are more malleable than others in their affinity with an information security and social engineering perspective is
people who they do not have personal association or con- trust (Charbaji & Jannoun, 2006; Guadagno & Cialdini,
tact, such as in the case of identification with a famous 2002; Wang & Emurian, 2005; Yakov, Shankar, Sultan, &
person (Fennis, Pruyn, & Maasland, 2005). Urban, 2005).
In terms of information security and social engineering, Some research (e.g., Walczuch & Lundgren, 2004) has
people sometimes will divulge sensitive or private infor- defined various forms or types of trust; nevertheless, some
mation to those to whom they feel committed even if people have a greater propensity to trust generally than do
these others do not have “a need to know.” People do this others (Charbaji & Jannoun, 2006; Chen & Barnes, 2007;
because they have an attachment to or emotional bond and Krishnan & Martin, 2006; Stephenson, 2008). Under-trust in
believe that the relationship is paramount to the possible one setting may result in foregone beneficial opportunities,
information security threat (Bergman, 2006; Theoharidou paranoia, and unnecessary tensions, but over-trust leads
et al., 2005). Consistent with these theories, when social to ineffective monitoring, fraud, reduced efficiency and
engineers utilize peripheral route persuasion geared toward incompetence (Stephenson, 2008). Overtrust may “limit the
social proof (those that involve an affinity or identification cognitive efforts of [people] when they consider their
with a social entity or important other), people who are broader environment . . . and the cognitive comfort that trust
more affectively committed are more susceptible to social brings about also limits variety of thought and action and

666 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—February 15, 2008
DOI: 10.1002/asi
attentiveness to detail” (Krishnan & Martin, 2006, something of value, punishment, humiliation, or condemna-
pp. 894–895) making one more susceptible to peripheral tion (Milgram, 1983). Social engineers prey on the impulses
route persuasion. of those who respond to fear from an authoritative command
A substantial body of literature describes the nature of (Cacioppo et al., 1986; Cialdini 2001; FTC, 2003; Miller,
“online-trust” and its important role in conducting online 2005; Mitnick & Simon, 2002).
business transactions, but also hints that a trusting nature can However, people vary in their obedience and the degrees
lead one into the potential of falling victim to social engi- to which they will comply with commands (Helm & Morelli,
neering ploys (Guadagno & Cialdini, 2002; Wakefield & 1985). Factors such as deference to authority play an important
Whitten, 2006; Yakov et al., 2005). Consider, for example, role in obedience and persuasion even when people are
that online retailers often utilize techniques on their Web exposed to those who possess highly authoritarian character-
sites that people associate with the familiar brick-and-mortar istics (Blass, 2000). Moreover, when people feel threatened
facilities, such as images of a building, trusted logos, or or coerced, they sometimes strive to repel the coercion by
famous people (Walczuch & Lundgren, 2004). Social engi- establishing a psychological defense mechanism of resis-
neers regularly employ these same techniques in phony Web tance (Donelson, 1973). This reactance may be triggered
sites or e-mail. They also utilize close-distance personal by events that impede a perceived freedom of choice, and by
writing styles that attempt to establish a rapport with the any perceived social influences that make an individual feel
potential victim to prey on his or her loneliness or need for a force pushing him or her to act. It may, at times, motivate
friendship, and they strive to create a feeling of similarity efforts to restore the lost freedom to prevent the loss of other
with potential victims to gain their trust (Mitnick & Simon, freedoms (Brehm & Cole, 1966).
2002). Consequently, we hypothesize that When fear appeals are made, people respond based on
the magnitude of perceived severity of a depicted threat, the
Hypothesis 4: People who are more trusting will succumb probability people perceive of that event’s occurrence, and
to social engineering more frequently than those who are the efficacy of the recommended response (Rogers, 1975).
less trusting. If the portrayed threat is not believed, or if the event is
thought not to be severe, or the recommended solution is not
believed to be adequate to deal with the threat, people may
Fear, Authority, and Scarcity
resist (Severin & Tankard, 1997). Therefore, in some cases,
Telemarketers and debt collectors frequently utilize fear people will readily comply with someone who “seems”
tactics and authoritative commands to gain a person’s com- authoritative, whreas others remain skeptical and resist
pliance (FTC, 2003). Borrowing from these techniques, (Brehm, 1966; Donelson, 1973). When social engineers use
social engineers have used authority and fear tactics to elicit authority to project fear or a threat, those who respond more
information or actions from potential victims (Gao & Kim, readily and obediently to authority are more likely to comply
2007; Mitnick & Simon, 2002; Panko, 2004). A common with these requests than people who are more skeptical and
phishing technique, for example, is to broadcast an e-mail remain defiant (Weatherly et al., 1999). Therefore, we
containing an urgent subject line to get the potential victim’s hypothesize that
attention. Samples of actual phishing e-mail subject lines
have included “Alert From Chase Card Services,” “Your eBay Hypothesis 5: People who are more obedient to authority
account will be Suspended!” “Please Update Your Bank of will succumb to social engineering more frequently than
America Profile: Personal Information Error,” and “Urgent! those who are less obedient to authority.
Invalid information added to your PayPal account.” Con-
tained within the e-mails are dire warnings along with offi- Fear has an additional element that impacts how people
cial looking logos and formatting, followed by instructions may or may not respond to social engineering threats. Similar
to browse to a Web page and enter information or call a par- to how authority may trigger reactance, scarcity may engen-
ticular phone number. A common pretext is to telephone der a reactive hoarding impulse (Melamed, Szor, Barak, &
members of an elderly association and impersonate a gov- Elizur, 1998; Plomin, DeFries, & McClearn, 2001) in which
ernment official, then using pressure and fear tactics, instruct people may react quickly and at times illogically to per-
the potential victims to enroll in an unnecessary or even ceived shortages (Brehm, 1966). Social engineers often try
fraudulent insurance program (Rusch, 1999). to gather information or elicit an action because of the
Milgram’s (1983) landmark work on obedience to author- premise that the potential victim will run out of time or
ity offered provocative evidence of the extent to which the opportunity to capitalize on gaining some scarce item
people will submit to commands of an authority figure. (Lynn, 1992; Rutte, Wilke, & Messick, 1987). Reactance
Obedience creates actions in deference to those who have theory (Brehm, 1966) posits that people may change their
perceived coercive power (Weatherly, Miller, & McDonald, views and behavior when they think their ability to act might
1999) such as in the case of one who can terminate a bank be curtailed by an external constraining force such as a
account or some valued privilege. Authority therefore can be shortage in the supply of a valued item. In essence, people
used to engender fear, where people obey commands to react when an individual perceives that his freedom is
avoid a negative consequence such as losing a privilege or restricted or threatened about an important matter on which

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—February 15, 2008 667
DOI: 10.1002/asi
he thinks himself capable to make a choice among different employees of the practice, it is not required by law (Scholz,
alternatives (Pennebaker & Sanders, 1976). Thus, when some 1997).
people feel their freedom to act and make choices is threat- The data collection consisted of two parts, a question-
ened, they experience a dissonance that motivates a reac- naire and objective observations. For the questionnaire, we
tance to the perceived constraining threat (Rusch, 1999). drew from Allen and Meyer’s (1990) items for commitments,
Yet people differ in their perceptions of perceived threats Gendall (2005) for trust, Lindsey (2005) and Weatherly et al.
and whether they react or resist (Dowd & Seibel, 1990; (1999) for obedience to authority and reactance/ resistance.
Lindsey, 2005), particularly if the threat involves a per- We also gathered self-report items for dependent variables to
ceived shortage rather than a direct personal threat (Dowd & determine their correlation with the objective observations
Seibel, 1990). If social engineers use a technique that strives of subjective social engineering behaviors. Eight-hundred
to gather information or elicit an action because of the fifty participants were randomly selected from the company
premise that the potential victim may run out of time or directory and 612 responded; however, of those 612, 24
the opportunity to capitalize on gaining some scarce item, questionnaires were incomplete (perhaps the result of a loss
then people who react more readily to the scarcity threat will of network connection as most contained duplicate message
become victims of this type of social engineering threat than authentication codes with completed questionnaires) and
those who tend to resist such threats (Guadagno & Cialdini, were thus discarded, which yielded a 69% response rate at a
2002; Pennington & Hastie, 1986). We therefore hypothe- 3.5% sampling confidence with a standard error of esti-
size that mate of 0.05, indicating a high level of sampling validity
(Salant & Dillman, 1994).
Hypothesis 6: People who are more reactant will succumb
to social engineering more frequently than those who are
more resistant. Procedures
Corporate executive sponsors facilitated entré once
researchers had signed a nondisclosure and a confidentiality
Method
agreement. Researchers were provided with a company
We chose a field study employing two data-gathering directory of the population under study, including location
techniques: a questionnaire containing antecedent factors, and e-mail addresses. The corporate sponsors sent each par-
and an observation of behaviors related to the dependent ticipant a message, with an acknowledgement flag set, in
variables (described below). Field studies, having strong which they were informed with the cover story that
ecological validity, are thought to be a good choice when researchers were interesting in studying employee percep-
the study is highly relevant to real-world events and the tions about telemarketing, and asked for their cooperation.
researchers wish to generalize to the world of practice. They were assured of the confidentiality of their responses in
the message. The researchers then contacted participants via
e-mail and attachment with a cover letter once again using
Sample and Data Collection
the cover story and ensuring confidentiality of the respon-
Our field study was conducted at a large services organi- dents, along with an announcement of the URL of the online
zation involved in the insurance and financial industries in data collection instrument. In addition, each participant each
the United States, and is a government-regulated entity that received an authentication password. When participants
has had serious security breaches in the past. In the public took the questionnaire, the authentication password was
interest, they encouraged us to study the problem and acceded used to produce a message digest to keep track of who had
to our requirement that participation would be anonymous completed the questionnaire and to ensure that the question-
and the data gathered held in strict confidence. Prior to engag- naire was taken only once by each participant.
ing in the study, we received approval from institutional The researchers collected phishing messages they had
human-subjects review board, Florida State University, and received over the course of many months as well those pro-
had guidance and assistance from the company’s human vided by an information security consultant who specialized
resources representatives and the corporate attorneys. in social engineering. Pretext scenarios were constructed
The company monitors data and communication as a with the assistance of an information security consultant,
standard practice and requires employees to sign an employ- analysts from a market research firm, and members from the
ment agreement that includes their consent to monitoring company’s security department who had collected success-
when they are hired. Laws in the United States and the Euro- ful and thwarted incidents at the company. The researchers,
pean Union support the right of corporations to inspect and information security consultant, and the analysts from the
monitor work and workers, which arises from needs related market research firm reviewed the materials to ensure that
to business emergencies and a corporation’s rights to protect the range of persuasive message characteristics was ade-
its interests (Harvey, 2007; Keck, 2005; Losey, 1998; Borrull quately covered. Student-actors were then enlisted from
& Oppenheim, 2004), and is now a common organizational the university’s college of visual arts and theatre to perpe-
practice (D’Urso, 2006). Companies can monitor employees, trate the phishing and pretext ruses. Telemarketers from
and, while advisable to take overt action to notify its the market research firm rehearsed the student-actors on the

668 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—February 15, 2008
DOI: 10.1002/asi
12

(.87)
delivery of persuasive messages, and the information secu-
rity consultant coached them on specific pretexts. Phishing
attacks were conducted using e-mails containing the various

(.89)
.84
11
techniques to get participants to click on Web page URLs
and enter personal or company confidential information, or
open e-mail attachments containing an executable program

.09
.06
that created a network connection and reported back to the

10


researchers that the file had been opened. Pretexts were done
with telephone calls to participants where the actors pre-
tended to be various officials, internal employees, employ-

.09
.13
.12
9
ees of trading partners, customers, utility companies, and


financial institutions, and solicited confidential information
using the study range of persuasive techniques.

.19
.15
.09
.15
8


Measures
As indicated, the factors were collected as self-report

.09

.32
.27
.01
.24
items using an online questionnaire. To examine the con-


struct validity of the self-report independent variables (IVs)
on social engineering security behavior, we ran a Varimax
rotated principal components analysis on the relevant items.

(.74)
.05
.01

.48
.43
.09
.19

Note. N  588. All correlations greater than r  .14 are significant at p  .001; correlations greater than .12 are significant at p  .01
6
If items for the six IVs discriminate the constructs as inten-
ded, they will load highly on the posited two outcome mea-
sures and not cross-load. The loadings did indeed cleanly

(.85)
.29
.26
.18

.38
.33
.01
.17
discriminate between the measures. Whereas we tested the

5
various factors via the direct effects of the six IVs on behav-
ioral social engineering responses, the analysis offers strong

(.82)
evidence that attempts to capture in-practice assessments of

.26
.54
.16
.19

.59
.51
.09
.04
4

these threats at a macro level will likely find that the factors
do account for the relationships with the social engineering
outcomes. Phishing e-mails and pretext attacks were carried (.87)
.54
.35
.61
.17
.13

.48
.46
.04
.02
out over a period of 6 months in which the coadjutors used
3

social engineering to try to gain confidential information


Descriptive statistics, scale reliabilities, and intercorrelations of study variables.

from each participant two times each week. An example


was to send an e-mail asking participants to click on a link
(.80)

.21

.46
.43
.25
.37
.35
.34
.17

.04
.04
2

to update personal and confidential information. The fre-


quency of violations was collected and examined for each
participant.
(.89)

.45
.42
.51
.35
.31

.47
.43
.17
.55
.44

.01
1

Results
11.25
0.50

1.25
3.39
1.38

1.23
0.92
1.05
0.99
1.19

0.88
1.30

Before the hypotheses were tested, we needed to take a


SD

preliminary step to determine whether the objective measures


of security behaviors (phishing and pretext) were relatively
independent, or if they were indicators of an underlying
.0000

security behavior construct. As a result, we looked at the


4.34

37.96
1.47

4.19
3.74
3.96
3.95
3.80
3.91

2.65
2.93
X

frequency with which the person responded to pretexts and


phishing attempts or opened “potentially” destructive e-mail
attachments. See the descriptive statistics in Table 2.
2. Continuance commitment

To determine whether these behaviors represented sepa-


1. Normative commitment

10. Previous victimization


3. Affective commitment

rate constructs or constituted a single variable (i.e., social


11. Subjective behaviors
9. Employee education

12. Objective behaviors


8. Employee gender

engineering security behavior), we conducted an exploratory


7. Employee age

factor analysis. This analysis showed that the behaviors


5. Obedience
6. Reactance

loaded on a single factor. Further, the reliability of the items in


factor was high (  .88; Stevens, 1989). As a result, we stan-
TABLE 2.

4. Trust

dardized each of the observed items and then reverse coded


the Z-score for the frequency that the person positively

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—February 15, 2008 669
DOI: 10.1002/asi
responded to an adverse social engineering behavior. We hypothesis was supported for both self-report (b  .13, p 
then created a single measure referred to as observable .001) and objective ( b  .11, p  .01) social engineering
social engineering security behavior. Thus, objective behav- security behaviors. Likewise, hypothesis 2 proposed that if
ior represented the objective measures of employee social people were higher in continuance commitment, they would
engineering security behavior, whereas the variable subj- succumb to social engineering more frequently than those
ective represented the self-reported behavioral social who were lower in continuance commitment. Again this was
engineering. supported for both self-report (b  .17, p  .001) and objec-
Each of the hypotheses was then tested using both the tively observed (b  .20, p  .001) social engineering security
self-reported social engineering behaviors and the objective behaviors.
social engineering behavior measures. This allowed for a Hypothesis 3 stated that people who were higher in affec-
more robust test of the hypotheses rather than relying upon tive commitment would fall victim to social engineering
subjective or objective measures alone. In the current study, more frequently than those who are lower in affective com-
the self-report and objective measures very highly correlated mitment. As hypothesized, this was supported for both self-
(r  .89). Even with this high degree of correlation, how- report (b  .16, p  .001) and objective (b  .20, p  .001)
ever, we cannot assume that the measures are completely social engineering security behaviors. Hypothesis 4 stated
interchangeable (Bommer, Johnson, Rich, Podsakoff, & that people who were more trusting would succumb to social
MacKenzie, 1995). In fact, nearly 20% of the variance engineering more frequently than those who are less trust-
between these two measures remains unexplained. As a ing. Again, as hypothesized, this was supported for both self-
result, we left the objective and subjective measures separate report (b  .24, p  .001) and objectively collected social
so that the hypotheses could be tested against each criterion engineering security behaviors (b  .16, p  .01).
variable. To test the hypotheses contained in this research, an Because there are indications that people comply with
ordinary least squares regression was conducted whereby authority figures when requests are made, hypothesis 5 pro-
the variables of interest and the controls were regressed on posed that people who are more obedient to authority would
the self-report and objective measures of social engineering succumb to social engineering more frequently than those
behavior. The results of the regression models are reported who are less obedient to authority. This was supported for
in Table 3. both self-report (b  .10, p  .01) and objectively collected
To provide a more controlled assessment of the six b  .11, p  .01) social engineering security behaviors.
hypotheses of interest, we included a number of control vari- Hypothesis 6 presented an interesting case. It proposed that
ables to provide a better-controlled assessment. These con- people who were more reactant would yield to social engi-
trol variables included the employee demographics of age, neering more frequently than those who are more resistant.
gender, education, and whether someone had previously Although the relationships were positive, they were not
fallen victim to a social engineering ploy. The regression significant for either the self-report (b  .06, p  .05) or
results provide a very consistent story across the two social objective (b  .06, p  .05) measures.
engineering security behavior measures. The first hypothesis
stated that people who were higher in normative commit-
ment would succumb to social engineering more frequently
Discussion
than those who are lower in normative commitment. This
In terms of information security defenses, most of the
research has investigated either available security technolo-
gies or the management of security infrastructure such as
TABLE 3. Ordinary least squares regression results for self-reported and conducting risk analyses for the application of technological
objective observations of security behaviors. defenses. However, the defense problem has a behavioral
grounding, and the significance of people’s failure to take
Subjective Objective
precautions against information security threats has been
Security Security
Variable Behaviors Behaviors largely ignored, especially concerning social engineering
threats. Failing to take information security precautions is a
Normative commitment .13*** .11** significant issue. Carelessness with information and failure
Continuance commitment .17*** .20*** to take available precautions contributes to the loss of infor-
Affective commitment .16*** .20***
mation and even to crimes such as corporate espionage and
Trust .24*** .16**
Obedience .10** .11** identity theft of which the U.S. Department of Justice (2004)
Reactance .04 .06 estimates that 1 in 3 people will become victims at some
Employee age .13*** .10* point in their lifetime. Social engineering is a major avenue
Employee gender .01 .01 for information security breaches, yet other than anecdotal
Employee education .07* .12**
materials, there has been little to help managers address the
Previous victimization .03 .01
Model adjusted R2 .46 .40 problem because the relationships among personal factors
and social engineering outcomes has not been thoroughly
Note. N  588. All variables are standardized regression coefficients. investigated or explained.

670 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—February 15, 2008
DOI: 10.1002/asi
Social engineering takes many different forms, although et al., 1979; Sagie, 1998) managers need to be able to assist
the techniques mainly rely on peripheral route persuasion. employees in recognizing and discriminating appropriate
As such, the ELM has offered a promising framework for targets of their commitments; that is, commitment to company
understanding the ways in which social engineers gather means withholding commitments from potential threats.
sensitive information or get unwitting victims to comply Next, a common security countermeasure is to compartmen-
with their requests for actions. Our investigation has talize roles and allocate information on a “need to know”
attempted to bridge the theory that explains how people are basis so that sensitive information is not inadvertently
persuaded through peripheral routes with the social engi- leaked. However, this technique runs counter to many orga-
neering outcomes using an empirical field study in which we nizational theories and interventions that strive for organic
investigated whether the factors that account for how people structuring and open communications. People need to be
are persuaded in marketing campaigns to make purchases inculcated with sense of ethical conduct and responsibility,
may apply as well to social engineering to give up confiden- and must be trustworthy. Trust however, is a two-way street;
tial information. trustworthy employees expect to be trusted. As such, train-
Specifically, we found that people who are high in nor- ing is seen as an important component in dealing with social
mative commitment feel obligated to reciprocate social engineering. Perhaps people do not connect their general
engineering gestures and favors such as receiving free soft- willingness to protect sensitive information with the duplic-
ware or gift certificates by giving up company e-mail ity that may occur. Training should mitigate by first making
addresses, employee identification numbers, financial and employees aware, and second in developing new coping
insurance data, and other confidential and sensitive informa- behaviors. This is particularly important in relation to online
tion. Likewise, people who are high in continuance commit- trust. Technologies exist that are used as countermeasures
ment tend to provide information to escalating requests. We such authentication and watermarking, and training can
found that high continuance commitment people will even assist people knowing what to look for before trusting.
give up increasingly sensitive information as part of an Finally, in dealing with the issue of obedience to authority,
online game just to try to win the game. High affective com- corporate security policies should be established that address
mitment was also found to contribute to successful social the classification of information and the circumstances under
engineering. These individuals tend to provide information which sensitive information can be divulged. These policies
because they want to be part of a socially desirable group, or should also include the processes and accountability for
to be accepted. Thus, all three of Allen and Meyer’s (1990) reporting suspected incidents so that people who are obedi-
types of commitments were found salient in social engineer- ent to authority have clear delineation of the lines of author-
ing attacks. ity and the roles and responsibilities of the actors. Then
Online trust has been studied in relation to whether regular, ongoing security awareness programs should be
people will conduct “e-business” and has provided some conducted to prevent complacency.
provocative indicators that this factor may as well lead to Some specific limitations are worth noting. First, there is
social engineering susceptibility. Social engineers often always some discrepancy between what people report about
apply techniques that try to cultivate trust by a process of their behaviors and what they actually do, thus we utilized
establishing a friendly rapport or by reference to likable observational measures to address this limitation. The high
famous individuals. Consistently, we found that people who correlation and the consistent results indicate that good con-
are trusting were more likely to fall victim to social engi- gruence between our self-report subjective measures and
neering more than those who are distrusting. This creates a those we observed. Outside of that issue, however, our par-
double bind when juxtaposed with the need for online trust ticipants were those who responded to the requests and filled
in e-business. out the questionnaire posted online. Even though our
Whereas some people are more persuaded by trust and response rate and sampling confidence was good, clearly
friendly rapport, others are more responsive to authority there is a complication as those who did not respond may
figures. We tested authoritative commands and fear tactics in have made important contributions to our investigation and
relation to the levels of people’s obedience to authority and to our findings. Next, the ways people cope with threats
reaction to scarcity. Higher degrees of obedience to author- largely is socially influenced, and could be modally sensitive.
ity were an important factor in whether people responded to For instance, the influences may be different in chatrooms
these types of social engineering attacks; however, we found than in e-mail.
no support for reactance to scarcity. In a marketing sense, Our study involved only people in the United States. It
perhaps people will strive to own something of dwindling would be interesting to research an international population
supply, but it does not appear that reactance to scarcity such to determine if the framework could be generalized across
as using a “time is running out” technique accounts for social and cultural contexts. Also along these lines, addi-
whether people succumb to such social engineering ploys. tional research is needed in an organizational and group-work
Based on these findings, several recommendations are context. Finally, some frameworks (e.g., Langenderfer &
made for managers. Because commitment is a fairly stable Shimp, 2001) have suggested that demographics are impor-
personal characteristic and is instrumental in effectively tant variables in susceptibility to various factors. Because
functioning organizations (McCaul et al., 1995; Mowday our study focused on hypothesis testing rather than model

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—February 15, 2008 671
DOI: 10.1002/asi
building, we used age, gender, and education as control vari- Cialdini, R.B. (2001). Influence: Science and practice. Boston: Allyn &
ables. Future research may consider empirically testing Bacon.
Debar, H., & Viinikka, J. (2006). Security information management as an
demographic model influences on the dependent measures outsourced service. Information Management & Computer Security,
to deepen our understanding of the social engineering 14(5), 417–435.
phenomenon. Department of Justice (2004), “Violation of 18 U.S.C. § 1030(a)(5)(B):
Gaining unauthorized access,” Cibercrime Report, 17, 188–219.
Dodge, R.C., Carver, C., & Ferguson, A.J. (2007). Phishing for user secu-
rity awareness. Computers & Security, 26, 73–80.
References Donelson, E. (1973). Personality: A scientific approach. Pacific Palisades,
CA: Goodyear Publishing.
Acquisti, A., & Grossklags, J. (2003, May). Losses, gains, and hyperbolic
Dotterweich, D.P., & Collins, K.S. (2006). The practically of super bowl
discounting: An experimental approach to information security attitudes
advertising for new products and companies. Journal of Promotion
and behavior. Paper presented at the 2nd Annual Workshop on Econom-
Management, 11, 19–31.
ics and Information Security (WEIS’03), Berkeley, CA.
Dowd, E.T., & Seibel, C.A. (1990). A cognitive theory of resistance and
Ajzen, I. (2002). Perceived behavioral control, self-efficacy, locus of
reactance: Implications for treatment. Journal of Mental Health Counsel-
control, and the theory of planned behavior. Journal of Applied Social
ing, 12, 458–469.
Psychology, (32), 665–683
D’Urso, S.C. (2006). Who’s watching us at work? Toward a structural-
Albrechtsen, E. (2007). A qualitative study of users’ view on information
perceptual model of electronic monitoring and surveillance in organiza-
security. Computers and Security, 26(4), 276–289.
tions. Communication Theory, 16, 281–303.
Aldoory, L., & Van Dyke, M.A. (2006). The roles of perceived shared
involvement and information overload in understanding how audiences Fennis, B.M., Das, E., & Pruyn, A.T.H. (2006). Interpersonal communica-
make meaning of news about bioterrorism. Journalism & Mass Commu- tion and compliance: The disrupt-then-reframe technique in dyadic influ-
nication Quarterly, 83, 346–361. ence settings. Communication Research, 33, 136–151.
Allen, N.J., & Meyer, J.P. (1990). The measurement and antecedents of Festinger, L., & Carlsmith, J.M. (1959). Cognitive consequences of
affective, continuance and normative commitment to the organization. forced compliance. Journal of Abnormal and Social Psychology, 58,
Journal of Occupational Psychology, 63, 1–18. 203–210.
Arkes, H., & Blumer, C. (1985). The psychology of sunk cost. Organiza- FTC. (2003). Complying with the telemarketing sales rule. Retrieved January
tional Behavior and Human Decision Process 35, 124–140. 29, 2007, from http://www.ftc.gov/bcp/conline/pubs/buspubs/tsrcomp.htm
Asch, S.E. (1946). Forming impressions of personality. Journal of Abnor- Gao, W., & Kim, J. (2007). Robbing the cradle is like taking candy from a
mal and Social Psychology, 41, 258–290. baby. Paper presented at the Annual Conference of the Security Policy
Beck, K., & Wilson, C. (2000). Development of affective organizational Institute, Amsterdam, the Netherlands.
commitment: A cross-sectional examination of change with tenure. Jour- Gass, R.H., & Seiter, J.S. (1999). Persuasion, social influence, and compli-
nal of Vocational Behavior, 56, 114–136. ance gaining. Needham Heights, MA: Allyn-Bacon.
Bergman, M.E. (2006). The relationship between affective and normative Gendall, P. (2005). Can you judge a questionnaire by its cover? The effect
commitment: Review and research agenda. Journal of Organizational of questionnaire cover design on mail survey response. International
Behavior, 27, 645–663. Journal of Public Opinion Research, 17, 346–361.
Blass, T. (2000). Invited response to review of obedience to authority: Cur- Giles, H., & Wiemann, J.M. (1987). Language, social comparison and
rent perspectives on the Milgram paradigm. British Journal of Educa- power. In C.R. Berger & S.H. Chaffee (Eds.), The handbook of commu-
tional Psychology, 70, 624–25. nication science (pp. 350–384). Newbury Park, CA: Sage.
Bommer, W.H., Johnson, J.L., Rich, G.A., Podsakoff, P.M., & MacKenzie, Grunig, J. E. (1997). A situational theory of publics: Conceptual history,
S.B. (1995). On the interchangeability of objective and subjective recent challenges and new research. In D. Moss, T. MacManus, &
measures of employee performance: A meta-analysis. Personnel D. Vercic (Eds.), Public relations research: An international perspective
Psychology, 48, 587–605. (pp. 3–48). London: International Thomson Business Press.
Borrull, A.L., & Oppenheim, C. (2004). Legal aspects of the web. In Blaise Guadagno, R.E., & Cialdini, R.B. (2002). On-line persuasion: An examina-
Cronin (Ed.), Annual review of information science and technology (Vol. tion of differences in computer-mediated interpersonal influence. Group
38, pp. 483–548). Medford, NJ: Information Today. Dynamics: Theory, Research and Practice, 6, 38–51.
Brehm, J.W. (1966). A theory of psychological reactance. NY: Academic Gundlach, G.T., Achrol, R.S., & Mentzer, J.T. (1995). The structure of
Press. commitment in exchange. Journal of Marketing, 59, 78–92.
Brehm, J.W., & Cole, N.H. (1966). Effects of a favor that reduces freedom. Harmon-Jones, E., & Mills, J. (1999). Cognitive dissonance: Progress on a
Journal of Personality and Social Psychology, 3, 420–426. pivotal theory in social psychology. Washington, DC: American Psycho-
Bresz, F.P. (2004, July–August). People—Often the weakest link in secu- logical Association.
rity, but one of the best places to start. Journal of Health Care Compli- Harrington, S.J. (1996). The effect of codes of ethics and personal denial
ance, 57–60. of responsibility on computer abuse judgments and intentions. MIS
Brill D.W., & Molton, P. (2006). Escalation of economic costs, sunk costs, Quarterly, 20, 257–258.
and opportunity costs: A psychological investment perspective. Interna- Harvey, C. (2007). The boss has new technology to spy on you. Datamation,
tional Journal of Human Decisioning, 12, 29–38. April, 1–5.
Cacioppo, J.T., Petty, R.E., Kao, C.F., & Rodriguez, R. (1986). Central and Helm, C., & Morelli, M. (1985). Obedience to authority in a laboratory
peripheral routes to persuasion: An individual difference perspective. setting: Generalizability and context dependency. Political Studies, 14,
Journal of Personality and Social Psychology, 51, 1032–1043. 610–627.
Calluzzo, V.J., & Cante, C.J. (2004). Ethics in information technology and Horai, J., Naccari, N., & Fatoullah, E. (1974). The effects of expertise and
software use. Journal of Business Ethics, 51, 301–312. physical attractiveness upon opinion agreement and liking, Sociometry,
Casciaro, T., & Lobo, M.S. (2005). Competent jerks, lovable fools, and the 37, 601–606.
formation of social networks. Harvard Business Review, 83, 92–99. Hsu, M.-H., & Kuo, F-Y. (2003). An investigation of volitional control in
Charbaji, A., & Jannoun, S.E.L. (2005). Individuality, willingness to take information ethics. Behavior and Information Technology, 22, 53–62.
risk, and use of a personal e-card. Journal of Managerial Psychology, 20, International Federation of Accountants (2006). Intellectual assets and
51–58. value creation: Implications for corporate reporting. Paris France.
Chen, Y.-H., & Barnes, S. (2007). Initial trust and online buyer behavior. Retrieved November 12, 2007, from http://www.oecd.org/dataoecd/2/
Industrial Management and Data Systems, 107, 21–36. 40/37811196.pdf

672 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—February 15, 2008
DOI: 10.1002/asi
Jones, R.L., & Rastogi, A., (2004). Secure coding: building security into the Petty, R.E., Briñol, P., & Tormala, Z. L. (2002). Thought confidence as a de-
software development life cycle. Information Systems Security. 13, terminant of persuasion: The self-validation hypothesis. Journal of Per-
29–39. sonality and Social Psychology, 82, 722–741.
Josephs, R.A., Larrick, R.P., Steele, M., & Nisbett, R.E. (1992). Protecting Petty, R.E., & Cacioppo, J.T. (1986). Communication and persuasion: Cen-
the self from the negative consequences of risky decisions. Journal of tral and peripheral routes to attitude change. New York: Springer-Verlag.
Personality and Social Psychology 62, 26–37. Petty, R.E., Cacioppo, J.T., & Schumann, D.W. (1983) Central and periph-
Keck, R. (2005). Disruptive technologies and the evolution of the law. eral routes to advertising effectiveness: The moderating role of involve-
Legal Briefs, 23, 22–49. ment. Journal of Consumer Research, 10, 135–146.
Kelley, H.H., & Thibaut, J. (1978) Interpersonal relations: A theory of Plomin, R., DeFries, J.C., McClearn, G.E., & McGuffin, P. (2001). Behav-
interdependence. New York: Wiley. ioral genetics. (4th edition). New York: Worth Publishers.
Kim, Y.H., & Kim, D.J. (2005, January). A study of online transaction Rogers, R.W. (1975). A protection motivation theory of fear appeals and
self-efficacy, consumer trust, and uncertainty reduction in electronic attitude change. Journal of Psychology, 91, 93–114.
commerce transactions. Paper presented at the 38th Annual Rosenstock I.M. (1974). Historical origins of the health belief model.
Hawaii International Conference on System Sciences (HICSS), Big Health Education Monograph, 2, 328-35
Island, HI. Rusch, J.J. (1999). The social engineering of Internet fraud. Report of the
Komito, L. (1994). Communities of practice and communities of trust: U.S. Department of Justice. Paper presented at the INET’99 Conference.
Global culture and information technology. Journal of Anthropology, 4, Retrieved February 6, 2007, from http://www.isoc.org/inet99/proceedings/
33–45. 3g/3g_2.htm
Krishnan, R., & Martin, X. (2006). When does trust matter to alliance Rutte, C.G., Wilke, H.A.M., & Messick, D.M. (1987). Scarcity or abun-
performance? The Academy of Management Journal, 49, 894–917. dance caused by people or the environment as determinants of behavior
Langenderfer, J., & Shimp, T.A. (2001). Consumer vulnerability to scams, in the resource dilemma. Journal of Experimental Social Psychology, 23,
swindles, and fraud: A new theory of visceral influences on persuasion. 208–216.
Psychology and Marketing, 18, 763–783. Sagie, A. (1998). Employee absenteeism, organizational commitment, and
Leach, J. (2003). Improving user security behavior. Computers and job satisfaction: Another look. Journal of Vocational Behavior, 52,
Security, 22(8), 685–692. 156–171.
Leyden, J. (2004). Clueless office workers help spread computer viruses, Salant, P., & Dillman, D.A. (1994). How to conduct your own survey. NY:
The Register, February 6, 17–21. John Wiley & Sons.
Lindsey, L.L.M. (2005). Anticipated guilt as behavioral motivation: An Sasse, M. A., Brostoff, S., & Weirich, D. (2004). Transforming the weakest
examination of appeals to help unknown others through bone marrow Link - A Human/Computer interaction approach to usable and effective
donation. Human Communication Research, 31, 453–481. security. BT Technology Journal, 19, 122—131
Losey, R.C. (1998). The electronic communications privacy act: United Scholz, J.T. (1997). Enforcement policy and corporate misconduct: The
States Code. Orlando, FL: The Information Law Web. Retrieved Novem- changing perspective of deterrence theory. Law and Contemporary Prob-
ber 12, 2007, from http://floridalawfirm.com/privacy.html lem, 60, 153–268.
Lynn, M. (1992). Scarcity’s enhancement of desirability. Basic and Applied Schumann, D.W., Hathcote, J.M., & West, S. (1991, September). Corporate
Social Psychology, 13, 67–78. advertising in America: A review of published studies on use, measure-
McCaul, H.S., Hinsz, V.B., & McCaul, K.D. (1995). Assessing organiza- ment, and effectiveness. Journal of Advertising, 35–55.
tional commitment: An employee’s global attitude toward the organization. Severin, W.J., & Tankard, J.W. (1997). Communication theories: Origins,
Journal of Applied Behavioral Science, 31, 80–90. methods, and uses in the mass media. New York: Addison-Wesley.
Melamed, Y., Szor, H., Barak, Y., & Elizur, A. (1998). Hoarding: What does Shreve, Michael. (2004). The office now a major place for identity theft.
it mean? Comprehensive Psychiatry, 39, 400–402. crains, Sept, 1–4.
Milgram, S. (1983). Obedience to authority: An experimental view. New Siponen, M.T. (2005). Analysis of modern IS security development
York: Harper-Collins. approaches: Towards the next generation of social and adaptable ISS
Miller, K. (2005). Communication theories: Perspectives, processes, and methods. Information and Organization, 15, 339–375.
contexts. New York: McGraw-Hill. Siponen, M., & Livari, J. (2006). Six design theories for IS security policies
Milne, S., Sheeran, P., & Orbell, S. (2000). Prediction and intervention in and guidelines. Journal of the Association for Information Systems, 7(7),
health-related behaviour: A meta-analytic review of protection motiva- 445–472.
tion theory. Journal of Applied Social Psychology, 30, 106–143 Siponen, T. (2000), A conceptual foundation for organizational information
Mitnick, K., & Simon, W.L. (2002). The art of deception: Controlling the security awareness, Information Management & Computer Security, 8,
human element of security. New York: Wiley. 31–41
Mowday, R.T., Steers, R.T., & Porter, L.W. (1979). The measurement of Staw, B.M. (1981). The escalation of commitment to a course of action. The
organizational commitment. Journal of Vocational Behavior, 14, Academy of Management Review, 6, 577–587.
224–247. Stephenson, K. (2008). The quantum theory of trust: Power, networks, and
Pahnila, S., Siponen, M.T., & Mahmood, A. (2007, January). Employees’ the secret life of organizations. Englewood Cliffs, NJ: Prentice-Hall.
behavior towards IS security policy compliance. Paper presented at the Stevens, J. (1989). Intermediate statistics: A modern approach. Hillsdale,
40th Hawaii International Conference on System Sciences (HICSS), Big NJ: Erlbaum.
Island, HI. Straub, D. W., & Nance, W.D. (1990). Discovering and disciplining com-
Panko, R.R. (2004). Corporate computer and network security. Upper Sad- puter abuse in organizations: a field study. MIS Quarterly, 14, 45–62.
dle River, NJ: Pearson/Prentice-Hall. Straub, D.W., & Welke, R.J. (1998). Coping with systems risk: Security
Pechmann, C., Zhao, G., Goldberg, M., & Reibling E.T. (2003), What to planning models for management decision-making. MIS Quarterly, 22,
convey in antismoking advertisements of adolescents: The use of protec- 441–469.
tion motivation theory to identify effective message themes, Journal of Tajfel, H., & Turner, J.C. (1986). The social identity theory of inter-group
Marketing, 67, 1–18. behavior. In S. Worchel & L.W. Austin (Eds.), Psychology of Intergroup
Pennebaker, J.W., & Sanders, D.Y. (1976) American graffiti: Effects of Relations. Chicago: Nelson-Hall.
authority and reactance arousal. Personality and Social Psychology Theoharidou, M., Kokolakis, S., Karyda, M., & Kiountouzis, E. (2005). The
Bulletin, 2, 264–267. insider threat to information systems and the effectiveness of ISO17799.
Pennington, N., & Hastie, R. (1986). Evidence evaluation in complex Computers and Security, 24, 472–484.
decision making. Journal of Personality and Social Psychology, 51, Thomas, Thomas M. (2004). Network Security First-Step. Indianapolis, IN:
242–258. Cisco Press.

JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—February 15, 2008 673
DOI: 10.1002/asi
von Solms, B, & von Solms R. (2004). The 10 deadly sins of information Weatherly, J.N., Miller, K., & McDonald, T.W. (1999). Social influence as
Security management. Computers & Security, 23, 371–376. stimulus control. Behavior and Social Issues, 9, 25–46.
Wakefield, R.L, & Whitten, D. (2006). Examining user perceptions of third- Wilson, R. (2004, January). Understanding the offender/environment dy-
party organization credibility and trust in an e-retailer. Journal of namic for computer crimes: Assessing the feasibility of applying crimi-
Organizational and End User Computing, 18, 1–19. nology theory to the IS security context. Paper presented at the 37th
Walczuch, R., & Lundgren, H. (2004). Psychological antecedents of institution- Hawaii International Conference on System Sciences (HICSS), Big
based consumer trust in e-retailing. Information & Management, 42, Island, HI.
159–177. Yakov B., Shankar, V., Sultan, F., & Urban G.L. (2005), Are the drivers
Wang, Y.D., & Emurian, H.H. (2005). An overview of online trust: and role of online trust the same for all web sites and consumers? A
Concepts, elements, and implications. Journal of Computers in Human large scale exploratory empirical study, Journal of Marketing, 69,
Behavior, 21, 105–125. 133–152.

674 JOURNAL OF THE AMERICAN SOCIETY FOR INFORMATION SCIENCE AND TECHNOLOGY—February 15, 2008
DOI: 10.1002/asi

You might also like