Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

SLICKROCK II

perspective
Value-Sensitive Design
Batya Friedman Colby College and The Mina Institute

Values emerge from the tools that we build and how we

choose to use them. Yet, in most of the current practice in

designing computer technology and the related infrastructure of

cyberspace, little is said about values.

After all, values—especially moral values— User Autonomy1


can be controversial. They can also seemingly Consider a recent workstation design that
conflict with economic goals and can be diffi- comes from a leading computer hardware and
cult to articulate clearly and to translate into networking company in the United States (see
meaningful processes and designs. What then [20] for a detailed discussion). The worksta-
is there to be said about accounting for tion was designed to support speech input and
human values in system design? How should multimedia, and thus included a built-in
we say it? microphone. Nothing strange here. Except
Over the past decade, my colleagues and I that the microphone automatically recorded
have addressed these questions as we have audio information whenever the workstation
taken an active stance toward creating com- was on. Now, imagine that you are in the mid-
puter technologies that—from an ethical posi- dle of a video-conferencing session and a visi-
tion—we can and want to live with [2, 4 – 7]. tor comes into your office, or the phone rings. 1User autonomy was

I have also sought to develop a framework for The only way to ensure audio privacy is to the topic of a work-
understanding how specific values play out in turn off the application, which is a cumber- shop Nissenbaum and
the design of computer systems, including some solution. Alternatively, a simple solution I organized at CHI‘96
how these values can be undermined or pro- existed in the design process (ultimately [6]. Participants in the
moted by the technology. In this article, I dis- vetoed by the design team): to install a hard- workshop included
cuss projects (conducted in collaboration with ware on/off switch on the microphone at the Bay-Wei Chang, Ise
Helen Nissenbaum) that concern two values: cost of 25 cents. Henin, David Kirsh,
user autonomy and freedom from bias. I con- This example illustrates how hardware Pekka Lehtio, Nicole
clude with some reflections on the importance design can either hinder or help the user’s abil- Parrot, and Mark
of value-sensitive design, justifiable limita- ity to control the technology. More generally, Rosenstein, They con-
tions on its implementation, and ways in it speaks to the value of user autonomy. By tributed to the ideas
which it complements economic mandates. this term we refer to individuals who are self- developed here.

interactions...november + december 1996


17
Batya Friedman, determining, who are able to decide, plan, and tion. Similarly, users sometimes need but are
Department of act in ways that they believe will help them to denied access to low-level manipulations
Mathematics and achieve their goals and promote their values. from within an operating system. Or con-
Computer Science, People value autonomy because it is funda- sider a state-of-the-art 3-D rendering system
Colby College, mental to human flourishing and self-devel- designed by a leading software company to
Waterville, ME 04901. opment [8, 9]. support artists with the animation process.
E-mail: How can designs promote user autonomy? The system designers assumed users would
b_friedm@colby.edu. From the previous example, there might seem work from hand-drawn or computer-generat-
to be a simple answer: If autonomous individ- ed sketches. But when some users preferred to
uals need to have freedom to choose means work from video images, it was discovered
and ends, then it could be said that whenever that there was no way to connect video input
possible and at all levels, designers should pro- to the system. All three cases illustrate that
vide users the greatest possible control over user autonomy can be undermined when the
computing power. On closer scrutiny, howev- computer system does not provide the user
er, there is a more complex picture. After all, with the necessary technological capability to
think of a text editor that helps a non-techni- realize his or her goals.
cally minded user to create consistent and
well-formatted documents. Such users will System Complexity
have little interest in explicitly controlling In some instances, systems may supply users
lower levels of operation of the editor even with the necessary capability to realize their
though they will appreciate control over high- goals, but such realization becomes difficult
er level functions. They will have little inter- because of complexity. We can see this prob-
est, say, in controlling how the editor executes lem clearly with the recent proliferation of fea-
a search and replace operation or embeds for- tures in many word processing programs.
matting commands in the document, and Although more features mean more capability,
more interest in controlling the efficient and more features often increase a program’s com-
effective formatting of the document. In this plexity and thereby decrease its usability, espe-
case, achieving the higher order desires and cially for the novice user. Granted, a designer
goals, such as efficiently producing a good- can rightfully expect users to spend time
looking document, will enhance autonomy, learning. But the question is How much time?
whereas excessive control over all levels of Moreover, other problems of system complex-
operation of the editor may actually interfere ity arise from a mismatch between the abilities
with user autonomy by obstructing users’ abil- of the user—for example skill level, memory,
ity to achieve desired goals. attention span, computational ability, and
In other words, autonomy is protected physical ability—and those required to use the
when users are given control over the right system efficiently.
things at the right time. Of course, the hard
work of design is to decide these whats and Misrepresentation of the System
whens. Accordingly, Nissenbaum and I have Users can experience a loss of autonomy when
begun to identify aspects of systems that can provided with false or inaccurate information
promote or undermine user autonomy. The about the computer system. Imagine, for
four that I discuss here are system capability, example, the package copy for a news filter
Value-Sensitive system complexity, misrepresentation of the that states “this news filter is as good as the
system, and system fluidity. best personal assistant.” Given the state of the
field, such hyperbole will mislead a user who
System Capability believes the package copy and, thus, develops
Recall the introductory example of a worksta- inaccurate expectations of the software agent’s
tion microphone with no hardware on/off ability to realize the user’s goals. Or consider
switch. Here (assuming no software fixes), the following vignette related by a colleague in
users lack the capability to easily interrupt a her remarks in a panel at CHI 95. My col-
DESIGN
videoconference for a private office conversa- league had recently visited a MOO—one that

18 interactions...november + december 1996


perspective
users visit with the mutually understood goal Bias in Computer Systems2
of meeting and interacting with others. While In its most general sense, “bias” means simply
there, she “walked” into a bar and started to “slant.” Given this undifferentiated usage, bias
chat with the bartender. Several minutes into can describe both moral and nonmoral cir-
the conversation, she had a realization: the cumstances. In our work, however,
bartender was a bot—not a person taking part Nissenbaum and I have been focusing on
in on-line conversation! So much for thinking computer technologies with biases as a source
she had met a person she might like to know of moral concern; thus, we use the term bias
better. There is also the loss of time, the feel- in a more restricted sense. We say that a com-
ing of deception, and the residual doubt “will puter technology is biased if it systematically
the next encounter seemingly with a person and unfairly discriminates against certain
actually be with someone’s code?” Such expe- individuals or groups of individuals in favor of
riences can undermine the quality of interac- others. A technology discriminates unfairly if
tions in an electronic community and it denies an opportunity or a good, or if it
ultimately affect the user’s original goals for assigns an undesirable outcome to an individ-
participation. ual or group of individuals on grounds that
are unreasonable or inappropriate.
System Fluidity How then does bias become embedded in
With the proliferation of electronic informa- computer systems? In our study of 17 existing
tion, automated filters—such as mail agents systems, we have identified three overarching
and news filtering software—have become ways: preexisting bias, technical bias, and
increasingly common. Now, let’s assume that emergent bias.

Several minutes into the conversation, she had a


realization: the bartender was a bot—not a person taking
part in on-line conversation
initially a filter does a good job of discarding Preexisting Bias
or never even retrieving unwanted informa- Preexisting bias has its roots in social institu-
tion. Over time, however, the user’s goals can tions, practices, and attitudes. When comput-
change. A single woman, for example, might er technologies embody biases that exist
have little interest in company e-mail on independently of, and usually before, the cre-
maternity leave and child care, but 3 years ation of the technology, we say that the tech-
later and expecting a baby, the same messages nology embodies preexisting bias. Preexisting
might be wanted. But because the filter does biases may originate in society at large, in sub-
not provide the woman with information cultures, or in formal or informal organiza-
about what was previously discarded, she has tions and institutions. They can also reflect
no way of assessing what she is currently miss- the personal biases of individuals who have
ing. Indeed, she would have no hint that the significant input into the design of the tech-
mail agent even needs to be reprogrammed or nology, such as the client or the system design-
retrained. The point here is that users’ goals er. This type of bias can enter a technology
often change over time. Thus to support user either through the explicit and conscious
autonomy, systems need to take such change efforts of individuals or institutions, or 2Much of the material

into account and provide ready mechanisms implicitly and unconsciously, even despite the discussed here has
for users to review and fine-tune their systems best of intentions. Consider, for example, soft- been adapted from
[11, 12]. ware that a colleague purchased for his school- several sources [3, 5].

interactions...november + december 1996 19


age daughter. He writes, “Well, of course, the arises from the resolution of issues in the tech-
first thing that happens is this: you get to nical design. Sources of technical bias can be
choose which one of three basic adventurers found in several aspects of the design process,
you want to be—a male thief, a male magi- including
cian, or a male warrior. Nice choice for a
young girl, huh?” [3], p. 49] • Limitations of computer tools such as
More formally, Huff & Cooper [10] con- hardware, software, and peripherals;
ducted a study showing that software design- • The process of ascribing social meaning
ers sometimes unknowingly design software to algorithms developed out of context;
that is more aligned with males than with • Imperfections in pseudo-random num-
females. In the study, subjects were asked to ber generation; and
propose designs for software to teach seventh • The attempt to make human constructs
graders the correct use of commas. One group amenable to computers—when, for
of subjects was asked to design the software example, we quantify the qualitative,
for seventh-grade boys, the second group to make discrete the continuous, or formal-
design for seventh-grade girls, and the third ize the nonformal.
group to design for seventh-graders, gender
unspecified. Huff and Cooper reported that The use of computer punch card tallying
along a number of dimensions the designs systems for national and local elections pro-
proposed by subjects in the gender-unspeci- vides a case in point. Undereducated groups
fied group closely resembled the designs pro- are more likely to not understand how the
posed by subjects who designed for boys and computerized system works and, thereby, to
were significantly different from the designs invalidate their own votes by either not voting
proposed by subjects who designed for girls. for a position, or by voting for more than one
The study illustrates how preexisting biases, in person per position [1]. When this occurs, fea-
the form of expectations about which software tures of the interface create unfair difficulties
will appeal to each gender, coupled with the for undereducated voters and thus pose serious
implicit assumption that the generic user of problems for fair elections (see also [16]).
software is likely to be male, can influence Graphical user interfaces (GUIs) provide
design and give rise to bias in software. another example of technical bias. Before the
Whereas gender bias is deeply embedded in invention of GUIs, many visually impaired
Western society, other biases can serve indi- individuals were active members of the com-
vidual or corporate interests. The Sabre and puting community and many supported
Apollo computerized airline reservation sys- themselves as computer professionals. GUIs
tems that have been charged with intentional then appeared and rapidly became a standard.
bias [17] are one example. The limited size of Key to our discussion, the development of
display screens often means that all possible GUIs was motivated not by preexisting social
flights that match a traveler’s request cannot bias against visually impaired computer users,
be shown simultaneously on the screen. The but rather by new technical developments in
flights that are shown on the initial screen (or graphics-oriented hardware and displays cou-
the part of the file that is visible before the pled with the belief in the old adage “a picture
user scrolls) are more likely to be selected by is worth a thousand words.” Although the
Value-Sensitive travel agents and their clients [19]. In the new GUI standard improves access to the
Sabre and Apollo systems, because of details technology for many, it has effectively shut
in the algorithm certain airlines regularly out visually impaired computer users. On a
appear on the first screen and thus have sys- more positive note, to counter this technical
tematic advantages over those airlines whose bias the computer industry has invested sub-
flights do not. stantially in attempting to design technical
means that would allow visually impaired
Technical Bias users access to the information buried in the
DESIGN
In contrast to preexisting bias, technical bias graphical user interface.

20 interactions...november + december 1996


perspective
Emergent Bias involves identifying or “diagnosing” bias, and
Although it is almost always possible to iden- to do so during the earliest stages of the design
tify preexisting bias and technical bias in a phase, when negotiating the system’s specifica-
design at the time of creation or implementa- tions with the client and constructing the first
tion, emergent bias arises only in a context of prototypes. Then comes the task of remedy-
use by real users. This bias typically emerges ing. Some current designers, for instance,
some time after a design is completed, as a address the problem of handedness by allow-
result of a change in societal knowledge, user ing the user to toggle between a right-handed
population, or cultural values. For example, or a left-handed configuration for user input
much of the educational software developed and screen display. Elsewhere, members of the
in the United States embeds learning activities Archimedes Project at Stanford University are
in a game environment that rewards competi- developing an approach to designing for peo-
tive and individual playing strategies. When ple with physical disabilities [13]. More
such software is used by students with a cul- broadly, the HCI community has considerable
tural background that eschews competition experience in designing usable systems for
and instead promotes cooperative endeavors, both color-blind and color-sighted individuals
such students can be placed at a disadvantage by the simple means of redundant informa-
in the learning process. Or consider the tion: Whatever pertinent information is

Well, of course, the first thing that happens is this:


you get to choose which one of three basic adventurers you
want to be—a male thief, a male magician, or a male
warrior. Nice choice for a young girl, huh?
National Resident Match Program (NRMP), communicated through color is also commu-
a computerized matching system for assigning nicated in some other form. The more success
medical residents to hospital residency pro- we have with specific designs, the more readi-
grams, that was designed and implemented in ly we can develop more systematic approaches
the 1950s. The initial algorithm used by the to both process and technique. For example,
NRMP placed couples (where both members in their attempt to systematize attention to
of the couple were residents seeking a match) bias in design, NYNEX has begun to embed
at a disadvantage in the matching process methods for minimizing bias into their certi-
compared with their single peers. However, fied design processes for ISO 9001 registra-
this bias emerged only in the late 1970s and tion [21]. Eventually, the standards themselves
early 1980s when increasing numbers of should require such value-sensitive processes.
women entered medical school and a growing
number of couples sought residencies [14, Conclusion
15]. To the credit of those who oversee the Although computer technology is expensive to
NRMP, recent revisions in the algorithm place develop, it is comparatively inexpensive to
couples more equitably in matches. produce and disseminate, and thus the values
embedded in any given implementation are
Design Methods to Minimize Bias likely to be widespread, pervasive, and system-
As the computing community develops a bet- atic. Moreover, unlike with people with whom
ter understanding of bias in system design, we we can disagree about values, we cannot easily
can correspondingly develop techniques to negotiate with the technology. Although inat-
avoid or minimize it. Certainly an initial step tention to moral values in any enterprise is

interactions...november + december 1996 21


disturbing, it is particularly so in the design of machines. This protection can translate into
computer technology. Thus, I have suggested marketable features, such as of privacy and
that in the design of computer technology we security. Minimizing bias in a design also like-
value human values as understood from an ly leads to a larger market share because such
ethical standpoint. systems are typically accessible to a greater
This is not to say that one or more such diversity of users (e.g., users who are color-
values, such as autonomy or freedom from sighted or color-blind, right-handed or left-
bias, necessarily overrides others. For example, handed, male or female). Moreover, even
when systems are employed in contexts that when a greater market share is not directly
seriously affect human welfare, such as in an anticipated there is the goodwill of customers
air traffic control system, we may need to who associate a company or its product with
restrict autonomy to protect against a user moral purposes. Such an association is diffi-
with malicious intentions or well-intentioned cult to quantify economically, no doubt; but
users guided by poor judgment. Neither is it so too are the economic benefits of advertis-
always clear how to balance competing values. ing. It is also worth noting that retrofitting a
Take, for example, the issue of standardization design is vastly more costly than building
in design. On the one hand, some forms of things right the first time.
standardization will restrict how users can Which brings us to what it means to build
control technology. On the other hand, stan- things “right.” Right by what criteria? Some
dardization can free users from the burden of currently accepted ones in system design
relearning how to work with the technology include reliability, efficiency, and correctness.
when they switch among stations or systems. Yet there is a growing consensus that we need
In these situations, standardization provides also to include criteria that embody or at least
users with not lesser but greater control over help foster core human values [18, 21]. Thus,
the technology. in future work we need to continue to con-
Certainly, more thinking is needed on how ceptualize values carefully and then study
to balance human values. For example, it may them in both laboratory settings and organi-
be that reconciling the ideals of standardiza- zations. Moreover, we will need to examine
tion and autonomy depends on identifying our own design practices from this perspec-
the appropriate level of user control. It will tive. By such means, designs could be judged
also be necessary to consider the relationships poor and designers negligent. As with the tra-
between moral values and economic goals. ditional criteria of reliability, efficiency, and
Part of the difficulty here is that economic correctness, we do not require perfection in
goals are themselves human values, and at value-sensitive design, but a commitment.
times even moral ones, as when people strive And progress.
through economic means to attain autonomy,
that is, the freedom and responsibility to About the Author
attain necessary goods for themselves and Batya Friedman is Associate Professor of
their families. But personal and corporate eco- Computer Science at Colby College. She
nomic goals also have a long history of under- received both her B.A. and her Ph.D. from the
mining moral values: when, for example, a University of California at Berkeley. Her areas
drive for profits runs roughshod over the of specialization are human-computer interac-
Value-Sensitive development of safe products and the fair tion and the human relationship to technolo-
treatment of workers and customers. In such gy. She has written numerous research articles
cases, it should go without saying that the eco- in addition to designing educational software
nomic needs to give way to the moral. and consulting on human values in system
Moral values can also support economic design. Currently she is editing a book titles
goals. Yet this point often gets lost in bottom- Human Values and the Design of Computer
line cost–benefit analyses. For example, pro- Technology to be published shortly by the
tecting user autonomy means giving users Center for the Study of Language and
DESIGN
control at the appropriate level over their Information, Stanford University.

22 interactions...november + december 1996


perspective
References

1. Dagger, R. . Annals of 7. Friedman, B. and 13. Perry, J., Macken, E., 18. Shneiderman, B. and
democracy. The New Winograd, T. (eds.). Scott, N., and McKinley. Rose, A. Social impact
Yorker (November 7, Computing and social Disability, inability and statements: Engaging
1988): 40–46, 51–52, responsibility: A collec- cyberspace. In B. public participation in
54, 56, 61–68, 97–100, tion of course syllabi. Friedman (ed.), Human information technology
102–108. Computer Professionals values and the design of design. In B. Friedman
for Social Responsibility, computer technology. (ed.), Human values and
2. Friedman, B. (ed.).
Palo Alto, Calif., 1990. Center for the Study of the design of computer
Human values and the
Language and technology. Center for
design of computer tech- 8. Gewirth, A. Reason
Information, Stanford the Study of Language
nology. Center for the and morality. University
University, Stanford, and Information,
Study of Language and of Chicago Press,
Calif., in press. Stanford University,
Information, Stanford Chicago, 1978.
14. Roth, A. E. The evo- Stanford, Calif., in press.
University, Stanford, 9. Hill, T. E., Jr.
Calif., in press. lution of the labor mar- 19. Taib, I. M. Loophole
Autonomy and self-
ket for medical interns allows bias in displays on
3. Friedman, B., Brok, respect. Cambridge
and residents: A case computer reservations
E., Roth, S. K., and University Press, United
study in game theory. systems. Aviation Week
Thomas, J. Minimizing Kingdom, 1991.
Journal of Political & Space Technology
bias in computer sys- 10. Huff, C. and Cooper,
Economy 92 (1984): (February 1990), p. 137.
tems: CHI ‘95 J. Sex bias in educational
991–1016. 20. Tang, J. C.
Workshop. SIGCHI software: The effect of
Bulletin 28, 1 (1996): 15. Roth, A. E. New Eliminating a hardware
designers’ stereotypes on
48–51. physicians: A natural switch: Weighing eco-
the software they design.
experiment in market nomics and values in a
4. Friedman, B. and Journal of Applied Social
organization. Science design decision. In B.
Kahn, P. H., Jr. Human Psychology 17 (1987):
250 (1990): 1524–1528. Friedman (ed.), Human
agency and responsible 519–532.
16. Roth, S. K. The values and the design of
computing: Implications 11. Laurel, B. Interface
unconsidered ballot: computer technology.
for computer system agents: Metaphors with
How design effects vot- Center for the Study of
design. Journal of character. In B. Laurel
ing behavior. Visible Language and
Systems Software 17 (ed.), The art of human-
Language 28 (1994): Information, Stanford
(1992): 7–14. computer interface
48–67. University, Stanford,
5. Friedman, B. and design Addison-Wesley,
Calif., in press.
Nissenbaum, H. Bias in Reading, Mass., 1990, 17. Shifrin, C. A. Justice
will weigh suit challeng- 21. Thomas, J. C. Steps
computer systems. ACM pp. 355–365.
ing airlines’ computer toward universal access
Transactions on 12. Malone, T. W., Lai,
reservations. Aviation within a communications
Information Systems 14, K. Y., and Fry, C.
Week & Space company. In B. Friedman
3 (1996): 1–18. Experiments with Oval:
Technology (March (ed.), Human values and
6. Friedman, B. and A radically tailorable tool
1985), p. 105. the design of computer
Nissenbaum, H. User for cooperative work.
technology. Center for
autonomy: Who should Proceedings of the ACM
the Study of Language
control what and when? Conference on
and Information,
Conference Companion Computer Supported
Stanford University,
of the Conference on Cooperative Work
Stanford, Calif., in press.
Human Factors in (CSCW ‘92), Toronto,
Computing Systems, Ontario, November
CHI ‘96 Association for 1992.
Computing Machinery,
New York, April 1996, p.
433.

interactions...november + december 1996 23

You might also like