Download as pdf or txt
Download as pdf or txt
You are on page 1of 104

Security in Practice: Examining the Collaborative Management of Personal Sensitive

Information in Childcares and Medical Centers

Laurian Claire Vega

Dissertation Proposal submitted to the faculty of the Virginia Polytechnic Institute and State
University in partial fulfillment of the requirements for the degree of

Doctor of Philosophy
In
Department of Computer Science

Steve Harrison
Dennis Kafura
D. Scott McCrickard
Enid Nicole Montague
Deborah Tatar

March 8th, 2010


Blacksburg, Virginia

Keywords: Security, Trust, Privacy, Usable Security, Computer Supported Collaborative


Work
Security in Practice: Examining the Collaborative Management of Personal Sensitive
Information in Childcares and Medical Centers

Laurian C. Vega

ABSTRACT

Traditionally, security was conceptualized as rules, locks, and passwords. More recently,
security research has explored how it is part of a larger socio-technical system that involves
people working with technology and their environments to create safe praxis. When
examining security as one part of a larger socio-technical system, issues of trust, privacy, and
negotiation appear: trust, because people work together through an set of agreed rules;
privacy, because people work with information that is sensitive; and, negotiation, because the
policies that mediate groups encounter breakdowns or instances where rules are not clearly
defined thus requiring system changes.

For my dissertation I will focus on one type socio-technical system in two domains:
childcare centers and medical centers. In childcare centers workers manage the enrolled
children and also the enrolled child’s personal information. In medical centers workers
manage the patients health along with the patients health information. Building off of the
prior work of observations and interviews, my approach to exploring this topic further will
be mirrored month long ethnographies in each setting. The goal will be to observe
breakdowns in the explicit and implicit policies that these groups work within. The data
being collected will be observation notes, pictures, and transcriptions of activities. An
Activity Theory Framework will then be used to analyze the observed breakdowns that were
observed. The outcomes from this work will produce thick descriptions and categorizations
of security system breakdowns. A goal of this work is to build on the literature in the areas
of usable security, human-computer interaction, and trust.

In this research proposal I introduce the work to be completed. Following the introduction I
cover related work along with an explanation of Activity Theory and its accompanying
framework. After the review, the research study and methodology are covered for the
proposed work. A timeline, targeted publication venues, along with supporting materials are
covered in the appendix.
 
1.  INTRODUCTION ............................................................................................................. 1 
1.1.  NEED .................................................................................................................................................... 1 
1.2.  RESEARCH QUESTIONS...................................................................................................................... 3 
1.3.  APPROACH ........................................................................................................................................... 3 
1.4.  ASSUMPTIONS ...................................................................................................................................... 5 
1.5.  BENEFIT ............................................................................................................................................... 5 
1.6.  ALTERNATIVE CHOICES OF STUDY................................................................................................. 5 
1.7.  BACKGROUND ..................................................................................................................................... 6 
1.8.  OUTLINE .............................................................................................................................................. 6 
2.  REVIEW OF THE LITERATURE ...................................................................................7 
2.1.  USABLE SECURITY .............................................................................................................................. 7 
2.1.1.  History of Usable Security ............................................................................................................... 8 
2.1.2.  The Social Side of Security ............................................................................................................. 10 
2.2.  SOCIAL NORMS & CONTEXTUAL INTEGRITY ............................................................................. 12 
2.2.1.  Description of Social Norms........................................................................................................... 12 
2.2.2.  Trust as a Social Norm ................................................................................................................. 12 
2.2.3.  Privacy as a Social Norm............................................................................................................... 16 
2.3.  WORK IN SIMILAR PLACES .............................................................................................................. 21 
2.3.1.  Abstract Places & Spaces.............................................................................................................. 21 
2.3.2.  Childcares...................................................................................................................................... 21 
2.3.3.  Medical Centers and Hospitals ...................................................................................................... 22 
2.4.  EXPLICIT AND IMPLICIT POLICIES ................................................................................................ 23 
2.4.1.  Local and Global Process............................................................................................................... 24 
2.4.2.  Breakdowns ................................................................................................................................... 27 
3.  ACTIVITY THEORY AS THEORETICAL FRAMEWORK .........................................28 
3.1.  INTRODUCTION & INITIAL DESCRIPTION OF ACTIVITY THEORY ......................................... 28 
3.1.1.  Background Description of Activity Theory .................................................................................... 29 
3.1.2.  Activity Theory & Explicit and Implicit Policies ........................................................................... 32 
3.1.3.  Activity Theory & Breakdowns..................................................................................................... 33 
3.2.  SECURITY AS AN ACTIVITY ............................................................................................................. 34 
3.2.1.  Descriptions of the Activities .......................................................................................................... 34 
3.2.2.  Step 1: Modeling the Situation....................................................................................................... 36 
3.2.3.  Step 2: Produce and Activity System of the Situation Being Investigated ......................................... 37 
3.2.4.  Step 3: Decompose the Situations Activity System .......................................................................... 37 
3.2.5.  Step 4: Generating Research Questions........................................................................................... 39 
3.2.6.  Step 5 & 6: Conducting and Interpreting Findings from a Research Study..................................... 39 
3.3.  ALTERNATIVE FRAMEWORKS ........................................................................................................ 39 
3.3.1.  Communities of Practice ................................................................................................................. 39 
3.3.2.  Distributed Cognition .................................................................................................................... 40 
3.3.3.  Actor-Network Theory .................................................................................................................. 40 
3.3.4.  Value-Centered Design.................................................................................................................. 41 
3.3.5.  Common Information Spaces.......................................................................................................... 42 
3.3.6.  Design Tensions............................................................................................................................. 42 
3.3.7.  Macroergonomics ............................................................................................................................ 43 
4.  PROPOSED RESEARCH ................................................................................................43 
4.1.  RATIONAL FOR QUALITATIVE METHODS.................................................................................... 43 
4.2.  GROUNDED THEORY & PROPOSED OBSERVATIONS ................................................................ 45 
4.3.  RESEARCH SPACE.............................................................................................................................. 46 

i
4.3.1.  Childcares...................................................................................................................................... 46 
4.3.2.  Medical Practices ........................................................................................................................... 49 
4.3.3.  Comparing & Contrasting Childcares and Medical Practices ......................................................... 51 
4.4.  PILOT STUDY OBSERVATION & INTERVIEWS ............................................................................. 52 
4.4.1.  Pilot Study Methods & Demographics........................................................................................... 52 
4.4.2.  First Round of Coding by Study..................................................................................................... 54 
4.4.3.  Second Round of Coding & Analysis ............................................................................................ 56 
4.5.  STUDY OVERVIEW & RESEARCH QUESTIONS ............................................................................ 64 
4.5.1.  Length of Proposed Observations .................................................................................................... 65 
4.5.2.  Conceptual Model .......................................................................................................................... 66 
4.6.  PROPOSED PRODUCT ....................................................................................................................... 67 
4.6.1.  Stopping Point ............................................................................................................................... 67 
4.6.2.  Sample Problem Scenario ............................................................................................................... 68 
4.6.3.  Sample Near-Future Scenario........................................................................................................ 69 
4.7.  TIMELINE ........................................................................................................................................... 69 
5.  REFERENCES.................................................................................................................70 
APPENDIX..............................................................................................................................78 
5.1.  DEMOGRAPHIC INFORMATION ABOUT PARTICIPANTS IN PILOT STUDIES .......................... 78 
5.2.  PROPOSED TIMELINE FOR DISSERTATION .................................................................................. 81 
5.3.  MATERIALS USED PILOT STUDIES .................................................................................................82 
5.3.1.  Informed Consent for Interviewing Childcare Directors & Parents .................................................. 82 
5.3.2.  Interview Protocol for Interviewing Childcare Directors.................................................................... 85 
5.3.3.  Informed Consent for Interviewing Medical Center Directors ........................................................... 89 
5.3.4.  Interview Protocol for Interviewing Medical Center Directors ........................................................... 92 
5.3.5.  Informed Consent for Observations in Childcares ............................................................................ 95 
5.3.6.  Interview Protocol for Interviewing Parents...................................................................................... 98 

ii
"Any society that would give up a little liberty
to gain a little security will deserve neither and lose both."
Benjamin Franklin

1. Introduction
This study seeks to try and understand how work practices are affected by the breakdowns
that occur within explicit and implicit policies in settings that work with sensitive personal
information. The purpose of this study is an examination of early pilot work comprising of
interviews and observations along with two month long observations in a childcare and
medical practice in the New River Valley of Southwest Virginia. It is anticipated that the
results from this study would provide insight into producing electronic and physical security
mechanisms that support collaborative work practice. This work uses the qualitative method
of ethnography through the application of interviews and observations. The participants in
this study were selected to provide insight into two similar realms of sensitive personal
information practice. They were selected from the local area surrounding Virginia Tech as
representatives of the surrounding businesses.
1.1. Need
Traditionally, electronic and physical security was conceptualized as rules, locks, and
passwords. More recently, security research has explored how security is part of a larger
socio-technical system (Mamykina et al. 2007) that involves people working with technology
and their environments to create safe praxis. When examining security as one part of this
system, or as a supporting mechanism, issues of trust (Flechais et al. 2005), privacy (Bellotti
et al. 1993; Adams et al. 2005b), and negotiation (Dourish et al. 2006) start to appear: trust,
because people are working together through an established set of agreed rules to work
effectively; privacy, because people are working with information or details that are sensitive;
and, negotiation, because the rules or standards that groups are working within encounter
breakdowns or instances where rules are not clearly defined (Kobayashi et al. 2005) thus
requiring changes to be mediated.

For my dissertation I will focus on one type socio-technical system (Carayon 2006) in two
instantiations. In my work the term ‘socio-technical system’ means that people and
technology are interacting to co-produce outcomes. The type of socio-technical system that I
will be studying is one in which groups coordinate and manage their clients’ personal
information through physical and technological mechanisms. The two instantiations that I
will explore are childcare centers and medical centers. In childcare centers workers manage
the enrolled children and also the enrolled child’s personal information. In medical centers
workers manage the patients’ health along with the patients’ health information. The use of
two areas allows for generalization across similar work but exploring different dimensions
(e.g. routines, legislation). More details are discussed in Section 4.3.3 Comparing &
Contrasting Childcares and Medical Practices.

The personal information in these centers is sensitive. Sensitive, in this connotation, means
that the information is private and in need of securing. Sensitive means that the personal
information has limitations surrounding who should access it - as one operationalization of
‘sensitive’. In the context of the information rich environments I will be studying, a child or
a patient’s information includes practical pieces of information that should be secured from
a legal point of view, like a social security numbers and consent forms, but it also contains

1
social information within it, like a parent’s divorce or a medical diagnosis.

From the lens of security, the less people who have access to this information the easier the
information is to secure (Flechais et al. 2005). This is an example of a policy that might exist
in a setting where sensitive information is being managed. Security can be conceptualized
into explicit and implicit policies surrounding the physical and electronic forms of the
personal information (Dourish et al. 2004). These policies restrict who can, should, and do
access, modify, and delete the child’s and patient’s information.

However, the study of security in every day work practice has been an area that has a dearth
of research. Prior work has examined how plans, work, and documentation are mediated by
artifacts (Bardram 1997; Kientz et al. 2009a), how models and frameworks can be used to
design for privacy (Whitten et al. 1999), and the effects of deploying technology probes
aimed at exploring security and privacy (Bylund et al. 2008). Little work has examined how
current work practices function to maintain and security sensitive private information. There
has been even less work that has taken a holistic approach to studying both the technological
and social means to manage sensitive personal information, as I am proposing to do. This is
an increasing need given that the work of childcares and medical practices are being moved
into a paperless work place (Jha et al. 2006). It is an assumption that new electronic systems
will manage security with traditional ideas of what electronic security means, thus reducing
the need for social means of managing sensitive information.

The problem with security policies is that they “often only secure in principle. They are
seldom secure in practice” (Bellotti et al. 1993). Practice is what happens in the moment; it is
the activity; it is what is actually done. There is a tension that exists between work practice
and security. There has been a plethora of research that has demonstrated that when security
policies or mechanisms are not robust enough to support work practice, security breaks
down (i.e. creating work-arounds such as writing passwords on post-it notes, or as was
observed in the pilot studies - shouting passwords) (Adams et al. 1999; Dourish et al. 2004;
Adams et al. 2005a). When a breakdown occurs, though, in a social system, workers do not
stop doing work. They create special cases or method that allows them to continue - bends
in the policies. In this sense, social systems are intrinsically flexible. When we start to think
about electronic systems, the reverse is true: electronic systems are not as flexible. It is for
this reason that there exists a need to study and understand the holistic practice of how
socio-technical systems manage security practice.

Security is increasingly being automated with technology. The use of programmable policies
and role-based ontologies has facilitated technology to create the appearance the information
and data is more secure. This increase in the use of automated security carries with it an
implied assumption that these technological measures are more effective than the previously
used social ones. My argument, however, is that the increased use of technology may not
increase the security of sensitive person information unless social measures are accounted
for. In Figure 1 three different levels of overlap between social and technical mechanisms are
depicted. The square box that the circles are encompassing represents the entire space of
sensitive personal information that is to be managed in a workplace. The diagram on the left
depicts the idealized current state of use in regards to technical and social mechanisms.
There is some overlap between measures that are both socially and technically accounted for,
but there is much that is handled by only social or technical mechanisms. The second figure

2
shows the increasing use of technology also encasing more of the realm of how sensitive
personal information is being managed. In this depiction, there is still overlap between the
social and technical, along with some social mechanisms (e.g., whispering) that are used.
What I am proposing is that there is a need to optimize and use both social and technical
mechanisms to create secure practice – as depicted in the right-hand diagram.

Figure 1. Three scenarios depicting levels of overlap between sociological and


technical security mechanisms in the protection of sensitive personal information.
The larger encompassing goal of this work is to determine what is the optimal use of
technical, social, and socio-technical mechanisms to support the best management of
sensitive personal information. The concept has been called ‘joint optimization’ in the socio-
technical system theory literature (Baskerville et al. 2000). Socio-technical systems theory
recognizes that there are social and technical aspects of a work system that produce
social/psychological outcomes along with physical products. The ambition of the socio-
technical designer is to optimize both the social and technical mechanisms for the best
outcome. My central argument is that there is a need for a larger overlap between the social
and technical will result in dynamic, flexible, and reliable security mechanisms. To
understand how to design this system there is a need to holistically study current socio-
technical systems to understand how to optimize the increasing use of technology without
reducing the use and complexity of social mechanisms.
1.2. Research Questions
Responding to the need for research in this area, my research question is: how do socio-
technical systems that use sensitive personal information manage work-practice breakdowns
surrounding the implicit and explicit rules of process? I have further broken this down into
three sub-questions:
What are the implicit and explicit rules surrounding how medical practices and
childcares handle sensitive personal information?
What breakdowns happen when the explicit and implicit rules are not followed?
How are breakdowns accounted for, negotiated, and managed in socio-technical systems
where sensitive personal information exists?
1.3. Approach
I will be focusing on breakdowns in the practice of explicit and implicit policies. Implicit and
explicit policies are the guarantees that the system will support the work and access to
resources. Dourish et al. (2004) goes as far as to emphasize the importance of both explicit
and implicit guarantees in their definition of security: security is “the ability of users to
express and rely upon a set of guarantees that the system may make, explicitly or implicitly,
about its treatment of user data and other resources” (p392, my emphasis). By focusing on

3
the breakdowns that occur with explicit and implicit policies I will be able to study how
socio-technical systems manage system perturbations. This management of perturbations is
an intrinsic strength of human-mediated security systems, and one characteristic that
technical systems fail to properly incorporate (Flechais et al. 2005).

The framework I will be using to conceptualize system breakdowns is Activity Theory. A


breakdown is a place where the artifacts of use do not support the objective of the activity
(Bødker 1996). An extreme example would be if a computer stopped working that held all of
the electronic billing information. In this case, the computer, or tool, becomes the object of
the action – embodying the goal of making the computer work again. By focusing on
breakdowns I will be detailing places in the activity system where change has occurred, or
places where problem situations were noted but the activity system did not require changing.
After applying Activity Theory to conceptualize breakdowns, I will create categories of
breakdowns that will provide insight into guidelines for the design of future technology in
these workplaces. These design implications will be incorporated into a set of near-future
scenarios that will depict the positive and negative consequences of their use.

Barring approval from the Virginia Tech Institutional Review Board, qualitative methods will
be used. Month long active participant observations will be employed. For these I will be
volunteering half days of work at both a childcare and medical center. Pilot work employed
interviews with parents, childcare directors, and medical directors along with two to three
observations at four different area childcares. The pilot data and resulting data from the
month long observations will form the bases for the overall findings of this study. Daily
observation logs will be kept along with audio recordings and appropriate pictures of
representative artifacts. Live coding will be used to identify breakdowns as they occur. Key
parts of the audio recordings will be transcribed verbatim. Breakdowns will then be coded to
produce an emergent understanding of how the socio-technical system employs security in
practice. The use of observations and interviews from key stakeholders should provide a
complete story.

Table 1. Research Problems, Goals, and Approaches

Problem Goal Approach


Prior work has highlighted To detail what workers To interview relevant
that security is put into believe to be the explicit and stakeholders in the
practice through explicit and implicit of their current management of personal
implicit policies. However, work practice. sensitive information.
little work exists on what
policies actually exist in the
collaborative management of
sensitive information.
Given an understanding of To compare what workers To observe and take part in
explicit and implicit policies, believe to be their practices the work of managing
there has been little work to with what work is actually personal sensitive
understand how these done. information.
practices are then actualized
in the collaborative

4
management of sensitive
personal information.
Breakdowns occur in all To create a framework for With the use of Activity
systems of interaction. how socio-technical systems Theory, to analyze data
However, there is little manage perturbations and from interviews and
understanding of how breakdowns in security observations for reactions
breakdowns affect security policies. and adaptations in response
practices and policy to perturbations and
adaptations. breakdowns in security
policies.

1.4. Assumptions
This work represents some key assumptions about the field of electronic record keeping and
socio-technical systems. The first is the assumption that electronic record keeping is a
growing field; childcares and medical centers will eventually be paperless environments. This
assumption subsequently implies that security practices surround files and the management
of personal information will also adapt.

The second assumption is that by being an active participant in the work practice I will be
able to partake and observe security policies. More importantly, that within a months time I
will be able to notice breakdowns in implicit policies.
1.5. Benefit
This work will benefit the security, the usable security, and the sub-community of trust in
human-computer interaction by detailing a deep understanding of how communities manage
explicit and implicit policies. Resulting from this work will emerge properties that will help
the design community to create technology and tools to support secure work practice.

A second benefit from this work will be the conceptualization of security as more than rules.
The application of the activity theory framework provides a lens for examining how groups
internalize and externalize the constructs of security, trust, and privacy. Activity theory
literature on breakdowns will provide additional methods of analysis to the security
literature. Additionally, there has been a dearth of research studying how groups manage and
coordinate security and put these constructs into practice. This work will add to that body of
literature and understanding.

A last benefit of this work will be the use of the outcomes of this work. Two outcomes are a
set of categories of breakdowns that occur and also a set of near future scenarios depicting
consequences of the application of design implications. These two outcomes will provide
insight into future work on electronic health records, paperless work systems, and sensor-
based privacy research.
1.6. Alternative Choices of Study
Three choices have been made to help make this work competitive. The first is the location
of study. It was critical to select areas where privacy was being managed in practice. The
settings of childcares and medical practices were selected because they are similar in their
goal of manage and respect sensitive personal information. Childcares were selected because

5
they are a setting where manipulation and access are more easily granted. Privacy concerns
may not be as highly pertinent since the information is about children and not life critical.
Medical practices were selected because of the daily use of information and because there is
a life-critical aspect of making sure that the information is correct. By studying both areas my
desire is to better understand the area of managing personal information. Areas that were
not used were employee files, criminal files, student files, and client files.

The second choice for this study was selecting activity theory as the framework for
conceptualizing the socio-technical system. Other framework choices have been detailed in
the section title “Alternative Frameworks” on page 39.

The third choice of study involves the length and number of settings to study. The time span
of a month in each setting for observations was selected to allow for the communities to
start adopting me as an insider into their practices. While the study could extend into
additional months, this work and timeframe is beyond the scope of my dissertation. Other
lengths of study that were considered were a two week observation in two childcares and
two medical practices, or one week observations in four childcares and four medical
practices.
1.7. Background
This work stems from the Usable Security Team in the Computer Science Department at
Virginia comprising of Steve Harrison, Dennis Kafura, Denis Gracanin, and Francis Quek.
This team is exploring how practices affect security in the realms of healthcare and the
Virginia Tech Smart Home1 through the application of policies and a practice-based
ontology.

My prior work has examined how users assess online web information. This resulted in a
novel evaluation method of user trust. I additionally worked on a review of literature on trust
an eHealth websites. This work on the theory of trust produced a valuable base to examine
how trust is managed in collaborative practice.
1.8. Outline
The remainder part of this document outlines the work to be proposed. It starts with the
work from fields that I will be building on, primarily from usable security, human-computer
interaction, social norms, and some work on similar realms of study. Activity theory and its
framework are then explored to provide a method for understanding the process of
externalization and internalization of cultural norms along with understanding how
communities manage and respond to breakdowns. The work that I am then proposing to
accomplish for my dissertation is outlined along with the methodology and a brief
explanation of prior work. The appendix includes supporting research materials such as
interview and observation protocols with informed consents.

1
The webpage for the Lumanhaus: http://www.solar.arch.vt.edu/

6
Table 2. Dissertation Phase, Study, & Output
Phase Study Output
Pilot Studies Study 1: Open-ended 12 Transcripts, enrollment and policy forms;
Questionnaire of Richer understanding of Information Space;
Childcare Directors, 12 Service providers conception of security
Participants
Study 2: Open-ended 13 Transcripts, enrollment and policy forms;
Questionnaire of Medical Richer understanding of Information Space;
Center Directors, 13 Service providers conception of security
Participants
Study 3: Open-ended 21 Transcripts; Richer understanding of
Questionnaire of Parents, Information Space; Service consumers
21 Participants conception of security
Study 4: Follow-up Open- 4 Transcripts & 10 observation logs + notes
ended Questionnaire of + artifacts + pictures; Richer understanding
Childcare Directors & 2-3 of Information Space; Service providers
2 hour observations, 4 conception of security and ideas of security in
Participants practice
Phase 1, Analysis of data from Pilot RQ1: What are the implicit and explicit
Analyzing Pilot Studies rules surrounding how medical practices
Data and childcares handle sensitive personal
information?
List of codes detailing what workers believe
to be the explicit and implicit of their current
work practice.
Phase 2, Study 5: Month long RQ2: What breakdowns happen when the
Ethnographies active observer explicit and implicit rules are not
in Childcares ethnography in Childcare followed?
and Medical and Medical Center, 2 Approximately 40 observation logs, audio
Centers Participants recordings, and numerous artifact collection
and pictures
Phase 3, Analysis of data from all RQ3: How are breakdowns accounted for,
Analysis of studies negotiated, and managed in socio-
Data with technical systems working with sensitive
Activity personal information?
Theory Framework for how socio-technical systems
manage perturbations and breakdowns in
security policies. Rich descriptions of
breakdowns with design implications
explicated in near-future scenarios.

2. Review of the Literature


2.1. Usable Security
There has been a movement within the security sub-domain of computer science that
recognizes that security has two parts: the computers and humans. Security research
traditionally focused on making rules to protect data, with the assumption that the user of
the security was ‘the enemy’ (Adams et al. 1999). Users, after all, were the ones who were

7
writing down their passwords on post-it notes taped to monitors and holding open RFID
badge doors for the person behind them. This recent research area, called Usable Security,
has pushed back at this anti-user assumption: what if users are not the enemy? Instead, what
if the problem is that security is infringing on work practices? Usable Security’s proposal is
that if security is hard to manage it is not because the users are at fault; it is because the
security mechanisms are incongruent with the user’s primary task. To understand the user’s
task involves understanding the user’s needs, practices, and values. Additionally, with
computing systems becoming increasingly ubiquitous, they will become more involved and
integrated into everyday work; computers are no longer solitary machines under the
computer desk. This means that security can no longer be something that is a secondary task
and isolated from the user. Security management has to integrate into the tasks of everyday
work.
2.1.1. History of Usable Security
The recognition for information security systems to be usable is not entirely a recent topic.
Back in 1973 Saltzer and Schroder recognized psychological acceptability as one of the key
considerations for computer security (Saltzer et al. 1973). They explain that for information
systems to be secure the interface needs to be designed for “ease of use, so that users
routinely and automatically apply protection mechanisms correctly.” Indeed the call to
secure private information can be traced back to 1890 in Warren and Brandeis’s call for the
“Right to Privacy” (Warren et al. 1890). However, it was not until the last decade that a real
emergence of how to incorporate the human side of security has been reflected in the
literature.

Some position papers and reports were published that projected how the field of security
should change. For example, in 2003 the Computer Research Association put together a
report of what they saw as the four challenges of technology as it moves into the future in
regards to security and trust. They argued that the infrastructure of technology should move
beyond simply creating security policies to create policies that are adaptive to the
environments: “security is concerned with letting the right people access the right
information and services in a trusted environment” (Spafford et al. 2003). Of the four
challenges that the CRA proposed, two challenges directly related to usable security: (1) new
technological tools that have large societal impact, like electronic healthcare records and
electronic voting systems, need to be designed so that people not only trust them but so that
they are also secure; and (2) technologies of the future need to be built upon policies that
users can trust and understand. In both of these challenges, trust implies that the user not
only is willing to use the technology, but that the user will incorporate it into their practices.
This article is critical because it highlights that the perceived future of security lies in
embedding security into technologies that are usable and policies that are adaptive to user
needs.

Additional cues to how the field was changing were starting to appear in years following the
CRA Report. Conferences like SOUPS2 (Symposium on Usable Privacy and Security)
emerged and conferences like UbiComp3, and IFIPTM4 (International Conference on Trust

2
http://cups.cs.cmu.edu/soups/2010/
3
http://www.ubicomp2010.org/home
4
http://www.ifip-tm2010.org/doku.php

8
Management) started to publish papers on the topic of usable security. A key paper was
published in 1999 by Whitten and Tygar proposing that users may be the cause of security
problems; the real culprit is that computer interfaces have not adapted to discourage security
breaches (1999). A central tenant of their argument was that traditional definitions of
usability are not specific enough to be used for security software: “Since usable security
requires user interface design priorities that are not the same as those of general consumer
software, it likewise requires usability evaluation methods that are appropriate to testing
whether those priorities have been sufficiently achieved” (Whitten et al. 1999). To deal with
the lack of specificity they defined security to be useable if (1) users are aware of security
tasks that need to be performed, (2) users can successfully perform security tasks, (3) when
users do commit errors, they are not dangerous errors, and (4) the user’s comfort level is
high enough to persist in using the security software. Given this definition, the researchers
tested a security interface with what was at that time a high usability standard. They found
that one third of their participants were able to complete their tasks up to their relatively low
standards of usable security.

Six years later Garfinkel and Miller published a secondary study based on the model of
usable security published by Whitten and Tygar (1999) at the first year of the SOUPS
Conference (Garfinkel et al. 2005). The researchers proposed that it was not the usability of
the system that caused security errors, but it was a problem caused by the back-end of the
system not seamlessly supporting security needs. They found that the use of a more usable
interface plus a better back-end encryption system did enable less security errors to be made,
but it still did not fix all the problems. They found that there were sociological problems of
making trusting decisions that would impact security outcomes. For example, a user might
get an email from a friend with a different unverified email address – do they trust them or
not? They concluded that issues of trust are a problem that perhaps technology cannot fully
address.

Hitchings published a paper that summarized and outlined the work up to that point that
made the argument that there is a human side to security. He explained that along with the
technical side of security there are human issues: the personnel having individual objectives,
culture, and attitudes; the organizational issues such as policies; and, the context or
environmental issues such as outside competitors or customers. These issues all will affect
the practice of security. He concludes that, “The methods for implementing security are
outdated and a new methodology is required that takes into account the people problem”
(Hitchings 1995).

Another foundational article was published in 1999 in the Communications of the ACM, a
magazine published by the largest computing organization, which built off the work of
Hutchings and raised the visibility of the emerging field of usable security. This article by
Adams and Sasse (1999) provided a wide purview of how password security has to work
with user practice to create usable solutions to privacy issues. Through the surveying of 139
users and 30 interviews, the researchers were able to derive four factors that affect how
people use passwords, the last two highlighting how work practices are important: (1)
multiple passwords, (2) password content, (3) the perceived compatibility with a user’s work
practices, and (4) the user’s perceptions of organizational security and information
sensitivity. Adams and Sasse explain that the lack of communication with the security along
with a wrong mental model of how security works both impacted the user behavior: “users

9
perceive their {insecure} behavior to be caused by a mechanism design to increase security.”
By highlighting the incongruent practices of current security measures with user password
behaviors this paper provided a clear demonstration of how security as a field needed to
evolve.

These papers highlighted how useable security emerged as a field that recognizes the balance
conjunction of people and technology to respond to security needs. Not only did the field of
security move beyond trying to solidify more rigid and rigorous rules to explore the human
side, but they also started to utilize methods from the social science fields (i.e. interviews and
observations).
2.1.2. The Social Side of Security
Part of recognizing the human side of security is recognizing that systems of interaction are
socio-technical. ‘Socio-technical’ means that people, technology, and the surrounding
context are all parts of a system of interaction that frame the practice of work (Flechais et al.
2005). In the recent work of usable security, researchers have published on how security is
and should be designed as one part in the larger socio-technical system. The papers in this
section extend beyond stating that security needs to be more usable but also consider what
aspects of the context and user are important for security.

The first piece of work by Flechais, Reiglesberger, and Sasse explored how and why
examining security as one part of the larger socio-technical system can provide insight into
how to create better systems. They emphasize the importance of defining security by its
perceived vulnerability: “a computer is secure if you can depend on it and its software to
behave as you expect… Dependability is therefore determined by the degree to which this
socio-technical system behaves in a way it’s expected to.” (Flechais et al. 2005). By
highlighting the perceived dependability, the authors emphasized the user’s part in security.
The researchers went on to explain how removing the human aspect of security could be
dangerous. Humans are flexible, have intuition, and evolve. Computers systems, on the other
hand, are rigid and less able to adapt to unstructured events. When used in combination it is
possible to create a reliable and flexible system of security.

Adams and Blandford (2005a) explored the clinical domain to primarily try and determine
the factors that affected information and technology usage. Their secondary purpose was to
explore the security and privacy issues that affected use – without the ‘bias’ of having a
structured research question, a process that I am adopting. They demonstrate that the
culture of the work setting impacted the types of security problems identified and also the
resulting solutions. Their key findings were (1) that communities that did not adopt new
security procedures circumvented security policies, (2) security mechanisms were sometimes
purposed by people within the community to enforce power relations, and (3) when IT
personnel engaged with the community they were more likely to create solutions that
benefited the whole socio-technical system.

A few additional theoretical points from this paper are important to mention. Adams and
Blandford make the argument that security, privacy, and trust have all been designed for the
individual with little consideration for how to design for the group. This assertion supports
my exploration of how communities manage these constructs in practice - instead of
focusing on the prototypical behind-a-desk user. Second, Adams and Blandford use

10
Communities of Practice as their theoretical underpinning to understand how groups
interact. The theory of Communities of Practice is discussed in more detail in 3.3.1
Communities of Practice. The emphasis on communities and work practice in this research
are models for my own work.

In 2006 Dourish and Anderson published a position paper that directly examined security as
a social process (Dourish et al. 2006). They argue that work within usable security has fallen
traditionally within three areas – none of which consider security and privacy in the realm of
social and cultural context: (1) empirical examinations of security practice, (2) empirical
studies of the usability of current security systems, and (3) novel systems that have been
designed to better support end-user security. From these areas of work on usable security, a
“narrow” view of security and privacy emerged, with the terms at time being used
synonymously. Dourish and Anderson argue that models of security and privacy have to be
valued as “practical, ad hoc, in-the-moment accomplishments of social actors, achieved
through their concerted actions in actual settings” (pp. 328). They discuss new models of
security and privacy to show how the study of work practice can and should be valued in this
research. This call for a contextual and holistic analysis of security practices demonstrates the
need for my proposed work.

To better understand the relationship between the social and technical side of security, work
from Carayon (Carayon 2006) has applied the dimensions of a socio-technical model to
network security concerns and healthcare. These dimensions vary by definition, but include
aspects of the social system, the technical system, organizational structure and policies,
contextual and environmental factors, and the current task. By using a socio-technical
framework, other aspects of security and healthcare can be brought to the forefront. For
example, in the realm of security, humans may represent the weakest link in a completely
secure technical system. However, the social aspects of human may also account for flaws or
gaps in current technical security systems. This is a similar argument presented by others,
e.g., (Flechais et al. 2005). From the realm of healthcare, the increasing use of outpatient care
is enabled through the patients co-producing their care through technological and human-
based support systems.

There has been work outside of the sphere of research that has examined security concerns
in the medical arena. One key example comes from the Agency for Healthcare Research and
Quality who have put together at toolkit specifically for examining security in the medical
setting (2010a). This toolkit includes a set of scenarios that have depicted different
circumstances that security and privacy concerns may arise within. These scenarios are then
disseminated to states and towns nationwide to encourage discussion. The AHRQ also has
put together a document of business practices relating to security and privacy. These
practices are divided into technological means of addressing security concerns, but also the
physical and people oriented means of addressing security. Example electronic business
practices are the use of keycards and digital signatures. Examples from the ‘paper
environment’ are organizational policies and procedures. This toolkit is a demonstration of a
growing trend in the recognition that both people and technology can play a part in the
security and privacy of patient information. However, these documents are also an indicator
that there is confusion about how to balance and manage the security of a socio-technical
system in practice.

11
2.2. Social Norms & Contextual Integrity
2.2.1. Description of Social Norms
Social norms reflect the way that groups understand how to act and react in situations with
particular contexts. The concept of social norms encompasses the ways that groups of
individuals think, feel, and the view of the world, the group, and others (Smith et al. 2000).
Ways that groups manage social norms is through the creation of laws, policies, and other
governing rules. These rules allow for groups to enforce standards of behavior. However,
norms can also work without having to be explicit. Social norms can be learned, internalized,
and adopted as standards. In this fashion, social norms also work as a way to regulate
behavior: people behave in a certain way not because they feel they have to, but because they
want to.

For a norm to influence a behavior it must be triggered from some environmental cue. For
example, seeing someone else display a behavior stemming from a social norm, like taking
off a hat during the broadcast of the national anthem, could trigger another person to also
display that same social norm. Social norms can be established through systems of rewards
and punishment and then activated through environmental cues. One form of a reward that
would encourage compliance with a social norm is social acceptance. Social acceptance is a
strong motivator to be accepted among your family and peers. At the most basic level a
social norm could be translated into eating all of your vegetables before getting dissert. At
the most esoteric level, the person who adheres to social norms is able to feel part of the
community and reaps the benefits of that status. By not adhering to social norms, in
particular, ones that have been instantiated into laws, punishment is a community response.
For social norms to effect behavior, personal attitudes have to be in congruence with the
social norm and other social norms have to be less readily accessible (Smith et al. 2000).

Examples social norms in the settings I will be studying are privacy and trust, which are
discussed in the sections below. One concept related to social norms, and used to explain
privacy, is contextual integrity. The concept of contextual integrity focuses on the social
norms that regulate the how personal information is shared or on the “norms of information
flow” (Nissenbaum 2004). A central tenant of the argument is that there are no situations
where information flow is not regulated by social norms: “Almost everything—things that
we do, events that occur, transactions that take place—happens in a context not only of
place but of politics, convention, and cultural expectation” (Nissenbaum 2004, pp.137). By
behaving in a way that respects the contextual integrity of the situation, social norms
regarding how personal information has been managed are adhered to.
2.2.2. Trust as a Social Norm
One social norm that I will be focusing on in my work is trust. Trust is a multidimensional
construct that impacts interpersonal relationships and the individual’s relationship with a
larger socio-technical community. For my dissertation I will be studying how trust impacts
security within the socio-technical systems of a childcare and medical practice. Trust is an
important construct in these environments because of the value of interpersonal
relationships. For instance, the patient trusts that their team of healthcare professionals is
going to manage their wellness appropriately. Health care professionals also trust each other
to work cooperatively and to effectively communicate pertinent patient information to each

12
other. It is the interpersonal relationships and the trust within them that are the building
blocks for the socio-technical system to function.

Conceptualization of the construct of trust customarily will define trust based on ‘facets’ and
‘antecedents.’ Facets of trust are the constructs that it may encompass or make up user trust.
Examples taken from a body of reviewed literature include 'well-intentioned' (Theng et al.
2005), 'truthfulness' (Rains et al. 2009), and 'integrity' (Zahedi et al. 2008). Antecedents of
trust, on the other hand, are the external factors that impact or cause variation in a user’s
trust. Examples include 'relevance', 'authority', or 'disposition' (Kelton et al. 2008).
For my work I have conceptualized trust as a social norm that regulates the behavior of the
group. Similar to the work presented by Kandori (Kandori 1992), trust is a social
phenomenon that is within the group, as well as between individuals.

In the section below I detail how different relevant areas have used the construct of trust.
2.2.2.1. Trust Research from Relevant Areas
Trust is a construct that has been used in the realms of sociology, psychology, political
sciences, decision making, business, computer science, and others for decades. Given its
wide use and exploration, the definitions of trust vary by discipline and subject of study
within that discipline (Montague et al. 2009). For example, in a study of how the construct of
trust has been explored in regards to user trust in ehealth websites, in forty-nine papers
twelve different definitions of trust were cited (Manuscript submitted to JAMIA in
December 2009, Vega et al. 2009c). Even within such a focused area of trust application,
the variation in definitions of trust demonstrates how dispersed research on trust can be. In
this section I explore how relevant domains have defined trust and how those definitions
impact my proposed work.

Medical practices and childcare providers are businesses. This means that while they
function to serve a community purpose, they also need to be economically viable and
sustainable. The field of economics and business management has explored how trust is used
by businesses to help mitigate risk. For example, one proposition is that a system that has
high trust between the personnel is likely to be more expensive than that of systems with low
trust between personnel (Handy 1995). Flechais et al. explains this through two factors: (1)
when both parties trust each other and do not break that trust they lower their expenses
through not having to create additional countermeasures, and (2) trust enables social capital,
which creates a feelings of shared responsibility thus decreasing selfishness and carelessness
(Flechais et al. 2005).

Similar work has stemmed from the field of eCommerce on how potential customers trust
ecommerce websites. eCommerce research focuses on creating online websites and
situations where customers feel comfortable to pay for a service. Example work in this area
has demonstrated how the use of pictures has effected trust of websites (Riegelsberger et al.
2002), the use of social presence increases trust (Hassanein et al. 2004), and how the
presence of a third-party seal can verify trust assessments (Head et al. 2002). Perhaps one of
the most cited papers on the topic of trust and eCommerce is the work Mayer, Davis, and
Schoorman (Mayer et al. 1995). They present a process-based model where external
interaction iteratively increases or decreases their perceived trust in another entity. The
trustee is evaluated based on their ability, benevolence, and integrity – antecedents of trust.

13
Combined with the trustor’s propensity to trust, a user will establish a level of trust. Then,
depending on the amount of perceived risk and the risk taking relationship, some outcome
will manifest that will effect future beliefs in ability, benevolence, and integrity.

In related work that I have been conducting (Vega et al. 2010), I have been creating a
method to assess how micro-interactions in a website can affect trust assessment. Given that
interactions are dynamic, trust is also a dynamic, unfolding, and a deeply contextual
phenomenon that must be evaluated as such. Post-interaction questionnaires alone cannot
address the evolution of user trust over time. The central argument of the work is that there
is a need to measure trust iteratively and in situ. This measurement of trust can provide a
deeper insight into the construct of trust and the design elements that influence it. I
proposed a method called the TIME (Trust Incremental Measurement Evaluation) Method.
The TIME Method places an emphasis on repetition, taking repeated measures of trust
across the pages of a website with many users. Subsequent analysis forms the basis for
identifying relationships between specific design elements and increases or decreases in user
trust. From this work I started to understand the deeply contextual nature of trust, and the
impact it had on work.

Given that people in medical and childcare centers communicate across different mediums
(i.e. face-to-face, email, text message), research on how users trust each other through these
devices is also of value. A large proportion of research evaluating trust through different
communications methods involves the use of a game similar to prisoner’s dilemma.
Prisoner’s dilemma is a game that forces people to trust each other so that both teams
receive optimal payment. The game functions by having participants work together (or
sometimes with a confederate) with each player having a set amount of goods. By
participants contributing goods to a job, they can both receive equal payout. If one team
member betrays another team member, however, the individual has the opportunity to make
more money. Researchers can thereby measure trust by whether or not the players betray
each other. Some problems with this method is that there are assumptions that participants
are rational and that the game elicits enough of a feeling of initial trust to establish a
relationship not worth betraying. An extended analysis of the prisoner’s dilemma game can
be found in (Riegelsberger et al. 2003). The work examining communication methods has
individuals or teams work through different communication technologies (i.e. video chat v.
telephone). How the teams then perform provides an analysis of how different methods of
communication impact trust. While the participants in my study will not be using prisoner’s
dilemma, this analysis of trust highlights the effects of betrayal as an antecedent to trust.
Additionally, awareness of your collaborators affects interpersonal communication and also
trust (Rocco 1998).

Literature on trust in computer mediate communication has focused on the effects of


medium and socio-emotional cues that allow a person to detect trustworthy cues (Warkentin
et al. 2010). Participants in these studies need to assess the context and people who they are
working in order to rely on them for work and also to establish inter-personal relationships.
For this reasons, the study of trust can look at the effects of lying and deception on the
formation of trust. Example studies are of online dating profiles (Toma 2010; Toma et al.
2010), texting (Birnholtz et al. 2010), and instant messaging (Hancock et al. 2009). While the
management of sensitive personal information is currently limited to paper-based methods,

14
as it expands to other formats, considering trust and trustworthiness in different mediums
could be a useful avenue of research.

Within the area of software design, there has been work on how to understand whether or
not the software is trustworthy. With software being part of the socio-technical system of
medical centers and childcares, how it is trusted could impact the trust in the system. For
example, in the work of Amoroso et al. (1991), they provide a comprehensive list of
analytical tools, or requirements, to evaluate whether the software is trustworthy. In fact,
their definition of trust is “the degree of confidence that exists that the software will be
acceptable for one’s needs” (pp. 199). They propose that to evaluate whether a piece of
software is trustworthy or not the organization must examine the practices of its design. An
example requirement would be to make sure that the software development used version
control software that logged who worked and checked in individual pieces of code. While
this method of evaluating the trustworthiness of software could help organizations in
assessing software to purchase, it obfuscates the end user’s evaluation of the software. A
piece of software may be made through a method that is trustworthy, but because it does not
fit into the end user’s practice it may be rejected and circumvented. However, this work does
emphasize the need to understand the cultural-historical background of artifact creation in
order for it to be a trusted part of the socio-technical system.

The construct of trust in the realm of security has been traditionally defined by its ability to
foster transactions between networked components. For example, if my router wants to
send a packet to another router, should it be trusted? One example definition of trust from
security is that trust is a “system or component…. whose failure can break the security
policy” (Anderson 2008). Security researchers use their definition of trust to then create
formulas that calculate the risk of transactions and use this formula to define the trust in the
system. Additionally, from a security perspective things are “trustworthy” when all
vulnerabilities have been removed from the system – usually through technical security
measures (Flechais et al. 2005). This definition emphasizes the need for policies to make the
facets and antecedents of trust known and measurable. The problem with this conception of
trust is that trust is flexible and dependent on context. Rules and formulas are useful for
normal occurrences, but need to be adaptive or will be circumvented.

The research on usable security, however, defines trust based how it is part of a larger socio-
technical system. For example, the work of Flechais, Reigelsberger, and Sasse explains how
instead of mitigating vulnerability to create a secure and trustworthy system, the socio-
technical system recognizes the vulnerabilities and trusts that they will not be exploited
(Flechais et al. 2005). They define a model of trust between two actors in the socio-technical
system. As shown in Figure 1, there are internal and contextual factors that influence the
trust between two people. Internal factors are internal to the trustor and trustee. They are a
trustor’s motivation, made up of sub-factors such as the propensity to take risks; internalized
norms; and, ability. Contextual factors surrounding and interacting with the trustor and
trustee are temporal, social, and institutional. This model explicates how security is
dependent on trust. For instance, if a user is trusted to manage new organizational assets,
their internalized cultural norms will inhibit them from breaking that trust. Alternatively,
using only security policies and sanctions to assure that users behave properly does not
encourage the same behavior. This paper highlights the important aspects of trust and how

15
they can be used to elicit security behaviors from users that are much different from using
organizational policies.

Figure 2. Flechais et al.'s model of interpersonal trust from the realm of usable
security.
2.2.3. Privacy as a Social Norm
The personal information in childcares and medical centers is sensitive. Sensitive means that
the personal information has limitations surrounding who should access it - as one
operationalization of ‘sensitive’. I have adopted the definition from Nissenbaum, that the
terms ‘sensitive’ and ‘confidential’ in reference to personal information specify special
information that denotes a need for privacy (Nissenbaum 2004). In the context of the
information rich environments I will be studying, a child or a patient’s information includes
practical pieces of information that should be secured from a legal point of view, like social
security numbers and consent forms, but it also contains social information in it, like a
parent’s divorce or a medical diagnosis. Security mechanisms are the tools that will allow
information to be made private.

The ownership of personal information is an important consideration in terms of privacy.


Who owns the file being stored at the childcare and medical center? The file contains
information about the patient; it is their personal information. At the same time the
information is written on the medical center’s forms, annotated by staff, and used regularly
by the medical center. Personal conceptions of who owns the file or other instantiations of
the personal information are going to affect the management of its privacy. This is especially
true given the lack of an over-arching government regulation specifying rules about who
owns that information. For instance, if the director believes that they are merely stewards of
the personal information they are going to restrict inter-office access to personal information
based on the patient’s request. In contrast, if the director believes that the personal

16
information is the company’s property, the personnel may share personal information
regardless of what the customer desires.

In the papers in this section we will explore how privacy has been previously conceptualized.
2.2.3.1. Models of Privacy
In understanding how privacy is a larger societal norm, Adams and Blandford theoretically
argued for why there are issues with the security of personal information. They explain that
security mechanisms embody the cultural idea of personal identity. From the idea of
personal identity stems personal information which then needs to be made private. They
explain that: “the problem with many definitions of personal information is that they
concentrate on the data itself rather than how it is perceived” (original emphasis, Adams et
al. 2005a). Information is only private because there is a public space and the delineation
between those two spaces are going to vary depending on an individual’s relationship to the
greater society or contextual situation (Harper et al. 1992). In tandem with this idea of
boundaries is the concept of social identity – the public persona that a user has in a public
space. The social identity is created through the divulgence of personal information that is
then interpreted by others. Therefore, users have a need to conceptualize the personal
information that they place in the public sphere (Bylund et al. 2008). The interplay between
the personal and public identities is a dynamic negotiation between being hidden and being
seen.

Palen & Dourish suggest a similar framework by building off of the work Altman to
understand how the interconnections between technology and society create boundary
situations where privacy is a concern (Palen et al. 2003). Altman proposes that privacy is a
dialectic and dynamic boundary regulation process that is not rule-bound. Instead, there is a
spectrum where people’s accessibility is more open or closed. To build off of Altman’s work,
Palen & Dourish propose genres of disclosure. Genres of disclosure are socially constructed
patterns of interpersonal privacy management “that reflect, reproduce and engender social
expectations, guide the interpretability of action, and evolve as both technologies and social
practices change” (pp. 133). In practice, this means that there are social boundaries of where
technology will encroach on norms of privacy. Norms will then be established for how to
manage accessibility to private information in these new situations.

While this model of social boundaries is thin, Dourish and Anderson suggest an alternative
model of navigating information sharing called “cultures of secrecy.” A culture of security is
one in which groups ascribe meaning to certain pieces of information thereby making them
private properties. Because secrets instigate intimacy, the sharing of secrets is a sites of
negotiation where people discuss, navigate, and, at times, redefine the social norms of what
should be shared to whom and under what circumstances. These norms are then used to
enculture new members and to mark group membership so that new members learn what
information is to be shared and discussed and outsiders are quickly identified. This concept
is similar to a concept I have called a “community of trust” and is one discussed in the
results section of my proposal document.

An additional model of privacy is the economic exchange model. In this model, the user of a
service is willing to give up some of their private information in order for them to gain that
service. An example trade is use of a frequent buyer card at the grocery store: the consumer

17
provides personal information about goods purchased in order to receive product discounts.
A problem with this model is that there is an implicit assumption that the user is rational -
similar to an assumption made about participants in the prisoner’s dilemma game (Dourish
et al. 2006). As Dourish and Anderson explain, studies of users in actual practice show that
users do not always behave in a rational economic manner. A second problem is that this
model does not reflect the social nature of privacy. “Privacy is not simply a way that
information is managed but how social relations are managed” (pp. 327). Given these
problems with the economic model of privacy, I will be adopting a more nuanced model of
privacy that accounts for it as a cultural norm – as explained in Section 3.1.2 Activity Theory
& Explicit and Implicit Policies

Agre has also distinguished between two models of privacy (Agre 1994). He first asserts that
privacy is a social construct. As new technologies are developed with novel methods of
tracking people, their work, and their information, people think culturally about how their
privacy is being managed. He argues that different models of privacy and data management
highlight what are the different privacy concerns. One model that he examines is the
surveillance model. The surveillance model embodies the ideas of watching, territorial
spaces, and big brother style centralizing files. While this model highlights the possible
negative aspects of aggregating information, it also highlights the associated fears. A
different model proposed by Agre is the capture model. Agre explains that through the use
of linguistic metaphors and the focus on local practice and activity reconstruction, the
capture model highlights how the capturing of information enables communities to
coordinate their work. The capture model focuses on creating ontologies of activities. While
I am adopting neither of these models, they both highlight important aspects of how to
conceptualize space of social privacy.

Nissenbaum conceptualizes privacy from how societies have made privacy norms into laws
(Nissenbaum 2004). Three principles embody privacy: (1) the need to protect privacy
against government intrusion – similar to Agre’s concept of surveillance model (Agre 1994);
(2) determining the spectrum what is sensitive, intimate, and confidential – similar to the
economic exchange model discussed above; and, (3) determining the spaces and boundaries
of where there is a public and private sphere - as discussed in (Bylund et al. 2008). The
problem with these three principles, as argued by Nissenabum, is that they fail to take into
account the larger guiding principle of contextual integrity. The concept of contextual
integrity is guided by the social norm of appropriateness, distribution, information flow, and
change.
2.2.3.2. Designing for Privacy
Bellotti and Sellen (1993) considered how information technology facilitates new fantastic
data manipulation and presentation. Two problems can arise in these new areas. The first
problem is that these new information systems present novel vulnerabilities with untested
protection mechanism. They quote Mullender (1993) in saying that “protection mechanisms
‘are often only secure in principle. They are rarely secure in practice” (their emphasis). The
second class of problems, which are the ones that they focus on, are the design issues that
disrupt and cause breakdowns in social behavior. When people are working together and
communicating through mediating computer systems people can display more information
than may be desired or they may oversee some information about a collaborator that they
did not desire to see. In this sense they define privacy “to be a personal notion shaped by

18
culturally determined expectations and perceptions about one’s environment.” Privacy is
shaped by what are the cultural norms of the situation to determine what is acceptable.
(Culture norms are covered in Section E.3.) In order to design for privacy Bellotti and
Sellen propose a design framework of a 2x4 framework that considers feedback and control
versus the aspects of capture, construction, accessibility, and purposes. Sasse and Blandford
criticize this work by saying there is a large assumption that designers can a priori known all
of the aspects that a user would identify as a privacy invasion (Adams et al. 2005a).

In similar work, how to design file sharing software that encourages participation while
protecting privacy was explored. The outcome was four usability heuristics on how users
understand P2P file sharing software (Good et al. 2003). While similar to that of the original
definition of usable security outlined by Whitten and Tygar (Whitten et al. 1999), the
framework was conceptualized to be specific to privacy. P2P software is ‘safe’ and ‘usable’ if
users (1) know what is being shared (2) know how to share and not share, (3) do not share
information that is overly sensitive or ‘dangerous’, and (4) are confident that the system is
not misusing their information (i.e., that it is trustworthy). Good and Krekelberg did
multiple small studies examining how users inadvertently share personal information
through the P2P file-sharing network, KaZaA. While this study assumed that all users are
rational and do not desire to share their personal information unnecessarily, they found over
a 12-hour period 156 KaZaA users shared their email boxes. Among the four design aspects
to respect privacy in P2P software outlined above, KaZaA failed all of them in user testing.

The work of Hong et al. has proposed a model of privacy that can be applied to
understanding the design space of future technology. They argue that privacy is
“heterogeneous, fluid, and malleable notion with a range of needs and trust levels” (Hong et
al. 2004, pp.92). They additionally argue that understanding security mechanisms that
attempt to regulate privacy are “by no means sufficient.” This is because security if focused
on protecting against adversaries, instead of on understanding contextual needs of
individuals. For their model they propose examining various aspects of the social and
organization context, along with looking at the technological needs, by asking questions of
the current design space. They specify that this method is different from those presented by
Bellotti and Sellen (1993), along with others covered in this review, because their work is
practical and concrete. Example questions that the designer is to ask to herself are, “Who is
the user of this system?”, “How is personal information collected?”, and “How much
information is shared?”.

Another side of designing for privacy considers how to use sensors to detect the context of
the situation that people are in and thereby activate appropriate privacy settings. An example
would be if a person was looking at their medical files on a public computer and a sensor
detected an approaching unknown person. By detecting that someone else is present, the
computer would enact a privacy protocol to gray the screen until the unknown person
passed. Sample scenarios can be seen in the work of Ackerman, Darrell, and Weitzner
(Ackerman et al. 2001). They explore four usage scenarios of people interacting and the
dimensions of privacy that should be considered (i.e. user controlled protocol enactment,
secondary usage of the data, and user knowledge of sensors).

19
2.2.3.3. Software Designed for Privacy
In the fields of usable security and human-computer interaction there has been work to
create software that helps user think about their privacy concerns. One example of this kind
of privacy software is visualizing parts privacy statements at pertinent moments (Egelman et
al. 2009). Another instantiation of this work involves the use of “privacy critics”. The privacy
critics software uses pop-up windows to help prompt users think about the implications for
divulging personal information (Ackerman et al. 1999).

Technology probes have been used in the work of Bylund, Höök, and Pommeranz (2008) to
explore the effects of displaying personal information in an unusual places or with a unique
methods. They were also used to display personal information through novel information
and communication technologies. This work was fueled by the idea to explore how people
balance their personal and social identities. Bylund et al. found that while privacy concerns
were high prior to the probe being deployed, participants were generally more engaged in the
“creative playfulness” that took over after the deployment. This work highlights that privacy
concerns and software should not primarily focus on how help users hide their information,
but think more about how disclosure plays a part in social identity.

The work of Heckle and Lutters examined one software-based mechanism to provide
privacy for hospitals workers (Heckle et al. 2007). Their argument is based on the application
of HIPPA into the workplace, which enforces that passwords can no longer be shared. To
balance the need for security and privacy, they suggest the use of single sign-on
authentication. Single sign-on authentication involves one user logging into a particular
system and then the network manages logging them in and out of related systems as they
work. They found that participants were particularly hesitant about having their email
available through single sign-on authentication, that participants were likely to share
workstations even with a different person logged in, and that participants were concerned
about surveillance and trusting management.
2.2.3.4. Evaluations of User Privacy
With privacy statements becoming ubiquitous, the work of Arcand et al. (2007) explored
how modified privacy statements could encourage participant thoughts of control and trust
in e-commerce websites. They found that privacy statements did make participants feel more
in control of how their personal information was to be managed. However, whether the
policy was opt-in or opt-out caused feelings of control and trust to increase and decrease,
respectively.

Harper et al. conducted ethnographic work to examine the sociological implications for two
labs using active badges to track location information. They argue that it is important to
consider the sociology of the workplace to understand how the information will be used and
how the workers will regard its use. In particular, they found that the role of the person in
the organization had a large impact on their use and attitude towards active badges and the
information that was being tracked. The responsibilities that the person’s role also “includes
what they feel a sense of obligation to do and a tacit, informally managed awareness of the
need to co-operate with fellow workers” (Harper et al. 1992, pp.349). The application of
social norms into the roles that people embody in the workplaces I will be studying is a
valuable consideration for the proposed work.

20
2.3. Work in Similar Places
Researchers have been interested in the areas of child information management and medical
information management within the field of human-computer interaction. Both of these
areas require high degrees of coordination for the personnel to work effectively with one
another. However, they are both information rich areas that work with sensitive data that
needs securing. They are particular places with imbued meaning and context. In this section
we explore how others are worked within childcare and medical centers.
2.3.1. Abstract Places & Spaces
A space is a three-dimensional structure with certain physical qualities such as relational
orientation, proximity, partitioning, and presence. What makes a space into a location that
has meaning, or a place, is socially co-constructed values and behaviors that the space
affords. For example, when conceptualizing a door as an element of a space, it functions as a
partition and a divider of two spaces. When viewed as a place, the door affords privacy and
individuation (Harrison et al. 1996). Places are created by people, events, and loci (Harrison
et al. 2008).

The concepts of place and space are of particular importance when considering childcares
and medical centers. At a basic level this is because they are more than the physical space.
They are places of caring, business, and healing. The door to the office at a medical center is
imbued with the meaning of entering a room where you wait for your turn with the doctor.
In terms of security, the office of the manager is a place where the patient’s files are stored,
managed, and accessed with appropriate permission.

Harrison & Dourish have considered issues of privacy in terms of place and space (Harrison
et al. 1996). They explain the concept of privacy is place-centric. By this they mean that
privacy concerns are going to vary by the meaning that has been ascribed to a physical
location, such as the bedroom versus the living room. This distinction is more than just a
delineation between public and private spheres: “places are constructed not only out of
spaces, but also by the people present, and the events occurring in them” (Harrison et al.
2008, pp.101). The places have meanings that have been previously and are continually
socially constructed. For instance, what is permitted as possible for discussion in the living
room of one family may be different than another and may be different again depending on
the context of the situation, e.g., party vs. working on homework.

Understanding and studying places that have already constructed their conceptions of
privacy, such as those that are being proposed for study, is critical when considering the
design space for future technological interventions.
2.3.2. Childcares
There has been a considerable dearth of research examining the realm of childcare facilities
and childcare provides are information managers. However, there has been research from
Kientz and Abowd on how to design a technological solution for the amount of information
that has to be stored and managed about children. This work started as a qualitative inquiry
using interviews that examined how a network of caregivers managed the record-keeping
process. This network of people involved childcare workers, parents, extended family, and
doctors. Their central argument is that record keeping is an important process of
documenting when children reach developmental milestone. This practice, however, is

21
tedious (Kientz et al. 2007). Important findings from the study were that doctors were the
most trusted source of information about a child’s development. Doctors also supply
information during visits but parents usually have their hands full and forget. This
relationship is key to understanding the conceptions that parents have about authoritative
information. The second related finding indicates that parents had concerns about the
privacy of information managed by secondary caregivers, like childcares or nannies. From
this research a series of design requirements were elicited (e.g., Take advantage of existing
motivations).

Additionally studies have demonstrated prototypes for a system that supports the needs
elicited in the prior study (Kientz et al. 2007; Kientz et al. 2009b). This includes providing a
multimedia and developmental milestone repository with interfaces to display how the child
is progressing. This milestone repository could be software that is deployed on a home
computer. Another choice is to create an all-inclusive device that allows for recording picture
and videos of the child showing milestone progress. This prototyped system also would
include toys with sensors built in to record when children play with them in such a way that
a milestone has been met. While this work is in the same area, and documents some of the
same needs (i.e., mass amounts of information, data recording, etc.), it does not focus on the
security and privacy of practice. Additionally, this work has focused on how parents manage
the documentation. Instead, I am focusing on how childcares focus on managing
documentation, with parent’s involvement only impacting privacy concerns.
2.3.3. Medical Centers and Hospitals
There has been ethnographic work to examine how communities work together to manage a
patient’s information. In the work of Reddy and Dourish (2002) they explore how nurses,
doctors, and other medical staff in an intensive care unit coordinate through the use of
rhythms. They argue that the management of the people and their meta-information is so
tightly coupled that they are inseparable from the work being done: “Information is not a
separate focus of concern, but is woven seamlessly into the work of the unit.” To
understand how workers could manage with such integrated systems, Reedy and Dourish
explored the idea of rhythms of work – the repeated temporal patterns of increases or
decreases in work activity and information management. This lens facilitated understanding
how medical workers would transfer, communicate, and manage the work that needed to be
completed. For instance, they were able to see that workers oriented towards when
information was and was not available.

Jakob Bardram is perhaps one of the largest voices in the field of HCI, medical systems, and
usable security. In one of his earliest papers he analyzed how medical work systems use plans
to mediate their work through the lens of Activity Theory (Bardram 1997). Bardram
proposed that medical work systems have an intrinsic need to plan, and, in turn, to crystallize
these plans into representational and mediating artifacts. However, medical work systems are
still deeply contextual. He explains this by segregating plans, as a conceptual unit, into their
different uses. For example, he explores plans as socially constructed and used artifacts along
with exploring them as records and means of dividing work. This conceptualization of plans
is useful for understanding how they are represented in Activity Theory. For my research I
will need to determine if and how security is a type of plan, and how important these plans
are for managing actual security work. (More details on this work can be found in Section
3.1.2 Activity Theory & Explicit and Implicit Policies.)

22
One of Bardram’s largest pieces of work has been on his ABC Framework (Bardram 2005),
where he proposes to support the activities of medical teams with “simple, light-weight,
versatile mechanisms” (Bardram 2005, pp.166). He argues that the medical setting is one that
is valuable for studying given the pervasive use of technology, but that it also involves a
setting that is different from the typical workplace. The medical setting is one that involves
ad hoc collaboration, mobility, interruptions, and an extreme focus on communication. To
examine this space he proposes to study the following dimensions of activities: time and
space – managing work in different physical locations and at different times; task – managing
the collection of services; and, users – managing real-time and asynchronous collaboration
with awareness information. In later work Bardram outlines the principles that underlie the
ABC approach: activity-centered resource aggregation, activity suspension and resumption,
activity roaming, activity sharing, and activity awareness (Bardram 2009b). Bardram’s work
on the ABC framework has been applied to context aware computing (Bardram 2009a). The
idea is to link on going user-created activities to location-based systems of activity (e.g.,
pulling up prior work on patient as approaching patient bedside).

Work has been conducted to examine how the medical setting has managed the security of
private information (Adams et al. 2005a), how articulation work supports collaboration in a
medical setting (Bossen 2002), how to manage the mobility of medical collaboration
(Bardram et al. 2005), and the use and creation of multiple surfaces to help collaboration and
management – with a specific focus on supporting medical work (Bardram et al. 2009).
2.4. Explicit and Implicit Policies
As explained in the work of Adams and Blandford (2005a), policies are traditionally
authoritarian in nature – being passed down from of top-down decision makers. Security
policies, as one type of organizational policy, are traditionally created and implemented with
the focus enforcing procedures into work practice. Examples from medical practices and
childcares are the use of policy books and HIPPA compliance forms. In these artifacts,
guidelines for how the business will manage the personal sensitive information are outlined.
Parents and patients are not given power over how their information will be managed.

One area of usable security has focused on the use of explicit rules for managing online
privacy. Example concerns include the secondary use of online browsing or financial
information or securing access to sensitive personal information – like medical information.
Apart from the implementation of the P3P project, there has been research into additional
methods that take account of the situation to determine privacy. For example, the work of
Ackerman (2004) has suggested the use of privacy labels so that users can be notified of
possible data capture, use, and reuse as they interact with web-based technologies. Another
example is the work of Cranor et al. called Privacy Bird (Cranor et al. 2002). Privacy Bird
reads the P3P information in a website and then displays a visual indicator of how closely
the privacy standards for that website match personal settings (e.g., green indicator means a
good match).

Another set of work from usable security that makes rules explicit is the use of semantic
ontologies for defining access rights. An example ontology comes from the work of
Chowdhury et al. (2007). They define project roles – project leader, supervisors, members,
and visitors; privileges for each of the roles – administrator, read/write, visibility, and final

23
approval; and project resources for project roles – membership details, deliverables, and
documents. Specific people are then assigned to roles with predefined privileges and access
to resources. This allows for privacy and access to information and resources restricted by
how the ontology is defined.

In conjunction with explicit policies, there are implicit cultural norms that will regulate
behavior and expectations surrounding information use. Social norms that are of particular
salience to security and information managements are privacy and trust, as discussed in the
previous sections. In particular, Palen and Dourish (2003) state in their work that the use of
explicit rule-based privacy management is problematic, as seen in electronic file systems and
the use of semantic ontologies. This is because privacy is a dynamic interpersonal construct
that changes depending on the situation and context.
2.4.1. Local and Global Process
In both childcares and medical practices the businesses are given a set of mandated explicit
practices that they are to follow from government agencies. Parts of these regulations cover
how to manage personal sensitive information (e.g., HIPPA). Business processes, such as
those passed down by the government in childcares and medical practices, are often treated
as centrally-owned, explicit, and mandated work (Suchman 1995; Dourish 2001). However,
various ethnographic studies have shown that workers formulate their own objectives for
business processes and work out strategies and methods for attaining them (Gerson et al.
1986; Sachs 1995; Schmidt et al. 1996).

In prior work that I have done, we called these locally-owned processes (Moran et al. 2009).
They are local in that they are adapted to local work situations and needs, even when
workers are following centrally defined and mandated organizational processes, which are
called central processes. Local processes are owned by the workers involved and responsible
for doing the work. A number of studies show that people adapt central processes to local
situations and needs, e.g., (Suchman 1983; Suchman 1995; Fischer et al. 2001; Hardstone et
al. 2006). However, little is known about how workers actually go about evolving local
process representations over time, and manage discrepancies between global and local
processes. Given that the work of managing sensitive personal information in childcares and
medical practices involves evolving local processes, this is a space where breakdowns may
occur.

An example from medical practices would be the HIPPA stipulation that states that the
medical information that is to be protected includes health information, personal
information, and financial information5. Financial information is usually completely managed
through electronic methods, but health records are not always managed through an
electronic system. This leeway in how to manage the HIPPA regulation facilitates medical
practices in determining how they will coordinate and manage the privacy of their patients in
local ways. For instance, one subject in the pilot study had an electronic file of the patient’s
health information, a physical file of the patient’s health information, and a separate
electronic file of the patient’s financial information. Another participant had one file for all

http://www.hhs.gov/ocr/privacy/hipaa/understanding/coveredentities/usesanddisclosures
fortpo.html

24
of the patient’s information. An example from the observations of the childcares showed
that one childcare had created a checklist for evaluating classrooms for compliance with the
Department of Social Services licensing requirements. The original list is three pages long,
but the childcare created a condensed list of how they interpreted the requirements and used
a checklist instead of the globally-owned process.

Figure 3. Example of locally-owned process of managing DSS requirements.


Two studies were done to examine how workers manage iterations in locally-owned
processes. The first study consisted of interviews with 14 participants discussing 42 local
processes. From the interviews on locally-owned processed we found four reasons why
process documentation was updated as (1) explicit and complete how-to information for a
complex activity (2) maintaining status information for a complex activity (3) informational
documentation used during an activity, and (4) as a final product of a process.

The second study involved watching 27 participants pass on locally-owned processes in a lab
study with four conditions. The goal was to study the cognitive capacity of groups to
collaboratively manage coordination of locally-owned processes. The different conditions
with research questions were:
• Stable-Static Group (the situation and instructions are the same for each participant). How
much variation is there in coping with and fixing discrepancies?
• Drifting-Static Group (subsequent situations have higher deltas, but the instructions are the
same). What limits do people have for coping with discrepancies?
• Stable-Evolving Group (the situation stays the same, but evolved instructions are passed from
one session to the next). Will the instructions become optimized to the specific situation,
that is, will the deltas approach zero?

25
• Drifting-Evolving Group (subsequent situations have higher deltas, but evolved instructions
are passed from one session to the next). Will the instructions evolve to keep up with
drifting situations?

Figure 4. The quadrants depict deltas for the four groups in a study about global and
local processes: the height of a bar represents the number of deltas a participant
encountered during that session. Arrows show how updated instructions were passed
between sessions in the two evolving-instructions groups.

Of 27 participants, 23 followed the practice of evolving LPRs. Figure 1 depicts the number
of deltas encountered during these 23 sessions. We focused our analysis on how well
participants evolved the instructions, measured by the number of discrepancies they fixed. A
fix removes a discrepancy from the instructions, and thus is a positive change. The grouping
of sessions also enabled us to examine the effects of evolving instructions on the number of
deltas, in both stable and drifting situations. Evolved instructions had the predicted effect:
deltas for sessions using evolved instructions were lower than the corresponding static-
instruction sessions. Regardless of delta, all participants could cope with all discrepancies to
correctly complete the disclosure process, even if they did not fix all of them. As delta
increased, the number of fixes also increased, but the percentage of discrepancies fixed
remained constant, averaging 41%. These results may seem obvious, but we think it is
important to empirically demonstrate that most people can follow an effective practice of
evolving locally-owned process representations. For example, participants might not have
been able to cope or to revise instructions in a way that benefits succeeding participants.

The lab study also revealed how people follow the practice of evolving locally-owned process
representations. First, participants did more than fix discrepancies. Participants had no way
to distinguish our planted discrepancies from any other parts of the instructions that they
felt could be improved. We use the term enhancements to label any changes participants made
to the instructions that were not fixes of planted discrepancies. While we reserved the term
fix for changes in the LPR that were positive remedies to planted discrepancies,
enhancements could have negative effects: they could actually introduce discrepancies into
the instructions (25% of enhancements were incorrect, thus increasing the deltas for the
following participants) or could lead to verbose, confusing instructions (one participant in

26
the Stable-Evolving group spoke about her instructions, which had been revised by three
others: “Some of the notes … were lengthy and confusing….”). Overall, participants made
six enhancements per session on average. We found that the number of enhancements did
not depend on delta and did not vary across the different session groups. This is surprising,
because intuitively one might expect fewer enhancements in sessions with more
discrepancies to fix or with instructions that had already been enhanced.

Participants fixed 41% of discrepancies on average. The most common reason for this
(revealed in interviews) was that participants simply did not notice they had coped with a
discrepancy (10 participants). Other reasons included: feeling like a novice in the disclosure
task (5 participants); being confused by the instructions, which could have reduced their
ability to document discrepancies (2); forgetting to fix a discrepancy (1); having limited time
to make revisions (1); and believing that some discrepancies were so minor they were “not
worth recording” (1). These results indicate that iterations over several sessions may be
needed to optimize instructions, since some discrepancies will often be overlooked in each
session.

This work on locally and globally-owned processes demonstrates how groups coordinate and
collaborate to work through the management of explicit policies. While the focus has not
been on security, this work shows that teams will adapt global explicit policies to meet local
needs.
2.4.2. Breakdowns
2.4.2.1. Models of Breakdowns
Breakdowns are natural and occur in all systems of interaction and at different levels. They
have been a key focus in HCI due to the fact that by studying where there are user errors,
designs for user software could be improved. Work in breakdowns stems from the original
work examining critical incidents from psychology. In Flanigan’s foundational paper on the
critical incident technique he defines the method’s purpose as “collecting direct observations
of human behavior in such a way as to facilitate their potential usefulness in solving practical
problems and developing broad psychological principles” (Flanagan 1954, pp.327).
Breakdowns are a type of critical incident. They are different in that breakdown analysis
focuses on examining places where the system has not functioned correctly.

One of the earliest works on errors in the interface comes from 1981, when they are termed
‘action slips’ (Norman 1981) and built off of the psychological literature of Freud. Action
slips are the result of breakdowns in the mental processing of external stimuli. Using the
terminology of the research, the cognitive user is examining a stimulus that ‘activates’ the
slip, the brain’s ‘schemas’ then enacts the slip, and the cognitive capacity of user is
overloaded while multi-tasking. In this work, the metaphor of user as a computer embodies
the notions of inflexible stimulus-response pairing to environmental conditions. Once the
cause of the breakdown is deduced and the user’s mental schema fixed, action slips will no
longer occur. This work of cognitive errors still continues, with focuses on deducing
frameworks for understanding the psychological functions of mental errors (see (Reason
1990) for detailed analysis).

27
A response to this conception of a user as a computer produced the work on situated action.
In this work breakdowns are less of a focus. Instead, situated action focuses on how people
coordinate their emergent activities to constantly respond in situ to the presented contextual
contingencies. The basic unit of analysis in situated action is the relationship between the
actor and the environment (Nardi 1996). Thus, breakdowns occur when the user is forced to
move from the seamless activity at hand, to focus on the operations of conducting the
activity. For example, Winograd and Flores explain that breakdowns occur when objects that
functions ready-at-hand no longer functions in that manner (Winograd et al. 1986). A hammer
can work as an extension of a hand to knock in a nail. It is only when the hammer no longer
functions to knock in the nail that a breakdown has occurred – sometimes called moving to
present-at-hand. The concept of an external object functioning as an extension of the actor is
similar to the concept of functional organs. ‘Function organ’ is the term applied to an
external concept or artifact that mediates the actors ability to function seamlessly with the
context (Kaptelinin 1995). An intrinsic problem with the situated action standpoint on
breakdowns is that they focus on the individual user and fail to account for how
communities of people may encounter breakdowns. Breakdowns are also conceptualized as
problems with the artifacts or tools of use, instead of also examining conflicts in motivation,
rules, or other external contingencies.

For my work I will be employing the Activity Theory framework to understand how
breakdowns emerge and are responded to. This is covered in Section 3.1.3 Activity Theory
& Breakdowns.
2.4.2.2. Studies Using Breakdowns
Studying breakdowns provides places where intervention can help. In terms of software
design, breakdowns have been used to study particular situations where current technology
does not meet the needs of the users. For example, in work on activity awareness,
breakdowns were studied to understand places where collaborative software did not support
group tasks (Convertino et al. 2004; Humphries et al. 2004). Breakdowns were first found
from field studies of the use of groupware software. These breakdowns were categorized by
factors such as tool use, tasks, situational, and group dynamics. A confederate and a
participant then replicated the induced the breakdown from the field in the lab setting
(Convertino et al. 2004).

The study of breakdowns is not only prevalent, but their use is also diverse. Breakdowns
have also been used to study how communication across different technologies may be
effected (Hancock et al. 2009), for understanding a user’s mental model (Sheeran 2000), and
for understanding the problems in design processes (Guindon et al. 1988). In medical work
literature they have been used to understand how teams coordinate their workflows
(Kobayashi et al. 2005) and work trajectories (Ren et al. 2007).

3. Activity Theory as Theoretical Framework


3.1. Introduction & Initial Description of Activity Theory
Activity Theory is a constructivist theoretical framework with that strives to understand the
relationship between consciousness and activity (Nardi 1995). In the realm of HCI, Activity
Theory is of primary influence through its emphasis on the role that artifacts, like computers,
play in mediating experience. It is through the artifact that humans engage and interact with

28
external artifacts and other humans. At a basic level an artifact is language and gestures, but
at a concrete level an artifact is a hammer and a keyboard.

Activity Theory can be used as a descriptive framework for understanding every day security
practices. While other work in HCI has focused on the cognitive capacity for what a user is
able to account for (e.g., GOMS), Activity Theory “extends the concept of consciousness
past an idealistic, mentalistic construct in which only cognitive resources and attention
‘inside the head’ are at issue, to a situated phenomenon in which one’s material and social
context are crucial” (Nardi 1995, pp.13). This focus on context and artifact is different from
the traditional cognitivist concentration on information and its representations. Instead, by
studying how people act and work in situ, Activity Theory facilitates understanding the
dynamics of interaction. The analysis and application of Activity Theory in the remaining
part of this chapter use Engström’s adaptation of Activity Theory that extends the analysis
from the individual and object to also consider the individual situated in a community.
3.1.1. Background Description of Activity Theory
Activity Theory is not a theory in the sense that there is a set of rigid principles that guide its
use. Instead, Activity Theory is a set of disparate concepts about how individuals and
communities act and interact. It stems from Marx and Engles, but is highly influenced by
Vygotsky (Roth et al. 2007), Leont’ev (Leont'ev 1981 (Russian original 1947)), and Luria.
Kuutti has explicated key aspects of Activity Theory and I will be summarizing them in this
chapter in reference to the realm of how people manage sensitive personal information
(Kuutti 1995).

The first important principle of Activity Theory is that the activity is the central unit of
analysis. This analysis is at a higher level than the actions or operations of an individual. An
activity encompasses an individual acting in a particular context towards a goal. This focus
on the context is important for my work. It places the emphasis not on the actions that
people are doing to keep sensitive personal information secure, but instead on the context of
the surrounding activity that facilitates the actions to take place. This is not to diminish an
understanding at these levels, but emphasizes seeing the encompassing activity as central to
understanding the other aspects.

Activities are dynamic and unfolding. This means that when I will observe or take part in an
activity, there is a history to the context, situation, and artifacts that I am observing. At a
deeper level, the artifacts that I will be observing and working with are instantiations of prior
activities. For example, there are forms about who can access information stored in a child’s
file that parents are requested to fill out during enrollment. The creation of this form has a
particular history about the need that it embodies along with historical information about
edits and uses of it.

The artifacts, such as the form in the last example, of an activity are mediators. Mediator, in
this connotation means that the artifacts work to connect and facilitate manipulation. For
example, the director lets patients know of HIPPA policies through the mediation of a form
explaining the regulations. The artifacts can function as instruments of action. And, because
artifacts are created, they have limitations. Knowing the limitations of an artifact, however,
maybe particular to the activity at hand (e.g., piece of paper makes a great paper airplane, but
is not a great material for clothing).

29
Activities have a particular familiar structure, as seen in the figure below. For an individual,
the activity is a set of actions towards an object to create an outcome. It is the object of an
activity that distinguishes it from other activities; the same actors and artifacts may be used
but with different object(ives). For example, a director and the HIPPA explanation form
may be used for the objective of notifying patients of their rights, but also for demonstrating
to licensing agent HIPPA compliance. An object can be something tangible, like a
completing a form, or it can be a common idea (sometimes called an objective). The only
qualification of an object is that can be used by the actor for manipulation. The relationship
between the actor and the object is mediated by the tool, and the tool embodies that
historical context of that relationship. As Kuutti explains, “The tool is at the same time both
enabling and limiting: it empowers the subject in the transformation process with the
historically collected experience and skill ‘crystallized’ to it, but it also restricts the interaction
to be from a perspective of that particular tool or instrument” (Kuutti 1995, pp.27). The
relationship between the subject and object is represented in Figure 5 in a red line. The
mediation between the subject and object is represented in the black lines connecting them.
The transformation process of actualizing the object into an outcome is represented in the
arrow.

Figure 5. The form of an activity at the individual level, sometimes called Mediation
Model (Mwanza 2001).
Figure 2, however, only captures the individual’s conception of the activity. The full
conception of an activity involves considering the actor in her environment. Part of the
environment involves the community that the actor is a part of. For example, a director is
one actor in a community of administrative staff and additionally one actor in an even larger
community of medical personnel. Communities are defined by sharing the same object. This
is depicted in Figure 6 with the red lines between community-subject and community-object.
(Figure 6 presents a simplistic view of activity. There are actually mediating relationships
between all parts of the activity.) Considering the individual as part of the community is a
foundational part of Engström’s constribution to the further development of Activity
Theory (Engeström 1987).

30
Figure 6. An Activity from the community and individual perspective, sometimes
called the Activity Triangle Model (Engeström 1987; Mwanza 2001).
The relationship between the community and the subject is mediated by rules. These rules
are the focus of my dissertation study. Rules may be explicit or implicit, as long as they are
norms that regulate the activity. An explicit rule may be something that has been crystallized,
for example, into a policy book that parents sign. An implicit rule may be something that
every one understands but does not discuss, for example, a social norm that states that other
workers do not touch the medical director’s computer – even though it is in a commonly
shared space.

The relationship between the community and the object is division of labor. Division of
labor refers to explicit and implicit organization of the community. For example, it is the job
of the director to manage the patient’s files for the next day. Whereas it may be the
administrative assistants job to print out the schedule for the next day and to check that it
does not have patient information on it.

Activities have levels. At the most abstract level is an activity, which has a motive. At a lower
level is an action, which has a goal. At the lowest level is an operation, which has conditions.
The operations of an activity are the most basic level. An example could be learning to type.
When first learning to type the operations of getting fingers to keys is the base. Once fingers
have moved to type keys, the keystrokes can then be combined into the actions of writing.
The actions of writing can then be collated to the activity of conveying meaning. As the
typist practices the operations they become rote, and intense concentration does not have to
be used. Instead, focus can be placed on managing the actions, and so forth.

Figure 7. Levels of activity.

31
Activity Theory recognizes an external state and an actor’s internal state. Tools, such as a
computer or a patient’s records are external. Concepts and heuristics, such as not accessing a
child’s record without permission from the director, are internal (Kaptelinin 1995). Concepts
and heuristics are internalized through the zone of proximal development. The zone of
proximal development is the amount of progress that can be made by an individual based on
their surrounding contextual learning tools. For instance, a student with a good teacher may
have a larger zone of proximal development than a student with a poor teacher. The process
of internalization within the zone of proximal development can be characterized as “from
inter-subjective mental actions to intra-subjective ones” (original emphasis, Kaptelinin 1995). It is
in this way actors within childcares and medical practices can internalize the external social
norms and policies of their work places. Externalization, on the other hand is the
materialization of internal processes into external actions that can be reacted to and adjusted.
3.1.2. Activity Theory & Explicit and Implicit Policies
Bardram explains how artifacts can work as mediators of work in a medical setting (Bardram
1997). As explained in the previous section, the rules section of Engstrom’s activity structure
is made up of explicit and implicit policies that guide the individual’s and community’s
behaviors. One method of studying the explicit policies is to examine the artifacts that have
crystallized the rules. A very clear example is the use of policy books and HIPPA
explanation forms that parents and patients sign when enrolling.

Bardram explains how the use of these crystallized artifacts that embody rules are at the
same time planned, but also contextual. This is important when considering the possible
implementation of electronic workflow systems into medical settings. Medical work systems
are on one side deeply invested in the use of plans, schedules, and their crystallized
instantiations of charts and whiteboard tables; but, at the same time, the medical work
system is still rife with in situ contextual interpretation of the patient’s needs. These two
seemingly contradictory methods of working are explained through the lens of activity
theory.

Medical personnel are capable of making plans and then making these plans into objects
through anticipatory reflection. Bardram argues that the operations, actions, and activity are
all guided by anticipation:

The anticipatory reflection guides the activity by making an afferent synthesis


between a perception of the environmental state of the activity, and memory
(i.e. the cumulated experience of the person). This afferent synthesis forms
an anticipation of the future state as a result of the activity about to be
performed. When the activity is performed there is a feedback mechanism
which compares the result of the activity with the prediction, and any
incongruence (i.e. a breakdown) gives rise to a learning situation (i.e. the
experience of the person is expanded). (Bardram 1997)

As this quote explains, the use of a person’s cognitive perception and memory facilitates the
construction of mentally conceiving a future state. This future state is then compared against
the real activity. When there is a discrepancy between the future state and the real activity, a
breakdown has occurred. To mitigate future breakdowns the individual will learn, adapt, and
revise any artifacts that crystallized prior constructions of anticipated future states. This

32
feedback cycle is how medical work systems are at one time planned, but at the same time
deeply contextual.

Understanding how norms can become crystallized into artifacts, and then how these
artifacts can guide behavior is important for my work. The use of Activity Theory provides a
framework for understanding how they can be both formal and impromptu.
3.1.3. Activity Theory & Breakdowns
Given my focus on breakdowns in the current explicit and implicit policies being used in
medical and childcare practices, there is a need to understand the theoretical underpinnings
of how communities handle them. Activity Theory has a model of breakdowns that stem
from Marxist Theory (Nardi 1998). When an activity system has a perturbation it indicates
that there is a need for growth; the activity system needs to change. A ‘contradiction’ within
and between elements of an activity, or even between levels of an activity, can cause a
breakdown (Kuutti 1995). The cycle of a breakdown is that a perturbation is caused by a
conflict or a tension within the system.

Bardram provides an example of a breakdown from the medical field between the objective
of the nurse and the doctor. In the example the nurse is trying to help the patient get out of
bed (Bardram 2009b). However, what she does not know is that the doctor had prescribed
intravenous medication that requires the patient to stay in bed. The lack of coordination
between the objective of the nurse and the objective of the doctor created a situation where
two objectives became in conflict. Bardram explains that there are many times when the
overall objective of the activity (i.e., healing the patient) becomes lost or drifts from the
related individual activities.

The process of responding to the breakdown has been explained further by splitting the
process into phases (Raeithel 1996; Nardi 1998; Nardi 2007). The organization goes through
phases where the system works and does not need change, called coordination; to phases
where there are minor disruptions to the system, called cooperation; last, to phases where
major breakdowns have occurred that require the system to realign, called co-construction.

Co-construction has been further explained by Nardi as when the collective group, or
organization, needs to redefine their object(ive): “Co-construction necessarily involves
intensive learning as new practices are being devised, which itself requires learning, and the
new practices are intended to be learned by others” (Nardi 2007). The groups adapt to the
new object through learning on a personal and community level. The group does this
through object construction and object instantiation. Object construction is the process of
figuring out what the object should be, of defining it, and motivating the activity (i.e., the
process of figuring out how to get all the forms updated on time). Object instantiation is the
work that goes into materializing the object, of incorporating the object into the activity (i.e.,
creating new forms or sending out emails). Through both of these processes the
communities work to create change in response to a conflict or breakdown in the activity
system. By using this framework I will be able to tease apart not only that a breakdown has
occurred, but how the communities responded and instantiated a change. I will also be able
to distinguish between when a breakdown has been recognized, but not responded to and
when a breakdown has been recognized, instantiated, but does not meet the needs of the
current activity system.

33
In her work of using video data to examine focus shifts and breakdowns in using computer
applications, Bødker suggests to ask the following questions that can be useful for
identifying the cause of the breakdown in an Activity Theory framework (Bødker 1996): (1)
For an individual, what is the purpose of the activity and actions? (2) When there is more
than one person participating in the activity, are the tools, purposes, and objects conflicting
between the individuals and groups?
3.2. Security as an Activity
To conceptualize security as an activity, I have followed the method outlined by Mwanza.
She details how to determine the parts of the activity system – as detailed by Engstrom
(Mwanza 2001). The steps are (1) model the situation being examined, (2) produce an
activity system of the situation, (3) decompose the situation’s activity system, (4) generate
research questions, (5) conduct detailed investigation, and (6) interpret findings. In this
section I will detail the products of steps 1-4. Two activities will be examined to highlight
different aspects two social norms from the pilot data: (1) a teacher asking for access to
retrieve information from a child’s file – depicting privacy concerns, and (2) an alternative
example that shows a teacher looking at a log of information, which depicts the community
of trust in managing personal information. The purpose of this section is to provide details
of the design space and to create a lens for the propose work section of this proposal.
3.2.1. Descriptions of the Activities
Before transforming the data into an activity framework, the pilot study data from the two
examples are presented. Both examples are taken from the observations conducted in the
Fall of 2009 at childcares.
3.2.1.1. Example 1: Accessing a Child’s File
The directors at the studied childcares function as moderators and guardians of the child’s
information. They mediate and negotiate the privacy of the information by regulating access
to the physical information. This means that there is a policy in the childcare stating that the
director is the one who regulates the staff’s ability to access the child’s file. In the following
quote the Director explains that the teacher has to ask her for the file. Then, if the Director
determines that the requestors need to access the information is valid, the Director will
retrieve the information for her: “… Aside from that all teachers and staff have to come to
the administration first before they go into a file, they can't go in even to their own file until
a director pulls it out for them because even for staff records, pay is confidential.”

While observing the same director in the quote above, Stacy Branham, a Ph.D. student who
was working with me for the semester, was able to observe a teacher ask to access the files
for the students in her room. Here is a snippet from Stacy’s observation journal. Her
interpretations are in the brackets and the observed actions are in carrot brackets.

A teacher appears in the doorway and walks into the room and approaches
the corner of Director’s desk: “Hey” "If I want my kids' middle names, are
they gonna be in here or in the file?" <points to black box> "they'd be in
their file" The teacher then says, in a much softer voice, “can I dig?” The
informality of this word, the tone of her voice makes this seem like a
common practice. The Director responds with a carefully-laid sentence and a

34
slight sternness in her voice--her eyebrows raised, eyes widened, and a
scolding manner in the slow, undulating movements of her head and voice as
she speaks: “I’ll have to dig for you.” I can only see the side of the teacher’s
face, but she looks shocked in that her face pauses, gaze held with the
Director’s, and there’s a space created that seems to beg of a verbal
response. The Director fills the silence with a soft, earnest “I’m sorry” that
seems to answer the teacher’s unspoken question: “I can’t, really?” It
convinces me that this is usually not the situation the teacher finds herself in.
The teacher explains that she wants to make a name book with their full
names and asks the Director {as if to justify her request, perhaps to implicitly
ask the Director to reconsider} if she needs her to make a reminder note {as
if to play with the notion that it’s really just a small request and to suggest a
heavyweight documentation of it in rebellion}. The director finishes with “I
was gonna say, do you need them now or can I get back to you later?”
{Perhaps Claudine is making space for the teacher to come back later to
service her own request, when I’m not there}. The teacher responds meekly,
an unexpected wrench put in her planned activity: “latter’s fine.”

In this example the teacher is coming into the Director’s office where the files are located.
She attempts to just access the files to get the information that she desires. Instead, the
Director intercepts her actions by restating the community policy that the director will have
to get the files for her.
3.2.1.2. Example 2: Scanning Publicly Displayed Private Information
Personal information about children in a childcare is ubiquitous. Apart from the child’s file,
there are logs of activity and sub-records kept in classrooms. The front desk can also have
personal sensitive information about the children stored there. It is one of my hypothesis
that what guides access to this publicly displayed private information is a community of trust
among the childcare personnel. In this snippet the director is talking about the balance
between confidentiality and people trusting one another. She talks about how there is a
director always in the office with the files, but even if there is not a director present, they
trust each other enough to not violate confidentiality:

Since there's a director always there, teachers are bound by confidentiality, it's
in their agreement, it's in our handbook, any violation of confidentiality is
immediate grounds of termination. We try to use a lot of trust. Pretty much,
there's nothing--more often than not there's not anything that they can't see.
Um, there are cases of children that you know we've had suspicions of abuse
or different information, but we kinda want at the same time for them to be
privy to that so that you know if there is a future concern…

During an observation session, Monika Akbar, another Ph.D. student who worked with me
during Fall 2009, was able to observe someone examining a journal, or as Monika calls it, a
notebook, that an assistant director keeps at the front desk. This journal keeps notes of
things to do, sometimes updating children’s sensitive information, but also the comings and
goings of activities in the childcare. Here is a snippet from Monika’s observation journal:

35
{The assistant director} goes to restroom and leaves someone at the front
desk. I am assuming that someone is a teacher since I saw her with a group
of children earlier who went out with the children and after a while came
back without them. The notebooks I saw open earlier that {the assistant
director} was working on are still open. This teacher is looking at the
notebooks and other files. She is not brushing through the data, she has her
full concentration. She is an older woman, in her 50s, with an apron which
makes me think that she might also work as a kitchen staff. She is reading the
files without any kind of hesitation. It looks like she doesn’t care if someone
sees her reading those files.

In this example it is apparent that the community works in such a fashion as to facilitate
sharing of sensitive information in order to work effectively.
3.2.2. Step 1: Modeling the Situation
To model the situation Mwanza said to complete the identification of the activity of interest,
the object of interest, the subjects in this activity, the tools that are mediating the activity, the
rules that are regulating the activity, the division of labor that is mediating the activity, the
community that the activity is taking place within, and the desired outcome of the activity –
essentially the different parts of Engstrom’s Activity Triangle (Mwanza 2001). For both
activities, these parts have been listed in Table 3.

Table 3. Deconstructing sample activities with Activity Theory framework using


Mwanza framework.

Activity 1 Activity 2
Activity of Interest Teacher accessing child’s Cook accessing public journal
files of private information
Object(ive) To acquire middle names of To become up-to-date on the
the children in her classroom latest comings and goings of
the childcare
Subjects Teacher Cook
Tools Children’s files, filing Journal, front office
cabinet, director’s office
Rules Rules of Privacy - asking Community of Trust - sharing
before entering filing cabinet; responsibility for managing
talking to director as entering child’s care
room
Division of Labor Managing business, Cooking, watching children,
entertaining and watching managing business
children
Community Childcare personnel, parents, Childcare personnel, parents
licensors
Outcome Not having access to child’s Updated on where she is
files needed and moves to that
place

36
3.2.3. Step 2: Produce and Activity System of the Situation Being Investigated
From he table I am able to visually construct the Activity Theory diagrams to depict the
activities. These can be seen in Figure 8 and Figure 9.

Figure 8. Activity Theory framework applied to the activity of a teacher attempting to


access a child's files.

Figure 9. Activity Theory framework applied to the activity of the cook examining
publicly displayed private information.
3.2.4. Step 3: Decompose the Situations Activity System
In the previous two steps I have walked through two sub-activities of the larger activity of
making personal sensitive information secure. To understand the larger activity, it is
necessary to understand these smaller sub-activities that make up the larger activity. In fact,
there are times when sub-activities can be in conflict, thus resulting in a breakdown.

37
To understand the larger activity of how childcares and medical centers managing sensitive
personal information, Mwanza says to decompose the larger activity into the categories of
Actors, Mediators, and Object, where Tools, Rules, and Division of Labor function as
Mediators (Mwanza 2001). Dividing the larger activity in this fashion facilitates seeing holes
in the activity or places where there are unknown answers to questions – covered in Step 4.
These categories have been listed to represent what is known from the pilot study data in
Table 4. This list will expand after the observations are complete.

This table is conceptually similar to that of understanding a design space through the use of
morphological boxes. Morphological boxes are used to conceptualize and graphically depict
combinations of parts of different categories. They can be used to understand combinations
that do not work and to conceptualize the scope of the design space. It is in the later use that
detailing the aspects of the activities in regards to securing sensitive information that this
table can be used.

Table 4. Lists of known actors, mediators, and objectives from pilot data.
Actors Mediators Object(ive)
Subjects & Division of
Tools Rules
Communities Labor
Director Patient’s & Community of Healing To serve a
Child’s File Trust community
purpose
Administrative Front Desk Director’s office Documenting To be a business
Staff Journal functioning as a
sacred space
Teachers Teacher Journals Consulting Planning To keep patients
Director before and children safe
accessing files and healthy
Nurses Scrap Notes Having Billing To meet the
information ready- standards set down
at-hand by governing
agencies
Doctors To Do Lists Keeping Teaching To ensure life’s
information work
redundant to
protect against
information
atrophy
Childcare Post-it Need-to-know Passing
Owners Annotations policy Inspection
Licensors Backpack Mail Parent’s as Paying
outsiders
Parents Ambient Policing unknown Communicating
Information (i.e. persons
Filers & Notice
Boards)
Patients HIPPA Forms & Licensing &
Policy government
Notebooks regulations

38
Extended Buss & Honoring
Family Playground patient/child
Members student privacy
checklists
Insurance Electronic File Protecting against
Companies future lawful action
Health Billing File
Department

3.2.5. Step 4: Generating Research Questions


Stage four involves the decomposing the activity of securely managing sensitive personal
information into research questions that will inform what I look for in my observations.
Mawanza explains that a research question can be formed by taking an actor, a mediator, and
an object(ive) and combining them. For example, ‘how do licensing and government
regulations affect the way that doctors serve a community purpose?’ While many of these
can be created from Table 4, the encompassing question that will be explored through
different parts of Table 4 is ‘how are breakdowns reflected in changes made to rules in
regards to the collaborative management of sensitive personal information?’ This question
and its sub-questions will be explored in the next chapter.
3.2.6. Step 5 & 6: Conducting and Interpreting Findings from a Research Study
Step 5 and 6 are what I am proposing to do for my dissertation work and will be covered in
the subsequent chapter. Roughly, these steps involve attempting to answer the questions
posted in Step 4 and then framing the activities that were observed to highlight the
question’s solution. I am proposing to frame the activities, and then to build from the
activities a set of near future scenarios that outline emerging design implications.
3.3. Alternative Frameworks
3.3.1. Communities of Practice
Communities of Practice is a framework for understanding how groups of individuals work
together (Lave et al. 1991; Wenger 1998). A community may be an organization like a
medical practice or a childcare, but it might also be more fluid. A central tenant is that
groups of people come together centered on a activity that engages them to communicate
and interact. The three dimensions of a Community of Practice are (1) what it is about – the
continual negotiation of what the group is engaging in (2) how it functions – engagement
with social rules that bind the group, and (3) what capability it produces – the shared
resources that have been co-produced and are accessible (Smith 2003).

The strength of the Communities of Practice framework is the understanding it provides on


how people move from novices to experts within groups, called Legitimate Peripheral
Participation (LPP). LPP is the idea that newcomers to a Community of Practice are
apprenticed within a work practice. An apprentice advances by becoming a fully-fledged
member with a possibility of becoming a master or leader. Progress is partially determined by
the amount of access a newcomer has to the masters.

Adams and Blandford used the Communities of Practice framework to examine security use
in clinical settings (Adams et al. 2005a). The premise of their work is that hospitals are
hierarchical organizations that with power structures that will influence the adoption, use,

39
and application of security measures. They found that people within the community to
exclude others from participating fully used authentication systems.

The focus of Communities of Practice does provide a framework for how groups interact.
However, its focus is on learning and insider-outsider perspective. If insider-outsider power
issues are utilized in the settings I will be studying, Activity Theory will account for it
through the use of mediators such as rules or objects. In reference to learning, if this is an
objective of a participant in our study, this will be conceptualized through the Activity
Theory objective and outcome.
3.3.2. Distributed Cognition
Distributed Cognition is an analytical framework for understanding the cognitive activity or
group-based problem solving (Perry et al. 1997). Distributed Cognition focuses on actors
working together through objects to solve problems within a particular context. It details
representational states and how information flows between these representations to facilitate
the analysis all of the pertinent factors affecting the group’s task. This is usually done
through the creation of diagrams detailing representational state changes. It has been used to
study the workplace, in particular places of group navigation such as the airplane cockpit
(Hutchins et al. 1996).

At the base of the Distributed Cognition framework is a cognitivist conception of human


cognition as computer. This is perhaps the largest difference between Activity Theory and
Distributed Cognition. Humans are thought to receive input through sensory stimulation
and the brain then creates representations from these stimulations. These representations are
then processed and result in the human produces some output. Activity Theory, on the other
hand, stipulates that humans are not equal to computers (Nardi 1996). Another difference is
that Activity Theory says that that knowledge creation is not akin to a program creating
processed data, but instead the validation of intermediated mediators.

There are many similarities between Distributed Cognition and Activity Theory. They both
focus on the use of tools to mediate goal directed behavior, that people are social and work
together, and the object of study is people working with these tools (Nardi 1998). In a paper
submitted Baumer and Tomlinson to Alt.Chi, and yet to be accepted, the authors discuss the
differences between Distributed Cognition and Activity Theory by analyzing the same video
clip through the lens of both frameworks (Baumer et al. 2010). They found that Activity
Theory is harder to learn the nuances of, but provides more structure for easier application.
A second difference between the two theories was on how their temporality. Distributed
Cognition tends to focus more narrowly on discrete actions, whereas Activity Theory tends
to take a broad scope focusing on activities that can last minutes to months.

Given that my research focus is not on how information flows, but instead on how security
breakdowns are managed, Distributed Cognition was not to be selected as the analytical
framework for my work.
3.3.3. Actor-Network Theory
Actor-Network Theory (ANT) is a framework for conceptualizing how different objects,
people, and groups interact to form innovations. For example, Song and Zahedi
conceptualize how web designers, web technology, health information, and health experts

40
are intertwined to create and design a health infomediary (Song et al. 2007). In ANT, the
actors and the tools have equal symmetric power to impact the innovations. Tools, though,
are socially and historically emergent artifacts that represent innovations from prior
networks of interaction (Latour 1987). In this way, interaction is a socially constructed
phenomenon that is dependent on contextual factors.

Trust has been accounted for through the use of ANT in the work of Song and Zahedi
(Song et al. 2007). In this conceptualization, different parts of the network take on the
inherent risk of a situation to protect the user. For instance, a health infomediary takes on
the risk of collecting information from different sources to aggregate them into one ‘trusted’
location.

Engeström and Escalante apply both Activity Theory and ANT to the use and downfall of
an electronic system named Postal Buddy (Engeström et al. 1996). The authors explain that
ANT enables the study of activity networks, a subject that activity theory is only beginning
to account for. In particular, the ability for the piece of technology to have agency in the
relationships within the network makes sense in regards to thinking of avatars or human-like
robots/interfaces. When ANT was used to conceptualize the downfall the Postal Buddy
system, insufficient conclusions were made (i.e., creation of an insider/outsider spreading
network of actors). Activity Theory was then applied and demonstrated that the leaders of
the Postal Buddy company were overly enamored with their concept that instead annoyed
users and postal workers. Activity Theory allows for multiple objectives and motivations to
be accounted for. It facilitated understanding the post office as a central place for the
activities of use. From this work ANT was used to show the larger network of people and
objects. However, motivations, objectives, and rules were shown to be valuable constructs
for understanding breakdowns that resulted in the companies failure.

In my proposed work, the people (i.e., directors, doctors, parents) and the tools (i.e., files,
journals, electronic record systems) can be conceptualized as a large network with security as
the intermediary. This framework does not account for the different factors that can impact
and result in breakdowns in the system. For this reasons, ANT was not selected as the
framework for the proposed research.
3.3.4. Value-Centered Design
Value-Centered Design is a “theoretically grounded approach to the design of technology
that accounts for human values in a principled and comprehensive manner throughout the
design process” (Friedman et al. 2002, pp.1). The focus is to understand the values in the
design environment and then to incorporate them into the resulting technology. There are
three phases to Value-Centered Design, although there is some debate as to the order - see
(Dantec et al. 2009) for alternative suggestion. The phases are conceptual investigations,
which involve philosophical analysis of the constructs that the values encompass; empirical
investigations, which involve research into the observable and measurable practices that the
technology is being used in reference to values; and, technical investigation, which involves
understanding the current technological inventions in reference to values and the use of
technology probes designed for specific values.

The work of Value-Centered Design focuses on the creation of new technology. My work,
instead, focuses on understanding the values that impact the security of particular

41
workplaces. Value-Centered Design may be used as a second step of my larger research goal
to design and create future technologies.
3.3.5. Common Information Spaces
Similar work to Bardram’s ABC Framework (detailed in Section 2.3.3) is that of Common
Information Spaces (Bannon et al. 1997; Bossen 2002). Common Information Spaces (CIS)
is a conceptual framework that proposes to support collaboration in work situations that are
information rich. The interrelationships between actors, artifacts is of prime importance,
because it is through the artifacts that collaboration is possible. The parameters of CIS are of
debate, but Bossen presents a set of parameters built from the ethnographic work of medical
centers: (1) the degree of distribution, (2) the multiplicity of webs of significance (3) the level
of required articulation work (4) the multiplicity and intensity of means of communication
(5) the web of artifacts (6) immaterial mechanisms for interaction, and (7) the need for
precision and promptness of interpretation (Bossen 2002).

This framework is useful in that it allows for outlining the artifacts of coordination.
However, it does not provide a rich enough framework to explain needs and goals of the
articulation work that goes into creating information spaces. Given my focus on security,
privacy, and trust, a framework that provides a rich vocabulary for understanding
community and user goals is necessary.
3.3.6. Design Tensions
While Value-Centered Design focuses on listing, understanding, and then mitigating the
values of a design space, Design Tensions are proposed with an alternative goal (Dimitriadis
et al. 2007; Tatar 2007). The Design Tensions framework proposes to evaluate the
conflicting tensions in a design space and to balance the tensions through compromise.
There are four possible parts to constructing the framework: (1) the vision of what is versus
what ought to be, (2) the approach that is taken in terms of the project’s drivers and values,
(3) the tensions that exist in the design space, and (4) the as created situations that represent
the consequences of the new possible realities. Design tensions are managed through trade-
offs, insight, and reformulation of vision, approach, or other components (Tatar 2007).

For example, Tatar presents her design work on the NetCalc project that works to teach
rates of change and variation on handheld devices (Tatar 2007). The situation as it stands is
that children do not have access to the technologies, like NetCalc, that could help them learn
mathematics. What ought to be is the availability of resources that could make learning
complex concepts easier. Example tensions that exist in this design space are planned action
vs. conversation, using a private display vs. coordinating the classroom activity with multiple
displays, and workflow handling vs. worksheet handling. Examining all of the tensions in the
design space allowed for an understanding of design consequences.

The design space of managing sensitive personal information presents different design
tensions. At the most basic level, a tension exists between managing current work and the
need to keep information secure. However, the application of the Design Tensions
framework is useful when considering designing future technologies. At this point, the
workspace and the tensions that exist within it are not fully conceptualized. The application
of Activity Theory to this space will help draw out initial design tensions that this framework
can then build upon for future work.

42
3.3.7. Macroergonomics
Macroergonomics is a framework for examining the systems level of a socio-technical
system. It is derived from the literature on socio-technical systems and is used in the fields of
human factors and ergonomics. Hendrick, a primary founder of Macroergonomics
distinguishes it from other types of ergonomic work by saying that many ergonomic
company-wide solutions can fail without addressing the larger system of the organization
(Hendrick 2002). The sub-systems of Macroergonomics are the technology sub-system, the
personnel sub-system, and the external environment (e.g., factors such as educational,
political, cultural, and legal) (Haro et al. 2008). There are five guiding principles: flexibility,
joint optimization, joint design, systems harmonization, and continuous improvement. The
overall goal of Macroergonomics is to “optimize the work system design in terms of its
socio-technical system characteristics, and then carry the characteristics of the overall work
system design down through to the design of individual jobs and human-machine and
human software interfaces to ensure a fully harmonized system” (Hendrick 2002, pp.3).

While Macroergonomics does examine values and the work systems, its focus is on
optimization of the socio-technical system for production. My focus, instead, is on studying
how the current systems work to manage sensitive personal information to better
understand the confluences of incorporating more technologies into the work environment
and practice. It is not my goal to optimize the output of the system, although some
suggestions for technological interventions may be a side product of possible findings from
the studies.

4. Proposed Research
The purpose of this research is to holistically examine and understand the social and
technical mechanisms that support collaborative management of sensitive information
practice. I believe that that a better understanding of the practices in the particular
workplaces of childcares and medical practices will allow security researchers to design
systems that are optimized for the best use of explicit and implicit privacy and
communication policies. In seeking to understand these practices I am proposing to answer
the question ‘how do socio-technical systems that use sensitive personal information manage
work-practice breakdowns surrounding the implicit and explicit rules of process?’ I have
further broken this down into three sub-questions:
What are the implicit and explicit rules surrounding how medical practices and
childcares handle sensitive personal information?
What breakdowns happen when the explicit and implicit rules are not followed?
How are breakdowns accounted for, negotiated, and managed in socio-technical systems
where sensitive personal information exists?
These research questions will be discussed further in this chapter. Additionally, this chapter
includes an overview of the methods used, a description of pilot studies and initial analysis,
and proposed work for my dissertation.
4.1. Rational for Qualitative Methods
This research is being approached through the lens of social constructivism. Compared to
behaviorism, and cognitivism, constructivism takes the theoretical stance that the world is
interpreted; the world is subjective. Interpretations are based on how groups, or how society,
conveys meaning, and then how the learner internalizes that meaning. Constructivism
challenges the post-positivist assumption that reality can be reduced to its reproducible parts

43
(Bloomberg et al. 2008). For constructivists, this means that the content and context of
situations are going to vastly impact how a person learns and thereby understands the world
around them (Ertmer et al. 1993). Researchers with a constructivist focus will be ones that
focus primarily on the context of the situation and work practice.

In terms of evaluating the information practices of childcare and medical practices I am


proposing to use an ethnographic approach. Ethnography is one instantiation of a
constructivist approach to research and a qualitative method of study. Bloomberg and Volpe
have defined ethnography by the work of the researcher: “the researcher studies a cultural or
social group in its natural setting, closely examining customs and ways of life, with the aim of
describing and interpreting cultural patterns of behavior, values, and practices” (Bloomberg
et al. 2008). Other characteristics of qualitative research that ethnography embodies are that
the research is active and involved in the space being studied. The researcher becomes an
active observer – sometimes taking part in the activities being studied. A second aspect of
qualitative research is that it may be defined at first, but it is open for developing exploration
into other important facets of the space as they emerge.

Qualitative research accepts that the researcher is going to interpret the world around her,
but that her interpretation is going to be different from another researcher’s interpretation–
a phenomenon sometimes called the double hermeneutic, which has developed from the
work of Giddens. The first part of the double hermeneutic is that “self-interpretation and
their relations to the context of those studies must be understood in order to understand
why people act as they do” (Flyvbjerg 2001). The second part of the double hermeneutic is
the researchers own self-interpretation. This dualism is important for ethnography because
the context determines and is determined by the researchers understanding of herself and
her values. Since the context is the focus of qualitative work, the researcher needs to
recognize and manage subjective and objective interpretations in a way that is valuable.

To manage this intrinsic problem with qualitative methods, researchers have crafted models
and methods to elicit objective-like observations along with training and insight into how to
recognize assumptions, subjective assessments, and other biasing factors. By interacting with
the participants reality and work in a meaningful way, the researcher is able to engage with
the participants interpretations of their world. The primary rule of this interaction is to
recognize that the context is of highest value. What is to being conveyed will never be
happen again. To use an analogy of Lucy Suchman’s, the act of canoeing and navigating a set
of rapids will not be the same between the first and second attempt. This is because various
characteristics of the setting, such as how the water level or boulder placement, will be
different along with the canoe paddler’s prior experience (Suchman 1987).

Research through qualitative methods such as interviews and observations allows for
researchers to try and capture the voice of the participants through the use of rich
descriptions. Rich descriptions are the data derived from the use of qualitative methods. The
researcher tries to sensitively convey the context through transcribing what participants say
in the setting and actions they are being said within. These are sometimes combined with
pictures, video, and audio from the setting to create a detailed picture of what occurred. Rich
descriptions are then organized to answer the open research questions posed at the start of
the project.

44
4.2. Grounded Theory & Proposed Observations
Grounded Theory is a theory of process that enables the researcher to understand the
participants’ views of their world. This understanding is accomplished though collecting
qualitative data, such as artifacts from the environment, interviews, and observations. The
data is then aggregated in such a way to understand interrelationships and categories of
information. The first primary characteristic of Grounded Theory is that categories and
interrelationships go through a series of refinements and iterations as more data is collected
and interpreted. The second primary characteristic is that the researcher does theoretical
sampling from different groups to provide perspective on the similarities and differences
between groups (Creswell 2008).

There are some nuances to Grounded Theory. Grounded Theory stems from the work of
Glaser and Strauss (Glaser et al. 1967). In the more than forty years of its use, there are
theoretical differences between what Glaser and Strauss propose to be the use of Grounded
Theory. Glaser proposes that Grounded Theory the theory should be emergent and iterative.
Strauss, on the other hand, proposes that after an initial stage of theory creation, the theory
should then be applied and validated along some criteria (Strauss et al. 1997). For my
research, I will be using the method detailed by Charmaz, who was a student of both Glaser
and Strauss (Charmaz 2006). Charmaz’s method is more similar to Glaser’s method, in that
there are iterations in developing emerging themes, and little focus on validation phase of
theory creation.

The process of Grounded Theory, as proposed by Charmaz, involves phases of gathering


more data and then coding and refining (Charmaz 2006). Figure 10 depicts the different
phases of using grounded theory. The first phase is collecting data and starting to create
codes that answer the initial research question. The researcher(s) then create memos
detailing and describing the codes using the collected data. These memos are refined through
phases of collecting and coding more data. The phases of collecting and coding more data
can be focused to collect data about a particular sub-question. Eventually enough data and
memo writing will start to create a comprehensive understanding of the concepts being
studied. This understanding facilitated the creation of a draft of the theory. However, even at
this stage, more data collection and refinement can be done.

For my research I have conducted four pilot studies that have started to answer the question
of how do people collaborative manage sensitive personal information. I did this by
interviewing childcare directors and medical office directors as to general work practices.
Two additional studies were then conducted to provide more focused and advance research
questions. One pilot study interviewed parents about their sensitive information
management concerns. Another study went and completed follow-up interviews and initial
observations of childcares to focus initial findings from the summer interviews. Each of
these studies was individual coded for relevant themes but also was aware of prior codes and
results. Reports with detailed memos were produced. From the data an initial codes were
then collated and used to produce a first draft of explicit and implicit policies affecting the
collaborative management of sensitive personal information. The future work of this project
is the in depth observations to further refine the initial theory and to comprehend associated
breakdowns.

45
Figure 10. Iterative refinement of codes and data collection to create grounded
theory.
4.3. Research Space
For my dissertation work I am proposing to study the socio-technical security mechanisms
that are used in childcares and medical practices. Before discussing this proposed work the
physical locations will be described in this section to provide a deep understanding of the
settings.
4.3.1. Childcares
Childcare centers are places where a care provider cares for children during workday hours.
Children are dropped off at childcares five days a week for approximately 50 weeks of the
year. Childcares are state licensed and go through a regular process of passing inspections.
They can house anywhere from one child up to many hundreds of children. According to
the U.S. Department of Education, National Center for Education Statistics, thirty-six
percent of all children under six who are not yet in kindergarten participate in childcare
centers (2005).

There are many people who participate in running a childcare. There are owners who are the
financial backers of the location. Owners may also be involved in day-to-day responsibilities
of running the childcare. At the level of owners there is sometimes a board of
trustees/directors who provides guidance. In the day-to-day functions, there is a director

46
who oversees the staff and child management. The director is the one who generally
manages the files of both the staff and children as well. Associated with the director can be
office staff such as an assistant director, front-desk manager, and a receptionist. In the
classrooms are teachers. There can be a head teacher who is in charge of the curriculum and
direction of the classroom. Teachers sometimes will participate in a parent-teacher
association. Last, childcares sometime have volunteers who move in and out of rooms or
clean the workplace.

The family unit is of primary concern for childcares. This can involve the traditional family
unit of mother, father, and children living under one roof. However many families are
nuclear. A child may be associated with a particular arrangement where a parent may not be
allowed to see their child. The childcare has to enforce this policy. Secondary care givers like
grandparents may also participate in the responsibilities of picking up and taking care of
children after childcare. Last, there are foster care arrangements that childcares have to be
made aware of for inspection of the child through the Department of Social Services.

There are many third party services that may become involved in the licensing of
management of the childcare. The largest of these is the Department of Social Services. The
Department of Social Service is a state run organization that polices the management of
childcares through a licensing process. The DDS’s reports are made available online for
Virginia and detail inspections and reports of abuse. Other third party services are additional
accreditation services like the NAEYC Accreditation Institution6, public school systems for
early intervention, doctors to provide information about immunizations and sometimes to
provide physical and psychological care, fire marshals, the USDA, web camera services, web
designers, accountants, and encompassing organizations that house the childcare like the
Virginia Tech Research Department or Carilion Healthcare.

6
http://www.naeyc.org/

47
Figure 11. Sample floor plan for a childcare.
Childcares are typically physically constructed in a pattern. This pattern has been
reconstructed in Figure 11. There is a front entrance that will usually have a bell. Upon
entering the childcare the visitor is usually first greeted by the director or a receptionist. This
is shown in Figure 12. The director’s office is the central location for where the child’s files
are stored – usually in numerous filing cabinets. Sometimes there is a separate side room just
for the files. The children are usually located in rooms that are organized by age. Attached to
each room can be an entrance to a general play area or to a smaller play are. For example,
children under two years of age may have individual play areas separate from the older
children.

Figure 12. Front desk of a childcare center. There is a monitor and picture frame in the
corner. Desk and 1-way mirror on the other corner behind (Branham et al. 2009).

48
Some childcares have web camera systems. These systems can be made available for parents
to view their children with. They can also be used as a surveillance system to watch teachers
internally. Cameras are visually shown in the corners of each room in Figure 11.

Most of the people who work in childcare are female. The environments are warm,
welcoming, and nurturing. The receptionist’s desk in Figure 12 is covered with Halloween
decorations, has a digital picture frame showing pictures of children in the center, and has
other pamphlets about community events.

Good childcares generally have long waitlists. These waitlists can be expansive taking a long
as four years to be reached on. Most childcares have a policy stating that if you already have
a child currently enrolled in the childcare they will add a new baby to the front of the list.
There are also higher state restrictions on the number of babies per teacher allowed in infant
rooms. For this reason, the waitlist for the placement of children under two years old is
usually longer.
4.3.2. Medical Practices
For my work half of my study is focusing on small medical practices. A small medical
practice independently operates their own patient record system and schedule. The doctors
usually are the owners, and the doctor(s) serves as the main boss for final decisions.

Almost all Americans will filter through the American healthcare system. Most have health
insurance through private or public insurance companies, but 2006 15.8% of people did not
have any form of insurance. Health insurance, in general, helps cover the expense of
healthcare. In 2005 there were 1,169 visits to physicians’ offices and outpatient care at
hospitals with an average of 3.94 visits per person (Cutler 2008).

Figure 13. Ambient displays of HIPPA regulation in Medical Office.

49
Doctors rather than centers are licensed to make sure that they are complying with state7 and
federal regulations. The largest and most prevalent federal policy is the Health Insurance
Portability and Accountability Act (HIPPA) of 1996 that outlines privacy and security rules
for managing a patient’s health and financial information8. Figure 13 shows how one medical
practice informs their patients of HIPPA regulations. HIPPA is a broad yet influential policy
that regulates processes for accessing information and also procedures for managing day-to-
day aspects of information use. The Department of Health and Human Services explain on
their website (2010b):

“The HIPAA Privacy Rule provides federal protections for personal health
information held by covered entities and gives patients an array of rights with
respect to that information. At the same time, the Privacy Rule is balanced so
that it permits the disclosure of personal health information needed for
patient care and other important purposes. The Security Rule specifies a
series of administrative, physical, and technical safeguards for covered
entities to use to assure the confidentiality, integrity, and availability of
electronic protected health information.”

Similar to childcares, the stakeholders for medical practices can be divided up into three
groups. Office employees include everyone who works at the practice, including front desk
workers (receptionists, office assistants, office managers, file managers and schedulers),
doctors (including medical doctors, dentists, ophthalmologists, opticians and chiropractors),
support staff (nurses, dental assistants and therapists) and owners. Patients include the
person being seen and the patient's parents/families/guardians who can legally have access
to patients' information. Third party stakeholders that exchange information with medical
practices include insurance companies, hospitals, pharmacies, referring doctors' offices (both
referring and being referred to), laboratories, collection agencies, credit agencies (e.g. "Care
Credit"), software companies, Medicare/Medicaid and lawyers.

Front desk workers, primarily females, are usually the first people one encounters when
entering a medical practice. These workers include receptionists, office managers, office
assistants, file managers and schedulers. Front desk workers are generally responsible for
scheduling appointments, printing the daily schedule for the doctors, handing forms to
patients to sign and fill out, filing patient forms, filing and “pulling” patients’ charts, filing
insurance claims, billing patients and keeping up with billing and insurance information.
Front desk workers and doctors have the most access to patients’ information. In some
cases, office managers have more access because they deal with medical, schedule and billing
information, whereas doctors in my pilot studies reported having little contact with billing
information. In addition to interacting with formal medical, schedule and billing
information, occasionally, front desk workers reported informally communicating
information about patients to doctors, since they are the first people in the office to have
contact with patients.

Medical offices are laid out in a similar fashion as childcares. There is usually a central
entrance where the patient is greeted with a waiting room/lobby with access to a
7
http://www.vahealthprovider.com/
8
http://www.hhs.gov/ocr/privacy/

50
receptionist. The receptionist will note that the patient has arrived, and the patient will wait
to be seen. The patient will then be guided back to a private room to be consulted with a
nurse, and then a doctor. The patient’s files are usually stored around or behind the medical
director’s desk.

Figure 14. A medical offices files located on the surrounding walls of the director's
office.
Files are organized by the patient’s name, their last visit, and other relevant information as
shown with the different color codes on the side. Patients can have electronic and paper
files. There are sometimes separate files for billing and health information as well.

4.3.3. Comparing & Contrasting Childcares and Medical Practices


Childcares and medical practices both handle sensitive personal information about their
clients. Two instances of workplaces that handle sensitive personal information were
selected to provide an overview of general practices. Table 5 highlights some of the
pertinent aspects that are similar and different about these locations. Perhaps the most
salient feature that is different between childcares and medical practices is that medical
practices house information that may be more life critical (e.g., a transcript from a surgery,
sudden weight gain/loss, and treatment schedules). While children in childcares can have
allergies that may be life critical, the aim of keeping information on children is to keep them
physically safe – not to diagnose them or to check on their health.

The largest similarity between childcares and medical practices is that they are businesses,
requiring them to generate revenue. However, they also serve a community purpose. Both of
these factors are what drive their information storage and documentation. For instance,
childcares keep incident reports of when a child falls or bites another child. These incident
reports are kept for two reasons. The first is so that the childcare can protect their business
against litigation down the road. The second is so that they can collate them to find places or
routine activities that are resulting in children not being safe – serving the community service

51
of caring for children. Medical directors describe the same behavior for noting meta-
information about a patient in their file (e.g., recent divorce) and for noting the type of
penicillin that is prescribed so that the community as a whole does not become immune.
Privacy concerns that run contrary to these needs are going to create a tension in what is
reflected in the documentation.

Table 5. Pertinent characteristics are provided in this table that distinguish


childcares and medical practices.

Medical Practices Childcare Centers


File Storage Large wall-spanning file Usually only one file cabinet
cabinets
File Forms Electronic billing patient Physical file in director’s
file; physical file of medical office; sub-file in teacher
history; electronic copy of room; electronic copy of
information in physical file information in physical file
New Customer Daily/Weekly Monthly
Daily Different patients daily Same children daily
Participation
Communication Email, inter-office memos, Email, inter-office memos,
Methods documenting in files, face- documenting in files, face-to-
to-face, staff meetings, face, texting, staff meetings,
phone phone
Age of files Indefinite Indefinite
Policing Privacy U.S. Department of Health Virginia’s Department of
Agency & Human Services through Social Services through bi-
HIPPA annual licensing; additional
accreditation through
NYCEA, and hosing
institution such as Virginia
Tech

4.4. Pilot Study Observation & Interviews


4.4.1. Pilot Study Methods & Demographics
Four pilot studies were run to start collecting data that would detail and describe the
information space and practices of managing sensitive personal information. These studies
were conducted in the Summer and Spring of 2009 with Tom Dehart, Laura Agnich,
Edgardo Vega, Zalia Shams, Stacy Branham, and Monika Akbar. The Usable Security Team,
comprised of Denis Gracanin, Steve Harrison, Dennis Kafura, and Francis Quek, also
provided valuable feedback and funding at this early stage in the work. Two projects were
conducted within Dennis Kafura’s course on Usable Security.

52
Table 6. Basic dimensions of the four pilot studies.

Childcare Directors Medical Office Parents


Directors +
Doctors
When Summer + Fall 2009 Summer 2009 Fall 2009
Number + Summer = 11 women, 9 women, 4 men 18 women, 3 men
Gender 1 man; Fall = 4 women
Method Interviews: 30 – 60 Interviews: 30 Interviews: 30
minutes; Observation minutes minutes
Location Place of work Place of work Place of
convenience (e.g.,
coffee shop)

Basic dimensions of these studies can be seen in Table 6. In total, the work is comprised of
46 interviewed participants, and 14 observations. The summer studies involved interviews
with the directors from childcares and medical practices within their work place with guided
tours. Four childcare directors were then selected for the second set of studies as the
representatives of the strongest information practices (i.e. least DSS violations, clear
practices on how information is managed and regulated). Childcare directors participated
interviews that lasted approximately 45 minutes. Two to four observation sessions lasting 2-
3 hours each were then conducted following the interviews. Twenty-one parents participated
in 30-minute interviews in regard to their beliefs about sensitive private information
management. All interviews and observations were audio recorded and transcribed.
Interview protocols for the four different types of interviews are attached in the appendix.
Interviews were utilized instead of standardized surveys to generate rich responses in regards
to aspects of their work and practice that may affect the security of information.

All participants were from the Southwest area of Virginia. This area is rural, yet
technologically impacted by the proximity to the University. Waitlists exist for the best
childcares and medical practices. The number of children being managed in the participating
childcares was 26 to 200 (sd = 63.7) with children ranging in age from six weeks to sixth
grade. The staff size at these facilities ranged from 9 to 45 (sd = 10.3). The average number
of years of experience as a childcare director the participants had was 12.6. Of the medical
practices that participated, the practices themselves varied in size, staffing 2 to 50 individuals.
The average size was 13, including doctors and administrators. One participants was a
practicing chiropractor and owner of nine practices in Southwest Virginia; staff size (37) and
patients served weekly (2000) reflects the entire nine location as well as the staff of the
corporate location. Another participant was a doctor at a local surgical center and also the
chief doctor of the location with administrative responsibilities. He reported personally
seeing forty patients and employing fifty individuals, but was unable to recall how many
patients were seen at the center. Apart from these two cases, the approximate number of
patients seen weekly ranged from 55 to 400, with an average of 121.8. All participant data is
in Section 5.1 Demographic Information About Participants in Pilot Studies.

53
All directors were recruited through a comprehensive list of all area businesses; the response
rates were 55% for childcares, and 26% for medical practices- not including the local
hospitals. Parents were recruited through listservs, flyers, and company newsletters. The only
incentives provided to participate were offered to parents; parents were paid ten dollars.
4.4.2. First Round of Coding by Study
Grounded theory was used to analyze the body of data. All data from the studies were coded
by at least two researchers for an agreed set of codes. All codes from the pilot studies have
been included in Table 7. Some codes, such as policies and security were central to all four
studies. Other codes were particular to the study at hand, such as light-bulb moments. The
use of these codes was a first pass of coding with each study helping to inform the next
about relevant issues in this information space. See (Branham et al. 2009; Vega et al. 2009a;
Vega et al. 2009b; Vega et al. 2009d) for more details on each study.

Table 7. Codes from the four pilot studies.

Childcare Interviews

Medical Interviews

Parent Interviews

observation/
interviews
Childcare
Codes Description
Access of References to who can and Yes Yes No Yes
Information cannot access information
Age of information How long the file and Yes Yes Yes No
information is kept at the
childcare
Ambient Citing the use or the lack of use Yes No Yes No
Information of ambient information as a
communication channel
Attitudes About Particular attitudes that reflect No No Yes No
Sharing Sensitive beliefs about sharing sensitive
Information personal information
Audit Trails Any audit trails that are used to Yes No No No
determine who and when
information is being accessed
Authority Issues of power or hierarchy in Yes No No Yes
reference to how information is
being communicated or
managed
Communication Methods used to disseminate Yes Yes Yes Yes
information around the
childcare
Critical Incidents & Incidents where the participant Yes Yes Yes Yes

54
Breakdowns indicates that the current
method does not reflect the
need
Degrees of When certain information may No Yes No No
Information not be shared with other people
Sharing
Influence Particular people or Yes No No No
organizations that influence the
childcares policies
Leaving a childcare Reasons that a participant gives No No Yes No
for people leaving a childcare
Light-Bulb Particular moments in the No No Yes No
Moments interview when the participant
would recognize a particular
concern that they will go back
and address
Location The physical location of the files Yes Yes No Yes
and other external sensitive
personal information
Ownership Disagreement about who owns Yes Yes Yes No
the files and the information in
the files
Physical v. References made to wanting to Yes Yes No Yes
Electronic keep one piece of information
in a physical or electronic form
Policies Explicit or implicit policies that Yes Yes Yes Yes
govern how information is
being managed
Reports The use of reports or forms No No Yes No
being sent home as a method of
disseminating information
Security Practices that are used to keep Yes Yes Yes Yes
information and
children/patients safe
Tangibility Having information in a No Yes No No
physical form
Technology Gap The difference between No Yes No No
someone who feels comfortable
with using technology to
someone who feels
uncomfortable using technology
- usually older
Trust The participant reflecting on the No No Yes No
amount of trust s/he has in
their childcare
Web Camera Any use or lack of use in Yes No No No

55
reference to web cameras
When Information Particular times when Yes Yes No Yes
is Managed information may be accessed,
annotated, or deleted
Pre-enrolling Practices that parents may go No No Yes No
practices through before enrolling their
child in a childcare
Improvisation The by-the-moment negotiation No No No Yes
of policies and practices to
manage work
Asymmetry When a participant talks about No No No Yes
Between Parents parents or childcare workers in
and Directors a sense that reflects that one is
the enemy

4.4.3. Second Round of Coding & Analysis


After all of the pilot studies were conducted a second round of coding was conducted to
drive a higher level of comprehension in regards to the similarities and particularities of
childcares and medical practices. Part of the method of grounded theory is continuously
reexamining data to validate, refine, and create new conceptions of the data. The result was a
set of examples of breakdowns and a set of policies in relation to information security.
Sample breakdowns will be detailed in the final product of this proposal. This section will
include an initial analysis of information security policies. During the observations the
conception of these policies will be refined further with possible associated breakdowns.
4.4.3.1. Human-Mediated Access Management
The central nucleus of information being stored and managed about a patient or a child is
located in their file. The centers in our studies kept the files in expansive filing shelves, like
those in medical practice shown in Figure 1, or in filing cabinets. The location of these files
is of particular importance. The files can appear to be the central part of the director’s
workspace. Indeed, accessing, searching, and managing the files is a larger part of the role of
the director. However, the role of director also extends to mediating the access and use of
the files by others in the center. We found three factors that affected how access to a file was
mediated: (1) issues of ownership, (2) place-based norms, and (3) role-based monitoring.

Ownership
One theme that impacted access and monitoring of the information in a file was the
director’s mental schemas about the ownership of the information in the file. The
information in the file is intrinsically personal and sensitive; it is information about the
patient or the child. At the same time the information is collated on the center’s forms,
annotated by staff, and used regularly by the center. Personal conceptions of who owns the
file or other instantiations of the personal information are going to affect the management of
its privacy. This is especially true given the lack of an over-arching US government
regulation specifying rules about who owns that information. For instance, if the director
believes that they are merely stewards of the personal information, they are more likely
restrict inter-office access to personal information based on the patient’s request. In contrast,
if the director believes that the personal information is their property, they are more likely

56
share that information regardless of what the customer desires as long as the behavior is
compliant with government regulations.

In probing this topic we asked directors and parents about who they believed to be the
owner of the information. The responses indicate a range of conceptions varying from the
medical practice and childcare center owning the files, to the parents or patients owning the
information, to ideas of joint ownership. For example, one medical director explained how
both parties have ownership, “the information in a file belongs to a patient, but the file itself
belongs to the doctor… So people think when they get an Xray they’ve bought an X-ray but
no, the information on the X-ray belongs to the patient but the actual film itself belongs to
the doctor.” Alternatively, other medical directors emphasized that they would not provide
the patient access to their file. Instead, they will make copies of anything that the patient
knows to ask for a copy of. One medical center director explains “... but if they want a copy
I’d give them a copy. Whatever they want. It’s their records.” The idea of only copying part
of the file was reflected in the comments of another medical director who said, “No, we
don’t give them the whole file. There’s a limit to a one page thing, but it’s basically treatment
that we’ve done.”

Childcares had similar issues of ownership in regards to the child’s file. A primary difference
is that most childcare centers would talk about parents entering the directors office, reading
the file, and making copies of pertinent information (11/12 participants). This type of open
discussion may be more common in childcares where open discussion about a child’s
progress can be encouraged. However, while this practice appears to be a more open access
to the child’s own personal information, the fact that the director is present while the parent
examines the file and its contents emphasizes that access to one’s own information is still a
mediated process.

When asking about what information a parent of patient might not be able to see in the file,
directors generally said that the patient could see anything. However, there were three
medical directors and one childcare director who said that the patient or parent could not
have access to the file. The reason cited by one of the participants was that the patient would
not understand the annotations. The three remaining participants reported that the centers
keep information about the patient or the child that they would not want them to see. For
example, one director said, “We have a website they can look at. We don't usually let them
see the files because it has notes possibly even about them and things like that.” Two
additional participants mentioned that they might take notes on sticky pads and temporarily
attach sensitive information to a file and remove it later. One participant said that she keeps
a separate file of highly sensitive information that she would not allow access to about
particular children (i.e., detailing abuse information): “it's not that we're hiding it necessarily,
but we're just not making it 100% visible, if that makes sense.”

Parents also displayed shifting ideas about the ownership of a child’s files. One third of the
participants believed that they were owners of the files, where as another one third believed
that the childcare was the owner of the file. The remaining participants had complex ideas
about ownership. For example, two parents believed that they owned the information, but
that the childcare owned the physical file. Another two participants initially responded that
the childcare owned the file, but then after a couple of minutes retuned to the topic and said
that they were the owners of the files. Perhaps most relevant in the parents responses is that

57
all participants reflected that they had not spent time thinking about this question. Unlike
childcares and medical practices, which had explicit policies regulating access, parents
expressed a lack of knowledge about even what was in their child’s file. After discussion of
this question all participants reflected that the process of their child’s information
management was not transparent.

Place-based Norms
The physicality of the workplaces in childcares and medical practices impacted the security
practices. As Harrison and Dourish explain in their work on place and space (Harrison et al.
1996), the meaning that is ascribed to a physical location, or space, denotes communal
norms and understanding, or place. We found aspects of where information was physical
located and how it was access were governed by the childcares and medical practices.
In medical practices, many of the rules that were responsible for privacy concerns were
governed by the HIPPA act. However, how particular medical practices put the HIPPA
policies into practice varied by office. One instantiation of this variation is in the form that
the physical files would take. For instance, one medical practice kept a physical file of the
patient’s health information separate from both the electronic and physical forms of the
payment information. They believed that keeping this information separate in different
physical locations helped keep the patient’s information private (“Due to the HIPPA thing,
you know, you’re not allowed to keep patient’s financial information there in their chart.”)
Another medical director talked about how he discloses sensitive personal information to a
patient in different physical locations, but how the application of HIPPA appears to be
overly restrictive:

HIPPA as I understand it was developed to protect information that is sent


over the internet… Now HIPPA in my opinion, and I don’t mind if this is
recorded, I think it’s a stupid thing… I wouldn’t go out in the waiting room
and say, you know, “hey Ms. Jones your syphilis test is negative,” so to me
it’s an ethical thing and not a legal issue… now my understanding of HIPPA
interpretations, you’re not even allowed to say the patient’s name in the
office. But what a load of crap, all that. If I got an 80 year old lady, she
wants a hug. I’m not gonna ignore her, you know, “205, you’re up!” That’s
just, that’s a little ridiculous.

The participant is expressing the fact that HIPPA regulations are not only restrictive, in
some cases, but that external policies are affecting how he manages the disclosure personal
information in different locations. The implication for what he is explaining is that different
locations afford different disclosure policies.

In both childcares and medical practices we found that the office of the director functioned
as a venerated space. The patient’s or child’s physical files were usually located surrounding
the desk of the director, as shown in Figure 1, or were directly behind the desk of the
director. The director in that office mediated access to the files. There were subtle cues such
as the placement of furniture and computers to indicate that access was inhibited – similar to
what was described in (Dourish 2004). Other workers, when needing information or wanting
to ask the director something would stand in the doorway to pass on their request. After
explaining their purpose the director would then encourage them into the room. One

58
medical director also explained that she used a screen saver to hide electronic sensitive
personal information when she was away from her desk to protect for passing personnel.

During our observations of the childcare centers, we were able to examine more nuanced
forms of place-based norms. For example, one medical director tucked highly sensitive
personal information towards the back of a file so that it was less out of the way. Another
childcare director used the corner of her desk as a place where semi-sensitive information
was kept. This location denoted that the place was one of communicating information with
particular people but about particular types of information. For example, a black box of
emergency contact information about children was stored there.

Role-based Monitoring
The role of the director was found to mediate the information seeker’s goal in a way that is
flexible, negotiated, and determined in a case-by-case fashion to best balance the need for
information for work with need to keep information private. This made the director a central
hub for not only information access, but also for establishing the explicit and implicit social
norms of how the center manages sensitive private information. This is similar to the
concepts addressed in the work of Harper et al. looking at workplace concerns on the
collection of location-based information (Harper et al. 1992).

In medical practices, participants made specific references to how it was their job to modify
or limit the amount of sensitive information that was to be shared with the following people
and secondary parties (7 out of 13 participants): patients and their family members, insurance
companies, pharmacies, other staff members, referring doctors, other doctors in the practice,
and hospitals. One medical director mentioned that she even had a method of validating any
information that was placed in the electronic filing system to monitor and verify that it was
correct. She validated because she wanted to make sure that the information was correct
given that it was associated with her login information: “most of the time I’m logged in in
the front because I’m the only one up there, but occasionally someone else will come up and
they’ll just do it, and I usually check to make sure just because it is on my login.” This quote
emphasizes that not only does she see her job responsibility as managing the electronic
system of patient files, but that she monitors access and updates made by others in the
system.

Childcare directors mentioned similar issues of monitoring, accessing, and distributing


information to the workplace. In reference to their responsibility to monitor access to the
child’s file one director said, “When a teacher comes in and wants access to a file they have
to come through me first and they have to tell me their reason basically, you know, why do
you need to go in there?” This director is explaining how she monitors access to the files in a
method that is more than simply checking access rights to information. She is additionally
checking the teacher’s goal, which extends into managing information privacy. In our
observations, we were able to watch a teacher attempt to access a child’s files. The following
is a snippet from one observer’s field notes:

A teacher… approaches the corner of Director’s desk: “Hey” "If I want my


kids' middle names, are they gonna be in here or in the file?" <points to
black box> … The teacher then says, in a much softer voice, “can I dig?”
The informality of this word, the tone of her voice makes this seem like a

59
common practice. The Director responds with a carefully-laid sentence and a
slight sternness in her voice--her eyebrows raised, eyes widened, and a
scolding manner in the slow, undulating movements of her head and voice as
she speaks: “I’ll have to dig for you.” I can only see the side of the teacher’s
face, but she looks shocked in that her face pauses, gaze held with the
Director’s, and there’s a space created that seems to beg of a verbal response.
The Director fills the silence with a soft, earnest “I’m sorry.”

This snippet highlights that the director enforces access policies in a way that establishes her
has a point of authority. Other methods of monitoring that were observed included the use
of one-way mirrors, an internal system of web-cameras that only the director and
administrative staff could see, and one participant having a link on her computer to let her
see the contents and use of the other computer in the center.
4.4.3.2. Information Duplication
Information was duplicated in the working environments of childcares and medical
practices. From a security perspective having only one instance to protect is the simplest
case. When information, however, becomes dispersed to better support individual practice, it
is traditionally conceived as becoming more difficult to manage due to numerous access
points. However, we found that duplication was a form of security in two senses: (1)
information was redundantly duplicated so that if it went missing in one location is would
still be available to the workplace as a whole, and (2) information was replicated so that it
would be on hand for impromptu activity.

Information Redundancy
Eleven out of thirteen of the medical centers used electronic record systems; only one
childcare center used an electronic record system. All participants used computers in some
fashion (i.e. submitting for insurance payments or word processing tasks) All participants
also keep paper copies of patient or child files, even if all the information was replicated in
an electronic form. (See (Sellen et al. 2001) for more on the myth of the paperless
workplace.) Not only were there electronic and physical copies of sensitive personal
information, but there were also multiple copies of files for different purposes. The two
reasons participants gave for keeping redundant information were (1) to serve a community
purpose, (2) to protect information from being lost, and (3) to use appropriate information
based on contextual needs.

Medical centers and childcares heal us when we are sick and care for our children when we
are at work. This is a duty that for many is undervalued and especially for childcares,
underpaid. When asking about how long the participants kept their files for, all participants
said that they’ve kept some files indefinitely. One participant said that he is the third
generation of his medical practice and had inherited files from the early 1930s. These files
are kept because they serve as a representation of their job (“this is my life work, I don’t
want to throw too much of it away"), but also because the centers never know when they
will be required to provide that information again in the future. For example, one medical
director explained how she had to find old records to identify bodies: “The problem is, and
someone wouldn’t think about why it’s so important, but it’s like the Virginia Tech massacre
we had 3 patients who we had to identify the bodies.” The fact that these records server a

60
purpose outside of the office means that the value that is attributed to their use and their
security reflects a community need.

Medical centers and childcares kept redundant information to protect information from
being lost. This is because medical centers and childcares do more than serve a community
purpose to care and heal. They also function as a business. The information about patients
and children are what allows them to operate, e.g., schedules, patient histories, emergency
contact information, etc. Keeping that information safe and well managed can be translated
into keeping a prosperous business. For example, all fourteen participants who used an
electronic system kept a back up of their files in multiple locations. One medical director
explains:

“…we actually have a series of backups. We have a local tape backup and we
have an off site backup which actually backs up over the internet at my house
at night... And then at my home we actually have two hard drives and my
wife goes to the safety deposit box and swaps them out regularly. So if
somebody’s mad enough to burn this office down and my home down, we’ll
still have a record in a safe deposit box.”

In this setting, security has moved beyond the conception of keeping sensitive personal
information secure for the purposes patient care, but to also encompass the idea that security
keeps the business safe. This practice is managed through keeping redundant copies in ‘safe’
locations to protect information from being lost.

Perhaps the most compelling reason for keeping redundant information is that information
can be repurposed for multiple needs and within different settings. The same demographic
information is going to be kept in the patient’s manila file, but it is also going to be stored on
their electronic record for billing, and possibly an electronic health record. In childcares, the
parent’s contact information and child’s name is going to be kept in the various electronic
and physical files, but they are also going to be kept in the classroom to function for
emergency purposes, on the buss to make sure that the appropriate children are on board, in
the billing ledger, on old waitlists, in rolodex’s for administrative assistants, and many more.
Information in these settings is retools and repurposed based on the need. One medical
director explains:

“We have an electronic medical record here – so it’s all eventually entered in.
The information is taken down by a nurse interviewer preoperatively on a
pre-op visit.... And then eventually that all gets put into the electronic
medical record... but of course we transfer a lot of that information onto the
anesthesia record which is entered in real time into the electronic medical
record”

In this simple example, the information is transferred from the patient, to the nurse. The
nurse transfers the information to a pre-op visit form, which is then translated into an
electronic medical record. The information in the electronic record is repurposed for the
anesthesia record, which is then entered back into the electronic medical record. In another
example, a medical director explains that she may communicate with patients through email.
The emails, though, are then printed out and kept in the patient’s file. This allows her to take

61
advantage of electronic communication, but still keep that information for understanding the
patient’s medical history (“Usually she’ll write back something and then she’ll print it out and
put it in the chart.”)

Information On Hand
Medical practices and childcares are places that are regularly in flux. The personnel in these
places are responding to sick children, different patient’s needs, new customers, and other
information intensive activities such as licensing. While these situations cannot be entirely
planned for, medical centers and childcares attempt to be ready for any situation by having
information on hand. For example, childcares have emergency contact sheets in their
classroom that have has allergy information visible for substitute teachers. Chefs, additionally
have a child’s allergy information visible in the kitchen. Teachers also keep communication
logs in the classrooms of relevant activities for when teachers are switched out of rooms.
Additionally, teachers will display logs of the day’s activities on the outside of the classrooms
so that as parents enter they can become knowledgeable about their child’s daily activities.

There is a tension that exists between needing to keep information private and secure while
also needing it to work. In our observations of childcares we were able to witness an
inspection by a Department of Social Services licensor. The licensor explains this tension to
the observer:

“… things have to be kept confidential and locked, per se, but the staff still
need to be able to have access to it even if {the director is} not here and for
emergency contact information. So, sometimes they will produce their own
emergency contact form for their classroom… and that way {the manila
folder} can remain locked but people still have access to the information
needed"

The licensor is explaining how the nucleus of the sensitive information is kept in a locked
filing cabinet. However, relevant sensitive information needs to be in the environment to
protect the safety of the children. To balance this need the childcare keeps a separate form in
the classroom with the child’s information.

There are two other reasons why sensitive information may be kept on hand. The first
reason is anticipating extreme circumstances. For example, one medical director keeps hard
copies of files for the next day home with her “just in case anything were to happen.”
Another childcare director keeps a copy of all of the children’s emergency health insurance
information in a separate folder in case of a medical emergency. The second reason is that
some files may become too large. One medical director said that after a patient’s file
becomes too large, she starts a new file and rubber bands the old one to the new one in
storage. When a file becomes this large, a doctor may need only part of the medical record
for certain purposes. For example, one medical director explained that she creates a separate
packet of information for surgeries (“I think she does just take certain information, because
we have a packet that we make up with all the information.”) Understanding what
information is going to be kept in what space or form, and who has access to those instances
is something that is determined by the function of the information and also the context
surrounding the information use.

62
4.4.3.3. Community of Trust
The other themes presented in this work reflect a need to restrict access to information
present in diverse locations and used in various contexts. If the focus of work is always on
keeping sensitive personal information safe, then it will be hard to progress past assessing
access policies. To manage security while still functioning as a workplace, we suggest that the
childcares and medical centers establish communities of trust. Prior research has
demonstrated that trust is a factor that decreases expenditures (i.e., time on security
concerns) by mitigating needs for countermeasures, but also by creating feelings of shared
responsibility thus decreasing selfishness and carelessness (Flechais et al. 2005).

Figure 15. An example of the type of information placed in the environment to elicit
reciprocal feelings of sharing. Identifying information has been blurred.
We found different representations of communities of trust in our studies. Some examples
of this theme in the medical setting is a literal mom-and-pop practice that worked together
to create a sense that their patients were part of the family. Similarly, the childcares in our
study focused on creating an away-from-home feel by displaying children’s artwork, teacher
pictures with likes and dislikes (shown in Figure 1), window-box flower arrangements, and
other homey features. The feelings from these spaces are that they are places of sharing and
caring. This sense of community influenced the practice of security. For example, seeing that
a teacher will posts so much personal information elicits a reciprocal feeling of “now I can
share my information as well.” One childcare director explains her feelings on balancing
trust and privacy:

“… teachers are bound by confidentiality, it's in their agreement, it's in our


handbook, any violation of confidentiality is immediate grounds of
termination. We try to use a lot of trust… more often than not there's not
anything that they can't see. Um, there are cases of children that you know
we've had suspicions of abuse or different information, but we kinda want at
the same time for {the staff} to be privy.”

While there are explicit policies about who can access what, there is also a need to trust co-
workers, especially in situations of abuse or neglect, to be discrete and manage privacy issues

63
socially. One medical director specifically mentioned that her co-workers were trained on
HIPPA regulations, and this allows her to trust their actions.

One aspect of security that we asked about was the use of passwords. Computers, when
used for accessing patient information, were generally in the director’s space, or the doctor’s
office. Of those medical centers that used electronic systems, only seven (29%) had
individual passwords. This finding is similar to the work of Heckle & Lutters who found that
people who shared workstations shared passwords (Heckle et al. 2007). However, Heckle &
Lutters approached the reasons why people shared passwords as a need to share
workstations. In our study, when asked why, a director said, “They can access anything.
That’s their job.” This statement emphasizes that to be able to do the work required for the
job, levels of security have to become normalized to function. In our study, the lack of a
password to protect files is to enable people to get work done as a community. Why would
they need a password since it is their job?

In one observation of a childcare, we able to observe someone examining a notebook that


an assistant director keeps at the front desk. This notebook keeps track of on going work,
sometimes the need to update children’s sensitive information, but also the comings and
goings of activities in the childcare:

The notebooks I saw open earlier that {the administrative assistant} was
working on are still open. This teacher is looking at the notebooks and other
files. She is not brushing through the data, she has her full concentration. She
is an older woman, in her 50s, with an apron which makes me think that she
might also work as a kitchen staff. She is reading the files without any kind of
hesitation. It looks like she doesn’t care if someone sees her reading those
files.

In this example the staff of the childcare are sharing the communal responsibilities. One way
to manage this is through keeping track of work that needs to be done – such as through the
use of notebooks. Even though these artifacts contain sensitive personal information (e.g.,
late payment), they are left open to the staff so that they can work as a coherent team.
Another example comes from the locking of physical filing cabinets. It is the official policy
that filing cabinets containing files should be locked when the director is absent: “{files are}
all kept in here in a cabinet that's locked when I’m not here and the door is locked as well.”
The use of a key was, however, never observed.

These examples are not work-around security practices. They are, instead, examples of how
communities establish and negotiate what needs to be made secure. It is a demonstration of
contextual integrity (Nissenbaum 2004) playing its role in facilitating communities of people
trusting one another in situ.
4.5. Study Overview & Research Questions
The work that I am proposing will build off of the pilot studies discussed in the prior
section. This goal of this work, though, differs slightly from the prior work. The proposed
work will focus on the breakdowns that can be observed and how those breakdowns affect
the explicit and implicit policies for managing sensitive personal information.

64
Focusing on the breakdowns that occur in a socio-technical system servers two purposes.
One, perturbations in a system can create situations where new policies and work practices
are created. With the growing use of technology in these workplaces, the question will be
whether or not breakdowns in the system rely on social, technical, or a mixture of both to
create new policies and solutions to existing problems. The second reason why it is
important to examine breakdowns is that breakdowns can occur situations of stress in the
system. Technological solutions, however, are designed to not distinguish between stressful
and non-stressful situations. For example, the door logging system will still log that the
mother’s badge was used to enter the childcare, even though it was the father who entered
and does not have legal rights to the child. Understanding where social or technical
mechanisms fail, and how system responds will highlight the strengths and weaknesses of
current social and technical practices. Rich descriptions derived from observations detailing
the strengths and weaknesses of social and technical security mechanisms are a product that
would be of value to the security research community. For this reason my research question
guiding research question is “How do socio-technical systems that use sensitive personal
information manage work-practice breakdowns surrounding the implicit and explicit results
of process?”

I have further broken this down into three sub-questions to provide incremental answers to
the larger research question. The first question asks, “What are the implicit and explicit rules
surrounding how medical practices and childcares handle sensitive personal information?”
Answering this question involves creating a comprehensive list of the explicit and implicit
policies for how sensitive personal information is handled. A list of policy themes detailing
has been detailed in Section 4.4.3 Second Round of Coding & Analysis. These themes will be
revisited and more will be created after the observations have been conducted.

The second and third questions are interrelated. The second question asks about what
breakdowns happen. The third question asks about how the system responds to those
breakdowns. To answer these questions a series of observation logs with transcribed audio
snippets will be used. Relevant artifacts, pictures, and diagrams will be created to describe
activities and context of breakdowns. Undergraduate students will work with me during the
summer and fall to transcribe and start coding relevant parts of the large amount of data that
will be recorded. The themes discussed in Section 4.4.3 will be used as codes along with a
general theme of breakdowns. The result will be rich descriptions of breakdowns and their
responses in terms of the social and technical mechanisms.
4.5.1. Length of Proposed Observations
To understand how breakdowns affect the collaborative management of sensitive personal
information I am proposing to do month long observations in both a childcare and a
medical practice. The choice of a month was selected for two of reasons. First, I worked in
two medical practices in Northern Virginia before college as a receptionist and clerk. I found
the first week was a trial period where the practice was seeing what my capabilities were. The
following week involved learning routine processes. It was only in the subsequent weeks that
the routines of work began to feel natural. A month will provide me 10 observation sessions
where my presence will be more normal and I can start to observe practice as an active
observer.

65
The second reason why a month was selected is because of the routine nature of daily work
at childcares and medical practices. In particular, the research of Reddy and Dourish showed
that there are chaotic and stable periods of time that were predictable (Reddy et al. 2002). In
this work they show that work is responsive and situational, but that there are regular daily
rhythms to the work. These rhythms were observed in my personal experience but also in
the early observations done in the pilot studies. Even within only four observations in one
childcare, critical aspects of security and privacy were being repeated in day-to-day routines
that were observed. The repeated rhythms of this work means that a month of observations
should elicit enough of the same behavior for a reasonable comprehension of the context
and practices that breakdowns are observed within.

If during the last day of observation a critical breakdown occurs that greatly impacts my
understandings of the types of breakdowns in these environments, then observations will
continue for a short time until that breakdown is understood.
4.5.2. Conceptual Model
For my dissertation I will be applying aspects of a conceptual model that has been developed
by the Usable Security Team at Virginia Tech (Kafura et al. 2009). This conceptual
framework is diagramed in Figure 16. The diagram shows that relationships exist between
place, reciprocity, and meaning. The relationships between these factors reciprocally
determine (Germana 2001) each other to convey the design space for usable security.
Background information about parts of this diagram can be found in other sections of the
proposal (e.g., place, privacy, trust negotiation). Reciprocity, though, has not been covered in
my review, as it is a proposed future direction for examining the other factors for the usable
security group.

Figure 16. The conceptual framework of privacy, security and place (Kafura et al.
2009).

66
4.6. Proposed Product
The propose product from this research is a set of explicit and implicit policies that affect
the collaborative management of sensitive personal information. Additionally, the month
long observations will product a set of observed breakdowns that occurred in reference to
the explicit and implicit policies. These breakdowns will be categorized to create breakdown
types that will be exemplified in a set of problem scenarios.

Scenario-Based Design is a design method that details how to move from the problem space
to the design space. The first step in this process is to study and conceptualize the current
design space in a set of problem scenarios. As explained by Rosson and Carroll in their
foundational HCI book, Usability Engineering, a problem scenario “tells a story of current
practice” (Rosson et al. 2002, pp.64) Problem scenarios are not merely depictions of
problems within the current domain. They are, instead, descriptions of the work practices
that currently exist. Both the positive and negative aspects of current methods can be
highlighted in a scenario.

The final product of this work will be a complementary set of problem and near-future
scenarios that will depict the negative and positive implications for the implementation of
possible design solutions to the problem scenarios. Near future scenarios used frequently in
ubi-comp work to depict use of technology in realistic scenarios. For example, Little et al.
(Little et al. 2009) used them to show collaborative home setting of shopping use.
(Friedewald et al. 2007) also uses them in a multitude of settings focusing on individuals.
Near-future scenarios are similar to design scenarios from Scenario-Based Design. The
largest difference is that Near-Future Scenarios can depict technology that is possible, but
not yet realized.
4.6.1. Stopping Point
When creating scenarios the question is how many scenarios are enough to fully capture the
design space. Rosson and Carroll, two of the more prevalent researchers in Scenario-Based
Design, explain that Scenario-Based Design is a method of inquiry where that question is
hard to answer. The goal is to not describe every aspect of the design space, but to instead
raise rich discussion on points of interest in the design space. Instead of a firm stopping rule,
Rosson and Carroll suggest to have one scenario for every stakeholder, for tasks that are
complex to have multiple scenarios depicting different aspects, and if there is a need for an
exhaustive list, try to decompose the space using another method (Rosson et al. 2002).

For my work I will create an exhaustive list of all breakdowns that were observed. These
breakdowns will then be categorized to attempt to generate breakdown schemas. Problem
scenarios will be used to explain representative aspects of the breakdown schemas. Given
the qualitative nature of my work, there may be breakdowns that occur as one-of-a-kind
instances that I may never observe or may not fit into a schema. I propose to create
scenarios that richly depict the breakdown schemas and how they impact different areas of
the conceptual framework shown in Figure 16. The conceptual framework of privacy,
security and place (Kafura et al. 2009).

The stopping point for the creation of representative problem and near-future scenarios is
when they adequately represent and discuss rich aspects of the problem space. At most, this

67
is a one-to-one mapping of breakdown to scenario mapping. The goal, though, is to abstract
breakdowns, resulting in a many-to-one breakdown to scenario product.
4.6.2. Sample Problem Scenario
This section will provide a sample problem scenario that is based on observed events at a
childcare. For the purpose of telling a story, small details have been used to complete the
narrative.

Problem Scenario: Stephanie interrupts Jane, the director of a childcare that


houses 200 children: “Hey, do you have Celia’s new cell phone number?”
Jane puts down the folder she was looking for and thinks. “No” she
concludes, and returns back to her work. A couple of minutes later Samuel
calls Jane, asking if she knows where Stephanie is. Jane does not know, but
asks what Samuel needs to see if she can help. Samuel is the bus driver that
picks up children every day from school. He has a list of children who are
supposed to be on the bus that is updated daily before he leaves. One child,
who is Celia’s child, was not at the school to be picked up. Samuel explains
that he called Stephanie because the number he has for Celia is not working.
Jane now does not know where Stephanie is in the center, nor Celia’s new
cell phone number that has not been updated, nor where the child is that
Samuel is waiting to pick up. This situation is resolved, but not by any of the
three unknowns becoming know, but instead by happenstance. About 5
minutes after Jane tells Samuel that she will call him back, Celia enters the
daycare with her missing child. Stephanie eventually reappears explaining that
she was needed for a small but manageable emergency in another room.

In this scenario there are a few breakdowns that can be understood with the Activity Theory
Framework. The first is in relation to the division of labor. Labor has been divided in the
childcare so that Stephanie is in charge of updating information about the staff, or, in this
scenario, Celia. Jane, on the other hand, is in change of updating information about the
children. In this case, there is an unclear delineation between who was in charge of updating
Celia’s information. The second problem in relation to the division of labor is that there are
times when administrative staff, like Stephanie and Jane, have to step into other roles – such
as a temporary teacher. At these times the responsibilities of the administrative staff are put
aside and not managed by anyone else in the childcare during this time period. Therefore,
figuring out how to contact Celia was not properly managed given two pressing needs at one
time.

The second breakdown that occurred in this scenario is in relation to missing information.
There is a policy in this childcare that when a child is missing or does not appear at daycare
by a certain time, a five-dollar charge is added to the parent’s bill. In this situation, though,
there was not a policy about how to handle missing information to then assess whether or
not the child is just not available, or is missing for a nefarious reason. The lack of rules to
handle the lack of correct personal information caused a breakdown.

The last breakdown is in relation to a conflict in objectives. The bus driver has limited
amount of information about each child in case the van in stolen. The childcare does not
want a large amount of information about each child available on the buss. For this reason

68
they keep only the name of the child, with one contact name and phone number. The
conflicting objective lies in the situation – to find the missing child. In this particular
scenario the information at hand does not support the objective, thus creating a breakdown.
4.6.3. Sample Near-Future Scenario
In the previous section a problem scenario with an analysis in regards to what caused the
breakdowns was provided. In this section a scenario is provided to discuss one possible
socio-technical solution.

It is unclear why social mechanisms did not support the social system and prevent these
breakdowns. For example, this childcare has radios and a PA system that Jane could have
used to find Stephanie. Jane also could have started calling Celia’s husband and other people
listed on the emergency contact list to try and find Celia to ask about her daughter.
However, from the observation notes, it appears as though the childcare did know that Celia
had a new cell phone number, but that they did not actually have a copy of the new cell
phone number in their records. Possible solutions for this is to enable Celia to update her
own information in an electronic database of staff/parent information. If this database was
then also made available to the bus driver, he would have had access to all of Celia’s
information instead of only her name and contact information. Additionally, if the electronic
database on the bus was password protected, this will protect the privacy of the information.

Near-Future Scenario: When Samuel drives up the school and notices that
Celia’s daughter is not there to be picked up, he turns to his dashboard
computer. He had entered his password upon entering the car. As he drives
up to each location, the computer in the dashboard displays the pictures of
the children who are supposed to be available. Samuel clicks on Celia’s
daughter’s picture. Celia’s name and phone number appear. Samuel clicks on
Celia’s phone number, to dial, but finds that the number is disconnected. He
then clicks on the ‘get more information button’. Samuel dials Celia’s
husband and finds that Celia is on the way to drop her daughter off at the
childcare. Samuel heads back to the childcare to drop the children off and
lets Jane know to charge Celia five dollars for not calling to update.
4.7. Timeline
My estimated timeline would enable me to do my final defense in March-May 2011. Tables
by semester are in Section 8.2. I would run my observations in April and August-September
2010. Data analysis by me and undergraduate students would take place through the Fall
semester. December and early January would be spent writing my research defense. Between
two and 4 months later I would do my final defense.

69
5. References
1. (2005) "Percentage of Children Ages 0-6, Not Yet in Kindergarten by Type of Care
Arrangement and Child and Family Characteristics, 1995, 2001, and 2005." ChildStats.gov.
2. (2010a). The Health Information Security and Privacy Collaboration Toolkit, Agency
for Healthcare Research and Quality.
3. (2010b). "Understanding Health Information Privacy." Retrieved February 8th,
2010, 2010, from http://www.hhs.gov/ocr/privacy/hipaa/understanding/index.html.
4. Ackerman, Mark, Trevor Darrell and Daniel Weitzner (2001). "Privacy in Context."
Human-Computer Interaction 16(2): 167-176.
5. Ackerman, Mark S. (2004). "Privacy in pervasive environments: next generation
labeling protocols." Personal Ubiquitous Comput. 8(6): 430-439.
6. Ackerman, Mark S. and Lorrie Cranor (1999). Privacy Critics: UI Components to
Safeguard Users' Privacy. CHI '99 Extended Abstracts on Human Factors in Computing Systems,
Pittsburgh, Pennsylvania, ACM.
7. Adams, Anne and Ann Blandford (2005a). "Bridging the Gap Between
Organizational and User Perspectives of Security in the Clinical Domain." International
Journal of Human-Computer Studies 63(1-2): 175-202.
8. Adams, Anne, Ann Blandford, Dawn Budd and Neil Bailey (2005b). "Organizational
communication and awareness: a novel solution for health informatics." Health Informatics
Journal 11(3): 163-178.
9. Adams, Anne and Martina Angela Sasse (1999). Users Are Not the Enemy.
Communications of the ACM. 42: 40-46.
10. Agre, Philip. E. (1994). "Surveillance and Capture." Information Society 10(2): 101-
127.
11. Amoroso, E., T. Nguyen, J. Weiss, J. Watson, P. Lapiska and T. Starr (1991). Toward
an approach to measuring software trust. Research in Security and Privacy, 1991. Proceedings., 1991
IEEE Computer Society Symposium on.
12. Anderson, Ross J. (2008). Security Engineering: A Guide to Building Dependable Distributed
Systems, Wiley.
13. Arcand, Manon, Jacques Nantel, Mathieu Arles-Dufour and Anne Vincent (2007).
"The Impact of Reading a Web Site's Privacy Statement on Perceived Control Over Privacy
and Perceived Trust." Online Information Review 31(5): 661-681.
14. Bannon, Liam and Susanne Bødker (1997). Constructing common information
spaces. Proceedings of the fifth conference on European Conference on Computer-Supported Cooperative
Work, Lancaster, UK, Kluwer Academic Publishers.
15. Bardram, Jakob (2005). Activity-Based Support for Mobility and Collaboration in
Ubiquitous Computing. Ubiquitous Mobile Information and Collaboration Systems: 166-180.
16. Bardram, Jakob (2009a). A Novel Approach for Creating Activity-Aware
Applications in a Hospital Environment. Human-Computer Interaction – INTERACT 2009:
731-744.
17. Bardram, Jakob E. (1997). Plans as Situated Action: An Activity Theory Approach to
Workflow Systems. ECSCW '97, Copenhagen, Denmark, Kluwer Academic Publishers.
18. Bardram, Jakob E. (2009b). "Activity-based computing for medical work in
hospitals." ACM Trans. Comput.-Hum. Interact. 16(2): 1-36.

70
19. Bardram, Jakob E. and Claus Bossen (2005). "Mobility Work: The Spatial Dimension
of Collaboration at a Hospital." Computer Supported Cooperative Work (CSCW) 14(2): 131-
160.
20. Bardram, Jakob E., Jonathan Bunde-Pedersen, Afsaneh Doryab and Steffen
Sørensen (2009). CLINICAL SURFACES --- Activity-Based Computing for Distributed
Multi-Display Environments in Hospitals. Proceedings of the 12th IFIP TC 13 International
Conference on Human-Computer Interaction: Part II, Uppsala, Sweden, Springer-Verlag.
21. Baskerville, Richard, Jan Stage and Janice I. DeGross (2000). Organizational and Social
Perspectives on Information Technology, Springer.
22. Baumer, Erick P. S. and Bill Tomlinson (2010). Comparing Theoretically-Informed
Methods for Video Analysis: Beyond “Kicking the Tires”. Manuscript in submission to The
ACM Conference on Human Factors in Computing Systems (CHI), Atlanta, Georgia, ACM.
23. Bellotti, Victoria and Abigail Sellen (1993). Design for Privacy in Ubiquitous
Computing Environments. Proceedings of the Third Conference on European Conference on Computer-
Supported Cooperative Work, Kluwer Academic Publishers.
24. Birnholtz, Jeremy, Jamie Guillory, Jeff Hancock and Natalya Bazarova (2010). "on
my way": Deceptive Texting and Interpersonal Awareness Narratives. ACM Conference on
Computer Supported Cooperative Work (CSCW'10), Savannah, Georgia, USA, ACM.
25. Bloomberg, Linda Dale and Marie F. Volpe (2008). Completing Your Qualitative
Dissertation: A Roadmap From Beginning to End, Sage Publications, Inc.
26. Bødker, Susanne (1996). Applying Activity Theory to Video Analysis: How to Make
Sense of Video Data in Human-
27. Computer Interaction Context and Consciousness. B. A. Nardi. Cambridge,
Massachusetts, MIT Press: 147-174.
28. Bossen, Claus (2002). The parameters of common information spaces:: the
heterogeneity of cooperative work at a hospital ward. Proceedings of the 2002 ACM conference on
Computer supported cooperative work, New Orleans, Louisiana, USA, ACM.
29. Branham, Stacy, Monika Akbar and Laurian Vega (2009). Process and Situated
Practice: The Unofficial Rules of Childcare Information Management. Blacksburg, Virginia,
Virginia Tech.
30. Bylund, Markus, Kristina Höök and Alina Pommeranz (2008). Pieces of Identity.
Proceedings of the 5th Nordic Conference on Human-Computer Interaction: Building Bridges, Lund,
Sweden, ACM.
31. Carayon, P. (2006). "Human factors of complex sociotechnical systems." Applied
ergonomics 37(4): 525-535.
32. Charmaz, Kathy (2006). Constructing Grounded Theory: A Practical Guide through
Qualitative Analysis, Sage Publications Ltd.
33. Chowdhury, M. M. R., J. Noll and J. M. Gomez (2007). Enabling Access Control and
Privacy through Ontology. Innovations in Information Technology, 2007. IIT '07. 4th International
Conference on.
34. Convertino, Gregorio, Dennis C. Neale, Laurian Hobby, John M. Carroll and Mary
Beth Rosson (2004). A laboratory method for studying activity awareness. Proceedings of the
third Nordic conference on Human-computer interaction. Tampere, Finland, ACM Press.
35. Cranor, Lorrie Faith, Manjula Arjula and Praveen Guduru (2002). Use of a P3P user
agent by early adopters. Proceedings of the 2002 ACM workshop on Privacy in the Electronic Society.
Washington, DC, ACM.
36. Creswell, John W. (2008). Research Design: Qualitative, Quantitative, and Mixed Methods
Approaches, Sage Publications, Inc.

71
37. Cutler, David M. (2008) "The American Healthcare System." Medical Solutions.
38. Dantec, Christopher A. Le, Erika Shehan Poole and Susan P. Wyche (2009). Values
as Lived Experience: Evolving Value Sensitive Design in Support of Value Discovery.
Proceedings of the 27th International Conference on Human Factors in Computing Systems, Boston, MA,
USA, ACM.
39. Dimitriadis, Yannis, Juan Ignacio Asensio-P\, \#233, rez, Davinia Hern\, \#225,
ez-Leo, Jeremy Roschelle, John Brecht, Deborah Tatar, S. Raj Chaudhury, Chris DiGiano
and Charles M. Patton (2007). From socially-mediated to technology-mediated coordination:
a study of design tensions using group scribbles. Proceedings of the 8th iternational conference on
Computer supported collaborative learning. New Brunswick, New Jersey, USA, International
Society of the Learning Sciences.
40. Dourish, Paul (2001). Process descriptions as organisational accounting devices: the
dual use of workflow technologies. Proceedings of the 2001 International ACM SIGGROUP
Conference on Supporting Group Work. Boulder, Colorado, USA, ACM: 52-60.
41. Dourish, Paul (2004). "What we talk about when we talk about context." Personal
Ubiquitous Comput. 8(1): 19-30.
42. Dourish, Paul and Ken Anderson (2006). "Collective Information Practice:
Exploring Privacy and Security as Social and Cultural Phenomena." Human-Computer
Interaction 21(3): 319-342.
43. Dourish, Paul, E. Grinter, Jessica Delgado de la Flor and Melissa Joseph (2004).
"Security in the Wild: User Strategies for Managing Security as an Everyday, Practical
Problem." Personal Ubiquitous Computing 8(6): 391-401.
44. Egelman, Serge, Janice Tsai, Lorrie Faith Cranor and Alessandro Acquisti (2009).
Timing is Everything?: The Effects of Timing and Placement of Online Privacy Indicators.
Proceedings of the 27th International Conference on Human Factors in Computing Systems, Boston, MA,
USA, ACM.
45. Engeström, Yrjö (1987). Learning by Expanding: An Activity - Theoretical Approach to
Developmental Research. Helsinki, Orienta-konsultit.
46. Engeström, Yrjö and Virginia Escalante (1996). "Mundane tool or object of
affection? The rise and fall of the Postal Buddy." Context and consciousness: Activity theory
and human-computer interaction: 325-373.
47. Ertmer, Peggy A. and Timothy J. Newby (1993). "Behaviorism, Cognitivism,
Constructivism: Comparing Critical Features from an Instructional Design Perspective."
Performance Improvement Quarterly 6(4): 50-72.
48. Fischer, Gerhard and Jonathan Ostwald (2001). "Knowledge Management:
Problems, Promises, Realities, and Challenges." IEEE Intelligent Systems 16(1): 60-72.
49. Flanagan, John C. (1954). "The critical incident technique." Psychological Bulletin
51(4): 327-358.
50. Flechais, Ivan, Jens Riegelsberger and M. Angela Sasse (2005). Divide and Conquer:
The Role of Trust and Assurance in the Design of Secure Socio-Technical Systems.
Proceedings of the 2005 Workshop on New Security Paradigms, Lake Arrowhead, California, ACM.
51. Flyvbjerg, Bent (2001). Making Social Science Matter, Cambridge University Press.
52. Friedewald, Michael, Elena Vildjiounaite, Yves Punie and David Wright (2007).
"Privacy, identity and security in ambient intelligence: A scenario analysis." Telematics and
Informatics 24(1): 15-29.
53. Friedman, Batya, Peter H. Kahn, Jr. and Alan Borning (2002). Value Sensitive
Design: Theory and Methods. University of Washington, University of Washington,
Computer Science and Engineering Department.

72
54. Garfinkel, Simson L. and Robert C. Miller (2005). Johnny 2: A User Test of Key
Continuity Management with S/MIME and Outlook Express. Proceedings of the 2005
Symposium on Usable Privacy and Security. Pittsburgh, Pennsylvania, ACM.
55. Germana, Joseph (2001). "Emergent Properties: A Fundamental Postulate of
Classical Systems Theory in Schematic Representation." Systems Research and Behavioral
Science 18(6): 565-567.
56. Gerson, Elihu M. and Susan Leigh Star (1986). Analyzing due process in the
workplace. ACM Trans. Inf. Syst. 4: 257-270.
57. Glaser, Barney G. and Anselm Strauss (1967). The Discovery of Grounded Theory:
Strategies for Qualitative Research, Aldine Transaction.
58. Good, Nathaniel S. and Aaron Krekelberg (2003). Usability and Privacy: A Study of
Kazaa P2P File-Sharing. Proceedings of the SIGCHI Conference on Human Factors in Computing
Systems. Ft. Lauderdale, Florida, USA, ACM.
59. Guindon, R. and B. Curtis (1988). Control of cognitive processes during software
design: what tools are needed? Proceedings of the SIGCHI conference on Human factors in computing
systems. Washington, D.C., United States, ACM.
60. Hancock, Jeff, Jeremy Birnholtz, Natalya Bazarova, Jamie Guillory, Josh Perlin and
Barrett Amos (2009). Butler lies: awareness, deception and design. Proceedings of the 27th
international conference on Human factors in computing systems, Boston, MA, USA, ACM.
61. Handy, Charles (1995). "Trust and the Virtual Organization." Harvard Business
Review 73(3): 40-50.
62. Hardstone, Gillian, Luciana d'Adderio and Robin Williams (2006). Standardization,
Trust and Dependability. Trust in Technology: A Socio-technical Perspective. K. Clark, G.
Hardstone, M. Rouncefield and I. Sommerville. Dordrecht, The Netherlands, Springer. 36:
69-103.
63. Haro, Elizabet and Brian M. Kleiner (2008). "Macroergonomics as an organizing
process for systems safety." Applied ergonomics 39(4): 450-458.
64. Harper, R. H. R., M. G. Lamming and W. M. Newman (1992). "Locating systems at
work: implications for the development of active badge applications." Interacting with
Computers 4(3): 343-363.
65. Harrison, Steve and Paul Dourish (1996). Re-place-ing space: the roles of place and
space in collaborative systems. Proceedings of the 1996 ACM conference on Computer supported
cooperative work. Boston, Massachusetts, United States, ACM.
66. Harrison, Steve and Deborah Tatar (2008). "Places: People, Events, Loci --- the
Relation of Semantic Frames in the Construction of Place." Comput. Supported Coop.
Work 17(2-3): 135-135.
67. Hassanein, Khaled S. and Milena M Head (2004). Building Trust Through Socially
Rich Web Interfaces. the Second Annual Conference on Privacy, Security, and Trust (PST'2004),
Fredericton, Canada.
68. Head, Milena M and Khaled S. Hassanein (2002). "Trust in e-Commerce: Evaluating
the Impact of Third-Party Seals." Quarterly Journal of Electronic Commerce 3(3): 307-325.
69. Heckle, Rosa R. and Wayne G. Lutters (2007). Privacy implications for single sign-on
authentication in a hospital environment. Proceedings of the 3rd symposium on Usable privacy and
security. Pittsburgh, Pennsylvania, ACM: 173-174.
70. Hendrick, Hal W. (2002). An Overview of Macroergonomics. Macroergonomics: Theory,
Methods, and Applications. H. W. Hendrick and B. M. Kleiner. Mahwah, NJ, Lawrence
Erlbaum Associates, Inc.

73
71. Hitchings, Jean (1995). "Deficiencies of the Traditional Approach to Information
Security and the Requirements for a New Methodology." Computers & Security 14(5): 377-
383.
72. Hong, Jason I., Jennifer D. Ng, Scott Lederer and James A. Landay (2004). Privacy
risk models for designing privacy-sensitive ubiquitous computing systems. Proceedings of the
5th conference on Designing interactive systems: processes, practices, methods, and techniques. Cambridge,
MA, USA, ACM: 91-100.
73. Humphries, William D., Dennis C. Neale, D. Scott McCrickard and John M. Carroll
(2004). Laboratory Simulation Methods for Studying Complex Collaborative Tasks.
Proceedings of the Human Factors and Ergonomics Society 48th Annual Meeting (HFES '04. New
Orleans, LA: 2451-2455.
74. Hutchins, Edwin and Tove Klausen (1996). Distributed cognition in an airline
cockpit. Cognition and Communication at Work. Y. Engeström and D. Middleton, Cambridge
University Press.
75. Jha, A. K., T. G. Ferris, K. Donelan, C. DesRoches, A. Shields, S. Rosenbaum and
D. Blumenthal (2006). "How common are electronic health records in the United States? A
summary of the evidence." Health Affairs 25(6): w496-w496.
76. Kafura, Dennis, Denis Gracanin, Steve Harrison and Francis Quek (2009).
SecurePlace: Usable Privacy Protection Leveraging Place, Reciprocity, and Meaning.
77. Kandori, Michihiro (1992). "Social Norms and Community Enforcement." The
Review of Economic Studies 59(1): 63-80.
78. Kaptelinin, Victor (1995). Activity Theory: Implications for Human-Computer
Interaction. Context and Consciousness: Activity Theory and Human-Computer Interaction. B. A.
Nardi. Cambridge, Massachusetts, The MIT Press: 103-116.
79. Kelton, Kari, Kenneth R. Fleischmann and William A. Wallace (2008). "Trust in
digital information." Journal of the American Society for Information Science and
Technology 59(3): 363-374.
80. Kientz, Julie A. and Gregory D. Abowd (2009a). KidCam: Toward an Effective
Technology for the Capture of Children's Moments of Interest. Proceedings of the 7th
International Conference on Pervasive Computing, Nara, Japan, Springer-Verlag.
81. Kientz, Julie A., Rosa I. Arriaga and Gregory D. Abowd (2009b). Baby steps:
evaluation of a system to support record-keeping for parents of young children. Proceedings of
the 27th international conference on Human factors in computing systems, Boston, MA, USA, ACM.
82. Kientz, Julie A., Rosa I. Arriaga, Marshini Chetty, Gillian R. Hayes, Jahmeilah
Richardson, Shwetak N. Patel and Gregory D. Abowd (2007). Grow and know:
understanding record-keeping needs for tracking the development of young children.
Proceedings of the SIGCHI conference on Human factors in computing systems, San Jose, California,
USA, ACM.
83. Kobayashi, Marina , Susan R. Fussell, Yan Xiao and F. Jacob Seagull (2005). Work
coordination, workflow, and workarounds in a medical context. Conference on Human Factors in
Computing Systems (CHI'07), Portland, OR, USA, ACM Press, New York, New York.
84. Kuutti, Kari (1995). Activity theory as a potential framework for human-computer
interaction research. Context and Consciousness: Activity Theory and Human-Computer Interaction. B.
A. Nardi. Cambridge, MA, Massachusetts Institute of Technology: 17-44.
85. Latour, Bruno (1987). Science In Action. Cambridge, Massachusetts, Harvard
University Press.
86. Lave, Jean and Etienne Wenger (1991). Situated Learning: Legitimate Peripheral
Participation, Cambridge University Press.

74
87. Leont'ev, Alexei Nikolaevich (1981 (Russian original 1947)). Problems of the development
of mind. Moscow, Progress Press.
88. Little, Linda, Elizabeth Sillence and Pam Briggs (2009). Ubiquitous systems and the
family: thoughts about the networked home. Proceedings of the 5th Symposium on Usable Privacy
and Security. Mountain View, California, ACM.
89. Mamykina, Lena and Elizabeth D. Mynatt (2007). Investigating and supporting
health management practices of individuals with diabetes. Proceedings of the 1st ACM
SIGMOBILE international workshop on Systems and networking support for healthcare and assisted living
environments, San Juan, Puerto Rico, ACM.
90. Mayer, Roger C., James H. Davis and David Schoorman (1995). "An Integrative
Model of Organizational Trust." The Academy of Management Review 20(3): 709-734.
91. Montague, Enid N. H., Brian M. Kleiner and Woodrow W. Winchester Iii (2009).
"Empirically understanding trust in medical technology." International Journal of Industrial
Ergonomics 39(4): 628-634.
92. Moran, Thomas P., Tara L. Matthews, Laurian Vega, Barton Smith, James Lin and
Stephen Dill (2009). Ownership and Evolution of Local Process Representations. Proceedings
of the 12th IFIP TC 13 International Conference on Human-Computer Interaction: Part I. Uppsala,
Sweden, Springer-Verlag.
93. Mullender, Sape (1993). Protection. Distributed Systems, Addison Wesley Publishing
Company.
94. Mwanza, Daisy (2001). Where Theory meets Practice: A Case for an Activity Theory
based Methodology to guide Computer System Design. INTERACT'2001: Eighth IFIP TC
13 International Conference on Human-Computer Interaction, Tokyo, Japan, IOS Press (Oxford,
UK).
95. Nardi, B. A. (2007). Placeless Organizations: Collaborating for Transformation.
Mind, Culture, and Activity. 14: 5-22.
96. Nardi, Bonnie A. (1995). Context and Consciousness: Activity Theory and Human-Computer
Interaction. Cambridge, Massachusetts, The MIT Press.
97. Nardi, Bonnie A. (1996). Studying Context: A Comparison of Activity Theory,
Situated Action Models, and Distributed Cognition. Context and Consciousness: Activity Theory
and Human-Computer Interaction. B. A. Nardi. Cambridge, Massachusettes, MIT Press: 69-102.
98. Nardi, Bonnie A. (1998). Concepts of Cognition and Consciousness: Four Voices.
ACM SIGDOC Asterisk Journal of Computer Documentation.
99. Nissenbaum, Helen (2004). "Privacy as Contextual Integrity." Washington Law
Review 79(1).
100. Norman, Donald A. (1981). "Categorizaiton of Action Slips." Psychological Review
88(1): 1-15.
101. Palen, Leysia and Paul Dourish (2003). Unpacking "privacy" for a networked world.
Proceedings of the SIGCHI conference on Human factors in computing systems, Ft. Lauderdale, Florida,
USA, ACM.
102. Perry, John, Elizabeth Macken, Neil Scott and Janice L. McKinley (1997). Disability,
Inability, and Cyberspace. Human Values and the Design of Computer Technology. B. Friedman.
Cambridge, Cambridge University Press: 65-89.
103. Raeithel, A. (1996). From Coordinatedness to Coordination Via Cooperation and
Co-construction. Workshop on Work and Learning in Transition, San Diego.
104. Rains, S. A. and C. D. Karmikel (2009). "Health information-seeking and perceptions
of website credibility: Examining Web-use orientation, message characteristics, and structural
features of websites." Computers in Human Behavior 25(2): 544-553.

75
105. Reason, James (1990). Human Error. Cambridge, Cambridge University Press.
106. Reddy, Madhu and Paul Dourish (2002). A Finger on the Pulse: Temporal Rhythms
and Information Seeking in Medical Work. Proceedings of the 2002 ACM Conference on Computer
Supported Cooperative Work, New Orleans, Louisiana, USA, ACM.
107. Ren, Yuqing, Sara Kiesler, Susan R. Fussell and Peter Scupelli (2007). Trajectories in
Multiple Group Coordination: A Field Study of Hospital Operating Suites. Proceedings of the
40th Hawaii International Conference on Systems Science
108. Riegelsberger, Jens, M. Angela Sasse and John D. McCarthy (2002). Eye-Catcher of
Blind Spot? The Effect of Photographs of Faces on eCommerce Sites. Proceedings of the IFIP
Conference on Towards The Knowledge Society: E-Commerce, E-Business, E-Government, Kluwer, BV
Deventer, The Netherlands, The Netherlands.
109. Riegelsberger, Jens, M. Angela Sasse and John D. McCarthy (2003). "The
Researcher's Dilemma: Evaluating Trust in Computer-Mediated Communication."
International Journal of Human-Computer Studies 58(6): 759-781.
110. Rocco, Elena (1998). Trust breaks down in electronic contexts but can be repaired
by some initial face-to-face contact. Proceedings of the SIGCHI conference on Human factors in
computing systems. Los Angeles, California, United States, ACM Press/Addison-Wesley
Publishing Co.: 496-502.
111. Rosson, Mary Beth and John M. Carroll (2002). Usability Engineering: Scenario-Based
Development of Human-Computer Interaction. San Francisco, Morgan Kaufmann Publishers.
112. Roth, Wolff-Michael and Lee Yew-Jin (2007). ""Vygotsky's Neglected Legacy":
Cultural-Historical Activity Theory." Review of Educational Research 77(2): 186-232.
113. Sachs, Patricia (1995). Transforming work: collaboration, learning, and design.
Communications of the ACM, ACM. 38: 36-44.
114. Saltzer, Jerome H. and Michael D. Schroeder (1973). The Protection of Information
in Computer Systems. Fourth ACM Symposium on Operating System Principles.
115. Schmidt, Kjeld and Carla Simone (1996). "Coordination Mechanisms: Towards a
Conceptual Foundation of CSCW Systems Design." Computer Supported Cooperative
Work (CSCW) 5: 155-200.
116. Sellen, Abigail J. and Richard H. R. Harper (2001). The Myth of the Paperless Office, The
MIT Press.
117. Sheeran, Louise (2000). Users' models of the internet. CHI '00 extended abstracts on
Human factors in computing systems. The Hague, The Netherlands, ACM.
118. Smith, Eliot R. and Diane M. Mackie (2000). Social Psychology. Kendallville, In, Taylor
& Francis.
119. Smith, Mark K. (2003) "Communities of practice." The Encyclopedia of Informal
Education DOI: http://www.infed.org/biblio/communities_of_practice.htm.
120. Song, Jaeki and Fatemeh Mariam Zahedi (2007). "Trust in health infomediaries."
Decis. Support Syst. 43(2): 390-407.
121. Spafford, Eugene. H., Richard. A. DeMillo, Andrew Bernat, Steve Crocker, David
Farber, Virgil Gligor, Sy Goodman, Anita Jones, Susan Landau, Peter Neumann, David
Peterson, Douglas Tygar and William Wulf (2003). Four Grand Challenges in Trustworthy
Computing, Computer Research Association.
122. Strauss, Anselm C. and Juliet Corbin (1997). Grounded Theory in Practice, Sage
Publications, Inc.
123. Suchman, Lucy (1987). Plans and Situated Action. New York, NY, Cambridge
University Press.

76
124. Suchman, Lucy (1995). Making work visible. Communications of the ACM. New York,
NY, USA, ACM. 38: 56-ff.
125. Suchman, Lucy A. (1983). "Office procedure as practical action: models of work and
system design." ACM Trans. Inf. Syst. 1(4): 320-328.
126. Tatar, Deborah (2007). "The Design Tensions Framework." Human-Computer
Interaction 22(4): 413-451.
127. Theng, Yin-Leng and Eng-Soon Soh (2005). An Asian study of healthcare Web
portals: Implications for healthcare digital libraries. 8th International Conference on Asian Digital
Libraries, ICADL 2005, Dec 12-15 2005, Heidelberg, D-69121, Germany, Springer Verlag.
128. Toma, Catalina L. (2010). Perceptions of Trustworthiness Online: The Role of Visual
and Textual Information. ACM Conference on Computer Supported Cooperative Work (CSCW'10).
Savannah, Georgia, USA, ACM.
129. Toma, Catalina L. and Jeff Hancock (2010). Reading betweent he Lines: Linguistic
Cues to Deception in Online Dating Profiles. ACM Conference on Computer Supported
Cooperative Work (CSCW'10). Savannah, Georgia, USA, ACM: 5-8.
130. Vega, Edgardo, Shams Zalia and Laurian Vega (2009a). Childcare Study: The
Parent's view. Blacksburg, Virginia, Virginia Tech.
131. Vega, Laurian and Laura Agnich (2009b). Medical Practices Study Report.
Blacksburg, Virginia, Virginia Tech.
132. Vega, Laurian C., Enid N. H. Montague and Tom Dehart (2009c). Trust Between
Humans and Health Websites: A Review of the Literature and Derived Outcomes from
Empirical Studies. Journal of the American Medical Informatics Association.
133. Vega, Laurian and Tom Dehart (2009d). Childcare Study Report. Blacksburg,
Virginia, Virginia Tech.
134. Vega, Laurian, Yeong-tay Sun, D. Scott McCrickard and Steve Harrison (2010).
TIME: A Method of Detecting the Dynamic Variances of Trust. Under review at Workshop on
Information Credibility on the Web (WICOW'10) as part of World Wide Web (WWW'10). Raleigh,
NC, USA, ACM.
135. Warkentin, Darcy, Michael Woodworth, Jeff Hancock and Nicole Cormier (2010).
Warrants and Deception in Computer Mediated Communication. ACM Conference on
Computer Supported Cooperative Work (CSCW'10). Savannah, Georgia, USA, ACM: 9-13.
136. Warren, Samuel D. and Louis D. Brandeis (1890). "Right to Privacy." Harvard Law
Review 4(5): 193-193.
137. Wenger, Etienne (1998) "Communities of practice: learning as a social system." The
Systems Thinker 9.
138. Whitten, Alma and J. D. Tygar (1999). Why Johnny Can’t Encrypt: A Usability
Evaluation of PGP 5.0. Proceedings of the 8th USENIX Security Symposium.
139. Winograd, Terry and Fernando Flores (1986). Understanding Computers and Cognition: A
New Foundation for Design. Norwood, New Jersey, Ablex Publishing Corporation.
140. Zahedi, Fatemeh and Jaeki Song (2008). "Dynamics of trust revision: Using health
infomediaries." Journal of Management Information Systems 24(4): 225-248.
141.

77
Appendix
5.1. Demographic Information About Participants in Pilot Studies

Table 8. The demographic information for the childcare directors who participated
in pilot study interviews.
Interview Experience Enrollment
Date Collected Duration Title Gender Parent (years) Staff Size Size
P1 16-Jun 30:50 Director Female No 16 30 200
P2 17-Jun 30:00 Director Female No 2.5 10 26
P3 17-Jun 40:58 Director Female No 17 45 200
P4 18-Jun 17:36 Asst. Director Female Yes 20 15 43
P5 19-Jun 33:33 Director Female Yes 10 11 40
P6 19-Jun 51:01 Director Female No 15 22 100
P7 22-Jun 31:42 Director Female Yes 0.67 24 86
P8 23-Jun 29:21 Director Female Yes 4 23 145
P9 23-Jun 13:19 Director Female Yes 18 9 40
P10 26-Jun 44:15 Director Female Yes 22 24 27
P11 26-Jun 18:27 Owner Male No 14 13 61
P12 2-Jul 34:10 Asst. Director Female Yes 12 15 50

Table 9. The demographic information for the medical directors who participated in
pilot study interviews.
Appx.
Experience Number of
at current patients
Date Interview location served
Collected Duration Title Gender (years) Staff size weekly
P1 30-Jun 19:39 Office Manager Female 35 7 200
P2 30-Jun 29:03:00 Office Manager Female 16 8 120
P3 30-Jun 13:07 Office Manager Female 5 6 65
P4 6-Jul 34:10:00 Owner/ Doctor Male 12 37* 2000*
P5 8-Jul 29:57:00 Office Manager Female 16 18 400
P6 10-Jul 21:09 Office Assistant Female 10 12 70
P7 10-Jul 25:25:00 Office Assistant Female 4 9 65
P8 13-Jul 26:22:00 Office Manager Female 7 6 120
P9 14-Jul 16:15 Office Manager Female 18 3 70
P10 15-Jul 26:33:00 Owner/ Optician Male 17 2 55
P11 21-Jul 22:28 Office Manager Female 7 7 75
P12 29-Jul 41:10:00 Doctor Male 35 50 40**
P13 31-Jul 20:03 Owner/ Doctor Male 2 4 100
Numbers reflect total of 9 locations and corporate headquarters
*Approximate number of patients seen weekly by the doctor, not entire surgical center.

78
Table 10. Relevant demographic information about childcare centers that
participated in pilot study observations. Participant number corresponds with Table
8.
P1 P3 P4 P6
Hours of operation 7:15 am – 4:45 pm 6 am – 6 pm 7:30 am – 5:30 pm 7 am – 5:45 pm

Enrollment Size 200 200 43 100


Age 6 wk – 12 yr 6 wk – 12 yr 15 mo – 5 yr 6 wk – 12 yr
Title Director Director Director Director
Gender (director) Female Female Female Female

Staff Size 30 45 15 22
School Van Yes Yes Yes Yes
Number of 3 4 2 3
observations
Hours of observation 9 9 5 9

Table 11. Demographic information about parents who participated in pilot study
interviews.

Sex # of Children Child(ren) ages(s) Child(ren) age(s) Partner Current Childcare

P00 F 2 30 & 156 2.5 & 13 years Y Blacksburg Daycare


P01 M 1 36 3 years Y Honeytree Early
P02 F 1 30 2 years 6 months Y Rainbow Riders
P03 F 1 9 9 months Y Rainbow Riders

P04 F 1 32 2 years 8 months Y Rainbow Riders


P05 F 1 48 4 years Y Early Challenges
P06 F 1 16 1 year 4 months Y The Learning Ladder

P07 M 1 30 2 years 6 months N Rainbow Riders


P08 F 1 42 3 years 6 months Y Children's Nest
P09 F 1 36 3 years Y Rainbow Riders

P10 M 2 36 & 84 3 & 7 years Y Children's Nest


P11 F 1 10 10.5 months Y Blacksburg Daycare
P12 F 1 30 2 years 6 months N Kids Connections

P13 F 1 36 3 Y Rainbow Riders


P14 F 2 24 & 48 2&4 Y Bright Beginnings
P15 F 2 72 & 108 6 & 9 years Y Adventure Club

P16 F 1 24 & 60 2 & 5 years Y Noah's Arch


P17 F 1 18 1 year 4 months Y Rainbow Riders
1 year 3 months & 6
F 2 15 & 72 Y Blacksburg Daycare
P18 years
P19 F 2 23 1 year 11 months Y Blacksburg Daycare

P20 F 1 72 6 years Y Adventure Club

79
Time at Current Total Time in Date of
Previous Childcare(s) Total (minutes)
Childcare (months) Childcare (months) Interview

P00 N/A 30 30 7/14/2009 16.17


P01 N/A 18 18 9/14/2009 31.05
P02 Mount Tabor 1 9 9/14/2009 20.48

P03 The Learning Ladder 1 9 9/15/2009 15.00


P04 Early Challenges 1 20 9/16/2009 27.03
P05 N/A 1 12 9/17/2009 18.35

P06 N/A 8 8 9/17/2009 32.67


P07 N/A 12 18 9/21/2009 17.98
P08 N/A 41 39 9/22/2009 22.82

P09 Noah's Ark, Angels Nest 17 29 9/22/2009 21.20


P10 N/A 17 29 9/25/2009 20.48
P11 Kids Haven 1 2 9/21/2009 20.83

P12 ABC Daycare 24 29 9/22/2009 25.13


P13 Blacksburg Daycare 1 29 9/23/2009 27.83
P14 N/A 18 & 17 18 & 17 9/24/2009 23.13

P15 N/A 24 24 9/24/2009 22.35


P16 N/A 15 15 9/24/2009 27.83
P17 N/A 1 1 9/24/2009 28.97

P18 (oldest)Turtle Tots 2 & 48 2 & 62 9/29/2009 37.72


P19 N/A 1 4 9/29/2009 29.13
P20 Noah's Arch Preschool 13 30 9/30/2009 26.92

80
5.2. Proposed Timeline for Dissertation

Table 2. Dissertation Timeline. These tables span from the beginning of 2010 until the
end of 2011 depicting conferences, publications, dissertation work, and funding.

81
5.3. Materials used Pilot Studies
5.3.1. Informed Consent for Interviewing Childcare Directors & Parents

Field Study of In-use Information Security and Interfaces within


Childcare
Investigators: Steve Harrison, Dr. Dennis G. Kafura, Laurian Vega, Tom DeHart, Edgardo
Vega, Zalia Shams

Purpose of this Research


The purpose of this research is to start investigating how personal information is accessed,
documented, and stored in medical-like settings. The goal is to develop an understanding
from the interviews that will better inform the design and study of personal health records.

II. Procedures
At the start of the interview we will ask for demographic information about you and the
childcare you work for or are associated with for our record keeping purposes. Next, we will
ask questions to start you thinking about what are the kinds of information that is stored in
the childcare, who can access that information, how is that information accessed and is there
an audit trail, and how is that information documented. We will then work to fill in a table
that diagrams all of the people, information, and access rights. This interview can be
interrupted for job related tasks. Without interruption we anticipate this interview to take
between 30 and 45 minutes. At the end of the interview you will be thanked for your time.

III. Risks
The only known risk in this study is legal. We do not wish to see any information that is
confidential or illegal to share. However, all information and audio will be stored in a
password-protected computer. Files will only be named with your participant number that is
written at the top right of this form.

IV. Benefits
The long term benefits of this study is that the information you provide will give us insights
into how personal information is managed in medical-like situation. This will guide us in the
design of electronic personal information management. This will help reduce time, cost, and
paperwork in the medical and childcare settings.

However, no promise or guarantee of benefits has been made to encourage you to


participate in this research. If at a later date, if you are interested in the results of this study,
you can contact the researches listed below for a summary.

V. Extent of Anonymity and Confidentiality


At the beginning of the study you will be assigned a random participant number to assure
your confidentiality. You can see this number at the top of the informed consent. There are
a few places you identification will be stored. One is on our copy of this form. The second

82
will be in a digital file that associates your name with your participant number. The third is
in the audio recording of the interview. The last is on notes that we may take during the
interview. These file will all digitized and secured under a password, and the informed
consents will be stored in a locked file cabinet under the protection of the investigators. At
no time will the researchers release identifying information to anyone other than individuals
working on the project without your written consent.

During the interview we will be taking hand notes and filling out a table as well as taking an
audio file of what you are saying. These recordings and notes will be stored on an external
drive and also locked in a file cabinet of the experimenters. All data will be destroyed within
5 years of the experiment by either shredding the documents or whipping the hard drive.

It is possible that the Institutional Review Board (IRB) may view this study’s collected data
for auditing purposes. The IRB is responsible for the oversight of the protection of human
subjects involved in research.

Unless the participant reveals information regarding abuse to him/herself and/or others, this
confidentiality will be maintained under all circumstances.

VI. Compensation
You will be compensated $10.00 for participating in this study.

VII. Freedom to Withdraw


You are free to withdraw from a study at any time without penalty. You are free not to
answer any questions or respond to experimental situation that you choose without penalty.

There may be circumstances under which the investigator may determine that you should
not continue as a subject. You will be thanked for you time.

VIII. Subject’s Responsibilities


I voluntarily agree to participate in this study. I have the following responsibilities: to let the
experimenter know if I am feeling overly frustrated and need to take a break; to let the
experimenter know that I need to leave the study.

IX. Subject’s Permission


I have read the Consent Form and conditions of this project. I have had all of my questions
answered. I hereby acknowledge the above and give my voluntary consent:

______________________________________________ Date:________________
Subject Signature

Should I have any pertinent questions about this research or its conduct, and research
subjects’ right, and whom to contact in the event of a research-related injury to the subject, I
may contact:

83
Laurian C. Vega, Steve Harrison (Faculty Advisor), Dr. Dennis Kafura (Faculty Advisor),
Tom Dehart Edgardo Vega, Zalia Shams
Computer Science Department
2202 Kraft Drive
Blacksburg, VA. 24060
{lhobby, srh, kafura}@cs.vt.edu, tdehart@gmail.com
Phone: 540-231-7409

Dr. Ryder
Department Head, Computer Science Department
2202 Kraft Drive
Blacksburg, VA 24060
Ryder@cs.vt.edu
Phone: 540.231.8452

David M. Moore
Chair, Virginia Tech Institutional Review
Board for the Protection of Human Subjects
Office of Research Compliance
2000 Kraft Drive, Suite 2000 (0497)
Blacksburg, VA. 24060
540.231.4991
MooreD@vt.edu

84
5.3.2. Interview Protocol for Interviewing Childcare Directors

Section 1: Demographic and Background Information


Goal: To collect background information about the person and childcare.

Name: ________________________________________________________________

Gender: M/F

Are you a parent? Yes/No

If so, how old is/are your child/children: ____________________________

Childcare Facility:

_____________________________________________________________

Title: ______________________________________________________________

Number of children currently enrolled at this location:

______________________________

Number of children you personally oversee: __________________________

Number of people whom you work with at this location: __________________________

Age range of children whom you’re responsible for:

_____________________________________

How long have you been working at this particular location: ___________________

How long have you been working in Childcare Services:

_____________________________

Section 2: Information Security


Goal: To look at the various dimensions of information that is documented, stored, and communicated. This
section is to get the participant thinking about information they interact with and collect.

Time and Variety:


• What is the information that you handle as part of your duties at this facility daily?
Weekly? Monthly?
• What is routine and what is event-driven?

Types:

85
• What are some of the types of information you interact with and how is it handled?
o Health?
 Immunization?
o Age?
o Behavior?
o Incidents? (e.g. illness, physical altercations between children, etc)
o Parents?

Do you or your facility have general policies about who can access information in your
facility?
• How is it communicated to staff? To parents? To oversight agencies?
• Is there a situation that illustrates how this works?

How is access to information about children or other people managed?

Where are files stored? (Take picture)

How is information exchanged with:


• People who work here?
• Staff and individual parents?
• Groups of parents?
• Government and accreditation agencies?

Has your access to information changed?


• For instance, when you started working here, were you only allowed to access a
minimal amount of information?

If applicable: What affect does access to the information of other children on your behavior
or practices as a parent?

Maybe: Information Verification


• What information verification procedures are followed?
o For instance, do you verify the child’s or parent’s social security number?
• What information does your childcare provide to prove that they are certified?
• Maybe: When someone joins your childcare, what precautions are taken?

Section 3: Interfaces for information


Goal: To itemize all the stakeholders in information access and to understand how information and
documented information flows.

See attached table~ Interview Form: Field Study of In-use Information Security and
Interfaces within Childcare

Section 4: Web Camera


Goal: The goal of this section is to probe how security is managed and access is granted.

Do you use web cameras in your Day Care?

86
If yes:
• Who runs the web camera access system?
o Maybe, if 3rd party: Do you have a copy of the contract?
o What made you pick this company?
o What privacy policies are there? Who enforces them?
• Can I see what you can see through the web camera system?
o How is this view different or similar to what others can see?
• Who can access the system?
o What is the procedure for others to access the web camera system?
• How do you know when someone has accessed the system?
• What would happen or what has happened when there is suspicious activity in the
web camera system?
• Is what is seen through the web-cameras ever recorded?
o What happens with those recordings?
o Who can access those recordings?

87
88
5.3.3. Informed Consent for Interviewing Medical Center Directors

Field Study of In-use Information Security and Interfaces within


General Medical Practices
Investigators: Steve Harrison, Dr. Dennis G. Kafura, Dr. Francis Quek, Laurian Vega, Laura
Agnich

Purpose of this Research


The purpose of this research is to start investigating how personal information is accessed,
documented, and stored in medical settings. The goal is to develop an understanding from
the interviews that will better inform the design and study of personal health records.

X. Procedures
At the start of the interview we will ask for demographic information about you and the
medical practice you work for or are associated with for our record keeping purposes. Next,
we will ask questions to start you thinking about the kinds of information that is stored in
the medical practice, who can access that information, how is that information accessed, any
audit trail, and how is that information documented. We will then work to fill in a table that
diagrams all of the people, information, and access rights. This interview can be interrupted
for job related tasks. Without interruption we anticipate this interview to take between 30
and 45 minutes. At the end of the interview you will be thanked for your time.

XI. Risks
The only known risk in this study is legal. We do not wish to see any information that is
confidential or illegal to share. However, all information and audio will be stored in a
password-protected computer. Files will only be named with your participant number that is
written at the top right of this form.

XII. Benefits
The long term benefits of this study is that the information you provide will give us insights
into how personal information is managed in medical situations. This will guide us in the
design of electronic personal information management for personal health records. It will
help reduce time, cost, and paperwork in the medical setting.

However, no promise or guarantee of benefits has been made to encourage you to


participate in this research. If at a later date, if you are interested in the results of this study,
you can contact the researches listed below for a summary

XIII. Extent of Anonymity and Confidentiality


At the beginning of the study you will be assigned a random participant number to assure
your confidentiality. You can see this number at the top of the informed consent. There are
a few places you identification will be stored. One is on our copy of this form. The second
will be in a digital file that associates your name with your participant number. The third is
in the audio recording of the interview. The last is on notes that we may take during the
interview. These files will all be digitized and secured under a password, and the informed

89
consents will be stored in a locked file cabinet under the protection of the investigators. At
no time will the researchers release identifying information to anyone other than individuals
working on the project without your written consent.

During the interview we will be taking hand written notes and filling out a table as well as
taking an audio file of what you are saying. These recordings and notes will be stored on an
external drive and also locked in a file cabinet of the experimenters. All data will be
destroyed within 5 years of the experiment by either shredding the documents or whipping
the hard drive.

It is possible that the Institutional Review Board (IRB) may view this study’s collected data
for auditing purposes. The IRB is responsible for the oversight of the protection of human
subjects involved in research.

Unless the participant reveals information regarding abuse to him/herself and/or others, this
confidentiality will be maintained under all circumstances.

XIV. Compensation
There is no formal compensation for participating in this study.

XV. Freedom to Withdraw


You are free to withdraw from a study at any time without penalty. You are free not to
answer any questions or respond to experimental situation that you choose without penalty.

There may be circumstances under which the investigator may determine that you should
not continue as a subject. You will be thanked for you time.

XVI. Subject’s Responsibilities


I voluntarily agree to participate in this study. I have the following responsibilities: to let the
experimenter know if I am feeling overly frustrated and need to take a break; to let the
experimenter know that I need to leave the study.

XVII. Subject’s Permission


I have read the Consent Form and conditions of this project. I have had all of my questions
answered. I hereby acknowledge the above and give my voluntary consent:

______________________________________________ Date:________________
Subject Signature

Should I have any pertinent questions about this research or its conduct, and research
subjects’ right, and whom to contact in the event of a research-related injury to the subject, I
may contact:

90
Laurian C. Vega, Laura Agnich, Steve Harrison (Faculty Advisor), Dr. Dennis Kafura
(Faculty Advisor), Dr. Francis Quek
Computer Science Department
2202 Kraft Drive
Blacksburg, VA. 24060
{lhobby, srh, kafura}@cs.vt.edu, tdehart@gmail.com
Phone: 540-231-7409

Dr. Ryder
Department Head, Computer Science Department
2202 Kraft Drive
Blacksburg, VA 24060
Ryder@cs.vt.edu
Phone: 540.231.8452

David M. Moore
Chair, Virginia Tech Institutional Review
Board for the Protection of Human Subjects
Office of Research Compliance
2000 Kraft Drive, Suite 2000 (0497)
Blacksburg, VA. 24060
540.231.4991
MooreD@vt.edu

91
5.3.4. Interview Protocol for Interviewing Medical Center Directors
Section 1: Demographic and Background Information
Goal: To collect background information about the person and childcare.

Name: ________________________________________________________________

Gender: M/F

Health Care Facility:

_____________________________________________________________

Title: ______________________________________________________________

Number of patients currently served at this location: ______________________________

Number of patients you personally have contact with: __________________________

Number of people whom you work with at this location: __________________________

Number of doctors/physicians practicing at this location: __________________________

How long have you been working at this particular location: ___________________

How long have you been working in Healthcare Services: __________________________

Section 2: Information Security


Goal: To look at the various dimensions of information that is documented, stored, and communicated. This
section is to get the participant thinking about information they interact with and collect.

Time and Variety:


• What is the information that you handle as part of your duties at this facility daily?
Weekly? Monthly?
• What is routine and what is event-driven?

Types:
• What are some of the types of information you interact with and how is it handled?
o Patient personal information (identity, SSN, contact,...)
o Patient financial (payment method, account information, ...)
o Patient medical record (history, illnesses, lab results, ...)
o Patient medications (prescriptions, ...)
o Insurance information
o Schedule information (doctor daily schedule, patient scheduling,...)
o Referrals (to other specialists)
• What is the kind of information you or others might hesitant to write down?

92
Do you or your facility have general policies about who can access information in your
facility?
• How is it communicated to staff? To patients? To oversight agencies?
• Is there a situation that illustrates how this works?

Access
• How many people have access to all the files?
• How is access to information about patients or other people managed?

Has there ever been a time when someone had access to information that they shouldn’t? Or
accessed information? What happened?

How long do you keep files stored?

Where are files stored? (Take picture)

Has your access to information changed?


• For instance, when you started working here, were you only allowed to access a
minimal amount of information?

Ask if time:
• What affect does access to the information of other patients have on your behavior
or practices as a patient?
• What information verification procedures are followed?
o For instance, do you verify the patient’s social security number?

Section 3: Interfaces for information


Goal: To itemize all the stakeholders in information access and to understand how information and
documented information flows.

See attached table~ Interview Form: Field Study of In-use Information Security and
Interfaces within Healthcare

Section 4: Electronic Health Information Systems


Goal: The goal of this section is to probe what information is maintained in electronic form, how security is
managed and access is granted.

Do you use an electronic/computerized system for the maintenance of patient information?


If yes:
• Who provides the electronic/computerized system?
o What made you pick this company?
o What privacy policies are there? Who enforces them?
• What kinds of information are stored in the system?
• Who can access the system?
o What is the procedure for others to access the system?
• How do you know when someone has accessed the system?
• Who can add, modify, delete information to/from the system?
• Is the system only accessible on premises? or can it be accessed remotely?

93
• What would happen or what has happened when there is suspicious activity in the
system?

94
5.3.5. Informed Consent for Observations in Childcares

Observation of In-use Information Security and Interfaces within


Childcares
Investigators: Steve Harrison, Dr. Dennis G. Kafura, Laurian Vega, Stacy Branham, Monika
Akbar

Purpose of this Research


The purpose of this research is to start investigating how personal information is accessed,
documented, and stored in medical-like settings. The goal is to develop an understanding
from the interviews that will better inform the design and study of personal health records.

XVIII. Procedures
This is an observation study. We hope to ‘shadow’ you and watch you in your regular every
day job for approximately 3 hours of time. During this time we will try not to be overly
obtrusive. When there are any calm moments we may ask a couple of questions to clarify
something we have seen or heard for our notes. Otherwise, we will be silent. At the end of
the interview you will be thanked for your time.

XIX. Risks
The only known risk in this study is legal. We do not wish to see any information that is
confidential or illegal to share. However, all information and audio will be stored in a
password-protected computer. Files will only be named with your participant number that is
written at the top right of this form.

XX. Benefits
The long term benefits of this study is that the information you provide will give us insights
into how personal information is managed in medical-like situation. This will guide us in the
design of electronic personal information management. This will help reduce time, cost, and
paperwork in the medical and childcare settings.

However, no promise or guarantee of benefits has been made to encourage you to


participate in this research. If at a later date, if you are interested in the results of this study,
you can contact the researches listed below for a summary.

XXI. Extent of Anonymity and Confidentiality


At the beginning of the study you will be assigned a random participant number to assure
your confidentiality. You can see this number at the top of the informed consent. There are
a few places you identification will be stored. One is on our copy of this form. The second
will be in a digital file that associates your name with your participant number. The third is
in the audio recording of the interview. The last is on notes that we may take during the
interview. These file will all digitized and secured under a password, and the informed
consents will be stored in a locked file cabinet under the protection of the investigators. At
no time will the researchers release identifying information to anyone other than individuals
working on the project without your written consent.

95
During the observation we will be taking hand notes as well as taking an audio file of what
you are saying and doing. These recordings and notes will be stored on an external drive and
also locked in a file cabinet of the experimenters. All data will be destroyed within 5 years of
the experiment by either shredding the documents or whipping the hard drive.

It is possible that the Institutional Review Board (IRB) may view this study’s collected data
for auditing purposes. The IRB is responsible for the oversight of the protection of human
subjects involved in research.

Unless the participant reveals information regarding abuse to him/herself and/or others, this
confidentiality will be maintained under all circumstances.

XXII. Compensation
No monetary compensation will be provided for participation in this study.

XXIII. Freedom to Withdraw


You are free to withdraw from a study at any time without penalty. You are free not to
answer any questions or respond to experimental situation that you choose without penalty.

There may be circumstances under which the investigator may determine that you should
not continue as a subject. You will be thanked for you time.

XXIV. Subject’s Responsibilities


I voluntarily agree to participate in this study. I have the following responsibilities: to let the
experimenter know if I am feeling overly frustrated and need to take a break; to let the
experimenter know that I need to leave the study.

XXV. Subject’s Permission


I have read the Consent Form and conditions of this project. I have had all of my questions
answered. I hereby acknowledge the above and give my voluntary consent:

______________________________________________ Date:________________
Subject Signature

Should I have any pertinent questions about this research or its conduct, and research
subjects’ right, and whom to contact in the event of a research-related injury to the subject, I
may contact:

Laurian C. Vega, Steve Harrison (Faculty Advisor), Dr. Dennis Kafura (Faculty Advisor),
Stacy Branham, Monika Akbar
Computer Science Department
2202 Kraft Drive
Blacksburg, VA. 24060

96
{lhobby, srh, kafura}@cs.vt.edu, tdehart@gmail.com
Phone: 540-231-7409

Dr. Ryder
Department Head, Computer Science Department
2202 Kraft Drive
Blacksburg, VA 24060
Ryder@cs.vt.edu
Phone: 540.231.8452

David M. Moore
Chair, Virginia Tech Institutional Review
Board for the Protection of Human Subjects
Office of Research Compliance
2000 Kraft Drive, Suite 2000 (0497)
Blacksburg, VA. 24060
540.231.4991
MooreD@vt.edu

97
5.3.6. Interview Protocol for Interviewing Parents

Section 1: Demographic and Background Information


Goal: To collect background information about the parent and their child or children.

Name: ________________________________________________________________
Gender: M/F
Number of children in a childcare program: _____
Age(s) of child(ren): ____________________________
Childcare Facility in which your children are enrolled in: _____________________
How long has/have your child/children been enrolled in a childcare program:
________________________________________________________________________
How long has/have your child/children been enrolled in the current childcare program:
________________________________________________________________________

Section 2: Information Security


Goal: To look at how information is shared between the childcare providers and to the parents that are
enrolling their children in to such programs.

In order to enroll your child/children in to childcare, what kind of information did you have
to provide about yourself?

What kind of information did you have to provide about your child/children? Forms?
Immunizations?

Maybe: Was there any particular information that you had to provide to the childcare facility
that you felt was unnecessary or sensitive? How was this information handled? Any
differently?

What kind of information does the childcare facility provide to you about your
child/children? (ex: daily reports, webcam feeds, etc) Daily? Weekly? Yearly?

Is there any sort of information that you might have gathered about other children that are
enrolled in your child’s/children’s program (e.g. illnesses/conditions, behaviors )?

Are you aware of who can access your child’s information? Who do you think can access
your child’s information?

Do you know that centers keep your files forever?

Who do you think is the owner the information in your child’s file?

If you aren’t comfortable with any policies do you feel you would be able to bring it up and
have it be changed?

How much do you trust your childcare with your child’s information?

98
What reassurances have you been given that the information about your child has been
protected? Do you know who has access to information about your child?

Maybe (intrusive): Has there ever been an incident between your child and another at the
childcare (e.g. physical altercations)?
If so, what kind of information was given to you about the incident (e.g. name of the other
child, what exactly happened between the two children, etc)?

What are the different ways that you may be contacted by the childcare (ex: phone, face-to-
fact, email, etc.)

99
5.3.6.1. Collected Document Summary Form
This document is to be used to document any artifacts collected during observations. It is
adapted from (Bloomberg et al. 2008).

Name of Document: _______________________________________________________


Type of Document: ________________________________________________________
Document Number: _______________________________________________________
Date Received: ___________________________________________________________
Date of Document: ________________________________________________________
Event or Contact with which Document is Associated:
________________________________________________________________________

 Descriptive
 Evaluative
 Other _______________________

Page # Key Words/Concepts Comments:


Relationship to Research Question

Brief Summary of Contents:


________________________________________________________________________
________________________________________________________________________
Significance or Purpose of Document:
________________________________________________________________________
________________________________________________________________________
Is there any Contradictory About the Document?
 Yes
 No
Salient Questions/Issues to Consider:
________________________________________________________________________
________________________________________________________________________
Additional Comments/Reflections/Issues:
________________________________________________________________________
________________________________________________________________________
________________________________________________________________________

100

You might also like