Download as pdf or txt
Download as pdf or txt
You are on page 1of 14

135-148 088495 Carrilio (D) 17/3/08 09:24 Page 135

Journal of Social Work

8(2): 135–148
Copyright
© 2008
Sage Publications:
Los Angeles, London,
New Delhi and Singapore
www.sagepublications.com

Accountability, Evidence,
and the Use of Information
Systems in Social Service
Programs
T E R RY E . C A R R I L I O
San Diego State University, USA

Abstract
• Summary: As social work engages with the ideology of evidence-based
practice it becomes important to accurately document service activities
and outcomes. This often proves problematic, as utilization of systems to
collect data for evaluation is fraught with ideological, epistemological,
and skill-based difficulties. This article describes a ‘multiple case study’
consisting of: 1) a multi-agency evaluation with inconsistent implementation
of a data collection system; and 2) a follow-up cross-sectional study of
social workers’ use of computers and data systems.
• Findings: Four components related to practitioner utilization of data
systems were identified: skills and experience with using computers,
perceived ease of use, utility of the data, and attitudes about data. The
latter may point to underlying epistemological and ontological issues
regarding evidence-based practice in direct service settings.
• Applications: It is important to understand the interacting personal,
professional, and organizational factors that influence social workers’
use of information systems. The findings suggest that improving worker
skill and comfort with data systems as well as maintaining an open
dialogue about how data will be used may be key components of efforts
to improve practitioner utilization of data systems.

Keywords accountability computerized information systems


evidence-based practice

Introduction
Evidence-based practice is an important recent trend in social work practice
and education, consistent with similar trends in other clinically based
professions such as medicine and psychology (Gambrill, 1999; Pollio, 2006).

DOI: 10.1177/1468017307088495 135

Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 136

Journal of Social Work 8(2)

While there is not yet one agreed-upon definition of evidence-based practice,


there is general agreement that evidence-based practice represents a systematic
effort to identify what is known and what is currently agreed upon as best
practice, and apply careful judgment as to its appropriate application (Howard
et al., 2003; Pollio, 2006). There continues to be debate about the appropriate-
ness of the emerging model and the methodologies that produce ‘evidence’
(Gray and McDonald, 2006; Qureschi, 2004; Webb, 2001). Social work educators
and researchers are embracing the move from the profession’s traditional
authority base (Gambrill, 2003; Howard et al., 2003) more rapidly than prac-
titioners1 (Barrett, 1999; Pollio, 2006). At the same time, policy-makers are
interested in supporting programs that can demonstrate effectiveness and
efficacy, and this often leads to pressure on practitioners to provide data about
what they are doing (Hernandez, 2000; Kettner et al., 1999). Evidence-based
practice offers a way for practitioners to respond to the powerful pressures of
funders and policy-makers2 (Howard et al., 2003).
In an environment of heightened accountability and expectations that
intervention outcomes will be measured (Horsch, 1996; Lewis et al., 2001; Poole
et al., 2000; Scheirer, 2000), the practitioner is commonly instructed by admin-
istrators and researchers to collect data about what he or she is doing. External
pressures on practice come from funding sources, politicians, and legal statutes.
This external pressure encourages the use of practices that can be demonstrated
scientifically (Howard et al., 2003). This can result in a form of reductionism in
which programs that ‘work’ are narrowly defined and interventions are
presented technically and without sufficient regard for the context of the
specific service environment in which they are being implemented. Improve-
ments in the technology and software used to collect data offer interesting
possibilities for developing systems that could be used to provide deeper
conceptual and operational understanding of interventions and improve their
quality (Fitzgerald and Murphy, 1994; Mutschler and Hasenfeld, 1986).
Although these data management systems could assist practitioners in docu-
menting what they do so that it can accurately be evaluated and studied, social
service organizations often present a fitful commitment to data systems and
may even ignore the data that has been collected (Barrett, 1999).
There is often a sense on the part of practitioners that the researchers and
administrators are missing the point. Likewise, evaluators, administrators, and
policy-makers are regularly confounded by the practitioners’ apparent resist-
ance to collecting important data about practice and outcomes. What is often
missing is a sense of collaboration and dialogue. Some of these tensions may
represent very real differences about what constitutes meaning in the practice
of social services (Houston, 2005). It is possible that practitioners, recognizing
the multiple systems involved in ‘causing’ any observed behavior, and recog-
nizing the importance of values and experienced meaning to social work
practice, are not comfortable with the concept of prediction promulgated by
positivist and empirical models of understanding practice outcomes.

136

Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 137

Carrilio: Accountability, Evidence, and the Use of IS in Social Service Programs

Reframing practice to incorporate a focus on collecting client level process


and outcome data3 requires change at the personal and organizational levels.
Organizational support and commitment over time are vital to shifting
consciousness and successfully implementing information systems (Carrilio
et al., 2003; Ogborne et al., 1998). Sophisticated information systems are
developed for the purposes of measuring practice activities and client outcomes
(Fitzgerald and Murphy, 1994; Mutschler and Hasenfeld, 1986; Ogborne, 1998;
Rossi et al., 1999). However, these systems are often defeated or underutilized
by practitioners, even when they are well developed, user friendly, and well
supported (Carrilio et al., 2003). This article examines some of the factors
involved in the underutilization of data and data collection systems by social
service practitioners. A ‘multiple case study’ approach (Yin, 1993, 1994) is
utilized.
Some authors suggest that information systems may be perceived by
practitioners as coming from ‘on high’ (Fitch, 2005; Pollio, 2006). Although these
information systems carry great potential for helping practitioners to better
manage and understand their practice (Mutschler and Hasenfeld, 1986), they
are often seen as imposed by those in charge, and are not embraced as a self-
reflective tool (Fitzgerald and Murphy, 1994; Sluyter, 1998). The benefits of
information systems to evaluators and administrators are obvious, but
frequently there are significant barriers to the active use of these systems by
practitioners. The literature reports a range of possible reasons, both organiz-
ational and personal, for underutilization of client information systems,
including variable reporting needs and expectations (Kettner et al., 1999), and
organizational commitment and contextual considerations (Barrett, 1999;
Dorsey, 2002; Ogborne, 1998; Rapp and Poertner, 1992). At the personal level,
data system utilization is influenced by a sense of having the wrong questions
asked, paradigm difference between what clinicians find useful and what
researchers find useful (Ames, 1999; Fitch, 2005; Pollio, 2006), concerns about
skills and expertise in data use (Mutschler, 1992), and ideological resistance to
the collection of certain data.
Introducing data systems into practice settings is complex (Lederer and
Mendelow, 1988). Client tracking systems can be useful both organizationally
and at the individual practitioner level. For the practitioner, though, the attach-
ment of these systems to management needs (Fitch, 2005) creates a sense of
distance from the data and concern about its possible uses. The idea of looking
at intervention quality, consistency, and effectiveness, a key element of self-
reflexive and evidence-based practice, can be anxiety provoking, especially if a
foundation of trust has not been laid. An organization needs data to maintain
accountability and credibility, and researchers require data to evaluate
programs. Practitioners need data to understand what they are doing and how
they can improve practice (Fitch, 2005; Mutschler and Hasenfeld, 1986;
Schoech, 1999), as well as to test theoretical concepts related to interventions.
Such openness to examination requires a supportive organizational context

137

Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 138

Journal of Social Work 8(2)

(Hasenfeld and Patti, 1992; Monnickendam and Eaglstein, 1993; Sluyter, 1998).
Practitioners need to recognize the relationship between what they are being
asked to document, and underlying theories of change and program goals
(Hernandez, 2000; Scheirer, 2000; Rossi et al., 2000). Information system devel-
opers must recognize that there may not be agreement about what constitutes
evidence, leading to disagreements about what data are needed, and how these
data are to be understood (Clapp et al., 1998; Webb, 2001). It is incumbent upon
the developers to make it as easy as possible for direct practitioners to
document their activities in a way that is non-intrusive and is not overly burden-
some (Barrett, 1999; Carrilio, 2005; Hodges and Hernandez, 1999; Schoech,
1995).
A number of researchers have identified key elements in the acceptance and
utilization of information systems by practitioners. Despont-Gros and her
colleagues have proposed five dimensions associated with utilization of infor-
mation systems: user characteristics, information system characteristics, contex-
tual and organizational characteristics, process characteristics, and system
impacts (Despont-Gros et al., 2005). In Monnickendam’s study of computer
system utilization, process variables, system attributes, and user variables were
strong predictors of utilization, while organizational context and worker atti-
tudes were weak predictors of utilization (Monnickendam, 1999). Using data
systems in practice settings can very well represent a change of culture for
established practitioners (Carrilio et al., 2003), and may represent a non-
acceptance of the currently popular trend toward evidence-based practice
among researchers and educators (Pollio, 2006). Even when data collection
systems are developed, following ‘best practice’ guidelines in the literature,
there may be more dialogue needed in order to assure the successful use of
data by practitioners (Carrilio, 2005; Lederer and Mendelow, 1988; Rossi et al.,
1999). In the analysis that follows, researchers on a large multi-agency evalu-
ation project identified a problem with information system utilization by
practitioners and subsequently looked more deeply to identify some of the
possible reasons.

Method
Using a ‘multiple case study approach’ (Yin, 1994), the current analysis
considers two ‘case’ situations. For this article, the ‘cases’ for analysis were
identified as follows: 1) Case 1: observations regarding the utilization of infor-
mation systems in a multi-agency evaluation. It should be noted that these
observations were made during the course of a much larger study. The ‘case’
here represents the phenomenon of underutilization and differential data
reporting unexpectedly identified during the project, and an initial effort to
understand it; 2) Case 2: treats the follow-up survey of community practitioners
as a cross-sectional instance of information regarding practitioner utilization of
data and data systems.

138

Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 139

Carrilio: Accountability, Evidence, and the Use of IS in Social Service Programs

Each of the ‘cases’ represents a unit that can be analyzed separately yet can
perhaps offer more by being examined jointly. A ‘multiple case’ study (Yin,
1994) is an approach that utilizes several case studies in order to enhance
external validity and representativeness. Generally, a case study framework is
used when the phenomenon being investigated is closely intertwined with its
context (Yin, 1993). In looking at how practitioners utilize information systems
and the resulting data, the phenomenon is embedded in personal and organiz-
ational factors that are difficult to differentiate. Additionally, the funding and
political context within which programs operate4 makes it difficult to sort out
the relative contributions of individual and organizational factors to the overall
operation and evaluation of a given program. A ‘case’ represents an analysis of
a phenomenon or process at a particular point in time. A multiple case study
approach permits a cross-sectional analysis of related phenomena at multiple
points in time, and offers a way to embed quantitative data in its qualitative
context.

Case #1: Inconsistent Utilization of an Information System in a


17-Agency Evaluation
The author and colleagues were asked to evaluate a complex, multi-agency
family support project in California.5 The program initiative was an evolution
of two previous initiatives, both of which had been extensively evaluated.
Vulnerable families were offered up to two years of weekly home visits, and
center-based groups, including parenting classes, child development classes, and
skills development. Services were delivered by a multi-disciplinary team which
included mental health clinicians, substance abuse specialists, child develop-
ment specialists, public health nurses, social workers, and trained home visitors.
There was a community and systems component, in that the funding agency was
interested in increasing community focus on prevention and family support
activities. The evaluation included process and outcomes data at the client, staff,
organizational, and community level. The program results of this multi-agency
project and the two antecedent projects have been previously reported
(Carrilio, 2003, 2005a, 2005b; Carrilio et al., 2003; Landsverk et al., 2001).
In order to understand client level data, a number of simple, standardized
measures were administered at baseline and at six-month intervals. Addition-
ally, a client information system was used to track services received. The infor-
mation system was developed utilizing experience and feedback from several
projects (Carrilio, 2005a, 2005b). Direct practice staff, supervisors, and program
directors received intensive initial training and on-going support and training.
The system introduction, staff training, and maintenance plan were carefully
examined, and were found to have been consistent with elements that had been
identified in the literature as prerequisites for system success: 1) clearly identi-
fied need for data (Carrilio et al., 1985; Kettner et al., 1999; Lewis et al., 2001);
2) clarity of the content of the output (Kettner et al., 1999); 3) evaluation of
current capacities and gaps (Lewis et al., 2001); 4) use of data collection tools

139

Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 140

Journal of Social Work 8(2)

that meet information needs (Kettner et al., 1999; Lewis et al., 2001; Schoech,
1995); 5) process plan that clarifies accountability and procedures (Kettner et
al., 1999; Lewis et al., 2001).
Despite considerable support, and apparent fiscal and organizational
benefits to collecting client level data, we found variable utilization at the
different agency sites, including some sites with extremely low levels of
information system utilization (Carrilio, 2005a, 2005b; Carrilio et al., 2003).
Interestingly, the information system was used differentially, in that workers
were more likely to complete process data (who did what, with whom, how
much, when, where, and for how long), but tended to neglect outcome measures.
Organizational context, administrative and practitioner attitudes, and organiz-
ational supports were identified as key components of information system
adoption. The impact of attitudes, skills, and experience with information
systems initially emerged as potential determinants of use.
The variable use of information systems, in addition to limiting the avail-
ability of outcome data for use in the evaluation, seemed counterintuitive, since
most of the agencies were seeking additional funding when the project ended
and could have used evidence that might have demonstrated the effectiveness
of the intervention. In order to gain a better understanding of this phenomenon,
a survey of staff at each of the 17 agencies was conducted, and a review of
random records from each agency site was conducted. The researchers
concluded that factors associated with the implementation context, such as
organizational support and leadership, interacted with worker skills and atti-
tudes to create differential implementation of the data collection system
(Carrilio et al., 2003). These observations about the differential use of the data
system were not originally conceptualized as part of the study, but emerged as
representing an interesting, and potentially important, factor in program imple-
mentation, operation, and quality management. The team was intrigued and
decided to refine the survey instrument to try to obtain a more complete
understanding of the phenomenon we had observed.

Case #2: The Survey of Community Social Workers


In the earlier case situation, we had identified organizational context character-
istics, system process characteristics, and attributes of the system as having had
some influence on utilization. We then turned to worker characteristics and
attitudes to better understand the factors that influence practitioner utilization
of information systems. In the follow-up study we hypothesized that workers’
perceptions of data usefulness, perceived ease of use of the data system, and
worker attitudes would influence their use of data and information systems in
practice. A 44-item survey, containing questions about how social workers use
data, how comfortable they feel with data, and how they feel that data are used
in their organization, was mailed to field instructors working in social service
agencies throughout the community.6 There were 245 responses. A factor
analysis was conducted and four key indices were identified: 1) ease of use; 2)

140

Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 141

Carrilio: Accountability, Evidence, and the Use of IS in Social Service Programs

helpfulness of the available data; 3) attitudes about using data in practice


settings; 4) skills and experience with computers and data systems.
The respondents were predominantly female (79.2% of the respondents),
and on average had worked in the field of social work for 15 years. Direct
service staff represented 46 percent of the sample, and 54 percent were
supervisors or administrators. About 20 percent of the respondents felt that
their computer skills were not strong. Generally, the respondents felt that
the information systems at their agencies were user friendly and useful,
although 23.7 percent of the respondents did not find data helpful in their
daily work.
The data were analyzed in three stages: 1) a factor analysis was conducted;
2) a correlational analysis was conducted; 3) a multiple regression analysis was
conducted.

Factor Analysis Consistent with the literature and with the earlier case situation,
four indices were identified: 1) skill and experience with computers (= .79); 2)
ease of use (= .86); 3) data are useful (= .88); and 4) attitudes toward data
(= .74). The alpha scores indicate that the items contained in the indices are
related, and are measuring the dimensions fairly reliably.

Correlational Analysis A series of correlational analyses were conducted. The


single demographic factor that seemed to correlate with information system
utilization was the respondent’s role in the organization (r = .30). Although this
is a fairly weak relationship, the role in the organization was also correlated with
skill and experience using computers (r = .50), and attitude towards data (r =
.35). The skill and experience index was fairly well correlated to information
system utilization (r = .62), followed by the usefulness of data (r = .49) and ease
of use (r = .37). The index of attitudes data was weakly correlated (r = .25) with
information system utilization. All of the correlations were significant at a .001
level.

Multiple Regression Analysis An initial multiple regression entering all four


indices and the respondent’s role indicated that all of the variables together
explained 50 percent of the variance on utilization of information systems. A
subsequent stepwise regression indicated that the majority of the variance
(R2 = .39) was explained by the skill and experience with computers index. The
‘data are useful’ index increased the explained variance to .46, and adding the
ease of use variable increased the explained variance to .49. Combined, atti-
tudes towards data and role in the organization only accounted for 1 percent of
the variance.

141

Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 142

Journal of Social Work 8(2)

Discussion
The community social worker survey and the original findings from a multi-
agency evaluation indicate that about 20 percent of the practitioners who
responded did not feel comfortable using computers, and a similar number indi-
cated negative attitudinal factors associated with evaluating practice and utiliz-
ing data. The literature would also indicate that leadership and organizational
context can strongly influence attitudes and the availability of supports for prac-
titioners learning how to use data systems (Clapp et al., 1998; Ogborne, 1998;
Sluyter, 1998). The importance of these findings lies in the ways that worker
characteristics and organizational characteristics interact and influence the
degree to which staff are open to and able to use data in practice environments.
On the positive side, the data indicate that the respondents to the survey were
generally open to using data in practice. However, the 20 percent who demon-
strated negative attitudes and disinterest in using information systems did so
consistently.
The findings indicate that, contrary to initial expectations, worker attitudes
were less important in determining utilization than were skill and experience
or a sense that the data being produced were helpful. The first case, describing
the 17-agency evaluation and initial study of information system underutiliza-
tion, had pointed up the possible influence of leadership, worker attitudes, and
organizational context. Overall, the results of this multiple case study support
the findings of Monnickendam (1999) and Despont-Gros and her colleagues
(2005) in identifying the following as key components in worker utilization of
information systems:
1) Organizational context and supports
2) Administrator and practitioner attitudes towards data
3) Ease of use of the information system
4) Helpfulness (utility) of data produced by the information system
5) Skills and experience of users
As in the Monnickendam study (1999), in our follow up survey of
community social workers, the results support the idea that system character-
istics and utility of the product, along with worker skill and comfort with the
system, would lead to increased utilization. In the first case situation there was
a fair amount of information about organizational context, and, therefore, it is
not surprising that this emerged as an important element of utilization. Both
Despont-Gros and Monnickendam had also identified context as a factor in
utilization. The current findings support the previous findings regarding the key
dimension associated with utilization of information systems. In order to better
understand how the identified factors interact, future research will need to
consider structural (PATH) modeling.
The question of generalizability and external validity of results can be
addressed by a combination of sampling, use of multivariate statistics, and use

142

Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 143

Carrilio: Accountability, Evidence, and the Use of IS in Social Service Programs

of cross-sectional designs (Bernard, 2000; Rubin and Babbie, 2005). In the


analysis described here, the conceptualization of different cross-sectional
studies as ‘cases’ (Yin, 1993, 1994), while not a replication per se, does allow
instances of what is posed as the same phenomenon to be examined similarly.
The sample for this analysis, which actually consists of staff at the 17 organiz-
ations and 245 community social workers, was not pre-planned to be represen-
tative. Yet because multiple agencies in diverse geographic areas and with very
different auspices and funding were included in the case studies, it can be
argued that the sample is likely to be fairly representative of practitioners. One
concern might be whether these practitioners represent California, the US, or
all social service practitioners. For this reason, next steps might well include
utilizing the survey and a case analysis elsewhere in the US and in other
countries.
Ultimately, however, this additional investment would not be worth the
effort unless the question of ‘so what?’ is addressed. While many of the results
described here are not novel or surprising, what is new is the insight that comes
from supporting anecdotal and practice wisdom conclusions with more system-
atic data. The fact that the results show the obvious, that there needs to be
dialogue among the practitioners, the people in charge, and the researchers, is
a verification, using somewhat more positivist measures, of meaning and knowl-
edge that already exists. However, in an era of positivism and competition for
scarce dollars, where demands to demonstrate that what we ‘know’ is, in fact, of
value, finding ways to demonstrate these insights empirically can be an import-
ant survival tool.7 Analyses such as this one are not conceptualized as breaking
new ground, but rather, as providing social service organizations and prac-
titioners with tools that allow them to look at their work and their challenges
more systematically. Although some of these findings seem time-worn, they
have not found their way into practice; all too often there is a mutual finger-
pointing approach to dealing with the challenges of data collection in social
service agencies. The current article offers some simple empirical data to move
the discussion from a personal to a professional level, and provides support for
the need to focus first and foremost on dialogue and meaning in order to build
a framework of trust within which it is possible to study practice quality and
effectiveness.

Conclusion
In order to effectively manage and evaluate program and client level interven-
tions it is important to be able to develop an organizational context that
supports self-reflective practice. To the extent that organizations provide
support and training on utilizing data for practice, direct practitioners are more
likely to embrace an approach to practice consistent with evidence-based prin-
ciples. Conversely, as practitioners feel put out, inconvenienced, or threatened

143

Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 144

Journal of Social Work 8(2)

by these systems, they are less likely to effectively incorporate a self-reflective


approach to their practice.
The finding of the case situations described here indicate that administra-
tors and educators should take care to avoid mechanistic, overly positivist
approaches and focus on developing and supporting skill development, safety,
and trust as prerequisites to asking practitioners to embrace a self-reflective,
evidence-based approach.
Evidence-based practice is dependent upon such a reflective approach,
which, in turn, can be greatly enhanced by developing good practice level infor-
mation about exactly what is happening, and what planned-for changes in
clients might be taking place. The multiple case study presented here indicates
that there are contextual and instrumental steps that can facilitate the use of
data in direct practice: 1) create an organizational context that is supportive and
honestly encourages critical examination of practice for the purposes of
improving quality and effectiveness; 2) introduce an information system that is
easy to use and provides easily accessible information that the practitioner finds
helpful in his/her daily work; 3) offer a program of staff training and on-going
support to help practitioners maintain a sense of efficacy in using the data
system; 4) maintain the data collection system so that it keeps up with tech-
nology, changing needs, changing external expectations, and responses to input
from the users.
Sadly, while these conclusions should be both obvious and a regular part of
quality practice in social service organizations, this is not always the case. In the
25 agencies associated with this analysis, there was tremendous variance in
organizational commitment to effective practice and to developing a
rapprochement between the direct practice paradigm and the paradigms of
researchers, administrators, and policy-makers. Serious consideration needs to
be given to an on-going dialogue regarding evidence-based practice such that
practitioners can ‘own’ it. The findings of the two cases presented here illustrate
some prerequisites for accomplishing one of the subtasks (data collection)
associated with evidence-based practice. However, the enterprise needs to be
approached holistically and grounded in a respectful dialogue about the key
elements of practice and what constitutes evidence.

Notes
The data utilized in this study are available at [http://www-rohan.sdsu.edu/~socwrk1/].

1. The term ‘practitioner’ is used here to refer to individuals performing direct service
activities in social service settings. The term encompasses clinical and case
management activities that are often referred to as ‘direct practice’. Practitioners
include both Bachelors level workers and Masters level workers.
2. The two case instances reviewed in this article occurred in the US, where the complex
public–private funding structures and a tendency towards residual social service
organization strongly influence practice. Short-term, fashion-of-the-day funding,
along with underlying ambivalence about institutionalizing programs for the

144

Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 145

Carrilio: Accountability, Evidence, and the Use of IS in Social Service Programs

vulnerable, lead to reduced continuity and a tendency for programs to lose their core
in an effort to please those with the power to de-fund them. The term ‘funders’ can
include government agencies, foundations, and private philanthropy. ‘Policy-makers’
refers to elected representatives, political appointees, government bureaucracies with
authority to offer grants and contracts. Recipients can be non-profit agencies,
government agencies, and for-profit service-providing agencies.
3. Evaluation data can be collected at a variety of levels. The reference here is to client
level data, which describes what happens to individual clients and groups of clients. It
is also possible to look at data from a programmatic, organizational, or community
perspective. The term ‘process data’ refers to data that simply records what is done.
The term ‘outcome data’ refers to efforts to look at the impact and effectiveness of
an intervention.
4. See note 1. In social service programs in the US, it can reasonably be said that ‘form
follows finance’, meaning that program models, professional practice, and even client
need may be subsumed under the ideology and structure of the funding process.
5. There were eight independent organizations studied during the earlier evaluations,
and 17 in the case described here. Of these 25 organizations, the majority were non-
profits, although some were public agencies, such as county child welfare, health, or
educational offices. It is important to note that in the US funding context described
here, public and private organizations can compete for public funds from a
governmental entity that supersedes the competing agency. They can also compete
for private foundation and philanthropic dollars. For a description of the multi-
layered, pluralistic approach to the social services and social welfare in the United
States, see Karger and Stoesz (2006).
6. The field instructors were used because they represented a wide range of experiences
and agency types. Participation was both anonymous and voluntary.
7. This sense of needing to struggle to survive in the human services may be
exaggerated in the US because of over 25 years of devolving public responsibility and
‘privatization’ of the kinds of public social welfare services that in many developed
countries are taken for granted.

References
Ames, N. (1999) ‘Social Work Recording: A New Look at an Old Issue’, Journal of
Social Work Education 35(2): 227–37.
Barrett, S. (1999) ‘Information Systems: An Exploration of the Factors Influencing
Effective Use’, Journal of Research on Computing in Education 32(1): 4–17.
Bernard, H.R. (2000) Social Research Methods. Los Angeles, CA: SAGE.
Carrilio, T. (2003) ‘Learning from Experience? A Review of Three California
Initiatives Addressing the Needs of Vulnerable Families’, Social Policy Journal 2
(2/3): 5–25.
Carrilio, T. (2005a) ‘Management Information Systems: Why are They Underutilized in
the Social Services?’, Administration in Social Work 29(2): 43–61.
Carrilio, T.E. (2005b) ‘Looking Inside the “Black Box”: A Methodology for Measuring
Program Implementation and Informing Social Services Policy Decisions’, Social
Policy Journal 4(3/4): 1–17.
Carrilio, T., Kasser, J. and Moretto, A. (1985) ‘Management Information Systems: Who is
in Charge?’, Social Casework: The Journal of Contemporary Social Work 66(7):
417–23.

145

Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 146

Journal of Social Work 8(2)

Carrilio, T., Packard, T. and Clapp, J. (2003) ‘Nothing In – Nothing Out: Barriers to Data
Based Program Planning’, Administration in Social Work 27(4): 61–75.
Clapp, J.D., Burke, C. and Stanger, L. (1998) ‘The Institutional Environment, Strategic
Response and Program Adaptation: A Case Study’, Journal of Applied Social
Sciences 22(1): 87–95.
Despont-Gros, C., Mueller, H. and Lovis, C. (2005) ‘Evaluating User Interactions with
Clinical Information Systems: A Model Based on Human-Computer Interaction
Models’, Journal of Biomedical Informatics 38: 244–55.
Dorsey, D. (2002) ‘Information Technology’, in J. Hedge and E. Pulkalos (eds)
Implementing Organizational Interventions, pp. 110–32. San Francisco, CA:
Jossey-Bass.
Fitch, D. (2005) ‘The Diffusion of Information Technology in the Human Services:
Implications for Social Work Education’, Journal of Teaching in Social Work 25(1/2):
191–204.
Fitzgerald, B. and Murphy, C. (1994) ‘Introducing Executive Information Systems into
Organizations: Separating Fact from Fallacy’, Journal of Information Technology 9:
288–96.
Gambrill, E. (1999) ‘Evidence-based Practice: An Alternative to Authority-based
Practice’, Families in Society 80(4): 341–50.
Gambrill, E. (2003) ‘Evidence-based Practice: Sea Change or the Emperor’s New
Clothes?’, Journal of Social Work Education 39(1): 2–23.
Gray, M. and McDonald, C. (2006) ‘Pursuing Good Practice?’, Journal of Social Work
6(1): 7–20.
Hasenfeld, Y. and Patti, R. (1992) ‘The Utilization of Research in Administrative
Practice’, in A. Grasso and I. Epstein (eds) Research Utilization in the Social Services,
pp. 221–39. New York: The Haworth Press.
Hernandez, M. (2000) ‘Using Logic Models and Program Theory to Build Outcome
Accountability’, Education and Treatment of Children 23(1): 24–40.
Hodges, S. and Hernandez, M. (1999) ‘How Organizational Culture Influences Outcome
Information Utilization’, Evaluation and Program Planning 22(2): 183–97.
Horsch, K. (1996) ‘Results-based Accountability Systems: Opportunities and
Challenges’, The Evaluation Exchange 2(1): 2–3, website of the Harvard Family
Research Project, available online at: [http://www.gse.harvard.edu/hfrp/eval/issue3/
theory1.html], accessed January 2007.
Howard, M., McMillen, C. and Pollio, D. (2003) ‘Teaching Evidence-Based Practice:
Toward a New Paradigm for Social Work Education’, Research on Social Work
Practice 13(2): 234–59.
Karger, H. and Stoesz, D. (2006) American Social Welfare Policy: A Pluralist Approach,
5th edn. Boston, MA: Allyn & Bacon.
Kettner, P., Moroney, R. and Martin, L. (1999) ‘Building a Management Information
System’, in Designing and Managing Programs: An Effectiveness Based Approach,
2nd edn, pp. 139–69. Los Angeles, CA: SAGE.
Landsverk, J., Carrilio, T., Connelly, C. and Ganger, W. (2001) ‘Healthy Families San
Diego: Final Technical Report’, submitted to the California Department of Social
Services, The Wellness Foundation and The Stuart Foundation, Children’s Hospital
Child and Adolescent Services Research Center, San Diego.
Lederer, A. and Mendelow, A. (1988) ‘Convincing Top Management of the Strategic
Potential of Information Systems’, MIS Quarterly December: 525–34.

146

Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 147

Carrilio: Accountability, Evidence, and the Use of IS in Social Service Programs

Lewis, J., Lewis, M., Packard, T. and Souflee, F. (2001) ‘Designing and Using Information
Systems’, in Management of Human Service Programs, pp. 209–34. Pacific Grove, CA:
Brooks/Cole.
Monnickendam, M. (1999) ‘Computer Systems that Work: A Review of Variables
Associated with System Use’, Journal of Social Service Research 26(2): 71–94.
Monnickendam, M. and Eaglstein, A.S. (1993) ‘Computer Acceptance by Social
Workers: Some Unexpected Research Findings’, Computers in Human Services
9(3/4): 409–24.
Mutschler, E. (1992) ‘Computers in Agency Settings’, in A. Grasso and I. Epstein (eds)
Research Utilization in the Social Services, pp. 325–44. New York: The Haworth
Press.
Mutschler, E. and Hasenfeld, Y. (1986) ‘Integrated Information Systems for Social Work
Practice’, Social Work Sept/Oct: 345–9.
Ogborne, A., Braun, K. and Rush, B. (1998) ‘Developing an Integrated Information
System for Specialized Addiction Treatment Agencies’, Journal of Behavioral Health
Services & Research 25(1): 100–09.
Pollio, D. (2006) ‘The Art of Evidence-Based Practice’, Research on Social Work
Practice 16(2): 224–32.
Poole, D., Nelson, J., Carnahan, S. Chepenick, N.G. and Tubiak, C. (2000) ‘Evaluating
Performance Measurement Systems in Nonprofit Agencies: The Program
Accountability Quality Scale (PAQS)’, American Journal of Evaluation 21(1): 15–26.
Qureschi, H. (2004) ‘Evidence in Policy and Practice: What Kinds of Research
Designs?’, Journal of Social Work 4(1): 7–23.
Rossi, P., Freeman, H. and Lipsey, M. (1999) Evaluation: A Systematic Approach, 6th
edn. Los Angeles, CA: SAGE.
Rubin, A. and Babbie, E. (2005) Research Methods for Social Work. Belmont, CA:
Wadsworth, Brooks/Cole.
Schoech, R. (1995) ‘Information Systems’, in R. Edwards (ed.) Encyclopedia of Social
Work, 19th edn, pp. 1470–9. Washington, DC: NASW Press.
Schoech, R. (1999) Human Services Technology: Understanding, Designing, and
Implementing Computer and Internet Applications in the Social Services. New York:
The Haworth Press.
Scheirer, M.A. (2000) ‘Getting More “Bang” for Your Performance Measures Buck’,
American Journal of Evaluation 21(2): 139–49.
Sluyter, G. (1998) Improving Organizational Performance (#74 in the SAGE Human
Services Guides Series). Los Angeles, CA: SAGE.
Webb, S. (2001) ‘Some Considerations on the Validity of Evidence-Based Practice in
Social Work’, British Journal of Social Work 31: 57–79.
Yin, R. (1993) Applications of Case Study Research. Los Angeles, CA: SAGE.
Yin, R. (1994) Case Study Research: Design and Methods, 2nd edn. Los Angeles, CA:
SAGE.

147

Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 148

Journal of Social Work 8(2)

T E R RY C A R R I L I O is an Assistant Professor of Social Work at San Diego State


University. She has worked in a variety of social worker practice areas, ranging
from direct clinical practice to administration, for over 30 years. Most recently she
has focused on program development and evaluation in family support and
prevention programs. Her book, Home Visiting: A Case Management Strategy for
Supporting Vulnerable Families was published by the University of South Carolina
Press in the summer of 2007. Current interests involve implementing theoretically,
clinically, and empirically sound prevention programs and international social
development. Address: School of Social Work, San Diego State University, 5500
Campanile Drive, San Diego, CA 92182–4119, USA. [email: tbear10009@aol.com]

148

Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015

You might also like