Professional Documents
Culture Documents
Accountability, Evidence
Accountability, Evidence
8(2): 135–148
Copyright
© 2008
Sage Publications:
Los Angeles, London,
New Delhi and Singapore
www.sagepublications.com
Accountability, Evidence,
and the Use of Information
Systems in Social Service
Programs
T E R RY E . C A R R I L I O
San Diego State University, USA
Abstract
• Summary: As social work engages with the ideology of evidence-based
practice it becomes important to accurately document service activities
and outcomes. This often proves problematic, as utilization of systems to
collect data for evaluation is fraught with ideological, epistemological,
and skill-based difficulties. This article describes a ‘multiple case study’
consisting of: 1) a multi-agency evaluation with inconsistent implementation
of a data collection system; and 2) a follow-up cross-sectional study of
social workers’ use of computers and data systems.
• Findings: Four components related to practitioner utilization of data
systems were identified: skills and experience with using computers,
perceived ease of use, utility of the data, and attitudes about data. The
latter may point to underlying epistemological and ontological issues
regarding evidence-based practice in direct service settings.
• Applications: It is important to understand the interacting personal,
professional, and organizational factors that influence social workers’
use of information systems. The findings suggest that improving worker
skill and comfort with data systems as well as maintaining an open
dialogue about how data will be used may be key components of efforts
to improve practitioner utilization of data systems.
Introduction
Evidence-based practice is an important recent trend in social work practice
and education, consistent with similar trends in other clinically based
professions such as medicine and psychology (Gambrill, 1999; Pollio, 2006).
Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 136
136
Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 137
137
Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 138
(Hasenfeld and Patti, 1992; Monnickendam and Eaglstein, 1993; Sluyter, 1998).
Practitioners need to recognize the relationship between what they are being
asked to document, and underlying theories of change and program goals
(Hernandez, 2000; Scheirer, 2000; Rossi et al., 2000). Information system devel-
opers must recognize that there may not be agreement about what constitutes
evidence, leading to disagreements about what data are needed, and how these
data are to be understood (Clapp et al., 1998; Webb, 2001). It is incumbent upon
the developers to make it as easy as possible for direct practitioners to
document their activities in a way that is non-intrusive and is not overly burden-
some (Barrett, 1999; Carrilio, 2005; Hodges and Hernandez, 1999; Schoech,
1995).
A number of researchers have identified key elements in the acceptance and
utilization of information systems by practitioners. Despont-Gros and her
colleagues have proposed five dimensions associated with utilization of infor-
mation systems: user characteristics, information system characteristics, contex-
tual and organizational characteristics, process characteristics, and system
impacts (Despont-Gros et al., 2005). In Monnickendam’s study of computer
system utilization, process variables, system attributes, and user variables were
strong predictors of utilization, while organizational context and worker atti-
tudes were weak predictors of utilization (Monnickendam, 1999). Using data
systems in practice settings can very well represent a change of culture for
established practitioners (Carrilio et al., 2003), and may represent a non-
acceptance of the currently popular trend toward evidence-based practice
among researchers and educators (Pollio, 2006). Even when data collection
systems are developed, following ‘best practice’ guidelines in the literature,
there may be more dialogue needed in order to assure the successful use of
data by practitioners (Carrilio, 2005; Lederer and Mendelow, 1988; Rossi et al.,
1999). In the analysis that follows, researchers on a large multi-agency evalu-
ation project identified a problem with information system utilization by
practitioners and subsequently looked more deeply to identify some of the
possible reasons.
Method
Using a ‘multiple case study approach’ (Yin, 1994), the current analysis
considers two ‘case’ situations. For this article, the ‘cases’ for analysis were
identified as follows: 1) Case 1: observations regarding the utilization of infor-
mation systems in a multi-agency evaluation. It should be noted that these
observations were made during the course of a much larger study. The ‘case’
here represents the phenomenon of underutilization and differential data
reporting unexpectedly identified during the project, and an initial effort to
understand it; 2) Case 2: treats the follow-up survey of community practitioners
as a cross-sectional instance of information regarding practitioner utilization of
data and data systems.
138
Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 139
Each of the ‘cases’ represents a unit that can be analyzed separately yet can
perhaps offer more by being examined jointly. A ‘multiple case’ study (Yin,
1994) is an approach that utilizes several case studies in order to enhance
external validity and representativeness. Generally, a case study framework is
used when the phenomenon being investigated is closely intertwined with its
context (Yin, 1993). In looking at how practitioners utilize information systems
and the resulting data, the phenomenon is embedded in personal and organiz-
ational factors that are difficult to differentiate. Additionally, the funding and
political context within which programs operate4 makes it difficult to sort out
the relative contributions of individual and organizational factors to the overall
operation and evaluation of a given program. A ‘case’ represents an analysis of
a phenomenon or process at a particular point in time. A multiple case study
approach permits a cross-sectional analysis of related phenomena at multiple
points in time, and offers a way to embed quantitative data in its qualitative
context.
139
Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 140
that meet information needs (Kettner et al., 1999; Lewis et al., 2001; Schoech,
1995); 5) process plan that clarifies accountability and procedures (Kettner et
al., 1999; Lewis et al., 2001).
Despite considerable support, and apparent fiscal and organizational
benefits to collecting client level data, we found variable utilization at the
different agency sites, including some sites with extremely low levels of
information system utilization (Carrilio, 2005a, 2005b; Carrilio et al., 2003).
Interestingly, the information system was used differentially, in that workers
were more likely to complete process data (who did what, with whom, how
much, when, where, and for how long), but tended to neglect outcome measures.
Organizational context, administrative and practitioner attitudes, and organiz-
ational supports were identified as key components of information system
adoption. The impact of attitudes, skills, and experience with information
systems initially emerged as potential determinants of use.
The variable use of information systems, in addition to limiting the avail-
ability of outcome data for use in the evaluation, seemed counterintuitive, since
most of the agencies were seeking additional funding when the project ended
and could have used evidence that might have demonstrated the effectiveness
of the intervention. In order to gain a better understanding of this phenomenon,
a survey of staff at each of the 17 agencies was conducted, and a review of
random records from each agency site was conducted. The researchers
concluded that factors associated with the implementation context, such as
organizational support and leadership, interacted with worker skills and atti-
tudes to create differential implementation of the data collection system
(Carrilio et al., 2003). These observations about the differential use of the data
system were not originally conceptualized as part of the study, but emerged as
representing an interesting, and potentially important, factor in program imple-
mentation, operation, and quality management. The team was intrigued and
decided to refine the survey instrument to try to obtain a more complete
understanding of the phenomenon we had observed.
140
Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 141
Factor Analysis Consistent with the literature and with the earlier case situation,
four indices were identified: 1) skill and experience with computers (= .79); 2)
ease of use (= .86); 3) data are useful (= .88); and 4) attitudes toward data
(= .74). The alpha scores indicate that the items contained in the indices are
related, and are measuring the dimensions fairly reliably.
141
Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 142
Discussion
The community social worker survey and the original findings from a multi-
agency evaluation indicate that about 20 percent of the practitioners who
responded did not feel comfortable using computers, and a similar number indi-
cated negative attitudinal factors associated with evaluating practice and utiliz-
ing data. The literature would also indicate that leadership and organizational
context can strongly influence attitudes and the availability of supports for prac-
titioners learning how to use data systems (Clapp et al., 1998; Ogborne, 1998;
Sluyter, 1998). The importance of these findings lies in the ways that worker
characteristics and organizational characteristics interact and influence the
degree to which staff are open to and able to use data in practice environments.
On the positive side, the data indicate that the respondents to the survey were
generally open to using data in practice. However, the 20 percent who demon-
strated negative attitudes and disinterest in using information systems did so
consistently.
The findings indicate that, contrary to initial expectations, worker attitudes
were less important in determining utilization than were skill and experience
or a sense that the data being produced were helpful. The first case, describing
the 17-agency evaluation and initial study of information system underutiliza-
tion, had pointed up the possible influence of leadership, worker attitudes, and
organizational context. Overall, the results of this multiple case study support
the findings of Monnickendam (1999) and Despont-Gros and her colleagues
(2005) in identifying the following as key components in worker utilization of
information systems:
1) Organizational context and supports
2) Administrator and practitioner attitudes towards data
3) Ease of use of the information system
4) Helpfulness (utility) of data produced by the information system
5) Skills and experience of users
As in the Monnickendam study (1999), in our follow up survey of
community social workers, the results support the idea that system character-
istics and utility of the product, along with worker skill and comfort with the
system, would lead to increased utilization. In the first case situation there was
a fair amount of information about organizational context, and, therefore, it is
not surprising that this emerged as an important element of utilization. Both
Despont-Gros and Monnickendam had also identified context as a factor in
utilization. The current findings support the previous findings regarding the key
dimension associated with utilization of information systems. In order to better
understand how the identified factors interact, future research will need to
consider structural (PATH) modeling.
The question of generalizability and external validity of results can be
addressed by a combination of sampling, use of multivariate statistics, and use
142
Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 143
Conclusion
In order to effectively manage and evaluate program and client level interven-
tions it is important to be able to develop an organizational context that
supports self-reflective practice. To the extent that organizations provide
support and training on utilizing data for practice, direct practitioners are more
likely to embrace an approach to practice consistent with evidence-based prin-
ciples. Conversely, as practitioners feel put out, inconvenienced, or threatened
143
Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 144
Notes
The data utilized in this study are available at [http://www-rohan.sdsu.edu/~socwrk1/].
1. The term ‘practitioner’ is used here to refer to individuals performing direct service
activities in social service settings. The term encompasses clinical and case
management activities that are often referred to as ‘direct practice’. Practitioners
include both Bachelors level workers and Masters level workers.
2. The two case instances reviewed in this article occurred in the US, where the complex
public–private funding structures and a tendency towards residual social service
organization strongly influence practice. Short-term, fashion-of-the-day funding,
along with underlying ambivalence about institutionalizing programs for the
144
Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 145
vulnerable, lead to reduced continuity and a tendency for programs to lose their core
in an effort to please those with the power to de-fund them. The term ‘funders’ can
include government agencies, foundations, and private philanthropy. ‘Policy-makers’
refers to elected representatives, political appointees, government bureaucracies with
authority to offer grants and contracts. Recipients can be non-profit agencies,
government agencies, and for-profit service-providing agencies.
3. Evaluation data can be collected at a variety of levels. The reference here is to client
level data, which describes what happens to individual clients and groups of clients. It
is also possible to look at data from a programmatic, organizational, or community
perspective. The term ‘process data’ refers to data that simply records what is done.
The term ‘outcome data’ refers to efforts to look at the impact and effectiveness of
an intervention.
4. See note 1. In social service programs in the US, it can reasonably be said that ‘form
follows finance’, meaning that program models, professional practice, and even client
need may be subsumed under the ideology and structure of the funding process.
5. There were eight independent organizations studied during the earlier evaluations,
and 17 in the case described here. Of these 25 organizations, the majority were non-
profits, although some were public agencies, such as county child welfare, health, or
educational offices. It is important to note that in the US funding context described
here, public and private organizations can compete for public funds from a
governmental entity that supersedes the competing agency. They can also compete
for private foundation and philanthropic dollars. For a description of the multi-
layered, pluralistic approach to the social services and social welfare in the United
States, see Karger and Stoesz (2006).
6. The field instructors were used because they represented a wide range of experiences
and agency types. Participation was both anonymous and voluntary.
7. This sense of needing to struggle to survive in the human services may be
exaggerated in the US because of over 25 years of devolving public responsibility and
‘privatization’ of the kinds of public social welfare services that in many developed
countries are taken for granted.
References
Ames, N. (1999) ‘Social Work Recording: A New Look at an Old Issue’, Journal of
Social Work Education 35(2): 227–37.
Barrett, S. (1999) ‘Information Systems: An Exploration of the Factors Influencing
Effective Use’, Journal of Research on Computing in Education 32(1): 4–17.
Bernard, H.R. (2000) Social Research Methods. Los Angeles, CA: SAGE.
Carrilio, T. (2003) ‘Learning from Experience? A Review of Three California
Initiatives Addressing the Needs of Vulnerable Families’, Social Policy Journal 2
(2/3): 5–25.
Carrilio, T. (2005a) ‘Management Information Systems: Why are They Underutilized in
the Social Services?’, Administration in Social Work 29(2): 43–61.
Carrilio, T.E. (2005b) ‘Looking Inside the “Black Box”: A Methodology for Measuring
Program Implementation and Informing Social Services Policy Decisions’, Social
Policy Journal 4(3/4): 1–17.
Carrilio, T., Kasser, J. and Moretto, A. (1985) ‘Management Information Systems: Who is
in Charge?’, Social Casework: The Journal of Contemporary Social Work 66(7):
417–23.
145
Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 146
Carrilio, T., Packard, T. and Clapp, J. (2003) ‘Nothing In – Nothing Out: Barriers to Data
Based Program Planning’, Administration in Social Work 27(4): 61–75.
Clapp, J.D., Burke, C. and Stanger, L. (1998) ‘The Institutional Environment, Strategic
Response and Program Adaptation: A Case Study’, Journal of Applied Social
Sciences 22(1): 87–95.
Despont-Gros, C., Mueller, H. and Lovis, C. (2005) ‘Evaluating User Interactions with
Clinical Information Systems: A Model Based on Human-Computer Interaction
Models’, Journal of Biomedical Informatics 38: 244–55.
Dorsey, D. (2002) ‘Information Technology’, in J. Hedge and E. Pulkalos (eds)
Implementing Organizational Interventions, pp. 110–32. San Francisco, CA:
Jossey-Bass.
Fitch, D. (2005) ‘The Diffusion of Information Technology in the Human Services:
Implications for Social Work Education’, Journal of Teaching in Social Work 25(1/2):
191–204.
Fitzgerald, B. and Murphy, C. (1994) ‘Introducing Executive Information Systems into
Organizations: Separating Fact from Fallacy’, Journal of Information Technology 9:
288–96.
Gambrill, E. (1999) ‘Evidence-based Practice: An Alternative to Authority-based
Practice’, Families in Society 80(4): 341–50.
Gambrill, E. (2003) ‘Evidence-based Practice: Sea Change or the Emperor’s New
Clothes?’, Journal of Social Work Education 39(1): 2–23.
Gray, M. and McDonald, C. (2006) ‘Pursuing Good Practice?’, Journal of Social Work
6(1): 7–20.
Hasenfeld, Y. and Patti, R. (1992) ‘The Utilization of Research in Administrative
Practice’, in A. Grasso and I. Epstein (eds) Research Utilization in the Social Services,
pp. 221–39. New York: The Haworth Press.
Hernandez, M. (2000) ‘Using Logic Models and Program Theory to Build Outcome
Accountability’, Education and Treatment of Children 23(1): 24–40.
Hodges, S. and Hernandez, M. (1999) ‘How Organizational Culture Influences Outcome
Information Utilization’, Evaluation and Program Planning 22(2): 183–97.
Horsch, K. (1996) ‘Results-based Accountability Systems: Opportunities and
Challenges’, The Evaluation Exchange 2(1): 2–3, website of the Harvard Family
Research Project, available online at: [http://www.gse.harvard.edu/hfrp/eval/issue3/
theory1.html], accessed January 2007.
Howard, M., McMillen, C. and Pollio, D. (2003) ‘Teaching Evidence-Based Practice:
Toward a New Paradigm for Social Work Education’, Research on Social Work
Practice 13(2): 234–59.
Karger, H. and Stoesz, D. (2006) American Social Welfare Policy: A Pluralist Approach,
5th edn. Boston, MA: Allyn & Bacon.
Kettner, P., Moroney, R. and Martin, L. (1999) ‘Building a Management Information
System’, in Designing and Managing Programs: An Effectiveness Based Approach,
2nd edn, pp. 139–69. Los Angeles, CA: SAGE.
Landsverk, J., Carrilio, T., Connelly, C. and Ganger, W. (2001) ‘Healthy Families San
Diego: Final Technical Report’, submitted to the California Department of Social
Services, The Wellness Foundation and The Stuart Foundation, Children’s Hospital
Child and Adolescent Services Research Center, San Diego.
Lederer, A. and Mendelow, A. (1988) ‘Convincing Top Management of the Strategic
Potential of Information Systems’, MIS Quarterly December: 525–34.
146
Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 147
Lewis, J., Lewis, M., Packard, T. and Souflee, F. (2001) ‘Designing and Using Information
Systems’, in Management of Human Service Programs, pp. 209–34. Pacific Grove, CA:
Brooks/Cole.
Monnickendam, M. (1999) ‘Computer Systems that Work: A Review of Variables
Associated with System Use’, Journal of Social Service Research 26(2): 71–94.
Monnickendam, M. and Eaglstein, A.S. (1993) ‘Computer Acceptance by Social
Workers: Some Unexpected Research Findings’, Computers in Human Services
9(3/4): 409–24.
Mutschler, E. (1992) ‘Computers in Agency Settings’, in A. Grasso and I. Epstein (eds)
Research Utilization in the Social Services, pp. 325–44. New York: The Haworth
Press.
Mutschler, E. and Hasenfeld, Y. (1986) ‘Integrated Information Systems for Social Work
Practice’, Social Work Sept/Oct: 345–9.
Ogborne, A., Braun, K. and Rush, B. (1998) ‘Developing an Integrated Information
System for Specialized Addiction Treatment Agencies’, Journal of Behavioral Health
Services & Research 25(1): 100–09.
Pollio, D. (2006) ‘The Art of Evidence-Based Practice’, Research on Social Work
Practice 16(2): 224–32.
Poole, D., Nelson, J., Carnahan, S. Chepenick, N.G. and Tubiak, C. (2000) ‘Evaluating
Performance Measurement Systems in Nonprofit Agencies: The Program
Accountability Quality Scale (PAQS)’, American Journal of Evaluation 21(1): 15–26.
Qureschi, H. (2004) ‘Evidence in Policy and Practice: What Kinds of Research
Designs?’, Journal of Social Work 4(1): 7–23.
Rossi, P., Freeman, H. and Lipsey, M. (1999) Evaluation: A Systematic Approach, 6th
edn. Los Angeles, CA: SAGE.
Rubin, A. and Babbie, E. (2005) Research Methods for Social Work. Belmont, CA:
Wadsworth, Brooks/Cole.
Schoech, R. (1995) ‘Information Systems’, in R. Edwards (ed.) Encyclopedia of Social
Work, 19th edn, pp. 1470–9. Washington, DC: NASW Press.
Schoech, R. (1999) Human Services Technology: Understanding, Designing, and
Implementing Computer and Internet Applications in the Social Services. New York:
The Haworth Press.
Scheirer, M.A. (2000) ‘Getting More “Bang” for Your Performance Measures Buck’,
American Journal of Evaluation 21(2): 139–49.
Sluyter, G. (1998) Improving Organizational Performance (#74 in the SAGE Human
Services Guides Series). Los Angeles, CA: SAGE.
Webb, S. (2001) ‘Some Considerations on the Validity of Evidence-Based Practice in
Social Work’, British Journal of Social Work 31: 57–79.
Yin, R. (1993) Applications of Case Study Research. Los Angeles, CA: SAGE.
Yin, R. (1994) Case Study Research: Design and Methods, 2nd edn. Los Angeles, CA:
SAGE.
147
Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015
135-148 088495 Carrilio (D) 17/3/08 09:24 Page 148
148
Downloaded from jsw.sagepub.com at The University of Iowa Libraries on May 27, 2015