Download as pdf or txt
Download as pdf or txt
You are on page 1of 98

IES PRACTICE GUIDE WHAT WORKS CLEARINGHOUSE

Structuring Out-of-School Time


to Improve Academic Achievement

NCEE 2009-012
U.S. DEPARTMENT OF EDUCATION
The Institute of Education Sciences (IES) publishes practice guides in education
to bring the best available evidence and expertise to bear on the types of challenges
that cannot currently be addressed by a single intervention or program. Authors of
practice guides seldom conduct the types of systematic literature searches that are
the backbone of a meta-analysis, although they take advantage of such work when
it is already published. Instead, authors use their expertise to identify the most im-
portant research with respect to their recommendations and conduct a search of
recent publications to ensure that the research supporting the recommendations
is up-to-date.

Unique to IES-sponsored practice guides is that they are subjected to rigorous exter-
nal peer review through the same office that is responsible for independent reviews
of other IES publications. A critical task for peer reviewers of a practice guide is to
determine whether the evidence cited in support of particular recommendations is
up-to-date and that studies of similar or better quality that point in a different di-
rection have not been ignored. Because practice guides depend on the expertise of
their authors and their group decisionmaking, the content of a practice guide is not
and should not be viewed as a set of recommendations that in every case depends
on and flows inevitably from scientific research.

The goal of this practice guide is to formulate specific and coherent evidence-based
recommendations for use by educators using out-of-school time programming to
address the challenge of improving student academic achievement. The guide pro-
vides practical, clear information on critical topics related to out-of-school time and
is based on the best available evidence as judged by the panel. Recommendations
presented in this guide should not be construed to imply that no further research
is warranted on the effectiveness of particular strategies for out-of-school time.
IES PRACTICE GUIDE

Structuring Out-of-School Time


to Improve Academic Achievement
July 2009

Panel
Megan Beckett (Chair)
RAND

Geoffrey Borman
University of Wisconsin–Madison

Jeffrey Capizzano
Teaching Strategies, Inc.

Danette Parsley
Mid-continent Research for Education and Learning (McREL)

Steven Ross
The Johns Hopkins University

Allen Schirm
Mathematica Policy Research, Inc.

Jessica Taylor
Florida Department of Education

Staff
Samina Sattar
Virginia Knechtel
Elizabeth Potamites
Mathematica Policy Research, Inc.

NCEE 2009-012
U.S. DEPARTMENT OF EDUCATION
This report was prepared for the National Center for Education Evaluation and Regional
Assistance, Institute of Education Sciences under Contract ED-07-CO-0062 by the
What Works Clearinghouse, a project of Mathematica Policy Research, Inc.

Disclaimer
The opinions and positions expressed in this practice guide are the authors’ and do
not necessarily represent the opinions and positions of the Institute of Education Sci-
ences or the U.S. Department of Education. This practice guide should be reviewed
and applied according to the specific needs of the educators and education agency
using it, and with full realization that it represents the judgments of the review
panel regarding what constitutes sensible practice, based on the research that was
available at the time of publication. This practice guide should be used as a tool
to assist in decisionmaking rather than as a “cookbook.” Any references within the
document to specific education products are illustrative and do not imply endorse-
ment of these products to the exclusion of other products that are not referenced.

U.S. Department of Education


Arne Duncan
Secretary

Institute of Education Sciences


John Q. Easton
Director

National Center for Education Evaluation and Regional Assistance


Phoebe Cottingham
Commissioner

July 2009

This report is in the public domain. Although permission to reprint this publication
is not necessary, the citation should be:

Beckett, M., Borman, G., Capizzano, J., Parsley, D., Ross, S., Schirm, A., & Taylor, J.
(2009). Structuring out-of-school time to improve academic achievement: A practice
guide (NCEE #2009-012). Washington, DC: National Center for Education Evaluation
and Regional Assistance, Institute of Education Sciences, U.S. Department of Education.
Retrieved from http://ies.ed.gov/ncee/wwc/publications/practiceguides.

What Works Clearinghouse Practice Guide citations begin with the panel chair, fol-
lowed by the names of the panelists listed in alphabetical order.

This report is available on the IES website at http://ies.ed.gov/ncee and http://ies.


ed.gov/ncee/wwc/publications/practiceguides.

Alternate Formats
On request, this publication can be made available in alternate formats, such as
Braille, large print, audiotape, or computer diskette. For more information, call the
Alternate Format Center at 202–205–8113.
Structuring Out-of-School Time
to Improve Academic Achievement
Contents
Introduction  1

The What Works Clearinghouse Standards and their relevance to this guide 2

Overview 5

Scope of the practice guide 7

Status of the research 9

Summary of the recommendations 9

Checklist for carrying out the recommendations 11

Recommendation 1. Align the OST program academically with


the school day 12

Recommendation 2. Maximize student participation and attendance 19

Recommendation 3. Adapt instruction to individual and small


group needs 24

Recommendation 4. Provide engaging learning experiences 29

Recommendation 5. Assess program performance and use the


results to improve the quality of the program 34

Appendix A. Postscript from the Institute of Education Sciences 38

Appendix B. About the authors 41

Appendix C. Disclosure of potential conflicts of interest 44

Appendix D. Technical information on the studies 45

References  86

( iii )
Structuring Out-of-School Time to Improve Academic Achievement

List of tables
Table 1. Institute of Education Sciences levels of evidence
for practice guides  4

Table 2. Recommendations and corresponding levels of evidence  10

Table D1. Studies of OST programs that met WWC standards


with or without reservations  46

Table D2. Studies and corresponding recommendations  48

Table D3. Studies of programs cited in recommendation 1


that met WWC standards with or without reservations  51

Table D4. Studies of programs cited in recommendation 2


that met WWC standards with or without reservations  58

Table D5. Studies of programs cited in recommendation 3


that met WWC standards with or without reservations  66

Table D6. Studies of programs cited in recommendation 4


that met WWC standards with or without reservations  74

Table D7. Studies of programs cited in recommendation 5


that met WWC standards with or without reservations  82

List of exhibits
Exhibit 1. Sample logbook  15

( iv )
Introduction when possible. The research base for this
guide was identified through a comprehen-
This guide is intended to help educators, sive search for studies evaluating academi-
out-of-school time (OST) program pro- cally oriented OST interventions and prac-
viders, and school and district adminis- tices. An initial search for research on OST
trators structure academically focused programs conducted in the United States
out-of-school time programs. OST is an in the past 20 years (1988–2008) yielded
opportunity to supplement learning from more than 1,000 studies. Of these, 130
the school day and provide targeted as- studies examined school-based OST pro-
sistance to students whose needs extend grams that serve elementary and middle
beyond what they can receive in the class- school students and were eligible for fur-
room. With an increasing focus on school ther review. These studies were reviewed
accountability and student performance, by the WWC to determine whether they
OST can play a meaningful role in improv- were consistent with WWC standards. Of
ing academic achievement and closing the the 130 studies, 22 met WWC standards
gap between low- and high-performing or met the standards with reservations.
students. Although OST programs oper- These 22 studies of 18 different OST pro-
ate nationwide, disagreement about which grams represent the strongest evidence of
aspects of these programs are beneficial the effectiveness of OST programs.2
for student achievement remains. This
practice guide includes concrete recom- In keeping with the WWC standards for
mendations for structuring an effective determining levels of evidence, the panel
academically oriented OST program, and it relied on the following definitions (see
illustrates the quality of the evidence that Table 1):
supports these recommendations. The
guide also acknowledges possible imple- A strong rating refers to consistent and
mentation challenges and suggests solu- generalizable evidence that an inter-
tions for circumventing the roadblocks. vention strategy or program improves
outcomes.3
A panel of experts in OST programs and
research methods developed the recom- A moderate rating refers either to evidence
mendations in this guide and determined from studies that allow strong causal con-
the level of evidence for each recommen- clusions but cannot be generalized with
dation. The evidence considered in de- assurance to the population on which a
veloping this guide ranges from rigorous recommendation is focused (perhaps be-
evaluations of OST programs to expert cause the findings have not been widely
analyses of practices and strategies in OST. replicated) or to evidence from studies that
In looking for effective practices, the panel are generalizable but have more causal
paid particular attention to high-qual- ambiguity than that offered by experimental
ity experimental and quasi-experimental designs (e.g., statistical models of correla-
studies, such as those meeting the criteria tional data or group comparison designs for
of the What Works Clearinghouse (WWC),1
and to patterns of practices that are repli-
cated across programs. 2. See Table D2 for a summary of which studies
are relevant to each recommendation.
As with all WWC practice guides, the rec- 3. Following WWC guidelines, improved out-
comes are indicated by either a positive statisti-
ommendations in this guide are derived
cally significant effect or a positive, substantively
from and supported by rigorous evidence, important effect size (i.e., greater than 0.25). See
the WWC guidelines at http://ies.ed.gov/ncee/
1. http://www.whatworks.ed.gov/. wwc/pdf/wwc_version1_standards.pdf.

(1)
Introduction

which equivalence of the groups at pretest although some of these programs have
is uncertain). had rigorous evaluations of their impacts,
others have not. Furthermore, some of the
A low rating refers to expert opinion based programs that have been rigorously evalu-
on reasonable extrapolations from re- ated have found positive effects on aca-
search and theory on other topics and demic achievement; others have not.5
evidence from studies that do not meet
the standards for moderate or strong The What Works Clearinghouse
evidence. standards and their relevance
to this guide
It is important for the reader to remem-
ber that the level of evidence rating is not In terms of the levels of evidence indi-
a judgment by the panel on how effective cated in Table 1, the panel relied on WWC
each of these recommended practices will Evidence Standards to assess the quality of
be when implemented, nor is it a judg- evidence supporting educational programs
ment of what prior research has to say and practices. WWC addresses evidence
about their effectiveness. The level of evi- for the causal validity of instructional pro-
dence ratings reflect the panel’s judgment grams and practices according to WWC
of the quality of the existing literature to standards. Information about these stan-
support a causal claim that when these dards is available at http://ies.ed.gov/ncee/
practices have been implemented in the wwc/pdf/wwc_version1_standards.pdf. The
past, positive effects on student academic technical quality of each study is rated and
outcomes were observed. They do not re- placed into one of three categories:
flect judgments of the relative strength of
these positive effects or the relative impor- • Meets Evidence Standards for random-
tance of the individual recommendations. ized controlled trials and regression
Thus, a low level of evidence rating does discontinuity studies that provide the
not indicate that the recommendation is strongest evidence of causal validity.
any less important than other recommen-
dations with a strong or moderate rating. • Meets Evidence Standards with Res-
Rather, it suggests that the panel cannot ervations for all quasi-experimental
point to a body of research that demon- studies with no design flaws and ran-
strates its effect on student achievement. domized controlled trials that have
In some cases, this simply means that the problems with randomization, attri-
recommended practices would be diffi- tion, or disruption.
cult to study in a rigorous, experimental
fashion; in other cases, it means that re- • Does Not Meet Evidence Screens for
searchers have not yet studied this prac- studies that do not provide strong evi-
tice, or that there is ambiguous evidence dence of causal validity.
of effectiveness.4
Following the recommendations and sug-
Citations in the text refer to studies of gestions for carrying out the recommen-
programs that have implemented vari- dations, Appendix D presents more in-
ous practices. Not all of these programs formation on the research evidence that
contribute to the level of evidence rating: supports each recommendation.

4. For more information, see the WWC Frequently


Asked Questions page for practice guides, http:// 5. Table D1 summarizes the details and effective-
ies.ed.gov/ncee/wwc/references/idocviewer/ ness of studies consulted for the evidence rating
Doc.aspx?docId=15&tocId=3. of this guide.

(2)
Introduction

We appreciate the efforts of Samina Sat- their time and expertise to the review
tar, Virginia Knechtel, Liz Potamites, and process. We would like to thank Kristin
Claire Smither, MPR staff members who Hallgren, Scott Cody, Shannon Monahan,
participated in the panel meetings, charac- and Mark Dynarski for their oversight and
terized the research findings, and drafted guidance during the development of the
the guide. We also appreciate the help of practice guide, and for helpful feedback
the many WWC reviewers who contributed and reviews of its earlier versions.

Megan Beckett
Geoffrey Borman
Jeffrey Capizzano
Danette Parsley
Steven Ross
Allen Schirm
Jessica Taylor

(3)
Introduction

Table 1. Institute of Education Sciences levels of evidence for practice guides


In general, characterization of the evidence for a recommendation as strong requires both
studies with high internal validity (i.e., studies whose designs can support causal conclu-
sions) and studies with high external validity (i.e., studies that in total include enough of
the range of participants and settings on which the recommendation is focused to sup-
port the conclusion that the results can be generalized to those participants and settings).
Strong evidence for this practice guide is operationalized as
• A systematic review of research that generally meets WWC standards (see http://ies.
ed.gov/ncee/wwc/) and supports the effectiveness of a program, practice, or approach
Strong with no contradictory evidence of similar quality; OR
• Several well-designed, randomized controlled trials or well-designed quasi-experi-
ments that generally meet WWC standards and support the effectiveness of a program,
practice, or approach, with no contradictory evidence of similar quality; OR
• One large, well-designed, randomized controlled, multisite trial that meets WWC stan-
dards and supports the effectiveness of a program, practice, or approach, with no
contradictory evidence of similar quality; OR
• For assessments, evidence of reliability and validity that meets the Standards for
Educational and Psychological Testing.a

In general, characterization of the evidence for a recommendation as moderate requires


studies with high internal validity but moderate external validity or studies with high
external validity but moderate internal validity. In other words, moderate evidence is
derived from studies that support strong causal conclusions but generalization is uncer-
tain or studies that support the generality of a relationship but the causality is uncertain.
Moderate evidence for this practice guide is operationalized as
• Experiments or quasi-experiments generally meeting WWC standards and supporting
the effectiveness of a program, practice, or approach with small sample sizes and/
or other conditions of implementation or analysis that limit generalizability and no
contrary evidence; OR
Moderate • Comparison group studies that do not demonstrate equivalence of groups at pretest
and, therefore, do not meet WWC standards but that (1) consistently show enhanced
outcomes for participants experiencing a particular program, practice, or approach
and (2) have no major flaws related to internal validity other than lack of demonstrated
equivalence at pretest (e.g., only one teacher or one class per condition, unequal
amounts of instructional time, highly biased outcome measures); OR
• Correlational research with strong statistical controls for selection bias and for dis-
cerning influence of endogenous factors and no contrary evidence; OR
• For assessments, evidence of reliability that meets the Standards for Educational and
Psychological Testingb but with evidence of validity from samples not adequately repre-
sentative of the population on which the recommendation is focused.

In general, characterization of the evidence for a recommendation as low means that the
recommendation is based on expert opinion derived from strong findings or theories in
Low related areas and/or expert opinion buttressed by direct evidence that does not rise to
the moderate or strong level. Low evidence is operationalized as evidence not meeting
the standards for the moderate or high level.

a. American Educational Research Association, American Psychological Association, and National Council on Measurement
in Education (1999).
b. Ibid.

(4)
Structuring out-of- OST programs offer a promising approach
school time to improve to enhancing students’ academic skills and
to closing the achievement gap. In recog-
academic achievement nition of this promise, funding for OST
programs and services has grown in the
Overview past few years. As part of the No Child Left
Behind (NCLB) legislation, districts are re-
Over the past three decades, changing quired to spend 5 percent to 20 percent
labor force patterns in the United States of all Title I funds on supplemental educa-
have significantly increased the need for tional services (SES).10 Further, some states
child care for school-age children. In 2000, expanded funding for OST programs, even
nearly half of school-age children with in the face of reduced budgets. For exam-
working mothers spent time in non-pa- ple, in 2002, California voters approved
rental supervised settings when they were the addition of approximately a half bil-
not in school, including before- and after- lion dollars annually to existing state after-
school programs, family child care homes, school programs.11
and the homes of relatives.6 Commonly
known as out-of-school time (OST), this Similarly, the number of OST programs, in-
period outside of the school day when cluding after-school, weekend, and summer
children are not with their parents has re- programs and SES, has been increasing. In
ceived extensive policy attention, focused 1995, a U.S. Census Bureau study of child
on both the risks of negative influences care arrangements showed that 5.6 percent
during this time and the potential benefits of children ages 5 to 14 received care in a
the time holds for the positive develop- before- or after-school program accord-
ment of school-age children. ing to parents in the sample.12 In 2005, 20
percent of K–8 students participated in a
Although many OST settings are designed before- or after-school program.13
primarily to provide a safe place for chil-
dren to be outside of the traditional school
day while parents work, there is now a 10. Supplemental educational services (SES) are
tutoring or other academic support services of-
broader movement toward using OST to
fered outside the regular school day, at no charge
bridge the gap between high- and low- to students or their families, by public or private
achieving students and to give students providers that have been approved by the state.
more time to learn if they need it.7 Aca- Districts are required to offer SES to low-income
demically oriented out-of-school programs students in schools that have fallen short of
adequate yearly progress (AYP) standards for a
and services are promising because stu- third time (after missing AYP for two consecutive
dents spend twice as much of their wak- years). Students and their parents are permitted
ing hours outside of the classroom as in to choose among state-approved SES providers,
it,8 and OST periods, especially summer which come in all varieties, including national
for-profit firms, local nonprofits, faith-based
breaks, are the times when the achieve- organizations, institutions of higher education,
ment gap widens.9 and local school districts (which are permitted
to become approved providers unless they are
themselves identified for improvement under No
Child Left Behind [NCLB]).
11. Administration for Children & Families (n.d.).
6. Capizzano, Tout, and Adams (2000). State afterschool profiles: California. Washington,
7. Halpern (1999). DC: U.S. Department of Health and Human Ser-
8. Hofferth and Sandberg (2001). vices. Retrieved May 29, 2009, from http://nccic.
org/afterschool/ca.html.
9. Heyns (1978); Alexander, Entwisle, and Olson
(2007a, 2007b); Downey, von Hippel, and Broh 12. Smith (2000).
(2004); Cooper et al. (1996). 13. Carver and Iruka (2006).

(5)
Overview

Although it is generally assumed that OST academically; that time may need to be
programs can provide students with posi- carefully orchestrated to facilitate learn-
tive, academically enriching experiences, ing and retention of academic material.
it is not necessarily known how to struc- Additionally, the average amount of total
ture programs to effectively improve stu- instructional time received by students in
dent academic outcomes. Although many a typical OST program may be too low to
studies lacking comparison groups sug- generate meaningful academic effects.17
gest that OST programs can benefit stu-
dents academically,14 those with more rig- The findings from the evaluation of En-
orous evaluation designs raise questions hanced Academic Instruction in After-
about these findings. For example, find- School Programs, sponsored by the Insti-
ings from the national evaluation of the tute of Education Sciences (IES), provide
21st Century Community Learning Centers some evidence for what works in OST in-
(CCLC) program, which is the largest after- struction.18 The elementary school pro-
school program in the United States, show grams delivered school-day math and
that, on average, students participating in reading curricula adapted to after-school
the programs had no improvement in aca- settings.19 Students, who received an av-
demic achievement.15 erage of 57 hours of enhanced math in-
struction (more than the 30–40 hours SES
The evaluation found that 21st CCLC pro- students might receive20), had modest but
grams were not consistently focused on statistically significant improvements in
academics and often placed more empha- math achievement after one year compared
sis on sports or extracurricular activities with students in a regular after-school pro-
because they thought those activities were gram.21 No differences were found between
more popular with students and would students who received enhanced reading
encourage participation in the program.16 instruction and those in a regular after-
Students in 21st CCLC programs may not school program. This first year of findings
have been spending enough time engaged provides some indication that instruction
in academic content to produce measurable in OST can improve student achievement
gains in achievement. Simply adding time when delivered in a structured, focused
to students’ days may not benefit them format with adequate dosage.

14. Center for Applied Linguistics (1994); Fash- 17. Kane (2004).
ola (1998); Ferreira (2004); Sheldon and Hopkins 18. Black et al. (2008).
(2008).
19. Ibid.
15. U.S. Department of Education (2003).
20. Ross et al. (2008).
16. James-Burdumy, Dynarski, and Deke (2007).
21. Black et al. (2008).

(6)
Scope of the to improve academic achievement can
practice guide benefit from this guide. State education
agencies may find the recommendations
useful for assessing the quality of prospec-
The purpose of this practice guide is to tive OST programs such as SES providers
provide recommendations for organiz- or 21st CCLC programs. Other types of
ing and delivering school-based OST pro- programs, such as before-school or non–
grams to improve the academic achieve- school-based programs also may benefit
ment of student participants. School-based from the recommendations in this guide.
programs include those that are adminis- However, these programs were not the
tered by a school or school district, as well focus of the panel’s discussions.
as programs that are contracted by the
school or school district and provided by The panel assumed that the basic structure
other organizations. The panel has limited of an academically focused OST program
the scope of this practice guide to pro- would include the following components:
grams that (1) serve elementary and mid-
dle school students, (2) are organized by • a place to meet—often this is the school,
or conducted in partnership with a school but it also may be a community center
or school district, and (3) aim to improve or other facility
academic outcomes.
• regular hours of operation
The structure and objectives of OST pro-
grams vary. Some exist as a place where • transportation (if necessary)
students can be safe and occupied for the
time from school dismissal until parents • administrative and instructional staff
are able to pick them up. Other programs
are designed to provide social and cultural • instructional materials or curricula
enrichment opportunities or to promote
healthy outcomes such as grade promotion Staffing needs will vary by program but
and reduction in risky behaviors. This guide typically include a program director who
targets programs whose primary goal is to supervises the operation of the program
provide academic instruction to improve (or program sites if there are multiple loca-
participants’ achievement.22 The guide tar- tions) and manages OST instructors. The
gets the following types of programs: OST instructors are the front-line staff
members who interact with students. Ad-
• after-school or weekend programs ditionally, either the OST program or a
school may employ a coordinator who is
• summer school responsible for maintaining the relation-
ships among the school, the program site,
• SES and other partners (see recommendation
1). School districts that have contracted
Teachers, principals, district administra- with multiple OST programs may employ
tors, and other staff who seek guidance for a district coordinator to monitor all pro-
structuring these types of OST programs grams across schools.

The evidence base in this guide for the


22. In a meta-analysis of 73 after-school pro- effectiveness of OST programs and ser-
grams, Durlak and Weissberg (2007) found that
“the presence of an academic component” was
vices on student achievement is based on
the largest predictor of a program producing what is known about after-school, summer
significant academic improvements.

(7)
Scope of the practice guide

school, and SES programs targeting el- administrators will make that decision
ementary and middle school students in based on their available resources and the
low-income, high-needs communities. needs and preferences of the communities
Although many of the recommendations they serve. They may decide that one or a
may look similar to those for high school combination of these types of programs
programs, the objectives of OST programs will suit their objectives.
may differ in high school, and the action
steps and roadblocks will be different Second, the guide does not address specific
because of the wider range of activities, instructional practices or teaching strate-
transportation issues, and instructional gies that are unique to reading, math, or
needs of high school students compared other content areas. The research base
with those of younger students. Similarly, that would be required to support con-
although the recommendations may be ap- tent-specific recommendations is beyond
plicable to other populations, the literature the scope of this guide. For reference, the
used in this guide largely looks at students What Works Clearinghouse (WWC) has pub-
in urban, low-income, and low-achieving lished a number of practice guides and
schools. Table D1 in Appendix D includes intervention reports devoted to content-
details on the populations studied by the specific areas such as adolescent literacy,
literature mentioned in this guide. beginning reading, and elementary and
middle school math.23
The recommendations in this guide can be
used singly or in combination. The panel Third, the guide does not provide details
recommends that readers consider imple- on the costs of organizing or operating OST
menting all recommendations, but any one programs or of implementing the recom-
recommendation can be implemented inde- mendations of the panel. The panel rec-
pendently. For example, recommendations ognizes that cost is a huge element in the
1 and 2 will be most useful to school ad- decisions that programs make about the
ministrators who want to address low stu- services they provide, but also that costs
dent attendance and ensure that students can vary considerably by factors such as
who need academic help will benefit from the type of program, days of operation, and
the OST program. Recommendations 3 and geographical area.24 Some of the roadblocks
4 will be useful for teachers who are strug- found at the end of each recommendation
gling with addressing the academic needs provide suggestions for ways to minimize
of their students. Recommendation 5 is costs, but the panel directs readers to re-
intended for school, district, and program cent reports by Public/Private Ventures (P/
administrators who want to ensure that the PV), RAND, and others for more information
out-of-school programming offered to stu- on the costs of OST programs.25
dents is of high quality and that students
are benefiting from those services. Finally, the guide does not address behav-
ioral management in the OST context. The
In writing the guide, the panel chose not panel acknowledges that in the OST arena,
to address the following four areas: as in the school-day classroom, programs

First, the guide does not address whether


23. Institute of Education Sciences. (n.d.). What
to provide services after school, before
Works Clearinghouse: Practice Guides. Washing-
school, on weekends, or during the sum- ton, DC: U.S. Department of Education. Retrieved
mer months. The panel does not believe May 29, 2009, from http://ies.ed.gov/ncee/wwc/
that there is a consensus in the literature publications/practiceguides.
about how the timing of a program affects 24. Grossman et al. (2009).
student outcomes. The panel assumes that 25. Grossman et al. (2009); Beckett (2008).

(8)
Scope of the practice guide

and practices can impact student behavior. educators and OST providers alike with
In OST, this includes the sites participat- more definitive information about effec-
ing in the national evaluation of 21st CCLC, tive practices.
which were found to have an adverse ef-
fect on student behavioral outcomes.26 Al- In offering these recommendations, the
though the panel recognizes that creating panel reminds readers that the evidence
a safe and orderly environment is a neces- base in support of these recommenda-
sary condition for students to learn, inter- tions, when available, is based on expe-
ventions that address behavioral manage- riences in OST programs. The relatively
ment issues were judged to be out of the small amount of literature on academically
scope of this guide and recommendations focused programs limited the depth of in-
that are targeted to academic improve- formation on effective instructional prac-
ment. The panel directs readers to other tices in the OST context. To account for
publications that may be helpful in this this, the panel also incorporated relevant
area, including the WWC practice guide, literature in the broader education field
Reducing Behavior Problems in the Elemen- and used its expert judgment to identify
tary School Classroom.27 practices that strengthened the OST learn-
ing environment.
Status of the research
Summary of the recommendations
Overall, the panel believes that the exist-
ing research on OST practices is not at a This practice guide offers five recommen-
level to provide conclusive evidence of dations to improve the ability of OST pro-
best practices. Studies of OST programs grams to benefit students academically
tend to examine combined effects of a va- (see Table 2). Recommendations 1 and 2
riety of practices and procedures on stu- address how to design an OST program by
dent achievement, making it difficult to considering its relationship with schools
determine the specific practices contrib- and the components that can maximize
uting to achievement gains.28 Likewise, the appeal of the program. Recommenda-
the panel encountered varying impacts tions 3 and 4 focus on the delivery of aca-
across OST programs with ostensibly simi- demic instruction, and how it can be used
lar practices. Many studied interventions purposely to improve student engage-
are practiced on a small scale, necessitat- ment and performance. Recommendation
ing a small sample size and often making 5 addresses evaluation of OST programs,
it difficult to find an appropriate compari- which is essential for maintaining high
son group. Low levels of participation or standards of quality as well as continuous
attendance, even in large-scale programs, improvement of the program design and
also make it difficult to interpret evalua- instruction.
tion results. The panel believes that the
OST field would benefit from additional, Recommendations 1 and 2 (Design). OST
rigorous research on OST programs that programs should include design features
serve a large number of students and have that ultimately strengthen academic prog-
achieved high levels of participation. Im- ress while fulfilling the needs of parents
proving the research base will provide and students. Recommendation 1 empha-
sizes the importance for OST programs to
26. James-Burdumy, Dynarski, and Deke (2008). connect with school and classroom activi-
27. Epstein et al. (2008). ties to achieve a shared mission of improv-
28. In these cases, the panel members exercised
ing academic performance. Further, the
their expert judgment to identify practices likely panel recognizes that sometimes students
to produce achievement gains for students.

(9)
Scope of the practice guide

most likely to benefit from a strong aca- engaging. The panel believes it is particu-
demic program may be especially unlikely larly important to engage students when
to enroll in or attend OST programs regu- they may be fatigued after a long day of
larly. Thus, the panel recommends that school; on Saturdays; during the summer
OST programs focus on recruiting and months; or when they are drawn to partici-
retaining targeted students so that they pate in other, nonacademic activities. To
receive the dosage necessary to realize avoid the pitfalls of other programs that
academic benefits (recommendation 2). failed to demonstrate positive academic ef-
fects, the panel suggests that all activities
Recommendations 3 and 4 (Instruction). To have a specific learning objective.
maximize the educational benefits for stu-
dents, OST programs should deliver aca- Recommendation 5 (Evaluation). Finally,
demic instruction in a way that responds program improvement depends on the
to each student’s needs and engages them articulation of goals and expectations, ef-
in learning. Recommendation 3 presents fective management, and the performance
strategies for the structuring of instruc- and experience of staff. Recommenda-
tional practices and program content to ad- tion 5 presents strategies for schools and
dress the needs of students and effectively districts to use to identify programs that
improve academic outcomes. The recom- are most likely to result in academic im-
mendation provides suggestions for orga- provement and to monitor existing pro-
nizing instructional time in the classroom grams to ensure that the highest-quality
and for facilitating individualized teaching services are being provided to students.
by assessing student needs. Recommenda- The earlier recommendations in this guide
tion 4 encourages OST programs to capital- should provide programs with a solid
ize on programming flexibility by offering starting point from which to evaluate an
activities that students may find especially OST program.

Table 2. Recommendations and corresponding levels of evidence

Recommendation Level of evidence

Design

1. Align the OST program academically with the school day. Low

2. Maximize student participation and attendance. Low

Instruction

3. Adapt instruction to individual and small group needs. Moderate

4. Provide engaging learning experiences. Low

Evaluation

5. Assess program performance and use the results to improve the qual-
Low
ity of the program.

( 10 )
Checklist for carrying out the Recommendation 3.
recommendations Adapt instruction to individual
and small group needs.
Recommendation 1.
Align the OST program academically  Use formal and informal assessment
with the school day. data to inform academic instruction.

 Use OST program coordinators to de-  Use one-on-one tutoring if possi-


velop relationships and maintain ongoing ble; otherwise, break students into small
communication between schools and the groups.
OST program.
 Provide professional development
 Designate a school staff person to and ongoing instructional support to all
coordinate communication with OST pro- instructors.
grams and help them support school
needs. Recommendation 4.
Provide engaging learning
 Connect OST instruction to school experiences.
instruction by identifying school-based
goals and learning objectives.  Make learning relevant by incorpo-
rating practical examples and connect-
 Coordinate with the school to identify ing instruction to student interests and
staff for OST programs. experiences.

Recommendation 2.  Make learning active through op-


Maximize student participation portunities for collaborative learning and
and attendance. hands-on academic activities.

 Design program features to meet the  Build adult-student relationships


needs and preferences of students and among OST program participants.
parents.
Recommendation 5.
 Promote awareness of the OST pro- Assess program performance and use
gram within schools and to parents. the results to improve the quality of
the program.
 Use attendance data to identify stu-
dents facing difficulties in attending the  Develop an evaluation plan.
program.
 Collect program and student perfor-
mance data.

 Analyze the data and use findings for


program improvement.

 Conduct a summative evaluation.

( 11 )
Recommendation 1. outcomes. Although it was common for
Align the OST program programs to include some components of
the panel’s recommendations, none tested
academically with the effectiveness of this recommendation
the school day individually, only in combination with the
other components of OST programs.

The panel believes that academic In the panel’s opinion, collaboration can
alignment with the school day is improve academic outcomes and in the
necessary for OST programs to improve studies reviewed for this guide, two in-
academic performance. OST programs dependent evaluators recommended that
and schools have the shared mission of collaboration between schools and OST
helping students achieve success, and programs be strengthened if possible.31
collaboration between the two can be However, we acknowledge that more re-
mutually beneficial. Although alignment search is required to demonstrate the ef-
requires additional effort from staff, fects of stronger alignment.
teachers and principals in schools with
existing OST programs have voiced Brief summary of evidence to
support for this sort of collaboration.29 support the recommendation

Fifteen OST programs endeavored to


An OST program coordinator can collaborate with school-based staff or
ensure alignment through regular initiatives,32 but, in general, these efforts
communication with school staff, were not core components of the pro-
and schools can help by designating grams. Three programs also expressed
a school-based coordinator to work difficulty or reluctance to coordinate more
with the OST coordinator. This sort fully.33 Of the 11 programs with studies
of cooperation helps OST programs
evaluate their students’ needs and 31. Schacter and Jo (2005) and Center for Applied
provide the most effective instruction Linguistics (1994) suggested the use of more col-
and services. Both the program and laboration when appropriate.
the school-based coordinators should 32. Challenging Horizons Program (CHP)—Lang-
berg et al. (2006); Early Risers—August et al.
work to align the instructional activities
(2001); Enhanced Academic Instruction—Black et
of the OST program with state and al. (2008); Teach Baltimore—Borman and Dowl-
local content standards, the school ing (2006); Chicago Summer Bridge—Jacob and
curriculum, and district- and/or school- Lefgren (2004); Los Angeles’s Better Educated Stu-
based learning initiatives.30 dents for Tomorrow (L.A.’s BEST)—Goldschmidt,
Huang, and Chinen (2007); Youth Services—Child
Care, Academic Assistance, Recreation, and En-
Level of evidence: Low richment (YS-CARE)—Bissell et al. (2002); 21st
CCLC—U.S. Department of Education (2003); Leap
The level of evidence for this recommen- Frog—McKinney (1995); Nurturing Development
Partnerships (NDP)—Udell (2003); SES—McKay et
dation is low. There is no direct evidence
al. (2008); SES—Ross et al. (2008); SES—Muñoz,
that practices outlined in this recommen- Potter, and Ross (2008); Title I supplementary
dation contribute to improved academic education—Borman (1997); The After-School
Corporation (TASC)—Reisner et al. (2004); Project
Adelante—Center for Applied Linguistics (1994);
29. Goldschmidt, Huang, and Chinen (2007); Bis- After-school tutoring—Leslie (1998).
sell et al. (2002). 33. In James-Burdumy et al. (2005), the authors
30. Borman and Dowling (2006); Langberg et al. noted that 21st CCLC programs struggled to effec-
(2006); Roderick, Jacob, and Bryk (2004); Borman tively coordinate homework help with the school;
(1997). and in U.S. Department of Education (2003), 21st

( 12 )
1. Align the OST program academically with the school day

that met WWC standards with or with- program found significant and persistent
out reservations,34 three programs docu- effects on both math and reading for 3rd
mented practices that closely corresponded graders but not for 6th graders.39
to the panel’s recommendations.35 In two
of these, coordination between school- The remaining eight programs included
teachers and OST instructors was frequent some components similar to this recom-
and structured.36 Content and skills taught mendation, but studies indicated that the
during OST were intentionally designed to degree of coordination was lower or not
support students during their school-day enough information was provided to de-
instruction. One program showed positive termine the level of alignment between
academic effects,37 and the other did not.38 programs.40 Of these, one showed positive
The purpose of the third program, which effects,41 two showed mixed effects,42 and
was a summer school, was to help stu- five programs showed no detectable aca-
dents achieve proficiency on state exami- demic effects.43 Although more of these
nations they had not mastered during the programs failed to demonstrate effective-
school year. The curriculum was designed ness, Table D2 shows that there were seven
by the district with the express purpose effective programs that did not attempt co-
of helping students meet state standards ordination with schools. Given the absence
and was closely linked to that goal, but of a clear pattern of effectiveness based on
by nature of its being a summer school the level of coordination with schools and
program, coordination with individual the small sample of programs with high
teachers was limited. The evaluation of the levels of coordination, the panel decided
that the level of evidence was low.
CCLC programs were found to be supportive
but not “integrated” (p. 39) with the school. In
Project Adelante (Center for Applied Linguistics
1994), program directors recommended closer
coordination with school-day staff to collect data
and share information on student progress but
were concerned about the appropriateness of the
school-day curriculum for the students that their
program served. Similarly, Morris, Shaw, and Per-
ney (1990) expressed reluctance to align their
after-school tutoring program, Howard Street
Tutoring, to the school curriculum given that
their students’ classroom instruction was often 39. Chicago Summer Bridge—Jacob and Lefgren
beyond the students’ current reading levels. (2004).
34. CHP—Langberg et al. (2006); Early Risers— 40. Early Risers—August et al. (2001); Enhanced
August et al. (2001); Enhanced Academic Instruc- Academic Instruction—Black et al. (2008); Teach
tion—Black et al. (2008); Teach Baltimore—Borman Baltimore—Borman and Dowling (2006); L.A.’s
and Dowling (2006); Chicago Summer Bridge—Jacob BEST—Goldschmidt, Huang, and Chinen (2007);
and Lefgren (2004); L.A.’s BEST—Goldschmidt, YS-CARE—Bissell et al. (2002); 21st CCLC—U.S.
Huang, and Chinen (2007); YS-CARE—Bissell et al. Department of Education (2003); NDP—Udell
(2002); 21st CCLC—U.S. Department of Education (2003); SES—McKay et al. (2008); SES—Ross et al.
(2003); Leap Frog— McKinney (1995); NDP—Udell (2008); SES—Muñoz, Potter, and Ross (2008).
(2003); SES—McKay et al. (2008); SES—Ross et al.
41. Early Risers—August et al. (2001).
(2008); SES—Muñoz, Potter, and Ross (2008).
42. Enhanced Academic Instruction—Black et
35. CHP—Langberg et al. (2006); Chicago Sum-
al. (2008); Teach Baltimore—Borman and Dowl-
mer Bridge—Jacob and Lefgren (2004); Leap
ing (2006).
Frog—McKinney (1995).
43. L.A.’s BEST—Goldschmidt, Huang, and Chinen
36. CHP—Langberg et al. (2006); Leap Frog—
(2007); YS-CARE—Bissell et al. (2002); 21st CCLC—
McKinney (1995).
U.S. Department of Education (2003); NDP—Udell
37. CHP—Langberg et al. (2006). (2003); SES—Ross et al. (2008); SES—McKay et al.
38. Leap Frog—McKinney (1995). (2008); SES—Muñoz, Potter, and Ross (2008).

( 13 )
1. Align the OST program academically with the school day

How to carry out this will reinforce and complement the school
recommendation curriculum.45 Regular communication can
help identify the needs and strengths of in-
1. Use OST program coordinators to develop dividual students and those strategies that
relationships and maintain ongoing com- are most effective in raising achievement.
munication between schools and the OST Some examples of steps follow:
program.
• The OST coordinator can develop a log-
An OST program coordinator can play book that students carry back and forth
a critical role in ensuring that instruc- daily to share information about OST ac-
tional components of an OST program are tivities with classroom teachers.46 The
aligned with the school day. Coordinators logbook can contain information from
should work directly with teachers and classroom teachers about homework
administrators from the school to obtain assignments, concepts the student is
information that can be used to guide in- struggling with during the school day,
struction in the OST program. This can or the instructional strategies that are
be accomplished through regular com- most effective with the student.47 The
munication with key school staff, and also coordinator should work closely with
through participation in school meetings school staff in developing this logbook.
and committees. For example, coordina- For a sample logbook, see Exhibit 1.48
tors can attend staff meetings, participate
in common planning periods, serve on • The OST coordinator can arrange for
school leadership teams, and participate OST instructors to periodically attend
in parent-teacher organizations. OST co- common planning periods with class-
ordinators also can promote the OST pro- room teachers to align programming
gram to staff and families by posting in- or collaborate with effective teachers
formation about the program on bulletin to identify best practices and materi-
boards or holding OST events during the als for meeting curriculum goals and
school day to expose other students to the raising student achievement.49
OST program.
• The OST coordinator can collaborate
When possible, the OST coordinator should with school-based staff to identify rele-
be housed within the school, spending vant professional development that OST
time during daily school hours to be visi- instructors can attend with schoolteach-
ble to both students and teachers. The OST ers to align instructional strategies and
coordinator can use these types of oppor- to provide funding when possible.
tunities to maintain an open relationship
with teachers, principals, and counselors,
45. Leslie (1998); U.S. Department of Education
advocating for and gathering data about (2003).
OST students as necessary.44 46. Morris, Shaw, and Perney (1990).
47. Langberg et al. (2006).
The OST coordinator can take key steps to
48. For other examples of logbooks, see SEDL
facilitate regular communication between National Center for Quality Afterschool (n.d.) and
OST instructors and classroom teachers that Region VII After School Programs (n.d.).
49. Bott (2006) described the use of this strategy
in the Gardner Extended Services School; U.S.
44. Center for Applied Linguistics (1994). Department of Education (2003).

( 14 )
1. Align the OST program academically with the school day

Exhibit 1. Sample logbook


Information from the Classroom Teacher
Today’s Date

Student attended class?  Yes  No  Tardy

By subject: Subject 1a Subject 2 Subject 3 Subject 4

Topics covered
in class today

Today’s homework
assignment

Areas in which
the student needs
additional help

Instructional strategies
that were useful

Behavior or discipline
issues

Other comments

Information from the OST Instructor


Today’s Date

Student attended  Yes  No  Tardy


OST program?

By subject: Subject 1 Subject 2 Subject 3 Subject 4

Percentage of
homework completed
during OST

Homework items that


were challenging for
the student

Topics covered during


OST instruction

Instructional strategies
or activities that were
useful

Behavior or discipline
issues

Other comments
a. This form can be used for single subject classrooms or elementary classrooms.

( 15 )
1. Align the OST program academically with the school day

2. Designate a school staff person to coordi- 3. Connect OST instruction to school instruc-
nate communication with OST programs and tion by identifying school-based goals and
help them support school needs. learning objectives.

Schools can designate a staff member (the Information gathered from the school and
school-based coordinator) to work with the the district can be used to prioritize efforts
OST program coordinator (or with OST co- to raise academic achievement and sup-
ordinators from multiple OST programs, port student learning during the school
if relevant). The panel believes that when day. OST programs should align activi-
OST programs are well aligned to school- ties, instruction, and any formal curricu-
day goals and instruction, they can support lum with the state and local standards, as
the school in raising student achievement. well as the content and curriculum of the
A school-based coordinator can serve as a school day.50 State and local curriculum
first point of contact in the school for OST standards are often available online, but
programs and can help ensure that OST in- school and district officials also should di-
struction is well aligned with school goals. rect OST staff to relevant resources such
This person will play an important role as school improvement plans or other spe-
in program-school relations, but it is not cific school- or district-based objectives.
imperative that the position is full time, The OST program need not repeat class-
and an existing teacher or counselor may room instruction, but it can use different
be suited to this role. Key functions of the methods to support and reinforce what
school-based staff designee include students learn in school.

• Developing a set of standard operat- OST programs can help students develop
ing procedures for distributing student skills that support classroom instruction,
data to OST program staff. Relevant such as learning how to plan, take notes,
data can include results from district- develop an outline, or study for an upcom-
or state-wide standardized testing, ing test. For example, OST providers could
student progress reports from teach- explicitly teach a skill and require students
ers, or even brief informal comments to practice that skill using an assignment
from teachers on student strengths from the school day. They should follow
and weaknesses. up by checking assignments and confer-
encing with the classroom teacher. When
• Preparing relevant information on promoting the use of skills from OST dur-
school operations, academic standards, ing the school day, instructors should be
improvement plans, and curricula. careful to coordinate with the classroom
teacher first to ensure that the relevant
• Observing and communicating with OST skill will align with classroom instruction
staff to identify opportunities for greater and will not disrupt the teacher’s routine.
coordination with the school day. This can be useful for helping students de-
velop note-taking, planning, and studying
It also can be beneficial for school districts skills that will help them achieve success
to designate a key staff person to work with with the school-day curriculum.51
OST coordinators. The district designee can
provide OST coordinators with information
on standards and curricula. The district 50. Borman and Dowling (2006); Langberg et al.
designee also can play a key role in coor- (2006); Roderick, Jacob, and Bryk (2004); Borman
dinating and supervising the activities of (1997); Udell (2003).
multiple SES and other OST providers. 51. Langberg et al. (2006).

( 16 )
1. Align the OST program academically with the school day

Field trips or cultural activities that are tight or sufficient numbers of experienced
part of the OST program should be ex- teachers are not available.54
plicitly linked to school content and state
standards. The panel believes that these Although little is known about the methods
activities need to connect to something or characteristics that define effective teach-
the students are learning in school to help ers, researchers have discovered that some
them see how what they learn in school teachers are much better than others at help-
relates to their real-life experiences. The ing students achieve significant achievement
result can maximize the gains from both gains during the school day.55 For direct
the OST program and the school day and instruction or supervisory roles, the panel
make academic content more relevant to recommends hiring classroom teachers who
students’ lives (see recommendation 4 for demonstrate success during the school day,
information on connecting engaging in- and the school can support these efforts. To
struction to academic content). identify effective teachers to employ as the
OST coordinator or as an OST instructor,
4. Coordinate with the school to identify OST programs can seek out award-winning
staff for OST programs. teachers or work with administrators to
identify effective teachers.56
OST programs have several roles for which
effective classroom teachers are well Potential roadblocks and solutions
suited. The panel recommends that pro-
grams evaluate how classroom teachers Roadblock 1.1. The principal does not
can be useful to their programs and hire have time to coordinate with OST staff.
them when appropriate to meet program
goals. For example, teachers can serve as Suggested Approach. OST programs sup-
OST coordinators, particularly for sum- port the school by providing additional
mer programs in which teachers might academic assistance to students. To max-
not face conflicting demands on their time imize their effectiveness, the panel sug-
from their regular teaching schedules.52 gests that the principal designate someone
When funding is available to hire effective at the school to be responsible for com-
teachers from the school to serve as OST munications regarding OST programming.
instructors, these teachers can use their The OST program can help by clearly com-
experience and knowledge of instructional municating the benefits of collaboration to
methods to maximize academic gains for the principal.
participating students.53 Finally, teachers
can use their experience to advise and Roadblock 1.2. The OST program does
mentor less-experienced OST instructors not have enough money to hire a program
or volunteers, especially when budgets are

54. In KindergARTen (Borman, Goertz, and Dowl-


ing 2008), Teach Baltimore (Borman and Dowling
52. In 21st CCLC programs, about 67 percent of 2006), NDP (Udell 2003), and Howard Street Tutor-
coordinators had experience as classroom teach- ing (Morris, Shaw, and Perney 1990), schoolteachers
ers, and 34 percent were currently school-day served as advisors to less-experienced instructors
teachers (U.S. Department of Education 2003, or volunteers.
p. 36). 55. See, for example, Hanushek (1992).
53. In Chicago Summer Bridge (Jacob and Lef- 56. Roderick, Jacob, and Bryk (2004). Research-
gren 2004), 21st CCLC (U.S. Department of Educa- ers have demonstrated that principals are good at
tion 2003), and Enhanced Academic Instruction identifying their best and worst teachers, but they
(Black et al. 2008), for example, schoolteachers are not as good at distinguishing among those
frequently served as OST instructors. who fall in between (Jacob and Lefgren 2008).

( 17 )
1. Align the OST program academically with the school day

coordinator, or the school cannot afford to Roadblock 1.4. The OST instructor has some
hire a school-based coordinator. concerns about aligning instruction with the
school day, or the classroom teacher has
Suggested Approach. Depending on size some concerns about the OST instruction.
and scope of the program, a full-time pro-
gram coordinator might not be necessary. Suggested Approach. Concerns about
The panel believes that programs can be alignment may signal a need for greater
successful with a part-time coordinator, communication between the two programs.
particularly when the program is small Use the OST coordinator and the school-
or works with only a few students. What based coordinator to communicate con-
is important is that both the program and cerns and ensure that students receive
the school designate a person who is re- the instruction that will benefit them the
sponsible for managing the coordination most. This may mean sharing information
between program and school staff and about a particular student’s progress, for
that this person’s role in maintaining com- example.
munication is clearly established. Another
possibility is for a volunteer to take on the Roadblock 1.5. Privacy concerns prohibit
coordinator role. the transfer of data from school to OST
program.
Roadblock 1.3. It is hard to get high-qual-
ity teachers to work after school because Suggested Approach. OST programs can
of their commitments and responsibilities establish a secure system for transferring
during the school day. data that meets the security concerns of
the school and can ensure that only a lim-
Suggested Approach. OST programs can ited number of staff members have access
consider devising flexible staff schedules to the data. Programs should get formal
that allow teachers to work for shorter written consent from parents and students
periods (e.g., tutoring small groups of for their data to be released. OST staff also
students for one hour). Programs also can schedule meetings with teachers and
can consider involving busy but qualified staff to gather informal information on
teachers in important support roles, such student performance in school.
as coaching less-experienced instructors
or providing instructional training.

( 18 )
Recommendation 2. Level of evidence: Low
Maximize student The panel judged the level of evidence
participation and supporting this recommendation to be low,
attendance because there is no conclusive evidence
that following the action steps in this rec-
ommendation will lead to higher atten-
To attract and retain participants, dance or increased academic achievement.
OST programs should determine Given the voluntary nature of most OST
which factors prevent students from programs and other barriers discussed in
participating in the program and work this recommendation, regular attendance
with schools and parents to ensure that appears to be a difficult goal for many pro-
the program is addressing those factors. grams to reach. Some programs have de-
Parents are critical to this process voted considerable resources and seem to
because they are co-decisionmakers have made efforts to implement the action
about students’ participation in OST steps described in this recommendation
programs, and children generally value and still have trouble getting students to
their parents’ judgment about which attend regularly (see Table D2). However,
programs may be beneficial to them.57 given that attendance is a precursor to an
Important factors include location, OST program’s promoting student learn-
transportation, timing, length, program ing, the panel believes it is particularly
offerings, and frequency of services. important for programs to enhance their
Researchers have found that student efforts to get students in the door.
participation is affected by issues of
access and convenience, as well as Brief summary of evidence to
by the adequacy and attractiveness support the recommendation
of the services and features provided
in the program.58 The importance of emphasizing partici-
pation has been pointed out by many ex-
perts in OST.59 Although it seems logical
The challenge for OST program that students need to attend to receive the
organizers is to design program benefits of a program, there is no rigorous
features to minimize the barriers to evidence demonstrating that the steps rec-
participation, especially for the students ommended here will lead to increased par-
most in need of program services and ticipation, and limited evidence that aca-
most likely to benefit from them. OST demic achievement is increased through
programs also can increase awareness more exposure to OST programs. A meta-
and acceptance by promoting the analysis of 53 OST programs by Lauer et
program among school staff and al. (2004) found larger effect sizes in both
families. Greater awareness is likely to math and reading for programs that con-
facilitate communication with schools sisted of at least 45 hours of program-
and parents regarding students who ming.60 The panel believes that if a pro-
struggle to attend and could help gram is aligned academically with the
in actively persuading students to
participate in program activities.
59. Cooper et al. (2000); Granger and Kane (2004);
Lauver, Little, and Weiss (2004).
60. The meta-analysis also found that, on average,
programs with very high durations (more than 100
57. Duffett et al. (2004). hours for math and 210 hours for reading) did not
58. Ibid. have effects significantly different from zero.

( 19 )
2. Maximize student participation and attendance

school day (recommendation 1), adapts on attendance rates.64 Even when pro-
instruction to individuals and groups (rec- grams report attendance, the panel cannot
ommendation 3), and provides engaging isolate which, if any, components of the
learning experiences (recommendation 4), program affected attendance, forcing the
greater exposure to that program will yield panel to use its judgment regarding which
higher academic achievement. practices contributed to increased atten-
dance and improved academic outcomes.
Since a student’s actual program atten- In terms of overall academic effects, 7 of
dance, as a percentage of hours of pro- the 14 showed positive effects,65 2 showed
gramming offered, may be correlated with mixed effects,66 and 5 others showed no
unobservable characteristics such as mo- effects.67 Despite the lack of consistent
tivation or family circumstances, it is not evidence linking the panel’s suggestions
advisable to draw causal conclusions from to increased academic achievement, the
most studies on the relationship between panel believes these recommendations,
attendance and outcomes. Four evalua- faithfully implemented and taking into
tions met WWC standards with or with- consideration the unique constraints and
out reservations for their impact studies student populations of each program, can
and also looked at the possible relation increase student attendance and, there-
between level of program attendance and fore, contribute to achievement gains.
academic achievement.61 Only one found a
positive correlation between higher atten- Only one study provided direct evidence
dance and greater program effects.62 on increasing attendance. Black et al.
(2008) randomly assigned students to ei-
The other evidence for this recommen- ther a less-structured, business-as-usual
dation is less direct. Fourteen programs after-school program or an enhanced math
used practices similar to the action steps or reading program that included moni-
recommended by the panel (such as toring and incentive systems to increase
using teachers to recruit students, locat- attendance. Students in the enhanced pro-
ing within schools, offering snacks, and gram attended significantly more days
including enrichment activities) and also than did control students.
had evaluations that met WWC standards
with or without reservations.63 Of the 14
programs, 6 reported some information 64. Early Risers—August et al. (2001); Kinder-
gARTen—Borman, Goetz, and Dowling (2008);
61. Teach Baltimore—Borman and Dowling (2006); Enhanced Academic Instruction—Black et al.
Early Risers—August et al. (2001); L.A.’s BEST— (2008); Teach Baltimore—Borman and Dowling
Goldschmidt, Huang, and Chinen (2007); 21st (2006); L.A.’s BEST—Goldschmidt, Huang, and
CCLC—U.S. Department of Education (2003). Chinen (2007); 21st CCLC—U.S. Department of
Education (2003).
62. Teach Baltimore—Borman and Dowling
(2006). 65. CHP—Langberg et al. (2006); Fast ForWord—
Slattery (2003); Howard Street Tutoring—Morris,
63. CHP—Langberg et al. (2006); Fast ForWord— Shaw, and Perney (1990); SMART—Baker, Gersten,
Slattery (2003); Howard Street Tutoring—Morris, and Keating (2000); Summer Reading Day Camp—
Shaw, and Perney (1990); Start Making a Reader Schacter and Jo (2005); KindergARTen—Borman,
Today (SMART)—Baker, Gersten, and Keating Goetz, and Dowling (2008); Early Risers—August
(2000); Summer Reading Day Camp—Schacter et al. (2001).
and Jo (2005); KindergARTen—Borman, Goetz,
and Dowling (2008); Early Risers—August et al. 66. Teach Baltimore—Borman and Dowling
(2001); Enhanced Academic Instruction—Black et (2006); Enhanced Academic Instruction—Black
al. (2008); Teach Baltimore—Borman and Dowl- et al. (2008).
ing (2006); L.A.’s BEST—Goldschmidt, Huang, and 67. L.A.’s BEST—Goldschmidt, Huang, and Chinen
Chinen (2007); YS-CARE—Bissell et al. (2002); 21st (2007); YS-CARE—Bissell et al. (2002); 21st CCLC—
CCLC—U.S. Department of Education (2003); Leap U.S. Department of Education (2003); Leap Frog—
Frog—McKinney (1995); NDP—Udell (2003). McKinney (1995); NDP—Udell (2003).

( 20 )
2. Maximize student participation and attendance

How to carry out this place of worship.68 Parents often prefer the
recommendation use of school facilities for services, which
eliminates the need to move from school
1. Design program features to meet the needs to another location after school.69 If the
and preferences of students and parents. program is not located at the school, or if
the program is serving students from mul-
The panel recommends that the OST pro- tiple schools, schools and districts should
gram gather information about parent ensure that transportation to and from the
preferences with a survey or seek the program is readily available and affordable
advice of school staff in identifying the (or provided at no cost), and that adult su-
needs of parents and students. The survey pervision is provided while transporting
could be distributed to parents through students.70 The panel believes that OST
the school and could ask a short series of programs also should try to operate dur-
questions, such as ing hours that are convenient for families,
particularly for working parents.
• Do you prefer an after-school/summer
program that is located at your child’s Program features should reflect the con-
school or one that is located at a com- tent that students and parents want, both
munity center? in academic and nonacademic areas. Par-
ents are likely to be looking for academic
• Would you be able to transport your or homework help that reflects the areas
child from school to an after-school in which their children need additional
program? help.71 Since academically oriented OST
programs may be competing with other
• What times would you like the pro- recreational activities in the same time
gram to start and end? period,72 programs should offer enrich-
ment and recreational activities in addi-
• Which academic subjects does your tion to academic instruction. These ac-
child need additional support in? tivities can include theater, music, arts
and crafts, sports, board games, fitness,
• How many days per week would you and martial arts, but they should reflect
send your child to an after-school/ student interests.73 To satisfy students’
summer program? nutritional needs, after-school programs
should incorporate a snack time into the
The program could consider a similar short schedule, whereas summer programs
survey to gauge the preferences of stu-
dents, especially when serving the higher
grade levels in which students are more 68. August et al. (2001); Langberg et al. (2006);
likely to choose a program on their own. Chaplin and Capizzano (2006); Reisner et al.
(2004).
The responses should guide how the pro- 69. U.S. Department of Education (2009).
gram organizes and provides its services. 70. August et al. (2001); Langberg et al. (2006);
This includes working with schools and dis- Morris, Shaw, and Perney (1990); Center for Ap-
plied Linguistics (1994).
tricts to ensure that design features make
71. Duffett et al. (2004).
the program accessible. For example, the
OST program should consider a location 72. Carver and Iruka (2006); Brown (2002); Coo-
per et al. (2000).
that is well situated and easy to get to,
73. Bissell et al. (2002); Schacter and Jo (2005);
whether it is a school, community center, or August et al. (2001); Langberg et al. (2006); Bor-
man and Dowling (2006); Goldschmidt, Huang,
and Chinen (2007).

( 21 )
2. Maximize student participation and attendance

should consider including lunch or another staff can follow up with school staff to see
appropriate meal.74 if the problem extends to the school day.
OST staff also could coordinate with school
2. Promote awareness of the OST program staff to contact parents to determine the
within schools and to parents. reason for the absences.76 Programs can
consider using reward incentives, positive
It is the opinion of the panel that OST pro- reinforcement, or special privileges to en-
grams, whether organized by schools, dis- courage regular attendance. For example,
tricts, or private providers, should consis- incentives can be in the form of monthly
tently inform parents, teachers, and other prizes or a point system that rewards stu-
school staff about programs and their ben- dents with points for good attendance or
efits. Programs can use various methods behavior that can be redeemed for special
of promotion, including websites, flyers benefits such as field trips, school sup-
distributed at parent meetings, notices on plies, small prizes, or books.77
school bulletin boards or in school news-
letters, and word of mouth. Information Potential roadblocks and solutions
should include program location, hours
of operation, and contact numbers so that Roadblock 2.1. As students get older, ad-
parents can raise questions or concerns, ditional options for after-school or summer
and the information should be available in recreational activities become available
multiple languages when appropriate. and increase the competition for students’
after-school time.78
Schools can join OST providers in promot-
ing participation in the programs. For ex- Suggested Approach. An OST coordi-
ample, teachers and administrators can nator should be aware of other extracur-
identify and recruit students who might ricular activities when making scheduling
benefit from OST program services. Teach- or timing decisions so that it is easier for
ers can provide referrals or informational students to participate in both academic
materials during parent-teacher meetings or and recreational programs and, thereby,
give the program a list of students in need minimize competition. OST programs also
of academic assistance.75 Teachers or school can make participation in sports practice
administrators also can remind students at or other activities a privilege contingent
the end of the school day about attending on attendance in the academic portion of
the after-school program and, if needed, es- the OST program.79
cort students directly to the program.
Roadblock 2.2. Students may have un-
3. Use attendance data to identify students avoidable circumstances that prevent them
facing difficulties in attending the program. from attending OST programs, such as tak-
ing care of a sibling after school.
Program coordinators should systemati-
cally collect OST program attendance data 76. August et al. (2001); Black et al. (2008); Center
and use the data to identify students with for Applied Linguistics (1994).
recurring absences or low attendance. OST 77. Black et al. (2008); Udell (2003); U.S. Depart-
ment of Education (2005); Langberg et al. (2006);
Brown (2002).
74. Bissell et al. (2002); Schacter and Jo (2005); 78. Studies have found a drop-off in student par-
Morris, Shaw, and Perney (1990); Borman and ticipation after 3rd grade (Grossman et al. 2002)
Dowling (2006). and also between elementary and middle school
75. Langberg et al. (2006); Goldschmidt, Huang, (U.S. Department of Education 2003; Grossman
and Chinen (2007); Baker, Gersten, and Keating et al. 2002; Reisner et al. 2002).
(2000); McKinney (1995). 79. U.S. Department of Education (2003).

( 22 )
2. Maximize student participation and attendance

Suggested Approach. If a large portion Roadblock 2.4. Communicating with


of an OST program’s target group is fac- families involved in the program may be
ing these types of barriers, the program difficult.
should consider devising options for stu-
dents who cannot be onsite. Programs Suggested Approach. Schools and OST
might consider providing tutoring for providers should use multiple methods to
these students in the home or implement- communicate with parents. Some parents
ing a system that can provide the instruc- may move or change phone numbers fre-
tion in an online or electronic format. quently, which can make phone and mail
communication difficult. Work with the
Roadblock 2.3. The number of slots in OST school to provide information through
programs is limited. school staff or teachers via flyers sent home
with students. Recruitment efforts should
Suggested Approach. If the demand for extend beyond the school site to locations
OST programming is greater than the ca- that families frequent. These may include
pacity of the program, organizers should grocery stores, laundromats, and com-
consider ways to expand their capacity munity and faith-based centers. Schools
or partner with other programs that have should consider the common languages
available space. Organizers may need to spoken in the area and provide translated
seek additional funding sources, such as materials or hire bilingual staff.
foundations or other federal grant pro-
grams, for which they might consider hir-
ing an external evaluator to demonstrate
the value of the program model.

( 23 )
Recommendation 3. in fostering achievement.82 Within the con-
Adapt instruction text of OST, the literature is not definitive,
and, in general, positive effects cannot be
to individual and directly attributed to the use of the strat-
small group needs egies outlined in this recommendation.
However, looking more closely at the ac-
tual implementation of practices related to
OST is an opportunity to supplement this recommendation, there is a pattern of
learning from the school day and to more positive academic effects associated
provide targeted assistance to students with programs that more closely corre-
whose needs extend beyond what they spond to this recommendation. Therefore,
can receive in the classroom. Since the panel believes that OST programs can
OST programs are shorter than the be more successful if they attempt to un-
school day, instruction must be focused derstand the academic needs of the stu-
and targeted. The panel believes dents they serve and adapt their programs
that closely aligning the content and to those needs.
pacing of instruction with student
needs will result in better student Brief summary of evidence to
performance.80 Determining the right support the recommendation
level of difficulty and pace and the
most appropriate skills to teach is Of the 15 programs related to this recom-
critical to effectively individualizing mendation with evaluations that met WWC
instruction, but challenging in practice. standards with or without reservations,83
To provide targeted help to a student, 5 were judged to be in close correspon-
instructors need to assess and dence with more than one aspect of this
document students’ academic progress. recommendation.84 Four of these were
Based on this assessment, students found to have positive effects on academic
should be provided with instruction
that accommodates their level of 82. See Slavin (2006) for a review of research
development and rate of learning.81 on individualized instruction in general and
The same workbook or activity could Lauer et al. (2004) for reviews of OST programs
in particular.
be frustrating for some students
83. KindergARTen—Borman, Goetz, and Dowl-
and not challenging enough to be
ing (2008); Early Risers—August et al. (2001);
educational for others. If instructors Summer Reading Day Camp—Schacter and Jo
are unfamiliar with ways to incorporate (2005); Experience Corps—Meier and Invernizzi
assessment or individualization into (2001); SMART—Baker, Gersten, and Keating
instructional time, they should be (2000); Howard Street Tutoring—Morris, Shaw,
and Perney (1990); Fast ForWord—Slattery (2003);
provided with the tools and support CHP—Langberg et al. (2006); Teach Baltimore—
that will maximize their effectiveness. Borman and Dowling (2006); Enhanced Aca-
demic Instruction—Black et al. (2008); Chicago
Level of evidence: Moderate Summer Bridge—Jacob and Lefgren (2004); Leap
Frog—McKinney (1995); NDP—Udell (2003); 21st
CCLC—U.S. Department of Education (2003); 21st
The panel judged the level of evidence CCLC—Dynarski et al. (2004); 21st CCLC—James-
supporting this recommendation to be Burdumy et al. (2005); SES—McKay et al. (2008);
moderate. Learning environments that are SES—Ross et al. (2008); SES—Muñoz, Potter, and
adaptive to individual and small group Ross (2008).
needs are widely believed to be effective 84. Early Risers—August et al. (2001); Howard
Street Tutoring—Morris, Shaw, and Perney (1990);
Fast ForWord—Slattery (2003); CHP—Langberg
80. Slavin (2006); Bloom (1984). et al. (2006); Enhanced Academic Instruction—
81. Ibid. Black et al. (2008).

( 24 )
3. Adapt instruction to individual and small group needs

achievement,85 and one had mixed but appears to be implemented typically as


potentially encouraging effects.86 Of the one-on-one or small group tutoring (again
remaining 10 programs,87 6 still showed implementation varies widely), and the re-
positive or mixed effects on academics.88 sults from the states that have attempted
to evaluate the effects of SES, as mandated
Of the four programs with lower levels by law, do not show significant impacts on
of relevance and without detectable ef- state assessments.91
fects on academic achievement, two de-
serve special mention because they are In summary, the evidence demonstrates
the two major sources of federal funding positive effects associated with a total of
for academically focused OST programs: eight programs that adapted instruction
21st CCLC and SES, which are mandated to individual and small groups to some
as part of NCLB.89 The national study of degree92 and mixed effects in three other
21st CCLC programs found no positive programs;93 however, because adapting
academic effects for either elementary instruction always was a component of a
or middle school students.90 However, multicomponent intervention and because
implementation varied widely: although adapting instruction did not consistently
some programs reported tutoring in small demonstrate significant positive effects
groups with fewer than 10 students, most across every study reviewed,94 the panel
programs did not provide direct or adap- acknowledges that the level of evidence
tive instruction that was geared to the is moderate.
individual needs of all their students. Al-
though SES is a newer program relative How to carry out this
to 21st CCLC and less often studied, it recommendation

85. Howard Street Tutoring—Morris, Shaw, and 1. Use formal and informal assessment data
Perney (1990); Early Risers—August et al. (2001); to inform academic instruction.
Fast ForWord—Slattery (2003); CHP—Langberg
et al. (2006).
OST programs should utilize the results of
86. Enhanced Academic Instruction—Black et al.
(2008).
assessments administered to students dur-
ing the school day—combined with input
87. KindergARTen—Borman, Goetz, and Dowling
(2008); Summer Reading Day Camp—Schacter
and Jo (2005); Experience Corps—Meier and 91. Tennessee—Ross et al. (2008); Kentucky—
Invernizzi (2001); SMART—Baker, Gersten, and Muñoz, Potter, and Ross (2008); Virginia—McKay
Keating (2000); Teach Baltimore—Borman and et al. (2008).
Dowling (2006); Chicago Summer Bridge—Jacob 92. KindergARTen—Borman, Goetz, and Dowl-
and Lefgren (2004); Leap Frog—McKinney (1995); ing (2008); Early Risers—August (2001); Summer
NDP—Udell (2003); 21st CCLC—U.S. Department Reading Day Camps—Schacter and Jo (2005);
of Education (2003); 21st CCLC—Dynarski et al. Experience Corps—Meier and Invernizzi (2001);
(2004); 21st CCLC—James-Burdumy et al. (2005); SMART—Baker, Gersten, and Keating (2000);
SES—McKay et al. (2008); SES—Ross et al. (2008); Howard Street Tutoring—Morris, Shaw, and Per-
SES—Muñoz, Potter, and Ross (2008). ney (1990); Fast ForWord—Slattery (2003); CHP—
88. KindergARTen—Borman, Goetz, and Dowling Langberg et al. (2006).
(2008); Summer Reading Day Camp—Schacter 93. Enhanced Academic Instruction—Black et
and Jo (2005); Experience Corps—Meier and al. (2008); Teach Baltimore—Borman and Dowl-
Invernizzi (2001); SMART—Baker, Gersten, and ing (2006); Chicago Summer Bridge—Jacob and
Keating (2000); Teach Baltimore—Borman and Lefgren (2004).
Dowling (2006); Chicago Summer Bridge—Jacob
94. Leap Frog—McKinney (1995); NDP—Udell
and Lefgren (2004).
(2003); 21st CCLC—U.S. Department of Education
89. Zimmer et al. (2007). (2003); 21st CCLC—Dynarski et al. (2004); 21st
90. U.S. Department of Education (2003); Dynar- CCLC—James-Burdumy, Dynarski, and Deke (2007);
ski et al. (2004); James-Burdumy, Dynarski, and SES—Ross et al. (2008); SES—Muñoz, Potter, and
Deke (2007). Ross (2008).

( 25 )
3. Adapt instruction to individual and small group needs

from classroom teachers—to individual- reading.99 The record can be used to


ize instruction (see recommendation 1 for identify trouble spots and monitor stu-
how to establish relationships with school dent progress over time.
staff). General assessment can measure a
student’s content knowledge, appropriate • Pre- and post-lesson exercises can pro-
difficulty level, mastery of a topic, or skills vide useful information on progress
that require emphasis during instruction. and can point the instructor to the
The information gathered from assess- areas in which additional support is
ments should be used to adapt the content, required.100
pace, and approach in instructing the stu-
dent, whether in a one-on-one setting or These are just some examples of common
in small groups.95 assessment techniques; instructors should
choose the tools they are comfortable with
If additional information about student and those that gather the information they
progress is needed, OST instructors should can use to adapt instruction in the most
incorporate formal and informal assess- efficient way.
ments into tutoring and homework assis-
tance time.96 The types of tools instructors 2. Use one-on-one tutoring if possible; other-
can use to assess students’ abilities and wise, break students into small groups.
needs vary widely and should be deter-
mined based on the OST program’s goals, Ideally, OST programs should use one-on-
the students involved, or other unique ex- one tutoring to provide academic instruc-
periences. For example: tion to students.101 The panel believes
that a one-to-one ratio enables the most
• When students enter a program for the individualized attention for students and
first time, a test can be administered to facilitates the continuous assessment of
measure their baseline abilities.97 student progress and academic needs.102
If resources are limited and do not allow
• During a lesson, instructors can for one-on-one tutoring, the panel rec-
use basic techniques such as effec- ommends that students be broken into
tive questioning98 and observation to small groups of roughly three to nine
gauge a student’s comfort with the ma-
terial. For example, reading instructors
can use a running record to evaluate
99. In a running record, the instructor has a copy
reading behavior and keep track of
of the book the student is reading and follows
mistakes that a student makes while along as the student reads. When the student
makes a mistake, the instructor makes a note
in his or her own copy. In subsequent readings
of the same book, the instructor can identify
95. Black et al. (2008); Morris, Shaw, and Per- which mistakes are repeated and which ones
ney (1990); Meier and Invernizzi (2001); Borman the student has learned to avoid (Iverson and
and Dowling (2006); Borman, Goetz, and Dowl- Tunmer 1993).
ing (2008); Courtney et al. (2008); Johnson and 100. Black et al. (2008); Courtney et al. (2008).
Johnson (1999).
101. Morris, Shaw, and Perney (1990); Meier and
96. August et al. (2001); Black et al. (2008); Udell Invernizzi (2001); McKinney (1995); Ross et al.
(2003); Courtney et al. (2008). (2008); Baker, Gersten, and Keating (2000). A
97. New York City Board of Education (1991); meta-analysis of OST strategies found that one-
Courtney et al. (2008). on-one reading tutoring programs had larger
98. Effective questioning is the process of framing average effect sizes than did reading programs
questions in a way that will help the teacher eval- that taught in small or large groups (Lauer et
uate the student’s learning process while deepen- al. 2004).
ing the student’s understanding of a concept. 102. Slavin (2006).

( 26 )
3. Adapt instruction to individual and small group needs

students,103 at least when there is the op- for training instructors to implement the
portunity for students to work indepen- program properly. However, schools and
dently.104 In addition to giving the students districts already may have high-quality
an opportunity to learn teamwork skills training and professional development
and enhance their relationships with other resources. In this case, it may be to the
students (see recommendation 4), divid- school’s benefit to be involved in the train-
ing the class into smaller groups allows ing of OST instructors, many of whom may
students to work at their own pace.105 Stu- be their own classroom teachers or para-
dents who are more advanced might not professional staff, to ensure quality in-
feel frustrated by having to wait for other struction and promote alignment with the
students to catch up, and students who school curriculum. The panel recommends
struggle with the material would not feel that schools discuss training options with
the pressure of holding up the class.106 the OST program and consider involving
Students can be grouped based on their OST program instructors in training and
skill level for that topic (indicated by as- professional development courses at the
sessment data) or by grade.107 Instructors school and district levels.
also can create heterogeneous groups, in
which students assist their peers in a co- The level and intensity of OST instructor
operative learning format.108 training should not overburden instruc-
tors with unnecessary training but should
3. Provide professional development and ongo- ensure fidelity to the program. Further, the
ing instructional support to all instructors. intensity of OST instructor training will de-
pend on the background and experience
The organizer of the OST program (i.e., of the instructors, including whether they
the OST coordinator or the program di- are experienced and credentialed teach-
rector) should have primary responsibility ers, graduate students in education, or
volunteers with minimal teaching expe-
rience.109 To facilitate targeted training,
103. Black et al. (2008), Langberg et al. (2006),
and U.S. Department of Education (2003) provide OST programs should vary the intensity of
examples of small group sizes. training by dividing teachers into separate
104. Although the literature does not suggest tracks based on their prior experience.110
conclusively that there are large differences in Inexperienced instructors should be ob-
outcomes between one-on-one tutoring and small served and coached in the initial stages of
group instruction in out-of-school time, there
teaching to monitor quality and to identify
is more evidence for the effectiveness of one-
on-one tutoring in raising student achievement the need for additional training.111 OST
(Slavin 2006; Lauer et al. 2004; Bloom 1984). programs can use experienced teachers as
105. Schacter and Jo (2005); Black et al. (2008); trainers and to serve as resources for less
Langberg et al. (2006); Udell (2003). experienced instructors.112 Training ses-
106. In a survey of students in the Chicago Sum- sions should at least cover using assess-
mer Bridge program (Stone et al. 2005), research- ment data in instruction, individualizing
ers reported that more advanced students were
frustrated by the slow pace of the whole class
instruction and lesson planning, align-
and did not think that they were learning what ing programming with the goals of the
they needed. Other students appreciated that the
instructor would wait for them to understand
the material before moving the class on to an- 109. See recommendation 1 for a discussion of
other topic. hiring staff for OST programs.
107. Schacter and Jo (2005); Black et al. (2008); 110. Arbreton et al. (2008).
Langberg et al. (2006); U.S. Department of Educa- 111. Morris, Shaw, and Perney (1990); Udell
tion (2003); Brown (2002). (2003); Sheldon and Hopkins (2008); Wasik and
108. August et al. (2001); Johnson and Johnson Slavin (1993); Tucker (1995).
(1999). 112. Udell (2003).

( 27 )
3. Adapt instruction to individual and small group needs

school, and using monitoring and evalua- students. The program also could consider
tion procedures.113 partnering with another provider that has
sufficient staff resources and alternate staff
OST instructors should be given manuals schedules or combine programs.
to be used as an ongoing and useful refer-
ence.114 All instructors also should receive Roadblock 3.2. OST instructors are not fa-
ongoing support and professional develop- miliar with their students because they do
ment that is tailored to the needs of the not teach them during the school day.
instructors and their students or that tar-
gets areas in which instructors are weak Suggested Approach. OST instructors
or need additional guidance.115 have less time to get to know their students,
especially academically, but the students’
Potential roadblocks and solutions classroom teachers have much of the infor-
mation they may need. Instructors should
Roadblock 3.1. The program cannot af- take advantage of relationships with the
ford one-on-one or small group tutoring. school (see recommendation 1) to commu-
nicate with the classroom teachers about
Suggested Approach. If not enough in- students’ academic and social needs. OST
structors are available to give individual instructors also may have more opportu-
attention to each student, consider incor- nities to communicate with parents when
porating volunteers or pooling instructors they pick up their children and should take
with other OST providers. Volunteers can advantage of that time to become more fa-
help lower the student-staff ratio and can miliar with their students.
be used strategically to maximize the use
of experienced teachers. High school stu- Roadblock 3.3. Useful assessment data
dents or other volunteers can be recruited are not currently available to individualize
to lead groups of students in recreational or instruction.
athletic activities while experienced instruc-
tors teach smaller groups or tutor individual Suggested Approach. If the student as-
sessment data available are not adequate
for evaluating the needs of the student,
113. Black et al. (2008); Udell (2003); Chaplin and the OST program should administer its
Capizzano (2006); Arbreton et al. (2008); Court-
own formal and/or informal assessments.
ney et al. (2008); Wasik and Slavin (1993).
The program or instructor should choose
114. August et al. (2001); Black et al. (2008); Lang-
berg et al. (2006); Udell (2003); Chaplin and Capiz- assessments that will provide the most
zano (2006); Courtney et al. (2008). information on the students’ achievement
115. August et al. (2001); Black et al. (2008); Shel- level with the least burden on the students
don and Hopkins (2008). or instructors.

( 28 )
Recommendation 4. relationships between staff and
students, have been linked to student
Provide engaging engagement, persistence with learning
learning experiences activities, and connection to the
school.119 Evidence suggests that
many of the activities discussed in
The panel recommends that OST this recommendation (e.g., games,
activities be interactive, hands on, recreation, or field trips) are ineffective
learner directed, and related to the when they occur independently of
real world, while remaining grounded the academic component of the
in academic learning goals.116 High- program.120 However, the panel
quality instruction is important, but believes that by making the connection
producing achievement gains in OST is between engaging activities and
particularly challenging. Both students academic learning explicit, OST
and teachers suffer from fatigue after a programs can produce greater
long school day or year. OST programs academic achievement gains.
are typically voluntary and must
compete with nonacademically oriented Level of evidence: Low
activities to attract students and effect
learning gains. The panel believes The panel judged the level of evidence for
that instructors must be particularly this recommendation to be low. Studies
engaging to overcome student fatigue of the types of activities covered in this
and distractions from nonacademically recommendation have demonstrated that
oriented activities. Although all they are effective in laboratory and school-
of the practices outlined in this day settings. However, the evidence for
recommendation may not be relevant whether these practices are effective is
to all programs, the panel believes that mixed in the OST context. Although many
OST programs have unique flexibility programs identified making programming
to provide engaging opportunities engaging as a key program goal, very few
for student learning and that these demonstrated consistently positive ef-
recommendations can be useful in both fects, and none linked positive effects di-
one-on-one and group settings. rectly to the use of the strategies outlined
in this recommendation.

Student engagement in school and


classroom instruction is correlated
with improved academic outcomes,117
and disengagement is correlated 119. See, for example, Newmann (1991); Helme
with poor academic performance.118 and Clark (2001); Blumenfeld and Meece (1988);
Student choice, cooperative learning Battistich et al. (1997); Klem and Connell (2004);
experiences, and hands-on and real- Turner (1995); Perry (1998); Skinner and Belmont
(1993); Connell and Wellborn (1991). This conclu-
world activities, as well as supportive
sion is drawn from the information in Fredricks,
Blumenfeld, and Paris (2004), which reviews the
available research on engagement. Other rele-
116. Capizzano et al. (2007); Arbreton et al. vant sources include Cordova and Lepper (1996);
(2008); Borman and Dowling (2006). Guthrie et al. (1999); and Guthrie, Wigfield, and
117. Connell, Spencer, and Aber (1994); Marks VonSecker (2000). Students in Chicago Summer
(2000); Wellborn and Connell (1990); Connell and Bridge reported having more motivation when
Wellborn (1991). they perceived that their teachers were con-
118. Finn, Pannozzo, and Voelkl (1995); Finn and cerned with their well-being (Stone et al. 2005).
Rock (1997). 120. U.S. Department of Education (2003).

( 29 )
4. Provide engaging learning experiences

Brief summary of evidence to effects,126 one showed mixed effects,127 and


support the recommendation four showed no effects.128 Although the
evidence is somewhat mixed, on average,
Several studies, conducted outside of the those with positive or mixed effects seemed
OST arena, have examined the effectiveness to be more closely aligned with the strate-
of different strategies to increase student gies recommended by the panel.
motivation, engagement, and academic suc-
cess.121 In the OST context, however, the How to carry out this
evidence supporting the use of engaging recommendation
activities has been mixed. Five programs
documented practices highly aligned with 1. Make learning relevant by incorporating
those recommended by the panel; these practical examples and connecting instruc-
either made a deliberate effort to integrate tion to student interests and experiences.
engaging practices with academic con-
tent or intentionally developed relation- The panel recommends providing instruc-
ships between students and OST staff with tion using tools or materials that students
the objective of engaging students with can relate to. OST staff should identify the
school and learning.122 Of these, three academic concept being taught and then
programs demonstrated positive academic find practical examples and relevant ma-
effects,123 and two showed mixed effects.124 terial to support that learning objective.
Six other programs had practices similar When possible, programs should consider
to the panel’s recommendations, but they integrating academic content using an over-
either did not provide enough descriptive arching program theme or final project to
evidence to determine whether they were reinforce different learning activities and
highly aligned with the panel’s recommen- make learning more meaningful.129 Because
dations or contained some evidence that OST programs operate during nontradi-
the strategies were used in a way that was tional hours (i.e., after school, weekends,
inconsistent with the panel’s recommen- or summers), they are positioned to pro-
dations.125 One of these showed positive vide different types of activities than do
schools. One example is field trips, which
121. Cordova and Lepper (1996); Anderson can help develop students’ background
(1998); Guthrie et al. (1999); Guthrie, Wigfield, knowledge and connect the real world to
and VonSecker (2000); Battistich et al. (1997); the in-class curriculum.130 OST programs
Klem and Connell (2004); Helme and Clark (2001);
also can invite guest speakers to demon-
Blumenfeld and Meece (1988); Connell and Well-
born (1991).
122. Early Risers—August et al. (2001); Develop-
21st CCLC—U.S. Department of Education (2003);
mental Mentoring—Karcher, Davis, and Powell
NDP—Udell (2003).
(2002); Enhanced Academic Instruction—Black et
al. (2008); Teach Baltimore—Borman and Dowl- 126. Summer Reading Day Camp—Schacter and
ing (2006); KindergARTen—Borman, Goetz, and Jo (2005).
Dowling (2008). 127. Chicago Summer Bridge—Jacob and Lef-
123. Early Risers—August et al. (2001); Devel- gren (2004).
opmental Mentoring—Karcher, Davis, and Pow- 128. L.A.’s BEST—Goldschmidt, Huang, and
ell (2002); KindergARTen—Borman, Goetz, and Chinen (2007); YS-CARE—Bissell et al. (2002);
Dowling (2008). 21st CCLC—U.S. Department of Education (2003);
124. Enhanced Academic Instruction—Black et NDP—Udell (2003).
al. (2008); Teach Baltimore—Borman and Dowl- 129. Center for Applied Linguistics (1994); Re-
ing (2006). isner et al. (2004); Karcher, Davis, and Powell
125. Summer Reading Day Camp—Schacter and (2002); Capizzano et al. (2007); Borman, Goetz,
Jo (2005); Chicago Summer Bridge—Jacob and and Dowling (2008).
Lefgren (2004); L.A.’s BEST—Goldschmidt, Huang, 130. Borman and Dowling (2006); Borman, Goetz,
and Chinen (2007); YS-CARE—Bissell et al. (2002); and Dowling (2008).

( 30 )
4. Provide engaging learning experiences

strate how academic content relates to their about and interact with academic content.
career experiences to help students find One example is collaborative learning:
practical meaning in the academic concepts OST programs can encourage interaction
they learn in school.131 Working or retired among peers by pairing struggling stu-
practitioners may be interested in serving dents with more advanced partners to
as guest speakers and/or mentors.132 help them grasp difficult concepts.136 OST
instructors also can break students into
The panel believes that OST instructional groups to work together to solve a prob-
strategies should capitalize on student in- lem or to rotate through learning stations,
terests and make students want to engage but effective group exercises can be less
in the instructional material. OST staff formal and as simple as having a group of
should first develop a clear understanding three students complete a math problem
of students’ interests. Brief conversations together.137 OST programs also can use
with students and/or teachers or quick sur- role-playing activities to make experiences
veys are simple mechanisms that programs real and meaningful for students.138
can use to gather information about stu-
dent interests.133 Instruction can then build Hands-on activities also can be helpful in
off existing student interests and incorpo- reinforcing academic content.139 OST staff,
rate examples from sports, current events, particularly those working with younger
or other community-specific interests. OST students, should provide opportunities
instructors can connect reading materials for students to use “exploration, creativ-
or concepts introduced in class to students’ ity, discovery and play.”140 Games, projects,
everyday life experiences.134 OST programs manipulatives, and computers can pro-
also can personalize instructional content vide practice and enrichment on content
and materials to student interests and pro- objectives.141 Hands-on science and math
vide students with choices to maximize projects or exploratory learning activities
student learning.135 Instructional strate- make academic subjects interesting for
gies can range from the simple (e.g., per- students.142 Learning can be active without
sonalizing reading instruction by begin- involving physical activity or elaborate les-
ning with the letters of a student’s name
to teach letter-sound identification) to the
136. Stone et al. (2005).
elaborate (e.g., modifying a course text to
137. Roberts and Nowakowski (2004). Ames
make it relevant to students’ reading levels,
(1992) describes the literature on how the learn-
experiences, and social contexts). ing environment affects student motivation.
138. Durlak and Weissberg (2007); Karcher (2005);
2. Make learning active through opportuni- James-Burdumy, Dynarski, and Deke (2007).
ties for collaborative learning and hands-on 139. Anderson (1998)
academic activities. 140. Schacter and Jo (2005), p. 160.
141. Black et al. (2008); Cordova and Lepper
The panel recommends that OST instruc- (1996); Capizzano et al. (2007); Nears (2008);
tion encourage students to think actively Arbreton et al. (2008); Borman and Dowling
(2006).
142. Anderson (1998); Borman and Dowling
131. Capizzano et al. (2007); Chaplin and Capiz- (2006); Capizzano et al. (2007); Arbreton et al.
zano (2006). (2008); Goldschmidt, Huang, and Chinen (2007).
132. Ferreira (2001); Center for Applied Linguis- Guthrie et al. (1999) found that the Concept-
tics (1994). Oriented Reading Instruction (CORI) teaching
framework, which includes the use of real-world
133. Cordova and Lepper (1996).
observation and hands-on activities to personal-
134. Schacter and Jo (2005); Black et al. (2008); ize reading experiences, increased student moti-
Guthrie et al. (2000). vation to learn and improved text comprehension
135. Cordova and Lepper (1996). and conceptual learning.

( 31 )
4. Provide engaging learning experiences

sons, however. OST programs should not OST programs also can support the devel-
underestimate the engaging nature of a opment of other meaningful relationships
dynamic instructor who uses active ques- in students’ lives. For example, mentors
tioning and participation to motivate and can help support relationships between
engage students in direct instruction.143 students and their parents or teachers by
coordinating and mediating meetings to
3. Build adult-student relationships among discuss developmental needs or by cre-
OST program participants. ating settings such as field trips in which
mentees have the opportunity to interact
Positive and supportive relationships with socially with adults.149
adults can help students feel connected to
the OST program and invested in the aca- Potential roadblocks and solutions
demic material they cover.144 Likewise, as
instructors get to know their students bet- Roadblock 4.1. The OST staff does not
ter, they can pinpoint their interests, relate have experience leading cooperative learn-
academic content to their context and in- ing activities (or any of the other strategies
terests, and encourage them to have high outlined above).
expectations for their achievement.145 To
do this, OST programs can hire staff with Suggested Approach. Unless teachers
backgrounds and interests that comple- have experience using cooperative learn-
ment those of their students and can serve ing, implementing these types of activi-
as positive role models and, thus, motivate ties can be difficult or ineffective. The
students toward success.146 OST programs panel recommends that OST programs
can use relationship-building activities to hire teachers with these skills. OST pro-
help staff get to know students and be- grams also can gather best practices from
come invested in their outcomes.147 Pro- teachers at other sites and during the
grams can support relationships between school day. Finally, OST programs should
students and staff members by assigning train staff to use these instructional strat-
one staff member to a group of students egies if they do not already have the ap-
to move with them across OST activities propriate skills.
or be with them as much as possible. As
staff members spend more time with stu- Roadblock 4.2. The academic content is
dents, the panel believes they should en- too specific for one instructor to teach dif-
deavor to accumulate knowledge about ferent subjects to the same students.
student interests and invest in supportive
relationships,148 helping students feel cared Suggested Approach. Particularly as stu-
for and connected to both the OST program dents get older, it might not be feasible
and academic learning more generally. for the same staff member to instruct the
same students across multiple subjects.
Another method for developing positive
143. Stone et al. (2005). relationships between students and OST
144. Klem and Connell (2004); August et al. instructors is establishing a “home-room”
(2001); Karcher, Davis, and Powell (2002); Stone
instructor whom students can engage with
et al. (2005); Goldschmidt, Huang, and Chinen
(2007); Udell (2003). informally at a particular time during the
145. Udell (2003). program to develop a more meaningful
146. Center for Applied Linguistics (1994); Carter, and caring relationship.
Straits, and Hall (2007).
147. Goldschmidt, Huang, and Chinen (2007);
Udell (2003). 149. Karcher, Davis, and Powell (2002); Karcher
148. Arbreton et al. (2008). (2005).

( 32 )
4. Provide engaging learning experiences

Roadblock 4.3. It’s too expensive to hire Roadblock 4.4. We don’t have the re-
qualified staff to facilitate engaging aca- sources to pay for field trips and materials
demic activities. for hands-on activities.

Suggested Approach. To reduce costs, Suggested Approach. Field trips and


instead of hiring additional staff for the activities do not have to be elaborate. If
engaging activities designed to supple- resources are not available for field trips,
ment academic content, OST programs can walks outdoors to explore the surround-
capitalize on existing staff with special- ing neighborhoods or parks or virtual field
ized knowledge and experiences,150 use trips can be viable options.152
volunteers, or share enrichment staff with
other OST programs.151

150. Borman and Dowling (2006). 152. For information on virtual field trips, see
151. Bissell et al. (2002); Arbreton et al. (2008). Manzo (2009).

( 33 )
Recommendation 5. spot problems and develop potential
solutions, identify conditions in which
Assess program the program is most effective, or make
performance and comparisons with the performance of
other programs. The findings can be
use the results to especially valuable in making long-
improve the quality term decisions about which strategies
and programs should be continued or
of the program replicated in other areas.

OST program organizers should be Level of evidence: Low


aware of two types of performance
assessments: (1) formative evaluations The panel judged the level of evidence
determine how a program has supporting this recommendation to be low.
progressed in realizing its model of Although the panel believes that monitor-
implementation and which program ing and improving performance is impor-
aspects are working well, and tant to ensure that the program is carrying
(2) summative evaluations determine out its intended objectives and adapting to
how effective a program has been changing needs and feedback, no direct
in achieving its goals, which usually evidence suggests that monitoring leads
include the improvement of student to increased academic achievement in OST
outcomes. Both types of evaluations programs. More research would be needed
are important in the life cycle of a to isolate whether it is the monitoring itself
program and especially instrumental or other components that lead to positive
in any program improvement effort.153 academic effects in those programs that
do some form of monitoring.

For efficient program management, the Brief summary of evidence to


panel believes that it is important for OST support the recommendation
providers to have internal mechanisms
to monitor staff performance and collect OST programs could consider using differ-
data related to program implementation. ent kinds of assessments. The panel found
However, the panel believes that seven programs relevant to this recommen-
schools (or districts154) have a unique dation with studies that met WWC stan-
responsibility to conduct independent dards with or without reservations.155 Of
evaluations of program implementation these seven, five included elements of fidel-
and its impacts on students. The findings ity monitoring in order to ensure that pro-
from these evaluations can be used to gram implementation followed the program
model,156 one used fidelity monitoring and
153. A meta-analysis of summer programs sug-
gests that programs that undergo monitoring
of instruction and performance tend to produce
larger effects than programs that do not (Cooper 155. Early Risers—August et al. (2001); CHP—
et al. 2000). Langberg et al. (2006); Teach Baltimore—Borman
154. Throughout this recommendation, we refer and Dowling (2006); Chicago Summer Bridge—
to the school as the driving force behind the Jacob and Lefgren (2004); Enhanced Academic
evaluation. However, it also is likely that districts Instruction—Black et al. (2008); L.A.’s BEST—
will play this role, possibly because a program Goldschmidt, Huang, and Chinen (2007); YS-
has wider coverage than just one school. Districts CARE—Bissell et al. (2002).
also may support schools in carrying out certain 156. Early Risers—August et al. (2001); CHP—
functions such as data collection or analysis, but Langberg et al. (2006); Teach Baltimore—Borman
that is a decision to be made collaboratively. and Dowling (2006); Chicago Summer Bridge—

( 34 )
5. Assess program performance and use the results to improve the quality of the program

ongoing external evaluations,157 and one three years.163 The plan should contain
program used surveys of key stakeholders information regarding the outcomes that
to gauge satisfaction with the program.158 will be used in the evaluation, the data
Of the six that used fidelity monitoring, all that will be collected to measure those
were considered to be doing so in a way outcomes, and how data will be gathered.
that was largely consistent with the panel’s It also should outline the timeline for car-
recommendation, and five had either posi- rying out various components of the plan
tive159 or partially positive effects on aca- and describe how results will be dissemi-
demic achievement.160 The program that nated and used. Since the process plays a
also used external evaluators and the pro- defining role in the evaluation, the panel
gram with stakeholder surveys were both recommends that the school involve all
inconclusive with nonsignificant effects.161 stakeholders in the development of the
The one program that used stakeholder plan, including teachers, parents, OST pro-
surveys was considered to have a lower gram administrators, and OST staff.
level of consistency with this recommenda-
tion because it was unclear how it used the 2. Collect program and student performance
surveys to improve the program. data.

How to carry out this Program implementation data, student out-


recommendation come data, and feedback from other stake-
holders regarding satisfaction with the
1. Develop an evaluation plan. program should be gathered.164 Program
activity should be monitored as closely as
An evaluation plan should present the possible; the more detail available about
evaluation objectives and research ques- implementation, the easier it will be to
tions, as well as details for the data collec- identify specific areas for improvement.
tion and analysis processes. This includes The panel believes that the school OST
a decision about which type of evaluation coordinator is the natural person to have
is appropriate; formative evaluations are a lead role in program monitoring, as that
more relevant for new programs, whereas person will have the best understanding of
summative evaluations should be under- both the school’s goals and the program’s
taken with more established programs162 operation. Some suggestions for collecting
that have demonstrated successful and monitoring and outcome data follow:
consistent implementation for at least
• Observations of implementation. Schools
should observe OST instruction and
Jacob and Lefgren (2004); Enhanced Academic
Instruction—Black et al. (2008). student management,165 recreational
157. L.A.’s BEST—Goldschmidt, Huang, and time, and the day-to-day operation
Chinen (2007). of the program (e.g., transportation,
158. YS-CARE—Bissell et al. (2002). parent interaction, collaboration with
159. Early Risers—August et al. (2001); CHP—
Langberg et al. (2006). 163. Fullan (2001) suggests that school reform
160. Teach Baltimore—Borman and Dowling interventions should not be expected to demon-
(2006); Chicago Summer Bridge—Jacob and Lef- strate significant positive change in the first two
gren (2004); Enhanced Academic Instruction— to three years of implementation.
Black et al. (2008). 164. A report by Ross, Potter, and Harmon (2006)
161. L.A.’s BEST—Goldschmidt, Huang, and provides useful guidance for states in evaluating
Chinen (2007); YS-CARE—Bissel et al. (2002). SES providers, much of which can be applied to
162. This assumes that outcome data are avail- schools evaluating OST programs or providers.
able and the sample size is large enough to have 165. Black et al. (2008); Stone et al. (2005); Lang-
adequate statistical power to measure impacts. berg et al. (2006); Borman and Dowling (2006).

( 35 )
5. Assess program performance and use the results to improve the quality of the program

schoolteachers). Create standardized • Stakeholder satisfaction. The evalua-


tools for observers to use in assess- tion should provide an opportunity
ing program quality.166 In designing for stakeholders such as the principal,
or choosing an observation tool, the classroom teachers, parents, and stu-
school should decide which aspects of dents to offer input on how the pro-
the program are important for effec- gram is meeting their needs, such as
tive implementation, such as curricu- whether the OST curriculum is comple-
lum content, instructional delivery, and menting classroom teaching or whether
staff-student interaction.167 Each com- students are getting enough individual
ponent should have its own checklist attention from instructors.170 At the
of items that breaks down the actions basic level, the school could contact
that represent good practice.168 Pro- parents by phone to have an informal
viders also should conduct their own conversation about their child’s expe-
ongoing monitoring to ensure proper rience with the OST program. Another
implementation. option is to invite a small group of par-
ents or teachers for a focus group dis-
• Student outcomes. The panel recom- cussion about program performance.
mends that the school and OST pro- Mailing surveys is another alternative,
vider share responsibility for collecting since this option makes it possible to
student outcome data. The OST pro- reach many respondents in a limited
vider can track student attendance in time frame. However, the panel cau-
the program, student performance on tions that even the most complex sur-
exercises or assessments conducted in veys have limitations in terms of bi-
the program, and student behavior such ased results and should be interpreted
as engagement during instructional time carefully. For example, surveys can
or disciplinary incidences.169 The school be useful for identifying problematic
will have access to course grades and trends in service delivery or areas for
records from the school day, as well as improvement, but they should not be
scores on state or district assessments. interpreted as conclusive evidence of
Recommendation 1 emphasized the pro- program effectiveness.
gram coordinator’s role as a facilitator
in communicating between the school 3. Analyze the data and use findings for pro-
and the OST program regarding student gram improvement.
academic needs and performance data.
If needed, this role can be extended at The panel recommends that the school
the school level to recording data for carefully analyze the data it has collected
evaluation purposes, including main- on implementation, student outcomes,
taining a central database of informa- and satisfaction and place findings in the
tion collected from the OST program, context of how to improve the program.
district, and school staff regarding each The school should look for inconsistencies
student’s progress. between what OST providers proposed
to do and how the program is actually
166. Sheldon and Hopkins (2008). For more re- implemented. They also should identify
sources on program quality assessment tools,
patterns in the data that suggest problem
see Yohalem et al. (2009) and publications by the
Bureau of Public School Options, Florida Depart- areas, such as irregular attendance on cer-
ment of Education (2007, n.d.). tain days of the week. Districts can use a
167. August et al. (2001).
168. Bureau of Public School Options (2007).
169. August et al. (2001); Borman and Dowling 170. Goldschmidt, Huang, and Chinen (2007);
(2006). Bissell et al. (2002).

( 36 )
5. Assess program performance and use the results to improve the quality of the program

larger dataset to look for patterns across some larger districts, it might be feasible to
schools or across OST providers. conduct a district-wide, multischool evalu-
ation, but for others the panel encourages
To encourage program growth, the panel participation in site-specific, independent
recommends that the school share its evaluations. If a qualified evaluator is not
evaluation results with the OST program. available, the school should be cautious
This also will allow suggestions for im- about concluding program effectiveness
provement or strategies to address areas from the data collected for the evaluation.
of concern to be discussed collaboratively. It could at least consider consulting with
For example, if observations of instruction an experienced evaluator about evaluation
indicate that teachers are not adequately plans and methods.
covering content, targeted professional
development may be necessary. A for- Potential roadblocks and solutions
mative evaluation that indicates that the
program is not improving might suggest Roadblock 5.1. Staff time and expertise
moving in a different direction from the are not sufficient for collecting and ana-
current program model or even ending lyzing data.
the program.
Suggested Approach. If staff time and
4. Conduct a summative evaluation. resources are already too constrained
to apply to evaluation efforts, explore
Once formative evaluations conclude that the availability of pro bono help from
the program is being implemented as de- local colleges or universities or the par-
signed, a summative or impact evaluation ent-teacher association. These institutions
is appropriate. If three years or more have and organizations may be able to provide
elapsed, a summative evaluation might observers or data collection and analysis
still be useful even without consistent im- services. Doctoral students might even be
plementation. The summative evaluation interested in conducting a full evaluation
should be conducted in the most rigorous for their dissertation.
way possible given the resources available
to the school (or district) and the level of Roadblock 5.2. The school and the pro-
cooperation from the OST provider. A rigor- gram do not want to duplicate monitoring
ous evaluation is one of the best measure- efforts.
ments of a program’s success at raising
student achievement, and positive results Suggested Approach. Both schools and
could boost participation in and funding OST providers need to be simultaneously
for the program. An external evaluator collecting information on the program.
can lend credibility to the evaluation and The type of information each collects does
its findings.171 A randomized experiment not have be the same, and any overlapping
is ideal but not always practical for every responsibilities can be discussed in the
OST program. The school should consider evaluation planning stage. For example,
the next best evaluation design that will providers may monitor attendance and
provide evidence of effectiveness.172 For collect data from their own instructors on
assessments administered to students in
OST, but schools will want to collect their
171. Goldschmidt, Huang, and Chinen (2007). own achievement data on the same stu-
172. See the What Works Clearinghouse (WWC) dents, including state assessments from
Procedures and Standards Handbook for a
guide to quality research methods: http://ies.
the district. Both schools and OST provid-
ed.gov/ncee/wwc/references/idocviewer/doc. ers should conduct their own observations
aspx?docid=19&tocid=1. of instruction in the program.
( 37 )
Appendix A. typically finds that a strong level of evi-
Postscript from dence is drawn from a body of random-
ized controlled trials, the moderate level
the Institute of from well-designed studies that do not
Education Sciences involve randomization, and the low level
from the opinions of respected authorities
(see Table 1). Levels of evidence also can be
What is a practice guide? constructed around the value of particular
types of studies for other goals, such as the
The health care professions have embraced reliability and validity of assessments.
a mechanism for assembling and commu-
nicating evidence-based advice to practitio- Practice guides also can be distinguished
ners about care for specific clinical condi- from systematic reviews or meta-analy-
tions. Variously called practice guidelines, ses such as WWC intervention reviews or
treatment protocols, critical pathways, best statistical meta-analyses, which employ
practice guides, or simply practice guides, statistical methods to summarize the re-
these documents are systematically devel- sults of studies obtained from a rule-based
oped recommendations about the course of search of the literature. Authors of prac-
care for frequently encountered problems, tice guides seldom conduct the types of
ranging from physical conditions, such as systematic literature searches that are
foot ulcers, to psychosocial conditions, the backbone of a meta-analysis, although
such as adolescent development.173 they take advantage of such work when
it is already published. Instead, authors
Practice guides are similar to the products use their expertise to identify the most
of typical expert consensus panels in re- important research with respect to their
flecting the views of those serving on the recommendations, augmented by a search
panel and the social decisions that come of recent publications to ensure that the
into play as the positions of individual research citations are up-to-date. Further-
panel members are forged into statements more, the characterization of the quality
that all panel members are willing to en- and direction of the evidence underlying a
dorse. Practice guides, however, are gen- recommendation in a practice guide relies
erated under three constraints that do not less on a tight set of rules and statistical al-
typically apply to consensus panels. The gorithms and more on the judgment of the
first is that a practice guide consists of a authors than would be the case in a high-
list of discrete recommendations that are quality meta-analysis. Another distinction
actionable. The second is that those recom- is that a practice guide, because it aims for
mendations taken together are intended to a comprehensive and coherent approach,
be a coherent approach to a multifaceted operates with more numerous and more
problem. The third, which is most impor- contextualized statements of what works
tant, is that each recommendation is ex- than does a typical meta-analysis.
plicitly connected to the level of evidence
supporting it, with the level represented by Thus, practice guides sit somewhere be-
a grade (e.g., strong, moderate, low). tween consensus reports and meta-anal-
yses in the degree to which systematic
The levels of evidence, or grades, are usu- processes are used for locating relevant
ally constructed around the value of par- research and characterizing its meaning.
ticular types of studies for drawing causal Practice guides are more like consensus
conclusions about what works. Thus, one panel reports than meta-analyses in the
breadth and complexity of the topic that
173. Field and Lohr (1990). is addressed. Practice guides are different
( 38 )
Appendix A. Postscript from the Institute of Education Sciences

from both consensus reports and meta- the topic being addressed. The chair and
analyses in providing advice at the level the panelists are provided with a general
of specific action steps along a pathway template for a practice guide along the
that represents a more-or-less coherent lines of the information provided in this
and comprehensive approach to a multi- appendix. They also are provided with
faceted problem. examples of practice guides. The practice
guide panel works under a short deadline
Practice guides in education at the of six to nine months to produce a draft
Institute of Education Sciences document. The expert panel members in-
teract with and receive feedback from staff
IES publishes practice guides in education at IES during the development of the prac-
to bring the best available evidence and ex- tice guide, but they understand that they
pertise to bear on the types of challenges are the authors and, thus, responsible for
that cannot be addressed currently by sin- the final product.
gle interventions or programs. Although
IES has taken advantage of the history of One unique feature of IES-sponsored prac-
practice guides in health care to provide tice guides is that they are subjected to
models of how to proceed in education, rigorous external peer review through
education is different from health care in the same office that is responsible for the
ways that may require that practice guides independent review of other IES publica-
in education have somewhat different de- tions. A critical task of the peer reviewers
signs. Even within health care, for which of a practice guide is to determine whether
practice guides now number in the thou- the evidence cited in support of particular
sands, there is no single template in use. recommendations is up-to-date and that
Rather, one finds descriptions of general studies of similar or better quality that
design features that permit substantial point in a different direction have not been
variation in the realization of practice ignored. Peer reviewers also are asked to
guides across subspecialties and panels evaluate whether the evidence grade as-
of experts.174 Accordingly, the templates signed to particular recommendations by
for IES practice guides may vary across the practice guide authors is appropriate.
practice guides and change over time and A practice guide is revised as necessary to
with experience. meet the concerns of external peer review-
ers and to gain the approval of the stan-
The steps involved in producing an IES- dards and review staff at IES. The process
sponsored practice guide are first to select of external peer review is carried out inde-
a topic, which is informed by formal sur- pendent of the office and staff within IES
veys of practitioners and requests. Next, a that instigated the practice guide.
panel chair is recruited who has a national
reputation and up-to-date expertise in the Because practice guides depend on the
topic. Third, the chair, working in collabo- expertise of their authors and their group
ration with IES, selects a small number of decisionmaking, the content of a practice
panelists to co-author the practice guide. guide is not and should not be viewed as a
These are people the chair believes can set of recommendations that in every case
work well together and have the requi- depends on and flows inevitably from sci-
site expertise to be a convincing source entific research. It is not only possible but
of recommendations. IES recommends also likely that two teams of recognized
that at one least one of the panelists be a experts working independently to produce
practitioner with experience relevant to a practice guide on the same topic would
generate products that differ in important
174. American Psychological Association (2002). respects. Thus, consumers of practice
( 39 )
Appendix A. Postscript from the Institute of Education Sciences

guides need to understand that they are, authorities who have to reach agreement
in effect, getting the advice of consultants. among themselves, justify their recom-
These consultants should, on average, pro- mendations in terms of supporting evi-
vide substantially better advice than an dence, and undergo rigorous independent
individual school district might obtain on peer review of their product.
its own because the authors are national
Institute of Education Sciences

( 40 )
Appendix B. affecting urban education. Among his vari-
About the authors ous awards and honors, most recently Dr.
Borman received the 2008 American Educa-
tional Research Association (AERA) Palmer
Panel O. Johnson Award and was recognized for
his contributions to education research by
Megan Beckett, Ph.D., is a RAND sociolo- selection as an AERA Fellow.
gist who has worked on a range of topics
related to families and children, aging, Jeffrey Capizzano is vice president of pub-
and demography. Dr. Beckett is a recog- lic policy and research at Teaching Strate-
nized expert on after-school programs gies, Inc. Mr. Capizzano brings to the panel
and policies and has co-authored a num- extensive research and evaluation experi-
ber of important studies and overviews ence in the areas of OST, youth develop-
of the OST field. She has been invited to ment, and summer learning programs and
speak on the subject of after-school care is skilled in a variety of quantitative and
with the Netherlands’ minister for social qualitative data collection and analysis
affairs and employment and the Nether- techniques. Mr. Capizzano was involved
lands’ secretary general for the Ministry in an experimental evaluation of an accel-
of Social Affairs, the California congres- erated learning summer program, Build-
sional delegation, the California League of ing Educated Leaders for Life (BELL), and is
Women Voters, the child policy deputy for currently a member of the BELL 2008–10
Los Angeles County, and Minnesota legis- Evaluation Advisory Board. Mr. Capizzano
lature members. She also has been cited in also has published a chapter in the Hand-
Education Week and the Los Angeles Times, book of Applied Developmental Science that
has published several opinion pieces on investigates the role of federal and state
after-school programs, and is a member governments in child and family policy. Mr.
of the IES-sponsored Study of Enhanced Capizzano was formerly a research associ-
Academic Instruction in After-School Pro- ate in the Urban Institute’s Center on Labor,
grams Technical Working Group. Human Services, and Population.

Geoffrey Borman, Ph.D., is a professor Danette Parsley, M.A., is the senior direc-
of education at the University of Wiscon- tor of system and school improvement in
sin–Madison and a senior researcher with Mid-continent Research for Education and
the Consortium for Policy Research in Ed- Learning’s (McREL) Field Services division
ucation. Dr. Borman’s main substantive and leads McREL’s fieldwork in system im-
research interests revolve around social provement and OST. Ms. Parsley currently
stratification and the ways in which edu- oversees several projects focused on OST
cational policies and practices can help programs, including a U.S. Department of
address and overcome inequality. His pri- Education–funded High-Quality Supple-
mary methodological interests include the mental Educational Services and After-
synthesis of research evidence, the design School Partnerships demonstration grant.
of quasi-experimental and experimental She has extensive experience providing
studies of educational innovations, and the professional development and technical
specification of school-effects models. He assistance to OST providers, schools, dis-
has been appointed as a methodological tricts, and state departments of education
expert on a number of national research and in developing products and tools to as-
and development projects and was recently sist schools and districts engaging in con-
named to the 15-member Urban Education tinuous improvement. Ms. Parsley holds an
Research Task Force established to advise M.A. in educational psychology, research,
the U.S. Department of Education on issues and evaluation methodology from the
( 41 )
Appendix B. About the authors

University of Colorado at Denver. Before students preparing for college. He has


joining McREL, Ms. Parsley was a class- served on four expert panels convened by
room teacher and adolescent counselor. the National Academy of Sciences’ Commit-
tee on National Statistics and is a fellow of
Steven M. Ross, Ph.D., is currently a se- the American Statistical Association.
nior research scientist and professor at the
Center for Research and Reform in Educa- Jessica Taylor, M.S., is program director
tion at Johns Hopkins University. Dr. Ross at the Bureau of School Improvement at the
is the author of six textbooks and more Florida Department of Education (FDOE),
than 120 journal articles in the areas of where she oversees a U.S. Department of
educational technology and instructional Education pilot program for differentiated
design, at-risk learners, educational reform, accountability in school improvement. Ms.
computer-based instruction, and individu- Taylor has worked extensively with imple-
alized instruction. He is the editor of the menting SES in Florida, which provides ser-
research section of the Educational Tech- vices to more than 70,000 students state-
nology Research and Development journal wide. Ms. Taylor has created several FDOE
and a member of the editorial board for publications regarding parent outreach
two other professional journals. Dr. Ross and program monitoring and has provided
recently held the Lillian and Morrie Moss training to more than 270 SES providers.
Chair of Excellence in Urban Education at Additional work at FDOE includes oversee-
the University of Memphis. He has testified ing parental involvement, public school
on school restructuring research before the choice, and magnet schools. Ms. Taylor
U.S. House of Representatives Subcommit- holds an M.A. in science education from
tee on Early Childhood, Youth, and Families Florida State University. Prior to joining
and is a technical advisor and researcher FDOE, Ms. Taylor worked in various K–12,
on current federal and state initiatives re- nonformal science education programs.
garding the evaluation of technology usage,
supplemental educational services (SES), Staff
charter schools, Reading First, and the
Comprehensive School Reform program. Samina Sattar is a research analyst at
Mathematica Policy Research, Inc. She has
Allen Schirm, Ph.D., is vice president experience in qualitative and quantitative
and director of human services research at analysis and has provided data collection
Mathematica Policy Research, Inc., and is a and implementation support for multisite
nationally recognized expert in program evaluations of math curricula, intensive
evaluation. In addition to his substantive teacher induction programs, and teacher
knowledge of education programs target- pay-for-performance incentives. She served
ing disadvantaged youth, Dr. Schirm has as a school district liaison to collect stu-
extensive experience in statistical methods dent records and achievement data for two
and data analysis. He directed the random studies funded by the U.S. Department of
assignment evaluation of the Quantum Op- Education. In this role, she participated
portunity Program (QOP) demonstration, a in discussions with school districts about
multiservice program for high school–aged their data tracking and storage systems and
at-risk students, which included intensive their ability to respond to data requests.
mentoring and after-school academic sup- Ms. Sattar also has worked with school of-
port. Dr. Schirm also directed the National ficials, publishers, and curriculum train-
Evaluation of Upward Bound, a program ers to monitor program implementation
that provides extensive educational sup- and to coordinate onsite training of more
port (including academic instruction, tutor- than 1,000 teachers in elementary schools
ing, and other services) for disadvantaged across the country. As part of a project to
( 42 )
Appendix B. About the authors

provide analytic and technical assistance to teams to recruit districts and schools to
the U.S. Department of Education, Institute participate in a U.S. Department of Educa-
of Education Sciences, Ms. Sattar contrib- tion study of select alternative routes to
uted to white papers on evaluation meth- teacher certification.
ods with the goal of improving the quality
of education research. Elizabeth Potamites, Ph.D., received
her doctoral degree in economics from
Virginia Knechtel is a research analyst at New York University and is a researcher
Mathematica Policy Research, Inc., and a at Mathematica Policy Research, Inc. Her
former special education teacher. She has thesis was a quantitative exploration of
worked directly with OST programs for why Black women work more than equiv-
middle and high school students. She has alent White women using data from the
researched existing principal-development U.S. Census and the National Longitudinal
pipelines in a large school district and de- Study of Mature Women. During her time
veloped two new programs to recruit and at Mathematica, Dr. Potamites has had the
develop exceptionally qualified principal opportunity to work on a variety of topics,
candidates to the district. Ms. Knechtel has including developing value-added models
experience working with quantitative data for ranking public schools as part of the
for the evaluation of a teacher incentive teacher incentive program funded by the
program funded by the U.S. Department U.S. Department of Education. She is a
of Education. She has worked with several WWC certified reviewer.

( 43 )
Appendix C. guide. The potential influence of panel
Disclosure of potential members’ professional engagements is
further muted by the requirement that
conflicts of interest they ground their recommendations in
evidence that is documented in the prac-
Practice guide panels are composed of in- tice guide. In addition, the practice guide
dividuals who are nationally recognized undergoes independent external peer re-
experts on the topics about which they are view prior to publication, with particular
rendering recommendations. IES expects focus on whether the evidence related to
that such experts will be involved profes- the recommendations in the practice guide
sionally in a variety of matters that relate has been appropriately presented.
to their work as a panel. Panel members
are asked to disclose their professional There were no professional engagements
involvements and to institute deliberative or commitments reported by the panel
processes that encourage critical exami- members that were identified as potential
nation of the views of panel members as conflicts of interest.
they relate to the content of the practice

( 44 )
Appendix D. OST programs found positive academic
Technical information effects; other studies found no effects or
mixed effects. Furthermore, OST programs
on the studies necessarily contain multiple components
that are related to parts of the recommen-
A search for research on out-of-school time dations in this guide, and the effects of a
(OST) programs in the United States from program cannot usually be causally attrib-
1988 to 2008 resulted in more than 1,000 uted to any particular component of that
studies. Of these, 130 studies examined ac- program. This creates a challenge for the
ademically focused school-based OST pro- guide, which aims to recommend specific
grams that serve elementary and middle practices rather than programs. To assess
school students and were reviewed accord- the importance of different components
ing to What Works Clearinghouse (WWC) and their relevance to each recommen-
standards because they had a compari- dation, the panel reviewed implementa-
son group and were deemed more likely tion reports of programs with evaluations
to meet standards. Studies that included that met WWC evidence standards (see
evidence of initial similarity between their the Introduction for a discussion of the
treatment group and comparison group or WWC standards and their relevance to this
attempted to account for the possible bias guide). The level of evidence for each rec-
introduced by self-selection into the treat- ommendation was determined by consid-
ment group were considered more likely ering the number of programs that were
to meet standards. Twenty-two studies of related to each recommendation, the de-
18 programs met WWC evidence standards gree to which the programs implemented
with or without reservations. the recommendation, and the programs’
impacts on academic achievement.
Table D1 contains descriptive information
about each of these programs, and Table In almost all cases, the information cited
D2 summarizes their relevance to each about each program comes from a single
recommendation. Although several stud- study; however, there were two cases in
ies contained practices that were similar to which different studies measured the ef-
those recommended by the panel, in many fectiveness of a particular program (21st
cases, the studies did not describe them in Century Community Learning Centers
sufficient detail to determine the degree to [21st CCLC] and supplemental educational
which the practices corresponded to those services [SES]), and the panel relied on
recommended by the panel. In other cases, these to gather additional information
there was evidence in the studies that the on practices and impacts. In three other
practices, although similar, were much less cases, more qualitative studies were con-
aligned with the panel’s recommendations sulted to gather additional descriptive
or missed key parts. For example, several evidence about a program with an impact
studies implemented engaging activities but study that met standards (Los Angeles’s
failed to link them to an academic goal. Better Educated Students for Tomorrow
[L.A.’s BEST], Chicago Summer Bridge, and
It was challenging to determine the level Developmental Mentoring). Finally, when
of evidence for each recommendation be- the panel judged it useful, recommen-
cause programs’ actual practices coincide dations were supplemented by evidence
with the panel’s recommendations to vary- from outside the scope of OST or with
ing degrees, and many large OST programs evidence from studies that did not meet
have not demonstrated success in improv- WWC standards.
ing academic outcomes. Studies of some

( 45 )
Appendix D. Technical information on the studies

Table D1. Studies of OST programs that met WWC standards with or without reservations

Program and Study Details


Sample Academic
Program Grades Size Sample Study Assessment
Brief Citation Type Studieda Program Length (Analysis) Characteristics Design Measure
Positive Academic Effects
KindergARTen, Summer K 6 hrs/day, 98b Urban, RCTc DIBELSd, word
Borman, Goetz, 5 days/week, low-income, lists, DRAe,
& Dowling (2008) 6 weeks low-performing dictation
schools, largely
non-White
Summer Reading Summer 1 118 Urban, RCT Gates-MacGinitie
9 hrs/day,
Day Camp, Schacter low-income, and SATf 9
5 days/week,
& Jo (2005) largely Decoding and
7 weeks
non-White Comprehension
Early Risers, Summer and K–2 4 days/week, 201 Semi-rural, RCT Academic
August et al. (2001) school-year 6 weeks in low to low- Competenceg
mentoring summer; once a middle income,
week during school Caucasian
year, 2 years
Developmental Summer and 5 1 Saturday/month, 26 Low-performing RCT WRAT-3h Spelling
Mentoring, Karcher, school-year September–May; school, largely
Davis, & Powell mentoring 8 hrs/day, non-White
(2002) 6 days/week for
2 weeks in the
summer
Experience Corps/ After-school 1 45 min/session, 56 Urban, RCT PALSi
Book Buddies, tutoring ~8 sessions/month, low-income,
Meier & Invernizzi 5–6 months at-risk students
(2001)
Start Making a After-school 1–2 30 min/day, 84 Low-performing RCT Woodcock Reading
Reader Today tutoring 2 days/week, schools, at-risk Mastery, Expressive
(SMART), Baker, 2 years students, largely One Word Picture
Gersten, & Keating non-White Vocabulary, and
(2000) three researcher-
developed
measures
Howard Street After-school 2–3 1 hr/day, 60 Urban, RCT Word Recognition,
Tutoring, Morris, tutoring 2 days/week, low-income, Spelling, and Basal
Shaw, & Perney October–May low-achieving Passage Reading
(1990)
Fast ForWord, After-school 3–5 100 min/day, 60 Urban, RCT Yopp-Singer Test
Slattery (2003) computer 5 days/week, low-income of Phoneme
tutoring ~6 weeks Segmentation,
Qualitative Read-
ing Inventory II
Challenging After school 6–7 2 hrs/day, 48 At-risk with RCT Teacher ratings of
Horizons Program 4 days/week, behavior prob- academic progress
(CHP), Langberg one semester only lems, largely
et al. (2006) non-White
Mixed Academic Effects
Teach Baltimore, Summer K–2 6 hrs/day, 686 Urban, RCT/ CTBSk Reading
Borman & Dowling 5 days/week, low-income, QEDj Comprehension
(2006) 7 weeks/summer largely and Vocabulary
for up to 3 summers non-White
Chicago Summer Summer 3, 6 3 hrs/day, ~5000 Urban, at-risk RDDl ITBSm Math
Bridge, Jacob 6 weeks students, and Reading
& Lefgren (2004) largely
non-White
Enhanced Academic After school 2–5 ~3hrs/week, 1828 Low-income, RCT SAT 10 Math
Instruction, Black between 70 to 120 Reading; largely and Readingn
et al. (2008) days during 1961 Math non-White
first year
(continued)
( 46 )
Appendix D. Technical information on the studies

Table D1. Studies of OST programs that met WWC standards with or without reservations (continued)

Program and Study Details


Sample Academic
Program Grades Size Sample Study Assessment
Brief Citation Type Studieda Program Length (Analysis) Characteristics Design Measure
No Detectable Academic Effects
SES, McKay et al. After-school K-12o Varies by Varies Low-performing QED State
(2008), Ross et al. tutoring program by state schools achievement
(2008); Muñoz, tests
Potter, & Ross (2008)
Leap Frog, After-school 1–2 2 hrs/day, 44 Reading; Rural, RCT SAT 8 Reading
McKinney (1995) tutoring 2 days/week, 47 Math low-achieving, and Math
9 months largely
non-White
Nurturing Develop- After-school 2 ~1 hr/day, 27 Low-income, QED Woodcock-
ment Partnerships tutoring 2 days/week, at-risk students, Johnson–Revised
(NDP), Udell (2003) 10 weeks largely
non-White
L.A.’s BEST, After school K–6 5 days/week 5662 Urban, QED Normal Curve
Goldschmidt, low-income, Equivalent scores
Huang, & Chinen high ELL of CTBSp, SAT 9q,
(2007) and CAT 6r
21st CCLC, U.S. After school K–8 2.5–3 hrs/day, Elementary: Low-performing RCT/ Student records,
Department of 4–6 days/week, 1748; schools, QEDs SAT 9
Education (2003); 9 months Middle: low-income
Dynarski et al. (2004); 4068
James-Burdumy
et al. (2005)
Youth Services— After school 1–5 4 hrs/day 660 Urban, QED SAT 9 Reading
Child Care, Reading; low-income, and Math
Academic Assis- 672 Math low-achieving
tance, Recreation,
and Enrichment
(YS—Care), Bissell
et al. (2002)
a. For summer programs, refers to the grade participating students exited prior to the commencement of the summer program.
b. The sample size in the KindergARTen evaluation varied by outcome. The size of the treatment group was either 72 or 73 students for four out of
five of the literacy outcomes tested and was 60 students for the Development Reading Assessment (DRA). The size of the control group varied from
23 to 27 students across the five outcomes.
c. RCT stands for randomized controlled trial.
d. Center on Teaching & Learning. (n.d.). DIBELS data system. Eugene, OR: University of Oregon. Retrieved May 29, 2009, from https://dibels.uoregon.edu/.
e. Beaver, J., & Varter, M. (n.d.). Developmental reading assessment (2nd ed.). Lebanon, IN: Pearson. Retrieved May 29, 2009, from http://www.pearson-
school.com/.
f. Gates-MacGinitie Reading Tests. (2000). GMRT validity and reliability statistics. Itasca, IL: Riverside Publishing.
g. Academic competence is a composite variable based on the Woodcock-Johnson–Revised and parent and teacher ratings. Woodcock, R. W., & Johnson,
M. B. (1990). Woodcock-Johnson Psychoeducational Battery—Revised: Tests of Achievement. Allen, TX: DML Teaching Resources.
h. Jastak, S., & Wilkinson, G. (1994). Wide Range Achievement Test (3rd ed.). San Antonio, TX: The Psychological Corporation.
i. Curry School of Education. (n.d.). Phonological Awareness Literacy Screening. Charlestown, VA: University of Virginia. Retrieved May 29, 2009, from
http://pals.virginia.edu/.
j. QED stands for quasi-experimental design.
k. McGraw-Hill. (n.d.). Comprehensive Test of Basic Skills. Monterey, CA: MacMillan/McGraw-Hill. Retrieved May 29, 2009, from http://www.ctb.com/.
l. RDD stands for regression discontinuity design. In this case, students who were right above the cutoff for mandatory summer school were com-
pared to students who were right below the cutoff and, therefore, had to attend summer school. Since all tests have some margin of error in judging
students’ academic abilities, students near the cutoff could be very similar in actual abilities. Furthermore, the test score pre–summer school and
other baseline characteristics are controlled for when looking at differences in students the next year.
m. College of Education. (n.d.). Iowa Test of Basic Skills. Iowa City, IA: The University of Iowa. Retrieved May 29, 2009, from http://www.education.uiowa.
edu/itp/itbs/.
n. The Stanford Achievement Test, 10th ed., abbreviated battery for either math or reading was given to students in the beginning and end of school year.
For 2nd and 3rd graders, the Dynamic Indicators of Basic Early Literacy Skills also was used. Harcourt Assessment. (n.d.). Stanford Achievement Test
Series, Tenth Edition–Abbreviated Battery, Areas of Assessment. San Antonio, TX: Author. Retrieved September 9, 2007, from http://harcourtassessment.
com/HAIWEB/Cultures/en-us/Productdetail.htm?Pid=SAT10A&Mode=summary&Leaf=SAT10A_2; Harcourt Assessment. (2003). Stanford Achievement
Test Series, Tenth Edition–Spring Multilevel Norms Book. San Antonio, TX: Author.; Harcourt Assessment. (2004). Stanford Achievement Test Series, Tenth
Edition–Technical Data Report. San Antonio, TX: Author.
o. Grades included in SES could be any from kindergarten to 12th grade. Typically, grades included in state or district evaluations are those that are
more often tested statewide with tests that are comparable across grade levels, usually 3rd through 8th.
p. McGraw-Hill. (n.d.). Comprehensive Test of Basic Skills. Monterey, CA: MacMillan/McGraw-Hill. Retrieved May 29, 2009, from http://www.ctb.com/.
q. Pearson. (n.d.). The Stanford Achievement Test, 10th ed. San Antonio, TX: Author. Retrieved May 29, 2009, from http://pearsonassess.com.
r. McGraw-Hill. (n.d.). California Achievement Test. Monterey, CA: MacMillan/McGraw-Hill. Retrieved May 29, 2009, from http://www.ctb.com/.
s. Results for elementary students are from an RCT study, whereas the middle school study is a QED.

( 47 )
Appendix D. Technical information on the studies

Table D2. Studies and corresponding recommendations

Program and Study Details Recommendationsa


Program Grades 1. 2. 3. 4. 5.
Brief Citation Type Studiedb Align Attendance Individualize Engage Assess
Positive Academic Effects
KindergARTen, Summer K √ √ √√
Borman, Goetz,
& Dowling (2008)
Summer Reading Day Summer 1 √ √ √
Camp, Schacter & Jo
(2005)
Early Risers, August Summer and K–2 √ √ √√ √√ √√
et al. (2001) school-year
mentoring
Developmental Summer and 5 √√
Mentoring, Karcher, school-year
Davis, & Powell (2002) mentoring
Experience Corps/Book After-school 1 √
Buddies, Meier tutoring
& Invernizzi (2001)
SMART, Baker, Gersten, After-school 1–2 √ √
& Keating (2000) tutoring
Howard Street After-school 2–3 √ √√
Tutoring, Morris, tutoring
Shaw, & Perney (1990)
Fast ForWord, Slattery After-school 3–5 √ √√
(2003) computer
tutoring
CHP, Langberg et al. After school 6–7 √√ √ √√ √√
(2006)
Mixed Academic Effects
Teach Baltimore, Summer K–2 √ √ √ √√ √√
Borman & Dowling (2006)
Chicago Summer Bridge, Summer 3, 6 √√ √ √ √√
Jacob & Lefgren (2004)
Enhanced Academic After school 2–5 √ √√ √√ √√ √√
Instruction, Black et al.
(2008)
No Detectable Academic Effects
SES, McKay et al. (2008); After-school K-12c √ √
Ross et al. (2008); Muñoz, tutoring
Potter, & Ross (2008)
Leap Frog, McKinney After-school 1–2 √√ √ √
(1995) tutoring
NDP, Udell (2003) After–school 2 √ √ √ √
tutoring
L.A.’s BEST, Goldschmidt, After school K–6 √ √ √ √√
Huang, & Chinen (2007)
21st CCLC, U.S. After school K–8 √ √ √ √
Department of Education
(2003); Dynarski et al.
(2004); James-Burdumy
et al. (2005)
YS-CARE, Bissell et al. After school 1–5 √ √ √ √
(2002)

a. Indicates relevance of recommendation to program operation. “√√” indicates high relevance; “√” indicates lower relevance.
b. For summer programs, refers to the grade participating students exited prior to the commencement of the summer program.
c. Grades included in SES could be any from kindergarten to 12th grade. Typically, grades included in state or district evaluations are those that are
more often tested statewide with tests that are comparable across grade levels, usually 3rd through 8th.

( 48 )
Appendix D. Technical information on the studies

In the remainder of this appendix, we strengthened if possible.175 However, we


summarize the evidence relevant to each fully acknowledge that more research is
of the panel’s recommendations. First, we required to explore whether there is a con-
briefly describe the level of evidence sup- nection between collaboration and posi-
porting each recommendation. Next, we tive academic outcomes in practice.
provide a brief description of the char-
acteristics and results of the studies that Summary of evidence
support each recommendation, the prac-
tices described in those studies, and the Fifteen OST programs endeavored to
degree to which they align with the panel’s collaborate with school-based staff or
recommendations. initiatives,176 but, in general, these efforts
were not core components of the programs,
In general, each description first describes and three programs expressed difficulty or
the relevant studies that produced posi- reluctance to coordinate more fully.177 Of
tive academic effects. Next, we review the the 11 programs with studies that met WWC
studies that showed mixed effects (posi- standards with or without reservations,178
tive results in only some subgroups or, in
the case of one study that used a mixed 175. Schacter and Jo (2005). Center for Applied
design, positive results from the quasi-ex- Linguistics (1994) suggested the use of more
perimental portion of the study but not the collaboration when appropriate.
experimental portion). The studies that 176. CHP—Langberg et al. (2006); Early Risers—
showed no positive effects on academic August et al. (2001); Enhanced Academic Instruc-
tion—Black et al. (2008); Teach Baltimore—Borman
outcomes are described next, followed by
and Dowling (2006); Chicago Summer Bridge—Ja-
supplemental evidence from studies that cob and Lefgren (2004); L.A.’s BEST—Goldschmidt,
either were not eligible for WWC review or Huang, and Chinen (2007); YS-CARE—Bissell et al.
did not meet WWC standards. (2002); 21st CCLC—U.S. Department of Education
(2003); Leap Frog—McKinney (1995); NDP—Udell
(2003); SES—McKay et al. (2008); SES—Ross et
Recommendation 1. al. (2008); SES—Muñoz, Potter, and Ross (2008);
Align the OST program Title I supplementary education—Borman (1997);
academically with the school day TASC—Reisner et al. (2004); Project Adelante—
Center for Applied Linguistics (1994); After school
tutoring—Leslie (1998).
Level of evidence: Low
177. In James-Burdumy et al. (2005), the authors
noted that 21st CCLC programs struggled to effec-
The level of evidence for this recommen- tively coordinate homework help with the school;
dation is low. There is no direct evidence and in U.S. Department of Education (2003), 21st
that practices outlined in this recommen- CCLC programs were found to be supportive but
dation contribute to improved academic not “integrated” (p. 39) with the school. In Project
Adelante (Center for Applied Linguistics 1994),
outcomes. Although it was common for program directors recommended closer coordina-
programs to include some components of tion with school-day staff to collect data and share
the panel’s recommendations, none tested information on student progress but were con-
the effectiveness of this recommendation cerned about the appropriateness of the school-
day curriculum for the students that their program
individually, only in combination with the
served. Similarly, Morris, Shaw, and Perney (1990)
other components of OST programs. In the expressed reluctance to align their after-school tu-
panel’s opinion, collaboration can improve toring to the school curriculum given that they ob-
academic outcomes and in the studies re- served that their students’ classroom instruction
viewed for this guide, two independent was often above their current reading levels.
evaluators recommended that collabora- 178. CHP—Langberg et al. (2006); Early Ris-
ers—August et al. (2001); Enhanced Academic
tion between in-school time and OST be Instruction—Black et al. (2008); Teach Balti-
more—Borman and Dowling (2006); Chicago
Summer Bridge—Jacob and Lefgren (2004); L.A.’s

( 49 )
Appendix D. Technical information on the studies

only 3 documented practices that closely showed positive effects,184 two showed
corresponded to the panel’s recommenda- mixed effects,185 and five showed no de-
tions: CHP, Leap Frog, and Chicago Summer tectable academic effects.186 Although more
Bridge (see Table D3).179 In CHP and Leap of these programs failed to demonstrate
Frog, coordination between schoolteach- effectiveness, Table D2 shows that there
ers and OST instructors was frequent and were seven effective programs that did not
structured.180 Content and skills taught attempt coordination with schools. Given
during OST were intentionally designed to the absence of a clear pattern of effective-
support students during their school-day ness based on the level of coordination with
instruction. CHP showed positive effects schools and the small sample of programs
and Leap Frog did not.181 The purpose of with high levels of coordination, the panel
Chicago Summer Bridge was to help stu- decided that the level of evidence was low.
dents achieve proficiency on state exami-
nations that they had not mastered during
the school year. Because it was a summer Positive evidence for alignment
school program, coordination with individ- between in-school time and OST
ual teachers was limited, but the curricu-
lum was designed by the district with the Two small to medium randomized con-
express purpose of helping students meet trolled trials of after-school programs, CHP
state standards and closely linked to that and Early Risers, attempted to coordinate
goal. It had significant persistent effects on with the schools that their students at-
both math and reading for 3rd graders but tended and showed positive effects,187 but
not for 6th graders.182 only in CHP did its coordination practices
seem to be both similar to the specific action
The remaining eight programs included steps recommended by the panel and an
some components similar to the action important part of the overall program.188
steps for this recommendation, but the
nine studies of the programs indicated that Langberg et al. (2006) used a randomized
the degree of coordination was much less, controlled trial to study small group CHP,
or not enough information was provided which provided after-school academic
to determine the level of alignment be- remediation and study skills training for
tween programs.183 Of these, one program 6th and 7th graders with a combination of
learning and behavior problems. The pro-
BEST—Goldschmidt, Huang, and Chinen (2007); gram showed no statistically significant
YS-CARE—Bissell et al. (2002); 21st CCLC—U.S.
Department of Education (2003); Leap Frog—
McKinney (1995); NDP—Udell (2003); SES—McKay (2003); SES—McKay et al. (2008); SES—Ross et al.
et al. (2008); SES—Ross et al. (2008); SES—Muñoz, (2008); SES—Muñoz, Potter, and Ross (2008).
Potter, and Ross (2008). 184. Early Risers—August et al. (2001).
179. CHP—Langberg et al. (2006); Chicago Sum- 185. Enhanced Academic Instruction—Black et
mer Bridge—Jacob and Lefgren (2004); Leap al. (2008); Teach Baltimore—Borman and Dowl-
Frog—McKinney (1995). ing (2006).
180. CHP—Langberg et al. (2006); Leap Frog— 186. L.A.’s BEST—Goldschmidt, Huang, and
McKinney (1995). Chinen (2007); YS-CARE—Bissell et al. (2002);
181. Ibid. 21st CCLC—U.S. Department of Education (2003);
182. Jacob and Lefgren (2004). NDP—Udell (2003); SES—McKay et al. (2008);
SES—Ross et al (2008); SES—Muñoz, Potter, and
183. Early Risers—August et al. (2001); Enhanced Ross (2008).
Academic Instruction—Black et al. (2008); Teach
Baltimore—Borman and Dowling (2006); L.A.’s 187. Langberg et al. (2006) had a sample size
BEST—Goldschmidt, Huang, and Chinen (2007); of 48 students; August et al. (2001) had 201
YS-CARE—Bissell et al. (2002); 21st CCLC—U.S. students.
Department of Education (2003); NDP—Udell 188. Langberg et al. (2006).

( 50 )
Appendix D. Technical information on the studies

Table D3. S
 tudies of programs cited in recommendation 1 that met WWC standards
with or without reservations

Program and Study Details Strategies


OST OST School
Coordinator School Staff Instruction Is Helped
Maintains Communicate Coordinated Identify
Program Grades Level of Relations with OST with School Staff for OST
Brief Citation Type Studieda Relevanceb with School Program Goals Programs
Positive Academic Effects
Early Risers, August Summer K–2 √ To support
et al. (2001) and classroom
school-year teachers
mentoring
CHP, Langberg et al. After 6–7 √√ Weekly Classroom District
(2006) school teacher identi- determined
fied student academic
problem areas content
Mixed Academic Effects
Teach Baltimore, Summer K–2 √ Reported
Borman & Dowling alignment
(2006) with school-
year curricula
Chicago Summer Bridge, Summer 3, 6 √√ District Principals
Jacob & Lefgren (2004) designed selected
curricula; teachers
intended to for summer
help students program
pass state tests
Enhanced Academic After 2–5 √ Part-time OST Majority of
Instruction, Black et al. school coordinator OST instruc-
(2008) at each site tors teach
at school
No Detectable Academic Effects
SES, McKay et al. (2008); After- K–12c √ Varies Varies Varies Varies
Ross et al. (2008); school
Muñoz, Potter, & Ross tutoring
(2008)
Leap Frog, McKinney After- 1–2 √√ Weekly Teachers
(1995) school written sent student
tutoring comments to assignments
teachers
NDP, Udell (2003) After- 2 √ Reported
school alignment
tutoring with school-
day learning
L.A.’s BEST, After K–6 √ Informal Principal
Goldschmidt, Huang, school buy-in
& Chinen (2007) required before
program starts
21st CCLC, U.S. Depart- After K–8 √ Varies Varies Varies
ment of Education (2003); school
Dynarski et al. (2004);
James-Burdumy et al.
(2005)
YS-CARE, Bissell et al. After 1–5 √ Reported
(2002) school alignment
with school-
day curricula
a. For summer programs, refers to the grade participating students exited prior to the commencement of the summer program.
b. Indicates relevance of recommendation to program operation. “√√” indicates high relevance; “√” indicates lower relevance.
c. Grades included in SES could be any from kindergarten to 12th grade. Typically, grades included in state or district evaluations are those that are
more often tested statewide with tests that are comparable across grade levels, usually 3rd through 8th.

( 51 )
Appendix D. Technical information on the studies

effects on teacher ratings of academic havior at the end of kindergarten, showed


progress because of the small sample positive academic effects but utilized dif-
size, but the effect size (calculated using ferent methods of collaboration than those
Hedges’s g189) was positive at 0.45.190 In discussed in this recommendation.193 Pro-
order to acknowledge meaningful effects gram staff met regularly with classroom
regardless of sample size, the panel fol- teachers to review student progress and
lowed WWC guidelines and considered a discuss instructional practices, but they
positive statistically significant effect or served more as a resource to teachers than
an effect size greater than 0.25 as an indi- vice versa. The study demonstrated sig-
cator of improved outcomes. nificant increased academic competence
for the treatment group relative to a com-
CHP was implemented as an alternative parison group, but behavioral effects were
to the district-run after-school program not significant.194
in Columbia, South Carolina.191 Sixth- and
7th-grade students were recruited based
on poor performance on state exams. The Mixed evidence for alignment between
first group was randomly assigned to par- in-school time and OST
ticipate in CHP the first semester, and the
second group began participation in the Three other larger studies of programs
second semester of the same year. The with varying degrees of school coordina-
program staff undertook considerable ef- tion had mixed yet potentially encouraging
forts to maintain communication with, and findings.195 The first study with mixed ef-
support the services of, the school day to fects was a regression discontinuity study
improve students’ behavior, organization, of Chicago Summer Bridge, a mandatory
and academic achievement. Districts deter- summer school program for students in
mined the content of the program, which grades 3 and 6 who failed to meet stan-
ensured that it was aligned with the dis- dards for promotion on state tests.196 The
trict curriculum. Students in the program study compared the achievement gains of
received extra privileges for correctly re-
cording their homework assignments and
193. August et al. (2001).
keeping their school materials organized,
194. Ibid. Academic competence is a compos-
encouraging the development of skills for
ite measure constructed by the authors using
school success. Students in various groups the Woodcock-Johnson Tests of Achievement–
were required to demonstrate the use of Revised (the broad reading and applied prob-
learned skills at school, and interventions lems domains) and several components of the
teacher and parent ratings. The teacher ratings
targeted problem areas identified by teach-
included in the composite were the Learning
ers. Each staff member in the OST program Problems scale from the Behavioral Assessment
was assigned to maintain weekly contact System for Children–Teacher Rating Scale, the
with two teachers regarding upcoming as- Cognitive Competence scale from the Teacher’s
signments and classroom behavior.192 Scale of Child’s Actual Competence and Social
Acceptance, and the Concentration Problems
scale from the Teacher Observation of Classroom
A study of Early Risers, a two-year tutor- Adaptation–Revised. The parent rating was the
ing and mentoring program that recruits Concentration Problems scale from the Parent
children with early onset aggressive be- Observation of Classroom Adaptation used to
assess behavior at home.
195. Borman and Dowling (2006); Jacob and Lef-
189. See Appendix B of WWC Procedures and
gren (2004); Black et al. (2008).
Standards Handbook.
196. Jacob and Lefgren (2004). The standards for
190. Langberg et al. (2006).
promotion also apply to 8th graders in Chicago
191. Ibid. but only 3rd and 6th graders were included in
192. Ibid. the impact study.

( 52 )
Appendix D. Technical information on the studies

students with June test scores just above or Success for All Foundation’s Adventure
and just below the cutoff for mandatory Island program at the reading centers).
summer school in order to potentially be Managing sites were provided with fund-
promoted to the next grade.197 It showed ing for a part-time district coordinator who
positive effects on student achievement was familiar with the district and students
in both reading and math among the dis- served and who coordinated several man-
tricts’ lowest-performing 3rd-grade stu- agement tasks associated with the imple-
dents, but not among 6th graders. In this mentation of the program. Three-quarters
program, coordination was fairly strong of the program instructors were class-
as the summer program was specifically room teachers from the school day in the
designed to help students achieve the school in which the program was located.
academic standards they failed to meet In some cases, the program staff worked
during the school year. Instructors used a with classroom teachers and principals
centrally designed curriculum distributed to identify eligible students. The curri-
by the district with lesson plans included, cula were tied to academic standards, but
which emphasized the basic skills needed some participating instructors reported
to pass the test at the end of the summer inconsistencies between the strategies and
and be promoted.198 Summer instructors vocabulary used for instruction during the
were selected by principals.199 school day and those used during the en-
hanced OST program.
In two additional studies, alignment with
schools was less strong or not described The enhanced program was characterized
in the same detail. The second of the by increased instructional time and spe-
three studies with mixed effects was of cific strategies regarding staffing, support
Enhanced Academic Instruction in After- for staff, and student attendance efforts
School Programs, which separately evalu- that distinguished it from the regular pro-
ated math and reading programs designed gram. The evaluation found a significant
to improve the academic instruction in impact on math scores with an effect size
after-school instruction for children in of 0.06 and no significant impacts on read-
grades 2 through 5.200 Students were ran- ing scores.201 Program impacts were mea-
domly assigned to receive the regular sured relative to students who participated
after-school-day offering (the comparison in a regularly offered after-school program
group) or either one of the structured en- that typically consisted of homework help
hanced curricula (Harcourt School Publish- or unstructured direct instruction. This
ers’ Mathletics at centers in the math study study is not a general evaluation of the im-
pacts of an after-school program; instead,
197. Students took the state exam again at the it is a test of a particular program type—
end of the summer program. If they failed to meet structured curricula in math or reading
standards a second time, they would be retained versus more typical after-school program-
in the same grade level. Between the spring of
1997 and the spring of 1999, 43 percent of 3rd
ming. Although the available evidence sug-
graders and 31 percent of 6th graders failed in gests that coordination with schools was a
either reading or math the first time in June. Of component of both the reading and math
those who failed the first time, 48 percent of 3rd programs, the study does not suggest
graders and 39 percent of 6th graders were ulti-
that these were a major component of ei-
mately retained (Jacob and Lefgren 2004).
ther program. In general, program effects
198. Jacob and Lefgren (2004); supplementary
information from Stone et al. (2005); Roderick,
Jacob, and Bryk (2004).
201. Based on classroom teacher surveys, the
199. Supplementary information from Roderick, program was not associated with increased in-
Jacob, and Bryk (2004). cidents of behavior problems or reduced home-
200. Black et al. (2008). work completion during the school day.

( 53 )
Appendix D. Technical information on the studies

cannot be attributed to specific program of the control group who were weighted
components, and, in this case in particu- by their likely predicted probability of
lar, school coordination efforts appear to being a frequent attendee if they had been
be an equally minor component of both in the treatment group.205 The analysis
the math and reading studies. found statistically significant effects of
the program for frequent attendees on
Finally, a large study evaluated Teach Bal- their total reading scores from the Com-
timore, a seven-week summer program prehensive Test of Basic Skills, 4th ed., and
designed to counter the “summer slide” its two subtests: Vocabulary and Reading
effect202 using undergraduate students Comprehension.206
as instructors for students in high-pov-
erty schools, beginning in kindergarten
and 1st grade and continuing over three Inconclusive and cautionary evidence
summers.203 The curricula were focused for alignment between in-school time
primarily on reading instruction and de- and OST
signed to align with what students were
learning in their school-year classrooms. Evaluations of six other after-school pro-
The evaluators noted that this likely helped grams showed no significant positive ac-
convey to teachers and principals the po- ademic effects, despite efforts by pro-
tential benefits of the program, but there grams to coordinate to some degree with
was not enough detail in the study to de- the in-school instruction.207 However, the
termine the extent of alignment with the amount of coordination in practice in all
school year. Instructors also maintained but one of these programs appears to be
communication with their students during relatively low. The exception is Leap Frog,
the school year. an after-school tutoring program in which
teachers and tutors have formalized mech-
However, the randomized controlled trial anisms for frequent communication. The
found no positive effects on reading that other five programs are the NDP, YS-CARE,
included all students assigned to the treat- L.A.’s BEST, 21st CCLC, and SES.
ment group. The authors suggest that this
result could be partially explained by the A small, randomized controlled trial study
poor attendance of participating students, of Leap Frog, an after-school program pro-
and they also include a quasi-experimen- viding one-on-one tutoring and homework
tal analysis of the average effect on stu- assistance to 1st and 2nd graders, found
dents who attended more frequently.204
Frequent attendance was defined as hav-
205. Frequent attendance was predicted based
ing above average attendance (above 39 on a model that included a baseline measure
percent) for at least two out of the three of a student’s regular school-year attendance,
summers. A total of 202 treatment stu- whether a student moved during the course of the
dents (46 percent of the original treat- study, and the site that a control student would
have attended if he or she had been in the treat-
ment group) were compared to members
ment group. It is not known whether the frequent
attendees were similar to the weighted sample of
202. “Summer slide” refers to the effect, broadly control group students in terms of their initial
recognized in the research, by which the achieve- reading levels or other characteristics.
ment gap for low-income and minority students 206. Effect sizes were 0.30 for total reading, 0.32
grows during the summer and stays constant for vocabulary, and 0.28 for comprehension.
while school is in session. See, for example,
Heyns (1978); Alexander and Entwisle (1996); 207. Udell (2003); McKinney (1995); Bissell et al.
Cooper et al. (1996). (2002); Goldschmidt, Huang, and Chinen (2007);
U.S. Department of Education (2003); Dynarski et
203. Borman and Dowling (2006). al. (2004); James-Burdumy et al. (2005); Ross et al.
204. Ibid. (2008); Muñoz, Potter, and Ross (2008).

( 54 )
Appendix D. Technical information on the studies

no academic effects.208 In the program, CARE program objectives included rein-


university student-volunteers tutored for forcing the school curriculum, the authors
one-hour sessions twice a week using work did not describe the specific activities un-
sent by the students’ classroom teacher. dertaken to promote collaboration, nor did
Teachers and tutors communicated using they report on the actual level of coopera-
a folder, which the tutee carried from tion implemented except for one survey
school to the OST program. Teachers sent question.214 Sixty-four percent of surveyed
the tutors assignments or content areas staff reported that getting cooperation
to focus on, and tutors were required to from schools was “not too difficult.”215
send weekly comments to the classroom
teacher about the tutoring sessions.209 In a quasi-experimental study of L.A.’s
BEST, the authors compared treated stu-
A small quasi-experimental study of the dents (those attending a minimum of four
NDP program, which provided one-on-two days per month) to two matched com-
or one-on-one reading tutoring by under- parison groups (those who opted not to
graduate tutors with 2nd-grade students participate in the same school and those
after school, found no effects on reading in similar schools that did not offer a pro-
compared to a comparison group that re- gram) and found no consistent positive
ceived small group activities not related to effects.216 The program was intended to
literacy.210 Although the author reported raise academic achievement and reduce
that tutors used techniques that supple- crime among students in high-crime, high-
mented the school-day instruction, no ad- poverty neighborhoods in Los Angeles
ditional details were offered and coopera- County. Another study used qualitative
tion with the school was not described as methods to gauge the implementation of
a key component of the program.211 L.A.’s BEST at six representative sites, iden-
tified by the main program office.217 The
A large evaluation of the YS-CARE after- researchers conducted observations, inter-
school program, which targeted Califor- views, and focus groups of key stakehold-
nia students with parents transitioning off ers. Most principals reported cooperative
welfare, found positive but insignificant working relationships with OST staff.218
findings in the first year.212 The analysis Although participating staff made some
used matched pairs to measure the effec- efforts to collaborate with the school day,
tiveness of the program.213 Although YS- including attending teacher meetings to
promote the program, requiring principal
208. McKinney (1995) had 47 students with pre- support before a program was initiated at
and posttests in math and 44 in reading. a school, and communicating regularly
209. Ibid. with teachers, these efforts were informal
210. Udell (2003) had 27 students in the analysis
sample. The following subtests of the Woodcock
the treatment and comparison groups were ini-
Johnson–Revised were used to measure reading
tially equivalent in this way.
achievement: Word Attack, Letter-Word Identifi-
cation, and Passage Comprehension. 214. Bissell et al. (2002).
211. Ibid. 215. Ibid.
212. Bissell et al. (2002). The program reached 216. Despite the matching attempts, the second
567 students in grades 1 to 5 at 28 schools. group (students at non–L.A.’s BEST schools) was
not comparable in baseline achievement levels
213. There were no baseline differences in the
in math, so the focus is on the analysis with the
test scores of the treatment and comparison
comparison group from nonparticipating stu-
groups at the outset of the study, but there were
dents at the same school (Goldschmidt, Huang,
some differences in student ethnicity. Also, some
and Chinen 2007).
students in the comparison group reported that
they did not attend because of responsibilities for 217. Huang et al. (2007).
caring for siblings, and it was not shown whether 218. Ibid.

( 55 )
Appendix D. Technical information on the studies

and not necessarily implemented consis- providers.223 In the study of SES provision
tently across sites.219 in Tennessee, 36 percent of principals re-
ported that SES providers communicated
A national evaluation of the 21st CCLC af- frequently with them, but in a similar
ter-school program showed no consistent study in Louisville, Kentucky, none of the
positive academic effects among elemen- surveyed principals reported frequent
tary or middle school students. Elemen- communication. Similarly, 34 percent of
tary students were randomly assigned to surveyed teachers in Tennessee reported
either treatment or comparison groups in frequent contact with SES providers com-
26 centers across 12 districts. Participat- pared to 7 percent in Louisville.224 Both
ing elementary students reported feeling studies involved relatively small samples
safer than the comparison group students of principals (50 and 19, respectively) and
reported feeling, but they also experienced teachers (128 and 56, respectively).
increased incidents of negative behavior
during the school day.220 A matched pair,
quasi-experimental analysis of the pro- Supplemental evidence from
gram’s impacts on middle school students other sources
in 61 centers in 32 districts found similar
results for behavior, although participat- Five other sources provided examples of
ing students did not report feeling safer practices for in-school and out-of-school
than control students reported feeling.221 alignment even though they either did not
meet the OST protocol for inclusion or failed
The 21st CCLC has awarded grants to sup-
port after-school programs with an aca- 223. Ross et al. (2008); Muñoz, Potter, and Ross
demic component (including homework (2008).
help) since 1998. The specifics of the pro- 224. Furthermore, in Louisville, 72 percent of
gram vary considerably by site. Although, teachers reporting having no contact at all with
SES providers, compared to 21 percent in Ten-
on average, the programs were not found to nessee. Nevertheless, when teachers in Louisville
be effective at improving academic achieve- were asked whether providers “adapted the tutor-
ment, the study reported that some OST ing services to this school’s curriculum,” 7 percent
coordinators would acquire lists of failing strongly agreed and 34 percent agreed. In Tennes-
see, 30 percent of teachers strongly agreed and 40
students from classroom teachers to iden-
percent agreed. Very similar responses were given
tify which OST participants needed extra when asked if providers “integrated the tutoring
attention. Centers reported communicat- services with classroom learning activities” (43
ing regularly with host schools in setting percent agreed or strongly agreed in Louisville
the curricula, goals, and objectives and in and 68 percent did so in Tennessee) or “aligned
their services with state and local standards” (41
providing feedback on students. Many prin- percent agreed or strongly agreed in Louisville
cipals of host schools also were actively in- and 71 percent did so in Tennessee).
volved in the planning of the program.222
A similar survey was done in Virginia but included
parents, SES providers, and SES coordinators in
Two quasi-experimental studies of SES in participating school divisions (McKay et al. 2008):
various cities in Tennessee and in Louis- 34 percent of SES coordinators (N = 41) agreed or
strongly agreed that providers “adapted the tu-
ville, Kentucky, found overall null effects
toring services to this school’s curriculum”; 24
on achievement growth using matched percent agreed or strongly agreed that provid-
pairs to measure the effects of individual ers “integrated the tutoring services with class-
room learning activities”; and 66 percent agreed
219. Ibid. or strongly agreed that providers “aligned their
services with state and local standards.” Of the
220. James-Burdumy et al. (2005).
111 representatives from 16 SES providers, 28
221. Dynarski et al. (2004). percent reported frequent communication with
222. U.S. Department of Education (2003). teachers.

( 56 )
Appendix D. Technical information on the studies

to meet WWC standards.225 A study of the counselors, coordinated a parent night and
level of coordination between classroom in- invited classroom teachers, and received data
struction and Title I supplementary instruc- on English as a Second Language (ESL) stu-
tion found that curricular congruence (e.g., dents and academic achievement scores from
both the classroom and the supplemental the district.231 In a study of an after-school
instructor used the same curricular materials tutoring program for rural middle school
or content or taught at the same instructional students, tutors met with classroom teach-
level) was correlated with higher reading ers prior to each tutoring session to discuss
achievement in 1st-grade classrooms.226 the students’ homework and problem areas
in class.232 Finally, in a newsletter document-
The evaluation of The After-School Corpo- ing OST programs’ attempts to create connec-
ration (TASC) showed consistently positive tions with the school day, authors cite strat-
effects for math, but not for reading.227 egies including sharing common planning
Eighty-six percent of principals reported periods with school-day teachers, attending
that the “after-school program solicits workshops with teachers, asking teachers to
input from the principal and teachers on help develop programs and deliver instruc-
skills in which students need help and in- tion, and sharing homework completion logs
corporates these topics into after-school with classroom teachers.233
activities.”228 Fifty-one percent of princi-
pals reported that OST instructors “coor- Recommendation 2.
dinate homework assistance with class- Maximize student participation
room teachers,” and the same percentage and attendance
reported that the program coordinator
“serves on a school planning team.”229 Al- Level of evidence: Low
most as frequently, however, principals
cited the following areas as in need of at- The panel judged the level of evidence sup-
tention: “coordination/integration with the porting this recommendation to be low be-
school curriculum” (41 percent) and “coor- cause there is no conclusive evidence that
dination with the school” (31 percent).230 following the action steps in this recom-
mendation will lead to higher attendance or
In Project Adelante, a summer program and increased academic achievement. Given the
Saturday academy for Hispanic students, voluntary nature of most OST programs and
the project coordinator sent weekly prog- other barriers discussed in this recommen-
ress reports to participating students’ school dation, regular attendance appears a difficult
goal for many programs to reach. Some pro-
225. Borman (1997); Reisner et al. (2004); Center grams have devoted considerable resources
for Applied Linguistics (1994); Leslie (1998); Bouf-
and seem to have made efforts to imple-
fard, Little, and Weiss (2006).
ment the action steps described in this rec-
226. Borman (1997). This study was ineligible
for review because the supplementary instruc- ommendation and still have trouble getting
tion took place during the school day. Analysis students to attend regularly (see Table D4).
was limited to students in classrooms with two
or more students receiving Title I supplemen-
231. Center for Applied Linguistics (1994). This
tary instruction and at least two students not
study was not eligible for review because it did
receiving it.
not contain a comparison group.
227. Reisner et al. (2004) could not be rated by
232. Leslie (1998). This study was not eligible
WWC reviewers because it was not clear if the
for review because it did not contain a compari-
treatment and control groups were equivalent in
son group.
achievement levels before the program began.
233. Bouffard, Little, and Weiss (2006). This
228. Reisner et al. (2004).
newsletter was not eligible for review because
229. Ibid. it was not a study of the effectiveness of these
230. Ibid. strategies.

( 57 )
Appendix D. Technical information on the studies

Table D4. S
 tudies of programs cited in recommendation 2 that met WWC standards
with or without reservations

Study Details Strategies for Recruiting and Promoting Attendance


Located
in School/
Average Recruiting Provide Trans- Enrichment
Program Attendance Students/ Snacks/ portation and Other Monitor Incentive
Brief Citation Type Gradea Ratesb Targeting Meals Provided Activitiesc Attendance Systems

Positive Academic Effects


KindergARTen, Summer K 72% among Teachers, Yes Yes Recreation,
Borman, Goetz, students who principals cultural
& Dowling attended at
(2008) least for 1 day
(55% overall)
Early Risers, Summer K–2 50% averaged Teachers Yes Yes Recreation, Yes
August et al. and across differ- cultural,
(2001) school- ent program interper-
year componentsd sonal
mentoring
Summer Summer 1 Target Yes Yes Recreation,
Reading Day schools cultural
Camp, Schacter
& Jo (2005)
SMART, Baker, After- 1–2 Teachers Yes
Gersten, & school
Keating (2000) tutoring
Howard Street After- 2–3 Teachers Yes Yes Recreation
Tutoring, school
Morris, Shaw, tutoring
& Perney (1990)
Fast ForWord, After- 3–5 Students Yes Yes
Slattery (2003) school reading
computer below
tutoring grade
level
CHP, Langberg After 6–7 Teachers, Yes Yes Recreation Yes
et al. (2006) school parents,
student
testing
Mixed Academic Effects
Teach Summer K–2 39%, includes Target Yes Yes Recreation, Yes
Baltimore, yearly no- schools cultural
Borman & shows
Dowling (2006)
Enhanced After 2–5 77% math; School Yes Yes Recreation Yes Yes
Academic school 73% reading staff
Instruction,
Black et al. (2008)
No Detectable Academic Effects
L.A.’s BEST, After K–6 66% of Target Yes Yes Recreation, Yes
Goldschmidt, school participants schools, cultural
Huang, & were activee teachers
Chinen (2007)
21st CCLC After K–8 44% among Teachers Yes Yes Recreation, Yes Yes
for elementary school students who cultural,
school students, attended at interper-
Dynarski et al. least 1 dayf sonal
(2004); U.S.
Department
of Education
(2003)

(continued)

( 58 )
Appendix D. Technical information on the studies

Table D4. Studies of programs cited in recommendation 2 that met WWC standards
with or without reservations (continued)

Study Details Strategies for Recruiting and Promoting Attendance


Located
in School/
Average Recruiting Provide Trans- Enrichment
Program Attendance Students/ Snacks/ portation and Other Monitor Incentive
Brief Citation Type Gradea Ratesb Targeting Meals Provided Activitiesc Attendance Systems

No Detectable Academic Effects (continued)


Leap Frog, After- 1–2 Teachers, Yes Yes Recreation,
McKinney school principals cultural
(1995) tutoring
YS-CARE, After 1–5 Yes Yes Recreation,
Bissell et al. school cultural,
(2002) interper-
sonal
NDP, Udell After- 2 School Yes Yes Yes Yes
(2003) school staff
tutoring
a. For summer programs, refers to the grade participating students exited prior to the commencement of the summer program.
b. Studies with blank cells did not report any information on attendance rates.
c. Examples of recreational activities include soccer, karate, and dance. Cultural activities include music, arts, and field trips. Interpersonal activities
include conflict management and relationship building.
d. August et al. (2001) calculated summer school attendance and family program attendance as the percentage of days attended out of total days of-
fered over the course of two years. They took a simple average of these two attendance rates with a third “rate,” which they used to measure the
“percentage” of FLEX contact time each family received. FLEX contact time was meant to provide preventative case management as needed. Actual
contact time ranged from 20 minutes to more than 5 hours total, excluding the initial 2-hour home visit to all participants. A full dosage of FLEX
time was capped at 2 hours (based on the assumption of two more 1-hour contacts); 22 percent of families received the full dosage of FLEX time.
e. Active participation is defined as attending the program more than 36 days a year. This participation rate is from our calculations using the informa-
tion in Table 3 in Goldschmidt, Huang, and Chinen (2007) from 1994 to 1997, the years included in their impact study. Their sample was restricted
to schools with more than 20 students with program attendance information.
f. Estimated based on information in Table II.1 in Dynarski et al. (2004). Sample size is 980 students who attended an elementary school center for at
least one day.

( 59 )
Appendix D. Technical information on the studies

However, given that actual attendance is their academic impact studies and also
a precursor to an OST program’s promot- looked at the possible relation between
ing student learning, the panel believes amount of program attendance and aca-
it is particularly important for programs demic achievement.236 Only one found a
to enhance their efforts to get students in positive correlation between higher atten-
the door. dance and greater program effects.237

The other evidence that influences this rec-


Summary of evidence ommendation is less direct. Fourteen pro-
grams had studies that discussed practices
The importance of emphasizing partici- similar to the action steps recommended
pation has been pointed out by many ex- by the panel (such as using teachers to
perts in OST.234 Although it seems logical recruit students, locating within schools,
that students need to attend to receive offering snacks, and including enrichment
the benefits of a program, no rigorous activities) and also met WWC standards
evidence demonstrates that the steps rec- with or without reservations.238 Of the
ommended here will lead to increased par- 14, only 6 reported some information on
ticipation, and limited evidence indicates attendance rates.239 Even when programs
that academic achievement is increased reported attendance, it was not possible
through more exposure to OST programs. to isolate which, if any, components of
A meta-analysis of 53 OST studies by Lauer the program affected attendance, forcing
et al. (2004) found larger effect sizes in the panel to use its judgment regarding
both math and reading for interventions which practices contributed to increased
that consisted of at least 45 hours of pro- attendance and improved academic out-
gramming.235 The panel believes that if a comes. In terms of overall academic ef-
program is aligned academically with the fects, 7 of the 14 programs showed positive
school day (recommendation 1), provides
engaging learning experiences (recom- 236. Teach Baltimore—Borman and Dowl-
mendation 3), and adapts instruction to ing (2006); Early Risers—August et al. (2001);
individuals and groups (recommendation L.A.’s BEST—Goldschmidt, Huang, and Chinen
(2007); 21st CCLC—U.S. Department of Educa-
4), greater exposure to that program will
tion (2003).
yield higher academic achievement.
237. Teach Baltimore—Borman and Dowling
(2006).
Since a student’s actual program atten- 238. CHP—Langberg et al. (2006); Fast ForWord—
dance, as a percentage of hours of pro- Slattery (2003); Howard Street Tutoring—Morris,
gramming offered, is likely to be correlated Shaw, and Perney (1990); SMART—Baker, Gersten,
with unobservable characteristics such as and Keating (2000); Summer Reading Day Camp—
Schacter and Jo (2005); KindergARTen—Borman,
motivation or family circumstances, it
Goetz, and Dowling (2008); Early Risers—August
is not advisable to draw causal conclu- et al. (2001); Enhanced Academic Instruction—
sions from most studies on the relation- Black et al. (2008); Teach Baltimore—Borman
ship between attendance and outcomes. and Dowling (2006); L.A.’s BEST—Goldschmidt,
Four evaluations of programs met WWC Huang, and Chinen (2007); YS-CARE—Bissell et
al. (2002); 21st CCLC—U.S. Department of Educa-
standards with or without reservation for tion (2003); Leap Frog—McKinney (1995); NDP—
Udell (2003).
239. Early Risers—August et al. (2001); Kinder-
234. Cooper et al. (2000); Granger and Kane gARTen—Borman, Goetz, and Dowling (2008);
(2004); Lauver, Little, and Weiss (2004). Enhanced Academic Instruction—Black et al.
235. The meta-analysis also found that, on aver- (2008); Teach Baltimore—Borman and Dowling
age, programs with high durations (more than 100 (2006); L.A.’s BEST—Goldschmidt, Huang, and
hours for math and 210 hours for reading) did not Chinen (2007); 21st CCLC—U.S. Department of
have effects significantly different from zero. Education (2003).

( 60 )
Appendix D. Technical information on the studies

effects,240 2 showed mixed effects,241 and Four studies on the effects of attendance
5 others showed no effects.242 Despite the on achievement
lack of consistent evidence linking the
panel’s suggestions to increased academic As discussed in recommendation 1, a ran-
achievement, the panel believes these rec- domized controlled trial of Teach Balti-
ommendations, faithfully implemented more, a summer school program, found no
and taking into consideration the unique positive effects on reading for all partici-
constraints and student populations of pants, but the quasi-experimental evalu-
each program, can increase student at- ation found positive effects on students
tendance and, therefore, contribute to who attended frequently.244 Although it is
achievement gains. encouraging that there were effects among
the frequent attendees, the fact that the av-
Only one study provided direct evidence erage attendance rate across all three sum-
on increasing attendance. Black et al. mers was only 39 percent and that only 46
(2008) randomly assigned students to ei- percent of the original treatment group
ther a less-structured business-as-usual could be classified as frequent attendees is
after-school program or an enhanced math an important cautionary note as the Teach
or reading program that included moni- Baltimore program does include some ele-
toring and incentive systems to increase ments that the panel recommends to en-
attendance. Students in the enhanced pro- courage attendance. For example, staff in
gram attended significantly more days the program served lunch to all program
than did control students.243 participants during the seven-week sum-
mer program. After lunch, there was time
for sports and enrichment activities such
as music and drama and weekly field trips
to museums and cultural events.

It also is important to note that three stud-


ies did not find higher impacts on aca-
demic achievement for higher attendees.
240. CHP—Langberg et al. (2006); Fast ForWord—
The Early Risers program did find positive
Slattery (2003); Howard Street Tutoring—Morris,
Shaw, and Perney (1990); SMART—Baker, Gersten, academic impacts overall, but students
and Keating (2000); Summer Reading Day Camp— who were classified as receiving less than
Schacter and Jo (2005); KindergARTen—Borman, half of the “program dosage” improved
Goetz, and Dowling (2008); Early Risers—August at the same rate as those who received
et al. (2001).
half or more.245 The other two programs,
241. Teach Baltimore—Borman and Dowling
21st CCLC and L.A.’s BEST, failed to find
(2006); Enhanced Academic Instruction—Black
et al. (2008). academic impacts either overall or for
242. L.A.’s BEST—Goldschmidt, Huang, and higher attendees. The evaluations of both
Chinen (2007); YS-CARE—Bissell et al. (2002); 21st L.A.’s BEST and Early Risers used regres-
CCLC—U.S. Department of Education (2003); Leap sion approaches to control for observ-
Frog—McKinney (1995); NDP—Udell (2003). able differences between high attendees
243. Students in their math treatment group at- and low attendees. However, unobserved
tended 12 days more over the course of the year
compared to students in the regular program
who attended 61 days on average. In their read- 244. Borman and Dowling (2006). Frequent at-
ing study, students attended 7 more days com- tendance was defined has having above average
pared to the treatment group, which attended 64 attendance (above 39 percent) for at least two out
days. Both of these differences are statistically of the three summers. The average attendance
significant at the 5 percent level and represent a rate for the program includes students who did
20 percent increase and 11 percent increase in not participate for an entire summer.
days attended, respectively. 245. August et al. (2001).

( 61 )
Appendix D. Technical information on the studies

characteristics may be driving the null beginning of middle school after students
results, and their analysis should not be had spent up to five years participating
interpreted as causal. in L.A.’s BEST in elementary school.252,253
In hierarchical linear models similar to
In the second- and third-year reports those used to estimate the overall treat-
from the national study of 21st CCLC, ment effects, the authors also controlled
fixed effects analysis was used to evalu- for exposure to L.A.’s BEST (years of pro-
ate whether more frequent attendance gram attendance) and intensity of expo-
had a positive impact on a wide variety of sure (natural log of daily attendance rate in
outcomes.246 This analysis uses two years years attended). The estimated coefficients
of data on the same students; therefore, on these variables in all four models (lev-
time-invariant unobservable characteris- els and growth in both math and reading)
tics cannot be driving the results.247 It tests were always small and never significantly
whether the changes in an individual stu- different from zero.
dent’s attendance are related to changes in
outcomes; therefore, it depends on there It appears that the authors also examined
being sufficient variation in both atten- the correlations of academic outcomes with
dance and outcomes.248 Only small insig- either program exposure or intensity using
nificant marginal effects were found on only program participants. The magnitude
year-end grades for both the elementary249 of effects is not reported, and, overall, the
and middle school samples.250,251 results appear mixed. Both exposure and
intensity are reported to be insignificant
Goldschmidt, Huang, and Chinen (2007) for reading achievement level at the begin-
found no positive effects for active par- ning of middle school but marginally sig-
ticipation in an L.A.’s BEST after-school nificant for reading growth.254 Exposure is
program on achievement scores or growth not found to have a significant effect on ei-
in either reading or math, measured at the ther math levels or growth, and intensity is
reported to be significantly related to math
246. Dynarski et al. (2004); James-Burdumy et levels but not to math growth.
al. (2005).
247. However, time-varying ones still could be
responsible. The first-year report of 21st CCLC Indirect evidence on recruitment and
(Department of Education 2003) considered using ways to encourage attendance
statistical techniques with their middle school
sample to account for the possibility that there
could be unobservable factors responsible for Other studies provided supplementary
influencing both attendance and academic out- examples of how to promote attendance
comes, but they could not find a suitable instru-
mental variable that was both related to atten-
252. Active participation was defined as attend-
dance but not independently related to academic
ing at least 36 days a year. From 1994 to 1997, 66
outcomes. Mother’s employment status was not
percent of students who ever attended an L.A.’s
correlated with program attendance.
BEST program during the course of a school year
248. James-Burdumy et al. (2005), p. 80. attended at least 36 days that year. The average
249. Ibid., p. 83. number of days attended of all students from
250. Dynarski et al. (2004), p. 131. 1994 to 1997 was 72.
251. Nonlinear models were used to calculate 253. They ran the same models using a second
the marginal effect of attending the program for comparison group consisting of matched stu-
10 more days on both an average student with dents at non–L.A.’s BEST schools. However, de-
a low level of attendance (10 days a year) and a spite the matching attempts, the second group
student who attended for 30 days. Neither case was not comparable in baseline achievement
showed significant improvement in year-end levels in math, so we focus on the analysis with
grades in any subject whether in elementary or the first group.
middle school. 254. p<.10

( 62 )
Appendix D. Technical information on the studies

at OST programs. Examples that support level.261 All but one262 provided snacks
the panel’s recommendations appear in or meals, and all either took place in the
14 programs that had studies that met students’ school or provided transporta-
WWC standards with or without reser- tion to students.263 All but two264 included
vations.255 Seven of those programs had some recreational component for partici-
positive effects on academics,256 two had pating students. Three also provided cul-
mixed effects,257 and five had no effect on tural components,265 and one provided
academic achievement.258 In general, this interpersonal development activities for
information should be interpreted cau- students.266 Four of the programs moni-
tiously: there is no direct evidence that tored attendance,267 and one had an incen-
the practices used by these programs were tive system to promote attendance.268
directly linked to improvements in either
attendance or academic achievement, and In studies of the five programs that demon-
many of the practices were observed both strated no treatment effects on academic
in programs that produced positive aca- achievement, researchers highlighted
demic effects and in those that did not. many of the same components.269 All but
one program270 relied on teachers, princi-
Table D4 compares several of the panel’s pals, and school staff to identify students
suggestions for promoting attendance in for participation. All provided a snack and
OST programs to the components dis- were either located in the students’ schools
cussed by evaluators in studies of the 12 or provided transportation to the program
programs that provided evidence for this site.271 Four of the programs provided
recommendation. Of the 9 studies that recreational and cultural activities,272 and
found positive or mixed academic treat- two provided interpersonal activities.273
ment effects, 6 used teachers, principals,
or school staff to recruit students.259 Two 261. Slattery (2003).
programs targeted schools with particular 262. Baker, Gersten, and Keating (2000).
characteristics,260 and one randomly chose 263. Langberg et al. (2006); Morris, Shaw, and
among all students reading below grade Perney (1990); Baker, Gersten, and Keating (2000);
Schacter and Jo (2005); August et al. (2001); Black
255. Langberg et al. (2006); Slattery (2003); Mor- et al. (2008); Borman and Dowling (2006); Borman,
ris, Shaw, and Perney (1990); Baker, Gersten, and Goetz, and Dowling (2008); Slattery (2003).
Keating (2000); Schacter and Jo (2005); Borman, 264. Slattery (2003); Baker, Gersten, and Keat-
Goetz, and Dowling (2008); August et al. (2001); ing (2000).
Black et al. (2008); Borman and Dowling (2006); 265. Schacter and Jo (2005); August et al. (2001);
Goldschmidt, Huang, and Chinen (2007); Bissell Borman and Dowling (2006).
et al. (2002); U.S. Department of Education (2003);
266. August et al. (2001).
McKinney (1995); Udell (2003).
267. Langberg et al. (2006); August et al. (2001);
256. Langberg et al. (2006); Slattery (2003); Mor-
Black et al. (2008); Borman and Dowling (2006).
ris, Shaw, and Perney (1990); Baker, Gersten, and
Keating (2000); Schacter and Jo (2005); Borman, 268. Black et al. (2008).
Goetz, and Dowling (2008); August et al. (2001). 269. Treatment effects refer to the overall aca-
257. Black et al. (2008); Borman and Dowling demic program impact, not to whether the impact
(2006). varied with level of attendance.
258. Goldschmidt, Huang, and Chinen (2007); 270. Bissell et al. (2002).
Bissell et al. (2002); U.S. Department of Education 271. Goldschmidt, Huang, and Chinen (2007);
(2003); McKinney (1995); Udell (2003). Bissell et al. (2002); U.S. Department of Education
259. Langberg et al. (2006); Morris, Shaw, and (2003); McKinney (1995); Udell (2003).
Perney (1990); Baker, Gersten, and Keating (2000); 272. Goldschmidt, Huang, and Chinen (2007);
Borman, Goetz, and Dowling (2008); August et al. Bissell et al. (2002); U.S. Department of Education
(2001); Black et al. (2008). (2003); McKinney (1995).
260. Schacter and Jo (2005); Borman and Dowl- 273. Bissell et al. (2002); U.S. Department of Edu-
ing (2006). cation (2003).

( 63 )
Appendix D. Technical information on the studies

Three studies indicated that program staff reported that obtaining and maintaining
monitored attendance,274 and two docu- the school space was one of the most chal-
mented providing incentives to increase lenging aspects of running the program, yet
attendance.275 each summer they were located in schools.
The after-school tutoring projects run by
Overall, only 6 of the 14 programs provide TASC were similarly located in the schools
some information on attendance rates, and attended by participants.278 The evaluation
the information they provide is not all di- of the TASC program found that projects
rectly comparable.276 Together, the studies also targeted the students most likely to
suggest that given the evidence base, it is benefit from the program, encouraged high
not possible to attribute causal effects on student attendance, and achieved a median
attendance to the strategies that the panel attendance rate of 85 percent.
recommended. Furthermore, there is little
evidence in the studies to suggest that the Project Adelante provided supervised
programs that show positive academic ef- transportation during the summer to the
fects are doing something fundamentally college campus where tutoring, counsel-
different to increase program attendance. ing, and mentoring services were provided
Nevertheless, the studies that report at- to Latino middle and high school stu-
tendance rates highlight how difficult it dents.279 Students in the target population
is for programs to achieve full participa- were recruited from the regular school day
tion, implying the need for a recommen- by school staff. Program staff also were
dation from the panel on encouraging actively involved in promoting student at-
participation. tendance, first by requiring parents to sign
a contract stating that they would support
attendance, and then by following up with
Supplemental evidence from students who missed sessions.
other sources
The Broward County Boys and Girls Club,
Four additional studies that did not meet although not a school-based program, also
WWC standards looked at interventions that promoted program attendance by provid-
incorporated some parts of the action steps ing incentives to students in the form of
from this recommendation. The Building points that could be redeemed for field
Education Leaders for Life (BELL) program, trips or events or used toward the pur-
which was a summer program for students chase of books and supplies.280
in grades 1–7, was purposefully located on
school campuses.277 The BELL organizers
equivalent initially. Furthermore, the posttest
274. Goldschmidt, Huang, and Chinen (2007); U.S. achievement tests for students in the control
Department of Education (2003); Udell (2003). group were administered at a different date than
275. U.S. Department of Education (2003); Udell the treatment group and are, therefore, not di-
(2003). rectly comparable.
276. Borman, Goetz, and Dowling (2008); Au- 278. Reisner et al. (2004). This study could not
gust et al. (2001); Black et al. (2008); Borman and be rated by WWC reviewers because it was not
Dowling (2006); Goldschmidt, Huang, and Chinen clear if the treatment and control groups were
(2007); U.S. Department of Education (2003). equivalent in achievement levels before the pro-
gram began.
277. Capizzano et al. (2007). The randomized
controlled trial evaluation of BELL by Chaplin 279. Center for Applied Linguistics (1994). This
and Capizzano (2006) did not meet WWC stan- study did not meet standards because it was a
dards because it suffered from attrition greater qualitative evaluation of the program.
than 20 percent and had not measured the initial 280. Brown (2002). This study was ineligible for
achievement levels and could not demonstrate review because it did not use a comparison group
that the treatment and control groups were in the analysis.

( 64 )
Appendix D. Technical information on the studies

In addition to these studies of interven- standards with or without reservations,283


tions, a national survey by Public Agenda 5 were judged to be in close correspon-
of a random sample of students and par- dence with more than one aspect of this
ents of school-age children explored what recommendation (see Table D5).284 Of these,
students do when they are not in school, 4 had evaluations that found positive ef-
and what students and parents want from fects on academic achievement,285 and
out-of-school time.281 The survey found one had mixed but potentially encourag-
that student participation is affected by ing effects.286 Of the other 10 programs,287
access to and convenience of programs, 6 still showed positive or mixed effects on
as well as by the attractiveness of the ser- academics.288
vices provided.
283. KindergARTen—Borman, Goetz, and Dowling
Recommendation 3. (2008); Early Risers—August et al. (2001); Sum-
Adapt instruction to individual and mer Reading Day Camp—Schacter and Jo (2005);
Experience Corps—Meier and Invernizzi (2001);
small group needs SMART—Baker, Gersten, and Keating (2000); How-
ard Street Tutoring—Morris, Shaw, and Perney
Level of evidence: Moderate (1990); Fast ForWord—Slattery (2003); CHP—Lang-
berg et al. (2006); Teach Baltimore—Borman and
The panel judged the level of evidence Dowling (2006); Enhanced Academic Instruction—
Black et al. (2008); Chicago Summer Bridge—Jacob
supporting this recommendation to be and Lefgren (2004); Leap Frog—McKinney (1995);
moderate. Learning environments that are NDP—Udell (2003); 21st CCLC—U.S. Department
adaptive to individual and small group of Education (2003); 21st CCLC—Dynarski et al.
needs are widely believed to be effective in (2004); 21st CCLC—James-Burdumy et al. (2005);
SES—McKay et al. (2008); SES—Ross et al. (2008);
fostering achievement.282 Within the con-
SES—Muñoz, Potter, and Ross (2008).
text of OST, however, the literature is not
284. Early Risers—August et al. (2001); Howard
definitive, and, in general, positive effects Street Tutoring—Morris, Shaw, and Perney (1990);
cannot be directly attributed to the use of Fast ForWord—Slattery (2003); CHP—Langberg
the strategies outlined in this recommen- et al. (2006); Enhanced Academic Instruction—
dation. Looking more closely at the actual Black et al. (2008).
implementation of practices related to this 285. Howard Street Tutoring—Morris, Shaw, and
Perney (1990); Early Risers—August et al. (2001);
recommendation, however, there does ap- Fast ForWord—Slattery (2003); CHP—Langberg
pear to be a pattern of positive academic et al. (2006).
effects associated with programs that 286. Enhanced Academic Instruction—Black et
more closely correspond to this recom- al. (2008).
mendation. Therefore, the panel believes 287. KindergARTen—Borman, Goetz, and Dowl-
OST programs can be more successful if ing (2008); Summer Reading Day Camp—Schacter
they attempt to understand the academic and Jo (2005); Experience Corps—Meier and In-
vernizzi (2001); SMART—Baker, Gersten, and
needs of the students they serve and adapt Keating (2000); Teach Baltimore—Borman and
their programs to those needs. Dowling (2006); Chicago Summer Bridge—Jacob
and Lefgren (2004); 21st CCLC—U.S. Department
Summary of evidence of Education (2003); 21st CCLC—Dynarski et al.
(2004); 21st CCLC—James-Burdumy et al. (2005);
SES—McKay et al. (2008); SES—Ross et al. (2008);
Of the 15 programs related to this recom- SES—Muñoz, Potter, and Ross (2008); Leap Frog—
mendation with evaluations that met WWC McKinney (1995); NDP—Udell (2003).
288. KindergARTen—Borman, Goetz, and Dowl-
ing (2008); Summer Reading Day Camp—Schacter
and Jo (2005); Experience Corps—Meier and In-
vernizzi (2001); SMART—Baker, Gersten, and
281. Duffett et al. (2004). Keating (2000); Teach Baltimore—Borman and
282. See Slavin (2006) in general and Lauer et al. Dowling (2006); Chicago Summer Bridge—Jacob
(2004) for OST programs in particular. and Lefgren (2004).

( 65 )
Appendix D. Technical information on the studies

Table D5. S
 tudies of programs cited in recommendation 3 that met WWC standards
with or without reservations

Program and Study Details Strategies


Use
Assessments
Program Grades Level of to Inform One-on-One/ Professional Type of
Brief Citation Type Studieda Relevanceb Teaching Small Group Development Staff Used
Positive Academic Effects
KindergARTen, Summer K √ Yes Initial and Certified teach-
Borman, Goetz, ongoing ers and college
& Dowling (2008) student interns
Summer Reading Day Summer 1 √ Small group Certified
Camp, Schacter & Jo for some teachers
(2005) activities
Early Risers, August Summer and K–2 √√ Yes Small group Initial and
et al. (2001) school-year ongoing
mentoring
Experience Corps/ After-school 1 √ Yes One-on-one Volunteers
Book Buddies, Meier tutoring
& Invernizzi (2001)
SMART, Baker, Gersten, After-school 1–2 √ One-on-one Volunteers
& Keating (2000) tutoring
Howard Street After-school 2–3 √√ Yes One-on-one Ongoing Volunteers
Tutoring, Morris, Shaw, tutoring
& Perney (1990)
Fast ForWord, Slattery After-school 3–5 √√ Yes One-on-one Certified
(2003) computer computer teachers
tutoring work
CHP, Langberg et al. After school 6–7 √√ Small group Initial Certified teach-
(2006) for some ers and CHP
activities specialists
Mixed Academic Effects
Teach Baltimore, Summer K–2 √ Yes Small group Initial College
Borman & Dowling students
(2006)
Chicago Summer Summer 3, 6 √ Certified
Bridge, Jacob & Lef- teachers
gren (2004)
Enhanced Academic After school 2–5 √√ Yes Small group Initial and Certified
Instruction, Black et ongoing teachers
al. (2008)
No Detectable Academic Effects
SES, McKay et al. After-school K-12c √ One-on-one Varies
(2008); Ross et al. tutoring
(2008); Muñoz, Potter,
& Ross (2008)
Leap Frog, McKinney After-school 1–2 √ One-on-one Initial College
(1995) tutoring students and
volunteers
NDP, Udell (2003) After-school 2 √ Yes One-on-one Initial and College
tutoring ongoing students
21st CCLC, U.S. After school K–8 √ Varies from Varies
Department of Educa- small to large
tion (2003); Dynarski groups
et al. (2004); James-
Burdumy et al. (2005)
a. For summer programs, refers to the grade participating students exited prior to the commencement of the summer program.
b. Indicates relevance of recommendation to program operation. “√√” indicates high relevance; “√” indicates lower relevance.
c. Grades included in SES could be any from kindergarten to 12th grade. Typically, grades included in state or district evaluations are those that are
more often tested statewide with tests that are comparable across grade levels, usually 3rd through 8th.

( 66 )
Appendix D. Technical information on the studies

Of the four programs with lower levels multicomponent intervention and because
of relevance and without detectable ef- adapting instruction did not consistently
fects on academic achievement, two de- demonstrate significant positive effects
serve special mention because they are across every study reviewed,294 the panel
the two major sources of federal funding is cautious and acknowledges that the
for academically focused OST programs: level of evidence is moderate.
21st CCLC and SES, which are mandated
as part of NCLB.289 The national study of
21st CCLC programs found no positive Positive evidence for individualizing
academic effects for either elementary or instruction in OST programs
middle school students.290 However, im-
plementation varied widely, and although The eight randomized controlled studies
some programs reported tutoring in small of OST programs that included individu-
groups with fewer than 10 students, most alized instruction and showed positive
programs did not seem to be providing effects looked at after-school study skills
direct or adaptive instruction that was programs, mentoring programs, summer
geared to the individual needs of all their programs, and literacy programs (either
students. Although SES is a newer program one-on-one with a tutor or with a com-
relative to 21st CCLC and less often stud- puter), or some combination.295 The grades
ied, it appears to be implemented typi- studied were elementary in all programs
cally as one-on-one or small group tutor- except one,296 and six out of the eight el-
ing (again implementation varies widely), ementary school programs were for grades
and the results from the states that have K–3. The sample sizes for all of these stud-
attempted to evaluate the effects of SES, as ies were relatively small; all but two of the
mandated by law, do not show significant nine had fewer than 100 students in their
impacts on state assessments.291 total analysis sample.297

In summary, the evidence demonstrates The four programs judged to be in higher


positive effects associated with a total of correspondence include two after-school
eight programs that adapted instruction tutoring programs (Howard Street Tutoring
to individual and small groups to some
degree292 and mixed effects in three other ing (2006); Chicago Summer Bridge—Jacob and
programs;293 however, because adapting Lefgren (2004).
instruction was always a component of a 294. Leap Frog—McKinney (1995); NDP—Udell
(2003); 21st CCLC—U.S. Department of Educa-
tion (2003); 21st CCLC—Dynarski et al. (2004);
289. Zimmer et al. (2007). 21st CCLC—James-Burdumy, Dynarski, and Deke
290. U.S. Department of Education (2003); Dy- (2007); SES—McKay et al. (2008); SES—Ross et al.
narski et al. (2004); James-Burdumy, Dynarski, (2008); SES—Muñoz, Potter, and Ross (2008).
and Deke (2007). 295. KindergARTen—Borman, Goetz, and Dowl-
291. Tennessee—Ross et al. (2008); Kentucky— ing (2008); Early Risers—August et al. (2001);
Muñoz, Potter, and Ross (2008); Virginia—McKay Summer Reading Day Camp—Schacter and Jo
et al. (2008). (2005); Experience Corps—Meier and Invernizzi
292. KindergARTen—Borman, Goetz, and Dowl- (2001); SMART—Baker, Gersten, and Keating
ing (2008); Early Risers—August (2001); Summer (2000); Howard Street Tutoring—Morris, Shaw,
Reading Day Camp—Schacter and Jo (2005); Ex- and Perney (1990); Fast ForWord—Slattery (2003);
perience Corps—Meier and Invernizzi (2001); CHP—Langberg et al. (2006).
SMART—Baker, Gersten, and Keating (2000); 296. Langberg et al. (2006).
Howard Street Tutoring—Morris, Shaw, and Per- 297. Schacter and Jo (2005) had 118 students
ney (1990); Fast ForWord—Slattery (2003); CHP— in their first follow-up of participants in a sum-
Langberg et al. (2006). mer reading day camp, and August et al. (2001)
293. Enhanced Academic Instruction—Black et had 201 students in their summer program with
al. (2008); Teach Baltimore—Borman and Dowl- school-year mentoring.

( 67 )
Appendix D. Technical information on the studies

and Fast ForWord), an after-school study assessment of the student’s current skill
skills program (CHP), and a summer pro- level, advancing or reviewing depending
gram with mentoring. August et al. (2001) on each student’s development.300 Further-
studied Early Risers, a summer school more, all volunteer tutors were given con-
and school-year mentoring program for tinued support via ongoing monitoring by
1st and 2nd graders. Program staff mem- the reading specialist supervisor, who also
bers monitored the academic, behavioral, planned each student’s lessons individu-
and social progress of each student. They ally. The Morris, Shaw, and Perney (1990)
assessed students after each session by evaluation included five literacy measures.
rating aspects of their performance such After adjusting for multiple comparisons,
as engagement and task completion. Staff only on reading Basal Passages did the
were trained to do the student assessment treatment group do significantly better
and monitoring required by Early Risers than the control group on the posttest,
through preprogram training, manuals, but effect sizes were greater than 0.25 on
and ongoing support. The study demon- all five posttest measures.
strated significant increased academic
competence for the treatment group rela- Finally, among the higher correspondence
tive to a comparison group. programs, Slattery (2003) studied the Fast
ForWord computer program in an after-
Langberg et al. (2006) studied CHP, which school context that was chosen by the
provided academic remediation in small school so that students in need of extra
groups for 6th and 7th graders with a com- help would not miss any instruction during
bination of learning and behavior prob- the school day. Fast ForWord is a computer
lems. CHP limited class sizes to 12 students program specifically designed to monitor
per instructor but also broke classes into individual performance on a daily basis
smaller groups of 3 students for skill de- and to modify the activities suggested to
velopment exercises. The staff members each student accordingly. Slattery (2003)
were provided with training and manuals found large positive significant effect sizes
to guide their instruction. The program for both literacy measures that she used.
showed no statistically significant effects
on teacher ratings of academic progress The main reason why four of these pro-
because of the small sample size, but the grams were judged to be in lower corre-
effect size (calculated using Hedges’s g,298 spondence was a lack of detail in the evalu-
a difference-in-differences method) was ations describing these types of practices.
positive at 0.45.299 In order to acknowledge For instance, in two cases, the programs
meaningful effects regardless of sample used one-on-one tutoring, but there is no
size, the panel followed WWC guidelines indication if tutors were trained to do for-
and considered a positive statistically sig- mative assessments of their students and
nificant effect or an effect size greater than to use these assessments to guide their
0.25 as an indicator of positive effects. instruction.301 Similarly, in two summer
programs with small groups, it simply was
In the Howard Street tutoring program, not known how adaptive the programs
instructors constantly adapted the pace were in their actual implementation.302
and content of instruction with their
300. Morris, Shaw, and Perney (1990).
298. See Appendix B of WWC Procedures 301. Experience Corps—Meier and Invernizzi
and Standards Handbook, http://www.ies. (2001); SMART—Baker, Gersten, and Keating
ed.gov/ncee/wwc/references/idocviewer/Doc. (2000).
aspx?docId=19&tocId=8. 302. Schacter and Jo (2005); Borman, Goetz, and
299. Langberg et al. (2006). Dowling (2008).

( 68 )
Appendix D. Technical information on the studies

Nevertheless, they may have been, and it Schacter and Jo (2005) studied a seven-
is notable that all the programs detailed week reading summer camp for 1st grad-
in this section did find positive effects on ers. Exercises during reading time were
some measures of academic achievement. designed to address specific skill sets; for
certain exercises, students were matched
The Meier and Invernizzi (2001) study of according to their skill levels for paired
Experience Corps and the Baker, Gersten, and small group reading instruction. The
and Keating (2000) evaluation of SMART study found large positive effects on de-
were both studies of after-school one-on- coding and reading comprehension in fol-
one tutoring programs in reading with low-up test scores from September.
1st graders. In the Experience Corps Book
Buddies program, tutoring sessions were Borman, Goetz, and Dowling (2008) ran-
designed to adapt to the tutee’s individual domly assigned exiting kindergarten stu-
learning pace and needs and to evolve with dents to participate in either a six-week
the progress in their learning.303 After just summer program or a no-treatment con-
one semester of tutoring (approximately trol group. The program found positive ef-
40 sessions lasting 45 minutes each), the fects on some literacy skills.306 Class sizes
first treatment group significantly out- were limited to 10, and the pacing of the
performed the control group on early lit- morning literacy block was determined
eracy measures.304 The SMART program by student needs. College student interns,
in Oregon is a two-year, one-on-one tutor- who worked with certified teachers, par-
ing program in reading. The tutors were ticipated in weekly professional develop-
volunteers and were given minimal initial ment workshops with the teachers and
training and usually received very little or other experts. They also had a four-week
no ongoing support. The authors reported training program on curricula/instruction,
positive academic effects on various read- assessment, classroom management, par-
ing measures, but after applying WWC ent involvement, and team building before
standards for multiple comparisons, these the summer camp began.
effects were no longer statistically signifi-
cant.305 However, their effect sizes were
larger than 0.25; therefore, the impacts Inconclusive evidence for one-on-one
are considered substantively important tutoring in OST programs
following WWC guidelines.
Two other small sample studies on after-
303. Meier and Invernizzi (2001). school one-on-one or one-on-two programs
304. In the next semester, the groups switched
places and the control group received 40 sessions
of tutoring. By June, control group students had 306. Borman, Goetz, and Dowling (2008) had five
caught up with those in the original treatment measures of early literacy: letter naming, pho-
group. The authors point out that although it is neme segmentation, word list, developmental
encouraging that students could benefit from the reading instruction (DRA), and dictation. There
tutoring during either semester of the first year, were no significant differences found between the
it is curious that students in the first treatment treatment and control groups on letter naming,
group were not able to build on the tools that phoneme segmentation, or dictation. After con-
they received from the extra tutoring and out- trolling for multiple comparisons following WWC
pace the original controls. They concluded that guidelines, the average pre-to-post difference for
the question is outside the scope of their study, word lists was significantly greater for the treat-
but they do point out that even though both ment group compared to the control group. Also,
groups made progress, at the end of their first both word lists and DRA had estimated effect
year at school, even with 40 sessions of one-on- sizes greater than 0.25 using Hedges’s g (for tech-
one tutoring, both groups were essentially only nical details on both computations, see http://ies.
at a pre-primer level. ed.gov/ncee/wwc/references/iDocViewer/Doc.
305. Baker, Gersten, and Keating (2000). aspx?docId=20&tocId=6).

( 69 )
Appendix D. Technical information on the studies

found no effects but were judged to have for recommendation 2). This could be
lower levels of correspondence.307 McKin- interpreted as a somewhat cautionary
ney (1995) conducted a randomized con- note of support for some elements of the
trolled trial on Leap Frog, an after-school program. It is not possible to disentangle
program in which university students which components of the Teach Baltimore
provided one-on-one tutoring and home- program are responsible for its success
work assistance to 1st and 2nd graders, with high attendees. Whether this should
but since teachers from the school as- be attributed to the adaptive nature of the
signed topics and assignments for the tu- instruction, the engaging elements of the
tors to cover, this program could be only instruction, the longevity of the program
as adaptive as the teachers were. Udell (available to students for three summers
(2003) used a quasi-experimental design in a row after 1st, 2nd, and 3rd grades),
to study the NDP program, which provided interactions of various components, or
one-on-two reading tutoring to 2nd-grade something else entirely is not known.
students after school. Student assessment
and progress monitoring were emphasized The Teach Baltimore summer program used
in the program, and tutors were trained to a curriculum that was designed to be adap-
assess their students’ strengths and needs tive to varying levels of achievement and
and to determine appropriate educational provided instructors with materials they
objectives and lessons based on those as- could use to supplement core instruction
sessments. The tutors also were trained based on students’ progress. The program
in effective instructional and motivational also provided three weeks of training to its
strategies, and they were provided with instructors (undergraduate students) prior
ongoing coaching from an experienced to the program and the maximum class size
schoolteacher. However, this was a rela- was held to eight students.308
tively short program consisting of fewer
than 20 hours of literacy tutoring. Jacob and Lefgren (2005) studied Chicago’s
Summer Bridge program and found effects
in math and reading for 3rd graders but
Mixed evidence on individualizing not for 6th graders. The program was de-
instruction in OST programs signed to provide remedial instruction in
small classes of around 16 students.309 In
The next three programs are larger stud- a qualitative study of twenty-six 6th grad-
ies of two summer school programs (Teach ers who participated in Summer Bridge,
Baltimore and Chicago Summer Bridge) and some students reported receiving indi-
one after-school program (Enhanced Aca- vidualized instruction, and a few reported
demic Instruction). Only the Enhanced Aca- frustration with the overall slow pace of
demic Instruction program was considered the classes, and their resulting inability to
highly relevant to this recommendation. learn according to their needs.310 Despite
the relatively small class sizes compared
Borman and Dowling (2006) failed to find to in-school instruction, the program was
overall effects of the Teach Baltimore pro- narrowly focused on pacing and keeping
gram on reading despite attempts to in-
dividualize instruction, but they did find 308. Borman and Dowling (2006).
positive academic effects when compar- 309. Roderick, Jacob, and Bryk (2004).
ing high attendees to students whom they 310. Stone et al. (2005). Of the twenty-six 6th
judged more likely to have been high at- graders interviewed, 10 were classified as hav-
ing positive experiences, 13 with neutral expe-
tendees (see discussion in this appendix riences, and 3 with negative experiences. Only
the ones with overall negative experiences men-
307. McKinney (1995); Udell (2003). tioned frustration with the pacing.

( 70 )
Appendix D. Technical information on the studies

up with a specific, highly structured, cen- A large number of 21st CCLC programs
trally dictated curriculum and may not were evaluated in the national study.312
actually have been adequately adaptive to As expected, the implementation of the
student needs. program was not consistent across all the
programs. Although an academic compo-
Black et al. (2008) studied an after-school nent was mandated by law, in practice,
program of Enhanced Academic Instruc- many programs put a strong emphasis on
tion in math and reading (separate treat- recreation because it was more attractive
ment groups) and found small positive to students.313 Programs that did focus on
effects for the math curriculum but not academics often provided tutoring in small
for the reading curriculum. Both curricula groups of 5 to 7 students or test prepa-
were designed to assess students before ration classes in groups of 7 to 10. They
and after lessons, provide instruction also may have made an effort to monitor
in small groups (an average of nine stu- student progress in the program and at
dents could be broken down into smaller school and aligned instruction with input
groupings) separated by grade or skill, as from the schoolteacher. Many programs
well as time for students to work indepen- required students to participate in the
dently with self-monitoring and supervi- academic session before any recreational
sion from an instructor. Instructors were activities, but the academic session could
provided with training, manuals, and mul- have been as simple as unstructured time
tiple sources of ongoing support to enable to complete homework with minimal fol-
effective implementation of the program. low-up from a staff member.
However, the comparison group also re-
ceived after-school programming with a The literature on SES, an intervention that
low teacher-to-student ratio.311 Neverthe- is typically but not always provided one-
less, a definite service contrast between on-one or in small groups, is not encourag-
the treatment group and the control group ing. Zimmer et al. (2007) found some posi-
was the use of formative diagnostic testing tive effects in a large national study that
to influence the direction of the instruc- does not meet WWC standards because the
tion, since after-school programs were treatment and comparison groups were
specifically chosen to host the Enhanced not equivalent initially.314 Three quasi-
Academic Instruction program only if they experimental studies that met WWC stan-
did not use diagnostic tests to guide their dards with reservations found no aca-
programming in their regular offerings. demic effects; the authors of two of these
studies suggest that the dosages of SES
provided in the locales they studied (cities
Cautionary evidence on in Tennessee and in Louisville, Kentucky)
individualizing instruction in
out-of-school time programs
312. U.S. Department of Education (2003); Dy-
narski et al. (2004); James-Burdumy, Dynarski,
This section discusses the 21st CCLC pro-
and Deke (2007).
grams and SES, neither of which has dem-
313. U.S. Department of Education (2003).
onstrated positive academic effects.
314. As mandated by law, if all students eligible
for SES cannot receive services, schools must pri-
oritize lower-performing students to receive ser-
vices first. Zimmer et al. (2007), in their study of
311. In the math study, the treatment group ratio SES in nine large, urban districts, did find that at
was 1:9 compared to 1:11 in the regular after- baseline, the eligible students who received SES
school program. In the reading study, the treat- had significantly lower achievement levels than
ment group ratio was 1:9 compared to 1:14 for their comparison group peers, eligible students
the regular group. who did not receive SES.

( 71 )
Appendix D. Technical information on the studies

were insufficient.315 However, the dosages their own pace and trained tutors accord-
(30–40 extra hours a year) are comparable ingly. The Project CARES (Career Aware-
to at least one of the programs with posi- ness Resources for Exceptional Students)
tive effects mentioned earlier, the Expe- program also did diagnostic needs assess-
rience Corps Book Buddies program.316 ments of students and included student-
The authors also suggest that the lack of paced instruction and computer-assisted
results could result from a lack of sensi- adaptive teaching.320 The study of the Boys
tivity of standardized state tests to small and Girls Club of Broward County program
changes in achievement levels.317 described the instructors grouping students
by achievement levels whenever possible
during academic instruction.321
Supplemental evidence from
other sources Three programs with explicit staff devel-
opment practices provided concrete ex-
Three studies were gleaned for illustra- amples for implementation strategies even
tions of practices relevant for individual- though the evaluations of these programs
izing instruction even though they did not either did not pass WWC screens322 or did
pass WWC screens for this guide.318 One not meet WWC standards.323 Evaluations of
study looked at the Early Start to Emancipa- Communities Organizing Resources to Ad-
tion Preparation (ESTEP) tutoring program, vance Learning (CORAL) have been purely
which was organized at community colleges qualitative or lacked a comparison group.324
instead of by a school or district.319 The The evaluations of the other two programs,
program incorporated frequent assessment Building Educated Leaders for Life (BELL)
of students, including diagnostics, weekly and Partnership Education Program (PEP),
progress tests, and other periodic checks. either did not measure initial achievement
It also emphasized instructing students at levels (BELL)325 or had treatment and com-
parison groups that were not equivalent

315. Tennessee—Ross et al. (2008); Louisville,


Kentucky—Muñoz, Potter, and Ross (2008). The
third study, McKay et al. (2008), did not provide
any information on dosage. 320. New York City Board of Education (1991).
The study did not pass screens because it lacked
316. The Experience Corps Book Buddies pro-
a comparison group and seemed to be targeting
gram in the Bronx, New York (Meier and Invernizzi
high school students.
2001) found positive effects of forty 45-minute
tutoring sessions (30 hours total) over the course 321. Brown (2002). The study did not have a com-
of a semester. The Howard Street Tutoring Pro- parison group nor was it school based.
gram (Morris, Shaw, and Perney 1990) met with 322. Arbreton et al. (2008); Sheldon and Hop-
students twice a week for one hour. The program kins (2008).
lasted from early October until late May; poten- 323. Chaplin and Capizzano (2006); Tucker
tially, the tutees had as much as 54 hours of extra (1995).
instructional time over the course of the school
324. Arbreton et al. (2008) and Sheldon and
year (2 hours a week for 32 weeks but subtracting
Hopkins (2008) did not pass WWC screens for
2 weeks for the winter holidays, 1 each for spring
this reason.
break and for the first and last weeks of the eight-
month time span). Unfortunately, data on actual 325. Chaplin and Capizzano’s (2006) randomized
extra tutoring time are not provided. controlled trial did not meet standards because
it suffered from high attrition and had not mea-
317. Tennessee—Ross et al. (2008); Louisville,
sured the initial achievement levels, so it could
Kentucky—Muñoz, Potter, and Ross (2008).
not demonstrate that the treatment and control
318. Courtney et al. (2008); New York City Board groups were equivalent at baseline. Furthermore,
of Education (1991); Brown (2002). the posttest achievement tests for students in the
319. Courtney et al. (2008). This study was out control group were administered on a different
of the scope for the practice guide because of date than those for the treatment group and
age range. were, therefore, not directly comparable.

( 72 )
Appendix D. Technical information on the studies

before the program began (PEP).326 Hence, of different strategies to increase student
neither study met WWC standards. motivation, engagement, and academic
success.329 In the OST context, however,
In the BELL program, instructors were pro- the evidence supporting the use of engag-
vided with training and manuals to pre- ing activities has been mixed (see Table
pare them for teaching in the summer pro- D6). Five programs documented practices
gram.327 In PEP, there was a strong emphasis highly aligned with those recommended
on observing after-school instruction of stu- by the panel; these either made a deliber-
dents and monitoring for quality, as well as ate effort to integrate engaging practices
coaching instructors in the proper instruc- with academic content or intentionally de-
tional practices.328 CORAL, an after-school veloped relationships between students
literacy program for grades K–8, provided and OST staff with the objective of en-
individualized training and coaching to its gaging students with school and learn-
instructors by grouping them by experience ing.330 Of these, three programs demon-
and prior training and tailoring the training strated positive academic effects,331 and
to the needs of the instructors. two showed mixed effects.332 Six other pro-
grams had practices similar to the panel’s
Recommendation 4. recommendations, but they either did not
Provide engaging learning provide enough descriptive evidence to de-
experiences termine whether they were highly aligned
with the panel’s recommendations or con-
Level of Evidence: Low tained some evidence that the strategies
were used in a way that was inconsistent
The panel judged the level of evidence for with the panel’s recommendations.333 One
this recommendation to be low. Studies of of these showed positive effects,334 one
the types of activities covered in this rec-
ommendation have demonstrated that they
are effective in laboratory and school-day
329. Cordova and Lepper (1996); Anderson
settings. However, the evidence for whether (1998); Guthrie et al. (1999); Guthrie, Wigfield,
these practices are effective is mixed in and VonSecker (2000); Battistich et al. (1997);
the OST context. Although many programs Klem and Connell (2004); Helme and Clark (2001);
identified making instruction engaging as a Blumenfeld and Meece (1988); Connell and Well-
born (1991).
key program goal, very few demonstrated
330. Early Risers—August et al. (2001); Developmen-
consistently positive effects, and none
tal Mentoring—Karcher, Davis, and Powell (2002);
linked positive effects directly to the use of Enhanced Academic Instruction—Black et al. (2008);
the strategies outlined in this recommenda- Teach Baltimore—Borman and Dowling (2006); Kin-
tion. The panel believes programs can be dergARTen—Borman, Goetz, and Dowling (2008).
more successful if they implement engag- 331. Early Risers—August et al. (2001); Devel-
ing learning strategies that are integrated opmental Mentoring—Karcher, Davis, and Pow-
ell (2002); KindergARTen—Borman, Goetz, and
into the programs’ academic components. Dowling (2008).
332. Enhanced Academic Instruction—Black et
al. (2008); Teach Baltimore—Borman and Dowl-
Summary of evidence ing (2006).
333. Summer Reading Day Camp—Schacter and
Several studies, conducted outside of the Jo (2005); Chicago Summer Bridge—Jacob and
OST arena, have examined the effectiveness Lefgren (2004); L.A.’s BEST—Goldschmidt, Huang,
and Chinen (2007); YS-CARE—Bissell et al. (2002);
21st CCLC—U.S. Department of Education (2003);
326. Tucker (1995). NDP—Udell (2003).
327. Capizzano et al. (2007). 334. Summer Reading Day Camp—Schacter and
328. Tucker (1995). Jo (2005).

( 73 )
Appendix D. Technical information on the studies

Table D6. Studies of programs cited in recommendation 4 that met WWC standards
with or without reservations

Program and Study Details Strategies


Fostered
Positive
Integrated Adult-
Engaging Examples of Student
Components Activities Examples Relationships
with Used to Make of Active to Increase
Program Grades Level of Academic Learning Learning Engagement
Brief Citation Type Studieda Relevanceb Components Relevant Experiences in Learning
Positive Academic Effects
KindergARTen, Borman, Summer K √√ Yes Themes, field Collaborative
Goetz, & Dowling (2008) trips learning
Summer Reading Day Summer 1 √ Personalize Hands-on,
Camp, Schacter & Jo learning exploratory
(2005) to student learning
experiences
Early Risers, August Summer K–2 √√ Participation, Teachers,
et al. (2001) and school- role-play parents
year
mentoring
Developmental Summer 5 √√ Themes, Teachers,
Mentoring, Karcher, and school- personalize parents
Davis, & Powell (2002) year learning
mentoring to student
experiences
Mixed Academic Effects
Teach Baltimore, Borman Summer K–2 √√ Yes Field trips Hands-on,
& Dowling (2006) games
Chicago Summer Bridge, Summer 3, 6 √ Interactive
Jacob & Lefgren (2004) questioning,
collaborative
learning
Enhanced Academic After 2–5 √√ Yes Personalize Hands-on,
Instruction, Black et al. school learning games,
(2008) to student computers
experience
No Detectable Academic Effects
NDP, Udell (2003) After- 2 √ Program staff
school
tutoring
L.A.’s BEST, After K–6 √ Recreational
Goldschmidt, Huang, school activities—
& Chinen (2007) non-linked
21st CCLC, U.S. Depart- After K–8 √ No Engaging
ment of Education (2003); school programming
Dynarski et al. (2004);
James-Burdumy et al.
(2005)
YS-CARE, Bissell et al. After 1–5 √ Choice Enrichment
(2002) school activities
a. For summer programs, refers to the grade participating students exited prior to the commencement of the summer program.
b. Indicates relevance of recommendation to program operation. “√√” indicates high relevance; “√” indicates lower relevance.

( 74 )
Appendix D. Technical information on the studies

showed mixed effects,335 and four showed encourage students to persist and engage
no effects.336 Although the evidence is more with learning activities.344 Other re-
somewhat mixed, on average, those with searchers have argued that collaboration,
positive or mixed effects seemed to be “opportunities for fun,” “novel tasks that
more closely aligned with the strategies have personal meaning,” hands-on activi-
recommended by the panel. ties, and tasks that build on individual tal-
ents can increase engagement.345

Evidence from non-OST studies In a study using student records and sur-
vey data, Klem and Connell (2004) looked
Engagement is a multidimensional con- at how student engagement and levels of
cept that can refer to behavior (ranging teacher support for students correlated
from following classroom rules to ac- with future academic achievement in six
tively participating in class and the school elementary and three middle schools in an
community),337 emotions (affectations re- urban district. They found that more highly
lated to learning including excitement and engaged students had higher achievement
boredom or identification with school),338 than did those who were less engaged
and cognition (effort and motivation to (based on self and teacher reports of en-
learn).339 Many studies have shown that gagement). Additionally, the authors found
engagement is correlated with improved that when students perceived their teach-
academic outcomes for students340 and ers to be supportive, the students tended
that behavioral disengagement is associ- to be more engaged in school.346
ated with poor academic performance.341
Activities that include choice and coopera- The panel also considered two specific ex-
tion between staff and students have been perimental studies to supplement the evi-
linked theoretically to student engagement dence on this recommendation. The first
and connection to school.342 Likewise, examined adding personalized components
teacher support and caring are associated (such as the student’s name or birthday),
with increased engagement.343 There is choice, and embellishments designed to
some evidence that autonomy and choice pique student interest in a computer game
designed to teach order of operations. When
335. Chicago Summer Bridge—Jacob and Lefgren the game included these components, stu-
(2004). dents demonstrated increased engagement
336. L.A.’s BEST—Goldschmidt, Huang, and in and learning from the tasks.347 In the
Chinen (2007); YS-CARE—Bissell et al. (2002);
second study, students in six classrooms
21st CCLC—U.S. Department of Education (2003);
NDP—Udell (2003). at two schools were randomly assigned to
337. See Finn (1993); Finn, Pannozzo, and Voelk either a control condition that received the
(1995) for more on behavioral engagement. traditional science textbook or one of three
338. See Finn (1989), Connell and Wellborn (1991), treatment conditions. One treatment group
and Skinner and Belmont (1993) for more on emo- received interesting texts about the animals
tional engagement.
339. See Connell and Wellborn (1991) and Newmann 344. Turner (1995); Perry (1998).
et al. (1992) for more on cognitive engagement.
345. See, for example, Newmann (1991); Helme and
340. Connell, Spencer, and Aber (1994); Marks Clark (2001); Blumenfeld and Meece (1988); Guthrie,
(2000); Wellborn and Connell (1990); Connell and Wigfield, and VonSecker (2000); Connell and Well-
Wellborn (1991). born (1991). Most of the information in this para-
341. Finn, Pannozzo, and Voelkl (1995); Finn and graph is drawn from Fredricks, Blumenfeld, and
Rock (1997). Paris (2004), which reviews the available research
342. Newmann et al. (1992). on student engagement in school environments.
343. Battistich et al. (1997); Klem and Connell 346. Klem and Connell (2004).
(2004). 347. Cordova and Lepper (1996).

( 75 )
Appendix D. Technical information on the studies

students were studying, another was given Positive OST evidence for
time to observe animals, and the third re- engaging instruction
ceived both the interesting texts and the
observation opportunities. All groups were Four programs with engaging instruction
instructed by the researcher and rotated to showed positive effects in randomized con-
work with her in small groups in a separate trolled trials.352 Three of the four evalua-
classroom over the course of a one-week tions discussed practices that were closely
intervention.348 The researcher reported related to this recommendation,353 and
increased conceptual knowledge among the fourth study did not contain enough
students receiving the interesting texts and detail to judge the level of integration of
the combination of observation and inter- engaging practices in the academic in-
esting texts, relative to the control group, struction.354 Two of the closely related pro-
but not among the students in the obser- grams that showed positive effects worked
vation alone group. This study provides intentionally to engage and connect stu-
some evidence for the use of engaging in- dents with learning by developing positive
structional materials and opportunities for adult and peer relationship through men-
real-world interactions, but it is limited by a toring and outreach to parents, as recom-
small sample, short duration, and possible mended by the panel in the third action
contamination of the groups when students step for this recommendation.355
returned to the school-day classroom and
had opportunities to discuss their experi- Early Risers is a two-year tutoring and
ences in small groups.349 mentoring program that recruits children
with early onset aggressive behavior at
Finally, two correlational studies provide the end of kindergarten.356 The study was
additional support for the incorporation conducted in two locations each with 10
of science observation and engaging in- schools randomly assigned to treatment
struction into classroom learning. Con- and control groups and found consistently
cept-Oriented Reading Instruction (CORI) positive academic effects. The program
integrates science content and engaging in- includes a six-week summer component
struction (including real-world, hands-on and additional support for students, fami-
activities and collaboration) into reading lies, and teachers during the school year.
instruction.350 Guthrie et al. (1999) found Students were paired with peer mentors
that students instructed using these meth- to collaborate daily on specific tasks de-
ods showed greater concept learning and signed to develop social skills but which
comprehension gains than did students in also might have contributed to the devel-
traditional instruction classrooms. In an- opment of relationships and positive en-
other study, the same strategies produced gagement within a school environment
gains in motivation and engagement for
participating students.351
352. August et al. (2001) evaluated a summer pro-
gram with year-round mentoring; Karcher, Davis,
and Powell (2002) and Karcher (2005) studied
a developmental mentoring program; Borman,
Goetz, and Dowling (2008) and Schacter and Jo
(2005) evaluated different summer programs.
353. August et al. (2001); Karcher, Davis, and Pow-
348. Anderson (1998) acknowledges the possibil- ell (2002); Borman, Goetz, and Dowling (2008).
ity of experimenter effects unintentionally influ-
354. Schacter and Jo (2005).
encing the results of the study.
355. August et al. (2001); Karcher, Davis, and
349. Ibid.
Powell (2002); Karcher (2005).
350. Guthrie et al. (1999).
356. August et al. (2001).
351. Guthrie, Wigfield, and VonSecker (2000).

( 76 )
Appendix D. Technical information on the studies

that the panel believes will promote aca- The other closely related program docu-
demic success. The program also provides mented an intentional effort to integrate
teachers and parents with strategies to field trips and activities with the academic
manage children’s behavior, ultimately goals of the program.360 Borman, Goetz,
intending to develop stronger relation- and Dowling (2008) randomly assigned ex-
ships with other adults. Fidelity-monitor- iting kindergarten students to participate
ing checklists measured whether summer in either a six-week summer program or a
staff members encouraged student partici- no-treatment control group. The program
pation or used role-play, both components found positive effects on some literacy
of the engaging instruction recommended skills.361 The program, intended to “build
by the panel.357 a love of learning,” included weekly field
trips and theme-based afternoon activities
Two studies looked at the effectiveness designed to build background and con-
of a Saturday and summer school pro- tent knowledge for the literacy instruction
gram that used developmental mentoring conducted each morning.362 Poetry and
to increase connectedness and academic songs, thematic learning, and collabora-
achievement.358 The first study demon- tive learning activities were regular pro-
strated small positive academic effects, gram components.
and the second study showed that the pro-
gram increased students’ connectedness Finally, Schacter and Jo (2005) studied a
on multiple dimensions. In Karcher et al. seven-week summer reading day camp in
(2002), 30 average or above average achiev- which students exiting 1st grade in three
ing 5th-grade students from three schools high-poverty schools with large minor-
with high dropout rates were randomly ity populations in south Los Angeles were
assigned to either a no-treatment com- instructed using writing activities that in-
parison group or to the program, which corporated everyday life experiences and
consisted of monthly Saturday program- creative writing opportunities.363 Play and
ming and a two-week summer session exploration were encouraged. Students were
with high school mentors. The program randomly assigned either to participate in
included academic enrichment and social the camp (maximum enrollment was lim-
connectedness components. Participating ited to 72 students because of the amount
students integrated math, science, writing, of funding) or to serve as a control group.
and computer skills into a final project The study found large positive effects on
over the summer. Students also completed
writing projects that incorporated social
360. Borman, Goetz, and Dowling (2008).
and cultural experiences, future careers,
361. Borman, Goetz, and Dowling (2008) had five
and negotiation and cooperation. The pro- measures of early literacy: letter naming, phoneme
gram encouraged parents to attend pro- segmentation, word list, developmental reading in-
gramming and field trips. A second study struction (DRA), and dictation. There were no signifi-
of a similar program produced greater con- cant differences found between the treatment and
control groups on letter naming, phoneme segmen-
nectedness to school and learning but did tation, or dictation. After controlling for multiple
not measure whether there were academic comparisons following WWC guidelines, the aver-
achievement gains.359 age pre-to-post difference for word lists was signif-
icantly greater for the treatment group compared
to the control group. Also, both word lists and DRA
had estimated effect sizes greater than 0.25 using
Hedges’s g (for technical details on both computa-
357. Ibid. tions, see http://ies.ed.gov/ncee/wwc/references/
358. Karcher, Davis, and Powell (2002); Karcher iDocViewer/Doc.aspx?docId=20&tocId=6).
(2005). 362. Borman, Goetz, and Dowling (2008), p. 2.
359. Karcher (2005). 363. Schacter and Jo (2005).

( 77 )
Appendix D. Technical information on the studies

decoding and reading comprehension in component showed positive significant ef-


follow-up test scores from September. fects for students who attended the summer
program more frequently. In the program,
students participated in hands-on math
Mixed support for engaging and science learning projects and education
instruction in OST games in the afternoon and also learned
new skills and knowledge through field trips
Three large studies provide mixed sup- to museums, which were integrated with
port for the use of engaging instructional classroom activities. Students also incorpo-
practices in OST.364 In two, engaging in- rated experiences beyond the classroom and
structional techniques were intentionally neighborhood into the learning.369
integrated with the academic content of
the program.365 In the third, supplemental A study of Enhanced Academic Instruc-
information from qualitative interviews tion in After-School Programs randomly
with students suggests that engaging in- assigned students to receive either a struc-
struction in the OST offering might be tured enhanced curriculum or a regular
more common than during school, but this after-school day offering. The instructional
was not represented as a core objective of materials and program were intentionally
the program.366 designed to be engaging and related to
academic standards, which led to the in-
Chicago Summer Bridge showed positive clusion of games, hands-on activities, and
effects on student achievement among computers, and program staff members
the districts’ lowest-performing 3rd-grade were evaluated on instructional techniques,
students, but not among 6th graders.367 including whether they made efforts to link
Participating students reported improved academic concepts to student experiences.
relationships with teachers and indicated The program demonstrated small math
that they received more encouragement to effects; however, a parallel intervention
achieve from summer instructors than from for reading, which included many similar
their teachers during the school year. Stu- features and also was intended to engage
dent interviews also suggested that some students with academic content, showed no
instructors were able to keep students en- effects. This suggests caution in interpret-
gaged by using interactive questioning to ing the study findings as evidence for the
gauge student understanding and collabor- effectiveness of engaging instruction.370
ative pairing to support the needs of both
struggling and more advanced students;
however, none of these components was Inconclusive evidence for engaging
presented as a major or consistent compo- instruction in OST
nent of Chicago Summer Bridge.368
Four programs at least partially imple-
A randomized controlled trial of the Teach mented the panel’s recommendations and
Baltimore program showed no positive ef- showed no effects, but studies of each ei-
fects on reading, but a quasi-experimental ther provide evidence that engaging activi-
ties were not clearly linked to academic in-
364. Borman and Dowling (2006); Black et al. struction or do not contain enough detail
(2008); Jacob and Lefgren (2004). for the panel to judge their alignment with
365. Borman and Dowling (2006); Black et al.
(2008).
366. Stone et al. (2005).
367. Jacob and Lefgren (2004). 369. Borman and Dowling (2006).
368. Stone et al. (2005); Roderick et al. (2004). 370. Black et al. (2008).

( 78 )
Appendix D. Technical information on the studies

the recommended practices.371 A quasi- of L.A.’s BEST,377 and academic activities


experimental study of the NDP program, were intended to promote a “love of learn-
which provided one-on-two or one-on-one ing” and build relationships between stu-
reading tutoring from undergraduate tu- dents and staff.378 Reports on L.A.’s BEST
tors to 2nd-grade students after school, do not contain enough details about the
found no effects compared to a compari- actual implementation to judge the degree
son group that received small group ac- to which engaging activities were explic-
tivities not related to literacy.372 The study itly linked to academic instructions.379
sample was small (27 students in the treat- Similarly, an evaluation of the YS-CARE
ment and control groups combined), and found positive but nonsignificant findings
the treatment duration was short (twice a in the first year.380 The program included
week for 10 weeks). In the program, tutors diverse enrichment programs intended to
used opportunities for creative expression broaden students’ knowledge base, among
to supplement reading instruction. Based other objectives, and the enrichment staff
on qualitative reflections from tutors, the traveled from site to site, but the specif-
researcher observed that as the tutors de- ics of these practices were not outlined in
veloped relationships with their students, sufficient detail to determine the degree
they were better able to engage them in the to which engaging practices were imple-
instruction. However, these relationships mented in the program.381
were brief and not described as a goal of
the program. There is no information in Finally, an evaluation of the 21st CCLC
the study to determine whether fostering after-school program showed no consis-
engagement with school or learning was tent positive academic effects among el-
an intentional or widespread component ementary or middle school students but
of the program.373 contains details that suggest that engaging
practices were not fully integrated with the
Like NDP, studies of L.A.’s BEST and YS- academic components of the programs.
CARE do not provide enough detail to de- Participating middle school teachers re-
termine the level of integration between ported improved relationships with stu-
academic and engaging components of the dents. Middle school site observations by
program.374 A quasi-experimental study the evaluation team revealed that student
of L.A.’s BEST found no consistent posi- engagement was often higher during recre-
tive academic effects.375 Although some ational and cultural activities, which were
engaging activities were not strongly tied often intentionally disconnected from aca-
to academic programming (recreational demic components, than during academic
activities included cooking, holiday activi- classes. In most programs, students were
ties and crafts, for example376), promot- allowed to select activities to increase pro-
ing student engagement was identified as gram appeal, but choice often was limited
a program goal in supplementary studies to the recreation and cultural components.
Although all programs offered some sort
of academic assistance, it often was lim-
371. Udell (2003); Bissell et al. (2002); Gold-
ited to homework help sessions during
schmidt, Huang, and Chinen (2007); U.S. Depart- which students worked independently
ment of Education (2003).
372. Udell (2003). 377. Huang et al. (2007).
373. Ibid. 378. Goldschmidt, Huang, and Chinen (2007), p. 9.
374. Goldschmidt, Huang, and Chinen (2007); 379. Huang et al. (2007); Goldschmidt, Huang,
Bissell et al. (2002). and Chinen (2007).
375. Goldschmidt, Huang, and Chinen (2007). 380. Bissell et al. (2002).
376. Ibid. 381. Ibid.

( 79 )
Appendix D. Technical information on the studies

with little direct support from instructors. professionals to improve student aspira-
In elementary school, homework sessions tions and expose them to real-world ca-
were similarly structured and sometimes reers. The program coordinated parent
chaotic. In short, although many programs nights that included staff from the stu-
strove to provide engaging experiences for dents’ schools, as well as meetings with
participants, these experiences often were parents to help them address concerns
disconnected from the academic compo- about the situations their children faced in
nents of the centers.382 school and to help them effectively man-
age conflicts that emerged as their chil-
dren adapted to a new culture.386
Supplemental evidence
from other sources The BELL program includes considerable
academic instruction, described by the
Other studies provided examples of engag- study authors as “culturally relevant”387
ing practices, but they either did not meet and arranged around specific themes, field
the OST protocol for inclusion or failed trips, and guest speakers to supplement
to meet WWC standards with or without academic content.388 Members of the panel
reservations. A number of meta-analyses reported observing particularly dynamic
have reviewed the effectiveness and char- instructional settings.
acteristics of different OST programs.383
One found that programs that used so- The TASC program showed consistently
cial and real-world examples had a posi- positive effects for math, but not for read-
tive impact on math.384 Another reviewed ing.389 Roughly three-quarters of the site
73 reports with control groups on after- coordinators reported using a theme to
school programs that sought to improve link different program activities across
personal and social skills. The authors years 2 to 4 of the project and that students
found the strongest predictor of programs participated in group work.390 Although
with increased academic achievement was homework help was reported as the activ-
that the program had an academic com- ity with the most intense emphasis across
ponent.385 The authors also found that TASC programs, recreational reading, math
programs that were active or included op-
portunities such as role-play and hands-on
practice were among those that produced 386. Center for Applied Linguistics (1994). This
study failed to pass WWC screens because it did
changes in student outcomes.
not contain a comparison group.
387. Chaplin and Capizzano (2006), p. 5.
In Project Adelante, a summer program
388. Chaplin and Capizzano (2006). This ran-
and Saturday academy for recent Latino domized controlled trial did not meet standards
immigrants, promoting positive cultural because it suffered from high attrition and had
identity was incorporated across courses not measured the initial achievement levels and
and enrichment activities. All program- could not demonstrate that the treatment and
control groups were equivalent initially. Further-
ming was aligned with a yearly theme.
more, the posttest achievement tests for students
Students worked with mentors and in- in the control group were administered on a dif-
structors from similar backgrounds who ferent date than those for the treatment group
served as role models of successful Latino and were, therefore, not directly comparable.
389. Reisner et al. (2004). This study found posi-
tive and significant math gains for participants,
382. U.S. Department of Education (2003).
but no effects for language arts. Because baseline
383. Meta-analyses are ineligible for review equivalence of the treatment and control groups
under current WWC standards. was not reported, WWC reviewers were unable to
384. Lauer et al. (2004). assign a rating to the study.
385. Durlak et al. (2007). 390. Ibid.

( 80 )
Appendix D. Technical information on the studies

games, and word games were among the having adults they could talk to and who
top 5 out of 12 activities measured.391 cared about them, and that they liked lit-
eracy activities at the centers. Teachers
A qualitative study of the Philadelphia Bea- reported efforts to make literacy activi-
con Centers combined surveys of students ties fun, including making connections
and staff with observations of program of- between the stories they read and the stu-
ferings to identify practices that engaged dents’ real-life experiences and engaging
students and observers. The study found the students in conversations about the
the use of cooperative learning and choice material. In some locations, enrichment
was associated with student enjoyment staff rotated from site to site.396
and desire to participate in activities.392
A study of the Voyager TimeWarp model, Recommendation 5.
implemented in 13 middle and elemen- Assess program performance
tary schools for low-achieving students, and use the results to improve
includes collaborative learning as a key the quality of the program
component.393 In another small program,
professional women with careers in the Level of evidence: Low
physical sciences mentored minority girls
in classroom settings designed to give The panel judged the level of evidence
them hands-on experience with science supporting this recommendation to be low.
and provide them with role models in Although the panel believes that monitor-
the field.394 Related field trips showed the ing and improving performance are impor-
classroom lessons in action. tant to ensure that the program is carrying
out its intended objectives and adapting to
When implementation data suggested that changing needs and feedback, no direct
the CORAL program (designed primarily evidence suggests that monitoring leads
for elementary students in low-performing to increased academic achievement in OST
schools in California) was poorly imple- programs. More research would be needed
mented on average and consisted mostly of to isolate whether it is the monitoring itself
homework help and enrichment activities or other components that lead to positive
that were disconnected from academic con- academic effects in those programs that
tent, the program tried to improve program do some form of monitoring.
quality and student engagement.395 As the
program implemented changes, specifically
enhancing the academic component of the Summary of evidence
program, most CORAL students reported
OST programs could consider using dif-
ferent kinds of assessments. Seven pro-
391. Ibid.
grams relevant to this recommendation
392. Grossman, Campbell, and Raley (2007). This
study failed to pass WWC screens because it did
had studies that met WWC standards with
not attempt to measure program impacts on aca- or without reservations (see Table D7).397
demic achievement. Of these seven, five included elements of
393. Roberts and Nowakowski (2004). This study
failed to pass WWC screens because it did not 396. Ibid.
have a comparison group.
397. Early Risers—August et al. (2001); CHP—
394. Ferreira (2001). This study was not eligible Langberg et al. (2006); Teach Baltimore—Borman
for WWC review because it did not have a com- and Dowling (2006); Chicago Summer Bridge—
parison group. Jacob and Lefgren (2004); Enhanced Academic
395. Arbreton et al. (2008). This study was not Instruction—Black et al. (2008); L.A.’s BEST—
eligible for review because it did not include a Goldschmidt, Huang, and Chinen (2007); YS-
comparison group. CARE—Bissell et al. (2002).

( 81 )
Appendix D. Technical information on the studies

Table D7. S
 tudies of programs cited in recommendation 5 that met WWC standards
with or without reservations

Program and Study Details Brief Description of Strategies Used


Program Grades
Brief Citation Type Studieda Strategies Used for Evaluating Program Performance
Positive Academic Effects
Early Risers, August et al. Summer and K–2 Fidelity monitoring: Fidelity technicians observed instruction for
(2001) school-year content presentation, style of delivery, and staff comportment through
mentoring unannounced visits.
Checklists were used to assess the alignment of staff behavior along
three dimensions: (a) elicits participation and shows enthusiasm,
(b) uses skills training or reinforces skill use, and (c) uses behavior
management techniques.
Tracking student participation: Staff recorded attendance and material
covered in logbooks, and level of engagement and homework completion
in rating forms.
CHP, Langberg et al. After 6–7 Fidelity monitoring: Group leaders kept a folder for each child in which
(2006) school they recorded progress and behavior each time they met with a child.
Staff supervisors reviewed the folders and identified problems during
weekly supervision sessions.
Mixed Academic Effects
Teach Baltimore, Borman Summer K–2 Fidelity monitoring: Instruction was observed to make sure it was being
& Dowling (2006) delivered as prescribed.
Tracking student participation: Student attendance also was monitored.
Chicago Summer Bridge, Summer 3, 6 Fidelity monitoring: Monitors visit the classroom regularly to observe
Jacob & Lefgren (2004) instructors and make sure that they are teaching within the curriculum
and pace guidelines.
Enhanced Academic After 2–5 Fidelity monitoring: Local district coordinators were responsible for
Instruction, Black et al. school observing instruction. Representatives from the curriculum publishers
(2008) also visited sites twice a year to observe instruction.
No Detectable Academic Effects
L.A.’s BEST, Goldschmidt, After K–6 Fidelity monitoring: The program’s office of evaluation has an internal
Huang, & Chinen (2007) school evaluation team that regularly meets with field staff and administrators
to share thoughts on program operation.
Ongoing external evaluations: External evaluations are performed to
measure the effect of the program on short- and long-term outcomes.
Site-specific results are shared with site coordinators and used to evaluate
individual site performance.
Surveys of key stakeholders: Evaluations gather feedback from program
staff, school-day teachers, parents, and students.
YS-CARE, Bissell et al. After 1–5 Surveys of key stakeholders: Evaluation surveys were administered
(2002) school to students, parents, school-day teachers, and YS-CARE staff to gauge
satisfaction with the program.
a. For summer programs, refers to the grade participating students exited prior to the commencement of the summer program.

( 82 )
Appendix D. Technical information on the studies

fidelity monitoring in order to ensure that gram monitoring was structured to ensure
program implementation followed the pro- continuing fidelity with the model and
gram model,398 one used fidelity monitor- included instructor logs of student atten-
ing and ongoing external evaluations,399 dance, their progress toward instructional
and one program used surveys of key objectives, and any changes in the imple-
stakeholders to gauge satisfaction with mentation of the program curriculum. Mon-
the program.400 Of the six that used fidel- itors conducted periodic unscheduled visits
ity monitoring, all were considered to be to observe staff compliance with program
doing so in a way that was largely consis- components, including checks on student
tent with the panel’s recommendation, and engagement. Monitors used checklists to
five had either positive401 or partially posi- evaluate staff compliance with program
tive effects on academic achievement.402 standards. August et al. (2001) reported
The program that also used external evalu- that the monitoring results indicated that
ators and the program with stakeholder the Early Risers program was implemented
surveys were both inconclusive with small as intended.405
nonsignificant effects.403 Only the program
that used just stakeholder surveys was Langberg et al. (2006) also used a ran-
considered to have a lower level of consis- domized controlled trial to study CHP,
tency with this recommendation because which provided academic remediation
it was unclear how it used the surveys to and study skills training in small groups
improve the program. after school for 6th and 7th graders with
a combination of learning and behavior
problems. The program showed no statis-
Two studies that monitored program tically significant effects on teacher rat-
implementation and had positive ings of academic progress because of the
academic effects small sample size, but the effect size was
0.45, large enough to be considered posi-
One randomized controlled trial of a sum- tive evidence by WWC guidelines.406 To
mer tutoring and school-year mentoring monitor fidelity to the CHP model, group
program (Early Risers) for exiting kinder- leaders were required to keep a folder on
garten students with early onset aggres- each CHP participant. The group leaders
sive behavior showed positive significant recorded the percentages of binder, book
effects on academic competence.404 Pro- bag, and locker organization in the folder
each time they met with a student. These
folders were reviewed by the staff su-
398. Early Risers—August et al. (2001); CHP—
pervisors, and problems were addressed
Langberg et al. (2006); Teach Baltimore—Borman
and Dowling (2006); Chicago Summer Bridge— during weekly supervision. Also, staff su-
Jacob and Lefgren (2004); Enhanced Academic pervisors were onsite every program day
Instruction—Black et al. (2008). and observed counselors leading groups.
399. L.A.’s BEST—Goldschmidt, Huang, and Supervisors then addressed any program-
Chinen (2007). adherence issues with counselors during
400. YS-CARE—Bissell et al. (2002). the weekly group supervision.407
401. Early Risers—August et al. (2001); CHP—
Langberg et al. (2006).
402. Teach Baltimore—Borman and Dowling
(2006); Chicago Summer Bridge—Jacob and Lef-
gren (2004); Enhanced Academic Instruction—
Black et al. (2008).
403. L.A.’s BEST—Goldschmidt, Huang, and 405. Ibid.
Chinen (2007); YS-CARE—Bissel et al. (2002). 406. Langberg et al. (2006).
404. August et al. (2001). 407. Ibid.

( 83 )
Appendix D. Technical information on the studies

Mixed evidence for monitoring program had been raised in the weekly meetings
implementation with the OST instructors.413

Three programs had mixed academic ef-


fects but contained elements of fidelity Inconclusive evidence for fidelity or
monitoring. Whether they found positive other types of monitoring
academic effects depended on the analy-
sis methods used and subgroups studied Two programs, L.A.’s BEST and YS-CARE,
(Teach Baltimore),408 the student grade were evaluated by quasi-experimental
level (Chicago Summer Bridge),409 or the studies that found positive but small
subject content area (Enhanced Academic and nonsignificant effects on academic
Instruction in After-School Programs).410 achievement.414 L.A.’s BEST internal evalu-
In the Teach Baltimore program, monitor- ators regularly monitored staff to gather
ing of instruction and student attendance and share information on best practices.
was designed to ensure fidelity to the pro- The program also commissioned external
gram model.411 In Chicago Summer Bridge, evaluators to gather feedback from pro-
monitors provided frequent oversight to gram stakeholders and measure program
ensure that instructors were implementing effects. The data from these evaluations
the desired curriculum and pacing, as well were shared with site coordinators to in-
as to provide technical support.412 In the form program improvements and moni-
Enhanced Academic Instruction in After- tor site performance.415 The YS-CARE pro-
School Math and Reading programs, bian- gram administered surveys to program
nual visits were conducted by curriculum stakeholders (students, parents, in-school
publishers and independent evaluators to teachers, and program staff) to gauge their
monitor instruction and pacing, as well as impressions of the program.416
program implementation, and to provide
feedback and follow-up training. Weekly
phone calls with district coordinators were Supplemental evidence
used to problem-solve challenges associ- from other sources
ated with implementation of the model and
to pass on implementation concerns that For this recommendation, the panel sup-
plemented the studies that met WWC stan-
dards with sources that discussed for-
mative and summative evaluations of
408. Borman and Dowling (2006) found no ef-
fects on reading in an intent-to-treat randomized OST programs and program quality as-
controlled treatment design that included all stu- sessment. A meta-analysis of summer
dents but did find significant positive effects on programs provides some evidence that
reading using a quasi-experimental design that programs that conduct implementation
looked at the effects of the program only on the
high attendees. See this appendix for recommen- monitoring may have larger effects com-
dation 1 for more detail. pared to programs that do not.417 Although
409. Jacob and Lefgren (2004) found significant
positive effects in both math and reading for 3rd
graders but not for 6th graders. 413. Black et al. (2008).
410. Black et al. (2008) found that their math 414. Goldschmidt, Huang, and Chinen (2007); Bis-
curriculum had small but significant positive ef- sell et al. (2002). Neither study reported sufficient
fects on student achievement for 2nd through 5th detail for effect sizes to be computed.
graders, but their reading curriculum did not. 415. Goldschmidt, Huang, and Chinen (2007).
411. Borman and Dowling (2006). 416. Bissell et al. (2002).
412. Stone et al. (2005). Summer Bridge showed 417. Cooper et al. (2000). This meta-analysis
positive achievement effects for 3rd graders but showed positive effects for fidelity monitoring
not for 6th graders in Jacob and Lefgren (2004). using fixed effects analysis, but not random

( 84 )
Appendix D. Technical information on the studies

this finding is not fully generalizable, it Other sources provided guidance for how
provides some support for this recommen- and when to conduct more formal perfor-
dation, which the panel believes should be mance evaluations. Fullan (2001) suggests
an important component of a successful that two to three years are necessary for
OST program. programs to begin to see representative
positive effects, based on his research of
A qualitative study of the CORAL program school reform interventions.419 Ross, Pot-
provided a few examples of how to carry ter, and Harmon (2006) provide guidance
out implementation monitoring.418 The on the components of an effective evalu-
program created tools for all observers of ation and monitoring system for SES pro-
program instruction to use in rating the grams; the panel believes that this guid-
quality of teaching. The tools were used ance is applicable to other academically
not only as a basis for providing targeted focused OST programs.420 Many publicly
feedback for instructors, but also as a available resources that offer tools for as-
mechanism for measuring improvement sessing and monitoring program quality,
over time in instruction and other aspects either through observations or surveys,
of the program. also are available.421

effects. Therefore, the authors caution that their


findings should not be generalized beyond pro-
grams similar to those included in the meta-anal-
ysis. Furthermore, they point out, “These associa-
tions may be due to the impact that surveillance 419. Fullan (2001) is a research synthesis to
can have on the rigor with which programs are de- which WWC standards are not applicable.
livered. Or, they may simply indicate that the extra 420. Ross, Potter, and Harmon (2006) does not
care that monitoring of programs implies is asso- look at the effectiveness of any programs.
ciated with other evaluation features, for example, 421. For just two examples of what is available,
care in data collection, related to the evaluation’s see Yohalem et al. (2009) or the Bureau of Public
likelihood of uncovering an effect” (p. 97). School Options, Florida Department of Educa-
418. Sheldon and Hopkins (2008). tion (2007, n.d.).

( 85 )
References Baker, S., Gersten, R., & Keating, T. (2000).
When less may be more: A 2-year longi-
Alexander, K. L., & Entwisle, D. R. (1996). tudinal evaluation of a volunteer tutor-
Schools and children at risk. In A. Booth ing program requiring minimal train-
& J. F. Dunn (Eds.), Family-school links: ing. Reading Research Quarterly, 35(4),
How do they affect educational outcomes? 494–519.
(pp. 67–114). Mahwah, NJ: Erlbaum. Battistich, V., Solomon, D., Watson, M., &
Alexander, K. L., Entwisle, D. R., & Olson, Schaps, E. (1997). Caring school com-
L. S. (2007a). Lasting consequences of munities. Educational Psychologist, 32,
the summer learning gap. American 137–151.
Sociological Review, 72(2), 167–180. Beckett, M. (2008). Current generation youth
Alexander, K. L., Entwisle, D. R., & Olson, programs: What works, what doesn’t, and
L. S. (2007b). Summer learning and its at what cost? Santa Monica, CA: RAND
implications: Insights from the begin- Education.
ning school study. New Directions for Bissell, J. S., Dugan, C., Ford-Johnson, A.,
Youth Development, 114, 11–32. & Jones, P. (2002). Evaluation of the YS-
American Educational Research Association, CARE after school program for Califor-
American Psychological Association, and nia Work Opportunity and Responsibility
National Council on Measurement in Edu- to Kids (CalWORKS). Irvine, CA: Univer-
cation. (1999). Standards for educational sity of California, Irvine, Department
and psychological testing. Washington, of Education.
DC: AERA Publications. Black, A. R., Doolittle, F., Zhu, P., Unterman,
American Psychological Association. R., & Grossman, J. B. (2008). The evalua-
(2002). Criteria for practice guideline tion of enhanced academic instruction in
development and evaluation. American after-school programs: Findings after the
Psychologist, 57(12), 1048–1051. first year of implementation (NCEE 2008-
Ames, C. (1992). Classrooms: Goals, struc- 4021). Washington, DC: U.S. Department
tures, and student motivation. Journal of of Education, Institute of Education Sci-
Educational Psychology, 84(3), 261–271. ences, National Center for Education
Anderson, E. (1998). Motivational and cog- Evaluation and Regional Assistance.
nitive influences on conceptual knowl- Bloom, B. S. (1984). The 2 sigma problem:
edge acquisition: The combination of sci- The search for methods of group instruc-
ence observation and interesting texts. tion as effective as one-to-one tutoring.
Unpublished doctoral dissertation, Uni- Educational Researcher, 13(6): 4–16.
versity of Maryland, College Park. Blumenfeld, P. C., & Meece, J. L. (1998). Task
Arbreton, A., Sheldon, J., Bradshaw, M., factors, teacher behavior, and students’
Goldsmith, J., Jucovy, L., & Pepper, S. involvement and use of learning strate-
(2008). Advancing achievement: Find- gies in science. Elementary School Jour-
ings from an independent evaluation of nal, 88, 235–250.
a major after-school initiative. Philadel- Borman, G. D. (1997). A holistic model of
phia, PA: Public/Private Ventures. the organization of categorical program
August, G. J., Realmuto, G. M., Hektner, students’ total educational opportuni-
J. M., & Bloomquist, M. L. (2001). An ties. Unpublished doctoral dissertation,
integrated components preventive in- University of Chicago, Chicago.
tervention for aggressive elementary Borman, G. D., & Dowling, N. M. (2006).
school children: The Early Risers pro- Longitudinal achievement effects of
gram. Journal of Consulting and Clinical multiyear summer school: Evidence
Psychology, 69(4), 61. from the Teach Baltimore randomized

( 86 )
References

field trial. Educational Evaluation and Carver, P. R., & Iruka, I. U. (2006). After-
Policy Analysis, 28(1), 25–48. school programs and activities: 2005
Borman, G., Goetz, W., & Dowling, M. (NCES 2006-076). Washington, DC: U.S.
(2008). Halting the summer achieve- Department of Education, National Cen-
ment slide: A randomized field trial of ter for Education Statistics.
the KindergARTen summer camp. Jour- Center for Applied Linguistics. (1994). Proj-
nal of Education for Students Placed At ect Adelante. Moving onward to a better
Risk, 13(4), 1–20. education. Washington, DC: Author.
Bott, J. (2006). Linking school and after Chaplin, D., & Capizzano, J. (2006). Im-
school: Strategies for success. The Eval- pacts of a summer learning program:
uation Exchange, 12(1 & 2), 2–6. A random assignment study of Building
Bouffard, S., Little, P., & Weiss, H. (2006). Educated Leaders for Life (BELL). Wash-
Building and evaluating out-of-school ington, DC: Urban Institute.
time connections. The Evaluation Ex- Connell, J. P., Spencer, M. B., & Aber, J. L.
change, 12(1 & 2), 2–6. (1994). Educational risk and resilience
Brown, A. (2002). The Boys and Girls Club in African-American youth: Context,
of Broward County after-school program self, action, and outcomes in school.
evaluation report. Broward County, FL: Child Development, 65, 493–506.
The School Board of Broward County’s Connell, J. P., & Wellborn, J. G. (1991). Com-
Office of Research and Evaluation. petence, autonomy, and relatedness: A
Bureau of Public School Options. (2007). motivational analysis of self-system
Self-assessment checklist for supplemen- processes. Minnesota Symposium on
tal educational services. Tallahassee, FL: Child Psychology, 23, 43–77.
Florida Department of Education. Cooper, H., Charlton, K., Valentine, J. C.,
Bureau of Public School Options. (n.d.). & Muhlenbruck, L. (2000). Making the
Parent questionnaire: Supplemental edu- most of summer school: A meta-ana-
cation services. Tallahassee, FL: Florida lytic and narrative review. Monographs
Department of Education. Retrieved of the Society for Research in Child De-
February 4, 2009, from http://www. velopment, 1, 1–118.
fldoe.org/flbpso/nclbchoice/ses/ses_ Cooper, H., Nye, B., Charlton, K., Lindsay, J.,
districtinfo.asp. & Greathouse, S. (1996). The effects of
Capizzano, J., Bischoff, K., Woodroffe, N., summer vacation on achievement test
& Chaplin, D. (2007). Ingredients of a scores: A narrative and meta-analytic
successful summer learning program: review. Review of Educational Research,
A case study of the Building Educated 66(3), 227–268.
Leaders for Life (BELL) accelerated learn- Cordova, D. I., & Lepper, M. R. (1996). Intrin-
ing summer program. Washington, DC: sic motivation and the process of learn-
Mathematica Policy Research, Inc. ing: Beneficial effects of contextualiza-
Capizzano, J., Tout, K., & Adams, G. (2000). tion, personalization, and choice. Journal
Child care patterns of school-age chil- of Educational Psychology, 88(4), 715.
dren with employed mothers. Occa- Courtney, M., Zinn, A., Zielewski, E. H.,
sional Paper Number 41. Washington, Bess, R., Malm, K., Stagner, M., & Per-
DC: Urban Institute. gamit, M. (2008). Evaluation of the early
Carter, S., Straits, K. J. E., & Hall, M. (2007). start to emancipation preparation—
Project venture: Evaluation of an experi- tutoring program. Los Angeles County.
ential culturally based approach to sub- Washington, DC: Urban Institute.  
stance abuse prevention with American Downey, D. B., von Hippel, P. T., & Broh, B. A.
Indian youth. Gallup, NM: National Indian (2004). Are schools the great equalizer?
Youth Leadership Project. Cognitive inequality during the summer

( 87 )
References

months and the school year. American Finn, J., Pannozzo, G., & Voelkl, K. (1995).
Sociological Review, 69(5), 613–635. Disruptive and inattentive-withdrawn
Duffett, A., Johnson, J., Farkas, S., Kung, behavior and achievement among
S., & Ott, A. (2004). All work and no play? fourth graders. Elementary School Jour-
Listening to what kids and parents really nal, 95, 421–454.
want from out-of-school time. New York: Finn, J. D., & Rock, D. A. (1997). Academic
Public Agenda. success among students at risk for
Durlak, J. A., & Weissberg, R. P. (2007). school failure. Journal of Applied Psy-
The impact of after-school programs chology, 82, 221–234.
that promote personal and social skills Fredricks, J. A., Blumenfeld, P. C., & Paris,
(No. 20). Chicago, IL: Collaborative A. H. (2004). School engagement: Poten-
for Academic, Social, and Emotional tial of the concept, state of the evidence.
Learning. Review of Educational Research, 74(1),
Dynarski, M., Moore, M., James-Burdumy, 59–109.
S., Rosenberg, L., Deke, J., & Mansfield, Fullan, M. (2001). The new meaning of educa-
W. (2004). When schools stay open late: tional change (3rd ed.). New York: Teach-
The national evaluation of the 21st Cen- ers College Press.
tury Community Learning Centers pro- Goldschmidt, P., Huang, D., & Chinen, M.
gram. New findings. Princeton, NJ: Math- (2007). The long-term effects of after-
ematica Policy Research, Inc. school programming on educational
Epstein, M., Atkins, M., Cullinan, D., Kutash, adjustment and juvenile crime: A study
K., & Weaver, R. (2008). Reducing behav- of the L.A.’s BEST after-school program.
ior problems in the elementary school Los Angeles, CA: National Center for
classroom: A practice guide (NCEE 2008- Research on Evaluation.
012). Washington, DC: U.S. Department Granger, R. C., & Kane, T. J. (2004). Improv-
of Education, Institute of Education Sci- ing the quality of after-school programs.
ences, National Center for Education Education Week, 23(23).
Evaluation and Regional Assistance. Grossman, J., Campbell, M., & Raley, B.
Retrieved April 7, 2009, from http://ies. (2007). Quality time after school: What
ed.gov/ncee/wwc. instructors can do to enhance learn-
Fashola, O. S. (1998). Review of extended- ing. Philadelphia, PA: Public/Private
day and after-school programs and their Ventures.
effectiveness (Report no. 24). Research Grossman, J., Lind, C., Hayes, C., McMaken,
Replication. J., & Gersick, A. (2009). The cost of qual-
Ferreira, M. (2001). Building communities ity out-of-school-time programs. Phila-
through role models, mentors, and delphia, PA: Public/Private Ventures.
hands-on-science. The School Commu- Grossman, J., Price, M., Fellerath, V., Ju-
nity Journal, 11(2), 27–37. covy, L., Kotloff, L., Raley, R., & Walker,
Field, M. J., & Lohr, K. N. (Eds). (1990). Clini- K. (2002). Multiple choices after school:
cal practice guidelines: Directions for a Findings from the Extended-Service
new program. Washington, DC: National Schools Initiative. Philadelphia, PA: Pub-
Academy Press. lic/Private Ventures.
Finn, J. D. (1989). Withdrawing from school. Guthrie, J. T., Anderson, E., Alao, S., &
Review of Educational Research, 59, Rinehart, J. (1999). Influences of con-
117–142. cept-oriented reading instruction on
Finn, J. D. (1993). School engagement & stu- strategy use and conceptual learning
dents at risk. Washington, DC: National from text. Elementary School Journal,
Center for Education Statistics. 99(4), 343–366.

( 88 )
References

Guthrie, J., Wigfield, A., & VonSecker, C. open late: Results from the national
(2000). Effects of integrated instruction evaluation of the 21st Century Com-
on motivation and strategy use in read- munity Learning Centers program. Edu-
ing. Journal of Education Psychology, cational Evaluation and Policy Analysis,
92(2), 331–341. 4, 296–318.
Halpern, R. (1999). After-school programs James-Burdumy, S., Dynarski, M., & Deke,
for low-income children: Promise and J. (2008). After-school program effects
challenges. The Future of Children, 9(2), on behavior: Results from the 21st Cen-
81–95. tury Community Learning Centers pro-
Hanushek, E. (1992). The trade-off between gram national evaluation. Economic In-
child quantity and quality. Journal of quiry, 46(1), 13–18.
Political Economy, 100(1), 84–117. James-Burdumy, S., Dynarski, M., Moore,
Helme, S., & Clark, D. (2001). Identifying M., Deke, J., Mansfield, W., & Pistorino, C.
cognitive engagement in the mathemat- (2005). When schools stay open late: The
ics classroom. Mathematics Education national evaluation of the 21st Century
Research Journal, 13, 133–153. Community Learning Centers program:
Heyns, B. (1978). Summer learning and Final report. Washington, DC: U.S. Depart-
the effects of schooling. New York: Aca- ment of Education, Institute of Education
demic Press. Sciences, National Center for Education
Hofferth, S. L., & Sandberg, J. F. (2001). How Evaluation and Regional Assistance. Avail-
American children spend their time. able at http://www.ies.ed.gov/ncee.
Journal of Marriage and Family, 63(2), Johnson, D. W., & Johnson, R. T. (1999). Learn-
295–308. ing together and alone (5th ed.). Needham
Huang, D., Miyoshi, J., La Torre, D., Mar- Heights, MA: Allyn and Bacon.
shall, A., Perez, P., & Peterson, C. (2007). Kane, T. (2004). The impact of after-school
Exploring the intellectual, social and or- programs: Interpreting the results of four
ganizational capitals at L.A.’s BEST (CSE recent evaluations. New York: William T.
Technical Report 714). Los Angeles, CA: Grant Foundation.
National Center for Research on Evalu- Karcher, M. J. (2005). The effects of devel-
ation, Standards, and Student Testing opmental mentoring and high school
(CRESST) Center for the Study of Evalu- mentors’ attendance on their younger
ation, University of California, Los An- mentees’ self-esteem, social skills,
geles. Available at http://www.cse.ucla. and connectedness. Psychology in the
edu/products/reports/R714.pdf. Schools, 42(1), 65–77.
Iverson, G., & Tunmer, W. (1993). Phonolog- Karcher, M. J., Davis, C., & Powell, B. (2002).
ical reprocessing skills and the reading The effects of developmental mentor-
recovery program. Journal of Educa- ing on connectedness and academic
tional Psychology, 85(1), 112–126. achievement. The School Community
Jacob, B. A., & Lefgren, L. (2004). Remedial Journal, 12(2), 35–50.
education and student achievement: Klem, A. M., & Connell, J. P. (2004). Re-
A regression-discontinuity approach. lationships matter: Linking teacher
The Review of Economics and Statistics, support to student engagement and
86(1), 226–244. achievement. Journal of School Health,
Jacob, B. A., & Lefgren, L. (2008). Can prin- 74, 262–273.
cipals identify effective teachers? Evi- Langberg, J. M., Smith, B. H., Bogle, K. E.,
dence on subjective performance eval- Schmidt, J. D., Cole, W. R., & Pender,
uation in education. Journal of Labor C. A. S. (2006). A pilot evaluation of
Economics, 26(1), 101–136. small group Challenging Horizons
James-Burdumy, S., Dynarski, M., & Deke, Program (CHP): A randomized trial.
J. (2007). When elementary schools stay
( 89 )
References

Journal of Applied School Psychology, An after-school volunteer tutoring pro-


23(1), 31–58. gram. Elementary School Journal, 91(2),
Lauer, P. A., Akiba, M., Wilkerson, S. B., 133–150.
Apthorp, H. S., Snow, D., Martin-Glenn, Muñoz, M. A., Potter, A. P., & Ross, S. M.
M., et al. (2004). The effectiveness of (2008). Supplemental educational ser-
out-of-school-time strategies in assist- vices as a consequence of the NCLB
ing low-achieving students in reading legislation: Evaluating its impact on stu-
and mathematics: A research synthesis, dent achievement in a large urban dis-
updated. Aurora, CO: McREL. trict. Journal of Education for Students
Lauver, S., Little, P. D. M., & Weiss, H. (2004). Placed At Risk, 13(1), 1–25.
Moving beyond the barriers: Attracting Nears, K. (2008). The achievement gap: Ef-
and sustaining youth participation in fects of a resilience-based after school
out-of-school time programs. Cambridge, program on indicators of academic
MA: Harvard Family Research Project. achievement. Dissertation Abstracts In-
Leslie, A. V. L. (1998). The effects of an ternational Section A: Humanities and
after-school tutorial program on the Social Sciences, 68(8-A), 3265.
reading and mathematics achievement, New York City Board of Education. (1991).
failure rate, and discipline referral rate Career awareness resources for excep-
of students in a rural middle school. Un- tional students (project CARES) 1990–
published doctoral dissertation, Univer- 91. Final evaluation profile. Brooklyn,
sity of Georgia. NY: Office of Research, Evaluation, and
Manzo, K. K. (2009). Virtual field trips open Assessment.
doors for multimedia lessons. Educa- Newmann, F. (1991). Student engagement
tion Week, 28(21), 9. in academic work: Expanding the per-
Marks, H. M. (2000). Student engagement spective on secondary school effec-
in instructional activity: Patterns in the tiveness. In J. R. Bliss & W. A. Firestone
elementary, middle, and high school (Eds.), Rethinking effective schools: Re-
years. American Educational Research search and practice (pp. 58–76). Engle-
Journal, 37, 153–184. wood Cliffs, NJ: Prentice Hall.
McKay, D., Paek, J., Harrison, L., Qian, H., Newmann, F. M., Wehlage, G. G., & Lam-
Zoblotsky, T., Ross, S. M., Fedde, F., & born, S. D. (1992). The significance and
Ford, J. (2008). Supplemental educa- sources of student engagement. In F.
tional services in the state of Virginia: M. Newmann (Ed.), Student engagement
2006–2007. Memphis, AL: Center for and achievement in American secondary
Research in Educational Policy. schools (pp. 11–39). New York: Teachers
McKinney, A. D. (1995). The effects of an College Press.
after-school tutorial and enrichment Perry, N. (1998). Young children’s self-reg-
program on the academic achievement ulated learning and contexts that sup-
and self-concept of below grade level port it. Journal of Educational Psychol-
first and second grade students. Unpub- ogy, 90(4), 715–729.
lished doctoral dissertation, University Region VII After School Programs. (n.d.). Ini-
of Mississippi. tial classroom connection form. Retrieved
Meier, J., & Invernizzi, M. (2001). Book Bud- March 30, 2009, from http://www.tcoe.
dies in the Bronx: Testing a model for org/AfterSchool/Resources/ProgramOp-
America Reads and National Service. eration/ClassroomConnectionForm.pdf.
Journal for the Education Placement of Reisner, E. R., Russell, C. A., Welsh, M.
Students Placed At-Risk, 6(4), 319–333. E., Birmingham, J., & White, R. N.
Morris, D., Shaw, B., & Perney, J. (1990). (2002). Supporting quality and scale
Helping low readers in grades 2 and 3: in after-school services to urban youth:

( 90 )
References

Evaluation of program implementation Sheldon, J., & Hopkins, L. (2008). Support-


and student engagement in TASC After- ing success: Why and how to improve
School Program’s third year. Washing- quality in after-school programs. Phila-
ton, DC: Policy Studies Associates. delphia, PA: Public/Private Ventures.
Reisner, E. R., White, R. N., Russell, C. A., Skinner, E. A., & Belmont, M. J. (1993). Mo-
& Birmingham, J. (2004). Building qual- tivation in the classroom: Reciprocal
ity, scale, and effectiveness in after- effects of teacher behavior and student
school programs: Summary report of engagement across the school year.
the TASC evaluation. Washington, DC: Journal of Educational Psychology, 85(4),
Policy Studies Associates. 571–581.
Roberts, G., & Nowakowski, J. (2004). Ad- Slattery, C. A. (2003). The impact of a com-
dressing the summer learning loss: An puter-based training system on strength-
evaluation of the Voyager summer read- ening phonemic awareness and increas-
ing intervention program. In G. D. Bor- ing reading ability level. Dissertation
man & M. Boulay (Eds.), Summer learn- Abstracts International, 64 (09A).
ing: Research, policies, and programs Slavin, R. E. (2006). Educational psychology:
(pp. 165–181). Mahwah, NJ: Erlbaum. Theory and practice (8th ed.). Boston:
Roderick, M., Jacob, B. A., & Bryk, A. S. (2004). Pearson Education.
Summer in the city: Achievement gains Smith, K. (2000). Who’s minding the kids?
in Chicago’s Summer Bridge program. In Child care arrangements: Fall 1998.
G. D. Borman & M. Boulay (Eds.), Summer Current Population Reports, 70.
learning: Research, policies, and programs Stone, S. I., Engel, M., Nagaoka, J., & Rod-
(pp. 73–102). Mahwah, NJ: Erlbaum. erick, M. (2005). Getting it the second
Ross, S. M., Potter, A., & Harmon, J. (2006). time around: Student classroom experi-
Evaluating supplemental educational ence in Chicago’s Summer Bridge pro-
service providers: Suggested strategies gram. Teachers College Record, 107(5),
for states (2nd ed.). Lincoln, IL: Center 935–957.
on Innovation and Improvement, Cen- Tucker, C. M. (1995). A parent, community,
ter for Research in Educational Policy, public schools, and university involved
American Institutes for Research. partnership education program to ex-
Ross, S. M., Potter, A., Paek, J., McKay, D., amine and boost academic achieve-
Sanders, W., & Ashton, J. (2008). Imple- ment and adaptive functioning skills
mentation and outcomes of supplemen- of African-American students. Journal
tal educational services: The Tennessee of Research and Development in Educa-
statewide evaluation study. Journal of tion, 28(3), 174–259.
Education for Students Placed At Risk, Turner, J. C. (1995). The influence of class-
13(1), 26–58. room contexts on young children’s mo-
Schacter, J., & Jo, B. (2005). Learning when tivation for literacy. Reading Research
school is not in session: A reading sum- Quarterly, 30, 410–441.
mer day-camp intervention to improve Udell, C. L. (2003). Early literacy tutoring
the achievement of exiting first-grade after school: An exploration of the im-
students who are economically disad- pact of a cross-age tutoring program
vantaged. Journal of Research in Read- (Nurturing Development Partnerships)
ing, 28(2), 158–169. on elementary school students’ reading
SEDL National Center for Quality After- development. Unpublished doctoral dis-
school. (n.d.). Afterschool training toolkit. sertation, University of Maryland, Bal-
Retrieved March 30, 2009, from http:// timore County.
www.sedl.org/afterschool/toolkits. U.S. Department of Education. (2003). When
schools stay open late: The national

( 91 )
References

evaluation of the 21st Century Learn- Yohalem, N., & Wilson-Ahlstrom, A., with
ing Centers program, first year findings. Fischer, S., & Shinn, M. (2009). Measur-
Washington, DC: Author. ing youth program quality: A guide to
U.S. Department of Education. (2005). Case assessment tools (2nd ed.). Washington,
studies of supplemental services under the DC: The Forum for Youth Investment.
No Child Left Behind Act: Findings from Zimmer, R., Gill, B., Razquin, P., Booker, K.,
2003–04. Washington, DC: Author. Lockwood, J. R., III, & U.S. Department
U.S. Department of Education. (2009). SES of Education. (2007). State and local
non-regulatory guidance. Washington, implementation of the “No Child Left
DC: Author. Behind Act.” Volume I—Title I school
Wasik, B. A., & Slavin, R. E. (1993). Pre- choice, supplemental educational ser-
venting early reading failure with one- vices. Washington, DC: U.S. Department
to-one tutoring: A review of five pro- of Education.
grams. Reading Research Quarterly,
28(2), 178–200.
Wellborn, J. G., & Connell, J. P. (1987). Man-
ual for the Rochester assessment pack-
age for schools. Rochester, NY: Univer-
sity of Rochester.

( 92 )

You might also like