Professional Documents
Culture Documents
College Readiness As Perceived by First-Year Community College ST
College Readiness As Perceived by First-Year Community College ST
Intervention http://aei.sagepub.com/
College and Career Readiness Assessment: Validation of the Key Cognitive Strategies Framework
Allison R. Lombardi, David T. Conley, Mary A. Seburn and Andrew M. Downs
Assessment for Effective Intervention 2013 38: 163 originally published online 28 June 2012
DOI: 10.1177/1534508412448668
Published by:
Hammill Institute on Disabilities
and
http://www.sagepublications.com
Additional services and information for Assessment for Effective Intervention can be found at:
Subscriptions: http://aei.sagepub.com/subscriptions
Reprints: http://www.sagepub.com/journalsReprints.nav
Permissions: http://www.sagepub.com/journalsPermissions.nav
What is This?
Downloaded from aei.sagepub.com at Univ of Connecticut / Health Center / Library on October 24, 2013
448668
ardi et al.Assessment for Effective Intervention
© Hammill Institute on Disabilities 2012
sagepub.com/journalsPermissions.nav
General Article
Assessment for Effective Intervention
Abstract
In this study, the authors examined the psychometric properties of the key cognitive strategies (KCS) within the
CollegeCareerReady™ School Diagnostic, a self-report measure of critical thinking skills intended for high school students.
Using a cross-validation approach, an exploratory factor analysis was conducted with a randomly selected portion of
the sample (n = 516) and resulted in five reliable factors: (a) problem formulation, (b) research, (c) interpretation,
(d) communication, and (e) precision/accuracy. A confirmatory factor analysis was conducted with the remaining sample
(n = 808). Goodness-of-fit indices indicated acceptable model fit. The five-factor solution is consistent with earlier validity
studies of the KCS framework. Implications for use by high school personnel in evaluation of instructional programs and
as a value-added assessment are discussed.
Keywords
cognition, critical thinking, college readiness, factor analysis, validity
College and career readiness has emerged as a major focal Conley, 2007; Brown & Niemi, 2007; Conley, 2003). Given
point in educational accountability systems. Most recently, the recent focus on college and career readiness as high-
knowledge and skills associated with college and career lighted by the CCSS and the continued demand for remedial
readiness have become the underlying goal of the Common higher education courses, it is especially crucial for high
Core State Standards (CCSS; National Governor’s Associa- school personnel to assess their students on the knowledge
tion [NGA] & Council of Chief State School Officers and skills that are not measured by grade point average or
[CCSSO], 2010) and a subsequent initiative led by the Race college admission exams. Adequate assessment of such
to the Top Assessment Program (U.S. Department of Educa- skills may help educators improve instructional program-
tion, 2010). Not only were these policy initiatives designed to ming so that college and career readiness is emphasized and,
address the knowledge and skills students need to be success- in turn, remedial higher education needs are reduced.
ful in college and careers (NGA & CCSSO, 2010), but they
also seek to reduce the 30% to 60% of underprepared high
school graduates in need of remedial higher education College and Career Readiness
(National Center for Education Statistics, 2004). Remedia- Definitions
tion needs are significantly higher among aspiring first-gen-
eration college students, suggesting that assessing college College readiness differs from college eligibility; in addi-
and career readiness in such students is particularly important tion to satisfying high school graduation requirements, col-
(Chen, 2005; Choy, 2001; Venezia, Kirst, & Antonia, 2003). lege-ready students are able to succeed in a credit-bearing
The current and well-accepted indicators of college and
career readiness (e.g., grade point average, college admis- 1
University of Connecticut, Storrs, CT, USA
sion exam scores) show some evidence of predicting col- 2
University of Oregon, Eugene, OR, USA
lege student grade point average (Camara & Echternacht, 3
Educational Policy Improvement Center, Eugene, OR, USA
4
2000; Cimetta, D’Agostino, & Levin, 2010; Coelen & University of Portland, Portland, OR, USA
Berger, 2006; McGee, 2003; Noble & Camara, 2003); how-
Corresponding Author:
ever, other evidence suggests that these measures are mis- Allison R. Lombardi, Neag School of Education University of Connecticut.
aligned with the knowledge and skills pertinent for success 249 Glenbrook Road, Unit 2064 Storrs, CT 06269, USA.
in college environments (Achieve, Inc., 2007; Brown & Email: allison.lombardi@uconn.edu
164 Assessment for Effective Intervention 38(3)
problems before beginning an assignment, search and organize Table 1. The Five Key Cognitive Strategies and Operational
information to make a case for a solution, consider varying Definitions.
opinions on the topic, compile and communicate their solu-
Strategy Definition
tion, and review their own work for precision and accuracy
(Conley, 2011a; NGA & CCSSO, 2010). Problem formulation The student demonstrates clarity
Based on these identified behaviors and skills, the KCS about the nature of the problem
and identifies potential outcomes.
were defined as five sequential constructs: (a) problem formu-
The student develops strategies for
lation, (b) research, (c) interpretation, (d) communication, exploring all components of the
and (e) precision/accuracy. Together, they represent the problem. The student may revisit
thinking skills or habits of mind of successful college stu- and revise the problem statement as
dents (Conley, 2007; Costa & Kallick, 2000; Ritchhart, a result of thinking about potential
2002), as well as the skills that college instructors expect methods to solve the problem.
students to have mastered on entrance to college across aca- Research The student explores a full range of
demic disciplines (Conley, 2003). Table 1 shows detailed available resources and collection
techniques, or generates original
definitions of the five KCS.
data. The student makes judgments
The KCS were developed from three theoretical frames: about the sources of information or
(a) dispositional-based theory of intelligence, (b) cognitive quality of the data, and determines
learning theory, and (c) competency theory. A dispositional the usefulness of the information or
view is rooted in the belief that intelligence is malleable data collected. The student may revisit
and, through increasing efforts, can grow incrementally and revise information collection
(Bransford et al., 2000; Costa & Kallick, 2000). The second methods as greater understanding of
the problem is achieved throughout
conceptual frame derives from cognitive learning theory, in
this process.
which people construct new knowledge based on what they Interpretation The student identifies and considers
already know and believe, and that retention is heightened the most relevant information or
by meaningful learning experiences (Perkins, 1992). findings, and develops insights. To make
Competency theory provides the final element of the con- connections and draw conclusions, the
ceptual frame. Guided by the expert–novice literature student uses structures and strategies,
(Baxter & Glaser, 1997), this theory suggests that novices which contribute to the framework
(students) benefit from models of how experts approach for communicating a solution. The
student reflects on the quality of the
problem solving, especially if they receive coaching in conclusions drawn and may revisit and
using similar models (Bransford et al., 2000). Within com- revise previous steps in the process.
petency theory research, developmental models of learning Communication The student organizes information
note the typical progression as a learner advances from nov- and insights into a structured line of
ice to competent to expert, and describe the types of experi- reasoning and constructs a coherent
ences that lead to change (Boston, 2003). These three and complete final version through
theoretical frames underpin the five-part KCS model. a process that includes drafting,
incorporating feedback, reflecting, and
revising.
Measuring the KCS Precision/accuracy The student is appropriately precise and
accurate at all stages of the process by
Conley (2003) found that a nationwide sample of college determining and using language, terms,
faculty, regardless of selectivity of institution and across expressions, rules, terminology, and
multiple disciplines, reached near universal agreement that conventions appropriate to the subject
most students arrive unprepared for the intellectual demands area and problem.
and expectations of postsecondary environments. Other Source: Adapted from Conley (2007a). Copyright 2007 by the
researchers have analyzed high school transcripts and found Educational Policy Improvement Center.
that rigorous academic preparation as represented by the
titles of high school courses taken is the most significant Purpose of the Present Study
predictor of persistence to college graduation (Adelman,
1999; Bedsworth, Colby, & Doctor, 2006). A different The purpose of this study was to examine the reliability and
approach is to analyze the content of college courses and internal validity of the KCS within the CCRSD, a self-
then determine what should be occurring in high school report instrument intended to measure the degree to which
courses to align with what will be encountered in college schools provide college and career readiness opportunities
courses. This backward mapping strategy implicated the for their students. To do this, we examined (a) the internal
initial iteration of the KCS framework (Conley, 2003). consistency of the measure and (b) the extent to which data
166 Assessment for Effective Intervention 38(3)
from the measure fit the proposed KCS model. These study reference point. If students do not believe certain items
objectives informed conclusions on whether the KCS could describe their behaviors or indicate they do not know, they
be validated as a self-report measure within a larger mea- are less aware of successful college and career readiness
sure of college and career readiness. The larger measure, practices and behaviors.
the CCRSD, is a tool intended for school personnel in Items were written for five subscales that represent
evaluating their instructional programs to ensure consis- the constructs: (a) Problem Formulation, (b) Research,
tency with the criteria of the CCSS and ultimately provide (c) Interpretation, (d) Communication, and (e) Precision/
more postgraduation opportunities to the youth they serve. Accuracy. We hypothesized that cognitive thinking skills
Because cognitive thinking skills are not readily observable associated with college and career readiness comprised
in students, a self-report instrument may be a useful tool for these five subscales. Before administration, the items were
educators in determining instructional supports and refine- pilot tested for readability on two samples during 2009 and
ments that emphasize these skills. Particularly, we were 2010. At that time, participants were solicited for qualita-
interested in validating the instrument for aspiring first- tive feedback on items as they responded to the survey.
generation college students, a population that has shown a These data were analyzed and used to inform item
disproportionate need for remedial higher education (Chen, revision.
2005; Venezia et al., 2003).
Procedures
Method School personnel selected student participants so that there
Sample were approximately 100 students per grade. Selected stu-
dents were offered the opportunity to take the CCRSD dur-
Participants were students (N = 1,324) across 10 high ing a designated 50-min class period. School personnel
schools in Illinois, Indiana, Michigan, and Wyoming that were advised to select student participants from core aca-
had agreed to pilot test the CCRSD in fall 2010. A purpo- demic courses (e.g., English/language arts, mathematics,
sive sample of high schools was selected because they had science, social studies). Of the participating classes, 72%
high enrollment rates of aspiring first-generation college were core academic courses—English/language arts (25%),
students and the schools reported they were implementing mathematics (18%), natural sciences (16%), and social sci-
college and career readiness programs. For the most part, ences (13%). The remaining 28% of courses were “other,”
the students were evenly distributed across grades: 27% which included career/technical, arts, foreign languages,
were in 9th grade, 24% were in 10th grade, 26% were in physical education, and health. In all schools, the resulting
11th grade, and 23% were in 12th grade. Of the students, participant sample was compared with the overall school
53% reported neither parent had a college degree, 27% population, and no significant demographic-based differ-
reported one parent had a college degree, and 20% reported ences were found.
both parents had college degrees. Race/ethnicity of the The CCRSD was administered online. Participants com-
students was as follows: African American (48%), White pleted an online consent form prior to the start of the sur-
(22%), Hispanic/Latino (20%), mixed race (6%), Asian vey. If participants responded “no” to the consent form,
American (<1%), American Indian/Alaskan native (<1%), they were unable to proceed with the survey and were redi-
and unknown (2%). There were slightly more female (54%) rected to the end page. Student participation was voluntary
than male (46%) students. A majority of the students quali- and students received no compensation. School personnel
fied for free/reduced meal service (66%). Approximately, received no compensation and were provided with aggre-
15% were students with disabilities with an individualized gated data reports, which they could access and interact
education program, and 10% were classified as English with online.
learners.
Analytic Approach
Measure To meet our study objectives, we examined the psychomet-
The KCS dimension of the CCRSD contains 64 items with ric properties of the instrument. For validity, we took a
response options ranging from 1 = not at all like me to 5 = cross-validation approach by randomly splitting the sample
very much like me, with a “don’t know” (DK) option. so that student responses were subject to exploratory factor
Students are asked, “Please indicate how much each state- analyses (EFA) and confirmatory factor analyses (CFA).
ment describes you” and rate the items accordingly. For the EFA, 40% of the sample was randomly selected,
Because the items are based on exemplary practices (see and the remaining 60% of responses were saved for the
Conley et al., 2010), the intent is for students to self- CFA. The reason for splitting the sample accordingly (as
rate their own behaviors using exemplary behaviors as a opposed to 50% for each method) was to ensure the CFA
Lombardi et al. 167
sample was large enough to meet more stringent sample Table 2. Descriptive Statistics and Reliability by Subscale and
size requirements (Kline, 1998). The CFA was used to Dimension.
determine whether the factor structure obtained in the EFA
Subscale Item n M SD α
could be confirmed on student responses from the remain-
der of the sample. Structural equation modeling methods Problem formulation 12 3.74 0.83 .88
(Kline, 1998) were used to estimate the CFA models. In Research 10 3.71 0.88 .88
addition to the EFA and CFA, we examined internal consis- Interpretation 10 3.56 0.89 .88
tency (Cronbach’s α) of the scores for the full instrument Communication 9 3.66 0.97 .90
and within factors, and determined a priori that acceptable Precision/accuracy 14 3.81 0.89 .93
reliability included values of .70 or greater, while .80 or Key cognitive 55 3.70 0.76 .96
strategies
greater values were preferable (Nunnally, 1975). Two types
of software were used for analyses, PASW 18.0 (SPSS,
Inc., 2010) and Mplus 6.1 (Muthen & Muthen, 2010).
Prior evidence shows these students are more dependent on of longitudinal tracking, school personnel can get a better
educators for college and career preparation (Pascarella et al., sense for the value added of their programs to student learn-
2004) and that programs targeted toward college access ing and achievement. In addition, use of the CCRSD cou-
positively affect them (Gandara & Bial, 2001; McDonough, pled with a performance assessment (such as C-PAS), may
2004; Plank & Jordan, 2001; Stanton-Salazar, 2001; Venezia allow for a comparison of student self-ratings and teacher
et al., 2003). Assessing students on the KCS is the first step scores on the KCS. This system is not meant to replace cur-
to integrating these skills into instruction. Potentially, if the rent and well-known academic performance measures; the
KCS are integrated into instruction, remedial higher educa- KCS are meant to add more meaning and clarification in
tion needs may decrease. integrating the instruction of thinking skills alongside the
content that is taught and measured in high school courses.
The CCRSD online system allows all participants to use a
Limitations data-driven decision framework to better understand how
Although the present study shows promising validity evi- they can optimally spend their high school years in prepara-
dence, there are several limitations to consider. Of primary tion for the future.
concern is our sample, of which the majority (68%) com-
prised African American (48%) and Hispanic/Latino stu- Declaration of Conflicting Interests
dents (20%). White students comprised 22%, and students The authors declared no potential conflicts of interest with respect to
of other races comprised the remaining 10% of the sample, the research, authorship, and/or publication of this article.
suggesting an underrepresentation of Asian American,
Pacific Islander, and American Indian students. Aspiring Funding
first-generation college students were of particular interest The authors received no financial support for the research, author-
in this study, and more than half of the sample (53%) com- ship, and/or publication of this article.
prised this population. Due to these sample characteristics,
the extent to which our findings generalize across high Note
schools is somewhat limited. Moreover, there is a potential 1. The college-readiness model is described by Conley (2010,
for respondent bias because this is a self-report instrument. p. 31). Copyright 2010 by D. T. Conley. The model dimensions
Future research studies are needed to establish the predic- described in the book have been relabeled as model keys. Names
tive validity of the KCS dimension to determine whether of two keys have been relabeled: Academic behaviors are now
students who exhibit high awareness and understanding of key learning skills and techniques, and contextual awareness
the KCS also have high achievement. and skills are now key transition knowledge and skills.
References
Implications for Practice Achieve, Inc. (2007). Aligned expectations? A closer look at
In light of the importance of college and career readiness as college admissions and placement tests. Washington, DC:
specified by the CCSS and the Race to the Top Assessment Author.
Program, it is increasingly crucial to measure the knowl- ACT, Inc. (2010). College readiness standards for EXPLORE,
edge and skills associated with postsecondary success. PLAN, and ACT. Iowa City, IA: Author.
School personnel—administrators, teachers, counselors, Adelman, C. (1999). Answers in the tool box: Academic inten-
and other student support personnel—may assess their stu- sity, attendance patterns, and bachelor’s degree attainment.
dents with the KCS dimension to better understand how Washington, DC: U.S. Department of Education.
they can adjust instruction and programming within their Allen, D. (1999). Desire to finish college: An empirical link
classrooms and schools to encourage and teach the KCS. In between motivation and persistence. Research in Higher Edu-
addition to student surveys, there are teachers, counselors, cation, 40, 461–485.
and administrator versions available so that student scores Baldwin, M., Seburn, M., & Conley, D. T. (2011). External
may be compared with school personnel to gain a greater validity of the College-Readiness Performance Assessment
sense of the perceptions and discrepancies in college- System (C-PAS). Paper presented at the 2011 annual confer-
readiness instruction and programs. Within the larger ence of the American Educational Research Association,
CCRSD online system, these instruments are tied to a New Orleans, LA.
resource database with actionable steps. The system is lon- Baxter, G. P., & Glaser, R. (1997). An approach to analyzing the
gitudinal, allowing students and school personnel to track cognitive complexity of science performance assessments. Los
their responses over time, monitor progress, and adjust Angeles, CA: National Center for Research on Evaluation,
instruction accordingly. Standards, and Student Testing and Center for the Study of
There is potential for the CCRSD to be used as a value- Evaluation.
added assessment. With versions available for students, Bedsworth, W., Colby, S., & Doctor, J. (2006). Reclaiming the
teachers, and other school personnel, and with the possibility American dream. Boston, MA: Bridgespan.
170 Assessment for Effective Intervention 38(3)
Boekaerts, M. (1999). Self-regulated learning: Where we are Conley, D. T. (2011b). Pathways to postsecondary and career
today. International Journal of Educational Research, 31, readiness. Invited speaker at College and Career Readiness
445–457. Regional Workshop, Wellington, NZ.
Boston, C. (2003). Cognitive science and assessment. College Conley, D. T., Lombardi, A., Seburn, M., & McGaughy, C. (2009).
Park, MD: Office of Educational Research and Improvement. Formative assessment for college readiness on five key cogni-
Bransford, J. D., Brown, A. L., & Cocking, R. R. (Eds.). (2000). tive strategies associated with postsecondary success. Paper
How people learn: Brain, mind, experience, and school. Wash- presented at the 2009 annual conference of the American Edu-
ington, DC: National Academy of Sciences, U.S. Department cational Research Association, San Diego, CA.
of Education. Conley, D. T., McGaughy, C., Kirtner, J., Van Der Valk, A., &
Brown, R. S., & Conley, D. T. (2007). Comparing state high Martinez-Wenzl, M. T. (2010). College readiness practices at
school assessments to standards for success in entry-level uni- 38 high schools and the development of the CollegeCareer-
versity courses. Educational Assessment, 12, 137–160. Ready School Diagnostic tool. Paper presented at the 2010
Brown, R. S., & Niemi, D. N. (2007). Investigating the alignment annual conference of the American Educational Research
of high school and community college assessments in Califor- Association, Denver, CO.
nia (National Center Report No. 07-3). The National Center Costa, A., & Kallick, B. (2000). Discovering & exploring habits
for Public Policy and Higher Education. of mind. A developmental series (Book 1). Alexandria, VA:
Browne, M. W., & Cudeck, R. (1993). Alternative ways of assess- Association for Supervision and Curriculum Development.
ing model fit. In K. A. Bollen & J. S. Long (Eds.), Testing Gandara, P., & Bial, D. (2001). Paving the way to higher edu-
structural equation models (pp. 136–162). Newbury Park, cation: K-12 intervention programs for underrepresented
CA: SAGE. youth. Washington, DC: U.S. Government Printing Office,
Camara, W. J., & Echternacht, G. (2000). The SAT and high U.S. Department of Education, National Center for Education
school grades: Utility in predicting success in college. College Statistics.
Entrance Examination Board, Office of Research and Devel- Gore, P. (2006). Academic self-efficacy as a predictor of col-
opment (Report CB-RN-10). New York, NY: College Board. lege outcomes: Two incremental validity studies. Journal of
Chen, X. (2005). First generation students in postsecondary edu- Career Assessment, 14, 92–115.
cation: A look at their college transcripts (NCES 2005-171). Hu, L. T., & Bentler, P. M. (1995). Evaluating model fit. In
U.S. Department of Education, National Center for Education R. H. Hoyle (Ed.), Structural equation modeling: Concepts,
Statistics. Washington, DC: U.S. Government Printing Office. issues, and applications (pp. 76–99). Thousand Oaks, CA: SAGE.
Choy, S. P. (2001). Students whose parents did not go to college: Kline, R. (1998). Principles and practice of structural equation
Postsecondary access, persistence, and attainment (NCES modeling. New York, NY: Guilford.
2001-126). U.S. Department of Education, National Center Kuh, G. D. (2005). Student engagement in the first year of college.
for Education Statistics. Washington, DC: U.S. Government In L. M. Upcraft, J. N. Gardner, & B. O. Barefoot (Eds.), Chal-
Printing Office. lenging and supporting the first-year student (pp. 86–107). San
Cimetta, A. D., D’Agostino, J. J., & Levin, J. R. (2010). Can high Francisco, CA: Jossey-Bass.
school achievement tests serve to select college students? Edu- Lombardi, A. R., Seburn, M., & Conley, D. T. (2011a). Develop-
cational Measurement: Issues and Practice, 29, 3–12. ment and initial validation of a measure of academic behav-
Coelen, S. P., & Berger, J. B. (2006). First steps: An evaluation iors associated with college and career readiness. Journal of
of the success of Connecticut students beyond high school. Career Assessment, 19, 375–391.
Quincy, MA: Nellie Mae Education Foundation. Lombardi, A. R., Seburn, M., & Conley, D. T. (2011b). Treatment
Conley, D. T. (2003). Understanding university success. Eugene: of nonresponse items on scale validation: What “don’t know”
Center for Educational Policy Research, University of Oregon. responses indicate about college readiness. Paper presented
Conley, D. T. (2005). College knowledge: What it really takes for at the 2011 annual conference of the American Educational
students to succeed and what we can do to get them ready. Research Association, New Orleans, LA.
San Francisco, CA: Jossey-Bass. MacCallum, R. C., Brown, M. W., & Sugawara, H. M. (1996).
Conley, D. T. (2007a). Redefining college readiness (pp. 8–9). Power analysis and determination of sample size for covari-
Eugene, OR: Center for Educational Policy Research. ance structure modeling. Psychological Methods, 1, 130–149.
Conley, D. T. (2007b). Toward a comprehensive conception of McDonough, P. M. (2004). Counseling matters: Knowledge,
college readiness. Eugene, OR: Educational Policy Improve- assistance, and organizational commitment to college prepa-
ment Center. ration. In W. G. Tierney (Ed.), Nine propositions relating to
Conley, D. T. (2010). College and career ready: Helping all stu- the effectiveness of college preparation programs (pp. 69-88).
dents succeed beyond high school. San Francisco, CA: Jossey- New York, NY: State University of New York Press.
Bass. McGee, D. (2003). The relationship between WASL scores and
Conley, D. T. (2011a). Building on the common core. Educational performance in the first year of university. Seattle: Office of
Leadership, 68, 16–20. Educational Assessment, University of Washington.
Lombardi et al. 171
Muthen, L. K., & Muthen, B. O. (2010). Mplus version 6.0 user’s talent loss. American Educational Research Journal, 38,
guide. Los Angeles, CA: Muthen & Muthen. 947–979.
National Center for Education Statistics. (2004). The condition Ritchhart, R. (2002). Intellectual character: What it is, why it mat-
of education 2004: Remediation and degree completion. ters, and how to get it. San Francisco, CA: Jossey-Bass.
Washington, DC: U.S. Department of Education. SPSS, Inc. (2010). PASW 18.0 for windows. Chicago, IL: IBM.
National Governor’s Association & Council of Chief State School Stanton-Salazar, R. D. (2001). Manufacturing hope and despair:
Officers. (2010). Common Core State Standards initiative. The school and kin support networks of U.S.-Mexican youth.
Retrieved from http://www.corestandards.org/ New York, NY: Teachers College Press.
National Research Council. (2002). Learning and understanding: Tinto, V. (2007). Research and practice of student retention:
Improving advanced study of mathematics and science in U.S. What’s next? Journal of College Student Retention, 8, 1–19.
high schools. Washington, DC: National Academy Press. Torres, J. B., & Solberg, V. S. (2001). Role of self-efficacy, stress,
Noble, J. P., & Camara, W. J. (2003). Issues in college admis- social integration, and family support in Latino college student
sions testing. In J. E. Wall & G. R. Walz (Eds.), Measuring persistence and health. Journal of Vocational Behavior, 59, 3–63.
up: Assessment issues for teachers, counselors, and adminis- U.S. Department of Education. (2010). Race to the Top Assess-
trators (pp. 283-296). Greensboro, NC: ERIC Counseling and ment Program. Retrieved from http://www2.ed.gov/programs/
Student Services Clearinghouse. racetothetop-assessment/index.html
Nunnally, J. C. (1975). Psychometric theory: 25 years ago and Venezia, A., Kirst, M. W., & Antonia, A. L. (2003). Betraying the
now. Educational Researcher, 4, 7–21. college dream: How disconnected K-12 and postsecondary
Pascarella, E. T., Pierson, C. T., Wolniak, G. C., & Terenzini, P. T. education systems undermine student aspirations. Stanford,
(2004). First-generation college students: Additional evidence CA: Stanford Institute for Higher Education Research.
on college experiences and outcomes. Journal of Higher Edu- Wiley, A., Wyatt, J., & Camara, W. J. (2010). The development of
cation, 75, 249–284. a multidimensional college readiness index. New York, NY:
Perkins, D. (1992). Smart schools: Better thinking and learning College Board.
for every child. New York, NY: Free Press. Wolters, C. A. (1998). Self-regulated learning and college stu-
Pintrich, P. R. (2004). A conceptual framework for assessing dents’ regulation of motivation. Journal of Educational Psy-
motivation and self-regulated learning in college students. chology, 90, 224–235.
Educational Psychology Review, 16, 385–407. Zajacova, A., Lynch, S. M., & Espenshade, T. J. (2005). Self-
Plank, S. B., & Jordan, W. J. (2001). Effects of information, guid- efficacy, stress, and academic success in college. Research in
ance, and actions on postsecondary destinations: A study of Higher Education, 46, 677–706.