Download as pdf or txt
Download as pdf or txt
You are on page 1of 22

Using Assessments for

Program and Unit Review


Dr. Debra Colley, Niagara University
Dr. Kim Boyd, Oral Roberts University
How are assessments tied to the
unit’s assessment system?

How are data from assessments


used to inform program and unit
changes?
Standard 2: Assessment System
and Unit Evaluation

The unit has an assessment system


that collects and analyzes data on
applicant qualifications, candidate and
graduate performance, and unit
operations to evaluate and improve the
performance of candidates, the unit, and
its programs.
2a. Assessment System
Unacceptable Acceptable Target
The unit has not involved its The unit has an assessment The unit, with the involvement of its
professional community in the system that reflects the professional community, is
development of its assessment conceptual framework and regularly evaluating the capacity
system. The unit’s assessment professional and state standards and effectiveness of its assessment
system, which reflects the
system is limited in its capacity and is regularly evaluated by its
conceptual framework and
to monitor candidate professional community. The
incorporates candidate proficiencies
performance, unit operations, unit’s system includes outlined in professional and state
and programs. The comprehensive and standards. The unit regularly
assessment system does not integrated assessment and examines the validity and utility of
reflect professional, state, evaluation measures to monitor the data produced through
and institutional standards. candidate performance and manage assessments and makes
Decisions about continuation in and improve the unit’s operations modifications to keep abreast of
and completion of programs and programs. Decisions about changes in assessment
are based on a single or few candidate performance are based technology and in professional
assessments. The unit on multiple assessments at standards. Decisions about
candidate performance are based on
has not examined bias in its admission into programs,
multiple assessments made at
assessments, nor made an appropriate transition points, and
multiple points before program
effort to establish fairness, program completion. The unit has completion and in practice after
accuracy, and consistency of taken effective steps to eliminate completion of programs. Data show
its assessment procedures bias in assessments and is working a strong relationship of performance
and unit operations. to establish the fairness, accuracy, assessments to candidate success
and consistency of its assessment throughout their programs and later
procedures and unit operations. in classrooms or schools. The unit
conducts thorough studies to
establish fairness, accuracy, and
consistency of its assessment
procedures and unit operations. It
also makes changes in its
practices consistent with the
results of these studies.
2b. Data Collection, Analysis, and Evaluation
Unacceptable Acceptable Target
The unit does not regularly and The unit maintains an The unit's assessment system
comprehensively gather, assessment system that provides provides regular and comprehensive
aggregate, summarize, and regular and comprehensive data on program quality, unit
analyze assessment and information on applicant operations, and candidate
performance at each stage of its
evaluation information on the qualifications, candidate
programs, extending into the first
unit’s operations, its programs, proficiencies, competence of
years of completers’ practice.
or candidates. The unit cannot graduates, unit operations, and Assessment data from candidates,
disaggregate candidate program quality. Using multiple graduates, faculty, and other
assessment data when assessments from internal and members of the professional
candidates are in alternate external sources, the unit collects community are based on multiple
route, off-campus, and data from applicants, candidates, assessments from both internal and
distance learning programs. recent graduates, faculty, and other external sources that are
The unit does not maintain members of the professional systematically collected as
a record of formal candidate community. Candidate assessment candidates progress through
complaints or document the data are regularly and programs. These data are
disaggregated by program
resolution of complaints. The systematically collected,
when candidates are in alternate
unit does not use appropriate compiled, aggregated, summarized,
route, off-campus, and distance
information technologies to and analyzed to improve candidate learning programs. These data are
maintain its assessment performance, program quality, and regularly and systematically
system. The unit does not use unit operations. The unit compiled, aggregated,
multiple assessments from disaggregates candidate summarized, analyzed, and
internal and external sources assessment data when candidates reported publicly for the purpose of
to collect data on applicant are in alternate route, off campus, improving candidate performance,
qualifications, candidate and distance learning programs. program quality, and unit operations.
proficiencies, graduates, unit The unit maintains records The unit has a system for effectively
operations, and program of formal candidate complaints and maintaining records of formal
candidate complaints and their
quality. documentation of their resolution.
resolution. The unit is developing
The unit maintains its assessment
and testing different information
system through the use of technologies to improve its
information technologies appropriate assessment system.
to the size of the unit and institution.
2c. Use of Data for Program
Improvement
Unacceptable Acceptable Target
The unit makes limited or no The unit regularly and systematically The unit has fully developed
use of data collected, including uses data, including candidate evaluations and continuously
candidate and graduate and graduate performance searches for stronger
performance information, information, to evaluate the relationships in the evaluations,
to evaluate the efficacy of efficacy of its courses, programs, revising both the underlying
its courses, programs, and and clinical experiences. The unit data systems and analytic
clinical experiences. The unit analyzes program evaluation and techniques as necessary. The
fails to make changes in its performance assessment data to unit not only makes changes
courses, programs, and initiate changes in programs and based on the data, but also
clinical experiences when unit operations. Faculty have systematically studies the
evaluations indicate that access to candidate assessment effects of any changes to
modifications would data and/or data systems. assure that programs are
strengthen candidate Candidate assessment data are strengthened without adverse
preparation to meet regularly shared with candidates consequences. Candidates and
professional, state, and and faculty to help them reflect on faculty review data on their
institutional standards. Faculty and improve their performance and performance regularly and
do not have access to programs. develop plans for improvement
candidate assessment data based on the data.
and/or data systems.
Candidates and faculty are not
regularly provided formative
feedback based on the unit’s
performance assessments.
What evidence do you collect from
P-12 students that indicate
candidates have a positive effect
on student learning?
• Teacher Work Samples

• Standardized Achievement Tests

• A state system that links student learning to


candidates and/or alumni
What assessments do you
use?
• Unit level
– State reports
– Student Teaching Performance Evaluations
– TWS
– Cooperating Teacher Program Evaluations
• Program level
– Look at required assessments
– Program specific Student Teaching Evaluations
– Advanced level Internship Evaluations
– Advanced candidate projects with rubrics
• Follow-up Surveys
– Exit Interviews
– Alumni Surveys
– Employer Surveys
What assessments do you use?

Institution Program

Unit
Community
(P-12)
Key assessments

Institution and unit Program

• Institutional research • State assessments


• NSSE • Dispositional
assessment
• Follow-up surveys
(alumni and • Field experience and
clinical practice
employers)
• Diversity assessment
• Tracking program
• Course embedded
completers assessment
• FQPD • Comprehensive exams
• Faculty evaluations or portfolio defense
The common ground

Unit
Performance

Unit Candidate
Indicators Performance
Indicators

Student
Learning

Program
Quality
Alignment and use of
assessments

• Core assessments

• Delineation of components to be measured

• Alignment of assessments

• Clustering of unit and program assessments

• Continuous improvement
NU’s conceptual framework – 4 major
components

1. Mission

2. Dispositions

3. Theoretical dimensions

4. Strategic goals
Alignment to assessment system

• Each component of the conceptual framework


is aligned with the assessment system.

• Data are clustered to measure our success and


link assessments.
Clustering unit and program
assessments (streams of data)

1. Program area data stream

2. Field experience and partnership data stream

3. Diversity data stream

4. Faculty qualifications, performance, and


development data stream

5. Alumni and employer follow-up surveys

6. Advancing research data stream

7. National Survey of Student Engagement (NSSE)


Conceptual Program Field Experience and Diversity FQPD Alumni & Advancing NSSE
Area Partnership Employer Research
Framework Follow-Up
Components
Mission
1.1 Service (Vincentian)
X X X
Dispositions
2.1 Responsibility
X X
2.2 Relationships
X X X
2.3 Critical thinking
X X
Theoretical
Dimensions
3.1 Constructivism
X X X
3.2 Process-product
X X X
3.3 Reflective practice
X X X
Strategic Goals

4.1 Diversity
X X X X
4.2 Faculty
X X
4.3 High standards
X X X X
4.4 Success grads.
X X X
4.5 Partnerships
X X
Linking assessments …

• Timelines
– Annual (When, where)

• Transparency
– What, how

• Roles and responsibilities


– Who

• Meaningful information
– Why
Meaningful information for…

• Planning, analysis, and advocacy

• Measuring progress
• Conceptual framework
• Unit standards, strategic plan goals, federal and
state standards

• Improving and refining assessment system


• Assessments
• Transition points and quality assurance
Meaningful information for…

• Ensuring success of graduates and impact on


student learning
• Alignment, currency, and rigor of curriculum
• Field experiences and clinical placements

• Enhancing faculty development


• Teaching effectiveness, scholarship, service

• Improving structures and operations

• Expanding partnerships with P-12 and community


Culture of continuous improvement

• Assessment is purposeful, well-planned (the


assessment system ).

• Assessments are aligned to the conceptual


framework.

• Data from multiple sources is used for decision-


making, planning, and continuous improvement
at multiple levels (beyond accreditation).

• It is the way to do business and it will make a


difference.
Assessment and use of data have led
to (90% of faculty reporting):

100% 96%

95%

90%

85% 81% 81%

80%

75%

70%
Collaboration Structure and Involvement and
process leadership

Niagara University, College of


Education (2009)
Questions and comments

You might also like