Professional Documents
Culture Documents
COE Data Analysis Process Tool DAPT
COE Data Analysis Process Tool DAPT
Division:
Program:
Academic Year:
To ensure interpretations of data are valid and consistent, quality assurance includes cumulative,
relevant, and actionable measures.
Step 1: Complete the chart below based on the key assessment data you analyzed.
The data are cumulative (successive administrations/3 cycles of data).
CYCLE 2: Term__________
Title of Key Assessment Analysis of Findings—Strengths Analysis of Findings—Areas for
Growth
4.23
CYCLE 3: Term__________
Title of Key Assessment Analysis of Findings—Strengths Analysis of Findings—Areas for
Growth
Step 2: Base your responses to the questions below (from the CUC Advanced Level Programs Phase in
Plan 5) on the data from each of the assessments you listed.
Section 1 The data are relevant: A clear link between what is being measured and what your division
intends to measure.
1.1 What do assessment results indicate about the extent to which course curriculum and experiences
prepared candidates for what was assessed?
1.2 To what extent is the assessment clear and explicit about the expectations for candidate
proficiencies in relation to standards?
1.3 How effectively were expectations conveyed in the assessment’s narrative description provided in
advance of the assessment?
Section 2 The data are actionable: A clear link between the measure and the action taken as a result of
this measure.
2.1 What do the assessment results reveal about underlying patterns of strength and areas of growth
within programs or about populations who could be served more effectively?
2.2 What do the assessment results you wrote about in 2.1 tell you about potential interventions for
students, as well as assessment, course, and/or program-level improvements?
Explain:
Action Steps:
Assessment-level Improvements
Explain:
Action Steps:
Revised 3.2.2023
2 of 3
Course-level Improvements
Explain:
Action Steps:
Program-level Improvements
Explain:
Action Steps:
2.3 What are the clear standards of comparison for the interpretation of this measure? (Measures can
be compared across programs, against peers, against established “best practices,” against established
goals, against national or state norms, or over time.)
Revised 3.2.2023
3 of 3