Download as pdf or txt
Download as pdf or txt
You are on page 1of 41

Extension Education Theoretical Framework

With Criterion-Referenced Assessment Tools Extension Manual EM-02-2013

Introduction
A guiding principle of Extension is that our 3. Provide common ground for both
educational programming is based on the individuals and teams from multiple
use on knowledge generated through disciplines in varying positions who will
research-based, scientific inquiry. To design, test and evaluate educational
produce individual, family, business or programs and materials.
community change, use of the science of
Use of Theory
change educators need to use the science
of change and of h and adult and Theory is important. It provides uniformity
organizational development and and “becomes a predictor of facts” and
empowerment. Additional bodies of “stimulates and guides scientific inquiry and
knowledge are needed so that science is analyzes and explains its findings” (Boone,
infused in content and process. Safrit, & Jones, 2002, p. 65). Application of
theories has many benefits:
While adult and youth education program
planning frameworks exist, Extension does 1. Increases the likelihood that intended
not have a unified theory that incorporates outcomes will be achieved;
multiple disciplines and provides
assessments tools to study the 2. Provides the rationale for how the
effectiveness of materials and educational program is strategically structured and
delivery. delivered;
3. Offers the basis for assessment of the
This manual was created specifically to program’s degree of success in
enable Extension educators to incorporate achieving intended outcomes.
a theoretical framework in program design.
The manual can: 4. Enables program planners to combine
their experience and insight with
1. Stimulate thought and dialogue that will evidence-based explanations of
further enrich the use of the framework; behavior change;
2. Guide educators and specialists who will 5. Contains key concepts and variables
design, test and evaluate Extension that define how the concepts will be
educational programming; and measured for evaluation of a theory-
driven program;

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 1
6. Provides a rationale for what Testing Theories for “Goodness of Fit”
educational program designers do or
did, and with what result. “A useful theory makes assumptions about
a behavior … problem, target population or
Finally, while theories inform practice, environment that are:
practice can also inform theory. When
defensible assessments are done, • Logical;
Extension’s programs can inform and • Consistent with everyday
build new theory. observations;
• Similar to those used in previous
successful programs; and
Theories for Extension Education • Supported by past research in the
same or related area” (1, p. 5)
As educational program planners, we need
theory to understand how to increase the
likelihood that desired outcomes will be Applying Theories to Educational
achieved. Often, no one theory is adequate Programs
to guide the creation, delivery and
measurement of educational programs. After analyzing theories for goodness of fit,
Extension educational program planners
Selection of theories depends on an can use guidelines for designing
assessment of the situation, identification of information dissemination and educational
the targeted population, an understanding programming focused on change-making.
of the behavior to be addressed or change These guidelines can be used for multiple
to be made and determination of outcomes media in both group and individual learning.
that are strategic, measurable, achievable, A set of guidelines is shown in Table 2.
relevant and timely. Finally, the level of
program—individual, family or other group, Assessing New and Existing Programs,
community or policy—will guide the Curricula and Materials
appropriate programming theories.
Criterion-referenced assessment tools can
For this guide, we are focused on theories be used for assessing both existing
that involve people and behavior (rather educational programs, curricula and
than communities or policies) and have materials and for developing new ones.
chosen the following individually-focused In this guide, we provide three individual but
theories: complementary tools for assessing existing
 social cognitive theory, educational programs, curriculum and
 stages of change or readiness, educational materials and those under
 theory of planned behavior, development. Criterion-referenced
 communications, assessment tools permit multiple individuals
 adult or youth development, to make judgments using common criteria
 empowerment, and with common definitions. All three tools are
 evaluation and action research. located in the Appendices.

Each is briefly presented in Table 1. Other


theories could be analyzed and presented
using the model as an example.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 2
Other Theories in Program Development, Delivery and Evaluation
This guide is intended to stimulate thought and dialogue and to help youth and adult education
program developers. It can be used in program development, at the end of pilot testing a
program, and during a full-scale program implementation stage. We understand that
continuous improvement may require that this guide be modified as others test our ideas and
theories.
Theoretical Frameworks

Table 1 consists of a set of eight theories chosen by the authors based on their research and
practice at the local, state, and national levels with educational programs designed to produce
behavioral change. These eight theories are a base from which to start: Other theories could
be added and used.
Each theory is presented with a list of authors, a description, suggestions for application,
comments by the developers, and questions for users. Complete citations are found in the
references section.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 3
Table 1: Individual Level Theories Overview and Use

Theory Name Description Application Comments Questions


Social Self-efficacy is the Build confidence Ties to theory of How do we determine
Cognitive “do-ability” factor— and sense of planned behavior the self-efficacy of our
Theory the measure of the control based on and stages of targeted learners?
Bandura (5) ability to take the where the change or
desired action. consumer is at readiness. How can we build
the start of confidence,
Do-ability is affected program competence and
by perception of control into our
control. If a behavior is educational design?
complex, smaller,
Control includes easier tasks Are there complex
sufficient competence should be used to tasks that must be
and confidence to create small done to act on the
act. successes. tasks?

Self-confidence is Progress should How can we divide


key to taking action. be recognized, complex tasks into
rewarded and subtasks?
reinforced as
tasks or subtasks What kinds of
are completed. recognition, rewards
and reinforcement
can we build into the
design?

Should we include
measures of self-
efficacy in our
baseline and
outcomes
assessment?

How can we tie our


self-efficacy
components to stages
of readiness?

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 4
Theory Name Description Application Comments Questions
Stages of Demonstrates that To help an adult Ties to social How can we
Change or people go through learn, begin cognitive and determine stages of
Readiness for five stages as they where they are. theory of planned change of our
Change adopt a new behavior behavior. targeted learners?
or belief: Use this theory to
establish stage How can we build
Prochaska,
1. Pre-contemplation status in relation stages of change into
DiClemente, & 2. Contemplation to the desired our educational
Norcross (6) 3. Action behavior. design?
4. Maintenance
5.Termination The education Should we include
and materials measures of stages
should be of readiness in our
designed for baseline and/or
each stage to outcomes
accommodate assessment?
learners in each.

Theory Name Description Application Comments Questions


Planned Health A causal explanation Help individuals Ties to social What attitudes do
Behavior of how behavioral believe they can cognitive theory. consumers hold
Azjen (7) intention determines do the behavior. about a topic?
behavior; how
attitude, and Have attitudes To what extent do
perceived control shaped by what’s consumers perceive
influence intent. required of the they can make good
behavior and the decisions?
Behavioral intention likely outcome(s).
is the most important Who influences their
determinant of Recognize the thinking?
behavior. importance of the
support of key Should behavioral
people in their intent be measured
lives that can as a baseline before
help with and after the
behavior change. program?

Help individuals
believe they have
control over the
behavior.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 5
Theory Name Description Application Comments Questions
Communication Draws on multiple Use to design Ties to social What components of
Bettman (8) behavioral and social targeted cognitive theory, social cognitive
McGuire (9) theories including audience theories of change theory, theories of
Rogers (10) consumer behavior campaigns and and youth & adult change and adoption
and social marketing. messages. education, of innovation and
empowerment and youth and adult
Can use an Consider multi- transformative education are
ecological level strategies theory. relevant to the
perspective with while selecting intended content and
multilevel media as The stages of delivery of the
communication. channels of social cognitive educational
delivery. theory and of materials?
The focus is on the diffusion and
elements of: 1) who Explore how the adoption of How will we know the
2) says what 3) in method of innovation can be extent to which our
which channels, 4) to delivery, the complimentary. content channels of
whom and 5) with content and its The first focuses delivery and the
what effect. presentation and on the individual; educator affected
the educator the second on learning outcomes?
Examines the affect groups of
interaction of effectiveness. individuals who Is there a rank order
audience and media influence one of messages that
for influence on Identify the stage another. need to be sent,
knowledge, opinions, of adoption to received and
attitudes and determine the understood?
behaviors of categories of
audiences. adoption Under what
behavior among conditions could the
Communication the targeted way we frame the
processes are central audiences and messages affect the
to encouraging or incorporate those critical thinking of
discouraging in the early targeted audiences?
behavior. stages as peer
leaders. Do we have sufficient
Through framing, experience to design
audiences are Frame messages a communications
influenced in both to guide learners plan for informing
what and how to in what to think targeted audiences of
think. about and how to our educational
think. programming and its
Diffusion of content?
information and
adoption of Within the targeted
innovation occurs audience, who are
through a social the early adopters or
process in categories who influence others?
of: early adopters,
early majority, late Who is the best
majority, and person to bring the
laggards. message?

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 6
Theory Name Description Application Comments Questions
Adult Education Adults Educators need Useful for How can we
are focused on to know how establishing needs determine the
Boyle (11), solving and adults define their and the situation. problems associated
Bruner (12), managing their problems, what with our Extension
Franz (13), problems. They want adults want to Can be used to programs that our
Kirsch, Jungeblut to be actively know and why design learning targeted learners
Jenkins, & Kolstad involved in problem they want to after identifying want solved?
(14), solving. know to design self-efficacy of
Knowles (15), effective targeted adults How can we design
Merriam, Adults learn well education. and their stage of educational programs
Caffarella, & through dialogue and readiness. to address those
Baumgartner (16), other learning other Educators need problems?
Mezirow (17), learning styles. to know both the When the science
Norris (18) general and and art of adult To what extent can
Transformative adult specific literacy education is we and should we
education empowers levels of their applied to engage our targeted
individuals by altering targeted educational learners in designing
points of view as a audience of programming, the our educational
result of critical learners in likelihood that materials and
thinking and designing and learning and programming?
reflection. delivering adult application of
education. learning will occur What is the desired
Adults vary in their increases. balance between
abilities to think about Principle: Start facts and how-to
the concrete world-- with the targeted Programs, application
what they perceive to population. curricula and components of the
be real and the educational program?
abstract world -- what Ground learning materials should
they conclude and experiences in be created and Is dialogue integral to
how to test critical and critiqued using the learning
consequences of reflective assessment tools environment?
decisions and thinking. that address
actions—if A then B. literacy levels of To what extent do we
the targeted incorporate critical
Adults vary in their audience(s). and reflective
general literacy-- thinking?
knowledge and skills
needed to locate, Is our programming
understand and use sensitive to varying
Information--oral, levels of literacy
written and among adults? Does
numerical. it use principles of
plain language and
Adults vary in their clear communication?
specific literacy
levels—financial, What are the different
health, technology, learning styles of the
etc. learners?

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 7
Theory Name Description Application Comments Questions
Youth
Development Children find their Educational 4-H youth Have we considered
psychosocial identity programming programming a youth audience for
Benson, Scales, through a series of should be focuses on our programming?
Hamilton, & Sesma developmental constructed for positive youth
(19), stages that continue the age and development. Is our youth
Bronfenbrenner across the life cycle. stage of the programming
(20), Damon (21), children. information-focused
Critical elements
Erikson (22), Success in each or developmentally-
essential to the
Jones & Bouffard stage is dependent Incorporating focused?
healthy
(23), Lerner & upon the successful experiential
development of
Benson (24), completion of the learning will help Is our youth
young people
Pittman (25), previous stage. children develop development
include:
Pittman & Irby (26) their own identity. programming
Children internally 1. Youth feel incorporating these
structure their Efforts should be physically and theories?
identity through the undertaken to emotionally safe
experiences they address the Are there other
2. Youth
have with others. systems in which appropriate theories
experience
a child lives and to be included?
belonging and
Children’s growth learns.
ownership
and development How do we best
occurs within Educational 3. Youth develop design our programs
systems that help or programming that self-worth to meet the needs of
hinder their focuses on the youth who are at
whole child will 4. Youth discover
development. self different
likely result in developmental stages
Children with strong better 5. Youth develop (and in different
social and emotional quality environments)?
learning skills will: Understanding relationships with
a) have better the internal and peers and adults
relationships with external assets of
each child, their 6. Youth discuss
other children and conflicting values
adults; family and the
environments in and form their own
b) do better in
school; and which they exist 7. Youth feel the
c) exhibit better and focusing on pride and
mental health. those as a accountability that
strength base comes with
Positive youth should result in mastery
development positive youth
development 8. Youth expand
focuses on asset their capacity to
development and outcomes.
enjoy life and
draws on existing know that success
assets of families is possible.
and communities.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 8
Theory Name Description Application Comments Questions
Empowerment A process by which Use to develop Ties to other Which learning
individuals gain programs that theories. methods will enable
Fetterman (27) perceived autonomy leave individuals, learners to identify
Freire (28), and confidence to groups, Can be used to issues and problems
Kar, Pascual, & achieve control over communities decrease and find solutions?
Chickering (29), problems and issues and/or dependence on
Varkey, Kureshi, & of concern to them organizations the provider of What is the desired
Lesnick (30), through appropriate with sufficient information and balance between
Zimmerman (31), solutions. ability and increase ability to providing information
Zimmerman & confidence that learn and act with and increasing
Warschausky (32) Includes actions, they can address confidence. confidence and a
activities, and issues and/or sense of control?
structures that may solve problems Is tied to learner-
be empowering and themselves. centered program What’s the rationale
outcomes that planning, design, for moving beyond
represent a level of implementation information
empowerment. and evaluation. dissemination to
empowerment
Varies across Moves beyond programming?
contexts and by measuring
individuals. knowledge and
skills to measuring
Can exist on sense of control
individual, community and confidence.
and organizational
levels.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 9
Theory Name Description Application Comments Questions
Evaluation Program developers Educators need to At least one Who are the intended
use theories and know how theory member of an users of the
Patton (33), methodologies to of evaluation fits educational evaluation
Lewin (34) determine: into their programming information?
programming. team should have
1) The need for a evaluation What questions do
program Applying expertise or we need to ask
(assessment); evaluation theory access to before, during and
2) How well the will strengthen evaluation after our programs?
process for planning, program planning, expertise.
implementing, implementation How will we collect
judging results and and outcomes. Costs associated and analyze the
communications is with evaluation data?
working (formative); To apply user- include personnel
and focused theory time and money Who will be involved
3) How to assess requires that: and should be in the evaluation plan
results (summative). considered as and process?
--The statement of program planning
Fundamental is the theory is begins. When should we
Theory of Action, the understood; create the plan of
explanation of how to --Participants are Funding may be evaluation and when
produce desired comfortable with needed to should it be
results, which asks: the process; adequately plan implemented?
Did the implemented --Participants for and execute
program lead to the understand how the evaluation Are there key dates
desired outcomes? the theory plan. requiring key
supports their evaluation actions?
User-focused theory actions; A plan for internal
engages the --Goodness of fit and external What is the desired
educators in is tested communication of balance between
conceptualization --Application of findings is formative and
through findings. theory remains a needed. summative evaluation
strong focus. elements of the plan?
Materials
evaluation is a
component of
formative
evaluation,
known as needs
assessment or
situation analysis
and outcome
evaluation.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 1
Key constructs (concepts) of theories can be combined into a model that frames the design
and measurement of educational programming impact. Key guidelines from these theories are:
1) Involve the targeted population to understand their readiness to learn and what
they want to learn under what circumstances.
2) Identify the level of confidence, competence and sense of control before and after
programs to determine the extent to which the program resulted in change.
3) Create messages and deliver via channels that fit the needs and situations of the
targeted population.
4) Design learning experiences so participants increase their ability to think critically
and to reflect on what they learned.
5) Assure that evaluation of need, process and outcome is effectively conducted
and reported.

After establishing the theoretical framework for the Extension education programming another
step is needed—designing for the literacy and numeracy of a range of adults. Following
literacy-focused design guidelines will strengthen the likelihood that the intended outcome(s)
will be achieved. For learners with a wide range of literacy levels, program planners need to
address methods and materials. Table 2 lists four major guidelines with sub-guidelines for low-
literacy adults.
Table 2.
Guidelines for methods and materials for low literacy adults.
(Based on Doak, Doak, & Root, 2007, p.22).

1. SET SMART* Goals & OBJECTIVE(S)


1.1 Limit the goal (s) and objective(s) to what the current needs of the majority of
the target population.
1.2 Use a planning sheet to write down the goals, objective(s) and key points.

2. TO ENCOURAGE CHANGE, FOCUS ON BEHAVIORS AND SKILLS


2.1 Emphasize behaviors and skills rather than facts.
2.2 Sequence concepts to build breadth and depth of understanding and skills.
3. PRESENT CONTEXT FIRST (BEFORE GIVING NEW INFORMATION)
3.1 State the purpose or use for new content information before presenting it.
3.2 Relate new information to the context of adult lives.

4. PARTITION COMPLEX INSTRUCTIONS


4.1 Break instructions into easy-to-understand parts.
4.2 Provide opportunities for small successes.

5. MAKE IT INTERACTIVE
5.1 Consider including an interaction after each key topic. The learner must write,
tell, show, demonstrate, select or solve a problem.

*Specific, Measurable, Attainable, Realistic, Time-bound

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 1
Assessment Tools for Judging Programs, Curriculum & Educational Materials

After applying Table 2 guidelines to program design, Extension program planners can also use
an assessment to judge existing educational programs. Three tools are included in this guide.
They provide criteria for judging programs, curricula and educational materials.

Extension Program Assessment Tool (PAT). This tool was created by McCoy and Braun
(35) for the University of Maryland Extension (UME) for judging the stage of program
development along a continuum from informational, to developmental, to signature, to
evidence-based. We distinguish program as a comprehensive set of learning experiences that
could include multiple curricula and materials. The tool can be used for both existing and new
programs. PAT uses checklists to assess the program instructions are included with the
instrument. Appendix A.

Extension Curriculum Assessment Tool (CAT). This assessment tool was created to
provide a standardized set of criteria to evaluate existing educational curricula and to use in
creating new educational curricula. CAT was originally created for the new Maryland Health
Smart Health Insurance Literacy Initiative educational curriculum. This guide has been
modified as a generic multi-disciplinary curriculum assessment. CAT evaluates curricula on a
four-point scale (Effective, Good, Fair, and Ineffective). CAT is located in Appendix B.

CAT is based on a Journal of Extension article (36) which provided a rationale for curricula
review. Finkbeiner and Braun (37) added additional items and converted the new items and
those from the article into an assessment tool.

Extension Materials Assessment Tool (MAT). The purpose of the materials assessment tool
(38) is to provide a standardized set of criteria to judge educational materials used in programs
and in curricula. Our tool is a modification of SAM — the Suitability Assessment of Materials
(2). It can be used both for critiquing existing educational materials and creating new materials.
This tool was originally created for assessing materials used with low-literacy audiences. It can
be used to judge the extent to which the materials can be understood and used by audiences
with varying levels of prose, document and quantitative literacy (14).

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 1
References Used for Extension Education Theoretical Framework Guide

1. National Cancer Institute (2005). Theory at a glance: A guide for health promotion
practice (2nd ed.). NIH Publication No. 05-3986: U.S. Department of Health and Human
Services. Available at: http://www.cancer.gov/cancertopics/cancerlibrary/theory.pdf.

2. Doak, C.C., Doak, L.G., & Root, J.H. (2007). Teaching patients with low literacy skills.
(2nd ed.). Philadelphia: J.B. Lippincott Company.

3. Glanz, K., Lewis, F. M., & Rimer, B. K. (Eds.) (1997). Health behavior and health
education. (2nd ed). San Francisco, CA: Jossey-Bass Publishers.

4. Doran, G. T. (1981). There’s a S.M.A.R.T. way to write management’s goals and


objectives. Management Review, 70, 35-36.

5. Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory.
Englewood Cliffs, NJ: Prentice-Hall, Inc.

6. Prochaska, J.O., DiClemente, C.C., & Norcross, J.C. (1992). In search of how people
change: Applications to the addictive behaviors. American Psychologist, 47, 1102-1114.

7. Ajzen, I. (1985). From intentions to actions: A theory of planned behavior. In J. Kuhl & J.
Beckmann (Eds.), Action-control: From cognition to behavior (pp. 11-39). Heidelberg:
Springer.

8. Bettman, J. R. (1979). An information processing theory of consumer choice. Reading,


MA: Addison-Wesley.

9. McGuire, W. J. (1984). Public communication as a strategy for inducing health-


promoting behavioral change. Preventive Medicine, 13, 299−313.

10. Rogers, E. M. (1983). Diffusion of innovations. New York: Free Press.

11. Boyle, P.G. (1981). Planning better programs. New York: McGraw-Hill.

12. Bruner, J.S. (1966). Toward a theory of instruction. Cambridge, MA: The Belknap Press
of Harvard University.

13. Franz, N. (February, 2007). Adult education theories: Informing Cooperative Extension's
transformation. Journal of Extension, 45. Available at:
http://www.joe.org/joe/2007february/a1.php

14. Kirsch, I.S., Jungeblut, A., Jenkins, L., & Kolstad, A. (2002). Adult literacy in America: A
first look at the findings of the national adult literacy survey. (NCES1993-275). U.S.
Department of Education Office of Educational Research and Improvement.
Washington, DC: National Center for Education Statistics.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 1
15. Knowles, M.S. (1973). The adult learner: A neglected species (2nd ed.). Houston: Gulf
Publishing Company.

16. Merriam, S.B., Caffarella, R.S., & Baumgartner, L.M. (2007). Learning in adulthood: A
comprehensive guide (3rd ed.). San Francisco: Jossey-Bass.

17. Mezirow, J. (Ed.) (2000). Learning as transformation: Critical perspectives on a theory in


progress. San Francisco, CA: The Jossey-Bass Higher and Adult Education Series.

18. Norris, J.A. (2003). From telling to teaching: A dialogue approach to adult learning.
Myrtle Beach, NC: Learning by Dialogue.

19. Benson, P. L., Scales, P. C., Hamilton, S. F., & Sesma, A. (2006). Positive youth
development: Theory, research, and applications. In W. Damon & R. Lerner (Eds.),
Handbook of child psychology (vol. 1, pp. 894–941). Hoboken, NJ: Wiley.

20. Bronfenbrenner, U. (1979). The ecology of human development: Experiments by nature


and design. Cambridge, MA: Harvard University Press.

21. Damon, W. (2004). What is positive youth development? Annals of the American
Academy of Political and Social Science, 591, 13–24.

22. Erikson, E.H. (1959). Identity and the life cycle. New York: International Universities
Press.

23. Jones, S.M., & Bouffard, S.M. (2012). Social and emotional learning in schools: From
programs to strategies. Social Policy Report, 26(4).

24. Lerner, R.M., & Benson, P.L. (Eds.). (2003). Developmental assets and asset-building
communities: Implications for research, policy, and practice. New York: Kluwer
Academic.

25. Pittman, K. (1991, June). Promoting youth development: Strengthening the role of youth
serving and community organizations. Washington D.C.: Academy for Educational
Development.

26. Pittman, K., & Irby, M. (1995). An advocate's guide to youth development. Boston:
Academy for Educational Development.

27. Fetterman, D. (2001). Foundations of empowerment evaluation. Thousand Oaks, CA:


Sage Publications.

28. Freire, P. (1972). Pedagogy of the oppressed. New York: Herder and Herder.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 1
29. Kar, S. B., Pascual, C. A., & Chickering, K. L. (1999). Empowerment of women for
health promotion: A meta-analysis. Social Science & Medicine, 49, 1431-1460.

30. Varkey, P., Kureshi, S., & Lesnick, T. (2010). Empowerment of women and its
association with the health of the community. Journal of Women’s Health, 19, 71-76.

31. Zimmerman, M. A. (1995). Psychological empowerment: Issues and illustrations.


American Journal of Community Psychology, 23, 581-599.

32. Zimmerman, M. A., & Warschausky, S. (1998). Empowerment theory for rehabilitation
research: Conceptual and methodological issues. Rehabilitation Psychology, 43, 3-16

33. Patton, M.Q. (1997). Utilization-focused evaluation: The new century text. Thousand
Oaks, CA: Sage.

34. Lewin, K. (1946). Action research and minority problems. Journal of Social Issues, 2(4),
34-46.

35. McCoy, T. & Braun, B. (February, 2013). University of Maryland Extension program
assessment tool. College Park, MD: University of Maryland Extension.

36. Coleman, G., Byrd-Bredbenner, C., Baker, S., & Bowen, E. (2011). Best practices for
extension curricula review. Journal of Extension, 49, Article 2T0T1. Available at:
http://www.joe.org/joe/2011april/tt1.php

37. Finkbeiner, N. & Braun, B. (October, 2012). University of Maryland Extension curriculum
assessment tool. College Park, MD: University of Maryland Extension.

38. Finkbeiner, N. & Braun, B. (November, 2012). University of Maryland Extension


materials assessment tool. College Park, MD: University of Maryland Extension.
39. Center for Medicare and Medicaid Services (2010). Toolkit for making written material
clear and effective: Part 7- Using readability formulas: A cautionary note. Available at:
http://www.cms.gov/Outreach-and-
Education/Outreach/WrittenMaterialsToolkit/ToolkitPart07.html

40. Centers for Disease Control and Prevention (2009). Simply put: A guide for creating
easy-to-understand materials. Available at:
www.cdc.gov/healthliteracy/pdf/simply_put.pdf

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 1
Appendix A
Extension Program Assessment Tool (PAT)©

University of Maryland Extension Program Assessment Tool

CATEGORY Informational Developing Signature Evidence-Based

Needs Assessment:

Fit with UME  Represents an emerging  Represents a  Represents a priority of


Includes all of the signature
Mission public issue or need that developing public issue UME based on identified
program characteristics plus:
(Program could be addressed by a or need that can be public issues and/or
Design) UME program. addressed by UME. needs of the people of
 Rigorous scientific
 Based on some evidence  Based on substantive the state.
evidence of impact
of the issue and/or need evidence of the public  Provides sufficient
 Adequate and sustained
 Included in at least one issue or need AND the evidence of impact to
funding and other
Individual Educational capacity of UME to justify commitment of
resources from UME and
Plan. make an impact. resources to conduct
others.
 Not yet included in TEP.  Included in multiple program.
 On occasion, replication
 Minimal or no specific IEPs.  Defines the
by other state Extension
UME funding or other  Included in at least one distinctiveness of UME
systems or by external
resources dedicated to TEP for development. from other organizations
groups.
addressing the emerging  Start-up UME funding in addressing the public
issue or need through a or other resources issue and/or particular
formal UME program. committed to need of the people of
addressing the issue or the state.
need through a formal  Included in multiple IEPs
program. across multiple
disciplines.
 Identified as a signature
program in at least one
TEP.
 Sufficient internal
and/or external
resources to make an
impact
 Program is recognized
outside of UME among
public decision-makers
and the people of the

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 1
Appendix A
Extension Program Assessment Tool (PAT)©
state and the national
Extension System.

CATEGORY Informational Developing Signature Evidence-Based

Educational Program:

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 1
Appendix A
Extension Program Assessment Tool (PAT)©
Meets Critical  Exchange of information  Exchange of information  Exchange of information  Exchange of information
Clientele to answer questions and is for immediate use leads to documented leads to scientifically-
Needs address concerns. and could lead to change in an individual’s rigorous, documented
(Program  Information is transferred change over time in an knowledge, attitude, change in an individual’s
Development) to client for immediate individual’s knowledge, skills, and aspirations knowledge, attitude,
use. attitude, skills, and (KASA). skills, and aspirations
 Information is research- aspirations (KASA).  Exchange of information (KASA) over time.
based.  Information and is used to aid in the  Exchange of information
methods of solution of a public issue is used to aid in the
teaching/learning are or need of individuals, solution of a public issue
research and theory- families, and or need of individuals,
based. communities. families, and
 Contact time with client  Information and methods communities.
is usually one hour or of teaching/learning are  Information and methods
less and one time and research and theory- of teaching/learning are
may be face-to-face based. research and theory-
and/or through  Contact time with client based.
different types of is more than two hours,  Contact time with client
media. for youth 6 or more is of a medium-to-long
 May involve key hours, extended over a duration and uses
partners or period of time medium- multiple methods of
stakeholders. to-long duration and contact, including face-
uses multiple methods of to-face and different
contact, including face- types of media.
to-face and different  Involves key partners and
types of media. stakeholders.
 Involves key partners  Uses program strategies
and stakeholders. that have been
scientifically tested and
proven successful for
public issues and needs of
people.

CATEGORY Informational Developing Signature Evidence-Based


Curriculum:

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 1
Appendix A
Extension Program Assessment Tool (PAT)©
 Knowledge-based  Program curriculum  Program curriculum  Program curriculum
educational materials under development is developed using the UME developed using the UME
are used but no tested based on the Curricula Assessment Curricula Assessment Tool
curriculum for change UME Extension Tool (CAT) review (CAT) review guidelines.
over time. Curriculum Assessment guidelines.  Program curriculum
Tool (CAT) and, when  Program curriculum adapted from another
appropriate, the adapted from another state has been peer
Materials Assessment state has been peer reviewed using UME CAT
Tool (MAT). reviewed using the UME and, when appropriate,
 Program curriculum Extension CAT and, the MAT.
changes have been when appropriate, MAT,  Curriculum produces
made based on the UME and modified to meet evidence-based results.
Extension CAT and, Maryland needs.
when appropriate, the  Curriculum has been
MAT. both internally and
 Curriculum has been externally peer-
pilot-tested using reviewed.
appropriate testing  Curriculum has been
methods. published with a UME
 If curriculum is adapted signature-program
from another source, is endorsement.
subjected to the CAT  Curriculum is available
and, if appropriate, to to other states to use
MAT, and pilot tested and adapt.
for appropriateness in
state and modified as
needed.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 1
Appendix A
Extension Program Assessment Tool (PAT)©
CATEGORY Informational Developing Signature Evidence-Based
Research & Research Base:
Scholarship
(Program
 Uses research-based  Theory and research-  Theory and research-  Theory, research-based
Development information. based information is based information are information, and
& Delivery) explicitly explained and used to explain impact empirical evidence are
incorporated into the measures and outcomes. explicitly integrated in
development of  Provides information explanation of program
program. that can be used to build impacts on intended
additional program outcomes.
strategies and research  Program research results
questions. provide evidence to build
additional theoretical
models.
 Program research results
provide evidence that
allows for further
research study funds to
be generated.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 2
Appendix A
Extension Program Assessment Tool (PAT)©
CATEGORY Informational Developing Signature Evidence-Based
Program Scholarly Outputs:

 Program activities cited  Program activities cited  Program impacts cited in  Program scholarship
in CVs and annual faculty in CVs and annual CV and annual faculty findings cited in CV and
reports for merit review. faculty reports for merit reports for merit annual faculty reports for
review. reviews. merit reviews.
 Conference and  Program scholarship  Program scholarship
professional association findings used in findings used in
posters. promotion and tenure promotion and tenure
 Conference and packages for decisions packages for decisions
professional association about Senior or Principal about Senior or Principal
workshops and Agent advancement and Agent advancement and
presentations based on for merit reviews. for merit reviews.
preliminary data.  Program results  Evaluation results add to
 Contributions to presentations at a national evidence-based
eXtension Communities professional association database.
of Practice (COP). meetings, workshops,  Invited presentations and
 UME peer-reviewed panels, and other types articles about program
Extension Briefs and/or of delivery methods-- results are issued from
Factsheets. both refereed and non- other states, regions, and
refereed. countries.
 Invited presentations  Primary authorships in
and articles about eXtension Communities of
program results. Practice (COP).
 Contributions to  Journal editorial board
eXtension Communities memberships.
of Practice (COP).  Refereed articles in
 Refereed articles in highly-acclaimed
subject-based journals. journals.
 UME peer-reviewed  UME peer-reviewed
Extension Briefs, Extension Briefs,
Factsheets, Bulletins, Factsheets, Bulletins,
Manuals, and Curricula. Manuals, and Curricula.
Books or book chapters.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 2
Appendix A
Extension Program Assessment Tool (PAT)©
CATEGORY Informational Developing Signature Evidence-Based
Program Evaluation Use:
Evaluation
 Data collected and  Dated collected and  Dated collected and  Data is collected and
evaluated to determine evaluated to determine evaluated to determine evaluated to determine
participant knowledge participants’ short-term medium-term outcomes long-term outcomes
gain and satisfaction KASA outcomes and achieved that benefit achieved that benefit
level with the interaction clientele satisfaction clientele and/or the clientele.
experience. level with the community.  Evaluation results are
 Evaluation results are interaction experience.  Evaluation results used used to communicate
used to communicate  Evaluation results used to communicate UME’s UME’s impact on
reach of Educator’s work. to determine program value in addressing compelling societal,
effectiveness and to societal, economic, and economic, and
communicate environmental needs. environmental issues in
effectiveness of  Evaluation results used Maryland.
Educator’s work to to communicate the  Evaluation results are
meet clientele needs. effectiveness of used to communicate
Educator’s work to meet state and national
clientele needs in impacts on compelling
Maryland. societal, economic, and
environmental issues.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 2
Appendix A
Extension Program Assessment Tool (PAT)©
CATEGORY Informational Developing Signature Evidence-Based
Evaluation Methods:

 End-of-session  Basic logic model  Logic model is fully  Logic model is fully
instruments used to developed. developed. developed and tested for
determine client  End-of-session  End-of-session utility over time.
satisfaction. instruments used for instruments used for  Results of evaluations
 No IRB approval required program improvement. program improvement. have been subject to
if client satisfaction will  Paired or unmatched  Paired or unmatched critical peer review.
not be published. pretests and posttests pretests and posttests  Empirical evidence exists
or other quantitative for assessment of KASA about program
assessments for KASA changes. effectiveness.
changes.  Qualitative methods  Program results grounded
 Qualitative methods incorporated where in rigorous evaluations
incorporated where appropriate (structured using experimental or
appropriate (structured observations, quasi-experimental
observations, interviews). studies with randomized
interviews).  Follow-up survey control groups.
 IRB approved. research used to assess  Program can be
medium- term replicated by other states
outcomes. with confidence in
 Control and comparison program effectiveness.
groups used where  Findings are published in
appropriate. peer-reviewed journals
 Findings are used to and other publications.
improve programs.  IRB approved.
 Findings are peer
reviewed and published
when appropriate.
 IRB approved.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 2
Appendix A
Extension Program Assessment Tool (PAT)©
CATEGORY Informational Developing Signature Evidence-Based
Adoption &  Has potential to become  Recognized by respected  Program is promoted and
 Potential for adoption
Replication and replication unknown. a program that can be agencies and adopted nationally as an
(Program replicated by Extension organizations as an empirically-tested
Dissemination) or others in state. effective program. program with identified
 Adopted by other short-, medium-, and
organizations or long-term outcomes.
Extension services.  Program materials
(curriculum, protocols,
evaluation instruments)
exist that make adoption
and replication possible.

Marketing &  No formal marketing  No formal marketing  Formal marketing plan in  Effective components of
Communication plan, but program is plan, but advertising place and evaluated for a formal marketing plan
(Program advertised at the local has extended beyond effectiveness. are used.
Dissemination) level though flyers, the local community.
newspaper articles,
newsletters, or word-of-
mouth.
Public Value  Program value is  Program’s value is  Program’s value is
 Program value is evident
(Program to the individual evident to the evident to individuals, evident to individuals,
Dissemination) participants using individual participants families, and the families, and the
information. using information and community-at-large. community-at-large.
participating in the  Program’s public value is
program. determined by people or
agencies outside of UME
using this assessment tool
or one used by an agency
with a standardized tool
and or a process for
judging value.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 2
Appendix A
Extension Program Assessment Tool (PAT)©
CATEGORY Informational Developing Signature Evidence-Based
Sustainability  Minimum resources are  Short-term resources  Medium-term resources  Long-term funding in UME
(Organizational required to initiate committed from Impact committed to supporting budget due to evidence
Commitment) elements of a program. Teams to assist the program from the of impact.
 Internal resources used developing program into UME budget pending  External, long-term
to launch the program. signature program. evidence of potential for funding or partners
 Short term external impact. secured to maintain
funding secured to  External funders may be programming.
assist in developing involved in on-going  National partners
program. support of the program. involved in program when
 Potential partners  Partners involved in appropriate.
identified. program when
appropriate.

Permission to use this Program Assessment Tool for non-profit educational purposes is granted with use of the following citation:

McCoy, T., Braun, B., & Finkbeiner, N. (March, 2013). University of Maryland Extension program assessment tool. College Park, MD:
University of Maryland Extension.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 2
Appendix B
Extension Curriculum Assessment Tool (CAT)©

For each rating category, information is presented which clearly identifies how to assess the
curricula according to the four-point scale. Instructions for utilizing the assessment tool are as
follows:

1. Read the description in each cell.


2. Determine how the curricula or material you are assessing rates on the particular
category.
3. Insert your own comments as appropriate in order to clarify how or why you arrived at
the particular rating.

Extension Curriculum Assessment Tool©


Reviewed By: Date:
Effective Good Fair Ineffective
4 points 3 points 2 points 1 point
Content & Comments
Theoretical foundation Effective Good Fair Ineffective
The curriculum is All content except More than one or The curriculum is
based on current one or two pieces is two pieces of the not based on
education and based on current curriculum are not current education
behavioral change education and based on current and behavioral
theory and behavioral change education and change theory and
research. The theory and behavioral change research. The
theoretical research. The theory and theoretical
underpinnings of theoretical research. The underpinnings of the
the curriculum are underpinnings of theoretical curriculum are not
described. the curriculum are underpinnings of described.
mostly described. the curriculum are
not described in
detail.

Research-based Content Effective Good Fair Ineffective


The content of the The content of the The content of the The content is not
curriculum is curriculum is mainly curriculum is research-based,
research-based, effective - all but missing more than accurate, or current.
accurate, and one of the key one of the key
current. components of components of
effective curriculum effective
(research-based, curriculum -
accurate, and research-based,
current) are accurate, and
addressed. current.

Balanced Viewpoint Effective Good Fair Ineffective


The curriculum All content except More than one or The curriculum
presents a one or two pieces two pieces of the presents a one-
balanced view of presents a balanced do not present a sided view of the
the topic, view of the topic, balanced view of topic, failing to
recognizing any recognizing any the topic, failing recognize any
aspects that are aspects that are not to recognize any aspects that are not
not yet clearly yet clearly aspects that are yet clearly
understood or understood or open not yet clearly understood or open
open to debate. to debate. understood or to debate.
open to debate.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 2
Appendix B
Extension Curriculum Assessment Tool (CAT)©
Learning Objectives Effective Good Fair Ineffective
Includes clear, All content except More than one or Does not include
measurable one or two pieces is two pieces of the clear, measurable
learning and tied to clear, curriculum are not learning and
behavioral measurable learning tied to clear, behavioral
objectives. and behavioral measurable objectives.
Objectives are objectives. learning and
clearly linked to Objectives are behavioral
theoretical mostly linked to objectives.
underpinnings. theoretical Objectives are
underpinnings. poorly linked to
theoretical
underpinnings.

Audience
Target Audience Effective Good Fair Ineffective
Identifies the All but one or two More than one or Does not clearly
intended audience components of the two components identify the
and is tailored to curriculum are of the curriculum intended audience.
this audience. tailored to the are not tailored to
intended audience. the intended
audience.

Audience Input/Outcomes Effective Good Fair Ineffective


Builds on the All content except More than one or Does not build on
strengths/assets, one or two pieces of two components the
needs, and the curriculum build of the curriculum strengths/assets,
interests of on the do not build on needs, and interests
learners. Audience strengths/assets, the of learners.
input was used to needs and interests strengths/assets, Audience input was
guide of learners OR were needs and not used to guide
development of guided by audience interests of development of
materials. input. learners OR were materials.
not guided by
audience input.

Audience Involvement Effective Good Fair Ineffective


Actively engages All content except More than one or Does not actively
the audience in one or two pieces two pieces of the engage the
the learning actively engages the curriculum do not audience in the
process and audience in the actively engage learning process
promotes learning process the audience or and does not
behavior change. and promotes do not promote promote behavior
behavior change. behavior change. change.

Reflection of Diversity Effective Good Fair Ineffective


Reflects the All content except More than one or Does not reflect the
diversity, for one or two two pieces do not diversity, including
including literacy, pieces reflects the reflect the literacy, of the
of the intended diversity, including diversity, intended audience.
audience. literacy, of the including literacy, Does not include
Includes intended audience. of the intended multilingual
multilingual Includes audience OR the handouts and
handouts and multilingual curriculum does educational
educational handouts and not include reinforcements
reinforcements educational multilingual when appropriate.
when appropriate. reinforcements handouts and
when appropriate. educational

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 2
Appendix B
Extension Curriculum Assessment Tool (CAT)©
Respect for Diversity Effective Good Fair Ineffective
Ideas and All content, except More than one or Ideas and principles
principles included for one or two ideas two ideas and included in the
in the curriculum and principles principles included curriculum do NOT
respect all aspects included in the in the curriculum respect all aspects
of diversity curriculum, respect do not respect all of diversity
including literacy. all aspects of aspects of including literacy.
diversity including diversity including
literacy. literacy.
Readability
Grammar Effective Good Fair Ineffective
Reflects standards One to two More than two Not comprehensible.
of written English grammatical, grammatical,
and correct spelling, spelling,
grammar, punctuation, or punctuation, or
spelling, mechanical errors. mechanical errors.
punctuation, and
mechanics.

Tone and Reading Level Effective Good Fair Ineffective


All y is clear, Terminology is Terminology is Not comprehensible
correctly used and somewhat clear and frequently and the curriculum
spelled correctly used and incorrectly used or is written at not
throughout spelled throughout is not clear and written at the grade
content. Correct most of content. has misspellings. 6 level if for the
abbreviations are Spelling mistakes Abbreviations are general public.
used throughout. are minor. Correct incorrect. The
The curriculum is abbreviations are curriculum if for
written at grade 6 mostly used the general public
or lower if throughout content. is written at a
intended for the The curriculum is higher level than
general public. written at grade 6 grade 6.
or lower if intended
for the general
public.
Organization Effective Good Fair Ineffective
Is logically and All content except More than one or Is not clearly
sequentially one or two pieces two pieces of the organized.
organized. displays logical and content are not
sequential logically and
organization. sequentially
organized.
Style of material Effective Good Fair Ineffective
Content displays All content except More than one or Content does not
evidence of one or two pieces two pieces of the display evidence of
understanding of displays evidence of content do not understanding of
principles of understanding of display evidence principles of literacy
literacy and plain principles of literacy of understanding and plain language
language (format, and plain language of principles of (format, font,
font, visuals, (format, font, literacy and plain visuals, color, text
color, text visuals, color, text language (format, construction, depth,
construction, construction, depth, font, visuals, detail, complexity).
depth, detail, detail, complexity). color, text
complexity). construction,
depth, detail,
complexity).
Permission to use this Curriculum Assessment Tool for non-profit educational purposes is granted with use of the following citation:
Finkbeiner, N., & Braun, B. (February, 2013). University of Maryland Extension curriculum assessment tool. College Park, MD: University
of Maryland Extension.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 2
Appendix B
Extension Curriculum Assessment Tool (CAT)©
Extension Curriculum Assessment Tool - SCORING SHEET

4 points for effective rating


3 points for good rating
2 points for fair rating
1 point for ineffective rating
N/A if the factor does not apply to this material

FACTOR TO BE RATED SCORE COMMENTS


1. CONTENT
(a) Program grounded in theory

(b) Content grounded in research

(c) Viewpoint is balanced

(d) Learning objectives included

2. AUDIENCE
(a) Identifies target audience

(b) Audience input utilized

(c) Audience involved, engaged

(d) Diversity is reflected.

(e) Diversity is respected

3. READABILITY
(a) Accurate spelling/grammar

(b) Appropriate vocabulary and reading level

(c) Logical organization

(d) Material reflects principles of plain


language and literacy

4. UTILITY
(a) Lesson Implementation and Preparation

(b) Appropriate references

(c) Easy to understand instructions

(d) Strong validity and reliability established

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 2
Appendix B
Extension Curriculum Assessment Tool (CAT)©
(e) Practical activities

(f) Relevant resources included

(g) Strong citation for program being


reviewed

(h) Logic model included

(i) Describes process for implementing


curriculum

5. EVALUATION
(a) Audience-tested instruments

(b) Psychometrically-sound instruments

(c) Evaluation methods linked to learning


objectives

(d) Pre-test, post-test methods

Total score:

Permission to use this curriculum assessment tool scoring sheet is granted with the following citation:
Finkbeiner, N. & Braun, B. (March, 2013). University of Maryland Extension curriculum assessment
scoring tool. College Park, MD: University of Maryland Extension

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 3
Appendix C
Extension Materials Assessment Tool (MAT)©

Suitability Assessment of Materials Evaluation Criteria 1

SAM, the Suitability Assessment of Materials instrument, offers a systematic method to


objectively assess the suitability of health information materials for a particular audience in a
short time. SAM permits rating of materials on factors that affect readability (the relative
difficulty of decoding the words) and comprehension (the relative difficulty of understanding
the meaning). Six areas are assessed by SAM: 1) Content, 2) Literacy Demand, 3) Graphics,
4) Layout and Type, 5) Learning Stimulation and Motivation, and 6) Cultural Appropriateness. 2
Materials are rated on a three-point scale: (Superior, Adequate, and Not Suitable) using the
objective criteria detailed below:

Extension Materials Assessment Tool©


Suitability Assessment of Materials Evaluation Criteria 3
Superior Adequate Not suitable
2 points 1 point 0 points
Content
Purpose: Superior Adequate Not suitable
It is important that readers Purpose is explicitly stated Purpose is not explicit. It is No purpose is stated in the
readily understand the in title, cover illustration, implied, or multiple purposes title, cover illustration, or
intended purpose of the or introduction. are stated. introduction.
instruction for them. If they
don’t clearly perceive the
purpose, they may not pay
attention or may miss the
main point.

Content Topics: Superior Adequate Not suitable


Since adults usually want to Thrust of the material is At least 40% of content topics Nearly all topics are focused
solve their immediate problem application of focus on desirable behaviors or on non-behavior facts. No
rather than learn a series of knowledge/skills aimed at actions. Some explicit explicit instructions for
facts (that may only imply a desirable reader behavior directions or instructions are behavior change are offered.
solution), the content of rather than non-behavior presented.
greatest interest and use is facts. Instructions are
likely to be behavior explicit and require specific
information (explicit actions from readers.
instructions, specific actions
and recommendations) to help
solve their problem.

1 Adopted from: Doak, C.C., Doak, L.G., & Root, J.H. (1996). Assessing suitability of materials. In Teaching patients with low literacy skills
(2nd Ed.) (pp. 41-60). Philadelphia: J.B. Lippincott Company.

2Smith, S. (2008). SAM: Suitability Assessment of Materials for evaluation of health-related information for adults. Available at:
http://aspiruslibrary.org/literacy/SAM.pdf

3Adopted from: Doak, C.C., Doak, L.G., & Root, J.H. (1996). Assessing suitability of materials. In Teaching patients with low literacy skills
(2nd Ed.) (pp. 41-60). Philadelphia: J.B. Lippincott Company.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 3
Appendix C
Extension Materials Assessment Tool (MAT)©
Scope: Superior (2) Adequate (1) Not suitable (0)
Scope is limited to purpose or Scope is expanded beyond the Scope is far out of proportion
objective(s). Depending on the Scope is limited to
essential information purpose; no more than 40% is to the purpose and time
type of material, a limited nonessential information. The allowed. Too many main
number of “main points” are directly related to the
purpose. The appropriate number of main points slightly points are presented.
presented (for example, a flyer exceeds the recommended
should address one to two number of main points are
presented. Experience amount. Key points can be
main points, while a lengthier learned in time allowed.
form/brochure should address shows it can be learned in
no more than four main time allowed.
points.) Scope is also limited
to what the reader can
reasonably learn in the time
allowed.

Summary and Review: Superior Adequate Not suitable


A review offers the readers a A summary is included and Some key ideas are reviewed. No summary or review is
chance to see or hear the key retells the key messages in included.
points of the instruction in different words and
other words, examples, or examples.
visuals. Reviews are
important; readers often miss
the key points upon first
exposure.

Literacy Demand
Reading Grade Level An explanation is included in an appendix to this document. A calculated grade level
will not be done for this project.
Writing Style: Superior Adequate Not suitable
Conversational style and Both factors: (1) Mostly (1) About 50% of the text uses (1) Passive voice throughout.
active voice lead to easy-to- conversational style and conversational style and active (2) Over half the sentences
understand text. Example: active voice. (2) Simple voice. (2) Less than half the have extensive embedded
“Consider the needs of your sentences are used sentences have embedded information.
family when choosing a extensively; few sentences information.
health insurance plan.” contain embedded
Passive voice is less effective. information.
Example: “Consumers should
be advised to select a plan
that best meets the needs of
their families.” Embedded
information, the long or
multiple phrases included in a
sentence, slows down the
reading process and generally
makes comprehension more
difficult as shown in this
sentence.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 3
Appendix C
Extension Materials Assessment Tool (MAT)©
Vocabulary: Superior (2) Adequate (1) Not suitable (0)
Common, explicit words are All three factors: (1) (1) Common words are Two or more factors: (1)
used (for example, doctor vs. Common words are used frequently used. (2) Technical Uncommon words are
physician). The instruction nearly all of the time. (2) and CCVJ words are frequently used in lieu of
uses few or no words that Technical, concept, category, sometimes explained by common words. (2) No
express general terms such value judgment (CCVJ) words examples. (3) Some jargon or examples are given for
as categories (for example, are explained by examples. math symbols are included. technical and CCVJ words.
legumes vs. beans), concepts (3) Imagery words are used (3) Extensive jargon.
(for example, normal range as appropriate for content.
vs. 15 to 70), and value
judgments (for example,
excessive pain vs. pain that
lasts more than 5 minutes).
Imagery words are used
because these are words
people can “see” (for
example, whole wheat bread
vs. dietary fiber; a runny
nose vs. excess mucus).

In sentence construction, Superior Adequate Not suitable


context is given before
new information: Consistently provides context Provides context before new Context is provided last or no
before presenting new information about 50% of the context is provided.
We learn new facts/behaviors information. time.
more quickly when told the
context first. Good examples:
“In order to get the most
health care coverage for your
insurance dollar (the context
first), you should compare
policies and premiums” (new
information).

Learning enhancement by Superior Adequate Not suitable


advance organizers (road
signs): Nearly all topics are preceded About 50% of the topics are Few or no advance
by an advance organizer (a preceded by advance organizers are used.
Headers or topic captions statement that tells what is organizers.
should be used to tell very coming next).
briefly what’s coming up
next. These “road signs”
make the text look less
formidable, and also prepare
the reader’s thought process
to expect the announced
topic.

Graphics
Cover graphic: Superior Adequate Not suitable
People do judge a booklet by The cover graphic is The cover graphic has one The cover graphic has none of
its cover. The cover image (1) friendly; or two of the superior the superior criteria.
often is the deciding factor in (2) attracts attention; criteria.
a reader’s attitude toward, (3) clearly portrays the
and interest in, the purpose of the material for
instruction. he intended audience.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 3
Appendix C
Extension Materials Assessment Tool (MAT)©
Type of illustrations: Superior (2) Adequate (1) Not suitable (2)
Simple line drawings can Both factors: (1) Simple, One of the superior factors None of the factors are present.
promote realism without adult-appropriate line is missing.
including distracting details. drawings/sketches are used.
(Photographs often include (2) Illustrations are likely to
unwanted details). Visuals be familiar to the viewers.
are accepted and
remembered better when
they portray what is familiar
and easily recognized.
Viewers may not recognize
the meaning of scientific
textbook drawings or abstract
art/symbols.

Relevance of illustrations: Superior Adequate Not suitable


Nonessential details such as Illustrations present key (1) Illustrations include One factor: (1) Confusing or
room background, elaborate messages visually so the some distractions. (2) technical illustrations (non-
borders, unneeded color can reader/viewer can grasp the Insufficient use of behavior related). (2) No
distract the viewer. The key ideas from the illustrations. illustrations, or an overload of
viewer’s eyes may be illustrations alone. No illustrations.
“captured” by these details. distractions.
The illustrations should tell
the key points visually.

Graphics: Lists, tables, Superior Adequate Not suitable


graphs, charts, geometric
forms: Step-by-step directions, with “How-to” directions are too Graphics are presented without
an example, are provided brief for reader to explanation.
Many readers do not that will build comprehension understand and use the
understand the author’s and self-efficacy. graphic without additional
purpose for lists, charts, and counseling.
graphs. Explanations and
directions are essential.

Captions are used to Superior Adequate Not suitable


“announce”/explain
graphics:
Explanatory captions will all Brief captions are used for Captions are not used.
Captions can quickly tell the or nearly all illustrations and some illustrations and
reader what the graphic is all graphics. graphics.
about, where to focus within
the graphic. A graphic
without a caption is usually
an inferior instruction and
represents a missed learning
opportunity.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 3
Appendix C
Extension Materials Assessment Tool (MAT)©
Layout and Typography
Layout: Superior Adequate Not suitable
Layout has a substantial At least five of the eight At least three of the (1) Two or less of the superior
influence on the suitability of factors (listed to the left) are superior factors are factors are present. (2) Looks
materials. present. present. uninviting or discouragingly hard
to read.
Superior factors include:

(1) Illustrations are on the


same page adjacent to the
related text.
(2) Layout and sequence of
information are consistent,
making it easy for the reader
to predict the flow of
information
(3) Visual cuing devices
(shading, boxes, arrows) are
used to direct attention to
specific points or key
content).
(4) Adequate white space is
used to reduce appearance of
clutter.
(5) Use of color supports and
is not distracting to the
message. Viewers need not
learn color codes to
understand and use the
message.
(6) Line length is 30-50
characters and spaces.
(7) There is high contrast
between type and paper.
(8) Paper has non-gloss or
low-gloss surface.

Typography: Superior Adequate Not suitable


Type size and fonts can make The four factors (listed to the Two of the superior factors One or none of the superior
text easy or difficult for left) are present are present. factors are present. Or, six or
readers at all skill levels. For more styles and sizes are used
example, type in ALL CAPS on a page.
slows everybody’s reading
comprehension. Also, when
too many (six or more) type
fonts and sizes are used on a
page, the appearance
becomes confusing and the
focus uncertain.

Superior factors include:

(1) Text type is in uppercase


and lowercase serif (best) or
sans serif.
(2) Type size is at least 12
point.
(3) Typographic cues (bold,
size, color) emphasize key
points.
(4) No ALL CAPS for long
headers or running text.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 3
Appendix C
Extension Materials Assessment Tool (MAT)©
Subheadings or Superior Adequate Not suitable
“Chunking”:
(1) Lists are grouped under No more than seven items More than seven items are
Few people can remember are presented without a presented without a subheading.
more than seven independent descriptive subheadings or
“chunks.” (2) No more than subheading.
items. For adults with low
literacy skills, the limit may five items are presented
be three- to five-item lists. without a subheading.
Longer lists need to be
partitioned into smaller
“chunks.”

Learning Stimulation and Motivation


Superior
Interaction included in text Adequate Not suitable
and/or graphic:
Problems or questions Question-and-answer No interactive learning
When the reader responds to presented for reader format used to discuss stimulation provided.
the instruction – that is, does responses. problems and solutions
something to reply to a (passive interaction).
problem or question –
chemical changes take place
in the brain that enhance
retention in long-term
memory. Readers/viewers
should be asked to solve
problems, to make choices, to
demonstrate, etc.

Desired behavior patterns Superior Adequate Not suitable


are modeled, shown in
specific terms: Instruction models specific Information is a mix of Information is presented in
behaviors or skills (ex: technical and common nonspecific or category terms
People often learn more compare costs of crop language that the reader such as the food groups.
readily by observation and by insurance plans; identify may not easily interpret in
doing it themselves rather restrictions for who is terms of daily living (ex:
than by reading or being told. insured for what period of “deductible”, which has
They also learn more readily time). different meanings for
when specific, familiar health insurance and other
instances are used rather insurance policies).
than the abstract or general.

Motivation: Superior Adequate Not suitable


People are more motivated to Complex topics are Some topics are subdivided No partitioning is provided to
learn when they believe the subdivided into small parts to improve the readers’ create opportunities for small
tasks/behaviors are doable by so that readers may self-efficacy. successes.
them. experience small successes
in understanding or problem
solving, leading to self-
efficacy.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 3
Appendix C
Extension Materials Assessment Tool (MAT)©
Cultural Appropriateness
Cultural Match: Logic, Superior Adequate Not suitable
Language, Experience
(LLE): Central concepts/ideas of the Significant match in LLE for Clearly a cultural mismatch in
material appear to be 50% of the central LLE.
A valid measure of cultural culturally similar to the LLE concepts.
appropriateness of an of the target culture.
instruction is how well its
logic, language, and
experience (LLE) (inherent in
the instruction) match the LLE
of the intended audience. For
example, a pamphlet
regarding health insurance
options is a poor cultural
match if it fails to provide
information about both public
and private insurance plans to
an economically diverse
audience.

Cultural image and Superior Adequate Not suitable


examples:
Images and examples Neutral presentation of Negative image, such as
To be accepted, an instruction present the culture in cultural images or foods. exaggerated or caricatured
must present cultural images positive ways. cultural characteristics, actions,
and examples in realistic and or examples.
positive ways.

Initially prepared by Nicole Finkbeiner, GRA, under the direction of Dr. Bonnie Braun,
University of Maryland School of Public Health-07/09. Customized by Nicole Finkbeiner for use
with University of Maryland Extension –12/12.
Permission to use this assessment tool for non-profit educational purposes is granted with use of the following
citation:

Finkbeiner, N. & Braun, B. (March, 2013). University of Maryland Extension materials assessment tool. College
Park, MD: University of Maryland Extension.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 3
Appendix C
Extension Materials Assessment Tool (MAT)©
Extension Materials Assessment Tool - SAM SCORING SHEET
2 points for superior rating
1 point for adequate rating
0 points for not suitable rating
N/A if the factor does not apply to this material

FACTOR TO BE RATED SCORE COMMENTS


1. CONTENT
(a) Purpose is evident

(b) Content about behaviors

(c) Scope is limited

(d) Summary or review included

2. LITERACY DEMAND
(a) Reading grade level

(b) Writing style, active voice

(c) Vocabulary uses common words

(d) Context is given first

(e) Learning aids via “road signs”

3. GRAPHICS
(a) Cover graphics shows purpose

(b) Type of graphics or illustrations

(c) Relevance of illustrations

(d) List, tables, etc. explained

(e) Captions used for graphics

4. LAYOUT & TYPOGRAPHY


(a) Layout factors

- Illustrations are on the same page


adjacent to the related text. Y / N
- Layout & sequence of information are
consistent, making it easy for the patient
to predict the flow of information. Y / N

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 3
Appendix C
Extension Materials Assessment Tool (MAT)©
- Visual cuing devices (shading, boxes, or
arrows) are used to direct attention to Y / N
specific points or key content.
- Adequate white space is used to reduce
appearance of clutter Y / N
- Use of color supports & is not distracting
to the message. Viewers need not learn
color codes to understand & use the Y / N
message.
- Line length is 30-50 characters & spaces.
Y / N
- There is a high contrast between type &
paper. Y / N
- Paper has non-gloss or low-gloss surface.
Y / N
(b) Typography

- Text type is in uppercase & lowercase


serif (best) or sans-serif. Y / N
- Type size is at least 12 point.
Y / N
- Typographic cues (bold, size, color)
emphasize key points. Y / N
- No ALL CAPS for headers or running text.
Y / N
(c) Subheads (“chunking”) used

5. LEARNING STIMULATION,
MOTIVATION
(a) Interaction used

(b) Behaviors are modeled & specific

(c) Motivation – self-efficacy

6. CULTURAL APPROPRIATENESS
(a) Match in logic, language, experience

(b) Cultural image and examples

Total SAM score:

Total possible score:

Percent score: %

Permission to use this materials assessment tool scoring sheet is granted with the following citation: Finkbeiner, N. & Braun, B.
(March, 2013). University of Maryland Extension materials assessment tool scoring tool. College Park, MD: University of Maryland Extension.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 3
Readability levels

To assure that educational materials are written at a reading level suitable for the targeted
audience requires an assessment of reading levels. Many reading level assessments require
you to manually calculate the reading level (39). For example, the Fry formula, which can be
used with short documents, requires that you:

o Randomly choose three samples from your document with 100 words a piece
o Count the number of sentences in each of the three passages, estimating length
of the fraction of the last sentence to the nearest 1/10th
o Count the number of syllables in the 100-word passages
o Find the average number of sentences and average number of syllables for the
three samples, by dividing the total of all three samples by three
o Compare to Fry chart to assess grade level

Computer formulas are much easier to use than hand calculated formulas, but they pose
problems of their own (40):

o The material often needs to be “prepared” before using a computer-based


formula
 If not already in computer format, you would have to “key it into” the
computer-based method
 Punctuation marks that occur in the middle of sentences need to be
removed (for example, periods that denote a percentage (84.5%) imply to
the computer that the sentence is ending)
o Problems with measurement and unreliability in computer programs

Prepared by Nicole Finkbeiner, MS, for Dr. Bonnie Braun


University of Maryland Extension
12/12.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 4
Acknowledgements:

The authors want to thank the University of Maryland Health Smart Team and the Health
Insurance Literacy Initiative team for early testing of the assessment tools; Extension
educators who participated in the pre-test of the Smart Choice Health Insurance© curriculum
and two external reviews: Drs. Michael Lambur and Dr. Debra Davis.

Suggested Citation:

Braun, B., McCoy, T., and Finkbeiner, N. (2014). Extension education theoretical framework
with criterion-referenced assessment tools. College Park, MD: University of Maryland
Extension.

Bonnie Braun, PhD, Teresa McCoy, MPA, & Nicole Finkbeiner, MS.
This publication, Extension Manual EB-02-2013, is a series of publications of the University of Maryland Extension and the Department of Family Science, School of
Public Health. The information presented has met UME peer review standards, including internal and external technical review. For more information on related publications
and programs, visit:: extension.umd.edu/insure. Please visit http://extension.umd.edu/ to find out more about Extension programs in Maryland.

Issued in furtherance of Cooperative Extension work, acts of May 8 and June 30, 1914, in cooperation with the U.S. Department of Agriculture, University of Maryland,
College Park, and local governments. Cheng-i Wei, Director of University of Maryland Extension. The University of Maryland is equal opportunity. The University’s policies,
programs, and activities are in conformance with pertinent Federal and State laws and regulations on nondiscrimination regarding race, color, religion, age, national origin,
gender, sexual orientation, marital or parental status, or disability. Inquiries regarding compliance with Title VI of the Civil Rights Act of 1964, as amended; Title IX of the
Educational Amendments; Section 504 of the Rehabilitation Act of 1973; and the Americans With Disabilities Act of 1990; or related legal requirements should be directed to
the Director of Human Resources Management, Office of the Dean, College of Agriculture and Natural Resources, Symons Hall, College Park, MD 20742.

For more information on this and other topics visit the University of Maryland Extension website at www.extension.umd.edu 4

You might also like