3 2 Quality Assurance v3 en-US

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Quality Assurance:

Monitoring and
Evaluation to Inform
Practice and Leadership
Transformation Framework
Developing
a Learning
Community

Curriculum and Teacher and


Assessment Leader Capacity
21st Century
Pedagogy
Establishing
a Vision
Leadership
and Policy
Quality Assurance: Monitoring Designing Technology
Physical Learning
and Evaluation to Inform Environments for Efficient and
Effective Schools
Practice and Leadership

Organizational Capacity, Partnerships and Personalized


Strategic Planning and Capacity Building Learning
Quality Assurance

Inclusion,
Accessibility and
Sustainability

Introduction
This paper examines one of ten critical Quality Assurance: Monitoring and Evaluation What is the Education
components of effective transformation in to Inform Practice and Leadership Transformation Framework?
schools and education systems. Each paper This paper provides monitoring and evaluation guides and The Microsoft Education Transformation Framework
examples for leaders. Monitoring and evaluation is used helps fast track system-wide transformation by
is produced by an expert author, who
by governments worldwide to improve school systems and summarizing decades of quality research. It includes
presents a global perspective on their topic educational results – and they can play an integral role in holistic a library of supporting materials for ten components About the author
through current thinking and evidence from education transformation. of transformation, each underpinned by an executive
summary and an academic whitepaper detailing global Dr. Tom Clark
research and practice, as well as showcase Education leaders at all levels can benefit from applying the
evidence. This provides a short-cut to best practice, President
examples. Together, the papers document the planning, monitoring and evaluation cycle and outcomes-based TA Consulting
speeding up transformation and avoiding the mistakes
planning and evaluation to education transformation initiatives.
contributions of ‘anytime, anywhere’ approaches Monitoring and evaluation can help educational transformation
of the past. Microsoft also offers technology architectures Dr. Tom Clark, President of TA Consulting, provides
and collaborative workshops to suit your needs.
to K-12 learning and explore the potential of new programs define and measure quality indicators and measures evaluation, research, and strategic services for
of the education transformation process, gauge progress education clients. He has led evaluations of a wide
technology for transforming learning outcomes toward desired educational outcomes, increase stakeholder range of online, blended and distance learning
for students and their communities. participation, and empower school leaders and teachers to build programs, from district-wide and state-wide
and sustain transformation in schools. programs to complex multi-state and postgraduate
programs. He also provides strategic consultation,
As each educational system is unique, evaluators should be developing policy analysis and whitepapers for
prepared to vary their evaluation approach based on program clients. Dr. Clark is an accomplished author, having
purpose and context. Technology is playing an increasingly co-written one of the first American textbooks on
important role in increasing data access, as well as a tool for postsecondary distance learning, and authored one
school leaders and teachers to inform instruction and improve of the seminal works in K-12 online learning.
student outcomes in education transformation initiatives.
The first steps
to success

How can I help ensure our What does monitoring


transformation is a success?
When you’re looking to ensure a
and evaluation achieve?
M&E can help kickstart education M&E can keep education programs
The M&E process
successful and sustainable education
transformation initiative, monitoring and
evaluation for quality assurance (M&E)
programs, by:
• D
 eveloping clear, attainable
on track, by:
• M
 onitoring program implementation
should be… factored
plays an important role. According to
James & Miller, “the M&E process should
outcomes and goals for education
transformation, and flexible
strategies for achieving them
and progress toward desired outcomes
• H
 elping programs identify and remedy into planning before
a project starts.
be an integral component of any planned implementation problems early on
ICT in education program and should be • P
 romoting high levels of • H
 elping sustain effective program
factored into planning before a project engagement by local school implementation over time
starts.”1 Furthermore, planning for M&E stakeholders
• H
 elping staff, teachers and partners James & Miller, 2005
is considered one of the ten critical • Promoting ongoing learn from their experiences, allowing
components needed to bring about communication about roles, them to make more informed
educational transformation. 2 expectations, progress, decisions, be accountable, and
and performance reposition their efforts. 3
• D
 ocumenting program success
for educational stakeholders
and funders.

1 James, T., & Miller, J. (2005). Developing a monitoring and evaluation plan for ICT. In Wagner, D. A., Day, B., Jones, T., Kozma, R. B., Miller, J., & Unwin, T. (Eds), Monitoring and evaluation
of ICT in education projects (pp. 32-42). Washington, DC: World Bank.
2 Cavanaugh, C., McCarthy, A., & East, M. (2014). An innovation framework for holistic school transformation: ten critical conversations for the 21st century. Redmond, WA: Microsoft
World Public Sector.
3 United Nations Development Programme. (2009). Handbook on planning, monitoring and evaluating for development results. New York.

4 | Quality Assurance: Monitoring and Evaluation to Inform Practice and Leadership


Management
Strategies
Results-Based Management What exactly do you mean? Poor implementation is Logical Framework Approach that affect its operations and success
is critical”.4 This realization led to the
scheme for ICT in education includes
1) input indicators, 2) process indicators,
Together with planning, M&E creates the Planning is addresses in more depth usually a direct result of A first step in building an effective development of the logic model, which 3) outcomes, 4) assessment instruments,
Planning, Management and Evaluation in the first and second white papers on education transformation program is
cycle. A number of management Vision and Enabling Transformation the ‘policy lever approach,’ defining desired results or outcomes
provides a theory of action for how the
program is intended to work. Some
and 5) a monitoring and evaluation plan.6
The plan is the most critical part, as it
approaches incorporate this cycle. with Strategic Planning. but poor alignment is (the measurable changes in people or agencies refer to logic models as logical includes a schedule for monitoring the
Results-Based Management (RBM) is one organizations) that will help address
of the most widely known, used by many
Monitoring may be defined as often blamed. critical needs or problems that fall
frameworks or “logframes”.5 indicators, administering the assessments
and undertaking evaluation activities.
“an ongoing process by which The example logic model for
international development agencies. within your organizational mission.
stakeholders obtain regular feedback Evaluation may be formative, educational transformation shows a
In management approaches like RMB, Thinking about your program needs
on the progress being made towards providing feedback for improvement, situation (what the program is intended
stakeholders create a vision, define
desired results, plan the project, monitor
goals and objectives.” Monitoring is or summative, assessing merit or
and desired results should precede
planning of program, monitoring or to address), inputs (existing resources),
A Logical Framework
implementation, then evaluate whether
an important source of information
for program evaluation.
worth. It may be internal, conducted evaluation activities. processes (program activities and (Logframe) lays out the
by program staff such as ‘M&E Officers’ services), outputs (who and how many
desired results were achieved and make
improvements or changes as necessary. Evaluation is defined by Patton as in development programs, or external, Here is an example logic model for an are served), and outcomes (short,
different types of events
Here is an illustrated guide to the cycle, “the systematic collection of conducted by outside evaluators who education transformation initiative that
is tackling low educational attainment
medium, and long-range changes in that take place as a project
provide third party validation or examine people and organizations attributed
as used by the United Nations: information to make judgments,
improve program effectiveness and/ questions of special interest. due to limited educational access: Once at least in part to the program). is implemented: Inputs,
or generate knowledge to inform
you have desired outcomes that address
the problem, it’s time to consider Monitoring and evaluation schemes Processes, Outputs
decisions about future programs.”
how the program will achieve them. typically follow such an approach, and Outcomes.
“Identifying the theory underlying the whether or not they have an explicit
program and the contextual factors logic model or logframe. A typical M&E

Results-based Situation
management (RBM) is a Inputs Activities Outputs

Evaluation Setting Planning management strategy that What fuels the program: What the program does: Who the program serves:
the vision
uses feedback loops to • Funding • School leader training • Students
• Staff Partners • Tech infrastructure building • Teachers
achieve strategic goals. • Facilities • Teacher PD & Peer Mentoring • Leaders
Managing
Defining • Web & Content Development • School and Districts
the results
and using
map and R&M
evaluation
Stakeholder framework Outcomes
participation
Short-term Medium-term Long-term

Improvements in: Improvements in: Improvements in:


Implementing Planning for • Teacher knowledge & skills • School leader and teacher practices • Teacher retention
and using monitoring and • Student attitudes • Student learning • Educational attainment
monitoring evaluation • School technology access • School Policies • School effectiveness
• School innovation level

Monitoring

Source: United Nations Development Program (2009), p.10. 4 Wholey, J. S., Hatry, H. P., & Newcomer, K. E. (Eds.). (2010). Handbook of practical program evaluation. 3rd Ed. San Francisco: Jossey-Bass.
5 World Bank. (2004). Monitoring & evaluation: some tools, methods & approaches. Washington, DC.
6 Rodríguez, P., Nussbaum, M., López, X., & Sepúlveda, M. (2010). A Monitoring and evaluation scheme for an ICT-supported education program in schools.
Educational Technology & Society, 13 (2), 166–179.

6 | Quality Assurance: Monitoring and Evaluation to Inform Practice and Leadership Quality Assurance: Monitoring and Evaluation to Inform Practice and Leadership | 7
What to watch
out for
Governmental performance
in Australia, Canada, Ireland,
and the U.S., shows a shared
emphasis on monitoring
outcomes and outputs,
rather than activities.

Challenges to effective • I nitial plans are never completely • P


 lanning for monitoring and • F inally, the program can benefit (2000) extended it to public schools indicators were “aspirational” in nature—
monitoring and evaluation accurate, and circumstances and evaluation should start early from distributing information on across the nation. Since 2008, the Obama that is, beyond the direct control of the
priorities may change, but logframes on. For example, identifying these measures throughout the administration has focused on effective agency to achieve.10
While outcomes-based models and may reduce flexibility to make changes key data indicators and how program’s implementation, rather use of performance information in
results-based management can be Chelimsky notes the ongoing cultural
later to fit project realities. data will be gathered. than just at the end.8 evaluation, appointing the first Chief
valuable tools, how they are implemented clash between the evaluation and
• L ogframes are often not used by field Performance Officer and developing
impacts their effectiveness as methods political worlds, and limits on evaluator
staff after initial planning, because According to Kozma & Wagner’s analysis a federal performance portal (www.
for managing education transformation. of prior international national monitoring
Role monitoring and performance.gov) that outlines cross-
independence within governmental
they do not fit how the project systems, while citing the ongoing role of
MANGO, a UK charity that works in actually works on the ground. and evaluation studies of ICT use, some of evaluation in government agency goals, such as improving STEM
international development, identifies a evaluation as a trustworthy, dependable
the most effective practices for evaluators According to Wholey, “evaluation education through the collaborative
number of ways in which logic models/ tool for preserving public accountability.11
are as follows:8 is used in government to increase efforts of 16 federal offices.
logframes and results-based management Lessons learned and transparency, strengthen accountability,
• P
 rogram evaluations should Boyle examined governmental
may fail to be used effectively:7 effective practices concentrate on [outcome] measures and improve performance,” where performance evaluation in Australia, Implementation
• P
 lanners may assume that complex To make monitoring and evaluation of student and teacher learning. The performance management systems Canada, Ireland, and the U. S, finding
social issues can be reduced to a simple effective in education transformation: … most sensitive are those custom- “establish outcome-oriented goals and marked differences in practice, but a
impacts the effectiveness
overarching problem, but often they
• D
 esired program outcomes
designed for the program … Include performance targets, monitor progress, shared emphasis on monitoring outcomes of methods for managing
cannot. Also, some important goals measures of learning that are likely to stimulate performance improvements, and outputs, rather than activities. He
cannot be easily measured. should be developed with local
result from the systematic use of ICT. and communicate results to higher policy found that performance indicators education transformation.
school leader and teacher input, levels and the public.”9
• R
 esults and impact may be beyond the • E
 valuators need to document developed by U. S. agencies were more
be realistic, and, where possible,
agency’s control, take years to achieve, and measure baseline inputs to In the U.S., the Government Performance likely to meet SMART criteria (Specific,
within the control of the program.
and be influenced by many factors the program. and Results Act of 1993 (GPRA) extended Measurable, Achievable, Relevant, Time-
besides the program. • S trategies and activities selected Bound) for effective indicator design
• E
 valuators need to acknowledge performance evaluation to all federal
to attain desired outcomes should (Doran, 1981), but also that many of these
• T
 here may be multiple viewpoints and describe the educational, agencies, while No Child Left Behind
be flexible and open to revision
about problems and competing technological, social, and
as needed by empowered local
interests, not a single view or interest economic factors that enable
program managers and school
easily expressed as an outcome. and constrain… the program.
leaders to reflect the evolving
• L ogframes may focus the project program in practice. • D
 irect measures of M&E indicators 7 MANGO. (2014). What is wrong with results-based management? Oxford, UK.
on the agency’s actions rather than are the most reliable sources
• M
 onitoring and evaluation 8 Kozma, R. B., & Wagner, D. A. (2005). Core indicators for monitoring and evaluation studies in ICTs for education. In Wagner, D. A., Day, B., Jones, T., Kozma, R. B., Miller, J., & Unwin,
the local program and the people of information. They also tend T. (Eds), Monitoring and evaluation of ICT in education projects (pp. 21-31). Washington, DC: World Bank.
in local schools should be 9 Wholey, J. S., (2010). Use of evaluation in government: the politics of evaluation. In J. S. Wholey, H. P. Hatry, & K. E. Newcomer (Eds.), Handbook of practical program evaluation
served, and tend to exclude local to be the most costly. The
participatory, to build local (pp. 651-667). 3rd Ed. San Francisco: Jossey-Bass. (pp. 652-654).
people from planning, especially reliability of [indirect] measures 10 Boyle, R. (2009). Performance reporting: insights from international practice. Washington, DC: IBM Center for the Business of Government.
buy-in and capacity to sustain
marginalized people. can be increased by obtaining 11 Chelimsky, E. (2008). Clash of cultures: improving the “fit” between evaluative independence and the political requirements of a democratic society. American Journal of Evaluation,
an effective program. 29(4), 400-415.  
information from multiple sources.
8 | Quality Assurance: Monitoring and Evaluation to Inform Practice and Leadership Quality Assurance: Monitoring and Evaluation to Inform Practice and Leadership | 9
Evaluation
Strategies

The differences between


evaluation models fall into three
camps, defined by emphasis on
methods, value, or use.

Evaluation theory models What’s the difference between How do we use the best Differences in the approaches of methods-focused, use-focused and value-focused evaluation theories
In the evaluation literature, the term methods-, use- and value- of each approach?
focused approaches? Key features Methods-focused evaluation Use-focused Value-focused
“theory” primarily refers to models An evaluation of holistic school reform evaluation evaluation
or approaches which specify how an requires elements of all three approaches:
Smith, Mitton, Cornelissen, Gibson
evaluation should be carried out. To date,
& Peacock have developed the following • U
 se-focused approaches may be the Key questions What is the program’s casual What do decision makers How do program processes
a limited amount of evaluation theory is
key differentiations:13 best primary emphasis for a program impact on desired outcomes? need to know to improve affect the relative standing
based on empirical study of what works
in evaluation. Until such empirical theory • A
 methods-focused evaluation might evaluation of education transformation program usefulness? of different groups?
is developed, according to Alkin, “we examine program effectiveness in at the system level.14
must rely on the prescriptive models terms of whether test scores improved • M
 ethods-focused research studies Evaluation focus • Intended objective or outcomes • Process • Process
generated by knowledgeable members as a result of program participation, may be embedded to determine • Program theory • Intended use • Unintended outcomes
of the evaluation community to by conducting a controlled research the causal impact of key program • Summative evaluation • Organizational learning and • Power relationships
guide practice.”12 experiment or quasi-experiment, features. In this case, staff training is capacity building • Equity and social justice
and rendering an expert judgment needed on fidelity of implementation
Alkin sees differences between the • Formative evaluation
on whether the program caused the of key features, and the importance
models as mainly falling into three camps:
desired outcomes. of adhering to research protocols.
A relative emphasis on either methods, Who primarily judges
value, or use (see table). Evaluators tend to • A
 use-focused evaluation might • Values-focused evaluation Evaluator Decision Maker Public/Society
programs benefits?
follow a model that makes sense to them look at what the program manager components should be included
intellectually, but should be prepared to needs to know to improve the that actively involve local school
program or demonstrate progress, Common Post-positivist Pragmatic Constructivist
vary their approach based on the purpose staff and stakeholders-to connect
gathering multiple sources of methodologies • Controlled experiments • Multiple sources of evidence • Critical or participatory
of the evaluation and program context. program theory and practice,
evidence and recommending and methods where possible methods
ensure equitable participation for • Interviews/focus groups
improvements that the program marginalized people, and build local • Outcome measurement • Action research
• Quantitative and
manager may decide to implement. capacity for sustainability. • Quantitative data qualitative data • Qualitative data
• A
 value-focused evaluation might
explore a program’s impact on
equity or social justice issues.
Here the public is the ultimate judge.
It might also actively empower local
staff and stakeholders to ensure 12 Alkin, M. C. (2012). Comparing evaluation points of view. In M. C. Alkin (Ed.), Evaluation roots: A wider perspective of theorists’ views and influences (pp. 3-10).
New York: Sage Publications.
education transformation fits local 13 Hall, W. (2013). Development and implementation of a priority setting and resource allocation evaluation tool for achieving high performance (Master’s Thesis,
needs and interests. University of British Columbia).
14 Patton, M. Q. (2008). Utilization-focused evaluation. 4th ed. Thousand Oaks, CA: Sage Publications.

10 | Quality Assurance: Monitoring and Evaluation to Inform Practice and Leadership Quality Assurance: Monitoring and Evaluation to Inform Practice and Leadership | 11
Seeing Success
Conditions that promote • T
 eacher collaboration that focuses
on peer support and the sharing
implementation factors most
positively linked to student success.
Using Monitoring education outcomes such as student
proficiency, dropout and graduation
Schools succeed when
educational transformation and Evaluation to improve
of teaching practices Schools with these factors in rates, achievement gaps between they foster a culture of
Several recent studies have identified school outcomes
key indicators of success in educational
• P
 rofessional development that place were more likely to report
improvements in student outcomes,
student subgroups, and readiness for
postsecondary education and careers.
visionary innovation,
involves the active and direct Most emerging nations have established
transformation. Together, these engagement of teachers, particularly such as decreased disciplinary rates national education management Use of measures of individual student teacher collaboration and
research studies provide evidence that
over time, well-planned technology-
in practicing and researching new and increased test scores.17 information systems, but data gathering
is often unsystematic and data use in
learning growth over time, already in
place in some nations, can improve the
professional development.
teaching methods.15 Cavanaugh, Hargis, Soto & Kamali
enhanced initiatives can help schools school improvement is limited. Nayyar- accuracy of outcomes data.
and education systems achieve propose a holistic framework for large- Online assessments and analytical tools
Based on his experiences working with the Stone describes the efforts of Pakistan’s
education transformation. scale mobile/cloud learning programs High-stakes student testing is moving have the potential to remove some of
Canadian province of Ontario and review education ministry to use national data to
that groups elements with the potential online in nations including Denmark, a these barriers to effective use of formative
SRI International and in-country of ITL research, Fullan believes that whole identify low-performing schools and train
to transform education under three trend that is likely to spread. In the U. S., evaluation to inform instruction.
evaluators in seven nations (Australia, system reform can be accomplished in district officials in development of action
pillars: learning environment, curriculum/ two assessment consortia, PARCC and Revenaugh (in press) described data-
England, Finland, Indonesia, Mexico, reasonably short periods of time.16 He plans for school improvement.19 In the
content, and pedagogy/leadership.18 SBAC, have developed online student driven personalized learning in a blended
Russia, and Senegal) jointly conducted suggests that education systems establish future, school-level data systems linked
assessment systems for state-level use. school, where student mastery data from
a study of innovative teaching and a small number of ambitious goals and The Project Red table below shows the to central systems might give local school
Their testing regimes are aligned with online formative assessments within
learning or ITL.15 In each nation, they a set of coherent, integrated actions to nine key factors identified by Project RED leaders the ability to directly access secure
the new Common Core State Standards, course content is used by teachers identify
surveyed teachers and school leaders pursue them, thereby fostering: as most predictive of education success data on student achievement.
which define the knowledge and skills learning deficits and dynamically group
in 24 schools, half identified through aligned with this holistic framework.
• C
 ollaborative, focused school It is already commonplace in developed students need to be prepared upon students with similar learning needs
prior research as innovative and half Together they illustrate the importance
culture or “collective capacity” nations for school leaders to engage in graduation for college and career in for on-site learning activities. Dawson
as comparison schools. They also of each pillar for successful educational
as the foundation data-driven decision making for school the 21st century. Like the international describes collaborative use of an action
conducted case studies of a sample of transformation. These nine factors also
improvement. However, large-scale data PISA test, both require higher order research tool by teachers in a statewide
the innovative schools in each nation. • A
 new role for the principal as correspond well to the findings of ITL
use by teachers is more limited. Initiatives thinking and student performance tasks. 1:1 computing project to improve practice.
They found that across nations, lead learner and supporter researchers and Fullan about conditions
in two U. S. states used embedded Online testing is pushing U. S. schools Teachers can also find a growing number
certain practices or conditions were • L ead teachers as supportive, promoting education transformation.
professional development time, to upgrade technology infrastructures of formative assessment tools online. 22 For
more likely to be in place in schools collaborative peer mentors Key indicators of conditions that foster professional learning communities, and just as they are moving to more rigorous example, the PARCC and SBAC consortia
where innovative teaching practices • A
 doption of concrete, innovative education transformation – supportive data coaches to train teachers in using standards and assessments. both offer optional formative assessments
occurred, including: teaching practices. school culture, participatory principal annual summative testing data to inform aligned to new standards.
• A
 school culture that offers a leadership, collaborative peer mentoring, instruction, with some success. 20
Using formative data Student activity in online coursework
common vision of innovation as Project Red researchers surveyed engaged professional learning, and
can provide a wealth of data for use in
well as consistent support that a purposive sample of 997 schools adoption of new innovative teaching to increase student success monitoring and evaluation. For example,
encourages new types of teaching across the United States that had 1:1 methods – need to be agreed upon Project Red surveyed 997 A focus is emerging on using ongoing patterns of student logins and ‘clicks” in
computing or large-scale technology and tracked as part of the monitoring
integration initiatives in place. and evaluation of related national and U.S. schools to identify the formative assessment data to inform the online system can predict whether
classroom instruction. This is more students will successfully complete an
They identified nine key international initiatives. nine key implementation useful for classroom teachers than online learning activity. 23 Data mining
factors most positively access to summative assessment data, within online simulations can be used
as it allows them to address student to formatively assess student scientific
Project Red (2013) key factors linked to student success. learning gaps early on. 20 However, inquiry skills. 24 Those engaged in
Swan and Mazur found that pre-service monitoring and evaluation can use these
During the monitoring and evaluation teachers often lacked the skills and new kinds of online educational data to
Quality indicators and Pillar 1 Pillar 2 Pillar 3 process, evaluators and decision- time needed to design, implement and recommend formative improvements
measures for educational
transformation
Learning Curriculum Pedagogy and makers can use available data at the analyze formative assessments and use to learning systems and policies that
national and local levels to analyze key
environment and content leadership them to personalize instruction. 21 support student success.

Nine factors most strongly 3. Students collaborate 1. Technology 2. Leaders provide time
linked to educational online daily integrated into every for teacher professional 15 Shear, L., Gallagher, L., & Patel, D. (2011). Innovative Teaching and Learning Research. Menlo Park: SRI International.
success in transformation
6. 1:1 computer ratios intervention class development and 16 Fullan, M. (2011). Whole system reform for innovative teaching and learning. In Microsoft-ITL Research (Ed.), Innovative Teaching and Learning Research (pp. 30-39).
initiatives (rank ordered 17 Greaves, T. W., Hayes, J., Wilson, L., Gielniak, M., & Peterson, E. L. (2013). Revolutionizing education through technology: the Project RED roadmap for transformation.
or close to 1:1 4. Technology is collaboration at
by predicatively) Eugene, OR: International Society of Technology in Education.
integrated into core least monthly 18 Cavanaugh, C., Hargis, J., Munns, S., & Kamali, T. (December 2012). iCelebrate teaching and learning: Sharing the iPad experience, Journal of Teaching and Learning with Technology,
7. Virtual field trips 1(2), 1-12.
at least monthly curricula at least weekly 9. Principals are trained 19 Nayyar-Stone, R. (2014). Using national education management information systems to make local service improvements: the case of Pakistan. PREM Notes, Number 30.
5. Online formative in teacher buy- Washington, DC: The World Bank.
8. S tudents use search 20 McCann, C., & Kabaker, J. C. (2013). Promoting data in the classroom: innovative state models and missed opportunities. Washington, DC: New America Foundation.
assessments performed in, best practices,
engines daily 21 Swan, G., & Mazur, J. (2011). Examining data driven decision making via formative assessment. Contemporary Issues in Technology and Teacher Education, 11(2), 205-222.
at least weekly and technology- 22 Dawson, K. (June 06, 2012). Using Action Research Projects to Examine Teacher Technology Integration Practices. Journal of Digital Learning in Teacher Education, 28, 3, 117-124.
Source: Cavanaugh, Hargis, Soto &
Kamali, 2013; Greaves, Hays, Wilson, transformed learning 23 Liu, F. & Cavanaugh, C. (2011). High enrollment course success factors in virtual school: factors influencing student academic achievement. International Journal on E-Learning,
10(4), 393-418.
Gielniak, & Peterson, 2013
24 Gobert, J. D., Sao Pedro, M., Razuiddin, J., & Baker, R. S. (2013). From log file to assessment metrics: measuring students’ science inquiry skills using educational data mining.
The Journal of the Learning Sciences, 22, 521-563.

12 | Quality Assurance: Monitoring and Evaluation to Inform Practice and Leadership Quality Assurance: Monitoring and Evaluation to Inform Practice and Leadership | 13
Conclusion References

Monitoring and evaluation are key tools


for education leaders as they seek to
Technology is trending Student assessment is moving online,
and is increasingly aligned to new,
Alkin, M. C. (2012). Comparing evaluation
points of view. In M. C. Alkin (Ed.),
Greaves, T. W., Hayes, J., Wilson, L.,
Gielniak, M., & Peterson, E. L. (2013).
Revenaugh, M. (in press). Building
blended charters: a case study of the
bring about educational transformation toward building local more rigorous learning standards Evaluation roots: A wider perspective of Revolutionizing education through Nexus Academies, U.S. In T. Clark & M.
theorists’ views and influences (pp. 3-10). technology: the Project RED roadmap for K. Barbour (Eds.), Online, blended and
at the national, regional and local levels.
Together with planning, they are part of
capacity for school that require 21st century skills.
Better access to national and local
New York: Sage Publications. transformation. Eugene, OR: International distance education in schools. Sterling,
Society of Technology in Education. VA: Stylus Publications.
well-established program management leaders to use summative summative assessment data, and Barbour, M., & Clark, T. (in press). Online,
blended and distance education in Hall, W. (2013). Development and Shear, L., Gallagher, L., & Patel, D. (2011).
cycles and theories of program action
that center on desired outcomes—
assessment data to inform to new types of online formative
assessment data, can help those schools: summary thoughts. In T. Clark & implementation of a priority setting and Innovative Teaching and Learning Research.
meaningful changes in the lives of people school improvement. engaged in monitoring and evaluation
M. K. Barbour (Eds.), Online, blended and
distance education in schools. Sterling,
resource allocation evaluation tool for
achieving high performance (Master’s
Menlo Park: SRI International.

and in organizations. document student outcomes and VA: Stylus Publications. Thesis, University of British Columbia). Smith N, Mitton C, Cornelissen E, Gibson J,
There has been considerable research study “what works” in educational Peacock S. (2012, August). Using evaluation
While monitoring and evaluation can be Boyle, R. (2009). Performance reporting: James, T., & Miller, J. (2005). Developing a theory in priority setting and resource
on conditions promoting educational transformation initiatives.
misused, when used effectively they can insights from international practice. monitoring and evaluation plan for ICT. In allocation. Journal of Health Organization
transformation. Key indicators of Washington, DC: IBM Center for the Wagner, D. A., Day, B., Jones, T., Kozma, R. Management, 26(5), 655–71.
be inclusive and empowering, engaging
educational transformation need to Business of Government. B., Miller, J., & Unwin, T. (Eds), Monitoring
stakeholders, informing school leadership
and teaching practice, and documenting
be agreed upon and used in program While M&E can be misused, Cavanaugh, C. (in press). Barriers to online
and evaluation of ICT in education projects Stufflebeam, D. L., & Shinkfield, A. J. (2007).
Evaluation theory, models, and applications.
planning, monitoring and evaluation. (pp. 32-42). Washington, DC: World Bank.
progress toward educational when used effectively learning in developing nations: the case San Francisco: Jossey-Bass.
transformation and student success. Nations differ widely in their use of of Nepal. In T. Clark & M. K. Barbour (Eds.), Kozma, R. B., & Wagner, D. A. (2005). Core
technology-enabled educational data
can be inclusive and Online, blended and distance education in indicators for monitoring and evaluation Swan, G., & Mazur, J. (2011). Examining
Evaluators of education transformation data driven decision making via formative
initiatives may want to consider
to inform leadership and practice empowering, engaging schools. Sterling, VA: Stylus Publications. studies in ICTs for education. In Wagner, D.
A., Day, B., Jones, T., Kozma, R. B., Miller, J., assessment. Contemporary Issues in
in schools, but the trend is toward Cavanaugh, C., Hargis, J., Munns, S., Technology and Teacher Education, 11(2),
combining different types of evaluation,
building local capacity for school stakeholders, leadership & Kamali, T. (December 2012). iCelebrate
& Unwin, T. (Eds), Monitoring and
evaluation of ICT in education projects 205-222.
such as use-focused evaluation
that informs program managers,
leaders to use summative assessment and teaching practice. teaching and learning: Sharing the iPad
experience, Journal of Teaching and
(pp. 21-31). Washington, DC: World Bank. United Nations Development Programme.
data to inform school improvement, (2009). Handbook on planning, monitoring
methods-focused evaluation that helps Learning with Technology, 1(2), 1-12. Liu, F. & Cavanaugh, C. (2011). High
and for teachers to use formative enrollment course success factors in and evaluating for development results.
demonstrate casual impact, and value- Cavanaugh, C., McCarthy, A., & East, New York: Author.
assessment data to inform instruction. virtual school: factors influencing student
focused evaluation that helps ensure M. (2014). An innovation framework academic achievement. International Wholey, J. S., (2010). Use of evaluation in
equitable access and empowers local for holistic school transformation: ten Journal on E-Learning, 10(4), 393-418. government: the politics of evaluation. In
school staff to carry out transformation. critical conversations for the 21st century.
Redmond, WA: Microsoft World Public MANGO. (2014). What is wrong with results- J. S. Wholey, H. P. Hatry, & K. E. Newcomer
Sector. based management? Oxford, UK: Author. (Eds.), Handbook of practical program
evaluation (pp. 651-667). 3rd Ed. San
Chelimsky, E. (2008). Clash of cultures: McCann, C., & Kabaker, J. C. (2013). Francisco: Jossey-Bass.
improving the “fit” between evaluative Promoting data in the classroom: innovative
Technologies Developing your own change strategy independence and the political state models and missed opportunities. Wholey, J. S., Hatry, H. P., & Newcomer,
K. E. (Eds.). (2010). Handbook of practical
requirements of a democratic society. Washington, DC: New America Foundation.
schools can use to Guiding questions for strategic planning, organizational American Journal of Evaluation, 29(4),
Nayyar-Stone, R. (2014). Using national
program evaluation. 3rd Ed. San Francisco:
Jossey-Bass.
400-415.
support change capacity and quality assurance
Dawson, K. (June 06, 2012). Using Action
education management information
systems to make local service Winthrop, R., & Smith, M. S. (2012). A new
Research Projects to Examine Teacher improvements: the case of Pakistan. face of education: Bringing technology
Technology Integration Practices. Journal PREM Notes, Number 30. Washington, into the classroom in the developing world.
Microsoft Surface devices equipped • What are the current global trends • What benchmarking needs to be
of Digital Learning in Teacher Education, DC: The World Bank. Washington, DC: Brookings Institution.
with Windows 10, Office 365 relating to technology, education, implemented to evaluate pre and post
Education, OneNote and Moodle are and educational technology? the implementation of the vision? 28, 3, 117-124. World Bank. (2004). Monitoring &
Patton, M. Q. (1997). Utilization-focused
being used to organize and monitor Doran, G. T. (1981). There’s a S.M.A.R.T. evaluation. 1st ed. Thousand Oaks, CA: Sage evaluation: some tools, methods &
• How can a school measure its • What are the Key Performance approaches. Washington, DC: Author.
teaching and learning. success and what system of metrics Indicators? way to write management’s goals and Publications.
objectives. Management Review, 70(11),
Online assessments are being should it employ? • What process will be delivered to Patton, M. Q. (2008). Utilization-focused
35–36.
managed with Windows InTune • How does the management of a ensure Quality Assurance – content, evaluation. 4th ed. Thousand Oaks, CA:
Fullan, M. (2011). Whole system reform Sage Publications.
for Cloud-based Mobile Device school relate to its ability to implement professional development, leadership,
for innovative teaching and learning. In
Management, as well as cross- innovative practices? academic results? Microsoft-ITL Research (Ed.), Innovative Pelgrum, W. (2010). Indicators on ICT in
platform mobile devices equipped Teaching and Learning Research primary and secondary education: results of
• Does the vision reflect overarching • What percent of the education budget
with Office. (pp. 30-39). an EU study. In F.
expectations and philosophies? relates to 21st century learning?
Gobert, J. D., Sao Pedro, M., Razuiddin, Scheuermann & F. Pedro (Eds.), Assessing
Microsoft SharePoint is being • How ready/prepared is the community • Are we settling for incremental
J., & Baker, R. S. (2013). From log file to the effects of ICT in education (pp. 165-188).
used to manage ePortfolio services for change? improvements when we could be Luxembourg: European Commission/OECD.
and resources. introducing innovation that will assessment metrics: measuring students’
• What personalized training science inquiry skills using educational
fundamentally transform learning? Rodríguez, P., Nussbaum, M., López, X.,
and professional development data mining. The Journal of the Learning & Sepúlveda, M. (2010). A Monitoring and
requirements should be considered? • Are we targeting change in too few Sciences, 22, 521-563. evaluation scheme for an ICT-supported
or too many areas? education program in schools. Educational
Technology & Society, 13 (2), 166–179.

14 | Quality Assurance: Monitoring and Evaluation to Inform Practice and Leadership Quality Assurance: Monitoring and Evaluation to Inform Practice and Leadership | 15
Interested in taking the next step on
your transformation journey?
Visit microsoft.com/education/leaders

© 2015 Microsoft Corporation. All rights reserved. Microsoft, Bing, Excel, Lync, Office, OneNote, PowerPoint, Skype,
Word, Windows and the Windows logo are either trademarks or registered trademarks of Microsoft in the United States
and/or other countries. Other product names may be trademarks of their respective owners. 18261-1115

You might also like