Professional Documents
Culture Documents
SCUP Collection Employing Accreditation To Strengthen Planning and Drive Improvement
SCUP Collection Employing Accreditation To Strengthen Planning and Drive Improvement
SCUP Collection Employing Accreditation To Strengthen Planning and Drive Improvement
Employing
Accreditation
to Strengthen
Planning and
Drive Improvement
Introduction to Accreditation....................1
by Belle S. Wheelan, PhD
The articles selected for this collection include both a retrospective and an
introduction to the field for readers new to the topic. The remaining articles
showcase how institutions intentionally employ accreditation at multiple levels
to strengthen integrated planning and propel ongoing quality improvement.
Introduction
to Accreditation
Belle S. Wheelen, PhD
The president of a regional accrediting body shares her answers to frequently asked
questions about accreditation.
The length of time between reaffirmation visits varies among the accrediting
organizations and ranges from seven to ten years.
Each accrediting body has a different set of standards, but each assesses the
quality of faculty and administration, curriculum, learning resources, and
institutional governance as well as the institution’s financial stability.
This article first appeared in Planning for Higher Education, Issue V46N1, October–December 2017
In the end, accreditation is about signifying that our institutions have “a purpose
appropriate to higher education” and the “resources, programs, and services
sufficient to accomplish and sustain that purpose” (Southern Association of
Colleges and Schools Commission on Colleges 2017, 2) for the betterment of our
students, communities, and country.
Re f e r en c es
Southern Association of Colleges and Schools Commission on Colleges. 2017. Welcome from the
President. Accessed November 14, 2017: www.sacscoc.org/president.asp.
U.S. Department of Education. 2016. Federal Student Aid Annual Report FY 2016. Washington,
DC: U.S. Department of Education. Accessed November 14, 2017: https://studentaid.ed.gov/sa/sites/
default/files/FY_2016_Annual_Report_508.pdf.
2
F E ATUR E ARTI CLE
Connecting
the Dots
Accountability, Assessment,
Analytics, and Accreditation
Linda L. Baer, PhD
Calls for accountability, outcomes assessment,
evidence of institutional performance, and
student success must be answered by integrated
planning and decision-making across higher
education.
There is now a renewed sense of urgency to improve accountability,
transparency, and performance in higher education—the result of a perfect
storm of state budget challenges, the ongoing transition from a manufacturing
to a knowledge economy, and the inability to appropriately articulate the
value of a postsecondary education. Stakeholders are demanding more from
higher education, searching for an overall return on this investment from the
student, state, and federal perspectives. People are now asking, is college worth
it? (Barone 2017; Dossani 2017). These challenges cannot be met with simple
changes. Institutions must strive to develop analytics or actionable intelligence
in all institutional areas—particularly those related to learning (Baer and
Campbell 2012). Strategic planning and decision-making in this ever-changing
environment is critical.
This article first appeared in Planning for Higher Education, Issue V46N1, October–December 2017
4
In the mid-1980s, states began to require colleges and In 2006, the Secretary of Education’s Commission on the
universities to report on performance, and in the Future of Higher Education offered a strong indictment of
1990s, Congress developed the Student Right-to-Know American higher education (U.S. Department of Education
Act, which mandated significant new disclosure of 2006). The commission focused on costs that were too
information on graduation rates and school safety. high, graduation rates that were too low, especially
Policy makers expressed further concerns regarding among low-income and minority students, and learning
higher education outcomes as related to cost. In 1983, A outcomes that remained a mystery. Overall, higher
Nation at Risk was published, which warned of declining education responded as it had to previous calls for more
learning standards in both primary and secondary schools accountability by developing strong defenses against the
(National Commission on Excellence in Education 1983). In criticism and ultimately indicating that it was already
1986, concerned governors published a report titled Time accountable in many ways. In addition, leaders argued that
for Results that extended the call to examine the quality higher education institutions are so diverse and unique
of learning to the collegiate level (National Governors that no single form of accountability could be used to
Association 1986). These efforts resulted in an important assess all fairly (Carey 2007).
nationwide movement that led to the development of
The No Child Left Behind legislation imposed
systematic research on student learning outcomes.
unprecedented federal requirements on the K–12 system
Driven by the federal government, regional accreditors
to use regularly administered standardized tests to
began to increase their emphasis on quality and improved
document annual improvements in all student ethnic and
graduation rates (Burke 2004; Carey 2007).
socioeconomic subpopulations, and many thought higher
Ewell (2014, 1) noted, education was next. However, since higher education
still didn’t have standards by which to measure learning
For many years, judgements about “quality”
outcomes, the proposals focused on graduation rates.
in higher education were determined almost
solely by institutional reputation, productivity, Indeed, many panels, commissions, acts, and proposals
and factors such as fiscal, physical, and human were created to move toward a stronger sense of
resources. Regional accreditors, charged with accountability. States tried to build accountability systems
examining the adequacy of public and independent that mattered, and during the 1990s, the number of state-
institutions alike, looked mostly at the overall level accountability systems grew. Yet, in determining
level of institutional resources and at internal metrics, states often used information they already had,
shared-governance processes. Over the past three such as graduation rates, which resulted in mountains of
decades, however, interest on the part of external data without context or meaning.
stakeholders in the actual academic performance of
Mark Warner, chair of the National Governors Association
colleges and universities has steadily risen.
in 2005, worked on a governors’ compact on high school
Ewell determined that there are a number of reasons graduation rates. He stated,
for this. One is the growing emphasis on accountability,
Clearly better data alone will not increase
particularly as it relates to student learning outcomes.
graduation rates or decrease dropout rates, but
There is increased competition in higher education, and
without better data states cannot adequately
the environment is putting a premium on visible evidence
understand the nature of the challenge they
of academic performance. In addition, the ongoing fiscal
confront. Knowing the scope of the problem, why
constraints under which most colleges and universities
students are leaving, and what their educational and
operate demand strong evidence-based academic
personal needs are can help leaders target resources
management practices as much as fiscal discipline.
more effectively in support of those young people
who are at-risk or who have already dropped out
(National Governors Association 2005, 9) .
• “The Data Quality Campaign, State Longitudinal Data Systems, and Common
Education Data Standards, all closely related initiatives to improve the
availability and quality of educational data.”
• “Common Core State Standards for College and Career Readiness, an initiative
to establish common learning objectives for K–12 education and assess
student achievement throughout elementary and secondary education in
order to promote more widespread attainment.”
• “What data, collected with what definitions and procedures, and what
combinations of data into metrics, will be credible and widely accepted?”
6
Figure 1 The Accountability Triangle The Accountability Triangle
(figure 1) provides insight into
the expectations of multiple
stakeholders and gives context
to the environment in which the
quest to improve accountability,
assessment, analytics, and
accreditation resides (Burke 2004).
8
Figure 2 Taxonomy of Terms Commonly Used in Connection with Student Learning Outcomes
Units of Analysis Ways of Looking Ways of Looking Ways to Review
at Performance at Outcomes Performance
Institution Efficiency Behaviors Evaluation
Productivity • Employment
Effectiveness • Further Education
• Career Mobility
• Income
Program Output
Productivity Satisfaction Measurement
Indicator
Assessment
Student Outcome Learning Evidence of Achievement
• Knowledge • Examinations
• Skill • Performances
• Ability • Student Work
• Attitude/Disposition
Attainment
Development
Source: Ewell 2001, p. 8
The assessment movement, as Ewell (2009, abstract 1) and the Media (n.d., p. 2) presented a similar conclusion:
characterized it, emerged in the mid-1980s from “the “Examinations of the quality of higher education usually
demand by policymakers for better and more transparent focus on statistics representing the number of books in the
information about student and institutional performance, library, the size of the endowment, test scores of incoming
the press by accreditors on institutions to collect and use freshmen, graduation rates and the like.”
student learning outcomes data, and the availability of
These metrics are lagging metrics since they report past
more and better assessment instruments and approaches.”
activity and often have little direct bearing on student
As noted by Schray (n.d., p. 6), “Many proponents of learning. Recently, more emphasis on outcomes-based
greater public accountability in higher education and performance measures has led to outcomes-based
accreditation argue that the most important evidence metrics such as student retention and persistence rates,
of quality is performance, especially the achievement graduation rates, rates of placement in jobs, and post-
of student learning outcomes. This has led to a number graduate income levels. However, these are still “after-
of national and state efforts to identify a broad range of the-fact” measures that are being used because of the
performance indicators or measures including access, lack of standardized learning outcomes data. Today, the
productivity and efficiency, student learning, degree
use of analytics tools is bringing higher education more
completion, and economic returns from postsecondary
ability to monitor at-risk student behavior, factors related
education. Many of these performance measures and
to persistence, and what interventions are working for
indicators are represented in “Measuring Up: The National
which students. As outcomes-based education expands,
Report Card on Higher Education.”
more standards for learning are becoming available. The
Historically, higher education has relied on input measures list of metrics now includes student success, student
such as student enrollment, number of faculty, investment access and diversity, meeting workforce needs, and
in new buildings, and research grants and contracts research and innovation that benefit the academic
received. A report by the Hechinger Institute on Education community and society (Miller 2016).
10
Over time, there have been many attempts to develop Another source of information about the aggregate
performance measures. National and state efforts have performance of colleges and universities is Complete
identified several measures including access, productivity College America (CCA). Established in 2009, CCA “is a
and efficiency, degree completion, and economic returns national nonprofit with a single mission: to work with
from postsecondary education. For example, Measuring states and consortia to significantly increase the number
Up 2008: The National Report Card on Higher Education, of Americans with quality career certificates or college
authored by the National Center for Public Policy and degrees and to close attainment gaps for traditionally
Higher Education (2008, p. 4), focused on six measures that underrepresented populations” (Complete College
apply to sets of institutions, an entire community or state, America, n.d., 1). Thirty-four states currently participate
or a set of communities. These measures by implication in CCA, which advocates structural changes to an
integrate social and economic conditions into the institution’s approach to student course-taking behavior.
performance evaluation of postsecondary education. The The CCA model promotes more intentionality in course
key indicators were selected because they are broad gauges taking, course scheduling, and developing pathways for
for understanding success in key performance areas: student success.
• “Preparation for college: How well are high school CCA has identified six institutional “game changers” that
students prepared to enroll in higher education and appear to contribute to student success:
succeed in college-level courses?”
Through research, advocacy, and technical
• “Participation: Do young people and working-age adults assistance, we help states put in place the six GAME
have access to opportunities for education and training CHANGERS that will help all students succeed in
beyond high school?” college: 15 to Finish, Math Pathways, Co-requisite
Support, Momentum Year, Academic Maps with
• “Affordability: How difficult is it to pay for college when
Proactive Advising, A Better Deal for Returning
family income, the cost of attending college, and student
Adults. (Complete College America, n.d., 4)
financial aid are taken into account?”
While the “15 to Finish” approach is intended to move
• “Completion: Do students persist in and complete
students to completion in a more timely manner, it is
certificate and degree programs in college?”
important to understand that full-time does not work
• “Benefits: How do college-educated and trained for all students. A more comprehensive metric includes
residents contribute to the economic and civic well- time to degree and on-path progress to completion for
being of each state?” part-time students. “Time to degree is a major concern
for students, one that colleges often do not take seriously
• “Learning: How do college-educated residents perform
enough. Research shows that students who can take more
on a variety of measures of knowledge and skills?”
classes on a focused path to a degree, should, because it
helps them succeed at higher rates. Whether it’s 15 in a
term, 30 in a year, or just one more class,” said Dr. Davis
How can higher education improve the Jenkins, Civitas Learning advisor and senior research
impact of performance measurement scholar at the Community College Research Center
as related to student success? By (Civitas Learning 2017, 5).
developing and using outcomes
How can higher education improve the impact of
measures for student learning.
performance measurement as related to student success?
By developing and using outcomes measures for student
learning.
12
The work in CBE is paving the way for more foundational definitions and
standards that institutions and academic programs can use to develop the
infrastructure for improved learning. Research indicates that students are more
active, engaged, and motivated when involved in coursework that is challenging
but within their capacity to master. CBE accomplishes this by linking progress to
mastery (EDUCAUSE Learning Initiative 2014).
Le ar nin g Analy ti cs
Learning science, smart technology, and the pressure for more accountability
have created a perfect storm for the development of a learning analytics
environment. In fact, in its annual Horizon Report that focuses on current
and future trends in higher education, NMC noted that one trend it has been
following is learning analytics (Adams Becker et al. 2017).
14
Acc r e d itation Figure 3 Accreditation Functions
Ewell and Steen (n.d., 10) noted, “Accreditors were first mandated to look at
learning outcomes as a condition of recognition in Department of Education
rules established in 1989, but these directives were not very specific.”
Accreditors are now being asked not just whether they examine student
learning outcomes in light of an institution’s mission, but also why they don’t
establish and enforce common standards of learning that all must meet (Ewell
and Steen, n.d.). The answer is that there are still no standards of learning
outcomes so institutions must continue to use the measures they do have:
graduation rates, persistence rates, and, in some cases, employment statistics.
Connectin g th e Dots
While these dynamics have been in play for some time, analytics now provide
a strong platform for reporting, monitoring, and evaluating progress as well
as acting on outcomes to improve student success. The science of learning, the
development of powerful data systems, and the advancement of predictive
student solution platforms have come together to enhance our ability to assess,
account, and accredit actual student learning and institutional performance.
The competency-based learning model is a precursor to the ability to establish
what students need to demonstrate in terms of mastery of competencies and
skills, how they will demonstrate that mastery, and what can be done in the
learning environment to support progress toward timely mastery.
16
Learning management systems are integral in this effort, providing the behind- As stakeholders demand more
the-scenes platform in a student’s learning experience and serving as the course of education, a coordinated,
hub and connector for management and administration, communication and aligned approach to assessment,
discussion, creation and storage of materials, and assessment of subject mastery accountability, analytics, and
(Lang and Pirani 2014). These systems enable the enhancement of learner accreditation will result in stronger,
information in real time for faculty, students, and advisors. more sustainable outcomes. Data must
move from reportability to action.
Figure 4 connects the components of accountability, assessment, accreditation,
and analytics. The model begins with the analytics dimension, which provides A critical component is moving
a data platform from which decision makers can access information, develop from data to insight. As Kamal
insight, and review data on what is working in the institution. Learning analytics (2012, 2) wrote, developing metrics
specifically provide invaluable information on student behavior and learning, is easy; developing insights is hard:
including insights into what works to support student learning. The assessment “In contrast to this abundant data,
dimension provides the foundation for improvement and is based on the insights are relatively rare. Insights
strength of the analytics environment. Stronger institutional evidence results here are defined as actionable, data-
when cross-functional units agree on what is assessed, how it is assessed, and driven findings that create business
what can be accomplished from standardizing outcomes metrics. At this point, value. They are entirely different
agreed-upon performance-based measures of accountability are strengthened beasts from raw data. Delivering
beyond after-the-fact graduation, persistence, and employment rates. The them requires different people,
institution can move from data overload to targeted measures that can make technology, and skills—specifically
a difference to individual students. Further, adaptive assessment can assist including deep domain knowledge.
students in their current learning environment. The institution can move And they’re hard to build.”
from fragmented student success measures to a fully integrated set of student
Why is this important? In order
success efforts spanning the student life cycle. This integrated approach
to connect the dots of assessment,
provides the institution with strong evidence to meet the multiple demands of
accountability, analytics, and
stakeholders and accrediting bodies.
accreditation, insight makers are
Figure 4 Connecting the Dots: A Model for Integrated Decision-Making required. These are people across
functional areas of the institution
• Performance-based • Quality assurance
expectations • Continuous who can go beyond the numbers to
• Multiple stakeholder improvement understand the implications, impact,
expectations • Gatekeeper
and inspiration behind the numbers.
They can lead collaborative
Accountability Accreditation conversations that use statistics,
reporting, and visualization tools to
help maximize data alignment across
the institution’s accountability
agenda. Good data are fundamental,
Assessment Analytics but analysis for impact is crucial
for real change. This is true for
each of the dimensions, whether
• Data
assessment, accountability,
• Standard • Insights
performance metrics
• Action tools
analytics, or accreditation.
• Learning metrics for decision-making
• Review what metrics, data, and indicators are being used for which part of the
accountability and accreditation requirements.
• Pay attention to the tools, applications, and services that are available to
support analytics and decision making based on data.
18
Re f e r en c es
AASCU Government Relations. 2017. Top 10 Higher Education State Hechinger Institute on Education and the Media. n.d. Beyond the
Policy Issues for 2017. Policy Matters, January. Accessed December 5, 2017: Rankings: Measuring Learning in Higher Education. An Overview for
www.aascu.org/policy/publications/policy-matters/Top10Issues2017.pdf. Journalists and Educators. New York: Hechinger Institute on Education
and the Media. Accessed December 19, 2017: http://hechinger.tc.columbia.
Adams Becker, S., M. Cummins, A. Davis, A. Freeman, C. Hall Giesinger, edu/primers/TeaglePrimer_092106.pdf.
and V. Ananthanarayanan. 2017. NMC Horizon Report: 2017 Higher
Education Edition. Austin, TX: The New Media Consortium. Accessed Johnson, L., S. Adams Becker, M. Cummins, V. Estrada, A. Freeman, and C.
December 5, 2017: http://cdn.nmc.org/media/2017-nmc-horizon-report- Hall. 2016. NMC Horizon Report: 2016 Higher Education Edition. Austin, TX:
he-EN.pdf. The New Media Consortium. Accessed December 19, 2017: http://cdn.nmc.
org/media/2016-nmc-horizon-report-he-EN.pdf.
Baer, L., and J. Campbell. 2012. From Metrics to Analytics, Reporting to
Action: Analytics’ Role in Changing the Learning Environment. In Game Johnson, L., S. Adams Becker, V. Estrada, and A. Freeman. 2015. NMC
Changers: Education and Information Technologies, ed. D. G. Oblinger, Horizon Report: 2015 Higher Education Edition. Austin, TX: The New Media
53–65. Washington, DC: EDUCAUSE. Consortium. Accessed December 5, 2017: http://cdn.nmc.org/media/2015-
nmc-horizon-report-HE-EN.pdf.
Barone, M. 2017. Is College Worth It? Increasing Numbers Say No.
Washington Examiner, June 8. Accessed December 5, 2017: www. Kamal, I. 2012. Metrics Are Easy, Insight Is Hard. Harvard Business Review,
washingtonexaminer.com/is-college-worth-it-increasing-numbers- September 24. Accessed December 5, 2017: https://hbr.org/2012/09/
say-no/article/2625304. metrics-are-easy-insights-are-hard.
Barrett, B. 2017. How Much Would It Cost for For-Profit Colleges to Pass Kauffman, S. 2017. Higher Learning Commission Blue Ribbon Panel
Gainful Employment? New America, June 15. Accessed December 5, 2017: Tasked to Set Agenda for New Initiatives and Innovation in College and
www.newamerica.org/education-policy/edcentral/how-much-would-it- University Accreditation. News release, July 19. Accessed December 7,
cost-profit-colleges-pass-gainful-employment/. 2017: www.prweb.com/releases/2017/7/prweb14521524.htm.
Burke, J. C., ed. 2004. Achieving Accountability in Higher Education: Balancing Kelly, R. 2016. 7 Things Higher Education Innovators Want You to
Public, Academic, and Market Demands. San Francisco: Jossey Bass. Know. Campus Technology, March 14. Accessed December 5, 2017:
http://campustechnology.com/Articles/2016/03/14/7-Things-Higher-
Carey, K. 2007. Truth Without Action: The Myth of Higher Education Education-Innovators-Want-You-To-Know.aspx?p=1.
Accountability. Change 39 (5): 24–29.
Lang, L., and J. A. Pirani. 2014. The Learning Management System
Civitas Learning. 2017. New Data Reveal Key Opportunities to Improve Evolution. ECAR Research Bulletin, May 20. Accessed December 5, 2017:
Part-Time Student Success. News release, October 11. Accessed https://library.educause.edu/~/media/files/library/2014/5/erb1405-pdf.pdf.
December 5, 2017: www.civitaslearning.com/press/community-insights-
report-key-opportunities-improve-part-time-student-success/. Lingenfelter, P. E. 2003. Educational Accountability: Setting Standards,
Improving Performance. Change 35 (2): 18–23.
Competency-Based Education Network. 2017. Quality Framework
for Competency-Based Education Programs. Accessed December 5, ———. 2016. “Proof,” Policy, and Practice: Understanding the Role of Evidence
2017: www.cbenetwork.org/sites/457/uploaded/files/CBE_Quality_ in Improving Education. Sterling, VA: Stylus Publishing.
Framework.pdf.
Long, P., and G. Siemens. 2011. Penetrating the Fog: Analytics in Learning
Complete College America. n.d. About. Accessed December 5, 2017: http:// and Education. EDUCAUSE Review, September/October, 31–40. Accessed
completecollege.org/about/. December 5, 2017: https://er.educause.edu/~/media/files/article-
downloads/erm1151.pdf.
Dossani, R. 2017. Is College Worth the Expense? Yes, It Is. The Rand Blog,
May 22. Accessed December 5, 2017: www.rand.org/blog/2017/05/ MacTaggart, T. 2017. The 21st Century Presidency: A Call to Enterprise
is-college-worth-the-expense-yes-it-is.html. Leadership. Washington, DC: Association of Governing Boards. Accessed
December 5, 2017: www.agb.org/sites/default/files/report_2017_21st_
EDUCAUSE Learning Initiative. 2014. 7 Things You Should Know About century_presidency.pdf.
Competency-Based Education. February 11. Accessed December 5, 2017:
https://library.educause.edu/~/media/files/library/2014/2/eli7105-pdf. Mayotte, B. 2015. What the New Gainful Employment Rule Means
for College Students. U.S. News & World Report, July 8. Accessed
Ewell, P. T. 2001. Accreditation and Student Learning Outcomes: A Proposed December 5, 2017: www.usnews.com/education/blogs/student-loan-
Point of Departure. CHEA Occasional Paper, September. Washington, DC: ranger/2015/07/08/what-the-new-gainful-employment-rule-means-for-
Council for Higher Education Accreditation. Accessed December 5, 2017: college-students.
www.chea.org/userfiles/Occasional%20Papers/EwellSLO_Sept2001.pdf.
Miller, T. 2016. Higher Education Outcomes-Based Funding Models and
———. 2009. Assessment, Accountability, and Improvement: Revisiting Academic Quality. Lumina Issue Papers, March. Accessed December 19,
the Tension. Occasional Paper #1. Urbana, IL: National Institute for 2017: www.luminafoundation.org/files/resources/ensuring-quality-1.pdf.
Learning Outcomes Assessment. Accessed December 5, 2017: www.
learningoutcomeassessment.org/occasionalpaperone.htm. Millichap, N., and G. Dobbin. 2017. 7 Recommendations for Student
Success Initiatives. EDUCAUSE Review, October 11. Accessed December
———. 2014. The Growing Interest in Academic Quality. Trusteeship, 5, 2017: https://er.educause.edu/blogs/2017/10/7-recommendations-for-
January/February. Accessed December 5, 2017: www.agb.org/ student-success-initiatives.
trusteeship/2014/1/growing-interest-academic-quality.
National Center for Public Policy and Higher Education. 2008. Measuring
Ewell, P., and L. A. Steen. n.d. The Four As: Accountability, Accreditation, Up 2008: The National Report Card on Higher Education. San Jose, CA:
Assessment, and Articulation. Mathematical Association of America. National Center for Public Policy and Higher Education. Accessed
Accessed December 5, 2017: www.maa.org/the-four-as-accountability- December 5, 2017: http://measuringup2008.highereducation.org/print/
accreditation-assessment-and-articulation. NCPPHEMUNationalRpt.pdf.
National Commission on Excellence in Education. 1983. A Nation at Schray, V. n.d. Assuring Quality in Higher Education: Key Issues and
Risk: The Imperative for Education Reform. Washington, DC: National Questions for Changing Accreditation in the United States. Issue Paper,
Commission on Excellence in Education. Accessed December 5, 2017: Secretary of Education’s Commission on the Future of Higher Education.
https://www2.ed.gov/pubs/NatAtRisk/index.html. Accessed December 5, 2017: https://www2.ed.gov/about/bdscomm/list/
hiedfuture/reports/schray.pdf.
National Governors Association. 1986. Time for Results: The Governors’
1991 Report on Education. Washington, DC: National Governors Shacklock, X. 2016. From Bricks to Clicks: The Potential of Data and
Association. Analytics in Higher Education. London: Higher Education Commission.
Accessed December 19, 2017: www.policyconnect.org.uk/hec/sites/
———. 2005. Governors Sign Compact on High School Graduation Rate at site_hec/files/report/419/fieldreportdownload/frombrickstoclicks-
Annual Meeting. News release, July 16. Accessed December 5, 2017: www. hecreportforweb.pdf.
nga.org/cms/home/news-room/news-releases/page_2005/col2-content/
main-content-list/governors-sign-compact-on-high-s.html. Suskie, L. 2015. Five Dimensions of Quality: A Common Sense Guide to
Accreditation and Accountability. San Francisco: Jossey-Bass.
Roscorla, T. 2014. How Analytics Can Help Colleges Graduate More
Students. Center for Digital Education: Converge, July 15. Accessed U.S. Department of Education. 2006. A Test of Leadership: Charting the
December 19, 2017: www.centerdigitaled.com/news/How-Analytics-Can- Future of U.S. Higher Education. Washington, DC: U.S. Department of
Help-Colleges-Graduate-More-Students.html. [Updated link: https://www. Education. Accessed December 5, 2017: https://www2.ed.gov/about/
govtech.com/education/Can-Analytics-Help-Colleges-Graduate-More- bdscomm/list/hiedfuture/reports/pre-pub-report.pdf.
Students.html.]
Auth o r Biog r ap hy
Lin da B a e r , P h D , is a senior consultant with Civitas Learning. She has served over 30
years in numerous executive-level positions in higher education, including senior program
officer, postsecondary success for the Bill & Melinda Gates Foundation, senior vice chancellor
for academic and student affairs in the Minnesota State College and University System, senior
vice president and interim president at Bemidji State University, and interim vice president
for academic affairs at Minnesota State University, Mankato. Her ongoing focus is to inspire
leaders to innovate, integrate, and implement solutions to improve student success and
transform institutions for the future. She presents nationally on academic innovations, educational transformation,
the development of alliances and partnerships, the campus of the future, shared leadership, and building
organizational capacity in analytics. Recent publications have been on smart change, shared leadership, successful
partnerships, innovations/transformation in higher education, and analytics as a tool to improve student success.
20
Fe atu r e ARTI CLE
Reflections on
Two Decades
of Quality
Assurance and
Accreditation
in Developing
Economies
Fred M. Hayward, PhD
In our increasingly mobile world, quality assurance
and accreditation across the globe, and
particularly in developing countries, has a number
of implications for higher education as a whole.
I ntro duction
This article first appeared in Planning for Higher Education, Issue V46N1, October–December 2017
22
have been found wanting. In a few countries, self-assessments are the only
requirement, though in most cases they are part of the accreditation process
in which the self-assessments are then reviewed by peer reviewers who also
carry out site visits and make recommendations about whether to accredit the
institution based on published standards. For the most part, in recent years
accreditation has become the standard mode of quality assurance in higher
education in developing areas.
In Africa there was only one quality assurance organization in 1985, but by 2006
there were 11 (Hayward 2006), and today almost every Sub-Saharan African
state has a quality assurance agency with 23 in place in 2014 (Salmi 2015). In Asia
prior to 1985 there were three quality assurance agencies. By 2004, there were
15 quality assurance organizations in 13 countries, 13 of which were involved in
accreditation and two in audits (Lenn 2004). Now almost every Asian country
has some sort of accreditation process in place.1 Thus by 2017, the process of
accreditation had become almost universal. As we will see, processes around the
What I want
world are remarkably similar with respect to goals, methods, and expectations.
At the same time, some of the processes are more demanding than others, with
to emphasize
understandably mixed results. Nonetheless, what I want to emphasize is the is the general
general acceptance of the importance of quality assurance, the need for quality acceptance of
improvement around the world, and the general similarity of the approaches to the importance
the process. of quality
A major challenge to the consensus on the importance of accreditation has assurance, the
been raised by the rapid growth of student demand for higher education, the need for quality
rapid expansion of private higher education to meet that demand, and the improvement
limited ability of many systems to review all institutions for quality in a timely around the world,
fashion. In Afghanistan, for example, the number of private higher education and the general
institutions increased from zero in 2005 to more than 125 in 2017, leading to
similarity of the
serious concerns about their quality.2 This rapid growth has been replicated
approaches to the
around the world, especially in developing economies, although because many
of the new institutions were of low quality, they were eventually closed, as
process.
happened in Afghanistan, South Africa, Ethiopia, Kenya, and the Philippines.3
However, it has become increasingly difficult for accrediting and quality
assurance agencies to keep up with the growth of private higher education
institutions due to the cost of reviews, the lack of professional staff, and
sometimes political and other interference in the process.
1 In 2015, only Myanmar was without a formal quality assurance agency. See Salmi (2015).
2 As a result, Afghan president Hamid Karzai, through a Presidential Decree in 2012, requested a major
quality review of all private higher education institutions.
3 For several examples and discussion, see Salmi (2015).
With the growing awareness of the importance of high quality has come the
recognition that higher education generally is underfunded. Unfortunately,
that knowledge has not encouraged the increased financial resources so
important to quality higher education in the developing world. In Afghanistan,
for example, funding in the regular budget dropped from $521 per capita in
2008 to only $345 per capita in 2016—a decline of more than one-third.4 Similar
scenarios have occurred in many developing economies, especially in Africa.
The funding problem has been exacerbated by a decline in external funding
from the World Bank and other agencies.5
4 Ministry of Higher Education data 2016. This does not include the development budget, which varied
widely over the years with only a small portion actually made available.
5 Overall external funding for higher education in Africa averaged $103 million annually from 1990 to
1994, then dropped to $30.8 million a year from 1995 to 1999. It rose to only $36.6 million between
2000 and 2004 in contrast to much higher levels of funding for primary and secondary education.
Ninety percent of all education funding in later years went to primary and secondary education. See
Hayward and Ncayiyana (2014).
24
B r e akin g Down Cu ltu r al Bar r ie rs to poor, good, very good, excellent) were marked “excellent.”
Acc r e d itation We asked how that could be, and the reviewers were
One of the perceived obstacles to accreditation in much of sheepish about it. They admitted it was the difficulty
the world was the perception that it was not in accord with of saying anything bad about others. We realized we
cultural norms. In both Afghanistan and Madagascar, the needed to give them some solid benchmarks to help them
initial reactions were “That is not Afghan” or “That is not differentiate. We suggested using the best universities
Malagasy.” In the Afghan case it was suggested that Afghans they could think of for “excellent,” a new but promising
would not criticize each other, would not say negative institution for “good,” and a fraudulent institution as
things about a person or an institution. Because of that, the “unacceptable.” That seemed to help somewhat. However,
argument went, how could Afghanistan institute effective as the discussion continued, the most useful description
peer review and site visits in which people were expected to was one drawn from survey research—the idea of a ladder.
be critical? That was followed by long discussions. In the end At the bottom of the five-rung ladder was “unacceptable,”
it was recognized that Afghans did make such judgments farther up was “good,” then “very good,” with “excellent”
in admitting students, and they could do the same for on the top rung. This worked very well. When the peer
programs and institutions. Eventually it was agreed that the reviewers went out two days later, their rankings covered
process should begin. This became a new part of the Afghan a wide range of levels—rankings those who knew the
academic culture—one involving much more assessment institutions well found to be right on target.
and review. It also fostered greater interaction with
students. In Madagascar the opponents were largely leaders
Recog ni z in g th e I nte r national
of private, for-profit institutions who said, “Let the customer I m po rtan c e o f Q ualit y Ass u r an c e an d
decide about quality.” They were quickly overruled by other Acc r e d itation
leaders of private institutions who saw accreditation as a
New awareness of the importance and implications
way to gain legitimacy in a competitive market—and it was.
of accreditation internationally helped its growth as
Indeed, the best of the private institutions were the first to
it became clear that graduates going on for graduate
apply for accreditation when it started, way ahead of the
studies in many parts of the world needed to be from
public institutions, which stalled.
accredited institutions to be considered for admission.
In addition, the growing understanding that there were
Deve lopin g I nnovative Tec h ni q u es o f some “international standards” or expectations helped
Ass ess m ent
encourage many reluctant institutions to review their
In the workshops leading up to peer reviews in Afghanistan, own standards. This was especially important in countries
we talked specifically about the unwillingness of people to such as Madagascar, which as an island nation had been
be publicly critical of or make judgments about others and isolated from most of its African neighbors. This was also
the need to differentiate levels of quality or non-compliance the case in Afghanistan, which had been isolated by war.
despite this. While workshop participants agreed that What was remarkable to watch was how important the
people were reluctant to be critical, they thought that in idea of “state of the art” or international standards and
this situation it would not be a problem. Following that expectations became to many higher education leaders
discussion, the quality assurance process was established, and faculty members. The Internet became a major tool in
and peer reviewers were trained. After the training, 10 peer identifying what was state of the art. In many cases, as in
reviewers were sent off to the first two institutions that Mauritius, people came away empowered after reviewing
were being used as pilots for the process. The 10 of them programs from what they saw as outstanding institutions
came back having completed their reviews, and almost all such as Oxford, Harvard, or Berkeley and seeing the
of the evaluations that used a five-part scale (unacceptable, similarities with their own programs.
P rotectin g th e P u b li c
26
Foste r in g G r e ate r Focus on Te ac h in g
Growing concerns among the public, government officials, and parents about
employment have had a significant impact on higher education in developing
countries. While unemployment numbers in many developing economies are
unknown, we know they are high. We have seen some of the effects in student
demonstrations around the world, with the Arab Spring a major example of
the contagion that can follow. Students are worried about their futures. The
old expectation of government jobs, an expectation largely created during
the colonial period when universities were seen as the path to government
employment, is long gone. Nonetheless, the notion of a government obligation
to provide employment is still widespread. That is putting pressure on both
governments and higher education institutions to ensure there is a link between
higher education programs and employer needs.
Businesses and other employers are also putting pressure on higher education
institutions to improve the relevance of educational programs. Not long ago
50 percent of firms in Egypt identified the low level of skilled labor as a major
problem. There were 200,000 vacancies; yet, many times that number of
people in those fields were unemployed. They did not meet the qualifications
of employers, mostly because of the poor quality of education they received
(Ghafar 2016). Many of these people were in engineering but had never been in
an engineering laboratory.
28
C r e atin g N ew Awar eness o f th e P ro f ess ionali z in g q ualit y ass u r an c e
O n goin g Re s pon s i b i lit y o f Facu lt y
Me m b e rs fo r Contin uous I m p rove m ent One bright spot in quality assurance is the
Though it has grown slowly, there is a new awareness professionalization of quality assurance personnel, from
among faculty members in developing economies of peer reviewers to senior staff. The last decade has seen
their responsibility for continuous improvement— a major improvement in staff training. Much of that can
an understanding, especially among younger faculty be attributed to organizations such as the World Bank
members, that the “state of the art” is constantly changing. and the Carnegie Corporation and a number of donor
The idea of meeting “international standards” was totally countries including the United States, Germany, the
rejected as impossible in 2008 when I suggested it in United Kingdom, and several others. In addition, a number
Pakistan at a meeting on quality assurance. Now it is a term of nongovernmental organizations offer excellent training
used regularly by those working on quality assurance programs. This has greatly improved the quality assurance
there. What international standards represent is not process and helped increase its legitimacy and level of
something specific, but the idea that there is a state of public trust.
As I have tried to show in the preceding pages, there have been a variety
of important, and in some cases unexpected, benefits resulting from the
development of quality assurance and accreditation in developing economies.
The process of establishing accreditation and quality assurance mechanisms
has helped set new expectations for quality. The process has fostered the
first external reviews ever of many institutions in places as diverse as Ghana,
Pakistan, Afghanistan, and Madagascar. It has greatly helped governments and
There is an
the public deal with the mushrooming growth of fraudulent and low-quality
amazingly high
providers, many of which were set up for political or financial reasons with
level of agreement
little or no interest in higher education outcomes. In some cases, such as in
on the importance
Afghanistan and South Africa, the process has helped strengthen efforts to
of quality
enhance the autonomy of higher education institutions and free them from
assurance and
political interference. It has also helped shake some of the older, self-satisfied
accreditation,
institutions out of their complacency about their own quality by showing them
the methods that
the results of comparisons with some newer universities that had moved ahead
should be used
of them, as was seen in South Africa where several vaunted institutions were
to carry it out
put on probation or lost program accreditation. The quality assurance process
appropriately, and
has served to strengthen links throughout the higher education system in
the value of an
several countries, including through a peer review process that has broadened
autonomous and
faculty member knowledge of other institutions. Accreditation has served to
open process.
create strong incentives for quality improvement and in the process helped
make the public aware of the importance of high quality and increased pressure
on governments to ensure it. In a number of cases, such as in Pakistan, it has
increased public confidence in higher education.
On the other hand, there are many challenges that remain. Among them is the
cost of accreditation in most developing economies, which, while not a major
problem for the economies of countries such as South Africa, is a major problem
for those such as Afghanistan where higher education budgets are already
under serious strain. Another challenge is capacity building. While there have
been great strides in professionalizing quality assurance and accreditation,
ongoing capacity building, especially of a changing corps of peer reviewers,
remains difficult and costly, with growing resistance to demands for the regular
upgrading of skills among some faculty members. Finding enough experienced
faculty members who themselves embody the excellence expected has been
difficult in those countries with too few faculty members with PhDs. This is
30
almost universally a problem in the developing world where as many as 90
percent of faculty members do not have PhDs. Even in South Africa, which has
a large number of higher education institutions, the accrediting agency has
occasionally had problems finding enough high-quality peer reviewers who
meet the requirements.
One area in which there has not been much success is in improving the quality
of graduate education, which in much of Africa and parts of South Asia is far
below the level required to produce the number of high-quality new master’s
degrees and PhDs needed. In general, accreditors have not focused on this
problem. Daniel Ncayiyana and I have looked at the problems of graduate
education in Sub-Saharan Africa over the years and found that it remains at
unacceptably low levels with a few exceptions. In a 2014 study we concluded,
32
Re f e r en c es
Auth o r Biog r ap hy
eNews Channel Africa. 2015. SA Student Dropout Rate High. May 19.
Accessed January 3, 2018: www.enca.com/south-africa/student-dropout- F r e d M . H ay wa r d , P h D , is a specialist on
rate-high.
higher education with more than 25 years
Ghafar, A. A. 2016. Educated but Unemployed: The Challenge Facing Egypt’s
Youth. Washington, DC: Brookings Institution. Accessed January 3, 2018: of experience as an educator, scholar, senior
www.brookings.edu/wp-content/uploads/2016/07/en_youth_in_egypt-2. administrator, and higher education consultant.
pdf.
He has a PhD and master’s degree from Princeton
Greenberg, M. 2014. It’s Time for a New Definition of Accreditation.
Chronicle of Higher Education, January 27. Accessed January 3, 2018: University and a BA from the University of
www.chronicle.com/article/Its-Time-for-a New-Definition/144207.
California, Riverside. He has taught at the
Hayward, F. M. 2001. Glossary of Key Terms in Quality Assurance and
Accreditation. Council for Higher Education Accreditation (CHEA).
University of Ghana, Fourah Bay College in Sierra
Accessed September 18, 2012: www.chea.org/international/inter_ Leone, and the University of Wisconsin-Madison
glossary01.html.
where he was professor of political science,
———. 2006. Quality Assurance and Accreditation of Higher
Education in Africa. Paper presented at the Conference on Higher department chair, and dean of international
Education Reform in Francophone Africa: Understanding the Keys
of Success, June 13–15, Ouagadougou, Burkina Faso. Accessed January
programs. He was executive vice president of
3, 2018: http://siteresources.worldbank.org/EDUCATION/resour the Council on Higher Education Accreditation
ces/278200-1121703274255/1439264-
1137083592502/QA_accreditation_HE_Africa.pdf. and senior associate for the American Council
Hayward, F. M., and D. J. Ncayiyana. 2014. Confronting the Challenges of on Education for more than 10 years. He has been
Graduate Education in Sub-Saharan Africa and Prospects for the Future.
International Journal of African Higher Education 1 (1): 173–216. Accessed
a higher education consultant for the World
January 3, 2018: https://ejournals.bc.edu/ojs/index.php/ijahe/article/ Bank, Carnegie Corporation, Ford Foundation,
view/5647/4979.
Academy for Educational Development (AED),
Langa, P., G. Wangenge-Ouma, J. Jungblut, and N. Cloete. 2016.
South Africa and the Illusion of Free Higher Education. University USAID, and several universities and ministries
World News, no. 402, February 26. Accessed January 3, 2018: www.
universityworldnews.com/article.php?story=20160223145336908.
of education, focusing on higher education
This article first appeared in Planning for Higher Education, Issue V48N1, October–December 2019
During Natalicio’s first decade in office, the university Surprisingly, despite the successful investments in
established a clear mission of serving the El Paso effective tactical systems, the institution did not see a
region by emphasizing the importance of ensuring dramatic increase in key outcomes (degrees awarded,
access to students from the region who are generally for example). As we considered why this was the case,
underrepresented in higher education (e.g., low income, unexpected inspiration came from an unlikely place.
Hispanic), and achieving institutional excellence (i.e., Moneyball, the book by Michael Lewis, provided the
student degree completion, social mobility of graduates, impetus for a refinement in UTEP’s planning approach;
and university research productivity). During the early the book described a professional baseball team’s use
years, the institution followed a traditional approach to of analytics to develop a strategic advantage against
planning. For example, in 1991, UTEP established the El more affluent teams. The Oakland A’s, the baseball
Paso Collaborative for Academic Excellence (comprised team referenced in the book, did not have the sufficient
of UTEP, El Paso Community College, regional K–12 resources to utilize established strategies (i.e., hire
school districts, and stakeholders) to improve the K–16 the prolific home run hitters and pitchers) to compete
pipeline. Additionally, during the 1990s, UTEP leveraged against more prosperous teams. Instead, the Oakland
several major grants from the National Institutes of baseball team developed a strategic advantage through
Health and the National Science Foundation to develop analytics: It developed a systems understanding of
winning games and identified undervalued players that
the institution’s research infrastructure, advance
optimized the process. Metrics (rather than subjective
research activity, train future researchers, and increase
expert opinion) provided an efficient way to understand
the number of underrepresented minorities in STEM
the system and to identify undervalued players.
programs. By the early 2000s, UTEP began to be
recognized for its tactical systems. The Oakland A’s approach appealed to Natalicio, an
ardent baseball fan. In 2005, she helped the Center
The El Paso Collaborative gained attention as outcomes
for Institutional Evaluation, Research and Planning
in the K–12 educational system improved dramatically;
(CIERP) secure a grant from the Lumina Foundation for
the region, despite being one of the poorest in Texas,
Education to develop the analytics infrastructure, with
had the highest percentage of high school graduates
the specific focus of developing a systems understanding
completing the college curriculum, and was among the
of student success. The insights generated by analytics
highest in the enrollment of students in higher education
helped to concentrate interventions at critical points
immediately after graduation. In 2003, renowned
of students’ transition into higher education (e.g., first
educational author George Kuh and his colleagues
term). In addition, the understandings allowed us to
identified UTEP as one of the 20 exemplary institutions
improve administrative activities (e.g., term-to-term
that had created the conditions for student success.
retention, proactive degree audit, and advising) and
In 2004, the Washington Advisory Group identified
make curricular adjustments that would allow for new
UTEP as one of the Texas public institutions that had the
pathways to degree completion. Those efforts had a
potential to become a national research university.
dramatic impact on outcomes.
36
Figure 1 Evidence of Innovation
3,300
3,100
2,900
2,700
2,500
Expected Growth based on Linear
Trend from FY 1994 to FY 2000
2,300
2,100
1,905
1,900
1,695
1,700
1,695
1,500
1
12
0
10
7
4
4
-1
0
-1
-0
-1
0
-1
0
-1
0
0
-1
-0
-
6-
9-
1-
2-
4-
5-
8-
7-
3-
10
11
16
12
14
15
13
0
99
0
0
20
0
0
0
20
20
20
20
20
20
20
20
20
20
20
20
19
20
20
20
20
Soon after, we began to incorporate analytics into other institutional planning Moneyball,
efforts. For example, by 2007, the State of Texas identified UTEP as one of the the book by
emerging research institutions in Texas. In response, UTEP began the planning Michael Lewis,
process to become a Tier 1 research institution. The initial internal skepticism provided the
about the institution’s ability to achieve the research results was overcome impetus for a
using analytics. We exercised our understanding of the system to develop refinement in
scenario models based on growth in enrollment, faculty, research productivity UTEP’s planning
of faculty, and resources. approach.
The 2010 Strategic Plan for Research laid out UTEP’s long-range plan to
achieve top-tier status based on those analytics models. By 2018, metrics and
analytics became a prominent feature of UTEP’s administrative culture, and the
Moneyball approach was recognized as having an important role in advancing
outcomes. However, despite the recognition realized by the university, many
administrators on campus were not clear on the specific details associated with
implementing the metrics-based planning approach.
The generally accepted definition of planning is that it The field of big data has evolved over time, and our
refers to deliberations that are undertaken to advance the experience with data has followed a similar path. The
goals of an organization (or its entities). Edward Bainfield, early focus of big data was on the volume, variety, and
in 1955, advanced the generic steps (generally described the dynamic rate at which data are available. By the early
as the rational planning model) that are associated 2000s, UTEP had large sets of data, but they resided in
with planning. In the late 1950s, Martin Meyerson and databases across offices and had differing levels of quality
Melville Branch Jr. articulated the domains of planning in and accuracy. In 2004, we began the process of creating a
organizations (generally described as the comprehensive comprehensive database that was built for reporting and
planning framework). Many theories (of how planning institutional research. We formalized the data governance
works) and procedures for planning have emerged over structure, secured large storage capacities, and developed
the last century, but a hurdle for us in planning was in a multilayered process to review, validate, and correct
translating those theoretical concepts into insights that data. Over time, the focus evolved and changed to
were useful for planning practice. managing and analyzing data. We had a similar shift at
UTEP. We developed standardized reports and tools to
In fact, efforts by scholars to establish a connection
track process on institutional outcomes, and also created
between formal planning (e.g., making plans) and
tools that allowed for efficient statistical analysis.
positive outcomes has been problematic, and that has
led to cynicism about the value of the process. Yet our The next step in big data evolution was advanced
review of a century of associated literature revealed analytics, which focuses on using data to explain changes
that planning theory and practice have not disappeared. with a systems perspective. We used the approach to
Our research confirmed that conceptual elements of understand student success, and created new tools that
organizational planning, first articulated nearly 60 years provided actionable insights to units across the campus.
ago, have been affirmed and refined. The five elements of Today, Big Data Analytics (BDA) is used to describe
comprehensive planning, in contemporary terms, are: (1) multiple concepts that emphasize infrastructure (e.g.,
plan making: development of mission, vision, goals, and key warehousing, reporting, processing), analyses (e.g.,
performance indicators; (2) tactical analysis: management statistical analysis, data mining, network analysis,
of systems and processes to improve operations; (3) systems analysis), and use (decision support, continuous
strategic analysis: development and adjustment of improvement, artificial intelligence, planning). Our
strategies based on comparative advantages relative to approach has also evolved. We are currently in the
competitors or peers; (4) policy analysis: management process of developing predictive analytics tools to support
of environmental factors that impact an organization’s planning and operations. A decade after we began our
ability to advance its goals; and (5) evaluation: assessment efforts, CIERP has developed an advanced analytics
of progress. infrastructure that includes more than 6,000 metrics
and 300 analytics tools to support institutional planning.
To ensure efficient tracking of that large volume of
information, we also formalized a hierarchy of metrics.
38
S u ppo rtin g P l annin g with Metr i cs How We Us e d Metr i cs
Tracking and using 6,000 metrics is a daunting task. Over In the following sections, we briefly describe each element
the last 15 years, we have classified data into four different of comprehensive planning, identifying how we used
categories to support planning: key performance metrics, metrics to assess and improve planning.
contextual metrics, control metrics, and policy metrics.
Plan making is a periodic activity that allows an
Figure 2 Hierarchy of Data Metrics Used for Planning organization to reaffirm and align its mission, values, and
goals, which is a critical factor in achieving the desired
outcomes. The process of making plans requires assessing
Key
the trajectory of the organization, determining conditions
Performance
Metrics Policy that are necessary to sustain progress, and identifying
Metrics further adjustments that are needed to achieve specific
Contextual
Metrics outcomes. There are specific types of analysis and data
that are essential to the plan-making process.
Control
Metrics We used key performance metrics to assess the
progress of goals; that analysis enabled us to
estimate future outcomes under historic conditions.
Key performance metrics are a very limited set We also developed outcome scenarios, which
of measures that show progress on organizational were exploring changes in component metrics
goals. Total enrollment, degrees awarded, research and other contextual metrics in the recent past,
expenditures, and total revenue are examples of key generally within five to ten years. Another
performance metrics. related analysis would be a focus on assessing the
Contextual metrics provide a framework to understand relative contribution of subunits (e.g., colleges and
changes associated with key performance indicators. departments within an institution). For that type
There are four different types of contextual metrics. of analysis, we considered changes in component
Component metrics describe the elements that make up metrics, including intermediate outcome metrics
the key performance indicators. Diagnostic metrics explain and diagnostic metrics. That analysis helped us
why change has occurred in each of the components. to identify emergent strategic advantages and
Leading metrics is another group of contextual indicators organizational inefficiencies. Once those internal
that provide early warnings about future change. assessments were completed, we focused on
Intermediate outcome metrics provide indicators of external comparative analysis. Key performance
progress related to specific initiatives designed to have an metrics and control metrics were used to assess
indirect impact on key performance indicators. organizational trends (relative to peers or
Control metrics are data that are used to assess the competitors) and to identify the level of outcomes
impact of tactical and strategic action. Specifically, control that were needed to keep pace or close gaps. That
metrics allow for comparison of outcomes between analysis allowed us to determine reasonable
initiatives, groups, or organizations, and they are used to targets based on institutional mission, resources,
assess productivity (for assessment of tactics) and relative comparative advantages, emergent changes, and
performance (for assessment of strategy). expected outcomes. The final plan document
Policy metrics are public measures that are generally built on those internal and external analyses
of interest to policy makers and stakeholders, including and provided a clarification of the organization’s
federal and state agencies. Policy metrics can be an existing mission, goals, vision, and strategies that guided
metric or a new measure created by an external agency. action during the next planning period.
Strategic analysis, the deliberate set of activities that are designed to move
organizations to an ideal position relative to its competitors, is one of the most
common aspects of planning practice. We knew that organizations could
advance outcomes through operational improvements, maturation effects, and
changes in the external environment (e.g., changes in funding, new technology,
innovation). However, we also recognized that those structural gains would not
be sufficient to stay ahead of peers, and organizations would need to harness
comparative advantages to maintain strategic advantage in the long term.
The data we used for strategic analysis and systems analysis were
similar. The primary difference was that strategic analysis focused on
assessing progress beyond structural gains (or gains expected across
all institutions). We used this analysis to generate early signals about
the need for strategic adjustments to maintain long-term gains relative
to peers. For example, we analyzed what adjustments were needed in
research productivity to move to the top quartile of baseline peers and
the top half of aspirational peers. We also recognized that innovation was
a likely source of strategic advantage. Thus, a critical focus of analysis
was on identifying emergent innovation at the subunit level (college and
departments) that could be scaled to have an impact at the institutional
level. We relied on intermediate outcome metrics and other leading
metrics to identify early signals about emergent innovation in areas such
as retention, degree completion, and research productivity.
40
Policy analysis focuses on monitoring and managing emergent changes What Worked
in the environment. Environmental scanning is a common technique used • Identifying key
to identify best practices to improve operations. Those scanning activities performance indicators
are well documented in literature. The missing element in literature is the and emphasizing
use of environmental scanning to identify emergent changes in the policy interventions that had
environment. That type of analysis focuses on assessing and managing policies. a marked impact on
outcomes.
Our efforts in that area included analyses of actions (e.g., rule changes) • Using tools that provided
by administrative agencies, assessment of proposed metrics (e.g., actionable insights for
accountability measures), and analysis of tacit and explicit policy emergent issues (e.g.,
positions (e.g., reports on rankings and institutional effectiveness) term-to-term retention
generated by external groups and agencies. Our primary focus behind instruments that identify
each student who has not
those analyses was to determine how each proposed or emergent policy
reenrolled).
would affect institutional goals and outcomes. The insights that emerged
• Developing a better
from those analyses were used by our senior administrators to shape
understanding of the
policy discussions and mitigate impact.
institution’s system and
Evaluation refers to the assessment of outcomes to ensure that adjustments selecting incremental
actions that would improve
are made to advance the goals of an organization. We undertook three levels of
key outcomes.
assessment. The primary level of assessment focused on ensuring continuous
improvement. The second level of assessment focused on ensuring that the
What Didn’t
organization was managing external and internal change. The third level of
• Providing too much data,
assessment focused on ensuring that the organization was achieving strategic
which led to actions
progress.
that did not affect key
The first level of analysis was associated with continuous improvement outcomes.
and took place at the end of each operational year to assess the change in • Employing generic
key performance metrics. The second level of analysis (tactical analysis) analytics.
took place during the operational year and at the end of the operational • Implementing best
year. During the operational year, we assessed changes in leading practice solutions without
considering specific needs
metrics, which provided insights about operational adjustments needed
of the local context.
to achieve expected annual outcomes. At the end of the operational
year, we assessed changes in key performance indicators relative to
our expected outcomes, and explained deviations using component
metrics and diagnostic metrics. The end-of-year operational analysis also
considered positive deviations relative to internal and external peers,
which yielded insights about emergent innovations. Finally, we assessed
structural gains using key performance indicators to determine if
adjustments needed to be made at the unit (or division) level and ensured
that the organization was maintaining the long-term trajectory to
preserve a competitive advantage relative to peers.
the last 30 years revealed that associate at the Center for Metrics Based Planning at
comprehensive planning had the University of Texas at El Paso. Prior to her
an important role in producing appointment, she served as the team lead for the
extraordinary success. Moneyball Research and Communications Group at UTEP’s
inspired us to use metrics to improve Center for Institutional Evaluation, Research, and
the effectiveness of planning. We Planning. Bonilla-Martin earned a doctorate in rhetoric, and her research
established a hierarchy of metrics interest centers on the impact of planning documents and literature on
and used data in a structured way institutional effectiveness.
to improve efficiency. It’s our
E r i c k Gon z a l e z is a graduate research assistant
contention that conditions for
at the Center for Metrics Based Planning at the
metrics-based planning exist in
University of Texas at El Paso. He has earned an
all higher education institutions,
undergraduate degree in accounting, and is a graduate
because all institutions undertake
student in education. Gonzalez is currently studying
comprehensive planning and
change in outcomes for 3,000 higher education
have access to large volumes of
institutions in the United States.
data. We also believe that metrics-
based planning is the next step in Roy M at h ew, P h D , is associate vice president for
the evolution of comprehensive planning at the University of Texas at El Paso. His
planning, one that is not replaced current academic research focuses on refining and
by big data and analytics—but articulating the distinctive features of organizational
enhanced by it. planning, and explaining the usefulness of metrics-
based planning in advancing outcomes in higher
education and other complex contexts. Mathew earned a doctorate in public
Re f e r en c es
policy analysis and planning from the University of Illinois at Chicago.
Banfield, Edward C. “Supplement: Notes on
a Conceptual Scheme.” Politics, Planning,
and the Public Interest. Martin Meyerson and Da nie l S a nta n a is a doctoral research assistant
Edward C. Banfield, eds. Glencoe, IL: Free
at the Center for Metrics Based Planning at the
Press, 1955, 303–329.
University of Texas at El Paso. He is a doctoral
Bowen, William G., and Derek Bok. The
Shape of the River. Long-Term Consequences candidate in history, and has an extensive background
of Considering Race in College and University
in researching primary source materials. Santana is
Admissions. Princeton: Princeton University
Press, 1988. completing a survey of literature related to planning
Branch Jr., Melville C. “Comprehensive in higher education.
Planning: A New Field of Study.” Journal of the
American Institute of Planners 25, no. 3 (1959):
115–120.
E n gag e with th e Auth o rs
The Equality of Opportunity Project,
accessed August 14, 2019. www.equality-of- To comment on this article or share your own observations, email
opportunity.org/education.
rmathew@utep.edu, emartin3@utep.edu, dsantana2@miners.utep.edu,
Lewis, Michael. Moneyball: The Art of Winning
an Unfair Game. New York: W.W. Norton, 2003.
or ejgonzalez11@miners.utep.edu.
42
Fe atu r e ARTI CLE
The Value of
Higher Education
Academic
Makerspaces for
Accreditation
and Beyond
by Vincent Wilcyznski, PhD, Aubrey Wigner,
PhD, Micah Lande, PhD, and Shawn Jordan, PhD
Institutions of higher education are
incorporating makerspaces and skills on their
campuses in support of institutional goals and
accreditation requirements.
University and college campuses are constantly evolving, adding new facilities,
resources, and programs to best serve students, faculty, and staff. Over the last
decade many institutions have added academic makerspaces to their campuses,
a development that allows individuals from across the university to come
together to collaborate, design, fabricate, and learn in shared spaces. First popular
in engineering departments, higher education academic makerspaces now have
expanded to support multidisciplinary learning across all aspects of the university.
We use the term “academic makerspace” to describe the facility, staff, resources,
and associated community that support creating, learning, and fabricating in an
academic setting. Recognizing that elementary and high schools, as well as other
This article first appeared in Planning for Higher Education, Issue V46N1, October–December 2017
backgrounds to meet and work together. It is estimated labs has also contributed. In an exploration of the future
of engineering education, Smith et al. (2005) identified
that there are more than 150 makerspaces on university
project-based learning as a growing pedagogical approach
campuses, with the number growing each year (Barrett et al.
to the teaching of future engineers. Through the
2015; Bryne and Davidson 2015).
(renewed) emphasis on hands-on, project-based learning,
A distinction of higher education academic makerspaces collaborative spaces have emerged to help transform
is found in the culture and community that form within. undergraduate engineering education.
The underlying culture of makerspaces, both in academic
Influenced by these factors, higher education academic
and nonacademic environments, is one of collaboration,
makerspaces developed from the growing need for widely
sharing, and additive innovation (Jordan and Lande 2016).
accessible technology and the increasing availability
Sharing one’s work with others creates an open community
(and affordability) of design tools, including hardware
and collaborative culture in which members are excited and software. Given this context, some of the first higher
to assist one another and willingly exchange design education academic makerspaces were housed in schools
knowledge. The diversity of users creates opportunities of engineering. In the past several years, many university
for members to work with and learn from others who libraries have launched makerspaces with design and
have unique experiences and skills. The existence of these fabrication tools for patrons to use while relying on
spaces and focused programs to integrate members has led in-house, on-campus, and digital resources for training,
to many unique collaborations among colleagues who may facilitation, and support. Examples exist where libraries
not have otherwise had the opportunity to work together, administer checkout processes for tools and equipment,
including the development of multidisciplinary courses similar to their traditional role in doing so for print material
(Ali et al. 2016). The open nature of these spaces promotes and other media. This development illustrates the wide
an intentional collision of random ideas, a design structure spectrum of the higher education academic makerspace
that has benefited many industries (Gertner 2012). movement on university and college campuses.
44
As further examples of the scope of this movement, engineering and other
discipline professionals have joined together to share knowledge and explore
best practices related to higher education academic makerspaces. For example,
in 2014 Arizona State University hosted a symposium focused on this topic,
and the MakeSchools (n.d.) alliance was formed to catalyze academic making.
In 2016 the White House convened a meeting on higher education academic
makerspaces in conjunction with the 2016 National Week of Making and
the National Maker Faire. International symposiums devoted to academic
makerspaces were held in 2016 and 2017, with each event attracting hundreds
of participants from across the world and over 100 papers written (ISAM 2017
Papers, Presentations, and Videos 2017; Proceedings of the 1st International
Symposium on Academic Makerspaces 2016).
46
higher education academic makerspace provides a rich and lifelong learning—attributes frequently evaluated by
pool of quantitative and qualitative data that can be used institutional and program accreditation organizations.
to demonstrate compliance with accreditation criteria.
Both forms of accreditation also review curriculum-related
Documenting student experiences is common practice aspects of students’ education, typically by allowing each
for most higher education academic makerspaces. institution or program to establish discipline-specific
These experiences are frequently archived as videos, educational outcomes and measurement mechanisms to
photographs, and articles that are accessible through a evaluate individual attainment of these outcomes, which
space’s web portal. Many makerspaces even offer live video often include academic and disciplinary knowledge,
streaming of their activity space. Video data can provide skills, and competencies. Higher education academic
insight into how a space is used, what hours are busiest, etc. makerspaces provide venues in which to increase
These records help others learn what can be accomplished knowledge, skills, and competencies, with this topic
in the facility, including new members who are exploring explored in more detail in the following section.
the space, administrators who are evaluating the impact of
the space, and potential contributors who are considering
Spac es an d Le ar nin g
investing in the space. These records are also a valuable
accreditation resource as they provide (readymade) Makerspaces are academically interesting in two ways: (1)
narratives that can be grouped to demonstrate enhancing teaching objectives and (2) enhancing student
institutional or programmatic accomplishments related to outcomes. It is worth noting that while these two concepts
specific accreditation standards. are similar, they are not identical in terms of modern
accreditation standards. Teaching objectives can be seen
It is also common for higher education academic
as a measure of how specific skills are passed on from
makerspaces to collect a large amount of quantitative
teachers to students. For example, if students leave a fluid
data, in part motivated by an inherent need to monitor
dynamics course with a mathematical understanding
and enforce safe operating practices. For example, most
of fluid flows and qualities, then the teaching objectives
spaces have databases identifying the individuals who
are met. In contrast, student outcomes in engineering,
are authorized to use the space, with that information
as defined by ABET (n.d.), include more nebulous and
often including the name, gender, status (student, faculty,
staff), and departmental affiliation of each user who has difficult-to-measure qualities such as the development
been trained and provided access to work in the space. of lifelong learning skills and effective communication
Similar records frequently exist that record the enrollment skills or the ability to function on multidisciplinary
in makerspace courses and programs (such as evening teams and use modern engineering tools necessary for
workshops). In addition, most spaces host academic groups, engineering practice. These broader student outcomes
such as design-affiliated student associations, for meetings encompass experiences and learning that occur
and work sessions, often logging these activities into a throughout a program of study rather than merely
master planning schedule. Collectively, these records form within one class. Makerspaces can play a role in both of
a valuable database of information that can be applied as these areas. Teaching objectives can be met via project-
evidence of alignment with accreditation standards. based assignments completed in a makerspace. Student
outcomes can be enhanced by providing a community of
For example, such quantitative data is important evidence
practice where students can learn from peers, engage in
in documenting an institution’s commitment to creating
self-directed learning, and be exposed to mind-sets that
multidisciplinary education facilities that accommodate
foster the more nebulous qualities, such as those of a
a variety of learning styles. Higher education academic
lifelong learner and effective communicator. Makerspaces
makerspaces favor a form of active learning focused on both
and their influence on both student outcomes and
individual drive and community-based problem solving.
teaching objectives are explored below within the context
User demographics and frequency-of-use data provide
of accreditation.
valuable documentation of an institution’s commitment to
fostering personal discovery, professional development,
48
Makin g in Spac es
While engineering departments were the pioneers in labs are for insiders only, but unlike art studios, playful
the recent expansion of high-tech higher education exploration is generally discouraged. Peer learning in
academic makerspaces, spaces for making things have makerspaces offers the possibility of increasing the
been an integral part of university facilities for decades. diversity of work and people students encounter during
Studio art spaces, for example for sculpture, contain many their time in higher education.
of the same tools as makerspaces, from 3-D printers to
Academic makerspaces can be viewed as places where
laser cutters and electronics stations. Much like higher
interdisciplinary technology can be focused on training,
education academic makerspaces, studio art spaces
work, and play. In the context of ABET accreditation for
serve as places for students to learn and practice skills,
explore creatively and freely, and collaborate with and engineering programs, such a space could be one of the
learn from their fellows. Art and design have a long only areas on campus where the explicit goal of training
history of “critical making,” which is the learning that engineers to function on multidisciplinary teams could
occurs via the experience of creating and interacting be met. Where once a dedicated circuits lab, for example,
with the physical through iterative processes and social would provide access to the tools and materials needed for
feedback (Somerson and Hermano 2013). In studio students to complete their coursework and prepare for a
spaces, instruction often takes place in the same shared real-world work experience, today’s engineering career
workspace in which others quietly, or not so quietly, ecosystem is much more likely to require input from
work on their own projects for different courses. The multiple disciplines in a rapidly changing technological
community is formed around growth in making art landscape. To emulate the real environment, a sort of
and integrates making into class and non-class time, “circuits in context” lab is required, one where traditional
both for assignments and for personal gratification or parts and tools (e.g., resistors, capacitors, soldering irons)
curiosity. However, these studio art spaces are generally exist side by side with programmable microprocessors like
walled away from the rest of the university and strictly the Arduino and the tools (e.g., 3-D printers, laser cutters,
disciplinary in nature. Engineering likewise houses sewing machines, craft implements) needed to create the
computer labs dedicated to the simulation of industrial products the circuits might exist within. Here, students
processes, circuit labs dedicated to the exploration of and student teams can explore not just what circuits are,
electronics, etc. In these spaces, both peer learning and but what they mean within a broader societal context—
coursework take place. Like studio art spaces, engineering how circuits interface with people in real terms.
50
Re f e r en c es Auth o r Biog r ap h ies
ABET. n.d. Criteria for Accrediting Engineering Programs, 2017–2018.
Accessed December 14, 2017: www.abet.org/accreditation/accreditation-
V in c ent W i l c z y n s k i , P HD , has served as
criteria/criteria-for-accrediting-engineering-programs-2017-2018/. the deputy dean of the Yale School of Engineering
Ali, P. Z., M. Cooke, M. L. Culpepper, C. R. Forest, B. Hartmann, M. Kohn, & Applied Science since 2010. He is the James S.
and V. Wilczynski. 2016. The Value of Campus Collaboration for Higher
Education Makerspaces. Proceedings of the 1st International Symposium Tyler Director of the Yale Center for Engineering
on Academic Makerspaces, Paper no. 48. Accessed December 14, 2017:
Innovation and Design and oversees all aspects of
http://seas.yale.edu/sites/default/files/imce/other/ISAM%20Campus%20
Collaboration.pdf. Yale’s academic makerspace. Previously, he was the
Barrett, T. W., M. C. Pizzico, B. Levy, R. L. Nagel, J. S. Linsey, K. G. Talley, C. R. dean of engineering at the U.S. Coast Guard Academy
Forest, and W. C. Newstetter. 2015. A Review of University Maker Spaces.
Paper presented at 2015 ASEE Annual Conference & Exposition, Seattle, where he served as a captain in the U.S. Coast Guard.
WA, June 14–17. Accessed December 14, 2017: www.asee.org/public/ He is a former vice president of the American Society
conferences/56/papers/13209/view.
of Mechanical Engineers, and his teaching skills
Bryne, D., and C. Davidson. 2015. MakeSchools Higher Education Alliance:
State of Making Report. Accessed December 14, 2017: http://make.xsead. have been cited by the Carnegie Foundation for the
cmu.edu/week_of_making/report.
Advancement of Teaching.
Gertner, J. 2012. The Idea Factory: Bell Labs and the Great Age of American
Innovation. New York: Penguin Press. A u b r e y W i g neR , P h D , is an assistant professor
ISAM 2017 Papers, Presentations, and Videos. 2017. Accessed at the Eli Broad College of Business at Michigan
December 26, 2017: https://drive.google.com/drive/mobile/
folders/0B4ZIatyugWjJNXlxVW9iR0ZFVjQ?usp=sharing. State University where he teaches courses in
Instructables. n.d. Home page. Accessed December 14, 2017: www. the university-wide minor in entrepreneurship
instructables.com. and innovation. He is a chemical engineer turned
Jordan, S., and M. Lande. 2016. Additive Innovation in Design Thinking and interdisciplinary social scientist whose dissertation
Making. International Journal of Engineering Education 32 (3): 1438–44.
explores the connections between the maker
MakeSchools. n.d. Home page. Accessed December 14, 2017: http://make.
xsead.cmu.edu/. movement, higher education, and the future of
New England Association of Schools and Colleges. n.d. Commission on work. Currently, he is working to expand MSU’s
Institutions of Higher Education: Standards. Accessed December 14, course offerings on interdisciplinary design and
2017: https://cihe.neasc.org/standards-policies/standards-accreditation/
standards-effective-july-1-2016#. collaboration for entrepreneurship and innovation.
Papavlasopoulou, S., M. N. Giannakos, and L. Jaccheri. 2017. Empirical
Studies on the Maker Movement, a Promising Approach to Learning: A Mi ca h L a n d e , P h D , is an assistant professor in
Literature Review. Entertainment Computing 18 (January): 57–78.
the engineering and manufacturing engineering
Proceedings of the 1st International Symposium on Academic Makerspaces. programs and Tooker Professor at the Polytechnic
2016. Accessed December 14, 2017: http://jrom.ece.gatech.edu/
wp-content/uploads/sites/528/2017/07/ISAM_2016-Proceedings-I.pdf. School in the Ira A. Fulton Schools of Engineering
Rosenbaum, L. F., and B. Hartmann. 2017. Where Be Dragons? Charting at Arizona State University. He teaches human-
the Known (and Not So Known) Areas of Research on Academic
Makerspaces. Paper presented at ISAM 2017, International Symposium on
centered engineering design, design thinking, and
Academic Makerspaces, Cleveland, OH, September 24–27. design innovation project courses. He researches how
Smith, K. A., S. D. Sheppard, D. W. Johnson, and R. T. Johnson. 2005. technical and non-technical people learn and apply
Pedagogies of Engagement: Classroom-Based Practices. Journal of
Engineering Education 94 (1): 87–101. design thinking and making processes to their work.
Somerson, R., and M. L. Hermano, eds. 2013. The Art of Critical Making:
Shawn Jordan, PhD, is an associate professor of
Rhode Island School of Design on Creative Practice. Hoboken, NJ: John
Wiley & Sons. engineering education in the Ira A. Fulton Schools of
Vossoughi, S., and B. Bevan. 2014. Making and Tinkering: A Review of the Engineering at Arizona State University. He teaches
Literature. National Research Council Committee on Out of School Time
STEM, 1–55. context-centered electrical engineering and embedded
Wigner, A., M. Lande, and S. S. Jordan. 2016. How Can Maker Skills Fit In
systems design courses and studies the use of context
with Accreditation Demands for Undergraduate Engineering Programs? in both K–12 and undergraduate engineering design
Paper presented at the 2016 ASEE Annual Conference and Exposition,
New Orleans, LA, June 26–29. education. He was named one of ASEE PRISM’s “20
Wilczynski, V., J. Zinter, and L. Wilen. 2016. Teaching Engineering Design Faculty Under 40” in 2014 and received a Presidential
in a Higher Education Makerspace: Blending Theory and Practice to
Early Career Award for Scientists and Engineers from
Solve Client-based Problems. Paper presented at the 2016 ASEE Annual
Conference and Exposition, New Orleans, LA, June 26–29. President Obama in 2017.