Managing Sustainable Performance and Governance in Higher Education Institutions

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 179

System Dynamics for Performance Management

& Governance 5

Federico Cosenz

Managing
Sustainable
Performance and
Governance in Higher
Education Institutions
A Dynamic Performance Management
Approach
System Dynamics for Performance Management
& Governance

Volume 5

Series Editor
Carmine Bianchi, CED4—Systems Dynamics Group, University of Palermo,
Palermo, Italy

Scientific Committee for System Dynamics for Performance Management &


Governance
Luca Anselmi, University of Pisa, Italy—Professor of Public Administration
David Birdsell, Baruch College/CUNY, USA—Dean, School of Public Affairs
Elio Borgonovi, Bocconi University, Milan, Italy—Professor of Economics and
Management of Public Administration
Tony Bovaird, University of Birmingham, UK—Professor of Public Management
and Policy
John Bryson, University of Minnesota, USA—McKnight Presidential Professor
Emeritus, Hubert H. Humphrey School of Public Affairs
Dario Cavenago, Bicocca University, Milan, Italy—Professor of Public Manage-
ment
Denita Cepiku, University of Rome Tor Vergata, Italy—Associate Professor of
Public Management
Lino Cinquini, Scuola Superiore Sant’Anna, Pisa, Italy—Professor of Business
Administration
Paal I. Davidsen, University of Bergen, Norway—Professor of System Dynamics,
Chair of the System Dynamics Group
Scott Douglas, Utrecht School of Governance, The Netherlands—Associate
Professor
Giuseppe Grossi, Kristianstad University (Sweden) and Nord University
(Norway)—Professor of Public Management & Accounting
Jeremy L. Hall, University of Central Florida (USA)—Professor of Public Admin-
istration and Director of the Ph.D. Program in Public Affairs
John Hallighan, University of Canberra, Australia—Emeritus Professor of Public
Administration and Governance
Roger E. Hartley, University of Baltimore (USA)—Dean, College of Public Affairs
David Lane, Henley Business School, UK—Professor of Informatics
Manuel London, State, University of New York at Stony Brook,
USA—Distinguished Professor of Management
Roula Masou, ESSCA School of Management, France—Associate Professor of
Performance Management
Luciano Marchi, University of Pisa, Italy—Professor of Planning & Control
Systems
Marco Meneguzzo, Università della Svizzera Italiana, Lugano, Switzerland—Uni-
versity Tor Vergata, Rome, Italy—Professor of Public Management
Donald P. Moynihan, McCourt Chair at the McCourt School of Public Policy
Georgetown University (USA)
Riccardo Mussari, University of Siena, Italy—Professor of Public Management
Stephen P. Osborne, University of Edinburgh Business School—Scotland,
Director of the Centre for Service Excellence (CenSE), Chair of International Public
Management
Guy Peters, University of Pittsburgh, USA—Maurice Falk Professor of American
Government, Department of Political Science, President of the International Public
Policy Association
Angelo Riccaboni, University of Siena, Italy—Professor of Planning & Control
Systems
William C. Rivenbark, University of North Carolina at Chapel Hill, USA, School
of Government—Professor of Public Administration and Government
Etienne Rouwette, Nijmegen School of Management, The Netherlands—Associate
Professor of Research Methodology and System Dynamics
Khalid Saeed, Worcester Polytechnic Institute, USA—Professor of System
Dynamics
Markus Schwaninger, University of St Gallen, Switzerland—Professor of
Management
Carlo Sorci, University of Palermo, Italy—Professor of Business Management
Jürgen Strohhecker, Frankfurt School of Finance & Management, Germany—
Professor for Business Administration, Operations and Cost Management
Jarmo Vakkuri, University of Tampere, Finland—Professor of Local Government
Accounting & Finance
Wouter Van Dooren, University of Antwerp, Belgium—Associate Professor of
Public Management
David Wheat, University of Bergen, Norway—Professor in System Dynamics
Jiannan Wu, Shanghai Jiao Tong University, China—Dean of the School of
International and Public Affairs, and Executive Vice Director of the China Institute
for Urban Governance
Federico Cosenz

Managing Sustainable
Performance and Governance
in Higher Education
Institutions
A Dynamic Performance Management
Approach
Federico Cosenz
University of Palermo
Palermo, Italy

ISSN 2367-0940 ISSN 2367-0959 (electronic)


System Dynamics for Performance Management & Governance
ISBN 978-3-030-99316-0 ISBN 978-3-030-99317-7 (eBook)
https://doi.org/10.1007/978-3-030-99317-7

© The Editor(s) (if applicable) and The Author(s), under exclusive license to Springer Nature Switzerland
AG 2022
This work is subject to copyright. All rights are solely and exclusively licensed by the Publisher, whether
the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of
illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and
transmission or information storage and retrieval, electronic adaptation, computer software, or by
similar or dissimilar methodology now known or hereafter developed.
The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication
does not imply, even in the absence of a specific statement, that such names are exempt from the relevant
protective laws and regulations and therefore free for general use.
The publisher, the authors and the editors are safe to assume that the advice and information in this
book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or
the editors give a warranty, expressed or implied, with respect to the material contained herein or for any
errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional
claims in published maps and institutional affiliations.

This Springer imprint is published by the registered company Springer Nature Switzerland AG
The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland
To my Mom
Preface

This book aims to cover about a decade of research activities devoted to university
management, exploring its specific organizational complexity, and adopting sys-
temic approaches to managing its performance generation mechanisms. It also draws
on the field experience spent as the academic delegate for scientific support to
strategic planning, management control, performance evaluation, and statistical
reporting at the University of Palermo, Italy. This work is included in a series on
“System Dynamics for Performance Management.”
The fast-changing evolutionary process of global higher education systems
systematically poses new challenges related to the appearance of innovative ele-
ments that lead academic governing bodies to question current managerial structures
and methods. Due to this, theory and practice have gathered multiple contributions
and experiences to support and further develop this evolutionary pathway during the
past few decades.
In the same vein, this book aims to draw on this flourishing debate on higher
education policy and management and explore an innovative systemic perspective to
design and implement sustainable performance management systems for academic
institutions. The conditions for the success of universities, the critical issues under-
lying the creation of academic value, the dynamic complexity characterizing aca-
demic governance settings, the pluralistic audience of stakeholders and related
expectations, and the causal interplays between organizational performance vari-
ables represent some of the central themes around which this work is developed.
More specifically, the book suggests and discusses the adoption of a dynamic
performance management (DPM) approach to frame the inherent organizational
complexity of higher education institutions, thus supporting a strategic learning
perspective to design and implement relevant performance measures. This approach
originates from the combination between conventional performance management
and system dynamics modeling. Many research and practice contributions prove that
this methodological combination can boost the understanding and interpretation of
value creation processes by identifying and exploring the causal connections among
strategic resource allocation and consumption, corresponding performance drivers,
emerging outputs, and outcomes.
vii
viii Preface

The contribution to such endeavor focuses on the following critical research


questions:
• What are the main challenges facing higher education institutions today in terms
of performance management and governance?
• How can a DPM approach support academic managers and decision-makers in
framing and measuring organizational performance?
• How to implement DPM systems in higher education institutions?
• What are the advantages, risks, and limitations of using DPM in such a complex
domain?
• How to extend such a systemic approach to include third mission activities and
associated collaborative governance settings?
For these purposes, the book is designed for university managers and public
management scholars interested in higher education institutions. As such, the prac-
tical applications of the proposed method are oriented to foster the interest of
university administrators toward the implementation of a more systemic approach
to frame and manage academic performance. Namely, its purpose is to develop
university managers’ capabilities to design and use performance management sys-
tems by adopting a system dynamics perspective. The theoretical sections aim to
discuss the state of the art of the current debate related to performance management
in higher education and those approaches and paradigms used in facing the specific
complexity of university management.
The book is divided into four chapters. The first chapter describes the complexity
of managing today’s higher education settings in terms of organizational traits and
related challenges. It also reviews the prevailing paradigmatic changes and
approaches to public administration development—e.g., from new public manage-
ment to new public governance and public value management—that have globally
inspired the reforms of the higher education sector during the past few decades.
Based on this, the chapter provides an introductory perspective on framing perfor-
mance management systems in universities. Chapter 2 illustrates a systemic per-
spective for designing performance management mechanisms in higher education
institutions. A qualitative approach is adopted to explain the potential of applying
the three views of DPM—i.e., the subjective, objective, and instrumental view—for
structuring and connecting strategic planning, process analysis, and the design of
key performance indicators. Then, chap. 3 further develops this methodological
investigation by introducing the support of system dynamics modeling. Illustrative
examples and models are shown here to outline how DPM works in practice. By
recognizing the role played by universities in their socioeconomic context, chap. 4
extends the scope of DPM to comprehend the impacts generated at the local and
regional levels. Such a broader performance management view aims at framing the
third mission activities into collaborative governance settings with intent to support
more effective strategic coordination between the university’s stakeholders in pur-
suit of sustainable outcomes. Eventually, the book outlines the main findings and
contributions in a concluding section.
Preface ix

In the hope of having contributed to the knowledge of this research field, there are
many people I feel obliged to thank at the end of this work. Mentioning some of them
makes me fear that I might forget others. However, among the many people who
have been particularly close to me in my research, I would like to express my deepest
gratitude to Carmine Bianchi. He encouraged my interest in developing performance
management and governance studies since the beginning of my academic pathway.
Special thanks are due to Carlo Sorci, Corrado Vergara, and Marcantonio Ruisi
for representing a valuable reference for the human and professional growth of
strategic management scholars at the University of Palermo.
I would also like to extend my sincere thanks to Enzo Bivona, Guido Noto,
Milton M. Herrera, Vinícius Picanço Rodrigues, and Francesco Rosati, who over the
years shared with me significant and inspiring research themes and academic
adventures. In addition, I am indebted to Antonello Miranda, Claudia Giurintano,
Salvatore Casabona, and other department colleagues, whose guidance strongly
contributed to my academic development. I hope I can always count on their
remarkable support and friendship.
Ultimately, I wish to express my warmest thanks to my family: my parents, my
three sisters, and my lovely wife Annachiara. Their love and emotional support have
been crucial to overcoming everyday difficulties and facing life challenges. Among
them, however, I do recognize the particular virtues of my mother. Although she did
not directly contribute to this work, her love and care guided me throughout its
development. This book is for her.

Palermo, Italy Federico Cosenz


Contents

1 Performance Systems in Higher Education Institutions . . . . . . . . . . . 1


2 Developing Performance Management Systems in Higher
Education Institutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 37
3 Designing Dynamic Performance Management Systems in Higher
Education Institutions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 85
4 University’s “Third Mission” Assessment Through
Outcome-Based Dynamic Performance Management . . . . . . . . . . . . 133
5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 169

xi
About the Author

Federico Cosenz is currently an Associate Professor of Business and Public Man-


agement at the University of Palermo, Italy. There, he teaches several master-level
courses, such as performance management in the public sector and business strategy.
He is a board member of the international doctoral program in “Model Based Public
Planning, Policy Design, and Management” run at the Department of Political
Sciences and International Relations.
Federico Cosenz held the role of university delegate for scientific support to
planning, management control, performance evaluation, and statistical reporting at
the University of Palermo from April 2020 to November 2021. In such a role, he
coordinated the working group for designing the Strategic Plan 2021–2023, as well
as the working group for the implementation of management control.
He has been a visiting scholar at the University of Bergen (Norway), the Radboud
University of Nijmegen (Netherlands), the Universidad Militar Nueva Granada of
Bogota (Colombia), and the IESE Business School of Barcelona (Spain).
Federico Cosenz has authored several academic and professional journals. He has
also presented his research contributions at numerous national and international
scientific conferences. In 2013, he won the “Best Paper” award at the AIDEA—
Italian Academy of Management annual conference, with a contribution titled
“Designing Performance Management Systems in Academic Institutions: a Dynamic
Performance Management View.”
His scientific and applied research interests include the following areas: perfor-
mance management and governance for public and private organizations, strategic
planning and management control, business model design for start-up companies
and SMEs, business model innovation also from a sustainability perspective, man-
agement information systems, and development of simulation models to support
strategic learning and decision-making. He also carries out training and scientific
support activities on the same topics through external collaborations (e.g., with the
Italian National Research Council—CNR).

xiii
Chapter 1
Performance Systems in Higher Education
Institutions

1.1 Introduction

Global competitiveness and economic and social growth are driven worldwide by
knowledge and innovation (De Witte & López-Torres, 2017; Hughes & Kitson,
2012; Salmi, 2009). In this context, Higher Education Institutions (HEIs) play a
crucial role as they primarily contribute to knowledge transfer and development and,
as a result, foster regional development, employment, and economic wealth (Bianchi
& Caperchione, 2022; Deiaco et al., 2012; Shattock, 2010). They are seen as
professional organizations with academics performing one of the ancient professions
in human history. The relevance of this role leads Universities to face major
challenges nowadays, one of which is managing their performance according to a
sustainable perspective (Camilleri & Camilleri, 2018; Noordegraaf, 2015; Shin &
Harman, 2009; Baldwin, 2009; Jongbloed et al., 2008).
The extant literature on the use of Performance Management (PM) mechanisms in
the public sector suggests that a multiplicity of factors may affect how organizations
manage and measure their performance and the degree of success they achieve
(Franco-Santos et al., 2012; Ferreira & Otley, 2009). However, little research on
PM applied to HEIs has been developed since using these systems in Universities is
relatively new (Sousa et al., 2010). As a result, there is a clear need for further
investigation in designing, implementing, and using PM systems tailored to the
organizational features of Universities (Camilleri, 2021; Grossi et al., 2020; Agasisti,
2017; Bianchi, 2016).
Over the last 20 years, governments have worldwide undergone several law
reforms that deeply changed the way of running HEIs. Mainly, these reforms applied
to the HE sector aim to promote more effective use of public resources and improve
the supply of academic services (Pollifroni, 2015; Rebora & Turri, 2011). For these
purposes, the policies adopted have mainly been oriented to increase the autonomy
of Universities in decision-making and management.

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 1


F. Cosenz, Managing Sustainable Performance and Governance in Higher
Education Institutions, System Dynamics for Performance Management &
Governance 5, https://doi.org/10.1007/978-3-030-99317-7_1
2 1 Performance Systems in Higher Education Institutions

The main causes underlying such reforms are basically due to three phenomena
that have highlighted the unsustainability of an outdated management system used to
steer academic institutions:
(1) The enduring effects of an economic crisis that governments are still facing,
producing severe cuts in budgeting and public expenditures, further worsened by
the Covid-19 outbreak
(2) The rise of competitiveness—perceived at both national and international
levels—in the HE sector, pushing Universities to move from a static and
protected to a dynamic and complex system made up of multiple players, rivals
in grabbing resources, prestige, and reputation
(3) An increased complexity characterizing the contemporary local and regional
areas where HEIs play a more proactive role in participating in collaborative
governance structures with other stakeholders to foster socio-economic devel-
opment and community well-being
In particular, the economic crisis has pushed governments to improve investment
allocation toward all public sectors (e.g., Education, healthcare, infrastructures, etc.).
This has involved a significant cut in financial resource transfers from central bodies
to local authorities and has also delayed the enforcement of national development
plans (Cepiku et al., 2016; Cepiku & Bonomi Savignon, 2012).
The critical condition of public finance has speeded up the implementation of
reforms. On this concern, given the increasing tightening of public funds, the
reforms have aimed at allocating public funds according to a performance-based
ranking among those institutions operating in the same sector. Such a mechanism—
which simultaneously seeks to increase public organizations’ performance—has
thus implied a rise of competitiveness among Universities at a national level.
Therefore, Universities have now to focus on the design and use of effective PM
mechanisms to improve both the quality of academic products/services supplied to
customers and the expenditure rationalization (Cappiello & Pedrini, 2017;
Marginson & van der Wende, 2009; Adler & Harzing, 2008; Saravanamuthu &
Tinker, 2002).
As for the second phenomenon, the growing competition among Universities
determined a “marketization” of the HE, and, as a result, Universities are now seen
as “business-focused organizations” (Camilleri, 2019; Parker, 2011; Etzkowitz et al.,
2000). Amaral and Magalhães (2002, p. 6) remark that “education is no longer seen
as a social right; it has become a service.” Students are seen as customers or clients,
while Universities are viewed as service providers who want to meet their clients’
needs and expectations (Meek, 2003). As far as the topic of University
“marketization” is concerned, the metaphor of the “Ivory Tower” by Powell and
Owen-Smith (1998) appears meaningful. According to this metaphor, “as [Univer-
sities] are gradually identified with commercial richness, they also lose their
uniqueness in the society. They are any longer seen as the ivory towers of the
intellectual activities and truth thoughts, but rather as enterprises run by arrogant
people aiming at capturing as more money and influence as possible.” In such a
context, the poor competitiveness of most Universities has become noticeable,
1.2 Mission, Governance, and Stakeholder Groups of Higher Education Institutions 3

particularly in comparison to world best practices (e.g., USA, Great Britain,


Finland).
Indeed, the rapid development of a HE “market” has pointed out several critical
issues related to managing academic institutions, which, in most cases, were
unprepared for the challenges introduced by a competitive environment (Neely,
1999; De Boer & Goedegebuure, 2001). Such a phenomenon tackles improving
prestige and reputation, bringing new research and educational activities’ invest-
ments. Concerning this, the past substantial investments of public resources toward
the HE sector have not resulted in an equivalent quality of Research and Teaching
(Bleiklie, 2001).
In addition, the competitiveness of the HE system reflects the competitiveness of
its own country. In fact, by focusing efforts on the interaction between Research,
Education, and professional training, a national economic system may refine those
assets and strengths, allowing different “production systems” to compete with their
rival economies. In this respect, innovation, technology, and professional compe-
tencies are unanimously considered as the only driving forces capable of facing
global challenges in the long term, specifically in those well-developed economies
where competition is no more based on the cost of inputs or economies of scale
(Noordegraaf, 2015; Czarniawska & Genell, 2002).
Therefore, innovative changes within HE systems are leading academic decision-
makers to discuss their current and outdated managerial systems to ensure successful
survival throughout time (Guarini et al., 2020).

1.2 Mission, Governance, and Stakeholder Groups


of Higher Education Institutions

Universities keep playing a key role in developing human capital in countries’ social
and economic development (Jalaliyoon & Taherdoost, 2012). Nowadays, more than
ever, the role played by Universities across the world has become more subservient
and dominant in the wide process of developing socio-economic settings at all local,
national, and supranational levels (Grossi et al., 2020; Pollifroni, 2016; European
Commission, 2003; Hölttä, 2000; Charles, 2003).
In recent years, Universities have been enjoying greater decisional autonomy.
Still, they are also being affected by pressures such as accountability, unstable
environment, and global competition, as well as by current trends like marketization
and changing roles of governments (Narayan et al., 2017; Deshmukh et al., 2010;
Decramer et al., 2007, 2008; Johnes & Taylor, 1987).
HEIs are defined as multiproduct organizations that produce two different main
outputs, i.e., Research and teaching, using multiple input factors. From a managerial
perspective, they are intended as dynamic complex organizational systems whose
value creation processes are characterized by a wide plurality of factors and variables
4 1 Performance Systems in Higher Education Institutions

Table 1.1 Statements describing the three missions of HEIs


Mission Statement
Education Improving the quality and innovation of the educational offer concerning the
current needs of the people, society, and the labor market, to foster the human and
professional development of profiles able to compete in national and international
contexts
Research Supporting basic and applied research activities, by promoting multidisciplinary
perspectives, to enhance the discovery, development, use, and dissemination of
scientific knowledge
Third Promoting a proactive and farsighted role of the University in the socio-economic,
Mission cultural, and innovative development of its regional area, through the creation of
qualified and enduring collaborations with the multiple social actors of the
knowledge economy

interacting with each other to deliver public services/products related to knowledge


transfer, professional training, and research development (Cosenz, 2014).
The UK National Committee of Inquiry into HE (1997) remarked that “the aim of
higher Education should be to sustain a learning society. The four main purposes
which make up this aim are:
• to inspire and enable individuals to develop their capabilities to the highest
potential levels throughout life, so that they grow intellectually, are well equipped
for work, can contribute effectively to society and achieve personal fulfillment;
• to increase knowledge and understanding for their own sake and to foster their
application to the benefit of the economy and society;
• to serve the needs of an adaptable, sustainable, knowledge-based economy at
local, regional and national levels;
• to play a major role in shaping a democratic, civilized, inclusive society.”
Therefore, the overall mission of HEIs is quite articulated. First, it refers to the
supply of undergraduate, graduate, and continuing education learning opportunities
that increase students’ knowledge for the subject matter, strengthen their critical
thinking, and prepare them to enter the labor market as skilled and knowledgeable
actors (i.e., Education). Second, it aims at conducting sponsored and unsponsored
research that advances and applies knowledge in the disciplines (i.e., Research).
Third, Universities must commit themselves to engage with the surrounding com-
munity by producing impacts on their socio-economic context through Education
and Research activities (i.e., the so-called Third Mission).
Examples of formal statements describing the three missions of Universities are
reported in Table 1.1.
The HE organizational setting is characterized by goal diversity, uncertainty, and
a fragmented decision-making structure, where generally the principles of profit-
making and cost minimization are partially taken into account (Lindsay, 1981). Due
to this, it is necessary to make a context of “limited economic resource” consistent
with the attainment of their missions. This causes a strong commitment to meet the
1.2 Mission, Governance, and Stakeholder Groups of Higher Education Institutions 5

increasing needs of knowledge transfer and development in terms of performance


quality and scope.
Governance in HEIs refers to how they are institutionally organized and man-
aged. In particular, the OECD (2006, p. 112) states that “governance is concerned
with the determination of values inside Universities, their systems of decision-
making and resource allocation, their mission and purposes, the patterns of author-
ity and hierarchy, and the relationship of Universities as institutions to the different
academic worlds within and the worlds of government, business and communities,
without.”
Academic governance includes all those structures, roles, mechanisms, and
processes that are institutionally responsible for executing activities, management,
and organization in the University, as well as its interaction with the outer context
(Frost et al., 2016; Paradeise et al., 2009). Academic governance goes beyond the
activities carried out by the rector and the academic boards, including governing
rules and regulations, institutional roles, expectations, and requirements by the
external and internal stakeholders. For this reason, the governance setting affects
University’s organizational structures, decision-making processes, resource negoti-
ation, and competition within and between faculties and departments (Moon et al.,
2018; Christopher, 2012).
For instance, departments use standing faculty committees to review and recom-
mend departmental changes to curricula, tenure and promotion, teaching effective-
ness, and professional development. They also use ad hoc committees to search for
new faculty, develop new programs, and conduct research projects. Departmental
faculty approve the committee’s recommendations before the chairperson forward
them to the college’s dean for approval, who—in turn—forward them to the
administrative head and the rector for the final approval.
Most matters regarding the research center go directly to the administrative chair
or the rector’s delegate for Research. The departmental technical/administrative staff
members are represented by departmental representatives who serve on all major
University-wide committees (Miller, 2007).
External and internal governance is concerned with how stakeholders or
governing bodies plan and influence Universities (Melo et al., 2010).
As remarked by Gillies (2011), the main academic stakeholders include:
(1) Students, who ask for receiving high-quality Education to comply with labor
market expectations
(2) Organizations and institutions (private, public, no profit, etc.), which seek well-
educated employees
(3) Academic (i.e., teaching and research, including Ph.Ds.) and technical/adminis-
trative staffs, who provide their services and, in turn, expect professional and
financial rewards
(4) The scientific community interested in research product development
There are also alumni, parents and families, the broader community, and its
environment, among other stakeholders. Such an ecosystem does not just practically
represent the community of the local area where the University is placed, but the
6 1 Performance Systems in Higher Education Institutions

broader global community which may benefit from its products and is supposed to
foster cultural and scientific promotion (de la Torre et al., 2019; Falqueto et al.,
2020).
The creation of social and competitive value in HEIs is, on the one hand,
equivalent to meeting stakeholders’ needs for training and knowledge enhancement.
Stakeholders are also directly or indirectly involved and interested in the growth of
the socio-economic value of Universities since they are deemed valuable invest-
ments for the community as a whole. On the other hand, knowledge creation,
development, and transfer toward the community impose Universities to express
high professional competence to tackle the growing competition of the HE sector.
The system of social groups is quite complex in the academic ecosystem because
of the uniqueness of its organizational setting: several categories of key-players
coexist in Universities, whose hierarchical order is quite difficult to understand if
compared to other public or private organizations (Moon et al., 2018).

1.3 Main Organizational Features of Higher Education


Institutions

In their study on government-University relationships, Neave and Van Vught (1991)


identify two different public University models. The first is a “state control model”
where the government directly rules and controls national Universities. In the second
model—called the “state supervising model”—the government limits its decisional
authority by supervising Universities’ performance.
Regardless of the typology of the government-University model, the strategic
coordination between central and peripheral structures stands for one of the most
significant criticalities in running Universities, especially as far as the allocation of
financial and human resources is concerned. Figure 1.1 depicts the University’s
organizational structure, as characterized by Mintzberg (1979).
The strategic apex includes the central structures in charge of designing the
strategy to adopt (i.e., rector, academic senate, board of direction). They establish
strategic dialogue and coordination with the peripheral bodies (i.e., faculties, depart-
ments, laboratories, libraries), which act as middle-line units in applying
implemented strategies. Departments employ scholars, while faculties are often
loosely organized by virtue of the scientific fields within the multiple heterogeneous
disciplines that form educational curricula (Weick, 1976; Reponen, 1999). The
department’s faculties vote to determine who will serve as departmental chairperson.
Rotating chairpersons are usually tenured, senior members of the department’s
faculty who serve specific time interval terms. The chairperson serves as the
department’s official spokesperson to the University community and regularly
interacts and directly reports to the dean.
Both central and peripheral structures are supported by the work of administrative
and technical staff. Academic professionals—such as scholars and lecturers—
1.3 Main Organizational Features of Higher Education Institutions 7

Fig. 1.1 Outlining the University’s organizational structure

produce and deliver education and research outputs (e.g., publications, educational
programs) in the operating sphere.
Based on their missions, Universities can be virtually divided into different
organizational areas encompassing the multiple interdependent processes leading
to value creation. The main organizational areas resulting from the University’s
mission are (1) education, (2) research (applied and scientific research), and
(3) administration. In addition, the so-called Third Mission is included since it
encompasses those outcomes emerging from Research and Education that have a
major impact on the socio-economic development of the region where the University
operates (e.g., new patents, academic spin-offs, unemployment reduction, etc.). All
the other operations and activities are intended to support these areas (e.g., career
guidance for students, job placement, etc.).
In particular, Education includes all those activities aimed at knowledge transfer
and developing professional competencies, teaching, and learning. Research relates
to the evolution of the multiple branches of knowledge. Its activities are conducted in
the departments, according to a logic of homogeneous clusters of disciplines. In
terms of research outcomes, it seems advisable to make a distinction between
scientific and applied Research: the first one is oriented to formulate theories,
sciences, and doctrines, whereas the second one aims to the practical application
of such theories, and its outcomes are usually more tangible (e.g., the working out of
new drugs or design patents). Supporting activities include a range of additional
8 1 Performance Systems in Higher Education Institutions

Fig. 1.2 Core organizational areas of HEIs

auxiliary services to Education and Research. Eventually, the administration is


responsible for executing all the procedural duties and legal obligations underlying
the provision of Education and Research products/services from an administrative
viewpoint. Figure 1.2 summarizes the main organizational areas of Universities.
Therefore, HEIs are characterized by the coexistence of two staff categories,
called to collaborate, although often resulting in conflict: research/teaching and
technical/administrative staff. People working in University administration view
the HEI as a bureaucratic organization that must pursue its goals according to
predefined procedures and responsibilities (Villarreal, 2001). People in the scholarly
community are predominantly focused on the outcomes achievable through Educa-
tion and Research activities with a limited propensity to be subjected to formal
procedures and to adopt a balanced perspective between input consumption and
output achievements (Dobija et al., 2019; Newton, 1992). Although these two
coexisting staff categories are crucial for the organizational success of HEIs, they
often complicate strategic planning, performance assessment, coordination among
organizational units, and institutional innovation. This amplifies the need to adopt
robust PM to foster strategic alignment and coordination between these groups
(Bianchi, 2016; Guarini et al., 2020; Deshmukh et al., 2010).
Each organizational area is associated with one or more organizational structure
(s). For instance, schools and faculties are the structures where educational activities
are carried out, while Research is developed in departments. Both administrative and
supporting activities involve multiple organizational units from the upstream to the
peripheral system (e.g., rectorate, schools, faculties, departments), as they are
1.3 Main Organizational Features of Higher Education Institutions 9

Table 1.2 Main organizational areas, structures, inputs, processes, and outputs/outcomes of HEIs
Organization Organizational
area structure Input Process Output/outcome
Education Schools, Academic staff, stu- Teaching, Graduations,
faculties dents, facilities (e.g., learning employment rate
classrooms)
Research Departments Academic staff, Researching, Publications,
research funds, experimenting, patents, cita-
facilities (e.g., labo- developing tions, other
ratories, database) research
products
Administration Schools, facul- Administrative staff, Administrative Administrative
ties, depart- academic staff, stu- processes products and
ments, rectorate, dents, facilities (e.g., services
administrative technical
headquarter equipment)
Third mission Academic incu- Academic staff, Developing, Start-up firms,
bators, job graduated, sponsors investing, net- spin off,
placement office working, employment
applying rate, socio-
knowledge economic
development

intended to support education and research activities. Academic incubators and job
placement offices are devoted to pursuing the Third Mission of HEIs. Table 1.2
associates the main organizational areas with related structures, inputs, processes,
and outputs/outcomes.
According to an <<input-process-output>> perspective (Talbot, 2007), Educa-
tion is realized through teaching and learning activities that employ resources such as
the academic staff, students, and facilities (e.g., classrooms and other teaching
equipment). Its main output corresponds to graduated students. Research is aimed
at producing publications, patents, and other research outputs (e.g., software, inno-
vations, new drugs, and so on) through processes related to analyzing, researching,
experimenting, and developing knowledge. It is conducted by the academic staff
(including Ph.Ds.) through research facilities (e.g., bibliographic databases, labora-
tories), often supported by private/public funding. The administration is carried out
by the administrative staff with the intent to support students and academic staff.
Administrative processes aim to produce administrative outputs, such as enrollment
and graduation certifications, payslips, diplomas, learning agreements, etc. Eventu-
ally, academic staff, graduated students, and sponsors contribute to performing Third
Mission activities resulting in new start-ups, increased employment, and socio-
economic development.
10 1 Performance Systems in Higher Education Institutions

1.4 Interdependent Levels in Academic Strategy


Formulation

Knowledge transfer and enhancement must be framed into a specific framework


inside a broader social system where Universities intersect their activities with other
institutions and public/private services.
To meet education and research needs through economic, social, and competitive
public service provision, the decisions made by academic managers must abide by
two basic management levels. Namely, they are (1) a strategic level focused on long-
term planning and (2) a technical level aimed to turn strategic planning into decisions
and related actions for daily management (Broadbent, 2007).
One of the most significant criticalities for strategy design in HEIs is the weak
relationships between internal units (e.g., central vs. peripheral,
administration vs. departments) and other external institutions (e.g., municipal
administration, public and private organizations). These relationships are based on
strategic coordination to create value by providing an aggregated output to users and
clients (Camilleri, 2021; Pitman, 2000). With the intent to meet education and
research needs, Universities are called to interact with other institutions—generally
autonomous and playing different social roles—to pursue a shared, strategic, and
interinstitutional perspective for generating value in the local/regional ecosystem
(Jongbloed et al., 2008; Charles, 2003; Amaral & Magalhães, 2002). Such a
perspective must be based on joint learning, and strategic dialogue shared among
the involved institutions to understand the complexities of the social and competitive
system to jointly manage and coordinate (Rajala et al., 2018; Bianchi et al., 2017;
Cosenz, 2022).
Figure 1.3 summarizes the two main strategy formulation and implementation
levels in the University setting. The political level defines the identity of Universities
in institutional terms, i.e., the hidden and invisible part of the strategic formula that
underlies the concrete decisions made explicit in the visible strategic profile (Coda,
2010). Such an identity is the “reason for being” of the HEI and identifies its mission

Fig. 1.3 Multiple levels for


strategy formulation in
Universities
1.4 Interdependent Levels in Academic Strategy Formulation 11

within the community. In other words, it represents the underlying strategic vision
made up of guidelines, values, beliefs, and basic attitudes, which becomes rooted in
key-players’ behavior (James et al., 2020).
Its visible strategic profile—deriving from the underlying strategic vision—
represents the executive formula outlined by the academic management at an
operational level. It includes the multiplicity of decisions and actions affecting the
organizational key-players for pursuing predefined goals (Broadbent, 2007). From a
time-based perspective, its validity is more limited than the underlying strategic
vision, as it changes depending on the academic performance assessment throughout
time (Frost et al., 2016).
Strategy development aims to identify and understand the organizational system’s
opportunities, threats, and complexities and timely carry out performance adjust-
ments through strategy reformulation processes influencing those factors driving
performance. Hence, the capability to understand and manage the complexity
characterizing the organizational setting of HEIs, and to envisage its possible
evolutions, accounts for a decisive key in strategic learning and management in
terms of management decisions to cope with future external events (Noordegraaf,
2015; Sporn, 1999; De Bruijn, 2002).
In particular, strategic decisions concern the definition of:
• Primary goals or macro-objectives to be achieved in the long term as a result of
the institutional mission pursuit
• Objectives or targets, which are defined as technical and operational expressions
of the primary goals
• Strategic resources, objective-oriented coordination, processes and activities, and
performance management tools
The first ones are set at a political level and, therefore, reflect the ability to
perceive the community’s needs and to put forward efficient solutions in terms of
long-range planning. The other two are defined at an operational level and require
specific managerial skills for combining political decisions with management
choices by allocating and coordinating available resources (both human and instru-
mental) to achieve goals through actions (Dooris et al., 2004).
For instance, a major theme in many strategic plans is to improve academic
programs and curricula. Each institution has its own perspective on what is relevant
in terms of academic programs, and these statements usually reflect an institutionally
driven viewpoint. One academic institution might aim to ensure programs and
curriculum fit the educational needs of its student population. In contrast, another
institution is more interested in improving its curriculum by expanding its graduate
and research programs. These are quite general objectives and might best be called
strategic goals, themes, or even directions. However, the specific actions to improve
academic programs could range from ensuring all educational programs offer an
internship option for students looking for real-world experience to setting target
enrollments for particular graduate programs. These types of action seem to fit the
definition of an objective more closely because they can be measured (Hinton, 2012;
Broadbent, 2007).
12 1 Performance Systems in Higher Education Institutions

1.5 Framing the Relationships Among Organizational,


Institutional, and Political Systems

The adoption of a strategic perspective in analyzing the conditions for the success of
HEIs may result inapplicable or limited whether the systemic relationships among
academic strategy formulation, institutional, and political systems are not properly
set. Likewise, each change occurring in the institutional system through juridical
innovations must be related to strategic management principles. This means that the
organizational goals must be set according to both the institutional and political
frameworks a University operates. The institutional system identifies rules, pro-
cedures, roles, and behavioral models that academic actors must adhere to
(Borgonovi, 1996; James et al., 2020).
Therefore, a strategic change in HEIs also involves a change in its institutional
system (Noordegraaf, 2015; Osborne, 2002). The recent introduction of reforms in
the HE sector—mainly oriented to establish competitiveness and market philoso-
phies into this system—is implying significant changes in the approach to University
management (Broucker et al., 2016).
In particular, as Borgonovi (1996) maintains, the institutional system of HEIs
includes:
• The stakeholders who are directly or indirectly involved in the multiple academic
activities (Education, Research, administration, technical support)
• The contributions—in terms of outputs—provided by these stakeholders in
executing their activities
• The rewards (economic, professional, etc.) received by the stakeholders
according to the contributions they provide
• The institutional structures and mechanisms through which the above elements
are kept in equilibrium
Decision-making processes of HEIs must take into account the relationships
among the organizational, institutional, and political systems, as illustrated in
Fig. 1.4.
According to the needs of the external socio-economic environment, these three
systems are called to establish a strategic dialogue to jointly foster the decision-
making processes of HEIs and other social actors based on their specific
responsibilities.
It appears worth remarking that political scientists are often inclined to include the
institutional system within the political sphere, thereby neglecting the different roles
played by these interconnected settings (Peters, 2019; Bryson et al., 2017; March &
Olsen, 1984, 2005). In particular, the institutional system is requested to maintain an
equilibrium between the formal powers and the set of rights/obligations related to the
national and local communities. The political system should guarantee a proper
balance between values, interests, and expectations of multiple social groups. Even-
tually, according to a sustainable development perspective, the organizational
1.5 Framing the Relationships Among Organizational, Institutional,. . .

Fig. 1.4 Relationships among organizational, institutional, and political systems to foster strategic change processes in HEIs (adapted from Borgonovi, 1996,
p. 105)
13
14 1 Performance Systems in Higher Education Institutions

system is responsible for fostering an equilibrium between social needs and available
resources, public service demand, and supply (Borgonovi, 1996).
The reform movement has inspired the adoption of a strategic approach to public
sector management called New Public Management and its complementing evolu-
tionary paradigms, i.e., New Public Governance and Public Value Management.
They are analyzed in the following section.

1.6 Reforming Higher Education Institutions According


to New Public Management Principles

Reforms applied to HE are deeply rooted in the New Public Management (NPM)
paradigm (Hood, 1995; Osborne & Gaebler, 1992). The intensity rate of NPM
development varied across countries but involved public sector organizations
throughout the world (Pollitt, 2005; Guthrie & English, 1997; Christensen &
Yoshimi, 2001; Holzer & Yang, 2004; Broadbent, 2007; Pavan et al., 2014).
NPM reflects a wide range of different managerial principles and ideologies that
have been applied to public sector organizations leading to the development of a
variety of initiatives and technologies adopted for improving public services.
Although the NPM is not a straightforward concept, it generally relates to the idea
that private sector practices, business philosophies, techniques, and entrepreneurial
values can improve public sector performance (Hood, 1995). This perspective
assumes that implementing private sector techniques in the public sector may lead
to improved performance over time. In particular, the literature analysis provides a
set of pillars on which the NPM is grounded (Ferlie et al., 2008; Osborne, 2006;
Gruening, 2001; Pollitt & Bouckaert, 2011). They are:
(1) A focus on lesson-drawing from private sector management
(2) The presence of hands-on management and the organizational distance between
policy implementation and policymaking
(3) Input and output controls, evaluation, and performance management
(4) The disaggregation of public services to their most basic units with a focus on
their cost management and strategic coordination
(5) The growth of competition and contracts for resource allocation and service
delivery within public services
(6) Treating service users as customers
(7) Transforming rigid bureaucratic institutions into more efficient and results-
oriented public sector organizations
Encouraged by the emergence of the knowledge society, budgetary restrictions,
economic crises, increased competition, and demographic evolutions (Dobbins et al.,
2011; Powell & Snellman, 2004), national governments have been inspired by NPM
principles worldwide, seeking new ways to steer the HE sector (De Boer & File,
2009). As a result, NPM principles have—to a large extent—been introduced in the
1.6 Reforming Higher Education Institutions According to New Public. . . 15

HE sectors all over the world. Nowadays, HEIs are considered as organizations,
rather than bureaucratic public entities, with the “enterprise” as an ideal model
leading the direction of governance reforms (Kretek et al., 2013; Tahar & Boutellier,
2013). Therefore, NPM reforms have been introduced to transform state-dependent
institutions into agile organizations where aspects—such as identity, strategy, coor-
dination, and economic rationality—may coexist (Brunsson & Sahlin-Andersson,
2000). As a result, new University organizational models—such as the “Entrepre-
neurial University” (Clark, 1998; Etzkowitz et al., 2000) and the “adaptive Univer-
sity” (Sporn, 1999)—emerged.
The introduction of NPM principles into the HE sector has differed from country
to country. An interesting cross-national comparison reporting distinctions and
similarities of NPM principles’ adoption has been conducted by Broucker et al.
(2016). In this study, the authors also analyze the main NPM characteristics
implemented in HEIs, and classify them into four broad areas (Table 1.3): (1) mar-
ket-based reforms; (2) budgetary reforms; (3) autonomy, accountability, and perfor-
mance; (4) new management style and new management techniques.
In particular, along the four areas, Marginson (2009) identifies an expansion of
the role of private institutions, encouragement of commercial business activities in
Research, the creation of a competition for parcels of government-provided

Table 1.3 NPM areas in HEIs (Broucker et al., 2016)


Hénard and Bleiklie and
Mitterle Michelsen
Marginson (2009) (2006) (2013) Ferlie et al. (2008)
Market-based Role expansion of Competition Competition for stu-
reforms private institutions; between dents and funding;
encouragement of public agen- market entrance
commercial activity; cies and pri- encouragement and
competition creation vate entities failure acceptability
Budgetary Growth in student Financial Budgetary Value for money;
reforms fee-charging incentives constraints real prices develop-
ment and introduc-
tion of higher
student fees; hard-
ening of soft bud-
getary constraints
Autonomy, Output modeling Incentives Formalization Performance mea-
accountability, of evaluation; surement and moni-
and more toring; audit and
performance autonomy checking systems;
vertical steering
New manage- Corporatization Leadership Hierarchization Development of
ment style and reform principles strong executive and
new manage- managerial roles;
ment reduction in faculty
techniques representation; local
government influ-
ence reduction
16 1 Performance Systems in Higher Education Institutions

resources, growth in student fee-charging, output modeling, and corporatization


reform. In a study promoted by the OECD, Hénard and Mitterle (2006) emphasize
leadership principles, incentives, and competition between public sector agencies
and private entities to enhance organizational outputs and cost-efficiency of public
services. In the same vein, Bleiklie and Michelsen (2013) stress budgetary con-
straints, the formalization of evaluation, hierarchization, and increased autonomy for
institutions. Eventually, Ferlie et al. (2008) remark the stimulation of competition for
students and funding and the encouragement of private sector providers; the devel-
opment of real prices for student fees and research contracts; the development of
audit and checking systems; and vertical steering with stronger and more overt
managerialism.

1.6.1 Critiques to NPM and Evolution Toward New Public


Governance and Public Value Management

NPM became the dominant model to inspire the management and delivery of public
services in the 1980s and 1990s. In short, based on the concerns with previous
government failures, it aimed to promote trust in the efficacy and efficiency of
markets, a belief in economic rationality, and a push away from large, rigid, and
centralized government authorities toward power devolution and privatization
(Bryson et al., 2014).
However, many weaknesses have emerged following more than two decades of
NPM experimentations. Failures to NPM applications can be traced back to the
narrow scope of intervention limited to specific managerial and organizational
aspects of public actions (Moynihan, 2006; Moynihan et al., 2011; Moore, 2014).
Concurrently, new societal conditions and restrictions have emerged, thus amplify-
ing the need for new paradigms focusing on how to govern—not just manage—in
increasingly diverse and complex communities facing increasingly complex phe-
nomena. For instance, these phenomena include failures of large parts of the
economy, deepening inequality, historical distrust, world pandemic outbreak,
unevenly effective health care and Education systems, a stagnant middle class, and
bankrupt communities (Pollitt & Bouckaert, 2011; Christensen & Laegreid, 2007;
Kettl, 2002; Osborne, 2010).
These greater challenges gave rise to a paradigmatic evolution aimed at embrac-
ing a broader perspective to public administration development, thereby including
aspects related to public governance, partnerships, networks, active citizenship,
public services, and values (Bojang, 2020; Bryson et al., 2015; Chouinard & Milley,
2015; Denhardt & Denhardt, 2011; Osborne, 2010; Bozeman, 2007; Stoker, 2006).
In this debate, New Public Governance (NPG) and Public Value Management
(PVM) have emerged as approaches to overcome NPM shortcomings. As described
by Liddle (2018, p. 978), “PVM and NPG are part of a set of ideas arguing about the
extant shift from government to governance, and though not without critics, the key
1.6 Reforming Higher Education Institutions According to New Public. . . 17

contribution they aim at is an ambitious attempt to bring together literature from


governance, strategy, operations management and service industry to substantiate
the argument that the business of government is not just about delivering ‘products’
in a top-down, mechanistic, hierarchical, manufactured fashion, rather it is about
how intangible public goods are part of supply chain ‘processes’, and all the
linkages must be ‘adding public value’ at every juncture (Osborne et al., 2013).
For these authors, current public management theory is not fit for purpose, nor
indeed has it ever been. In their ‘public service dominant approach’, they argue that
current theory has two flaws (1) the focus on intra-organisational processes at a
time when the reality of public services delivery is inter-organisational, and (2) the
reliance on management theory derived from the experience of the manufacturing
sector and which ignores the reality of public services as “services”.”
In the same vein, Benington and Moore (2011) stress that public administration
must be grounded on a set of values aimed to drive its behaviors and collaborative
networking in the conduct of its actions. A major role is given to involving citizens in
policymaking. According to this perspective, public values apply to multiple ele-
ments characterizing public administration, such as the contribution of the public
sector to the development of society, the relationship among public administration,
policy, citizens, and the environment, as well as the behavior of public sector
employees, shared powers, and the internal organizational aspects (Dickinson,
2016; Noordegraaf, 2015). In addition, Stoker (2006, p. 56) argues that “Public
value management does offer a new paradigm and a different narrative of reform. Its
strength lies in its redefinitions of how to meet the challenges of efficiency, account-
ability, and equity and in its ability to point to a motivational force that does not rely
on rules or incentives to drive public service reform. It rests on a fuller and rounder
vision of humanity than does either traditional public administration or new public
management.”
Table 1.4 summarizes the evolutionary pathway from NPM to PVM and NPG by
addressing the most relevant elements in this paradigm shift.
The NPG approach is relevant for framing Third Mission activities of Universities
as it supports a more organic and collaborative perspective for exploring coordina-
tion of academic stakeholders. These stakeholders interact in an ecosystem charac-
terized by a wider, more complex, and often nonlinear supply chain of public service
delivery (Douglas & Ansell, 2020; Bryson et al., 2017; Crosby et al., 2017; Osborne,
2010).
HEIs carry out Third Mission activities through social engagement, networking,
and collaboration with other stakeholders to generate sustainable community out-
comes (Pinto et al., 2016). This implies increasing complexity and new challenges
for University management. These activities are developed through a fragmented
value creation chain where multiple actors—entailing additional performance vari-
ables—intervene at political/strategic and operational levels (Noordegraaf, 2015).
Thus, NPG—as an approach to strengthening collaborative public governance and
inter-organizational coordination—is explicitly required to foster sustainable Third
Mission performance in the socio-economic context where the HEI operates (Doug-
las & Ansell, 2020; Broucker et al., 2017).
18 1 Performance Systems in Higher Education Institutions

Table 1.4 Comparing NPM with NPV and NPG (adapted from O’Flynn, 2007, p. 361)
Public Value Management and New
New Public Management Public Governance
Characterization Post-bureaucratic, competitive Post-competitive, collaborative
government
Dominant focus Results Relationships, collaboration, and
networking
Managerial Achieve agreed performance targets Multiple goals including responding
goals to citizen/user preferences, renewing
mandate and trust through quality
services, steering network
Definition of the Individual preferences are aggregated Collective preferences are expressed
public interest
Performance Management of inputs and outputs to Multiple objectives are pursued
objective ensure economy and responsiveness including service outputs, cohesion,
to consumers coordination, satisfaction, outcomes,
trust, and legitimacy
Dominant Upward accountability via perfor- Multiple accountability systems
model of mance contracts; outward to cus- including citizens as overseers of
accountability tomers via market mechanisms government and co-creators, cus-
tomers as users, and taxpayers as
funders

1.6.2 University Performance-Based Ranking and Funding


Systems

Based on NPM, NPG, and PVM principles, the rules concerning University public
financing and ranking formulation have changed over the last 20 years and, as a
result, have strongly modified the way to manage Universities. For example, in the
UK, Research Assessment Exercises (RAEs) have been used as a criterion for
allocating resources to Universities for over two decades (Ashton et al., 2009).
Similar procedures for measuring the quality of research output are now used in
several other countries (Dill & Soo, 2005).
Following UK’s pilot scheme, several changes have taken place all over Europe
and beyond, trying to harmonize the different academic systems to face education
globalization challenges aiming at achieving both higher competitiveness and an
increase in customer satisfaction (Aversano et al., 2018; Jongbloed & Vossensteyn,
2001; Dill & Soo, 2005). As a result, the external assessment and audit of teaching
and research performance are now well-established management routines in many
countries.
In particular, the adoption of these criteria has been oriented to a decentralization
of power from the National Ministry of Research and Education toward Universities,
by (1) enlarging their financial and autonomy in decision-making processes, (2) pro-
moting accountability in internal and external communication processes, and
(3) making decision-makers aware of their responsibilities along the hierarchical
1.6 Reforming Higher Education Institutions According to New Public. . . 19

scale. This increase in decision-making autonomy and accountability has placed a


greater emphasis on designing performance measurement and management systems
in Universities (Mussari & Ruggiero, 2010). Therefore, decision-making autonomy
and performance measurement have been introduced as complementary aspects of a
radical reform-oriented process.
In particular, the increase in decisional autonomy has involved a significant
overhaul of the public University funding system. Public funding (i.e., transfers
from the Ministry of Research and Education to HEIs) represents the most important
source of financing for most public Universities. In this regard, reforms have aimed
to link the financial resource allocation system to the performance measurement of
each University. This mechanism rewards those institutions resulting in “virtuous,”
i.e., whose overall performance is aligned with ministerial standards, by allocating
more public funds (Kivistö & Kohtamäki, 2016; Tiscini & Martiniello, 2011).
In the past, the public financing system distributed resources to Universities to
accomplish a widespread “welfare state” task to ensure a satisfying and homoge-
neous performance level in educational activities. This financing system did not
consider academic institutions’ efficiency, effectiveness, and quality levels in pro-
viding educational services and research outputs to end users.
Nowadays, in most countries, Universities operate in a new context characterized
by strong competitiveness as a result of the new public financing system that
allocates resources by virtue of a performance-based ranking. In other words, each
University’s performance is yearly assessed by the Ministry of Research and Edu-
cation, which, subsequently, distributes the largest part of public funds to top-ranked
Universities. Formally, such a mechanism is based on a meritocratic principle of
resource allocation, and, at the same time, its application encourages a performance
alignment among all national academic institutions in terms of education quality,
research output, and management effectiveness (Jongbloed & Vossensteyn, 2001;
Kivistö & Kohtamäki, 2016).
The academic competitiveness is based on the performance level that each
University can reach and on the resulting capability to obtain more funds (i.e., the
so-called performance-based funding system). This means that the adoption of a
rewarding system aims at putting in competition public Universities not only to
achieve financial resources but also to improve performance in terms of educational
services toward end users that forms the ground on which to draw up international
academic rankings (Francesconi & Guarini, 2018; Keenoy & Reed, 2008).
The Ministry of Research and Education measures academic performance
through indicators that consider Research and Education activities and other critical
processes, e.g., internationalization, managing strategic resources, and funding by
external financing bodies.
20 1 Performance Systems in Higher Education Institutions

1.6.3 Critical Issues on the Performance-Based Ranking


and Funding Systems: The Need for Introducing
Performance Management in Higher Education
Institutions

University performance indicators adopted by an external evaluator and funder, such


as the Ministry of Research and Education, are based on “macro” measures. They
provide limited information that makes highly ambiguous and partial efforts to
understand and diagnose academic performance. With the intent to generate incen-
tives and competition for funding and international prestige among Universities, the
Ministry of Research and Education essentially focuses on individual financial
measures and other isolated statistical data as surrogates of “good performance”
(Jongbloed & Vossensteyn, 2001; Kivistö & Kohtamäki, 2016). For instance,
indicators to measure teaching quality often refer to the credits gained by students
because of the exams passed within their curriculum of study. However, an
unintended result related to the exclusive use of this measure, as an indicator of
“good performance” in teaching, is that universities may adopt loose students’
evaluation schemes to get more funds and improve their ranking (Cosenz, 2014;
Cosenz & Bianchi, 2013). Though in the short term this policy might work, in terms
of additional cash flows, in the long run, it might compromise both educational
quality and the University’s reputation.
Ministerial parameters focus on outputs rather than outcome measures (Ammons,
2001, pp. 12–14) and related value creation processes. Such a myopic and bounded
view may result in a simplistic performance assessment that may lead to dysfunc-
tional or wrong short-term evaluations if observed under a sustainable development
perspective.
Potential risks of inconsistency in ministerial assessment may regard the follow-
ing issues (Cosenz, 2014):
• Allocating more funds to Universities that have shown a better performance is
likely to weaken the competitiveness of other Universities. Consequently, it may
enlarge the imbalance in the quality of the latter’s academic activities compared
to the former. Though this rule can be accepted as a principle to encourage
improved performance, it might be questioned if one considers that “knowledge”
is a public good.
• The outcome indicators, used by the Ministry of Research and Education to
measure the ratio between the quality of training and the employment rate of
graduates from each University, do not take into account the contextual socio-
economic conditions of the regional areas where Universities are located, and this
may involve a socio-economic imbalance in the development of regions.
• The ministerial effort to increase competitiveness in the HE sector, thereby
leading Universities toward higher performance levels in education and research
activities, should be complemented by a parallel action aimed to promote the
1.7 Designing Performance Management Systems in Public Sector Institutions 21

streamlining of both bureaucratic procedures and supporting activities carried out


by administrative back-office units.
• The ministerial performance measurement system mainly focuses on the short
term, and, therefore, it may not be consistent with the broader goals of sustainable
academic development.
• Scholars have to shift their research fields and methods to meet better the criteria
of excellence set by the ministerial performance measurement system (Harley &
Lee, 1997).
• This measurement system negatively affects departmental research collaboration
due to the individual focus of research performance evaluations, damaging
innovation and creativity in Research (Martin & Whitley, 2010).
Even though the above issues reveal a limited and incomplete assessment frame-
work of academic performance by the Ministry of Research and Education, the
design of effective PM systems cannot overlook ministerial guidelines and criteria.
In fact, excluding ministerial parameters from the set of performance measures
adopted by Universities runs the risk of diverting academic decision-makers’ atten-
tion from those measures that may lead to increased funding from the State.
Although their rationale and usefulness are at the core of a heated debate, both
national and international rankings of Universities play an increasingly important
role in the management mechanisms of most Universities (Brooks, 2005; Dill & Soo,
2005; Adler & Harzing, 2008). The importance of such rankings has increased
mainly due to higher competition between Universities, both nationally and inter-
nationally, as well as the more general internationalization of the HE sector. As a
result, considerable attention is now given to the performance of Universities and the
way to manage it.

1.7 Designing Performance Management Systems in Public


Sector Institutions

Performance Management (PM) is an established aspect of public sector manage-


ment nowadays, with journals producing special editions on the subject (Borst et al.,
2014; Ferlie & Steane, 2002). Inspired by NPM, NPG, and PVM paradigms
(Broucker et al., 2017), the use of PM in the public sector aims at progressively
moving away from the traditional bureaucratic approach to introduce managerial
mechanisms for tracking the outputs and outcomes of service delivery (De Bruijn,
2002; Hoque, 2008; Johnson, 2005). At the beginning of NPM introduction, efforts
to assess public sector performance focused on evaluating value for money, usually
conducted by external auditors. Progressively, the focus has been shifted toward
identifying good and poor resource usage examples through the adoption of perfor-
mance measures and indicators (Talbot, 1999). Since the mid-1980s, increasing
attention has been given to the introduction of PM in public sector organizations
seen as “an integrated set of planning and review procedures which cascade down
22 1 Performance Systems in Higher Education Institutions

through the organization to provide a link between each individual and the overall
strategy of the organization” (Rogers, 1990, p. 16).
Recently, the adoption of PM systems in the public sector has been associated
with the establishment of standards to be achieved (Ammons & Rivenbark, 2008;
Jääskeläinen & Laihonen, 2014), mostly related to resource consumption with few
attempts aimed at measuring outputs, with a focus on single organizational units
mainly located in front-office position. Evaluation result against set criteria is
generally published in social reports and used as a vehicle for external accountability
of the organization’s performance. However, these social reports often turn out to be
mere formal fulfillments to observe as a legal obligation. This is because of a huge
amount of data that makes understanding quite complex, a lack of focus on the
different stakeholder expectations toward performance disclosure, and a persistent
bureaucratic approach hiding relevant data rather than making them more explicit
(Boland & Fowler, 2000).
Neely et al. (2002) define a PM as a balanced and dynamic system that supports
the decision-making process by gathering, elaborating, and analyzing information.
Such a system adopts different measures and perspectives to provide a holistic view
of the organization and its values (Sorci, 2007). According to Kueng (2000), it serves
as a management information system that (1) gathers relevant performance data
through a set of indicators, (2) compares the current values against historical or
planned values, and (3) disseminates the results to the process actors and managers.
PM includes both strategic planning and performance measurement according to
a circular logic underlying policymaking and decision-making processes, as well as
organizational coordination with the intent to improve public service delivery
sustainably (Van Dooren et al., 2010; Verbeeten, 2008). Such logic is synthesized
in Fig. 1.5, where <<inputs-throughputs-outputs-outcomes>> are placed in a
framework of general and operational needs, as well as embedded policy objectives
associated with the economy and the environment (Bouckaert & Halligan, 2008).
Namely, policies are defined according to the needs of a given community. The
objectives set into policies are characterized by a qualitative approach that implies
the pursuit of long-term goals. Then, policies are translated into operational objec-
tives that quantify the goals to be reached in the short term. Based on these
objectives, each institution operating in the same sector allocates the available
resources—also affected by the region’s economy—to the different organizational
units responsible for specific processes and underlying activities leading to outputs.
Value creation processes require efforts to manage and coordinate the multiple
resources (e.g., human and instrumental) and related consumption to pursue the
organization’s sustainable development (Bianchi, 2010). Creating value allows
public organizations to better respond to community needs and feed up their resource
endowment over time. In the long term, the aggregated contribution in terms of
outputs offered by the different organizations involved in the same value creation
process produces a set of outcomes generating impacts on both the environment and
the economy. If timely and adequately measured through sound performance indi-
cators, outputs may influence the redefinition of operational objectives. In turn,
outcomes should be considered for fostering policymaking. Therefore, outcomes
1.7 Designing Performance Management Systems in Public Sector Institutions 23

Fig. 1.5 Performance management logics in a public context (adapted from Bouckaert & Halligan,
2008, p. 33)

are here intended as final organizational results that cross the institutional boundaries
of the single organization (Bouckaert & Halligan, 2008).
Performance is relevant for managing single organizations, substantive policy
areas, and the macro governance settings of countries (Bryson et al., 2014). The
main focus of PM concentrates on the link of resources with processes/activities and
outputs and capturing the aggregated contributions of outputs to outcomes of public
organizations and their policies in a given local or regional area.
In addition, this framework implies the definition of multiple performance mea-
sures that capture the value generated by public institutions, focusing on the different
elements within their organizational boundaries. As such, the capability to translate
community needs into policy objectives is measured in terms of relevance. Similarly,
measures of operational relevance are used to evaluate the alignment between
policies and operational objectives (i.e., qualitative vs. quantitative goals). Effi-
ciency compares results with the strategic resources used to achieve them. It can
be measured internally, i.e., outputs vs. resources, and externally, i.e.,
outcomes vs. resources. The aggregated contribution in terms of outputs is assessed
through outcome-based indicators (Bianchi et al., 2017). Results associated with
objectives generate effectiveness measures, which are referred to as a single
24 1 Performance Systems in Higher Education Institutions

organization (outputs vs. operational objectives -> internal effectiveness), as well as


to the overall sector (outcomes vs. policy objectives -> external effectiveness).
Utility measures aim to capture the impact produced by outcomes for fulfilling
community needs.
At each level (political, operational, unit level), responsibility for performance
may directly determine who is accountable for those results (Bouckaert & Halligan,
2008).
In comparison to private sector organizations, managing performance in public
institutions is more challenging due to additional critical issues.
These criticalities characterize public service delivery toward the community,
e.g., weak interaction between different institutions involved in the same value
creation process and the need to be more accountable. They also affect the strategy
design process encompassing multiple key-actors (e.g., policymakers, decision-
makers, public managers, unit managers). Namely, as maintained by Smith (1993),
public sector management is conventionally affected by the following phenomena:
(1) Tunnel vision: PM systems are centered on particular areas to exclude others.
(2) Sub-optimization: public managers focused on their own narrow objectives, to
the detriment of broader strategic coordination among the different organiza-
tional units (or institutions) interacting to reach common goals.
(3) Myopia: strategies are designed and implemented to produce their effects in the
short term, and do not consider possible long-term undesired effects.
(4) Convergence: an emphasis on not being exposed as an outlier on any perfor-
mance indicator rather than a desire to be outstanding.
(5) Ossification: cultural resistance to changes resulting in a disinclination to exper-
iment with innovative approaches.
(6) Gaming: defining unchallenging objectives to obtain temporary advantages and
incentives without effort.
(7) Misrepresentation: performance reports are drawn up showing useless or even
counterfeit results.
(8) Data manipulation: information on organizational results is misrepresented or
altered to report an erroneous perspective about performance and, possibly,
achieve undeserved consensus or rewards.
The use of PM systems in the public sector also requires increased promptness
and flexibility to external changes (i.e., social, economic, jurisdictional changes) by
adopting a systemic view involving different governance levels (Bianchi, 2015;
Bianchi et al., 2017). This is essential to tackle the adverse effects emerging from
the so-called wicked problems (Head & Alford, 2015). These problems are issues
difficult to detect and manage, related to social pluralism (multiple interests and
values of stakeholders), institutional complexity (the context of inter-organizational
cooperation and multilevel governance), and scientific uncertainty (fragmentation
and gaps in reliable knowledge).
To this end, performance measures must focus on outputs and outcomes to
support each key-actor in finding better inter-institutional coordination and improv-
ing value creation processes. This focus allows them to adopt sustainable integrated
1.8 Focusing on Performance Management Systems in Higher Education Institutions 25

strategies that may effectively rationalize resource allocation and consumption,


ensuring higher public service provision standards. Such measures cannot be
based exclusively on a financial dimension of performance. Instead, monetary and
nonmonetary indicators must coexist within an inclusive PM framework tailored to
public sector characteristics. In other words, a systemic perspective of PM is
required to enhance the strategic learning processes of public managers through
effective diagnostic tools able to track down the causes that have led to different
results compared to the expected ones (Bianchi, 2012, 2016).
Traditional PM frameworks—predominantly based on only financial measures—
often lack a systemic perspective and are too static approaches to frame value
creation processes and manage public sector organizations in contexts characterized
by complexity and dynamism (Bianchi, 2012; Cosenz, 2014; Sloper et al., 1999;
Linard, 1996). Therefore, additional methodological support is needed to allow PM
schemes to overcome these shortages.

1.8 Focusing on Performance Management Systems


in Higher Education Institutions

Over the last 20 years, HEIs faced increasing pressure to improve their performance
and accountability. This pressure was mainly driven by budget shortages, informa-
tion technology implementation, and external requirements for improved efficiency
and effectiveness (Chae & Poole, 2005). To cope with this pressure and related
expectations, HEIs should adopt PM mechanisms to improve academic results in
terms of efficiency, effectiveness, and stakeholder satisfaction. The conventional
tools for managing, controlling, and monitoring performance—mainly based on a
bureaucratic approach—appear outdated and require innovative methods for man-
aging and measuring academic performance. As a result, Universities started a
renewal process to design and implement PM systems to address this concern and
tackle their poor performance (Aversano et al., 2018; Deshmukh et al., 2010;
Cosenz, 2013).
In particular, Research focusing on PM applied to the HE sector is classified into
two main streams: the first one adopts a “macro” perspective aimed to design PM
systems according to a ministerial viewpoint. This perspective suggests reviewing
academic performance to comply with the ministerial criteria for public fund allo-
cation. The second one uses a “micro” perspective looking at the results emerging
from internal processes and practices according to an organizational viewpoint
(Franco-Santos et al., 2014). Both perspectives are relevant to assessing academic
performance. While the “macro” perspective focuses on an overall and synthetic
performance assessment, the “micro” one further analyzes those internal mecha-
nisms underlying decision-making and performance measurement processes. As
such, it is worth using a PM approach that may include both perspectives into a
26 1 Performance Systems in Higher Education Institutions

Fig. 1.6 Applying performance management logics to education

unique framework able to capture the multiple interdependencies between the key
elements underlying academic value creation processes (Cosenz, 2014).
For this purpose, the model suggested by Bouckaert and Halligan (2008),
depicted in Fig. 1.5, can be applied at different levels (e.g., strategic, institutional,
or operational level). This model is based on systems theory and provides a holistic
perspective for framing academic performance since it acknowledges the existence
of a closed loop between the following actions: policymaking, strategy design,
performance measuring, formulation of corrective actions, and outcome response
(Melo et al., 2010; Boland & Fowler, 2000).
This model has been calibrated according to Education and Research’s different
institutional and organizational features (see Figs. 1.6 and 1.7, respectively).
In these frameworks, the HE system is framed as a process to transform inputs
(e.g., enrolled students, academic staff, facilities, research funds) into outputs (e.g.,
graduated students, publications, patents) by carrying out specific activities, such as
teaching and learning, researching, experimenting, and developing.
While education focuses on the value added offered through the execution of
training and educational programs corresponding to any increment in the knowledge
of students, research takes account of any increase in knowledge generated by the
HEI, in the form of publications or patents, to provide an example (Cave & Hanney,
1992; Melo et al., 2010). In the long term, this process produces an impact on the
1.8 Focusing on Performance Management Systems in Higher Education Institutions 27

Fig. 1.7 Applying performance management logics to research

broader socio-economic contexts. Such an impact relates to the outcomes intended


as the long-term results of a University, which include—for instance—improving
the regional economy, reducing unemployment, and starting new academic spinoffs
(Boland & Fowler, 2000).
In this context, the design of PM systems should include a wider range of
indicators compared to the restricted range of ministerial parameters set according
to a “macro” perspective. Based on these indicators, academic decision-makers may
evaluate the progress resulting from the adoption of a given strategy, as well as
emerging problems that require proper analysis/diagnosis and reaction (Bouckaert &
Halligan, 2008). This implies that Universities need a systemic and selective
approach in identifying a balanced mix of indicators to support strategy design/
implementation and performance measurement (Boland & Fowler, 2000).
For instance, sound performance indicators that Universities could use to assess
their organizational results may refer to:
• Quality of education, research, administration, and supporting activities
(Dearlove, 1998). On the one hand, quality should be measured by comparing
the delivered “products/services” with end user’s expectations (e.g., availability
and professionalism of front-office workers, exhaustiveness of teaching contents,
relevance of publications, size of classrooms). On the other hand, quality should
be assessed by considering the efficiency level reached in administrative
28 1 Performance Systems in Higher Education Institutions

processes (e.g., mistakes in handling the workload, waste of consumption mate-


rials, equipment breakdowns).
• Time, referred to both end user’s expectations on academic service provision
(e.g., average waiting time in University administrative offices, delays in class
schedules, delays in updating curricula) and production processes related to
efficiency levels (e.g., time to fulfill administrative procedures, waiting time for
payments and reimbursement, salary payment delays).
• Productivity considered as the ratio between achieved outputs or outcomes and
resource consumption over time (e.g., the average number of publications per
research staff or department, per year).
• Flexibility highlighting the organization’s ability to quickly adapt to external
changes with a minimum waste of resources (e.g., the average time to implement
new administrative procedures, educational programs, and syllabus, student
assessment systems).
Designing a PM system entails not only a nominal definition of expected results
but also their measurement through adequate indicators (Propper & Wilson, 2003).
This system may represent a fundamental tool to support decision-makers in Uni-
versity management (Cave et al., 1997; Neely et al., 2004). It also acts as a
coordinating mechanism for supporting organizational units to better interact with
other units located at different hierarchical levels (Alford & Yates, 2014).
Therefore, PM is an integral part of a wider strategic management activity to
reach sustainable development in the academic service delivery (what value is
offered) and its underlying processes (how value is created). High-quality academic
performance and sustainable development cannot only be conceived as the outcome
of legislative reforms. Instead, their achievement depends on the regular use of
strategic PM tools tailored to the needs of academic institutions and their specific
organizational attributes. This means that performance assessment must be oriented
to support the enhancement of those critical success factors creating value in
academic activities (Van de Walle & Van Dooren, 2010).

1.9 Closing Remarks

Remarks made in this chapter underline the basic organizational features of HEIs and
the new institutional and political conditions in which they operate nowadays. This
analysis identifies a fast-changing and complex ecosystem formed by multiple
actors, values, needs, and conditions that must be considered in fulfilling the
institutional missions of HEIs and underlying management philosophies. Thus, the
significant change taking place worldwide is entailing a remarkable evolution of the
traditional organizational, operational, institutional, and governance patterns of
managing HEIs.
Under these conditions, the design and adoption of more effective PM mecha-
nisms in Universities are gaining increasing interest among public management
scholars and practitioners. Inspired by NPM, NPG, and PVM paradigms, these
References 29

mechanisms need to be calibrated on the specific organizational features of Univer-


sities, including both a “macro” and a “micro” perspective of analysis.
To this end, moving from the framework by Bouckaert and Halligan (2008), the
next chapter will explore how to develop PM systems applied to HEIs for managing
the challenging complexity of these specific public sector organizations.

References

Adler, N. J., & Harzing, A. W. (2008). When knowledge wins: Transcending the sense and
nonsense of academic rankings. Academy of Management Learning and Education, 8(1), 1–24.
Agasisti, T. (2017). Management of Higher Education Institutions and the evaluation of their
efficiency and performance. Tertiary Education and Management, 23(3), 187–190.
Alford, J., & Yates, S. (2014). Mapping public value processes. International Journal of Public
Sector Management, 27(4), 334–352.
Amaral, A., & Magalhães, A. (2002). The emergent role of external stakeholders in European
higher education governance. In A. Amaral, V. L. Meek, & I. M. Larsen (Eds.), Governing
higher education: National Perspectives on institutional governance (pp. 1–21). Kluwer Aca-
demic Publishers.
Ammons, D. N. (2001). Municipal benchmarks. Sage.
Ammons, D. N., & Rivenbark, W. C. (2008). Factors influencing the use of performance data to
improve municipal services: Evidence from the North Carolina benchmarking process. Public
Administration Review, 68(2), 304–318.
Ashton, D., Beattie, V., Broadbent, J., Brooks, C., Draper, P., Ezzamel, M., Gwilliam, D.,
Hodgkinson, R., Hoskin, K., Pope, P., & Stark, A. (2009). British research in accounting and
finance (2001–2007): The 2008 research assessment exercise. British Accounting Review, 41(4),
199–207.
Aversano, N., Manes, R. F., & Tartaglia Polcini, P. (2018). Performance measurement systems in
Universities: A critical review of the Italian system. In E. Borgonovi, E. Anessi-Pessina, &
C. Bianchi (Eds.), Outcome-based performance management in the public sector (pp. 269–288).
Springer.
Baldwin, J. F. (2009). Current challenges in higher education administration and management.
Perspectives: Policy and Practice in Higher Education, 13(4), 93–97.
Benington, J., & Moore, M. H. (2011). Public value in complex and changing times. In J. Benington
& M. H. Moore (Eds.), Public value: Theory and practice (pp. 1–30). Palgrave Macmillan.
Bianchi, C. (2010). Improving performance and fostering accountability in the public sector
through system dynamics modelling: From an ‘external’ to an ‘internal’ perspective. Systems
Research and Behavioral Science, 27(4), 361–384.
Bianchi, C. (2012). Enhancing performance management and sustainable organizational growth
through system dynamics modeling. In S. N. Groesser & R. Zeier (Eds.), Systemic management
for intelligent organizations: Concepts, model-based approaches, and applications
(pp. 143–161). Springer.
Bianchi, C. (2015). Enhancing joined-up government and outcome-based performance manage-
ment through system dynamics modelling to deal with wicked problems: The case of societal
ageing. Systems Research and Behavioral Science, 32, 502–505.
Bianchi, C. (2016). Dynamic performance management. Springer.
Bianchi, C., & Caperchione, E. (2022). Performance management and governance in public
universities: Challenges and opportunities. In E. Caperchione & C. Bianchi (Eds.), Governance
and performance Management in Public Universities (pp. 1–14). Springer.
30 1 Performance Systems in Higher Education Institutions

Bianchi, C., Bovaird, T., & Loeffler, E. (2017). Applying a dynamic performance management
framework to wicked issues: How coproduction helps to transform young People’s Services in
Surrey County Council, UK. International Journal of Public Administration, 40(10), 833–846.
Bleiklie, I. (2001). Towards European convergence of higher education policy? Higher Education
Management, 13(3), 9–29.
Bleiklie, I., & Michelsen, S. (2013). Comparing HE policies in Europe. Structures and reform
outputs in eight countries. Higher Education, 65, 113–133.
Bojang, M. B. S. (2020). Beyond new public management paradigm: The public value paradigm
and its implications for public sector managers. Journal of Public Value and Administrative
Insight, 3(2), 1–10.
Boland, T., & Fowler, A. (2000). A systems perspective of performance management in public
sector organisations. International Journal of Public Sector Management, 13(5), 417–446.
Borgonovi, E. (1996). Principi e sistemi aziendali per le amministrazioni pubbliche. Etas.
Borst, R., Lako, C., & de Vries, M. (2014). Is performance measurement applicable in the public
sector? A comparative study of attitudes among Dutch officials. International Journal of Public
Administration, 37(13), 922–931.
Bouckaert, G., & Halligan, J. (2008). Managing performance. International comparison.
Routledge.
Bozeman, B. (2007). Public values and public interest: Counterbalancing economic individualism.
Georgetown University Press.
Broadbent, J. (2007). If you Can’t measure it, how can you manage it? Management and gover-
nance in higher educational institutions. Public Money & Management, 27(3), 193–198.
Brooks, R. L. (2005). Measuring university quality. The Review of Higher Education, 29(1), 1–21.
Broucker, B., De Wit, K., & Leisyte, L. (2016). Higher education system reform: A systematic
comparison of ten countries from a new public management perspective. In R. M. O. Pritchard,
A. Pausits, & J. Williams (Eds.), Positioning higher education institutions. From here to there
(pp. 19–40). Sense Publishers.
Broucker, B., De Wit, K., & Verhoeven, J. C. (2017). Higher education research: Looking beyond
new public management, theory and method in higher education research (Vol. 3, pp. 21–38).
Emerald Publishing Limited.
Brunsson, N., & Sahlin-Andersson, K. (2000). Constructing organisations: The example of public
sector reform. Organization Studies, 21(4), 721–746.
Bryson, J. M., Crosby, B. C., & Bloomberg, L. (2014). Public value governance: Moving beyond
traditional public administration and the new public management. Public Administration
Review, 74(4), 445–456.
Bryson, J. M., Crosby, B. C., & Bloomberg, L. (2015). Public value and public administration.
Georgetown University Press.
Bryson, J. M., Sancino, A., Benington, J., & Sørensen, E. (2017). Towards a multi-actor theory of
public value co-creation. Public Management Review, 19(5), 640–654.
Camilleri, M. A. (2019). Higher education marketing: Opportunities and challenges in the digital
era. Academia, 1(16–17), 4–28.
Camilleri, M. A. (2021). Using the balanced scorecard as a performance management tool in higher
education. Management in Education, 35(1), 10–21.
Camilleri, M. A., & Camilleri, A. C. (2018). The performance management and appraisal in higher
education. In C. Cooper (Ed.), Driving productivity in uncertain and challenging times. British
Academy of Management.
Cappiello, G., & Pedrini, G. (2017). The performance evaluation of corporate universities. Tertiary
Education and Management, 23(3), 304–317.
Cave, M., & Hanney, S. (1992). Performance indicators. In B. Clark & G. R. Neave (Eds.), The
encyclopedia of higher education (pp. 1411–1423). Pergamon Press.
Cave, M., Hanney, S., Henkel, M., & Kogan, M. (1997). The use of performance indicators in
higher education. The challenge of the quality movement. Jessica Kingsley Publishers.
References 31

Cepiku, D., & Bonomi Savignon, A. (2012). Governing cutback management: Is there a global
strategy for public administrations? International Journal of Public Sector Management,
25(6–7), 428–436.
Cepiku, D., Mussari, R., & Giordano, F. (2016). Local governments managing austerity:
Approaches, determinants and impact. Public Administration, 94(1), 223–243.
Chae, B., & Poole, M. (2005). Enterprise system development in higher education. Journal of Cases
on Information Technology, 7(2), 82–101.
Charles, D. (2003). Universities and territorial development: Reshaping the regional role of UK
universities. Local Economy, 18(1), 7–20.
Chouinard, J. A., & Milley, P. (2015). From new public management to new political governance:
Implications for evaluation. Canadian Journal of Program Evaluation, 30(1), 1–22.
Christensen, T., & Laegreid, P. (2007). Transcending new public management: The transformation
of public sector reform. Ashgate.
Christensen, M., & Yoshimi, H. (2001). A two country comparison of public sector performance
reporting: The tortoise and hare? Financial Accountability & Management, 17(3), 271–289.
Christopher, J. (2012). Governance paradigms of public universities: An international comparative
study. Tertiary Education and Management, 18(4), 335–351.
Clark, B. (1998). Creating entrepreneurial universities. Pergamon.
Coda, V. (2010). Entrepreneurial values and strategic management. Essays in Management theory.
Palgrave Macmillan.
Cosenz, F. (2013). The “entrepreneurial university”: A preliminary analysis of the main managerial
and organisational features towards the design of planning & control systems in European
Academic Institutions. Management Research & Practice, 5(4), 19–36.
Cosenz, F. (2014). A dynamic viewpoint to design performance management systems in Academic
Institutions: Theory and practice. International Journal of Public Administration, 37(13),
955–969.
Cosenz, F. (2022). Adopting a dynamic performance governance approach to frame
interorganizational value generation processes into a university third mission setting. In
E. Caperchione & C. Bianchi (Eds.), Governance and performance Management in Public
Universities (pp. 87–108). Springer.
Cosenz, F., & Bianchi, C. (2013). Improving performance measurement/management in academic
institutions: A dynamic resource-based view. Insights from a field project. Paper presented at
the ASPA (American Society of Public Administration) Annual Conference for the Center for
Accountability and Performance (CAP) Symposium, Baltimore (USA), March 12.
Crosby, B., t’Hart, P., & Torfing, J. (2017). Public value creation through collaborative innovation.
Public Management Review, 19, 655–669.
Czarniawska, B., & Genell, K. (2002). Gone shopping? Universities on their way to the market.
Scandinavian Journal of Management, 18, 455–474.
De Boer, H., & File, J. (2009). Higher education governance reforms across Europe. ESMU.
De Boer, H., & Goedegebuure, L. (2001). On limitations and consequences of change: Dutch
university governance in transition. Tertiary Education and Management, 7, 163–180.
De Bruijn, H. (2002). Performance measurement in the public sector: Strategies to cope with the
risks of performance measurement. International Journal of Public Sector Management, 15(7),
578–594.
de la Torre, E. M., Rossi, F., & Sagarra, M. (2019). Who benefits from HEIs engagement? An
analysis of priority stakeholders and activity profiles of HEIs in the United Kingdom. Studies in
Higher Education, 44(12), 2163–2182.
De Witte, K., & López-Torres, L. (2017). Efficiency in education: A review of literature and a way
forward. Journal of the Operational Research Society, 68, 339–363.
Dearlove, J. (1998). The deadly dull issue of university “administration”? Good governance,
managerialism and organising academic work. Higher Education Policy, 11, 59–79.
Decramer, A., Christiaens, J., & Vanderstraeten, A. (2007). Individual performance management in
higher education institutions. Dilemmas in higher education. EAIR.
32 1 Performance Systems in Higher Education Institutions

Decramer, A., Christiaens, J., & Vanderstraeten, A. (2008). Implementation dynamics of perfor-
mance management in higher education. In 21st EIASM Workshop on Strategic Human
Resource Management. Birmingham: 21st EIASM.
Deiaco, E., Hughes, A., & Mckelvey, M. (2012). Universities as strategic actors in the knowledge
economy. Cambridge Journal of Economics, 36, 525–541.
Denhardt, J. V., & Denhardt, R. B. (2011). The new public service: Serving, not steering.
M.E. Sharpe.
Deshmukh, A. M., Sharma, S., & Ramteke, A. Y. (2010). Performance management practices in
higher education. Excel India Publisher.
Dickinson, H. (2016). From new public management to new public governance: The implications
for a ‘New Public Service’. In B. John & G. David (Eds.), The three sector solution: Delivering
public policy in collaboration with not-for-profits and business (pp. 41–61). Australian National
University Press.
Dill, D. D., & Soo, M. (2005). Academic quality, league tables, and public policy: A cross-national
analysis of university ranking systems. Higher Education, 49(4), 495–533.
Dobbins, M., Knill, C., & Vögtle, E. M. (2011). An analytical framework for the cross-country
comparison of higher education governance. Higher Education, 62(5), 665–683.
Dobija, D., Górska, A. M., Grossi, G., & Strzelczyk, W. (2019). Rational and symbolic uses of
performance measurement: Experiences from polish universities. Accounting, Auditing &
Accountability Journal, 32(3), 750–781.
Dooris, M. J., Kelley, J. M., & Trainer, J. F. (2004). Strategic planning in higher education. New
Directions for Institutional Research, 123, 5–11.
Douglas, S., & Ansell, C. (2020). Getting a grip on the performance of collaborations: Examining
collaborative performance regimes and collaborative performance summits. Public Administra-
tion Review, 81(5), 951–961.
Etzkowitz, H., Webster, A., Gebhardt, C., & Terra, B. R. C. (2000). The future of the university and
the university of the future: Evolution of ivory tower to entrepreneurial paradigm. Research
Policy, 29(2), 313–330.
European Commission. (2003). The role of the universities in the Europe of knowledge. Commu-
nication from the European Commission.
Falqueto, J. M. Z., Hoffmann, V. E., Gomes, R. C., & Mori, S. S. O. (2020). Strategic planning in
higher education institutions: What are the stakeholders’ roles in the process? Higher Education,
79, 1039–1056.
Ferlie, E., & Steane, P. (2002). Changing developments in NPM. International Journal of Public
Administration, 25(12), 1459–1469.
Ferlie, E., Musselin, C., & Andresani, G. (2008). The steering of higher education systems – A
public management perspective. Higher Education, 56(3), 325–348.
Ferreira, A., & Otley, D. (2009). The design and use of performance management systems: An
extended framework for analysis. Management Accounting Research, 20(4), 263–282.
Francesconi, A., & Guarini, E. (2018). Performance-based funding and internal resource allocation:
The case of Italian universities. In E. Borgonovi, E. Anessi-Pessina, & C. Bianchi (Eds.),
Outcome-based performance management in the public sector (pp. 289–306). Springer.
Franco-Santos, M., Lucianetti, L., & Bourne, M. (2012). Contemporary performance measurement
systems: A review of their consequences and a framework for research. Management Account-
ing Research, 23(2), 79–119.
Franco-Santos, M., Rivera, P., & Bourne, M. (2014). Performance management in UK higher
education institutions: The need for a hybrid approach. Leadership Foundation for Higher
Education.
Frost, J., Hattke, F., & Reihlen, M. (2016). Multi-level governance in universities. Strategy,
structure, control. Springer International Publishing.
Gillies, M. (2011). University governance questions for a new era Malcolm. Retrieved from http://
www.hepi.ac.uk/wp-content/uploads/2014/02/UniversityGovernance.pdf
References 33

Grossi, G., Kallio, K. M., Sargiacomo, M., & Skoog, M. (2020). Accounting, performance
management systems and accountability changes in knowledge-intensive public organizations:
A literature review and research agenda. Accounting, Auditing & Accountability Journal, 33(1),
256–280.
Gruening, G. (2001). Origin and theoretical basis of new public management. International Public
Management Journal, 4, 1–25.
Guarini, E., Magli, F., & Francesconi, A. (2020). Academic logics in changing performance
measurement systems: An exploration in a university setting. Qualitative Research in Account-
ing & Management, 17(1), 109–142.
Guthrie, J., & English, L. (1997). Performance information and programme evaluation in the
Australian public sector. International Journal of Public Sector Management, 10(3), 154–164.
Harley, S., & Lee, F. S. (1997). Research selectivity, managerialism, and the academic labor
process: The future of nonmainstream economics in U.K. universities. Human Relations,
50(11), 1427–1460.
Head, B. W., & Alford, J. (2015). Wicked problems implications for public policy and manage-
ment. Administration & Society, 47(6), 711–739.
Hénard, F., & Mitterle, A. (2006). Governance and quality guidelines in higher education. A review
on governance arrangements and quality assurance guidelines. OECD.
Hinton, K. E. (2012). A practical guide to strategic planning in higher education. Society for
College and University Planning.
Hölttä, S. (2000). From ivory towers to regional networks in Finnish higher education. European
Journal of Education, 35(4), 460–474.
Holzer, M., & Yang, K. (2004). Performance measurement and improvement: An assessment of the
state of the art. International Review of Administrative Sciences, 70(1), 15–31.
Hood, C. (1995). The “new public management” in the 1980s: Variations on a theme. Accounting
and Organisations and Society, 20(2/3), 93–109.
Hoque, Z. (2008). Measuring and reporting public sector outputs/outcomes. Exploratory evidence
from Australia. International Journal of Public Sector Management, 21(5), 468–493.
Hughes, A., & Kitson, M. (2012). Pathways to impact and the strategic role of universities: New
evidence on the breadth and depth of university knowledge exchange in the UK and the factors
constraining its development. Cambridge Journal of Economics, 36, 723–750.
Jääskeläinen, A., & Laihonen, H. (2014). A strategy framework for performance measurement in
the public sector. Public Money & Management, 34(5), 355–362.
Jalaliyoon, N., & Taherdoost, H. (2012). Performance evaluation of higher education; a necessity.
Procedia - Social and Behavioral Sciences, 46, 5682–5686.
James, O., Leth Olsen, A., Moynihan, D., & Van Ryzin, G. (2020). Behavioral public performance:
How people make sense of government metrics. Cambridge University Press.
Johnes, J., & Taylor, J. (1987). Degree quality: An investigation into differences between UK
universities. Higher Education, 16(5), 581–602.
Johnson, A. (2005). What does 25 years of experience tell us about the state of performance
measurement in public policy and management? Public Money and Management, 25(1), 9–17.
Jongbloed, B., & Vossensteyn, H. (2001). Keeping up performances: An international survey of
performance-based funding in higher education. Journal of Higher Education Policy and
Management, 23(2), 127–145.
Jongbloed, B., Enders, J., & Salerno, C. (2008). Higher education and its communities: Intercon-
nections, interdependencies and a research agenda. Higher Education, 56(3), 303–324.
Keenoy, T., & Reed, M. I. (2008). Managing modernization: Introducing performance management
in British universities. In C. Mazza, P. Quattrone, & A. Riccaboni (Eds.), European universities
in transition: Issues, models and cases (pp. 188–204). Edward Elgar.
Kettl, D. F. (2002). The transformation of governance: Public Administration for Twenty-First
Century America. Johns Hopkins University Press.
Kivistö, J., & Kohtamäki, V. (2016). Does performance-based funding work? Reviewing the
impacts of performance-based funding on higher education institutions. In R. M. O. Pritchard,
34 1 Performance Systems in Higher Education Institutions

A. Pausits, & J. Williams (Eds.), Positioning higher education institutions. From here to there
(pp. 215–226). Sense Publisher.
Kretek, P. M., Dragsic, Z., & Kehm, B. M. (2013). Transformation of university governance: On the
role of university board members. Higher Education, 65, 39–58.
Kueng, P. (2000). Process performance measurement system: A tool to support process-based
organizations. Total Quality Management, 11(1), 67–85.
Liddle, J. (2018). Public value management and new public governance: Key traits, issues and
developments. In E. Ongaro & S. Van Thiel (Eds.), The Palgrave handbook of public admin-
istration and management in Europe. Palgrave Macmillan.
Linard, K., (1996). Public sector performance management now and for the future. In Paper
presented at the Asia business forum – Performance Management in the Public Sector, Kuala
Lumpur.
Lindsay, A. (1981). Assessing institutional performance in higher education. A managerial per-
spective. Higher Education, 10(6), 687–706.
March, J., & Olsen, J. (1984). The new institutionalism: Organizational factors in political life. The
American Political Science Review, 78(3), 734–749.
March, J., & Olsen, J. (2005). Elaborating the “new institutionalism”. In S. A. Binder,
R. A. W. Rhodes, & B. A. Rockman (Eds.), The Oxford handbook of political institutions.
Oxford University Press. https://doi.org/10.1093/oxfordhb/9780199548460.003.0001
Marginson, S. (2009). The limits of market reform in higher education. In Paper presented at
Research Institute for Higher Education (RIHE), Hiroshima University, Japan, August 17th.
Marginson, S., & van der Wende, M. (2009). Europeanisation, international rankings and faculty
mobility: Three cases in higher education globalization. In Centre for educational research and
innovation, higher education to 2030, Vol. 2: Globalisation (pp. 109–144). Paris: OECD.
Martin, B. R., & Whitley, R. (2010). The UK research assessment exercise: A case of regulatory
capture? In R. Whitley, J. Glaser, & L. Engwall (Eds.), Reconfiguring knowledge production:
Changing authority relationships in the sciences and their consequences for intellectual
innovation (pp. 51–81). Oxford: Oxford University Press.
Meek, V. L. (2003). Introduction. In A. Amaral, V. L. Meek, & I. M. Larsen (Eds.), The higher
education managerial revolution? (pp. 1–29). Kluwer Academic Publishers.
Melo, A. I., Sarrico, C. S., & Radnor, Z. (2010). The influence of performance management systems
on key actors in universities: The case of an English university. Public Management Review,
12(2), 233–254.
Miller, B. (2007). Assessing organizational performance in higher education. Wiley.
Mintzberg, H. (1979). The structuring of organizations: A synthesis of the research. Prentice-Hall.
Moon, C. J., Walmsley, A., & Apostolopoulos, N. (2018). Governance implications of the UN
higher education sustainability initiative. Corporate Governance, 18(4), 624–634.
Moore, M. (2014). Public value accounting: Establishing the philosophical basis. Public Adminis-
tration Review, 74(4), 465–477.
Moynihan, D. P. (2006). Managing for results in state government: Evaluating a decade of reform.
Public Administration Review, 66(1), 77–89.
Moynihan, D. P., Fernandez, S., Kim, S., LeRoux, K. M., Piogrowski, S. J., Wright, B. E., & Yang,
K. (2011). Performance regimes amidst governance complexity. Journal of Public Administra-
tion Research and Theory, 21, 141–155.
Mussari, R., & Ruggiero, P. (2010). Public managers’ performance evaluations systems and public
value creation: Behavioral and economic aspects. International Journal of Public Administra-
tion, 33, 541–548.
Narayan, A. K., Northcott, D., & Parker, L. (2017). Managing the accountability-autonomy of
universities. Financial Accountability and Management, 33(4), 335–355.
Neave, G., & van Vught, F. A. (Eds.). (1991). Prometheus bound the changing relationship
between government and higher education in Western Europe. Pergamon.
Neely, A. (1999). The performance measurement revolution: Why now and what next? Interna-
tional Journal of Operations & Production Management, 19(2), 205–228.
References 35

Neely, A., Adams, C., & Kennerly, M. (2002). The performance prism: The scorecard for
measuring and managing stakeholder relationship. Prentice Hall.
Neely, A., Kennerly, M., & Waters, A. (2004). Performance measurement and management:
Public and private. Centre for Business Performance, Cranfield University, Cranfield.
Newton, R. (1992). The two cultures of academe: An overlooked planning hurdle. Planning for
Higher Education, 21(1), 8–14.
Noordegraaf, M. (2015). Public management: Performance, professionalism and politics. Palgrave
Macmillan.
O’Flynn, J. (2007). From new public management to public value: Paradigmatic change and
managerial implications. The Australian Journal of Public Administration, 66(3), 353–366.
OECD. (2006). Reviews of national policies for education – tertiary education in Portugal.
Organisation for Economic Co-operation and Development.
Osborne, S. (2002). Public management: A critical perspective (p. 2002). Routlege.
Osborne, S. (2006). The new public governance? Public Management Review, 8(3), 377–387.
Osborne, S. (2010). The new public governance? Emerging perspectives on the theory and practice
of public governance. Routledge.
Osborne, D., & Gaebler, T. (1992). Reinventing government. Penguin Press.
Osborne, S., Radnor, Z., & Nasi, G. (2013). A new theory for public service management? Toward
a (public) service-dominant approach. The American Review of Public Administration, 43,
135–158.
Paradeise, C., Reale, E., Bleiklie, I., & Ferlie, E. (2009). University governance. Western European
comparative perspectives. Springer.
Parker, L. (2011). University corporatisation: Driving redefinition. Critical Perspectives on
Accounting, 22(4), 434–450.
Pavan, A., Reginato, E., & Fadda, I. (2014). The implementation gap of NPM reforms in Italian
local government. An empirical analysis. Franco Angeli.
Peters, G. B. (2019). Institutional theory in political science: The new institutionalism. Edward
Elgar Publishing.
Pinto, H., Cruz, A. R., & de Almeida, H. (2016). Academic entrepreneurship and knowledge
transfer networks: Translation process and boundary organizations. In L. Carvalho (Ed.),
Handbook of research on entrepreneurial success and its impact on regional development
(pp. 315–344). IGI Global.
Pitman, T. (2000). Perceptions of academics and students as customers: A survey of administrative
staff in higher education. Journal of Higher Education Policy and Management, 22(2),
165–175.
Pollifroni, M. (2015). La valutazione della performance dell’azienda universitaria nell’integrazione
tra i modelli di accounting e quelli reputazionali. In AA.VV (Ed.), Food & Heritage:
sostenibilità economico-aziendale e valorizzazione del territorio (pp. 457–474). Giappichelli.
Pollifroni, M. (2016). Valori e principi di fondo dell’istituzione universitaria. In AA.VV., La
Rendicontazione Sociale negli Atenei Italiani. Valori, Modelli, Misurazioni (pp. 25–39).
Milan: Franco Angeli.
Pollitt, C. (2005). Performance management in practice: A comparative study of executive agen-
cies. Journal of Public Administration Research and Theory, 16, 25–44.
Pollitt, C., & Bouckaert, G. (2011). Public management reform: A comparative analysis—New
public management, governance, and the neo-Weberian state. Oxford University Press.
Powell, W. W., & Owen-Smith, J. (1998). Universities and the market for intellectual property in
the life science. Journal of Policy Analysis and Management, 17(2), 253–277.
Powell, W. W., & Snellman, K. (2004). The knowledge economy. Annual Review of Sociology,
30(1), 199–220.
Propper, C., & Wilson, D. (2003). The use and usefulness of performance measures in the public
sector. Oxford Review of Economic Policy, 19(2), 250–267.
Rajala, T., Laihonen, H., & Haapala, P. (2018). Why is dialogue on performance challenging in the
public sector? Measuring Business Excellence, 22(2), 117–129.
36 1 Performance Systems in Higher Education Institutions

Rebora, G., & Turri, M. (2011). Critical factors in the use of evaluation in Italian universities.
Higher Education, 61(5), 531–544.
Reponen, T. (1999). Is leadership possible at loosely coupled organizations such as universities?
Higher Education Policy, 12(3), 237–244.
Rogers, S. (1990). Performance management in local government. Longman.
Salmi, J. (2009). The challenge of establishing world-class universities. World Bank Report.
Saravanamuthu, K., & Tinker, T. (2002). The university in the new corporate world. Critical
Perspective on Accounting, 13(5/6), 545–554.
Shattock, M. (2010). Managing successful universities. McGraw-Hill Education.
Shin, J. C., & Harman, G. (2009). New challenges for higher education: Global and Asia-Pacific
perspectives. Asia Pacific Education Review, 10(1), 1–13.
Sloper, P., Linard, K., & Paterson, D. (1999). Towards a dynamic feedback framework for public
sector performance management. In: Proceedings of the 17th International System Dynamics
Conference, Wellington.
Smith, P. (1993). Outcome-related performance indicators and organizational control in the public
sector. British Journal of Management, 4, 135–151.
Sorci, C. (2007). Lo sviluppo integrale dell’azienda. Giuffrè.
Sousa, C., de Nijs, W., & Hendriks, P. (2010). Secrets of the beehive: Performance management in
university research organisations. Human Relations, 63(9), 1439–1460.
Sporn, B. (1999). Adaptive university structures: An analysis of adaptation to socioeconomic
environments of US and European universities. Jessica Kingsley.
Stoker, G. (2006). Public value management: A new narrative for networked governance? Amer-
ican Review of Public Administration, 36(1), 41–57.
Tahar, S., & Boutellier, R. (2013). Resource allocation in higher education in the context of NPM.
Public Management Review, 15(5), 687–711.
Talbot, C. (1999). Public performance—Towards a new model? Public Policy and Administration,
14(3), 15–32.
Talbot, C. (2007). Performance management. In E. Ferlie, J. L. E. Lynn, & C. Pollitt (Eds.), The
Oxford handbook of public management (pp. 491–517). Oxford University Press.
Tiscini, R., & Martiniello, L. (2011). Public sector reforms and the role of public managers: The
culture of performance and merit. Available at SSRN: https://ssrn.com/abstract¼2065277
United Kingdom National Committee of Inquiry into Higher Education. (1997). Report of the
National Committee of inquiry into higher education (also known as the Dearing report).
Crown Copyright. University, Buckingham.
Van de Walle, S., & Van Dooren, W. (2010). How is information used to improve performance in
the public sector? Exploring the dynamics of performance information. In K. Walshe,
G. Harvey, & P. Jas (Eds.), Connecting knowledge and performance in public services
(pp. 33–54). Cambridge University Press.
Van Dooren, W., Bouckaert, G., & Halligan, J. (2010). Performance management in the public
sector. Routledge.
Verbeeten, F. H. M. (2008). Performance management practices in public sector organizations.
Impact on performance. Accounting, Auditing & Accountability Journal, 21(3), 427–454.
Villarreal, E. (2001). Innovation, organisation and governance in Spanish universities. Tertiary
Education and Management, 7, 181–195.
Weick, K. E. (1976). Educational organizations as loosely coupled systems. Administrative Science
Quarterly, 21, 1–19.
Chapter 2
Developing Performance Management
Systems in Higher Education Institutions

2.1 Introduction

Based on recent reforms and institutional changes globally introduced in the HE


sector, the design and use of PM systems tailored to the University organizational
settings have become quite challenging (Lapsley & Miller, 2004). However,
although a shared belief in University management to evolve, PM systems are still
rarely adopted. These systems aim at supporting academic decision-making pro-
cesses and academic performance assessment (Angiola et al., 2018; Cosenz, 2014;
Broadbent, 2007). As far as PM systems are concerned, obstacles to their adoption
are due not only to technical, cultural, and political constraints but also to the
difficulties in adapting them to the organizational attributes of Universities.
The complexity of University management requires prompt and specific solutions
tailored to the HE system. These solutions essentially refer to the development of
PM systems allowing academic decision-makers to timely cope with potential risks,
challenges, and opportunities by defining corrective actions prioritizing the goals to
pursue according to the endowment of available resources (Otley, 1999; Talbot,
2005). Otherwise, those Universities loosely focused on both financial equilibrium
respect and quality of outputs—in terms of education and research—may run the risk
of damaging the provision of academic activities due to a lack of strategic resources.
On this concern, a result-oriented perspective—which implies a focus on organiza-
tional results rather than on the respect of procedures—suggests neglecting the
adoption of formal mechanistic approaches to managing HEIs. Rather, such a
perspective promotes the use of effective PM mechanisms according to a sustainable
development viewpoint. Such mechanisms include two main interrelated phases:
(1) strategic planning and (2) performance measurement (or management control),
which represent a relevant component of a broader PM framework aimed at coordi-
nating strategic resources to improve organizational performance.

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 37


F. Cosenz, Managing Sustainable Performance and Governance in Higher
Education Institutions, System Dynamics for Performance Management &
Governance 5, https://doi.org/10.1007/978-3-030-99317-7_2
38 2 Developing Performance Management Systems in Higher Education Institutions

PM systems may play a different role according to the design perspective adopted
by each University. The design perspective defines the criteria to introduce strategic
planning and performance measurement tools in the academic organizational setting
and their connections with the organizational structure and the other operational
mechanisms (Chenhall & Langfield-Smith, 2007). Indeed, applying a standardized
PM model is not possible, though successful in other organizations (e.g., hospitals,
police departments, public utilities, etc.). This is because the design of PM must be
calibrated to Universities’ organizational and institutional characteristics (Kotler &
Murphy, 1981; Cosenz, 2014). On this concern, some years ago, the European
Commission failed in developing a standardized set of performance indicators
applicable to each European HE system due to the heterogeneous academic perfor-
mance evaluation mechanism adopted by European countries.1
As described in Chap. 1, Universities are characterized by a multiplicity of
stakeholders, a high-articulated organizational structure, different governance levels,
and values. A wide range of tangible and intangible factors determine their perfor-
mance (e.g., knowledge, intellectual capital, know-how). Therefore, PM systems
should be tailored to these elements (Lindsay, 1994; Guthrie & Neumann, 2007).
In addition, as Gerrish (2016) remarks, strategic planning and performance
measurement must not be considered two different activities. Rather, they are
inseparable phases within a unique and broader organization informational system
to support the strategic learning processes of academic decision-makers and other
key-actors (e.g., unit managers, departmental chiefs, and so on).
Flamholtz (1996) defines the use of PM systems as a process aimed at controlling
and influencing people’s behavior, seen as members of a formal organization, to
increase the probability that they may achieve the planned goals. In this vein, the
proposed approach suggests that PM systems lead academic decision-makers to steer
the dynamic complexity characterizing the renewed HE sector through strategic
learning and organizational coordination according to a shared vision of perfor-
mance drivers and their implications (Bianchi, 2016; Kloot, 1997).

1
For instance, Denmark, Sweden, and the Netherlands use academic performance evaluation
approaches mainly related to organizational issues and carried out by independent foreign experts;
in Spain evaluation procedures and criteria are now set by a committee (charged by the Govern-
ment) that scales down the role played by the national HE agency (ANEP); in Austria HEIs jointly
define academic evaluation programs which each University may agree on in order to be included
among the HEIs legally recognized by the national Government.
2.2 The Role of Performance Management Systems in Meeting Specific. . . 39

Fig. 2.1 The public value chain (adapted from Moore, 2013)

2.2 The Role of Performance Management Systems


in Meeting Specific Information Needs of Higher
Education Institutions

The main aim of PM systems is to support decision-making processes and related


performance appraisal to effectively and efficiently reach results aligned with
planned objectives (Anthony, 1965; Maciariello, 1984; Lorange et al., 1986; Mer-
chant, 1997; Bryson, 2004; Fitzgerald, 2007; Flamholtz, 1996). To this end, HEIs
are called to develop a new attitude toward the definition of PM systems that places
academic value creation at the core of educational, research, and administrative
activities. This value represents the contribution delivered by HEIs to the community
and their regional areas in terms of service provision. On this concern, Moore (1995)
maintains that institutions must identify public value according to the benefit
emerging from the alignment between public service demand and the actual supply.
To analyze public value provision and foster accountability in the public sector,
Moore (2013) also stresses the necessity to frame public value creation processes
according to the scheme displayed in Fig. 2.1. Such schema highlights the causal
connections starting with the flow of assets controlled by an organization, through a
set of policies, programs, and activities, to outputs, transactions with clients of the
organization, and ultimately the social outcomes produced by the organization.
Managing Universities according to a value creation perspective requires three
basic steps:
1. Analyzing the current situation to identify possible chances of restructuring
2. Understanding how to take advantage of these opportunities through transactions,
new investments, asset trading, and organizational restructuring
3. Developing a new culture within the entire organizational system to foster
sustainable value creation processes
Adopting such a perspective takes time to be fully shared in organizations
traditionally managed according to a bureaucratic viewpoint. To facilitate the tran-
sition from a bureaucratic to a managerial mindset, the coexistence of the following
factors is highly recommended (Kallio et al., 2020):
40 2 Developing Performance Management Systems in Higher Education Institutions

Table 2.1 Applying PM to public value creation


Phases Activities
(1) Definition of performance objectives Strategic planning aimed at supporting management
at the organizational and unit level activities affecting academic performance
(2) Allocation of strategic resources
based on the performance objectives
(3) Management of programs and actions Academic performance measurement
to achieve objectives
(4) Performance assessment and man-
agement control
(5) Analysis and diagnosis of discrepan-
cies (results vs. objectives)
(6) Internal performance reporting and Academic stakeholder engagement
communication
(7) External performance reporting and
communication

• Spreading accountability among the multiple units and departments involved in


academic value creation processes
• Involvement and training of each unit manager to agree on common organiza-
tional goals and promptly intervene in the redefinition of core activities and
processes
• Using coherent performance measurement criteria to influence behaviors and
support managers in achieving goals
• Awarding high organizational performances at an individual and
organizational level
• Setting up internal and external communication processes to update stakeholders
on the results and related resource consumption.
The value creation purpose is ultimately associated with the concept of perfor-
mance that resumes the results an organization achieved over time and compares
them with the objectives and the means through which they have been achieved.
Table 2.1 lists the main phases and related activities of a PM system based on a
value creation perspective.
The above phases are sequentially connected and require the design of specific
performance indicators tailored to HEIs (Cave et al., 1997). These indicators are
used to frame and interpret how value is generated, thus supporting a systematic
feedback mechanism for designing more effective strategies over time.
Figure 2.2 displays this feedback process between strategic planning and perfor-
mance measurement within a PM system (Gerrish, 2016; Moynihan, 2008;
Riccaboni & Leone, 2010).
With the intent to foster the implementation of such indicators, a methodological
pathway leading to the formulation of a PM approach focused on HEIs is illustrated
and discussed in the following sections.
2.3 Framing Organizational Performance in Higher Education Institutions 41

Fig. 2.2 Relationships between strategic planning and performance measurement into a perfor-
mance management setting

2.3 Framing Organizational Performance in Higher


Education Institutions

Organizational performance conventionally refers to the results an organization


reaches throughout time, depending on the products/services it offers to meet its
customers’ needs (Wooldridge & Floyd, 1990; McGivern & Tvorik, 1997). Such a
concept has been differently defined and used according to the context under
observation (Stainer, 1999; Miller, 2007).
In management research, it is used to define the achievement of pre-set organi-
zational goals emerging from carrying out value creation processes, with the intent to
meet some societal needs (Coda, 2010). Stankard (2002) noted that organizational
performance results from interactions of different parts or units inside and outside
the organization. It refers to the outcomes of multiple organizational processes that
occur during its daily operations.
Organizational performance in the HE system is specifically related to its core
organizational areas: Education, Research, Administration, and Third Mission. Each
of them implies the achievement of different expected results:
• Education concerns vocational training. Its performance can be measured in
terms of training quality and teaching effectiveness.
• Research aims at knowledge development in its multiple branches. In this case,
performance is assessed in terms of research quality, innovation (with its double
42 2 Developing Performance Management Systems in Higher Education Institutions

meaning of originality and validity of scientific-applicative outcomes), and sci-


entific productivity.
• Administration concentrates on allocating and managing academic resources
(including the financial ones) and is intended to support the other organizational
areas, focusing on cost-effectiveness.
• Third Mission aims to contribute to the social and economic development of the
outer context (local, regional, national) through the long-term impact of education
and research outputs and networking and collaborations, sustainability, and social
engagement. Outcome measures—such as employment rate, academic spin-offs,
and citations—provide useful information on Third Mission activities carried out
by a University.
A prevailing view of organizational performance has traditionally focused on the
financial balance between expenditures and collections with the intent to pursue a
financial equilibrium (Fitzgerald, 2007; Sporn, 2003; Modell, 2001; Pendlbury &
Algaber, 1997). To a large extent, this was due to the difficulty of measuring and
quantifying organizational results in terms of effectiveness, efficiency, and impact
on the outside world. However, such a perspective today seems to be too bounded.
Though financial equilibrium is a fundamental principle to observe in any
organization, defining performance in HEIs requires a focus on also other perspec-
tives related to the quality of programs and the outcomes of undertaken policies
(Kallio et al., 2020; Chenhall & Langfield-Smith, 2007; Miller, 2007; Bianchi,
2016). Therefore, financial balance and value creation (Moore, 1995) for a wide
range of stakeholders should be the building block for a sustainable organizational
model applied to HEIs (Guthrie & Neumann, 2007; Parmenter, 2007; Cave et al.,
1997).
Academic value creation includes other performance dimensions focused on
University competitiveness and its capability to satisfy those social needs (Coda,
2010; Hamann et al., 2013) mainly related to education and research development.
As a result, academic performance refers to three main dimensions (see Fig. 2.3):
(1) a competitive, (2) a financial, and (3) a social dimension.
The competitive dimension is oriented to satisfy the needs of the HE market, i.e.,
providing better education and research products/services to students and other
academic stakeholders compared to competitors’ ones. The financial dimension
aims to increase the financial resources of a University to support future investments.
The social dimension focuses on ensuring a balance between stakeholders’ contri-
butions (e.g., academic staff, administrative staff, students, providers, research
funders) and the associated rewards that the University provides them (e.g., work
motivation and wages, education, and research equipment, sense of belonging).
Each performance dimension includes a set of strategic resources whose acqui-
sition and deployment in a synergic way imply the possibility to generate results
over time. For instance, University’s image refers to the competitive dimension,
liquidity to the financial one, and student satisfaction to the social one.
Such a multidimensional perspective of academic performance also highlights
close connections among the three mentioned dimensions. Thus, these dimensions
2.3 Framing Organizational Performance in Higher Education Institutions 43

Fig. 2.3 Academic performance dimensions (adapted from Coda, 2010)

must be embodied into a systemic framework where resource depletion/accumula-


tion processes and related results of one dimension affect the performance of the
other two. For instance, an increase in liquidity (financial dimension) allows the
University to hire more academic staff (social dimension), who may improve
education quality and, as a result, increase students’ enrollments (competitive
dimension). This means that the organizational success of a University depends on
a consistent balance among these performance dimensions.
In addition, perspectives related to “time” and “space” must be considered when
defining organizational performance in HEIs. Concerning the “space” perspective,
balanced development of academic activities emerges from the search for consis-
tency between the multiple outputs offered by different organizational units, depart-
ments, and faculties of a University. An unbalanced development rate could be
associated with either a size increase or an improvement in operations in one unit, to
the detriment of another one. For instance, the performance of an academic depart-
ment could be improved by diverting the resources invested in another department of
the same University.
Regarding the “time” perspective, an improvement in short-term performance
should not be obtained to the prejudice of long-term results. Too often, unbalanced
time-horizon strategies result in an evanescent short-term improvement that hides
worse undesired effects in the long run. Balancing the short with the long term in
planning and decision-making implies the need to adopt a strategic view of perfor-
mance management and measurement. A strategic perspective is related to matching
short- and long-term planning and the analysis of the outcomes of current and often
inertial decisions on the change in both organizational structures and external
environmental conditions (Grossi et al., 2020; Hamel & Prahalad, 1994; Mastilak
et al., 2012) (Fig. 2.4).
As such, PM systems must include a balanced mix of indicators that allows
academic decision-makers to focus on the trade-offs between individual and static
44 2 Developing Performance Management Systems in Higher Education Institutions

Fig. 2.4 Framing academic


performance according to a
systemic view: the search
for consistency between the
success, time, and space
perspectives (adapted from
Bianchi, 2016, p. 53)

performance measures related to competitive, financial, and social dimensions and


between short- and long-term effects of adopted policies. Indicators also relate to the
trade-off between the aggregated contribution offered by different organizational
units or departments operating within a University. Thereby, academic performance
refers to the contribution originating from a heterogeneous plurality of
interdependent organizational activities, often distant in space and time one from
each other but consistent with the broader mission of HEIs (i.e., meeting education
and research enhancement needs).

2.4 Strategic Planning in Higher Education Institutions

A strategy is generally defined as how an organization plans to achieve its desired


goals over time (Minzberg, 1994; Mintzberg & Quinn, 1996; Anthony, 1965;
Lorange et al., 1986; Porter, 1990; Andrews, 1971). By applying this concept to a
University setting, strategic planning is described as an activity aimed at (1) framing
the future role, mission, and vision of an HEI operating in a dynamic environment,
(2) defining strategic and operational objectives, (3) translating them into measurable
targets, and (4) using the outcomes of performance measurement to promptly adopt
corrective actions (Anthony, 1965; Lorange et al., 1986; Minzberg, 1994; Simon,
1960; Porter, 1990; Dooris et al., 2004).
Strategic planning requires a participative approach to negotiate and allocate
resources among the multiple organizational areas of the HEI following its strategic
and operational aims and targets (Mintzberg & Quinn, 1996; Bryson, 1988; Rowley
& Sherman, 2001). Planning is intended as the outcome of decision-making pro-
cesses as it defines the actions to put in place for achieving goals in a given time
2.4 Strategic Planning in Higher Education Institutions 45

Fig. 2.5 The strategy design process in HEIs

interval (Lorange & Vancil, 1976; Bryson, 2004; Rowley et al., 1997). Therefore,
planning defines the goals to achieve and suggests the consequent actions to
undertake—i.e., the ways to reach the targets—depending on the institutional
mission of the University (Conway et al., 1994; Kotler & Murphy, 1981).
Since 1980, Colleges and Universities have taken a closer look at strategic
planning, emphasizing its use as a management tool for the institution’s methodical
and systematic development. In this context, linear approaches and rational models
flourished, featuring a cognitive process of functions: identifying and prioritizing
stakeholders, environmental scanning, contextual and positioning analysis, specifi-
cation of core competencies, strategy formulation, goal setting, and evaluative
feedback loops (Birnbaum, 2000; Dooris et al., 2004). According to this perspective,
as shown in Fig. 2.5, the strategy design process in HEIs consists of five
interdependent steps:
1. An external analysis aimed to analyze the opportunities and threats existing in the
outer context where the University operates (i.e., exogenous variables affecting
the academic performance)
2. An internal analysis oriented to investigate the HEI strengths and weaknesses in
its inner setting (i.e., endogenous variables and resources fueling academic value
creation processes)
3. Strategy formulation, i.e., designing strategies that build and foster competitive
advantages by matching both strengths and weaknesses with the external oppor-
tunities and threats
4. Strategy execution, i.e., implementing the designed strategies by putting into
action the decisions made to achieve desired goals
5. Strategic evaluation, i.e., measuring performance and making corrections to plans
when the strategies are not producing the desired outcomes through feedforward
mechanisms.

2.4.1 Internal and External Analysis to Support Strategy


Formulation

As previously illustrated, the strategy design process begins with a comprehensive


analysis of those endogenous and exogenous factors that must be considered to
frame the University’s current status—especially in terms of resource availability
and consistency. As a result, this allows one to formulate more effective strategies
for meeting academic stakeholders’ expectations.
46 2 Developing Performance Management Systems in Higher Education Institutions

Table 2.2 An example of SWOT analysis applied to HEIs


Strengths Weaknesses
– National and international reputation of – Poor quality of services for students Lack of
the HEI Alumni network
– High number of Double Degree agree- – Lack of research infrastructures
ments
– High number of international students
Opportunities Threats
– Research grants funded by supranational – Decreasing public funding for the HE sector
institutions – Uncertainty in the international scenario
– Introduction of new digital technologies – Oligopoly of the big publishers
– Possibilities to cooperate with emerging
countries

In particular, internal and external analyses support academic decision-makers in


understanding the HEI’s strategic positioning in the national and regional context. It
is usually carried out through the use of the SWOT analysis (Hill & Westbrook,
1997), whose matrix enables to list strengths, weaknesses, opportunities, and threats
identified in the internal and external context, respectively. Therefore, such a tool
provides a starting ground for setting the University’s development strategy.
Table 2.2 displays an example of a SWOT analysis applied to HEIs.
As depicted in Table 2.2, the SWOT analysis offers a situational perspective that
enables to check whether the undertaken actions are still adequate, thus providing a
static tool for adopting possible corrective interventions to the strategy. It allows
academic decision-makers to contextualize the results of strategic plans and validate
both the characteristics of innovation programs and the overall strategic planning
process.
More specifically, this preliminary analysis explores the scenario where the HEI
interacts with its stakeholders and identifies the core challenges it has to take on in
terms of opportunities and threats. These challenges are faced by formulating
strategic directions based on a systematic matching process between the endogenous
and exogenous variables identified in the SWOT analysis. Although this analysis
could be characterized by the uncertainty underlying the recognition of internal and
external factors, academic decision-makers may foresee a range of potential strate-
gies. These strategies can be further investigated with the intent of making the most
of their resources to meet new unexplored opportunities and counteract contextual
threats. For this purpose, Table 2.3 illustrates a framework for combining strengths
and weaknesses with opportunities and threats, thus providing support to formulate
more effective strategies in HEIs.
This analysis can be further developed by identifying the critical success factors2
related to the HE sector. These factors are defined as attributes, characteristics,

2
According to Kim (2005), critical success factors are also defined as “factors of competition,” i.e.,
variables identifying those drivers that affect rivalry among organizations operating in the same
market sector (or niche).
2.4 Strategic Planning in Higher Education Institutions 47

Table 2.3 Matching endogenous and exogenous factors to support strategy design in HEIs
Endogenous
variables
Exogenous
variables Strengths Weaknesses
Opportunities Strategies that use strengths to take Strategies that eliminate or overcome
advantage of opportunities (e.g. using weaknesses in order to take advantage
international reputation to start new from opportunities (e.g. investing in
cooperation agreements) research infrastructures by acquiring
new digital technologies)
Threats Strategies that use strengths to defend Strategies that implement defenses of
the HEI from threats (e.g. exploiting the one or more weaknesses to modify the
international reputation to compete for impact of a threat on results (e.g.
research grants funded by suprana- starting the Alumni network to better
tional institutions) face uncertainty in the international
scenario)

conditions, or variables directly influencing a University’s effectiveness, efficiency,


and viability (Coda, 2010; See et al., 1980; Leidecker & Bruno, 1984; Volery &
Lord, 2000). Thus, generating and maintaining distinctive and defendable compe-
tencies linked to these factors—through the use of available resources—will allow
the academic institution to gain competitive advantages into the HE market (Porter,
1990; Aaker, 1989; Dierickx & Cool, 1989; Hall, 1993). To better frame the strategic
positioning of a University into the HE sector by exploring how it influences the
reference critical success factors, the Value Curve model offers a valuable tool
enabling a comparative analysis among different academic institutions (Kim,
2005). Such a model measures the value provided by the selected HEIs for each
critical success factor on which competition is played. An example of the Value
Curve model applied to Universities is shown in Fig. 2.6.
Among the others, this example includes some critical success factors related to
the three academic missions, such as (1) innovativeness of educational programs,
(2) job placement quality, (3) research productivity, (4) effectiveness of administra-
tive support, and (5) scope of stakeholder network.3 The emerging comparison
allows academic decision-makers to position the University into the HE competitive
arena by identifying possible value gaps with its competitors. It also supports the
identification of those critical success factors on which to focus future strategic
directions and investment efforts for improving academic performance and, conse-
quently, pursuing a sustainable competitive advantage in the HE sector.

3
For instance, job placement quality can be measured as the number of graduates with a job 3 years
after graduation, while research productivity in terms of number of published articles per researcher
per year.
48 2 Developing Performance Management Systems in Higher Education Institutions

Fig. 2.6 An example of the Value Curve model to compare and position Universities into the HE
sector

2.4.2 A “Subjective” View of Performance Management


to Design and Implement Strategies in Higher
Education Institutions

Academic decision-makers need to challenge managerial assumptions and consider


radical changes in their organizational structures and processes according to those
changes in the social, economic, and political spheres. For this reason, in recent
years, strategic planning in HEIs has increasingly focused on strategic learning,
critical thinking, and flexibility to external changes. Indeed, flexibility is key to
organizational success in today’s HEIs (Hussey, 1999; Melo & Figueiredo, 2020).
In particular, strategic planning is intended as a predetermination, i.e., an ex ante
identification and assessment of University objectives. This does not mean that
planning should predict future events, but rather a “perspective view” on the
estimated performances that a university can reach according to the available stock
of resources and the undertaken actions to use them for pursuing organizational
results. In this regard, academic managers have to deal with the setting of medium-
and long-term objectives and, then, define short-term targets in a “flexible” way, i.e.,
being aware that these may change depending on the outcomes emerging from
performance monitoring processes.
Conventionally, Universities focused on planning to collect information with the
intent to support predictable decision-making processes rather than to adopt a
perspective to outline a long-term strategic setting from which to benefit. As a result,
2.4 Strategic Planning in Higher Education Institutions 49

Fig. 2.7 The “subjective”


view of performance
management (Bianchi,
2016, pp. 136)

Universities reacted to external pressures avoiding undertaking actions to influence


their socio-economic context through innovative development strategies.
Strategic planning is based on a circular process including goal setting and
resource stewardship and applies to the overall institution and each organizational
area. Viable strategic planning implies that objectives must be (Doran, 1981;
Conzemius & O’Neill, 2006; Poister, 2003):
• Specific, i.e., they must focus on one particular area for improvement
• Measurable, i.e., they should be expressed through specific units of measurement
• Assignable, i.e., they are attributable to a responsible staff or unit
• Realistic, i.e., they describe what results can realistically be achieved, given
available resources
• Time-related, i.e., their achievement is delimited within a predefined term
Knowing the conditions that characterize the organizational setting and its inter-
actions with the external environment is essential for planning objectives and related
actions to achieve them (Bianchi, 2010). Planning is based on the rationality
underlying decision-making processes: a decision is rational whether it is consistent
with its objectives and well-matched with the available resources and the institu-
tional constraints of the system (Mintzberg & Quinn, 1996).
In particular, as argued by Bianchi (2016), setting goals and objectives within a
PM context requires the adoption of a “subjective” view of performance (Fig. 2.7).
Such a perspective identifies the linkages between results and objectives in terms of
operations. As such, goals and objectives are determined by taking into account the
expected results (i.e., performance drivers and end-results) and the associated
actions—i.e., processes and activities—to undertake. Thereby, objectives can be
defined for each organizational unit (single process or activity) and the overall
organization (macro processes).
This view of PM—borrowed from the work of Bianchi (2010, 2012, 2016)—is
particularly valuable in supporting strategic planning and decision-making pro-
cesses. As Bianchi (2016, p. 135) remarks, “activities and the processes to which
50 2 Developing Performance Management Systems in Higher Education Institutions

they are related can be associated with corresponding objectives and performance
measures, in a consistent action plan, from which resources are assigned in an
organization, available policy levers for each decision area are made explicit, and
responsibility for expected results is focused.” The possibility of identifying the
interplays between objectives and results throughout processes and activities makes
this perspective consistent with performance measurement. Measuring academic
results is carried out through a set of performance indicators that are classified in
monetary and nonmonetary measures (e.g., concerning volume or quantity, effi-
ciency, outcomes, and productivity). Each indicator is coupled with an expected
target value (or benchmark), according to which academic decision-makers have
outlined the actions to reach it by taking into consideration the available resources
and means (Gerrish, 2016).
Following this perspective, a crucial issue relates to understanding what the
relationships existing between <<actions/operations>> and <<results>> are or,
in other words, what the <<organizational model>> adopted in the University
is. These <<organizational models>> are often implemented according to a stan-
dardized logic based on formal management tools, which are far from providing
effective support to decision-making processes (Flamholtz, 1996). In the past, the
use of these models has resulted in an excessive bureaucratization in operations and
management and, at the same time, created the illusion of influencing results through
formal plans which should have supported the alignment between
“objectives$resources$processes$outcomes,” as well as the strategic coordina-
tion among different organizational areas.
To effectively support academic decision-makers, strategic planning in HEIs
should focus on a systematic strategic learning process (Kloot, 1997),
characterized by:
• A selective, rather than overall and analytic, approach to frame academic value
creation processes oriented to identify and manage the main variables and related
interdependencies affecting performance
• A conceptualization of planning as a system of decisional logic, rather than a list
of activities and tasks
• Involvement of the whole organizational system
• A focus on the social role that the University must play in the future according to
its institutional mission

2.5 Performance Measurement in Higher Education


Institutions

Strategic planning is an activity closely related to performance measurement


intended as a mechanism to coordinate the processes, activities, and operations
carried out within an organization and its connections with the outer environment
(Merchant, 1997; Maciariello, 1984). Measuring performance is an activity aimed to
2.5 Performance Measurement in Higher Education Institutions 51

(1) support strategic learning and decision-making, (2) empower internal stake-
holders, (3) influence their behavior, and (4) motivate and evaluate them at all levels
of the organization hierarchy.
In management research, performance measurement is defined as a process aimed
at collecting, analyzing, and reporting information regarding the performance of an
organization (Ouchi, 1979; Simons, 2000; Otley, 1999; Merchant, 1997;
Maciariello, 1984). As such, it represents an essential activity for managing an
HEI to achieve its goals according to efficiency- and effectiveness-based perspec-
tives, i.e., by pursuing a sustainable use of resources and compliance between results
and objectives in the short, medium, and long term. The alignment between results
and objectives is measured at a strategic level by comparing long-term results (i.e.,
outcomes) with political goals and at an operational level by comparing short-term
results (i.e., outputs) with operational objectives. While the former mostly implies a
qualitative assessment of performance, the latter aims at quantifying the possible
variances between actual and expected results (Kallio et al., 2017; Bouckaert &
Halligan, 2008).
In addition, performance measurement entails the analysis of the causes under-
lying the variances between actual and target values to promptly undertake correc-
tive actions oriented to fill these performance gaps. Variances or gaps may depend
on (1) external factors which require an adjustment of both objectives and actions
concerning the changes that occurred in the external context and/or (2) internal
factors which rely on the complexities arising within the internal system.
Measuring performance in HEIs should not be intended as an activity intermit-
tently executed or carried out once a year ( feedback mechanism). Rather, it should
be systematically conducted during concurrent management processes to allow
academic decision-makers to change promptly, whether necessary, the planning
contents, as well as to timely undertake corrective actions for attaining the pre-set
goals ( feedforward mechanism). From this perspective, performance measurement
becomes relevant to support the analysis and the diagnosis of value.
Specifically, a performance measurement system includes three main phases
(Maciariello, 1984; Brunetti, 1979):
52 2 Developing Performance Management Systems in Higher Education Institutions

1. Framing the academic value chain (or academic process analysis) as an activity to
map the organizational structure underlying specific value creation processes. In
particular, the organizational structure identifies products/services, clients, roles,
functions, processes and activities, and associated responsibilities in terms of
management and operations. It represents the organizational articulation of a
University and makes explicit each organizational area intervening alongside its
value creation processes. These areas correspond to the organizational units,
departments, faculties, laboratories, and libraries, operating and interacting within
the HEI. Framing the organizational structure of Universities allows academic
decision-makers to make staff units accountable for the achievement of planned
objectives and better identify those organizational units that require more support
in their operations. The core elements that characterize an organizational unit are:
• One or more unit manager(s) in charge of managing the area by assigning
sub-targets and duties, allocating resources, and controlling the execution of
core processes
• Operational levers which unit managers may influence to pursue specific
planned objectives
• Input factors which are the available strategic resources used and depleted in
value creation processes
• Standard operational conditions that in each organizational area define the
relation between the need for input and its expected consumption with the
intent to produce a given output
• Interplays with other organizational units leading to the final output of a
specific value creation process
2. The design of key performance indicators to support strategic learning and
decision-making processes of academic key-actors. These indicators include not
only conventional monetary measures emerging from the accounting system and
oriented to evaluate the financial performance of HEIs but also nonmonetary
measures related to the competitive and social performance of Universities
(Guthrie & Neumann, 2007; Broadbent, 2007; Agyemang & Broadbent, 2015;
Cave et al., 1997).
3. A feedback process aimed at connecting key performance indicators with the
organizational structure of the University (Maciariello, 1984). Based on perfor-
mance reporting, this mechanism allows organizational areas to satisfy their
information needs related to current performance trends and, as a result, to timely
undertake corrective actions oriented to improve results. It also aims to foster
strategic coordination between the multiple organizational areas that interact
alongside the academic value creation processes.
2.5 Performance Measurement in Higher Education Institutions 53

2.5.1 An “Objective” View of Performance Management


to Frame Academic Value Creation Processes

HEIs are part of a larger chain of activities that lead to the production of academic
services and products to be considered in the design and implementation of their
performance measurement models. These models entail deconstructing the HE
sector activities into elements, identifying strategic interdependencies among them
and, in some cases, with other public/private institutions. This means that measuring
performance in Universities requires a preliminary activity aimed at mapping the
academic value creation processes by identifying the multiple actors which directly
contribute to the provision of a final product, both inside and outside the organiza-
tion (Cosenz & Bianchi, 2013). This activity aims at:
1. Identifying the organizational units interacting within the academic value chain
leading to the production of a product/service
2. Highlighting the contribution generated by each organizational unit in terms of
academic value creation
3. Defining key-actors and associated responsibilities
4. Supporting strategic dialogue and coordination among the multiple organiza-
tional units involved in academic value creation processes
5. Measuring both unit and organizational performances through key performance
indicators tailored to the organizational setting of the process
As such, it encompasses the complete sequence of operations from input down-
stream to final products after several stages of adding value (Handfield & Nichols,
1999; Tiwari et al., 2013).
In particular, a value creation process is defined as a set of structured and
interrelated activities/operations aimed at producing a specific output to benefit a
client. Each process requires an endowment of input factors to deploy and consume
while executing the programmed operations to deliver this output. These operations,
which are organized according to a logical order in terms of space and time, are
classified into:
1. Physical, i.e., activities based on transforming material factors (e.g., collecting,
filing, manufacturing, storing).
2. Informational, i.e., activities focused on information (e.g., elaborating, memoriz-
ing, demanding). In addition, processes are executed according to specific pro-
cedures that define the operating methods, roles and responsibilities (i.e., who
does what), duties, and obligations to respect.
Figure 2.8 shows the core elements that identify an organizational unit responsi-
ble for executing an academic value creation process.
Internal actors include all organizational units operating inside the University
(e.g., departments, faculties, administrative offices, libraries, laboratories). In con-
trast, the external ones refer to those public or private organizations that interact with
the University to provide academic services or products. On the internal side,
54 2 Developing Performance Management Systems in Higher Education Institutions

Fig. 2.8 Identifying an organizational unit alongside academic value creation processes

Fig. 2.9 The “objective”


view of performance
management (Bianchi,
2016, pp. 121)

organizational units and their interactions are distinguished to show front-stage


interaction with the client, separated from the back-stage service production (Alford
& Yates, 2014).
Adopting an “objective” view of PM is recommended to frame academic value
processes (Bianchi, 2016). Such a perspective implies that products generated by the
fulfillment of administrative processes are made explicit (Fig. 2.9).
According to a value creation perspective, identifying “products” and “clients”
provides an important key to outlining an interdepartmental approach to affect
academic performance and strategic coordination. It is worth remarking that, if one
refers to academic services, a “product” identifies a result generated by the fulfill-
ment of a process or a combination of operations in favor of a given “client.” A
2.5 Performance Measurement in Higher Education Institutions 55

“client” identifies an entity (either an individual, or group of people, or a front-/back-


office organizational unit, or other institutions) which benefits from a given “prod-
uct” delivered by administrative processes. An administrative “product” may take a
different connotation as a function of the “client” to whom it is delivered (Pitman,
2000). In fact, by focusing on an external client (i.e., on those people/institutions/
stakeholders operating outside the University), then it is possible to identify a final
“product” or, more frequently, a package of final products that are demanded. For
instance, a bachelor’s degree is a final product whose logical premise for a student
(seen as a “client”) is linked to providing a package of final products logically and
sequentially related to each other. Other examples of final products include the issue
of a student identification code and its related certificate, the issue of a certificate of
enrollment to a new academic year or of the approved syllabus, and the transcript of
records.
As for supplying a final product to an external client, back-office units are
expected to deliver a set of “instrumental” products to their internal clients. Internal
clients are those back-office units that receive products/services from another unit
located upstream in the value chain. They will also process these products/services to
make progress in supplying services to an external client.
Therefore, delivering an “instrumental” product affects the internal “client’s”
performance. This, in turn, influences the performance of the other internal “clients”
which are sequentially located along the academic value chain leading to the
delivery of the “final” product. Competitive advantages and end users’ satisfaction
mostly depend on both quality and efficiency resulting from developing “instrumen-
tal” products.
Based on this perspective, if one refers to a given “final” product, it is possible to
identify a system of products resulting from the fulfillment of administrative pro-
cesses by each organizational unit whose only “clients” are internal in a given
University. Such a top-down approach—which gradually moves from synthesis to
analysis—implies a more selective search of relevant data to track academic perfor-
mance.4 Moving backward in such an analysis, i.e., from final to instrumental
“products,” allows academic decision-makers to (1) frame the academic value chains
leading to “final” products, (2) make performance drivers explicit, and (3) promptly
undertake corrective actions focusing on a specific organizational unit which shows
weaknesses in its operations and related results.
This approach to academic process analysis also focuses on identifying the causal
interplays between the multiple organizational units operating and interacting within

4
A top-down approach contrasts with a bottom-up approach which, starting from analytical
elements, moves toward synthesis. This latter emphasizes the use of statistical methods designed
to collect data and information which, starting from the analysis of different organizational units’
performance, are meant to reach an overall measurement system able to express the University
global performance. Nevertheless, this data acquisition would lead to a random collection of
information characterized by a lack of both selectivity and systemic perspective on performance
achievement processes. Consequently, such an approach may limit academic decision-makers in
steering Universities according to a sustainable development perspective.
56 2 Developing Performance Management Systems in Higher Education Institutions

Fig. 2.10 The identification of “clients” and “products” within the academic value chain

a given value creation process (Cosenz, 2014). In particular, such identification is


based on a coordination mechanism where the output (i.e., the product) generated by
one organizational unit forms the input of another unit operating downward in the
same academic value chain. The emerging causal maps result in diagrams in which
outputs, products, clients/units, and actions are framed, with arrows indicating how
one element is linked to another (Bryson et al., 2004).
The design of a causal map adopts a “backward mapping perspective” (Elmore,
1980), in which the final output is defined first. The steps and interconnections that
might lead to it are traced, starting with the most proximate ones and working
backward. This perspective characterizes the processual steps moving from out-
comes to (final and instrumental) outputs and associated inputs (Funnell & Rogers,
2009; Alford & Baird, 1997). Figure 2.10 depicts the identification of “clients” and
“products” within the academic value chain. It also points out a distinction between
decision-making and the performance measurement perspective. While the former
aims to support the outline of a causal map on which to implement performance
measurement systems oriented to feed the search of variance gaps, the latter focuses
on the logical sequence of activities and operations which, based on the undertaken
decisions, allow the University to deliver its final products. Both perspectives
complement each other.
For instance, the “undergraduate student prospectus” is a final output that a
University delivers to its potential enrolled students (i.e., external clients) as a result
of multiple administrative steps underlying critical factors influencing the provision
of this product. Such critical factors can be identified by focusing on internal clients
and corresponding “instrumental” products and the processes carried out by
2.5 Performance Measurement in Higher Education Institutions 57

back-office units located along the value chain. Instrumental products associated
with the delivery of the “undergraduate student prospectus” are course syllabus
outline delivered by departments to faculty boards, study programs offered by
faculty boards to the academic senate, curriculum proposal approved by the aca-
demic senate and sent to the “education” organizational unit, and “undergraduate
student prospectus” duly recorded and delivered by the “education” organizational
unit to potential enrolled students.
For each administrative “product” delivered to external and internal “clients,” the
identification of factors affecting related academic performance requires an effort to
map the following:
1. Processes and activities
2. Organizational units
3. Policy levers and resources
4. Performance indicators.

2.5.2 An “Instrumental” View of Performance Management


to Design Performance Measures in Higher Education
Institutions

An “instrumental” view of PM explores how a University has achieved its results by


analyzing and measuring critical drivers leading to performance (Bianchi, 2016). It
contributes to designing performance indicators focused on the combination of the
consumption of strategic resources (inputs) with the associated results (outputs).
Such a perspective is defined as “instrumental” since it contributes to making the
relation between resource accumulation/depletion and corresponding end-results
explicit by identifying performance drivers—linked to critical success factors—on
which academic decision-makers may act to affect results. In other words, it inves-
tigates how results are achieved in terms of resource allocation and consumption, as
well as how these results, in turn, create (or destroy) the associated resources.
Performance indicators are designed to measure both intermediate and
end-results. While end-results are referred to the provision of final products origi-
nating from a combination of multiple processes, intermediate results define the
value created through the execution of simple operations within these processes.
Therefore, end-results are measured to highlight the overall value generated by
an HEI.
Measuring intermediate results provides a deeper understanding of how this value
is generated along the academic value chain since it focuses on the coordination
mechanism between the various organizational units. These intermediate results
represent the main factors driving organizational performance. Their assessment
provides an understanding of how a university generates value in correspondence
with those critical success factors able to affect end-results. Based on the academic
value processes preventively framed according to PM’s “objective” view, the drivers
58 2 Developing Performance Management Systems in Higher Education Institutions

of performance are made explicit for assessing the results of each organizational unit
interacting alongside these processes. Thereby, the results of an upstream organiza-
tional unit form the input of the downward unit. This systemic perspective allows
academic decision-makers to frame the overall value chain and promptly identify
those “weakest links” on which to intervene to improve end-results. To use a
metaphor, while the end-results represent the speed of an organization’s perfor-
mance, the performance drivers denote performance acceleration. On the other hand,
strategic resources can metaphorically be depicted as the forces upon which
decision-makers act to affect the acceleration rate, and through it, the speed at
which an organization is traveling (Bianchi et al., 2015).
As a result, the “instrumental” view of PM is primarily concerned with identify-
ing both end-results and related drivers. Academic decision-makers must build up,
preserve, and deploy an adequate endowment of interdependent strategic resources
to affect such drivers. This also implies that decisions made by different unit
managers upon interdependent strategic resources should be coordinated according
to a systemic view (Cosenz, 2013). Particularly, each strategic resource should
provide the basis to sustain and foster others in the same system. For instance,
lecturers and education facilities (e.g., laboratories, computers, classrooms, etc.)
provide teaching capacity, affecting the perceived education quality. This impacts
the University’s image which, in turn, influences the student’s satisfaction. A change
in the student’s satisfaction will affect enrollments and, perhaps, the stock of
available financial resources (e.g., enrollment fee payments), increasing teaching
capacity and education quality.
Therefore, understanding how strategic resources influence and are influenced by
achieved results becomes a key issue to manage performance in HEIs, whose
organizational setting is characterized by dynamic complexity (Pucciarelli &
Kaplan, 2016).
Figure 2.11 illustrates how end-results provide an endogenous source inside an
HEI for the accumulation and depletion processes that affect those strategic
resources that cannot be directly purchased from the market. These are the resources
generated by management routines (e.g., University’s image and reputation, orga-
nizational climate, academic staff burnout), equity, and liquidity.
In particular, drivers are classified according to the main performance dimen-
sions, i.e., competitive, social, and financial.
Competitive performance drivers are associated with critical success factors in the
competitive academic system. They can be measured in relative terms—as a ratio
between the organizational performance perceived by clients (e.g., students) and a
benchmark—or a target value.5 Such a denominator must be gauged by considering

5
Neely et al. (1995) argue that benchmarking is used as a means of identifying improvement
opportunities as well as monitoring the performance of competitors. They also cite Camp (1989) as
having the most comprehensive description of benchmarking: benchmarking as the search for
organization best practices that lead to superior performance. In terms of performance management,
however, Neely et al. quote Oge and Dickinson (1992), who suggest that organizations should adopt
closed-loop performance management systems which combine periodic benchmarking with
2.5 Performance Measurement in Higher Education Institutions 59

Fig. 2.11 The


“instrumental” view of
performance management
(Bianchi, 2016, p. 73)

past performances, clients’ expectations, or even (if relevant) competitor Universi-


ties’ performance.
Social performance drivers can be measured in terms of ratios between strategic
assets and a target, which can mostly be expressed in terms of stakeholders’
expectations or perceived past organizational performance. For instance, a social
performance driver could be referred to as unemployment reduction measured as the
ratio between the actual and planned number of graduated students with a job.
Financial performance drivers must also be measured in relative terms. For
instance, the debt-to-total investment ratio often affects the change in University
solvency perceived by external investors. Efficiency measures affecting operational
costs can be gauged in terms of ratios as well.
Combining the “instrumental” with the “objective” view forms the methodolog-
ical background for designing performance measurement mechanisms to frame
academic performance throughout the multiple value creation processes in HEIs.
As a result, the emerging process analysis and associated performance indicators
support academic decision-makers to undertake corrective actions and plans to
improve performance in each of the observed processes.
As described above, PM’s “instrumental” view provides a method to design
performance drivers and end-result indicators to measure academic performance.
These measures are subsequently placed along the academic value creation

ongoing monitoring. As a result, closed-loop performance management systems are able not only to
provide the measure related to the performance of each organizational unit but also to explain how
their distinctive performance contributes to the overall academic result.
60 2 Developing Performance Management Systems in Higher Education Institutions

Fig. 2.12 Academic


performance
improvement vs. process
innovation

processes to evaluate the results achieved by the corresponding organizational units


operating within the same process. As displayed in Fig. 2.12, measuring academic
performance through adopting the “instrumental” view may support academic
managers in undertaking actions exerted upon strategic resources, thereby improving
performance timely.
The “objective” view adopts a selective approach to explore and outline academic
value creation processes. The process mapping development focus begins from the
end, i.e., by identifying the final product the University delivers to its external
clients. It gradually moves back across the value chain, thus spotting those instru-
mental outputs and related organizational units interacting therein.
In this way, such a process mapping approach enables selectively detecting and
eliminating the activities, operations, and associated organizational units that do not
add value to the process under observation. In a different—although complemen-
tary—perspective, PM’s “objective” view may also significantly boost academic
performance in terms of process innovation.

2.5.2.1 Strategic Resources in Higher Education Institutions

According to a sustainable development perspective, effective goal setting and


decision-making require a preliminary analysis of the available resources (e.g., in
terms of size and consistency) and means that a University owns and may use in
academic value creation processes. Such an analysis is fundamental to define
objectives and related actions and allocate a proper asset endowment—coherent
with the assigned objectives—to each organizational unit located along the academic
value chain. These assets are represented by tangible and intangible resources whose
consumption allows the University to achieve its results over time.
In particular, strategic resources are classified into (Bianchi, 2016):
• Physical resources (e.g., staff, academic furnishings, inventory) which HEIs can
purchase on the market
2.5 Performance Measurement in Higher Education Institutions 61

• Capacity resources (e.g., research productivity, knowledge transfer capacity)


which define critical operational factors in delivering academic products and
services
• Information resources (e.g., expected enrollment demand, management percep-
tions, new labor market needs) which support decision-making as intangible
factors related to “coded” and “non-coded” information
• Resources generated by internal management routines (e.g., knowledge, aca-
demic reputation, research, and teaching quality) which cannot be purchased on
the market since they result from the reinforcement of organizational procedures
and habits over time
• Financial resources (e.g., bank balances, equity, asset value) reported in the
University’s balance sheet.
Intangible resources and, specifically, intellectual capital in Universities are
widely recognized as crucial factors in creating and preserving sustainable compet-
itive advantages, especially nowadays in the so-called knowledge-based economy
(Cañibano & Sánchez, 2009; Sánchez et al., 2009; Sveiby, 2001). As remarked by
the European Commission (2003), the main goals of Universities relate to produc-
tion, diffusion, and knowledge transfer. Namely, managing intellectual capital in
HEIs has become critical mainly because their most important investments are in
research and human resources (Elena, 2004). As such, both academic inputs and
outputs are mainly intangibles. Intellectual capital includes three basic interrelated
components: human capital, structural capital, and relational capital (Ramírez et al.,
2007; Cañibano & Sánchez, 2004; Stewart, 1997; Edvinsson & Malone, 1997). In
HEIs, such components are characterized as follows (Ramírez Córcoles, 2013):
• Human capital corresponds to the set of both explicit and tacit knowledge of the
academic staff (e.g., professors, lecturers, scholars) acquired through formal and
informal educational and actualization processes embodied in their activities.
• Structural capital is the explicit knowledge related to the internal processes of
dissemination, communication, and management of scientific and technical find-
ings within the University; it can be both “organizational” (i.e., the operating
environment resulting from the interplay between research, management and
organizational processes, technology, and culture), and “technological” (e.g.,
patents, licenses, proprietary software, databases).
• Relational capital gathers the wide set of economic, political, and institutional
relationships developed and maintained by Universities (e.g., collaborations with
firms and institutions, joint funding projects’ involvement).
However, measuring intangibles is not an easy task (European Commission,
2003; Leitner & Warden, 2003; Cañibano & Sánchez, 2004). The OECD (1996)
remarked that current intellectual capital indicators fail to capture the fundamental
aspects of developing new economies, leading to erroneous economic policy design.
Therefore, adequate measurement of intangibles becomes essential to understand
what is happening in modern economies. As Foray (2004) asserts, relevant obstacles
affect knowledge assessment. These are related to the following reasons:
62 2 Developing Performance Management Systems in Higher Education Institutions

Table 2.4 The strategic matrix by the Observatory of European Universities (2005)
Human Academic Third
Funding resources outcomes Mission Governance
Autonomy Key questions and indicators
Strategic
capabilities
Attractiveness
Differentiation
profile
Territorial
embedding

1. An important part of knowledge is implicit.


2. The different elements of knowledge are heterogeneous.
3. Knowledge is not observable; the terms and magnitude of the relation between
the creation of knowledge, its diffusion, and economic growth are not known.
In addition, the OECD (1999) argued that it is more complicated to design
comparable indicators for intangibles than for tangibles. Although it is generally
acknowledged that intangibles create added value, this cause-and-effect chain is
complex to quantify (Lev, 2000).
Attempts to measure intangibles in HEIs have been inspired by well-known
performance measurement frameworks, such as the Balanced Scorecards by Kaplan
and Norton (1992) or the Intellectual Asset Monitor by Sveiby (1997). According to
a Resource-based view (Wernerfelt, 1984; Peteraf, 1993; Dierickx & Cool, 1989),
they aim to maintain an appropriate consistency between the different assets to foster
sustainable development. For instance, in 2005, a project promoted by the Obser-
vatory of European Universities developed a framework of analysis—called “stra-
tegic matrix”—oriented to build performance indicators to measure and compare the
intangible elements related to research activities. As reported in Table 2.4, such a
framework is organized through five thematic dimensions and five transversal
questions (Sánchez & Elena, 2006). The thematic dimensions are:
1. Funding (i.e., all budget elements, analyzing revenues and expenses)
2. Human resources (i.e., administrative staff, research and teaching staff, Ph.Ds.)
3. Academic productions (i.e., results from research activities in all fields, such as
scientific publications)
4. Third Mission (i.e., all the activities and relations between Universities and
non-academic partners, such as firms, public and nonprofit organizations, local
government, and society as a whole)
5. Governance (i.e., the process by which the University turns its inputs, such as
funding and human resources, into research outputs, e.g., academic outcomes and
Third Mission results emerging from research activities)
The transversal issues include:
2.5 Performance Measurement in Higher Education Institutions 63

1. Autonomy (i.e., the decisional autonomy of a University to allocate resources or


to invest funds)
2. Strategic capabilities (i.e., the concrete aptitude of a University to implement its
decisions)
3. Attractiveness (i.e., the University’s ability to attract resources within a context of
scarcity, such as money, people, equipment, and collaboration)
4. Differentiation profile (i.e., the core competencies of a University that distinguish
it from its competitors)
5. Territorial embedding (i.e., the strategic alliances that a University maintains
within its regional area)
Although it combines relevant performance dimensions within a unique matrix,
the above framework shows some methodological shortcomings. It does not clearly
separate inputs and outputs in the value creation process. Leitner (2004, p. 133)
remarked that it is necessary to separate inputs, processes, and outputs to detect the
weaker organizational areas where to concentrate managerial efforts in terms of
resource stabilization. Moreover, this framework only focuses on research activities,
while it also should be extended to education (Sánchez & Elena, 2006) and related
results. Eventually, measurements are mainly based on intellectual capital elements
without considering the effect of tangibles on the achievement of the strategic goals.
As a result, academic decision-makers do not have enough information about how to
manage each element to enhance overall performance (González-Loureiro &
Teixeira, 2012).
Another remarkable attempt to measure intangibles in HEIs has been developed
by Fazlagic (2005) with a focus on the Poznan University of Economics in Poland.
In this framework—shown in Table 2.5—both human and structural capitals are
measured through a set of indicators gauged to specific resource categories, the
undertaken activities to reinforce them, and the associated results. Although limited
to specific resources, it uses an approach to identify a system of interdependencies
between <<resources, activities, and results>>.
This perspective should be gauged to Universities’ education, research, and Third
Mission activities to originate a systemic framework of causal interplays between
key resources. This would foster a deeper understanding of how to manage the
accumulation and depletion processes of resources to improve asset consistency.
There is still not enough information about the impact that input management has on
the overall performance of Universities. In addition, it is advisable to deepen what to
achieve and how to achieve intended as crucial elements that most influence the
overall performance in the development of the threefold mission of HEIs (González-
Loureiro & Teixeira, 2012).
Both coordination and management of strategic resources allow Universities to
pursue results that will support the academic value creation processes by feeding up
strategic resources. For example, increasing research products (e.g., publications)
depends on intellectual capital consumption. This result will enhance the image of
the University that will be likely to affect the interest of external investors to sponsor
new research activities. As a result, external funds can be used to improve the
64 2 Developing Performance Management Systems in Higher Education Institutions

Table 2.5 The intellectual capital measurement matrix by Fazlagic (2005)


Types
Which objectives
What is there? What has been invested? have been achieved?
Categories (resources) (activities) (results)
Human • Number of researchers • Research spending per • Number of newly
capital • Share of researchers in employee recruited staff
total employment • ITC spending per employee • Number of con-
• Average age of a • Time spent in internal seminars tracts turned down
researcher per employee with regret
• Women in science • Staff satisfaction
(share of women in the • Staff turnover
workforce) • Added value per
• Inbreeding (share of employee
researchers who are • Composite
graduates of the employee satisfac-
University) tion index
• Average number of
publications per
researcher
Structural • Share of women • Total investment in research • N of international
capital occupying managerial infrastructure students
positions • Success ratio in project acqui- • Share of interna-
• Number of chairs sition tional staff
(departments) • Research spending per chair • Name recognition
• Average employment (department) and reputation
in a chair (department) • Participation in international (based on press
• N of PC per conferences (N of conferences ranking lists)
employee attended, n of researchers • Student satisfaction
attending conferences) index
• N of research projects • Number of students
underway • Number of courses
• Average number of
publications per
chair (department)

University’s intellectual capital (e.g., research training) according to a virtuous


feedback logic.
Under the purpose of enhancing the acquisition of enduring competitive advan-
tages against University’s competitors, the use of strategic resources should be
associated with the development of the core competencies linked to the critical
success factors for education, research, and Third Mission activities (Aaker, 1989;
Dierickx & Cool, 1989; Coda, 2010). It is worth underlining that resource measures
should not be confused with performance measures. While the former provides
information on asset consistency, volume, quality, etc., to achieve goals and objec-
tives, the latter focuses on the performance drivers affected through the exploitation
of resources and the corresponding end-results achieved in the long term. Measuring
resources is indeed essential to set achievable, though challenging, goals and
objectives.
2.5 Performance Measurement in Higher Education Institutions 65

2.5.2.2 End-Results and Intermediate Results

Once goals and objectives are defined according to the available strategic resources,
the associate results are made explicit. As previously stressed, it is worth
distinguishing “end” from “intermediate” results. End-results refer to what the
University can achieve from interacting with the external environment based on
academic products and services provided. They can be expressed in financial terms
(e.g., income, cash flow), in competitive terms (e.g., newly enrolled students, new
publications), and in social terms (e.g., change in University’s image, change in
student satisfaction). In the short term, academic decision-makers cannot directly
influence end-results as they come out from the combined effect of different oper-
ations, processes, and activities leading to the last segment of the academic value
chain. For instance, the rector or academic boards cannot directly affect a change in
the University’s image or student satisfaction. Rather, the effects of the decisions
made upon the introduction of new educational programs more respondent to current
labor market needs, in the long term, may result in a change of such critical
resources.
As Bianchi (2016) remarks, end-results are measured over a sequential chain and
classified in multiple interrelated layers (Fig. 2.13). The first layer’s end-results most
synthetically capture the overall value generated by the University. The creation of
this value involves a change in the endowment of those strategic resources that
cannot be purchased in the market since they depend on the results emerging from
internal management routines, equity, and liquidity. Equity and liquidity are
increased by, respectively, income and cash flow.

Fig. 2.13 An example of multiple academic end-result layers


66 2 Developing Performance Management Systems in Higher Education Institutions

Fig. 2.14 End-results affecting strategic resources

In the example shown in Fig. 2.13, net income and net cash flow are influenced by
current income and current cash flow, respectively. These second layer end-results
are also affected by other results classified in the third layer, such as earnings,
depreciation charges, and cash outflow payments. In this case, earnings depend on
students’ enrollment fees, which, in turn, are influenced by students’ enrollment.
Likewise, depreciation charges and cash outflow payments are affected by the
change in academic assets (e.g., classroom furniture, laboratory facilities, etc.)
purchased because of new enrollments. Students’ enrollments and purchased aca-
demic assets can be classified in a last layer of end-results. It is worth underlining
that this approach implies the coexistence of different performance dimensions in a
unique influence diagram. So, a competitive result—such as students’ enrollments—
can affect a financial result, i.e., earnings and, consequently, income.
Following this example, Fig. 2.14 displays how end-results affect strategic
resources, i.e., equity and liquidity.
In addition, end-results can be divided into outputs and outcomes. Outputs refer to
those quantitative results that a University can achieve in the short term, such as the
number of graduations in the last academic year, number of new enrollments in the
last academic year, number of articles published in scientific journals in the past
2.5 Performance Measurement in Higher Education Institutions 67

semester, etc. Outcomes correspond to the long-term impact of such outputs on a


broader context, e.g., the university’s region (Bianchi et al., 2019).
In particular, this impact arises from the aggregated contribution of multiple
institutions (private, public, nonprofit) which interact with the University and, in
doing so, extend the public value chain aiming at fostering their regional socio-
economic development. These long-term effects encompass quantitative and quali-
tative results (Kallio et al., 2017), such as new firms started in academic incubators,
new academic spin-offs, reduction in the regional unemployment rate, new research
agreements with private/public organizations, etc. Third Mission activities
implemented by HEIs specifically focus on outcomes. Lack of focus on both kinds
of end-results may imply the risk of adopting a myopic PM approach that would
prevent Universities from pursuing their sustainable development.
Based on a systemic perspective, end-results are affected by intermediate results.
These are outputs preparatory to achieve end-results, i.e., driving the fulfillment of
end-results. This type of result is extremely relevant in designing and using PM
systems as they reflect how a University develops and exploits its core competencies
to create value (i.e., end-results). Namely, core competencies are related to the
critical success factors of the HE sector (Coda, 2010; Hamel & Prahalad, 1994),
and their use can be associated with a range of value drivers that allow the University
to manage effective organizational changes to improve its performance.
Measuring these value drivers enables academic decision-makers to understand
better the causal determinants and related interplays that significantly impact
end-results. Consequently, desired corrective actions applied to such drivers can
be promptly undertaken in the short term. In fact, unlike end-results, academic
decision-makers may directly influence these drivers to pursue organizational
goals effectively. Value drivers can be affected through a different allocation of
strategic resources among organizational units set by decision-makers over time.
According to the “instrumental” view of PM, the identification of end-results,
intermediate results, and related strategic resources can focus on the whole organi-
zation (macro level), as well as on specific organizational units or processes (micro
level).
Figure 2.15 shows an example of how to apply the “instrumental” view of PM at a
macro level. In this framework, the first layer end-result is cash flow influenced by
other end-results, such as expenditures, earnings arising from new enrollments, and
the funds allocated by the Ministry of Education and Research to Universities
according to the performance-based funding system. The change in the University’s
image is also included among the end-results.
According to a systemic perspective, these end-results are influenced by a set of
intermediate results. For instance, the relative image—measured through comparing
the university and its competitors’ image—affects new enrollments, i.e., if the
University’s image is higher than its competitors. Then, students will be inclined
to enroll in such a university’s educational programs. Likewise, education and
research quality, together with resource efficiency, generate an effect on the social
and competitive performance of the University, which, in turn, will affect its ranking
68 2 Developing Performance Management Systems in Higher Education Institutions

Fig. 2.15 Applying the “instrumental” view of PM at a macro level

in the ministerial allocation of public funds (i.e., HE performance-based funding


mechanism). Resource efficiency also generates an effect on expenditures.
Intermediate results are influenced by strategic resources, which, in turn, change
through end-results. For instance, an increase in the University’s image will likely
improve the relative image, as previously defined. Similarly, an increase in
University’s liquidity due to its cash flow may involve higher investments in
capacity (e.g., more effective research equipment, leaner administrative programs,
and procedures, online education, etc.), improving research quality and resource
efficiency. Liquidity may also be used to hire more academic staff who will develop
intellectual capital and, as a result, may expand both research and education quality.
PM’s “instrumental” view can also be applied to a micro level of analysis. Two
examples of such an application to research and education are displayed in Fig. 2.16.
In particular, new patents launched on the market are conventional end-results
pursued by HEIs. Their commercialization depends on the research staff’s new
patents designed (not tested yet), identified as intermediate results. The main strate-
gic resources affecting patent design are the number of research staff and the
sponsored research projects in which they are involved. Both depend on the stock
of available financial resources, which may be increased through the earnings gained
by patent selling.
As for education, new graduations depict relevant end-results of this area. These
are influenced by the number of new theses developed by students about to graduate.
These intermediate results depend on the number of enrolled students and the
number of teaching staff supervising thesis development. Eventually, graduations
may affect the University’s reputation, which, in turn, contributes to attracting newly
enrolled students and retaining talented teaching staff.
2.5 Performance Measurement in Higher Education Institutions 69

Fig. 2.16 Applying the “instrumental” view of PM at a micro level

Once end-results, value drivers, and related interplays have been identified, an
effective PM approach requires the design of performance indicators to measure how
the University is pursuing its goals. Therefore, the following section illustrates how
to build measures that capture the combined effect of end-results and related
performance drivers within a PM system to support academic decision-making
processes.

2.5.2.3 Designing Performance Drivers and End-Result in Higher


Education Institutions

Setting an effective PM system implies the identification of end-results and related


valued drivers (i.e., intermediate results) and the design of valuable indicators to
support the diagnosis of the achieved results and the search of emergent value
discrepancies. This may provide academic decision-makers with a better understand-
ing of the critical factors affecting performance that facilitates the identification of
weaknesses and associated organizational units on which to focus corrective actions.
In particular, indicators should be designed taking into account the multiple
dimensions of academic performance, i.e., financial, competitive, and social. Exces-
sive attention paid only to financial aspects would lead to partial performance
measurement. It would not fully capture the effects of all the other variables (e.g.,
time, quality, volume) that are equally relevant for the fulfillment of academic
activities.
Therefore, it is necessary to integrate the system of financial indicators with other
performance parameters able to complete the academic performance measurement
framework by analyzing other key variables that create value. On this concern,
indicators may be expressed in monetary terms (i.e., currency), physical terms
(e.g., number of publications, number of students, teaching hours), and dimension-
less terms by using value scales ranging between zero and one (e.g., student
satisfaction, University’s image, and reputation).
70 2 Developing Performance Management Systems in Higher Education Institutions

As previously stressed, performance indicators can be measured in relative terms,


i.e., a ratio between the actual performance and a benchmark or target value. A
significant distinction between performance indicators and performance indexes
must be considered when designing and using performance measurement systems.
According to PM’s “instrumental” view, both are expressed as ratios. However,
indicators are used to measure performance drivers affecting end-results or other
performance drivers. As such, they strongly support academic decision-makers in
identifying dysfunctionalities and related causes and undertaking adequate correc-
tive actions.
Conversely, indexes form synthetic measures of the quality or state of a system
(e.g., ROI—return on investments) and do not affect any specific performance
measure. As Bianchi (2016, p. 83) maintains, “while performance drivers are
relevant measures for performance management, performance indexes can be
relevant for performance measurement only.”
Likewise to goals and objectives setting (see Sect. 2.4), effective criteria to build
performance indicators suggest that they should be (Doran, 1981; Conzemius &
O’Neill, 2006; Poister, 2003): specific, measurable, assignable, realistic, and time-
related.
In addition, performance drivers and end-result measures can be associated with
factors related to:
• Volume, which measures the value generated by processes and underlying activ-
ities in terms of output quantity (e.g., the ratio between actual and desired
publications, the ratio between actual and expected graduations)
• Quality, which analyzes the excellence of academic outputs (e.g., % of publica-
tions on top-ranked journals, the number of citations per publication)
• Efficiency, which measures the aptitude to saturate academic resource capacity
and to maximize throughput (e.g., the ratio between supervised theses and
teaching staff, the ratio between publications and research staff)
• Time, which assesses relevant time intervals to produce academic outputs (e.g.,
the ratio between actual and expected time to issue degree certificates, the ratio
between actual and expected research project completion time)
• Productivity, which expresses the aptitude of available resources to process the
workload over a given period (e.g., tutored students per tutors in the last semester,
completed research projects per research staff in the previous academic year)
• Flexibility, which determines the aptitude of the University to promptly fine-tune
its operations, processes, and management according to external and internal
changes with the minimum waste of resources (e.g., the average time to face an
increase in enrollments by allocating more teaching staff per educational pro-
grams, the average time to implement new procedures following a ministerial
decree, to introduce new educational programs or student evaluation systems)
Following the examples shown in the previous section, the design of performance
drivers and end-result measures have been applied to research and education.
Figure 2.17 displays how to build performance indicators associated with the
launch of new patents on the market. In the same framework, causal connections
2.5 Performance Measurement in Higher Education Institutions 71

Fig. 2.17 Designing performance drivers and end-result measures for research

have been identified to trace the critical factors affecting performance. Two
interconnected indicators have been built to measure end-results, i.e., (1) the ratio
between new patents in the market and designed patents and (2) the ratio between
sold patents and patents in the market. While the latter influences the stock of
financial resources, the former is affected by a set of interconnected performance
drivers. These are (1) the ratio between tested and designed patents, (2) the ratio
between designed and expected patents, and (3) the ratio between research staff and
research projects aimed at inventing new patents. The linkages make causal relation-
ships explicit according to a sequential order. These performance drivers can be
affected by strategic resources, such as research staff and sponsored research projects
which, in turn, depend on available financial resources.
Similarly, Fig. 2.18 portraits an example of how to build indicators for a conven-
tional education output, i.e., graduations. In this context, end-results are measured by
(1) the ratio between new and expected graduations and (2) the enrollment fees paid
by enrolled students. New graduations are affected by two interconnected perfor-
mance drivers: (1) the ratio between developed and supervised theses and (2) the
ratio between teaching staff and enrolled students (those about to graduate). These
drivers, in turn, are influenced by strategic resources, such as teaching staff and
enrolled students. Enrollment fees can increase financial resources, which can be
invested in hiring new teaching staff. On the other hand, new graduations are likely
72 2 Developing Performance Management Systems in Higher Education Institutions

Fig. 2.18 Designing performance drivers and end-result measures for Education

to influence student satisfaction, affecting the University’s reputation. Eventually, a


high academic reputation can bait new students and teaching staff.
Based on the analysis of the three complementary views of PM (i.e., “subjective,”
“objective,” and “instrumental”), the following section illustrates how to combine
each of them into an integrated performance framework. This integration aims to
support the coordination between the multiple organizational units intervening in the
academic value chain and to foster a strategic learning approach to University
management.

2.6 Combining a “Subjective,” “Objective,”


and “Instrumental” View for Enhancing Coordination
and Consistency in HEI’s Administration
and Performance Measurement

Operationalizing the analysis described here adopts a three-dimensional framework.


As previously remarked, three complimenting views are relevant to manage aca-
demic performance (Bianchi, 2016; Cosenz, 2014):
2.6 Combining a “Subjective,” “Objective,” and “Instrumental” View. . . 73

Fig. 2.19 Three complementary views for designing PM systems in HEIs (Bianchi, 2016, p. 137)

1. An “objective” view
2. An “instrumental” view
3. A “subjective” view
To sum up, the “objective” view implies that products generated by the fulfill-
ment of administrative processes are made explicit. This approach requires a back-
ward analysis to identify final/instrumental products and related external/internal
clients.
The “instrumental” view allows academic decision-makers to identify end-results
and performance drivers, as well as to understand how strategic resource allocation
may affect performance. It also explores how strategic resources are in turn
influenced (i.e., increased or depleted) by end-results. This perspective aims to
define a set of measures regarding both performance drivers and end-results. Possi-
ble examples of performance drivers used in HEIs can be those which measure the
promptness in updating educational programs, the effectiveness of academic equip-
ment (e.g., number of breakdowns), and the productivity of research staff. Each
organizational unit is expected to build up, preserve, and deploy an adequate
endowment of strategic resources (Ewell, 1999).
The “subjective” view provides a synthesis of the previous two perspectives since
it makes explicit—as a function of pursued results—processes and activities to be
undertaken, together with related objectives and performance targets to be included
in the budgets of each organizational unit. This view requires that performance
measures associated with academic service delivery are made explicit and linked
to the goals and objectives set by decision-makers operating in different organiza-
tional units.
Figure 2.19 provides a synthesis of the three complementary views of PM, as
described above.
74 2 Developing Performance Management Systems in Higher Education Institutions

Fig. 2.20 Synergies among the three views of PM to foster strategic coordination in University
management

Thereby, the synergic interplay between the three views of PM allows academic
decision-makers to frame value creation processes. This approach does not claim to
solve all the critical issues that typically affect public sector management and its
value creation processes (see these issues in Sect. 1.7—e.g., sub-optimization,
gaming, data manipulation, etc.). Counteracting such aspects additionally requires
ethical conducts, professionalism, and a pervasive organizational culture rooted in a
robust vocational inclination toward a systematic improvement of performance
related to public service delivery. Under these premises, University administrators
may leverage this systemic perspective of PM to take advantage of a continuous
strategic learning process to explore the specific complexity of academic institutions
(Moynihan, 2005; Moynihan & Landuyt, 2009). To this end, this approach contrib-
utes to tracing and outlining the causes which have led to the achievement of specific
results and, consequently, in detecting those critical factors or units on which to
focus management efforts. As such, it may effectively support a strategic learning
approach to manage Universities, mainly when they operate in a context character-
ized by dynamic complexity and unpredictability.
In addition, the possibility to frame the interactions between the organizational
units operating in the same academic value creation processes may foster strategic
coordination in decision-making, resource allocation, and performance measure-
ment. PM systems must focus on these crucial interactions—where the output of a
unit forms the input of another one located downward in the value chain—to support
strategic coordination, particularly between central (e.g., rectorate) and peripheral
(e.g., departments, faculties) academic structures (Fig. 2.20).
Figure 2.21 illustrates an example of how to apply the three views of PM in HEIs.
This example is excerpted from Cosenz (2014). Particularly, such a framework
2.6 Combining a “Subjective,” “Objective,” and “Instrumental” View. . .

Fig. 2.21 An example of strategic coordination in HEIs through a PM perspective (adapted from Cosenz, 2014)
75
76 2 Developing Performance Management Systems in Higher Education Institutions

shows a set of measures related to both performance drivers and end-results regard-
ing four administrative products, i.e., “publications,” “enrollments,” “graduations,”
and “partnerships and recruitments.”
Both academic organizational units and areas are identified concerning each
product. Specifically, publications are research products developed in departments.
Central offices and faculties are responsible for enrollments and graduations in the
education and administration areas. Partnerships with stakeholders (e.g., enterprises,
public organizations) and recruitments are part of the Third Mission activities with
the support of the University’s administration. The placement office mainly
coordinates them.
Moreover, a systemic perspective of the academic value creation processes has
been adopted to underline how end-results of a given product provision contribute to
enhancing (or depleting) strategic resources of another one located down the value
chain.
Performance drivers are identified in correspondence to end-results, whereas
related strategic resources are made explicit for each selected product. Performance
drivers are ratios between a current state of a resource and a benchmark. Benchmark
values may come from competitor standards or reference targets defined in internal
strategic planning processes (Ammons, 2000, 2001; Gerrish, 2016).
The design of the above framework started with identifying end-results related to
each selected product. Respectively, end-results are:
• Change in submitted articles, change in publications, and change in University’s
image and cash flow, as a result of the publication process
• Change in enrolled students and change in University’s image, enrollment fees,
and cash flow, as a result of the enrollment process
• Change in graduated students, change in University’s image, change in teaching
equipment capacity, and change in graduates’ skills, as a result of the graduation
process
• Change in stakeholders’ network, graduates’ recruiting rate, and change in the
University’s image due to partnerships and recruiting strategies
Subsequently, the design of performance drivers has focused on those factors
affecting the above end-results.
To produce a change in “publications,” some ratios influence the change in
submitted articles. They are the ratio between submitted articles per researcher per
year and its reference target, the ratio between research staff involved in developing
research projects and its reference target, and the ratio between actual and reference
time to develop research projects. These indicators aim to measure research staff
productivity and efficiency. The more submitted articles, the higher the possibility of
publishing research outputs in academic journals. However, this is not sufficient. To
publish in high-ranked journals is worth measuring research quality compared to
competitors’ one to monitor the quality of research activities and related outputs. A
higher number of publications, particularly those on high-ranked journals, are likely
to affect the University’s image and its cash flow since such a parameter commonly
influences the ministerial criteria used to allocate public funding toward Universities.
2.6 Combining a “Subjective,” “Objective,” and “Instrumental” View. . . 77

The described performance drivers, in turn, can be influenced by a set of strategic


resources, such as research staff and their skills (or intellectual capital), libraries and
electronic databases, working papers, research projects, and liquidity. The
end-results generated in this area increase (or decrease) these resources (e.g., liquid-
ity) and support coordination with other organizational areas responsible for fulfill-
ing related processes. For instance, increasing publications may build up the stock of
educational materials (i.e., publications and textbooks) used by students to complete
their curricula and graduate. Likewise, a change in image and cash flows affects the
corresponding endowment of resources (i.e., image and liquidity) used to attract and
enroll students in educational programs.
Besides University’s image and liquidity, multiple strategic resources are
involved in the process aimed at increasing “enrollments.” They include teaching
and administrative staff, potential enrolled students looking to enhance their knowl-
edge through academic courses attendance, and the educational programs offered by
the University. These resources influence a set of performance drivers. In particular,
the ratio between University’s and its competitors’ image, the ratio between inno-
vative educational programs and competitors’ ones,6 the ratio between academic
staff involved in promoting educational programs and its reference target, and the
ratio between the time to promote educational programs and its reference target
influence the change in enrolled students. The ratio between innovative educational
programs and competitors’ ones also affects the change in the University’s image.
As a result, a change in enrolled students may generate more cash flows through
enrollment fee payments (end-results).
The students enrolled in educational programs are fundamental resources in the
“graduation” process. Other relevant resources in this context correspond to teaching
staff and related skills, teaching equipment capacity (e.g., classrooms, computers,
laboratories, technical instruments, etc.), and publications and textbooks. These
resources are connected to four main performance drivers compared with a reference
target. Namely, they aim at measuring the number of teaching staff per student,
teaching equipment capacity, quality of teaching, and teaching and tutoring time.
These measures generate an effect on the corresponding end-results, such as changes
in teaching equipment capacity, graduates’ skills, University image, and graduate
students.
End-results of this area once more influence the resource endowment of the area
responsible for partnerships and recruitments. Graduate students, University’s
image, and graduates’ skills—together with research, teaching and administrative
staff, and networks with stakeholders—form the main assets to pledge more partner-
ships and, consequently, facilitate graduates’ recruitment. These resources affect
performance drivers related to the University’s image, the quality and size of
stakeholders’ network, the graduates’ skills, and the percentage of recruited gradu-
ates on the total. In particular, the graduates’ recruiting rate depends on the

6
Innovative educational programs are here meant as those curricula which better respond to current
labor market needs.
78 2 Developing Performance Management Systems in Higher Education Institutions

University’s image, the quality and size of networks, and the graduates’ skills. The
University’s image mainly influences the change in stakeholders’ networks. Even-
tually, the change in the University’s image is influenced by the quality and size of
stakeholders’ networks, the percentage of recruited graduates on the total, and their
skill level. Again, these outputs are likely to affect their corresponding strategic
resources, such as the University’s image and networks with stakeholders.
Establishing new strategic partnerships with stakeholders may also influence the
possibility of funding new research projects that feed into the research area devoted
to producing new publications.

2.7 Closing Remarks

This chapter has outlined a systemic approach to designing PM systems in HEIs.


Such an approach has been gauged to the organizational features of Universities and
their social roles in the contemporary competitive context, as described in Chap. 1.
In particular, PM systems encompass two main interconnected activities: strategic
planning and performance measurement. Both have been focused on the definition
of organizational performance tailored to academic institutions. Namely, its
multidimensional conceptualization refers not only to what competitive, social and
financial results are achieved by Universities (e.g., income, change in image or
student satisfaction) but also to how these results are achieved. In this respect, PM
becomes a valuable strategic tool to support academic decision-makers to (1) under-
stand how to affect performance according to a systemic perspective, (2) communi-
cate results inside and outside the organization, (3) trace and analyze their causes,
and (4) identify the policy levers on which to set corrective actions to pursue the
sustainable development of HEIs.
Such an approach is necessary to tackle possible undesired effects that may stem
from a bounded view in designing university management information systems (e.g.,
exclusively focused on financials). By using a value creation perspective, this
chapter has addressed the need to design PM systems that may balance short- and
long-term, and support better coordination between front-office and back-office
units, as well as central and peripheral structures.
It has also emphasized how mapping feedback relationships between end-results,
performance drivers, and strategic resources may support academic decision-makers
in managing and measuring the performance of academic institutions. In addition,
the intent to link back-office to front-office units in performance evaluation has
contributed to remarking how crucial it is to identify administrative products, map
the underlying processes, and match them to key responsibility areas. Identifying
processes, internal clients and related products, available resources, policy levers,
and responsibility areas provides the backbone for the effective implementation of
performance improvement programs in academic institutions.
The dynamic complexity and unpredictability characterizing the contemporary
HE systems is the main cause of the inadequate performance shown by Universities.
References 79

To capture the dynamic complexity of academic decision-making, PM systems


should also consider a range of relevant factors influencing organizational perfor-
mance. Such factors can be associated with delays, nonlinearity, intangibles, and the
unintended consequences on human perceptions and behavior caused by a superfi-
cial or mechanistic approach in setting performance targets (Bianchi, 2016; Sloper
et al., 1999; Linard & Dvorsky, 2001).
A “dynamic” perspective in designing and implementing PM systems is partic-
ularly valuable in this context. According to a “cause-and-effect” perspective, it
implies the identification and analysis of end-results, value drivers, and related
strategic resource accumulation/depletion processes, according to a “cause-and-
effect” perspective. A feedback analysis may allow academic decision-makers to
frame the relevant structure better underlying performance and, consequently, design
and assess alternative strategies to affect the system structure according to the
desired performance behavior. Understanding the dynamic relationships between
past, current, and future events is an important outcome of a deep learning process.
Namely, simulation-based learning is a process where academic decision-makers
and other University key-actors design and use models to evaluate current and past
results. They can also understand how the organizational system reacts in terms of
feedback structure, and apply a given strategy or test management routines in a
controlled and protected environment. As many research and practices in strategic
management prove, the methodological support provided by simulation-based tech-
niques—such as System Dynamics—is particularly recommended to model and
analyze social systems characterized by dynamic complexity and uncertainty. Its
support also allows one to experiment with the models to design strategies for
management and change (Forrester, 1958; Sterman, 2000; Cosenz & Noto, 2016).
For this purpose, the next chapter will illustrate how to combine PM with System
Dynamics modeling to provide better support to academic strategic planning and
performance measurement according to a deeper strategic learning perspective.

References

Aaker, D. A. (1989). Managing assets and skills: The key to a sustainable competitive advantage.
California Management Review, 31, 91–106.
Agyemang, G., & Broadbent, J. (2015). Management control systems and research management in
universities: An empirical and conceptual exploration. Accounting, Auditing and Accountability
Journal, 28(7), 1018–1046.
Alford, J., & Baird, J. (1997). Performance monitoring in the Australian public service: A
government-wide analysis. Public Money and Management, 17(2), 49–58.
Alford, J., & Yates, S. (2014). Mapping public value processes. International Journal of Public
Sector Management, 27(4), 334–352.
Ammons, D. N. (2000). Benchmarking as a performance management tool: Experiences among
municipalities in North Carolina. Journal of Public Budgeting, Accounting & Financial Man-
agement, 12(1), 106–124.
Ammons, D. N. (2001). Municipal benchmarks. Sage.
Andrews, K. R. (1971). The concept of corporate strategy. Irwin.
80 2 Developing Performance Management Systems in Higher Education Institutions

Angiola, N., Bianchi, P., & Damato, L. (2018). Performance management in public universities:
Overcoming bureaucracy. International Journal of Productivity and Performance Management,
67(4), 736–753.
Anthony, R. (1965). Planning and control systems: A framework for analysis. Harvard Business
School Division of Research.
Bianchi, C. (2010). Improving performance and fostering accountability in the public sector
through system dynamics modelling: From an ‘external’ to an ‘internal’ perspective. Systems
Research and Behavioral Science, 27(4), 361–384.
Bianchi, C. (2012). Enhancing performance management and sustainable organizational growth
through system-dynamics modelling. In S. N. Grösser & R. Zeier (Eds.), Systemic management
for intelligent organizations (pp. 143–161). Springer.
Bianchi, C. (2016). Dynamic performance management. Springer.
Bianchi, C., Cosenz, F., & Marinkovic, M. (2015). Designing dynamic performance management
systems to Foster SME competitiveness according to a sustainable development perspective.
Empirical evidences from a case-study. International Journal of Business Performance Man-
agement, 16(1), 84–108.
Bianchi, C., Bereciartua, P., Vignieri, V., & Cohen, A. (2019). Enhancing urban brownfield
regeneration to pursue sustainable community outcomes through dynamic performance gover-
nance. International Journal of Public Administration. https://doi.org/10.1080/01900692.2019.
1669180.
Birnbaum, R. (2000). Management fads in higher education: Where they came from, what they do,
why they fail. Jossey-Bass.
Bouckaert, G., & Halligan, J. (2008). Managing performance. International comparison.
Routledge.
Broadbent, J. (2007). If you can’t measure it, how can you manage it? Management and governance
in Higher Educational institutions. Public Money and Management, 27(3), 193–198.
Brunetti, G. (1979). Il controllo di gestione in condizioni ambientali perturbate. Franco Angeli.
Bryson, J. M. (1988). A strategic planning process for public and non-profit organizations. Long
Range Planning, 21(1), 73–81.
Bryson, J. M. (2004). Strategic planning for public and nonprofit organizations. CA, Jossey-Bass.
Bryson, J., Ackermann, F., Eden, C., & Finn, B. (2004). Visible thinking: Unlocking causal
mapping for practical business results. Wiley.
Camp, R. C. (1989). Benchmarking: The search for best practices that lead to superior perfor-
mance. ASQC Quality Press.
Cañibano, L., & Sánchez, P. (2004). Measurement, management and reporting on intangibles. State
of the art. In L. Cañibano & P. Sánchez (Eds.), Readings on intangibles and intellectual capital
(pp. 81–113). AECA.
Cañibano, L., & Sánchez, M. P. (2009). Intangibles in: Universities: Current challenges for
measuring and reporting. Journal of Human Resource Costing & Accounting, 13(2), 93–104.
Cave, M., Hanney, S., Henkel, M., & Kogan, M. (1997). The use of performance indicators in
Higher Education. The challenge of the quality movement. Jessica Kingsley Publishers.
Chenhall, R. H., & Langfield-Smith, K. (2007). Multiple perspectives of performance measures.
European Management Journal, 25(4), 266–282.
Coda, V. (2010). Entrepreneurial values and strategic management. Essays in Management theory.
Palgrave Macmillan.
Conway, T., Mackay, S., & Yorke, D. (1994). Strategic planning in Higher Education: Who are the
customers. International Journal of Educational Management, 8(6), 29–36.
Conzemius, A., & O’Neill, J. (2006). The power of SMART goals: Using goals to improve student
learning. Solutions Trees.
Cosenz, F. (2013). The “entrepreneurial university”: A preliminary analysis of the main managerial
and organisational features towards the design of planning & control systems in European
Academic Institutions. Management Research & Practice, 5(4), 19–36.
References 81

Cosenz, F. (2014). A dynamic viewpoint to design performance management systems in Academic


Institutions: Theory and practice. International Journal of Public Administration, 37(13),
955–969.
Cosenz, F., & Bianchi, C. (2013). Improving performance measurement/management in academic
institutions: A dynamic resource-based view. Insights from a field project. Paper presented at the
ASPA (American Society of Public Administration) Annual Conference for the Center for
Accountability and Performance (CAP) Symposium, Baltimore, March 12.
Cosenz, F., & Noto, G. (2016). Applying system dynamics modelling to strategic management: A
literature review. Systems Research and Behavioral Science, 33(6), 703–741.
Dierickx, I., & Cool, K. (1989). Asset stock accumulation and sustainability of competitive
advantage. Management Science, 35(12), 1504–1511.
Dooris, M. J., Kelley, J. M., & Trainer, J. F. (2004). Strategic planning in Higher Education. New
Directions for Institutional Research, 123, 5–11.
Doran, G. T. (1981). There’s a S.M.A.R.T. way to write management’s goals and objectives.
Management Review, 70(11), 35–36.
Edvinsson, L., & Malone, M. S. (1997). Intellectual capital: Realizing your company’s true value
by finding its hidden brainpower. Harper Business.
Elena, S. (2004). Knowledge management and intellectual capital in European universities. In
Proceedings of the Workshop organised by the Graduate Programme “Entering the Knowledge
Society” and the Institute for Science and Technology Studies, Bielefeld University.
Elmore, R. (1980). Backward mapping: Implementation research and policy decisions. Political
Science Quarterly, 94(4), 601–616.
European Commission. (2003). The role of universities in the Europe of knowledge. Communica-
tion from the Commission of 5 February 2003.
Ewell, P. T. (1999). Linking performance measures to resource allocation: Exploring unmapped
terrain. Quality in Higher Education, 5(3), 191–208.
Fazlagic, A. (2005). Measuring the intellectual capital of a University. In Proceedings of the
Conference on Trends in the Management of Human Resources in Higher Education. OECD.
Fitzgerald, L. (2007). Performance measurement. In T. Hopper, D. Northcott, & R. W. Scapens
(Eds.), Issues in management accounting (3rd ed., pp. 223–241). FT Prentice Hall.
Flamholtz, E. (1996). Effective organizational control: A framework, applications, and implica-
tions. European Management Journal, 46(6), 596–611.
Foray, D. (2004). The economics of knowledge. Massachusetts Institute of Technology Press.
Forrester, J. W. (1958). Industrial dynamics—A major breakthrough for decision-makers. Harvard
Business Review, 36(4), 37–66.
Funnell, S., & Rogers, P. (2009). Purposeful program theory: Effective use of theories of change
and logic models. Jossey-Bass.
Gerrish, E. (2016). The impact of performance management on performance in public organiza-
tions: A meta-analysis. Public Administration Review, 76(1), 48–66.
González-Loureiro, M., & Teixeira, A. M. (2012). Intellectual capital in public Universities: A
performance-oriented approach to manage intangible. International Journal of Engineering and
Industrial Management, 3, 95–125.
Grossi, G., Dobija, D., & Strzelczyk, W. (2020). The impact of competing institutional pressures
and logics on the use of performance measurement in hybrid universities. Public Performance &
Management Review, 43(4), 818–844.
Guthrie, J., & Neumann, R. (2007). Economic and non-financial performance indicators in univer-
sities. Public Management Review, 9(2), 231–252.
Hall, R. (1993). A framework linking intangible resources and capabilities to sustainable compet-
itive advantage. Strategic Management Journal, 14, 607–618.
Hamann, P. M., Schiemann, F., Bellora, L., & Guenther, T. W. (2013). Exploring the dimensions of
organizational performance: A construct validity study. Organizational Research Methods,
16(1), 67–87.
Hamel, G., & Prahalad, C. K. (1994). Competing for the future. Harvard Business School Press.
82 2 Developing Performance Management Systems in Higher Education Institutions

Handfield, R., & Nichols, E. (1999). Introduction to supply chain management. Prentice-Hall.
Hill, T., & Westbrook, R. (1997). SWOT analysis: It’s time for a product recall. Long Range
Planning, 30(1), 46–52.
Hussey, D. (1999). Strategy and planning: A manager’s guide. Wiley.
Kallio, K., Kallio, T., & Grossi, G. (2017). Performance measurement in universities: Ambiguities
in the use of quality vs. quantity in performance indicators. Public Money & Management,
37(4), 293–300.
Kallio, T., Kallio, K., & Blomberg, A. (2020). From professional bureaucracy to competitive
bureaucracy – Redefining universities’ organization principles, performance measurement
criteria, and reason for being. Qualitative Research in Accounting & Management, 17(1),
82–108.
Kaplan, R. S., & Norton, D. P. (1992). The balanced scorecard: Measures that drives performance.
Harvard Business Review, 70(1), 71–79.
Kim, W. C. (2005). Blue Ocean strategy: From theory to practice. California Management Review.
Kloot, L. (1997). Organizational learning and management control systems: Responding to envi-
ronmental change. Management Accounting Research, 8, 47–73.
Kotler, P., & Murphy, P. E. (1981). Strategic planning for higher education. The Journal of Higher
Education, 52(5), 470–489.
Lapsley, I., & Miller, P. (2004). Transforming universities: The uncertain, erratic path. Financial
Accountability and Management, 20(2), 103–106.
Leidecker, J. L., & Bruno, A. V. (1984). Identifying and using critical success factors. Long Range
Planning, 17(1), 23–32.
Leitner, K. (2004). Intellectual capital reporting for universities: Conceptual background and
application for Austrian Universities. Research Evaluation, 13(2), 129–140.
Leitner, K. H., & Warden, C. (2003). Managing and reporting knowledge-based resources and
processes in research organisations: Specifics, lessons learned and perspectives. Management
Accounting Research, 15(1), 33–51.
Lev, B. (2000). Intangibles: Management, measurement and reporting. Retrieved from www.
baruch-lev.com
Linard, K., & Dvorsky, L. (2001). People - not human resources: The system dynamics of human
capital accounting. Paper presented at the operations research society conference, University
of Bath.
Lindsay, A. (1994). Quality and management in universities. Journal of Tertiary Education
Administration, 16(1), 55–68.
Lorange, P., & Vancil, R. (1976). How to design a strategic planning system. Harvard Business
Review, 54, 75–81.
Lorange, P., Scott Morton, M. F., & Ghoshal, F. (1986). Strategic control. West Publishing.
Maciariello, J. A. (1984). Management control systems. Prentice Hall.
Mastilak, C., Matuszewski, L., Miller, F., & Woods, A. (2012). Evaluating conflicting performance
on driver and outcome measures: The effect of strategy maps. Journal of Management Control,
23(2), 97–114.
McGivern, M. H., & Tvorik, S. J. (1997). Determinants of organizational performance. Manage-
ment Decision, 35(6), 417–435.
Melo, A. I., & Figueiredo, H. (2020). Performance management and diversity in higher education:
An introduction. Tertiary Education Management, 26, 247–254.
Merchant, K. (1997). Modern management control systems. Prentice Hall.
Miller, B. A. (2007). Assessing organizational performance in Higher Education. Jossey-Bass.
Mintzberg, H., & Quinn, J. B. (1996). The strategy process: Concepts, contexts, cases.
Prentice Hall.
Minzberg, H. (1994). The rise and fall of strategic planning. Prentice Hall.
Modell, S. (2001). Performance measurement and institutional processes: A study of managerial
responses to public sector reform. Management Accounting Research, 12, 437–464.
References 83

Moore, M. (1995). Creating public value: Strategic management in government. Harvard Univer-
sity Press.
Moore, M. (2013). Recognizing Public Value. Harvard University Press.
Moynihan, D. (2005). Goal-based learning and the future of performance management. Public
Administration Review, 65(2), 203–216.
Moynihan, D. P. (2008). The dynamics of performance management. Georgetown University Press.
Moynihan, D., & Landuyt, N. (2009). How do public organizations learn? Bridging Structural and
Cultural Perspectives, 69(6), 1097–1105.
Neely, A., Gregory, M., & Platts, K. (1995). Performance measurement system design, a literature
review and research agenda. International Journal of Operations & Production Management,
15(4), 80–116.
Observatory of European Universities. (2005). Strategic management for University Research.
Second University Panel Session. Madrid: Observatory of the European University.
OECD. (1996). The knowledge-based economy. OCDE/GD(96)102, Paris.
OECD. (1999). The knowledge-based economy: A set of facts and figures, meeting of the
Committee for Scientific and Technological Policy at Ministerial Level, 22–23 June, Paris.
Oge, C., & Dickinson, H. (1992). Product development in the 1990s – New assets for improved
capability (pp. 132–144). Economist Intelligence Unit, Japan Motor Business.
Otley, D. T. (1999). Performance management: A framework for management control systems
research. Management Accounting Research, 10(4), 363–382.
Ouchi, W. (1979). A conceptual framework for the design of organizational control mechanisms.
Management Science, 25(9), 833–848.
Parmenter, D. (2007). Key performance indicators. Wiley.
Pendlbury, M., & Algaber, N. (1997). Accounting for the cost of central support services in UK
Universities: A note. Financial Accountability & Management, 13(3), 281–288.
Peteraf, M. A. (1993). The cornerstones of competitive advantage: A resource-based view. Strategic
Management Journal, 14, 179–191.
Pitman, T. (2000). Perceptions of academics and students as customers: A survey of administrative
staff in higher education. Journal of Higher Education Policy and Management, 22(2), 165–
175.
Poister, T. (2003). Measuring performance in public and non profit organizations. Jossey Bass.
Porter, M. E. (1990). The competitive advantage of nations. Harvard Business Review, 68(2),
73–93.
Pucciarelli, F., & Kaplan, A. (2016). Competition and strategy in higher education: Managing
complexity and uncertainty. Business Horizons, 59(3), 311–320.
Ramírez Córcoles, Y. (2013). Intellectual capital management and reporting in European higher
education institutions. Intangible Capital, 9(1), 1–19.
Ramírez, Y., Lorduy, C., & Rojas, J. A. (2007). Intellectual capital management in Spanish
Universities. Journal of Intellectual Capital, 8(4), 732–748.
Riccaboni, A., & Leone, E. L. (2010). Implementing strategies through management control
systems: The case of sustainability. International Journal of Productivity and Performance
Management, 59(2), 130–144.
Rowley, D. J., & Sherman, H. (2001). From strategy to change. Implementing the plan in Higher
Education. Jossey Bass.
Rowley, D. J., Lujan, H. D., & Dolence, M. G. (1997). Strategic change in colleges and univer-
sities: Planning to survive and prosper. Jossey-Bass.
Sánchez, M. P., & Elena, S. (2006). Intellectual capital in Universities. Improving transparency and
internal management. Journal of Intellectual Capital, 7(4), 529–548.
Sánchez, M. P., Elena, S., & Castrillo, R. (2009). Intellectual capital dynamics in universities: A
reporting model. Journal of Intellectual Capital, 10(2), 307–324.
See, M., Munro, C., & Wheeler, B. R. (1980). Planning critical success factors, and management’s
information requirements. MIS Quarterly, 4(4), 27–38.
Simon, H. (1960). The new science of management decision. Harper & Row.
84 2 Developing Performance Management Systems in Higher Education Institutions

Simons, R. (2000). Performance measurement & control systems for implementing strategy.
Prentice Hall.
Sloper, P., Linard, K., & Paterson, D. (1999). Towards a dynamic feedback framework for public
sector performance management. In Proceedings of the 17th International System Dynamics
Conference, Wellington.
Sporn, B. (2003). Management in Higher Education: Current trends and future perspectives in
European colleges and universities. In R. Begg (Ed.), The dialogue between Higher Education
research and practice (pp. 97–108). Kluwer Academic Publishers.
Stainer, A. (1999). Productivity, performance and paradise. Management Services, 43(6), 8–11.
Stankard, M. F. (2002). Management systems and organizational performance: The search for
excellence beyond ISO9000. Greenwood Publishing Group.
Sterman, J. D. (2000). Business dynamics systems thinking and modeling for a complex world.
McGraw-Hill.
Stewart, T. A. (1997). Intellectual capital: The wealth of organizations. Doubleday/Currency.
Sveiby, E. K. (1997). The intangible assets monitor. Journal of Human Resource Costing &
Accounting, 2(1), 73–97.
Sveiby, K. E. (2001). A knowledge-based theory of the firm to guide in strategy formulation.
Journal of Intellectual Capital, 2(4), 344–358.
Talbot, C. (2005). Performance management. In E. Ferlie, L. E. Lynn, & C. Pollitt (Eds.), The
Oxford handbook of public management. Oxford University Press.
Tiwari, M., Mahanty, B., Sarmah, P., & Jenamani, M. (2013). Modelling of responsive supply
chain. Taylor and Francis.
Volery, T., & Lord, D. (2000). Critical success factors in online education. International Journal of
Educational Management, 14(5), 216–223.
Wernerfelt, B. (1984). A resource-based view of the firm. Strategic Management Journal, 5,
171–180.
Wooldridge, B., & Floyd, S. W. (1990). The strategy process, middle management involvement,
and organizational performance. Strategic Management Journal, 11, 231–241.
Chapter 3
Designing Dynamic Performance
Management Systems in Higher Education
Institutions

3.1 Introduction

Chapter 2 focused on the design of PM systems calibrated on the peculiarities and


requirements characterizing HEI organizational settings. According to a sustainable
development perspective, the proposed logic and mechanisms have revolved around
the concept of “academic performance” as a critical element on which to develop
strategic planning and performance measurement processes with the purpose to meet
the conditions for pursuing a competitive and social success of a University (Coda,
2010; Mio, 2013). Unlike the traditional bureaucratic approach, this concept widens
the conventional boundaries of controlling legitimacy, formal procedures, and legal
compliance, focusing on the broader and more complex academic organizational
system (Deem & Brehony, 2008).
Strategic planning and performance measurement help academic decision-makers
to redesign their operational, strategic, informational, and decision-making struc-
tures to successfully compete in the current socio-economic scenario (Gerrish, 2016;
Deiaco et al., 2012). However, the recent changes in the HE legislation and the
socio-economic context pose more complex challenges to Universities that are now
called to develop an attitude that is no longer simply adaptive. Rather, it aims to
foster unceasing innovation and improve the services offered to stakeholders (e.g.,
students, academic and administrative staff, enterprises, public organizations, scien-
tific communities). In this context, research and education are becoming global
services delivered by quasi-companies in an ever-more complex and competitive
knowledge marketplace (Parker, 2011; Saravanamuthu & Tinker, 2002; Etzkowitz
et al., 2000). To cope with these challenges grounded in a more complex and
dynamic context, HEIs need additional methodological support to manage their
performance and successfully compete worldwide. This chapter aims to contribute
to this discussion by introducing System Dynamics (SD) modeling in designing and
implementing PM systems in HEIs.

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 85


F. Cosenz, Managing Sustainable Performance and Governance in Higher
Education Institutions, System Dynamics for Performance Management &
Governance 5, https://doi.org/10.1007/978-3-030-99317-7_3
86 3 Designing Dynamic Performance Management Systems in Higher. . .

Based on the analysis of the main critical issues hindering viable University
management in a complex, uncertain, and dynamic setting, this chapter explores how
applying SD methodology to the design and use of PM systems in HEIs may provide
greater support to academic decision-making and performance measurement. Such
an integrated approach is named Dynamic Performance Management (DPM)
(Bianchi, 2016). By adopting a systemic design perspective, this approach has
proved to facilitate strategic learning processes in Universities, endowing academic
decision-makers with deeper cognitive and informative supports, aptly integrated
with the PM systems and characterized by the requirements of relevance, articula-
tion, and selectivity (Bianchi, 2016, 2012; Cosenz, 2014, 2018; Zaini et al., 2016;
Barnabè, 2003, 2004; Cosenz & Noto, 2016). A systemic and selective perspective
enables to identification and investigation of the main cause-and-effect relations
among the forces influencing academic performance into a consistent PM frame-
work. As such, DPM allows academic decision-makers to take advantage of an
innovative management information tool to explore and govern complexity rather
than ignore it.
With the intent to evaluate the effectiveness of this approach, this chapter also
presents and discusses multiple examples related to real academic settings (e.g.,
departments, organizational units, whole academic institutions).

3.2 Challenges Related to Performance Management


Design and Implementation in Higher Education
Institutions

In the past decade, the changes intervened in the HE sector have led Universities to
operate in a global marketplace, thus offering their educational programs and
research products to a larger crowd of stakeholders (Pucciarelli & Kaplan, 2016).
Motivated by the financial crisis and inspired by the consequential need to cut public
expenditures financing HEIs, these changes have initially interested the HE legisla-
tion at the country level. For instance, many national Governments introduced a
performance-based funding system to distribute the available public funds to Uni-
versities according to a comparative evaluation of the education and research results
respectively achieved (Guarini et al., 2020; Hicks, 2012).
HE reforms have also concerned other substantial mechanisms, such as those
regulating academic recruitments and promotions, which nowadays require—among
the others—international research collaborations and publications in international
journals. As a result, Universities have gradually moved from a safe and static setting
characterized by the reliable support of national Governments and the refusal of
performance monitoring systems to a global, articulated, and dynamic environment
where to compete for survival. This has implied a wide debate among strategic
management scholars on the complex role HEIs have nowadays to play and the
managerial logic and tools to adopt in these peculiar organizations (Teixeira et al.,
3.2 Challenges Related to Performance Management Design and Implementation. . . 87

Table 3.1 SWOT analysis of current key trends affecting Higher Education (Pucciarelli & Kaplan,
2016, p. 313)
Strengths Weaknesses
• Essential source for a society’s talent and • Substantial delay in entrance of business
innovativeness practices into HE
– Institutionalized public service with a – Tradition of being a public service financed
societal mission and protected by the State
– Important provider of knowledge and – Resistance of faculty/department members,
innovation which are often organized in strong public sector
• National driver and global ambassadors unions
– HE as domestic resource, engine of • Low responsiveness to changes within the
growth and economic recovery corporate world
– International expansion and global – Little adaptation of programs and curricula to
knowledge dissemination recruiters’ needs and job expectations
– Myopic “publish-or-perish” research strate-
gies leading to purely academic publications
without consideration of other stakeholders
Opportunities Threats
• Fast-evolving HE environment through • Continuous decrease in public funding
ICT – Necessity for external fund-raising and
– Development of new markets, potential increased self-financing
productivity gains, and branding possibilities – Need for the marketization of HE, potentially
– Advancement of both general knowledge lowering academic standards and quality
and network society • Increasingly competitive environment
• Rapid transformation encouraged by – Domestic deregulation leading to new market
socio-demographics entrants
– Millennials seeking augmented educa- – Globalization broadening competition to an
tional experience international scale
– Growing and changing student population

2019; Rajala et al., 2018; Guarini et al., 2020; Head & Alford, 2015; Etzkowitz et al.,
2000).
There is a widespread consensus among scholars that the future of HEIs is and
will be complicated, challenging, and uncertain. In exploring the key trends under-
lying the contextual uncertainty and complexity of the current HE environment,
Pucciarelli and Kaplan (2016) developed a SWOT analysis, displayed in Table 3.1.
Such an analysis—among the strengths—recognizes the significant role still played
by Universities in developing societies in terms of economy and growth (European
Commission, 2013), carried on by disseminating knowledge and fostering innova-
tiveness on a global scale (de Boer et al., 2002; Altbach et al., 2009). As remarked by
Välimaa and Hoffman (2008), in a globalized world, knowledge, research, and
innovation are important resources, and their development is influencing the societal
role of HEIs. In this context, the capability to commercialize knowledge and
innovative research outputs becomes crucial for ensuring every university’s future
survival, namely, in terms of acquiring external funds and strengthening collabora-
tive networks with key stakeholders (e.g., alumni associations).
88 3 Designing Dynamic Performance Management Systems in Higher. . .

The prevailing literature agrees on the use of business philosophies and commer-
cial practices in the HE sector, thus emphasizing the need to adapt marketing logic to
the University setting. They also allow academic decision-makers to develop com-
petitive strategies, assess drivers of change, and formulate strategic guidelines for
adequately taking on such changes (Gibbs & Murphy, 2009).
However, this business-oriented approach to managing HEIs has encountered
structural resistance in its concrete implementation. These resistances are mainly
associated with an outdated perspective of considering HE as a public good financed
and secured by national governments. Consequently, concepts such as autonomy
and accountability are relatively new for University management. In addition, the
inherent aptitude of faculty members to rally in strong public sector unions further
prevents HEIs from adapting to business logic and commercial practices, thus
fueling a dangerous distance from the demand of a changing society (Jongbloed
et al., 2008). For instance, today’s job market requires skills and competencies of
students that differ from those needed in the past. As a result, several calls for
redesigned educational curricula have been put forward by trade associations,
practitioners, and Governments alike, often remaining without adequate responses
(European Commission, 2013).
Similarly, due to the HE promotion systems adopted by several countries, pro-
fessors and scientists are forced to focus on purely academic research products
publishable in top-ranked journals mainly read by other academics. Leading to
purely academic publications, these myopic “publish-or-perish”-driven research
strategies neglect stakeholders’ expectations, whose partnership and support are
crucial for acquiring new resources (Cotton & Stewart, 2013).
A complex and uncertain context also implies opportunities and threats which
University management must be aware of. For instance, the fast advancement of
Information and Communication Technologies (ICT) provides new channels for
expanding educational programs worldwide (e.g., distance learning, massive open
online courses) and, consequently, contributes toward cost reduction, as the transi-
tion from physical to digital knowledge transfer solutions can enhance efficiency
(Friga et al., 2003; EPRS, 2014). ICT simplifies the creation of academic and
professional networks, giving rise to expansion and reconfiguration of collaboration
inside HEIs and among stakeholders, thus overcoming the limitations of traditional
forms of cooperation (James et al., 2020; Noordegraaf, 2015; Castells, 2011).
Another opportunity for HEIs depends on the recent sociocultural and demographic
trends. These trends highlight the increasing presence of tech-savvy candidates,
digital natives, acting as rational and informed customers when selecting Universi-
ties (Temple & Shattock, 2007), often seeking educational programs characterized
by digital content, interactive learning, and social networking (McHaney, 2011;
Budde-Sung, 2011).
Eventually, the main threats are associated with national reforms aimed at
decreasing public funding for HEIs (Altbach, 2004) and—in Europe, above all—
increasing Universities’ autonomy, self-organization, and accountability (Pilonato &
Monfardini, 2020; Hoecht, 2006). As a result, these reforms have gradually
transformed a safe and financially protected sector into a competitive market by
3.3 Implementing Dynamic Performance Management Through System. . . 89

lowering the barriers to potential entrants (e.g., private e-universities), thus leading
Universities to engage in fund-raising activities and improve performance manage-
ment for securing self-financing. A globalized HE market further widens the com-
petition among Universities to an international scale (Schofield et al., 2013).
The strengths, weaknesses, opportunities, and threats analysis suggests that HEIs
must cope with the new challenges offered by a complex and uncertain environment
by adopting more effective PM systems. To this end, applying SD modeling to PM
in Universities may enhance the organizational learning processes of academic
decision-makers as its methodological support provides a consistent approach to
explore academic value creation according to a systemic perspective (Cosenz, 2014).
A systemic view of PM—supported by simulation models—enables to trace the
causal connections among strategic resources, drivers, and results, forming closed
feedback structures for testing the reaction of the academic institution against the
adoption of alternative strategies (Zaini et al., 2016). Applying SD modeling to PM
identifies Dynamic Performance Management as a more robust methodological
approach to support strategic planning and performance measurement in complex
organizational systems, as characterized by Bianchi (2010, 2012, 2016).

3.3 Implementing Dynamic Performance Management


Through System Dynamics Modeling

Dynamic Performance Management (DPM) is a methodological approach grounded


in the combination between the three views of PM—i.e., “subjective,” “objective,”
and “instrumental” views as depicted in Chap. 2—and SD modeling (Bianchi, 2010,
2012, 2016). In particular, DPM uses SD models to enhance the effectiveness of PM
systems with the intent to better frame the dynamic complexity characterizing
today’s organizational settings and, as a result, foster their sustainable development
over time. As argued by Bianchi (2016, p. 71), “Designing a Planning & Control
system to support decision makers to assess performance under a sustainability
perspective is the core of DPM. It requires a selective and sequential method of
inquiry. DPM is an approach that enables organization decision makers to frame the
causal mechanisms affecting organizational results over time. Such a field of
research and practice is based on two converging methods of inquiry: Performance
Management and SD modeling.”
After outlining the core features of SD modeling and its relevance to management
science, the following sections introduce SD modeling in the design and implemen-
tation of PM in Universities.
90 3 Designing Dynamic Performance Management Systems in Higher. . .

3.4 System Dynamics Modeling: Principles,


Implementation Criteria, and Purposes

SD is a methodological approach—developed during the 1950s by Jay Wright


Forrester—for modeling and simulating complex physical and social systems and
experimenting with the models to design policies for management and change
(Forrester, 1958). It provides a perspective and a set of conceptual tools to frame
the structure and dynamics of complex, nonlinear, multi-loop feedback systems
(Forrester, 1961; Meadows, 1980; Richardson, 1999; Sterman, 2000).
In conceptualizing this approach, Forrester combined concepts from several
sources such as control engineering, cybernetic, and organizational theory
(Meadows, 1980). Such a methodology was initially applied to industrial company
problems, e.g., inventory management, falling market share, instability of labor
force, etc. Subsequently, it has been successfully applied to various social domains
and areas of scientific interest, such as public policy design, economics, environ-
mental science, and management (Meadows, 1980; Richardson, 1999).
In particular, SD is a robust methodology for analyzing the dynamic tendencies of
complex systems—i.e., what kind of behavioral patterns they may generate over
time. The main assumption of the SD paradigm is that these patterns arise from the
system’s causal structure under observation, seen as a closed boundary, i.e.,
embodying all main relevant variables related to the phenomenon investigated.
Causal structures are determined by physical or social constraints, goals, rewards,
and pressures that make a system’s agents behave in a certain way (Meadows, 1980).
The SD modeling process is portrayed in Fig. 3.1. The basic rationale is that, if the
process structure determines the system behavior and the system behavior deter-
mines the organizational performance (Davidsen, 1991; Richardson & Pugh, 1981;
Sterman, 2000), then the key to developing sustainable strategies to optimize
performance is understanding the relationship between processes and behaviors,
and managing the leverage points (Ghaffarzadegan et al., 2011; Bianchi et al., 2012).
SD models offer an operational methodology to support strategic planning and
decision-making (Torres et al., 2017). In practical terms, decision-makers can use
these models to simulate alternative scenarios and explore what might have hap-
pened—or what could happen—under various past and future assumptions and
across alternative decision choices (Sterman, 2000). When real experimentation is
too costly—and this may be the case of the HE sector where public funding is
constantly decreasing—simulation becomes a valuable tool to discover how com-
plex systems work and where high leverage points may lie. This can compress or
expand time and space, allowing learners to simulate years or decades in the life of
an organization.
Mental models play a central role in SD efforts to improve learning and decision-
making in complex systems. SD methodology can be generally described as a
feedback process in which mental models of managers and strategy designers are
used to develop a simulation model. These models are used to explain social
phenomena which, in turn, create new learning opportunities, thus improving the
3.4 System Dynamics Modeling: Principles, Implementation Criteria, and Purposes 91

Fig. 3.1 The system dynamics modeling process (adapted by Richardson & Pugh, 1981, p. 17)

accuracy, coherence, and complexity of such mental models. The relevance of


explanatory models in decision-making processes underlines the importance of
surfacing mental models in designing and using strategic planning and performance
measurement mechanisms. Suppose mental models are considered as “deeply
ingrained assumptions, generalizations, or even pictures or images that influence
how we understand the world and how we take action” (Senge, 1990, p. 8). In that
case, it is clear how SD models could be influenced by and used to challenge mental
models. This is particularly relevant since long-term success often depends on the
process through which management teams modify and improve the shared mental
models of their organizations, their markets, and their competitors (Senge, 1990;
Argyris, 1992; De Geus, 1999; Micheli & Mari, 2014).
Compared to other methodologies (e.g., agent-based modeling), the use of SD
modeling may lead to remarkable results in framing and analyzing both determinants
and implications of a given phenomenon. Model structures are realized by
connecting those relevant variables that determine the observed system’s behavior
over time. In these connections, feedback loops are the building blocks for articu-
lating the dynamics of these models, and their interactions can explain the system
behavior. As such, SD methodology identifies the complex interactions among the
feedback loops, rejects notions of linear cause and effect, and requires the strategy
analyst to view a complete system of relationships whereby the cause might also be
affected by the effect (Bianchi, 2010). This means that a variable, ceteris paribus,
influences another variable (1) positively (i.e., an increase of the one corresponds to
an increase of the other, and vice versa), (2) negatively (i.e., an increase of the one
92 3 Designing Dynamic Performance Management Systems in Higher. . .

Fig. 3.2 Adopting a feedback approach to perceive policy resistance effects generated by bounded
policies in a dynamic complex system (adapted from Sterman, 2000, p. 11)

corresponds to a decrease of the other, and vice versa), and (3) according to a
nonlinear relation between them. If such relations form closed circuits, these are
defined as feedback loops and are responsible for the system behavior. Should each
relationship forming the loop be positive or the number of negative relationships is
even, feedback loops are defined as “reinforcing.” These loops tend to produce
exponential growth/decay behaviors. On the other hand, “balancing” feedback loops
are characterized by an odd number of negative relationships. They tend to coun-
teract any trajectory disturbance and move the system toward an equilibrium point
(Meadows, 1980).
By framing the dynamics of a complex organizational system—including inter-
actions among key-actors, actions, organizational structures, and processes—man-
agers can better decide how to reinforce positive factors or diminish the negative
pressures exerted upon them. Hence, as shown in Fig. 3.2, SD allows them to link
strategy to action and better perceive the interdependencies between organizational
units and functions, and between the organization and its environment where other
players behave in the pursuit of their own goals (Sterman, 2000).
3.5 Relevance of System Dynamics Modeling into Management Sciences 93

3.5 Relevance of System Dynamics Modeling into


Management Sciences

Strategic management frameworks and approaches are useful to drive decision-


makers to design competitive strategies and measure the resulting outcomes. They
have been mainly applied to private, public, and nonprofit organizations.
Over the years, however, many authors have raised several critical issues related
to the development, implementation, and use of these approaches. One of the major
criticisms is related to the lack of a perspective that may capture the dynamic
complexity of managerial decision-making (Bisbe & Malagueño, 2012). Too
often, traditional strategic management approaches fail in considering many relevant
factors influencing both planning and measuring organizational performance. Such
factors are primarily associated with delays, nonlinearity, intangibles, and the
unintended consequences on human perceptions and behavior caused by a superfi-
cial or mechanistic approach in setting performance targets. Due to this, as Sloper
et al. (1999) and Linard and Dvorsky (2001) claim, traditional approaches to
management sciences may limit decision-makers’ strategic learning processes.
Bianchi (2012) found that the lack of learning-oriented Planning and Control
systems represents a major cause of crises in implementing growth strategies. A
learning-oriented approach to Planning and Control implies the perception of the
organization’s dynamic complexity. Misperceiving the relevant system’s boundaries
and the dynamic relationships between the system’s feedback structure and behavior
often leads managers to make their decisions according to an excessively linear,
static, and bounded point of view in terms of time horizon and systemic scope.
Additionally, Bianchi et al. (2015) found a general inclination to frame the organi-
zational performance from a too static point of view. This does not allow one to
properly assess the outcomes of policies by considering the trade-offs between
short- vs. long-term effects (time) and results related to different interdependent
organizational units (space).
On his side, Vennix (1996) remarked the necessity to improve the techniques
suggested by the Stakeholder Theory by promoting the adoption of a participatory
approach based on the elicitation of key-actors’ tacit knowledge over a given
phenomenon to generate consensus on the underlying dynamics that regulate a
social system.
Likewise, Morecroft (1997, 2007, 2013) and Warren (2002, 2008) focused on the
constraints related to the Resource-based view. Besides a lack of a dynamic view-
point on resource management, they emphasized the need to better understand and
identify the causal relations between resource acquisition and depletion processes
according to a systemic perspective, with the intent to foster organizational results.
Several authors have also remarked on the same issues (Kim & Park, 2006;
Adamides & Pomonis, 2009; Qi et al., 2009).
Considering one of the most innovative frameworks, such as the Balanced
Scorecard (Kaplan & Norton, 1992), this tool displays some conceptual and struc-
tural shortcomings (Norreklit, 2000). Linard et al. (2002) asserted that the Balanced
94 3 Designing Dynamic Performance Management Systems in Higher. . .

Scorecard fails to translate strategy into a coherent set of measures and objectives
because it lacks a rigorous methodology for selecting metrics and establishing the
relationship between metrics and firm strategy. In addition, Sloper et al. (1999)
remarked that the Balanced Scorecard is a too static approach to frame performance
dynamics. Although Kaplan and Norton stressed the importance of feedback rela-
tionships between Balanced Scorecard variables for describing the trajectory of a
given strategy, the cause-and-effect chain is always conceived as a bottom-up
causality, which ignores feedbacks, thereby confining attention only to the effect
of variables in the lower perspectives (Linard & Dvorsky, 2001). Misperceiving the
dynamic relationships between the system’s feedback structure and behavior
(Davidsen, 1996; Sterman, 2000, pp. 107–133) often leads managers to make
decisions according to a linear, static, and bounded viewpoint in terms of time
horizon and interplay between variables.
In particular, the Balanced Scorecard approach does not help one to understand
(Bianchi, 2012):
• How strategic resource accumulation and depletion processes are triggered by
different policy levers affecting performance drivers
• How performance drivers influence outcome indicators
• How outcomes will affect the strategic asset accumulation and depletion
processes
• How to align key performance measures to strategic objectives (Melnyk et al.,
2013)
SD modeling has been suggested and used to overcome the above limitations and
improve strategic management approaches.
Since the 1980s, SD applications have characterized significant methodological
advances in interactive simulation games. Namely, “management flight simulators”
represented the door opener to introduce SD methodology to business management
(Forrester, 2007a, b). In the early applications of SD to management sciences, SD
scholars and practitioners were approaching problem-solving by adopting the “con-
sultant” mode. This means that they were used to analyze a business, build a model
without involving any key-actor inside the organization, and then come back with
recommendations. During those years, SD contributions to rational strategic man-
agement mainly focused on improving strategy formulation processes (Morecroft,
1984, 1985, 1988) and corporate planning (Hall & Menzies, 1983; Narchal, 1988;
Kumar & Vrat, 1989).
Starting from the 1990s, simultaneously with the rise of the Stakeholder Theory, a
new approach, called “Group Model Building,” emerged within the SD application
domain (Vennix, 1996). By using this technique, managers and key-actors started to
increase their involvement in the modeling process to internalize lessons about
dynamic feedback behaviors (Forrester, 2007a, b). Group Model Building uses a
collaborative approach whereby key-actors map out their own perspectives of the
systems and then slowly build on this with other stakeholders to foster a broader
agreement on the system’s structure. Such a collaborative process can be politically
subject to conditioning in public policy design. It is the responsibility of an SD
3.6 A Literature Review of System Dynamics Applications to University Management 95

facilitator to refuse any of the possible negative implications that this can entail
(Größler, 2007). In the same years, “System Thinking” became a well-known term in
management sciences, thanks to the book The Fifth Discipline by Peter Senge
(1990). This book represented an outstanding contribution to strategic management
science since it defined the concept of “learning organization.”
In recent years, SD modeling has been combined with PM systems proving to be
effective in fostering strategic learning processes and, as a result, supporting
decision-making and performance improvement according to a systemic perspective
(Bianchi, 2012, 2016; Bianchi et al., 2015; Cosenz & Noto, 2016). Combining PM
with SD aims at supporting decision-making through better coordination between
strategy design and performance measurement reporting. The application of SD to
PM helps strategy analysts to trace both causes and drivers leading to a given
performance level over time. In doing so, such an application contributes to enhanc-
ing a systematic diagnostic process. This process enables managers to design
corrective actions to fill the gap between the actual and the target performance. As
such, SD may support managerial approaches and applications in different ways. For
instance, it has been used to investigate and frame the organizational structure of
systems (e.g., an organization as a whole, a strategic business unit, a subsystem, and
so on), and support decision-making processes by simulating and testing the effects
of alternative strategies on performance development under certain conditions (i.e.,
scenario analysis). In addition, SD has been combined with existing strategic
management frameworks to understand better a phenomenon that occurred in
complex and dynamic domains (Cosenz, 2017; Cosenz & Noto, 2018a, b; Bivona
& Montemaggiore, 2010). Eventually, SD modeling has also served as a tool for
engaging stakeholders both from the internal and external environment.
From this brief historical overview of SD development, it is possible to conclude
that this methodology provided important and unique contributions to management
sciences (Gary et al., 2008). The following section presents a literature review of the
main SD applications to HE management.

3.6 A Literature Review of System Dynamics Applications


to University Management

Over the past 20 years, the retrieved literature indicates that SD modeling has been
applied to HE management by numerous authors, with different purposes in terms of
research focus. In the attempt to categorize these contributions, Kennedy (2011)
proposed a taxonomy based on matching specific areas of concern (e.g., Corporate
Governance; Planning, Resourcing, and Budgeting; Human Resource Management
Dilemmas; Teaching Quality; Teaching Practice; Microworlds; Enrollment
Demand; Fund-Raising Research; Research Quality) with different levels of hierar-
chy (e.g., National, Regional, University, Faculty, School/Department). Such a
taxonomy—portrayed in Table 3.2—has been updated and expanded by adding
the most recent contributions on this topic.
Table 3.2 Classification of System Dynamics research in Higher Education Management (adapted from Kennedy, 2011)
96

Hierarchical level
National Regional University Faculty School—department
Specific Corporate Saeed (1996, 1998); Ken-
areas of Governance nedy and Clare (1999)
concern Planning, Galbraith (1982); Frances (1995, Galbraith (1989, Kennedy and Clare Kennedy and Clare (1999);
Resourcing, Galbraith and Carss 2000) 1998a, b, c); Barlas and (1999); Kim and Vahdatzad and
and (1989); Bell et al. Diker (1996a, b, 2000); Rehg (2018); Trailer Mojtahedzadeh (2000)
Budgeting (2000) Kennedy and Clare (1999); (2012)
Vahdatzad and
Mojtahedzadeh (2000);
Szelest (2003)
Human Kersbergen et al. (2016)
Resource
Management
Dilemmas
Teaching Meadows (1999); Kennedy (1998a, b);
Quality Ghaffarzadegan et al. (2017) Eftekhar and Strong (2005);
McKeachie (1990);
Schneider Fuhrmann and
Grasha (1994a, b); Fincher
(1994)
Teaching Richardson and Forrester (1974); Saeed Roberts (1978); Saeed
Practice Andersen (1979, (1997); Frances (2000); (1990, 1993, 1997); Frances
1980) Arndt (2007); Friedman (2000); Runge (1977);
et al. (2007); Perez Salazar Shaffer (1976); Senge
et al. (2007); Potash and (1988); Morecroft and
Sterman (1992); Sterman
3 Designing Dynamic Performance Management Systems in Higher. . .
Heinbokel (2005); Meadows (1992); Anderson and
(1999) Sosniak (1994); Nodenof
et al. (2004)
Microworlds Maier and Größler Barlas and Diker (1996a, b, Sterman (1992); Virtual
(2000) 2000); Sterman (1992); Vir- University (2005a, b)
tual University (2005a, b);
Blumenstyk (2000); Dekkers
and Donatti (1981); Sawyer
(2002)
Enrollment Jordan (1992); Frances et al. (1994); Zaini et al. (2016)
Demand Frances et al. Frances (2000)
(1994); Frances
(2000)
Fund-Raising Oyo et al. (2008) Cosenz (2014, 2018)
Research
Research Küçük et al. (2008); Onsel Cosenz (2014, 2018)
Quality and Barlas (2011)
3.6 A Literature Review of System Dynamics Applications to University Management
97
98 3 Designing Dynamic Performance Management Systems in Higher. . .

Regardless of the specific research focus, the above contributions frame univer-
sities and their organizational structures (e.g., faculties, departments)—as complex
learning systems whose organizational processes are characterized by an articulated
network of key-players and stakeholders. For instance, by simulating competition
dynamics between different schools belonging to a faculty with limited funds,
Galbraith (1989, 1998a, b, c) investigated the effect of organizational policies on
HE institutional performance in Australian Universities, with a focus on the time
delays between policy change and associated results. In his studies, he found that
drawing up separate plans for faculties, departments, and schools fosters the pursuit
of individual goals, thus undermining the achievement of general institutional goals.
In addition, Galbraith (2010) explored the HE decision-making processes through
an SD model to inspire changes by providing incentives based on faculty staffing and
budgeting. In his modeling perspective, the allocation of funds is affected by
enrollment growth and grants allocation per faculty as a function of academic
research output (Zaini et al., 2016). Similarly, Kennedy and Clare (1999) identified
those factors generating a critical impact on the support of policy analysis regarding
HE resource management issues. Taking distance from a static and linear perspec-
tive, they argued that SD models are valuable tools to understand the dynamic
interactions between the input and output factors to pursue organizational process
improvements.
Saeed (1990, 1993, 1997) examined the role of SD in supporting teaching
practices in several academic disciplines, including social sciences. In these contri-
butions, Saeed suggested incorporating experimental learning into the teaching of
social sciences, “since experimentation with relationships, whether in a laboratory or
a studio, helps not only to corroborate theories and create robust designs but also to
develop the reflective process critical to the creation of innovation in various pro-
fessions” (Saeed, 1990).
Barlas and Diker (1996a, b, 2000) developed an interactive dynamic simulation
model, called “UNIGAME,” addressing a range of problems concerning the aca-
demic aspects of a university management system. By highlighting the systemic
nature of academic decision-making, such a “Microworld” allows stakeholders to
realize that individual decisions—made in stand-alone conditions—lead to counter-
intuitive outcomes when not coordinated with other players’ decisions. This inter-
active game also includes a set of academic performance measures. Likewise, within
a US University field project, Zaini et al. (2016) developed an SD model where four
sectors (i.e., students, faculty, quality, and facility) interact with the intent to explore
University growth. The engagement of stakeholders allowed the authors to elicit and
translate their mental models into an SD model, thus remarking the benefit of
adopting a holistic perspective in academic decision-making processes. Yet,
Meadows (1999) explored learning strategies in Universities by developing several
SD-based games, including the “Quality College” game. Experimenting with these
games revealed the need to create sound and simple modeling designs to ensure
effectiveness in learning processes.
Inspired by Ghaffarzadegan et al. (2017), Kim and Rehg (2018) examined the
dynamic complexity in University contexts focusing on academic strategic planning.
3.7 Conceptual, Insight, and Full-Fledged Simulation Modeling 99

By conducting participatory systems mapping sessions, they explored how innova-


tive attempts to improve educational activities’ quality and efficiency influence
faculty morale, including relevant factors such as the unintended implications of
faculty hiring decisions and salary inequity leading to poor performances and
financial problems. An SD model was also developed by Szelest (2003) to evaluate
alternative enrollment management theories and to address potential conflicts of
interest between education and research strategies in terms of financial resource
allocation.
The above contributions prove the relevance of adopting SD modeling in Uni-
versity management and a long research tradition in this context. Nevertheless, few
attempts to combine PM systems and SD have been identified in the prevailing
literature on SD applied to University management. To fill this gap, the following
sections aim to illustrate how to support PM system design in HEIs with SD
modeling.

3.7 Conceptual, Insight, and Full-Fledged Simulation


Modeling

SD models are developed by constructing feedback structures that identify the causal
connections among the relevant variables of the system under observation. These
feedback structures explain the rationale underlying the behavior of the variables
forming the loops by highlighting drivers and strategic levers to influence the
system’s current state (Sterman, 2000). As such, similarly to other modeling
approaches supporting decision-making in organizations, SD models serve as cog-
nitive tools to explore the operational processes affecting organizational perfor-
mance and better understand how the organization reacts to strategic interventions
in terms of performance. Based on the cognitive purpose they aim to, these models
can be classified into conceptual (or qualitative), stock-and-flow simulation
(or quantitative), and insight architectures (Wolstenholme, 1999).
Conceptual SD models draw attention to the causal relations among variables and
the polarity of the feedback loops (i.e., reinforcing or balancing loops) they form.
Given no possibility to include quantitative data and simulate system behaviors,
these causal maps are mainly used as simple channels to communicate and share an
understanding of the fundamental dynamics underlying the observed system
(Cosenz, 2017). They are also adopted as a tool to facilitate Group Model Building
processes among inexpert-SD actors (Vennix, 1996) or Collaborative Conceptual
Modeling (Newell, 2012; Newell & Proust, 2009). Figure 3.3 shows an example of
conceptual SD modeling where a reinforcing and a balancing feedback loop affect
financial resources in a University setting. Namely, the reinforcing loop illustrates
how the use of financial resources to improve education quality through academic
staff hiring may increase student satisfaction, leading to new enrollments and
associated fees that ultimately fuel the University’s financial resources. On the
100 3 Designing Dynamic Performance Management Systems in Higher. . .

+
Enrollment
fees
New
enrollments +
-
+ Financial
resources
R Salaries
Student B
satisfaction +
+
+
Education Academic
quality staff
+

Fig. 3.3 An example of a conceptual SD model

other hand, these resources are decreased by increasing salaries for new staff, thus
producing a balanced effect on the financial assets of the HEI.
Stock-and-flow simulation models aim to quantify the causal relations among the
system’s variables to simulate its behavior over time (Größler et al., 2008). This
modeling mode also considers time delays. These models include four kinds of
variables. They are:
• Stocks or levels: they represent an endowment of a strategic resource at a given
time, which can change through accumulation and depletion processes. Both
tangible and intangible resources (e.g., laboratories, academic workforce, biblio-
graphic databases, equity, human capital, relational capital, etc.) can be modeled
as stock variables. As portrayed in Fig. 3.4, they are graphically identified with a
rectangular shape.
• Flows: these variables determine a change (increase or decrease) of a stock from
one period to another due to implemented strategies. They can be modeled as
inflows, outflows, or even net-flows affecting a given stock (Fig. 3.4). These
variables correspond to the end-results achieved by the organization and are
depicted as arrow-shaped symbols connected to stocks.
• Inputs: these are exogenous parameters, such as contextual constraints or strategic
levers allowing modelers to translate policies into model variables. They are
graphically identified as diamond-shaped symbols (Fig. 3.4), and their value
remains constant throughout the simulation interval.
• Auxiliaries: these variables are used to add intermediate calculations within the
model and contribute to improving the understanding of the system. Performance
drivers are modeled as auxiliary variables whose shape corresponds to a circle
(Fig. 3.4).
3.7 Conceptual, Insight, and Full-Fledged Simulation Modeling 101

Enrollment fees per


Effect of student New enrollments
student
satisfaction on
enrollm ents Financial resources
to hire new academic
Financial resources staff

Cash flow
R
Student satisfaction

Total salaries
B Academic staff
Change in student recruitments
satisfaction Average salary per
academic staff

Academic staff

Effect of education
quality on student
satisfaction
Education quality
Change in education
quality Effect of academic
staff on education
quality

Fig. 3.4 An example of a full-fledged simulation model

The same example shown in Fig. 3.3 is translated into a stock-and-flow simula-
tion model in Fig. 3.4.
It is worth remarking that a reinforcing loop produces an exponential behavior in
the model variables due to the effect of a virtuous or vicious cycle, thus involving
growth or dysfunctional dynamics. In turn, a balancing loop reveals the presence of
draining or an adjusting process influencing a given resource. Considering the
example depicted in Fig. 3.3., the simulation result related to the stock of financial
resources displays an exponential growth in the first phase of the period due to the
dominance of the reinforcing loop. Subsequently, such a trend is followed by a goal-
seeking trajectory caused by increased academic staff salaries, which refers to a
dominance shifting toward the balancing feedback loop (Fig. 3.5).
To improve the readability and comprehension of SD models, insight models are
also used (Bianchi et al., 2019; Lyneis, 1999). This modeling mode is a middle
ground between the above ones, i.e., conceptual and simulation stock-and-flow
models. It aims to enhance conceptual SD models, or simplify stock-and-flow
structures, by framing a qualitative feedback architecture that identifies the variable
typologies forming the system (e.g., stocks, flows, etc.). As Bianchi (2016) argued,
insight (or policy-based) models support carrying out more extensive quantitative
modeling in a later stage of SD application to design PM systems in an organization
that has just approached this method. Figure 3.6 replicates the above example
according to an insight modeling perspective.
The following section illustrates how to combine PM systems and SD modeling
with the intent to better support academic decision-making processes and
102 3 Designing Dynamic Performance Management Systems in Higher. . .

Fig. 3.5 Simulating financial resource behavior through full-fledged simulation models

Fig. 3.6 An example of an insight SD model


3.8 A Dynamic Performance Management Approach to University Management 103

performance reporting. Such an integrated approach is named Dynamic Performance


Management—or, in short, DPM (Bianchi, 2010, 2012, 2016).

3.8 A Dynamic Performance Management Approach


to University Management

The previous sections of this chapter introduced the use of SD modeling as a method
to overcome those critical issues affecting PM design and implementation in HEIs
(as described in Sect. 3.2), as well as to improve the effectiveness of PM systems in
managing the inherent dynamic complexity of this sector (Alexander et al., 2018).
SD models may support academic decision-makers in (i) framing the causal relation-
ships among strategic resources, performance drivers, and end-results,
(ii) determining the time delays between causes and associated effects in policy
implementation, and (iii) showing the behavior of critical variables through simula-
tion scenarios.
Although an illusory inconsistency between conventional PM systems—includ-
ing accounting, budgeting, and reporting systems—and SD models, their synergy
may enhance the connection between strategic planning and performance measure-
ment. In this way, such a synergy nurtures a rapid adoption of corrective actions and
strategy reversals (see Fig. 2.3). Unlike traditional PM approaches where organiza-
tional performance is measured according to a bounded dimensional perspective,1
integrating SD and PM systems enables one to capture causal connections among the
key variables of multiple performance dimensions within a unique PM framework
(Santos et al., 2002).
The complementarity between PM and SD is based on a circular logic to improve
and refine key-actors’ mental models according to a strategic learning perspective.
Such a logic encompasses the following interconnected phases: (i) observation;
(ii) reflection, knowledge elicitation, and communication; (iii) diagnosing and shar-
ing of a shared understanding of phenomena; and (iv) decision-making and correc-
tive actions (Kolb, 1984).
Crossing the above phases through the use of SD models—i.e., sharing and
formalizing through SD models a common understanding of the causal structure
underlying the functioning of the HEI—forms the basis on which academic
decision-makers may implement a “double-loop learning” process, as characterized
by Argyris (2002). Double-loop learning involves the modification of goals or
decision-making rules in the light of experiences that one can gain by simulating
and testing organizational performance. Since strategic decisions are generated from

1
Traditional reporting systems usually include separate sections respectively dedicated to (1) eco-
nomic and financial results (e.g., financial statement, cash flow analysis), (2) social and environ-
mental impacts (e.g., CSR activity reports, environmental reports, customer satisfaction reports),
and (3) competitive performances (e.g., market share analysis, benchmarking reports).
104 3 Designing Dynamic Performance Management Systems in Higher. . .

Fig. 3.7 Complementarity between PM and SD models grounded around a circular double-loop
learning process (Bianchi, 2016, p. 40)

mental models, these are intended as the main leverage points for enhancing
decision-making processes.
In a complex and dynamic environment such as the HE sector, the observation of
strategic actions and related results is interpreted by the mental models of academic
managers who formulate corrective actions based on their managerial experience
(Martins, 2017). In turn, single-loop learning occurs when such observations of the
organizational model, especially the outcomes of previous strategic actions, lead to
changes in decision-making. When double-loop learning occurs, these observations
produce a stronger effect, modifying academic key-actors’ mental models (Kim
et al., 2013; Cosenz & Noto, 2018b).
As depicted in Fig. 3.7, such a strategic learning process based on the use of SD
models represents the cornerstone around which to set up the circular interplay
between strategic planning and performance measurement. This PM setting is
supported by SD modeling according to the following phases: (i) mapping (i.e.,
framing the organizational model), (ii) planning, (iii) implementing decisions/oper-
ations, and (iv) measuring/evaluating results, and undertaking corrective actions.
In particular, these phases allow academic decision-makers to jointly perceive the
current state of the organizational system and outline—through a shared mapping
process—a conceptual SD model capturing the feedback structure underlying such a
system and explaining its behavior over time. Subsequently, this shared model
enables the formulation of goals and objectives to anticipate the system’s desired
state through strategic planning. After implementing the plan, the model allows
academic decision-makers to measure associated results and undertake prompt
corrective actions through feedback and feedforward control mechanisms.
3.8 A Dynamic Performance Management Approach to University Management 105

Fig. 3.8 Modeling the “instrumental view” of PM through SD methodology (Bianchi, 2016, p. 73)

As previously described, the combination of PM and SD modeling is named


Dynamic Performance Management (DPM). This approach defines the synergy
between PM and SD models by converting PM’s “instrumental view” into SD
methodology terms. Figure 3.8 displays how to translate strategic resources, perfor-
mance drivers, and end-results, into SD variables. Namely, resources are modeled as
stock variables whose accumulation and depletion processes are regulated respec-
tively through in- and outflows. Flow variables correspond to the end-results
achieved by the University over time and, as such, are used to capture the value
created (or destroyed) by operations and management processes. For instance, the
stock of students enrolled in a Master’s program (i.e., a strategic resource) may
change due to the flows related to new enrollments and graduations (i.e.,
end-results). Similarly, the stock of publications (i.e., a strategic resource) may
change due to the flow of papers submitted to scientific journals (i.e., an end-result).
Operations and actions undertaken to allocate and use strategic resources influ-
ence critical success factors that are measured as performance drivers and modeled
as auxiliary variables. For instance, the average time to graduate can be measured by
comparing graduates and enrolled students (i.e., a performance driver). This indica-
tor provides a relevant piece of information on the education capacity of a University
for undergraduate students searching for a Master’s program, thus affecting new
enrollments (i.e., an end-result).
106 3 Designing Dynamic Performance Management Systems in Higher. . .

In the DPM approach, the feedback architecture underlying the dynamics of the
strategic resources implies that the flows influencing such resources are measured
over a time interval using performance drivers. In this context, the use of SD
enhances the understanding of how time delays affect strategic resources and results
throughout value creation processes, thus becoming valuable methodological sup-
port for managing performance in dynamic complex systems.
The focus on the conversion of the “instrumental view” of PM into SD models
aims to underline the capability of DPM to enhance the design of key performance
metrics in HEIs. However, PM’s “subjective” and the “objective views” must also be
considered complementary perspectives to support strategic planning, performance
measurement, and process analysis. Figure 3.9 shows the complementarity among
the three PM views and highlights their causal interplays according to an SD
approach, thus identifying the DPM. DPM aims to strengthen—through a systemic
perspective—the connection between strategic planning and performance measure-
ment in organizational settings characterized by dynamic complexity, thus providing
a strategic learning tool to decision-makers (Cosenz & Bianchi, 2013).
In particular, to support the strategic learning processes of academic decision-
makers aimed to manage organizational performance according to a sustainable
development perspective, the logical pathway for applying the DPM approach to
HEIs encompasses five main developmental stages. As shown in Fig. 3.10, the
process begins with identifying objectives and goals and designing associated
performance indicators. These indicators are ratios that compare actual results with
objectives (or expected results or benchmarks) to measure effectiveness and
relevance.
Successively, the adoption of PM’s “objective view” enables to frame and place
performance indicators throughout the academic value chain (or specific organiza-
tional processes) characterized by the connection between outputs, operations, and
(internal or external) clients. This phase is of particular relevance as effective
evaluation of those factors underlying the connections among the organizational
units (e.g., inputs/outputs, quality, and size, time delays, procedural errors, substan-
tial errors, etc.) contributes to detecting the critical areas requiring prompt corrective
actions (Noto & Cosenz, 2021). In addition, it improves the strategic coordination
among the units. Specifically, corrective actions are undertaken by intervening on
specific strategic levers (e.g., relocating resources from one unit to another) of the
value creation process under observation.
Eventually, identifying the above elements allows academic decision-makers to
outline the cause-and-effect relationships among them, thus generating causal loop
structures explaining the performance dynamics of the system.
Figure 3.11 focuses on an example of using the “instrumental view” of PM aimed
to explore the dynamics of enrollments, teaching staff management, and educational
programs in a University setting. This representation helps identify and frame
strategic resources, performance drivers, end-results, and the causal interplays
among these factors.
The model displayed in Fig. 3.12 exemplifies the conversion of the framework
depicted in Fig. 3.11 into a stock-and-flow DPM structure and highlights the role
3.8 A Dynamic Performance Management Approach to University Management

Fig. 3.9 Implementing Dynamic Performance Management through System Dynamics modeling (Bianchi, 2010, 2012, 2016)
107
108 3 Designing Dynamic Performance Management Systems in Higher. . .

Fig. 3.10 Developmental stages of DPM models

played by each variable. More specifically, input variables identify the external
constraints that academic decision-makers cannot influence, such as the time to
approve new educational programs, the obsolescence rate of existing educational
programs, and the retirement time of teaching staff. The same variable is used to
define the policy lever setting the desired number of teaching staff. Conversely, this
policy lever can be used by academic decision-makers to affect the stock of teaching
staff working in educational programs. A benchmark related to the competitor
University educational programs is also depicted as an input variable. It is a relative
term to detect a possible gap in offered educational programs.
The stock variables identify the main strategic resources within the model. This
example refers to educational programs, teaching staff, and enrolled students. These
resources change over time due to associated in- and outflows embedding the effects
of decisions and end-results. Namely, teaching staff retirements and recruitments
influence the stock of the workforce, which changes over time as a result of the
combined effect of the recruitment policy and the average time to retire.
Likewise, approving new educational programs and removing the obsolete ones
lead to a change in the stock of extant programs. In this case, eliminating programs
depends on an external constraint, i.e., the obsolescence imposed by the labor
market. Conversely, the implementation of new programs is endogenously ruled
3.8 A Dynamic Performance Management Approach to University Management 109

Fig. 3.11 An example of the “instrumental view” of PM applied to Enrollments and Graduations

CONSTRAINTS

Time to approve new Educational Retirement time


educational programs
programs obsolescence rate RESOURCE DECISION

BENCHMARK

Teaching staff
Teaching staff Teaching staff
Competitor retirements recruitments
University
educational
programs
RESOURCE POLICY LEVER
PERFORMANCE DRIVERS

Educational programs Desired teaching staff


New Educational Cancelled Educational Teaching staff per
Gap in educational
programs programs students ratio
programs
Effect of teaching
staff per students
ratio on time to
complete educational
programs
Effect on enrollments
Desired educational
programs RESULT RESOURCE RESULT

Enrolled students
New enrolled Graduated students
students

Time to complete
educational
programs

Fig. 3.12 Converting the “instrumental view” of PM into a DPM stock-and-flow model to explore
Enrollment and Graduation dynamics

by the performance driver measuring the discrepancy between the current educa-
tional programs offered by the University and those of its competitors.
110 3 Designing Dynamic Performance Management Systems in Higher. . .

Once again, enrolled students are affected by the inflow of new enrollments and
the outflow of graduations seen as the main end-results of the observed system.
Performance drivers influence these results. Particularly, while new enrollments
depend on the variance between the educational programs of the University and
those offered by its competitors, the teaching staff per student ratio generates an
effect on the time to complete the programs which, in turn, affects graduations.
To further test the effectiveness of DPM in managing academic performance, the
following subsections illustrate experiments of DPM applications to real academic
contexts.

3.8.1 Applying Dynamic Performance Management


to Fund-Raising Processes for Research in Academic
Departments

The analysis conducted by developing the following case study provides a field
experiment to illustrate how to design and adopt the DPM approach in an academic
department. A case-based research technique implies relevant methodological
advantages in terms of analysis scope and inductive logic that enable a more reliable
data interpretation (Yin, 2009).
The case study focuses on an academic department of a public University in Italy,
where a performance-based funding system has been recently introduced at a
national level to regulate the distribution of public funding among HEIs
(Francesconi & Guarini, 2016; Biondi & Cosenz, 2017). This case is excerpted
from Cosenz (2018). A web search survey has collected the data describing the
general department profile and its institutional structure. Subsequently, semi-
structured interviews with academic and administrative staff were conducted
through standard questions modified according to the different interviewees.
Academic departments are the main organizational areas devoted to research
activities within the overall institutional mission of Universities. The observed
department embraces several scientific domains related to political, social, manage-
ment, and law sciences. Its mission relates to “the development of different research
areas that contribute to defining the cognitive frameworks underlying the process of
European integration and creating an innovative international system based on
universal principles.” More in detail, the main departmental aim is to promote
research activities at both national and international levels, identifying as
key-clients the scientific community, Ph.D. students, the third parties, and funding
institutions. As for the governance structure, the departmental Director is supported
by the Council composed of ten members. The department counts 46 academic staff
units (assistant, associate, and full professors) and 9 technical-administrative
employees. An international Ph.D. program is also run at this department, where
16 Ph.D. students are currently enrolled. Research products correspond to scientific
3.8 A Dynamic Performance Management Approach to University Management 111

and empirical contributions, e.g., monographs, journal articles, conference papers,


and posters.
This case study focuses on the dynamics related to Research fund-raising pro-
cesses, which imply the development of research projects commissioned and spon-
sored by external stakeholders, such as enterprises and public sector institutions.
Research fund-raising is a process of particular relevance in this context since its
outcomes—intended as the acquisition of external resources—influence the
performance-based ranking of the University within the national funding system,
thereby contributing to an increase in the share of public funds allocated to the
University.
Adopting the “instrumental view” of PM, the strategic resources mostly affecting
performance in managing such a process are the following:
• Liquidity to invest in the acquisition of other strategic assets
• Research staff who is responsible for developing research projects and creating
networks with external stakeholders to foster research collaboration and
partnerships
• Research staff’s skills to acquire and deploy for developing research projects
• Research equipment (e.g., laboratories, libraries, software, etc.) and bibliographic
databases to endow scholars with adequate instruments to increase their knowl-
edge and skills applicable to conduct research activities
• Image intended as the reputation of the department to attract external investors
• Submitted papers, i.e., research outputs sent to academic journal boards whose
evaluation may result in publications (or rejections)
• Publications intended as research papers accepted for publication in a selected
academic journal
• Citations which count how many times other scholars have quoted a publication
and, consequently, highlight the impact of the research outcomes on a specific
scientific community
• Research projects requested and sponsored by external stakeholders, such as
SMEs, large-sized companies, and public sector organizations
Figure 3.13 shows PM’s “instrumental view” applied to the research fund-raising
process.
The proper use and coordination of such strategic resources allow the department
heads to affect a set of performance drivers that influence associated end-results. In
this framework, the departmental actors agreed upon the design of the following
drivers:
• The relative research productivity defined as the ratio between submitted papers
and research staff per year compared with a benchmark
• The fraction of published papers measured as a ratio between published and
submitted articles
• The relative citations, which compare the yearly citations collected by the depart-
ment research staff with those of its competitors
112 3 Designing Dynamic Performance Management Systems in Higher. . .

Fig. 3.13 An “instrumental view” of PM applied to research fund-raising processes in academic


departments

• The relative research quality as a driver to increase publications of department


research staff
• The relative time to develop research projects aimed to measure the discrepancy
between the actual and the planned time for completing a research product
These drivers support departmental heads to evaluate how the strategic decisions
applied to strategic resources affect the critical success factors of the research fund-
raising process and, in turn, generate an effect on the end-results producing a change
in the corresponding stock of resources. Namely, the framework detects the follow-
ing end-results:
• Cash flow which feeds back into liquidity and incorporates the twofold transfer
from external stakeholders and national Government
• Research staff recruitments increasing the scientific productivity of departmental
actors
• Change in research staff skills affecting the research quality and, consequently,
the possibility to publish in high-ranked scientific journals
• Change in research equipment and bibliographic databases supporting the scien-
tific productivity of research staff
• Newly submitted papers, i.e., the number of completed research products ready
for being evaluated for publication purposes
• New publications, i.e., submitted papers accepted for publication as a result of
peer-review processes carried out by academic journal boards
3.8 A Dynamic Performance Management Approach to University Management 113

Fig. 3.14 A conceptual DPM model of research fund-raising processes in academic departments

• New citations measuring the impact of publications within the scientific


community
• Change in department image intended as the reputation gained (or lost) by the
department for its research outcomes
• Change in research quality depending on the skills acquired by research staff
• Newly commissioned research projects by external stakeholders relying on the
department actors’ research activities
Identifying and placing the core elements of PM’s “instrumental view” inside the
respective layers (i.e., strategic resources, performance drivers, end-results) is instru-
mental in outlining the DPM model related to fund-raising processes. Figure 3.14
depicts the emerging conceptual DPM model, highlighting the fund-raising system’s
main feedback loops. Particularly, the department heads outlined the reinforcing
loop R1, which illustrates how an improvement of the department image positively
influences—other conditions being equal—the acquisition of new research projects
commissioned by external funders that may foster an image improvement again.
Loop R2 shows how an increase in liquidity directly affects the investments in
research staff skills. For instance, researchers develop their skills by attending
training courses, international conferences, and workshops or by strengthening
joint research partnerships. This may improve the research quality indicator—i.e.,
the ratio between actual and expected research products—and, consequently,
114 3 Designing Dynamic Performance Management Systems in Higher. . .

increase new publications in academic journals. Such an increase in publications


directly affects the number of yearly citations reached by the research staff. In turn,
citations positively influence the ratio between the department and its competitors’
citations, affecting its image. Again, this may imply new research agreements with
external stakeholders providing more funds to develop joint research projects. In
addition, external funds invested in research activities positively affect the
performance-based ranking indicator used by the national Government—i.e., the
ratio between the research funds by external sponsors and total funds allocated by the
University itself for research activities, thus involving a more significant share of
public funding and, as a result, more liquidity.
Department heads may also allocate financial investments for research staff
recruitment and new equipment purchases. As shown in loops R3 and R4, this
investment will likely increase the research staff and equipment, affecting the
research productivity indicator. Consequently, fostering research productivity may
result in new papers to submit to academic journals. In this case, by submitting more
papers to scientific journals, researchers are likely to obtain new publications. On the
one hand, publications directly affect the number of citations, as well as the
corresponding driver—i.e., the ratio between the department and its competitors’
citations—while, on the other, they may improve the ratio between published and
submitted articles, as illustrated in loop R5. Both drivers influence the department
image, whose improvement may imply new research agreements with external
funders, thus involving more public funding and increasing liquidity.
Similarly, the identification of loop R6 outlines that, by using its ‘relational
capital’, research staff may directly contribute to establishing new research contracts
with external stakeholders, thereby raising more public funding for fueling liquidity.
As for loop B1, an increase in liquidity allows the department to recruit more
research staff, which implies an increase in the total salaries paid by the University
and, consequently, a decrease in liquidity. Loop B2 illustrates that the department
also invests its liquidity in financing research activities. This investment may reduce
the ratio between external and internal funds invested in research that directly affects
public funding allocation to the University, feeding back into liquidity.
Loop B3 compares the department and its competitors in terms of research
performance. An increase in research contracts between the department and external
funders weakens the performance-based ranking of competitor Universities.
According to the national performance-based funding system, this dynamic may
decrease public resource allocation to such Universities. Therefore, a lower ranking
of competitors is likely to produce a stronger reaction to counteract the loss of public
funding. In other words, competitor Universities will be more focused on adopting
counteracting policies to be more competitive. As a result, their counteracting
policies may imply a reduction in the public funding allocation to the University.
This reaction may determine a decrease in liquidity and, hence, a reduction in
research skills development investments. The impoverishment of research skills
directly influences the quality of research, thus limiting the improvement of the
department image, which feeds back into research contracts with external funders.
3.8 A Dynamic Performance Management Approach to University Management 115

Fig. 3.15 A DPM full-fledged simulation model of research fund-raising processes in academic
departments

By exploring loop B4, department heads highlight that investments in new


research equipment and research staff recruitments influence the overall research
productivity, affecting the number of papers submitted for publication. More sub-
mitted papers negatively affect the ratio between published and submitted articles
which, in turn, directly influences the department image. Improvement of the image
causes an increase in new research agreements with external funders whose contri-
bution once again may lead to accumulating more liquidity to be reinvested.
Loop B5 implies that an increase in the department image may involve new
research contracts with external funders and, as a result, more external funds for
research activities, thereby increasing public funding. This dynamic generates a
negative effect on the performance-based ranking of competitors, which, in turn,
will produce a more significant effort to implement counteracting policies to increase
their citations.
The conceptual DPM structure—portrayed in Fig. 3.14—has been converted into
a DPM stock-and-flow simulation model to explore the potential of using DPM.
Figure 3.15 illustrates the emerging DPM stock-and-flow simulation model.
According to a systemic perspective, the model highlights strategic resources,
performance indicators, results, and related causal interplays that create closed
116 3 Designing Dynamic Performance Management Systems in Higher. . .

feedback loops. Department heads emphasize three main end-results to measure the
department’s performance in raising external funds. These are (1) earnings from
commissioned research projects, (2) new publications resulting from the develop-
ment of these research projects, and (3) the increase in research staff citations.
In particular, an increase in research staff positively influences research produc-
tivity. Consequently, this may result in a rise of papers submitted to academic
journals, thus positively affecting publications. The research productivity indicator
generates a positive effect on new publications. A second performance indicator
relates to the ratio between publications and submitted papers as an expression of
quality and relevance of research. Both research quality and relevance are essential
strategic resources that may depend on the attendance of research staff at interna-
tional conferences and workshops, and training courses. The research quality indi-
cator defines a third performance driver that positively affects the department image
and the number of citations. The stock of publications also influences the latter. An
increase in citations represents an end-result evaluating the impact of research out-
puts on the scientific community. In addition to the research quality indicator, the
citation indicator positively affects the department image, which, in turn, affects the
new research projects commissioned and sponsored by external stakeholders.
Likewise, the ratio between completed and commissioned research projects
measures the average research development time. Compared to the same parameter
of competitors, it represents a critical success factor that negatively influences the
departmental image. Research completion also depends on the ratio between the
number of projects to be developed and the available research staff.
Commissioned research projects feed up the associated earnings, which depart-
ment heads may invest for boosting the strategic resources that, in this case, are
mainly referred to as research staff recruitments, research equipment and scientific
databases, and research skills.
This modeling approach enables the simulation of the behavior of the main
variables linked to departmental performance. Specifically, the simulation interval
is equal to 5 years (2015–2019), and the emerging simulation scenario aims to assess
the strategies related to fund-raising processes.
Figure 3.16 shows the behaviors of performance indicators over time. In partic-
ular, assuming that competitors’ yearly citations equal 650, the citation indicator
displays a gradual decrease until mid-2015. Since then, it exponentially grows up
until 2017, reaching a value of 1,2. Subsequently, this trend slows down due to a
balancing effect to stabilize its value around 1. Accordingly, the research quality
indicator has a growing behavior that in 2016 reached a value of 1,2. Successively, it
gradually decreases, seeking an equilibrium close to 0,9.
The indicator that measures the research development time remains stable during
the simulation period highlighting a lack of difficulties related to the time variable.
Yet, the ratio between publications and submitted papers shows a growing trend to
reach 0,66 in 2019.
The research productivity indicator gradually increased to 0,76 in 2019, assuming
5 yearly publications per research staff unit in competitor departments. The ratio
between commissioned research projects and research staff grows from 0,10 to 0,20,
3.8 A Dynamic Performance Management Approach to University Management 117

Fig. 3.16 A simulation scenario of performance drivers in research fund-raising processes

Fig. 3.17 A simulation scenario of end-results in research fund-raising processes

revealing higher participation of scholars in the activities oriented to develop more


research projects commissioned by external funders.
Figure 3.17 illustrates the behaviors of the three main end-results. Specifically,
new publications begin from 93 in 2015 and grow to 96 in 2019. Conversely, new
citations decrease in the first phase and then reach 800 in 2017. In 2018, the effect of
a balancing process tended to stabilize yearly citations around 700 in 2019. Even-
tually, the earnings from commissioned research projects show a robust growth that,
starting from euro 10.000, moves to euro 30.000 in 2019.
This case study provides a synthetic illustration of how to adopt a DPM approach
to frame academic performance according to a systemic perspective. Based on the
described model structure, academic decision-makers may evaluate alternative strat-
egies related to a different allocation of strategic resources to test—through simula-
tion—how the system reacts (i.e., “what-if analysis”), thus promoting the more
sustainable scenario in terms of performance improvement.
118 3 Designing Dynamic Performance Management Systems in Higher. . .

3.8.2 A University Dynamic Performance Management


Simulator to Explore the Synergies Between Research
and Education

Drawing on the above DPM models applied to Education and Research, this section
illustrates a comprehensive University DPM simulator to explore the synergies
between these two core academic functions. This simulator is intended as a strategic
learning tool for academic decision-makers. To this end, the emerging model aims to
frame the specific organizational complexity of HEIs, thus providing a supportive
tool for managing the trade-offs between the strategies—and associated outputs—
applied to both Research and Education.
In particular, given a limited amount of financial resources, the DPM simulator
enables alternative investment decisions designed to allocate the available budget to
(i) purchase new research assets fostering the research productivity (e.g., scientific
databases, laboratory instruments, etc.), (ii) hire new research and teaching staff, and
(iii) improve the research skills of professors (e.g., more training activities, funding
visiting periods abroad, etc.). In addition, competitor Universities may adopt differ-
ent reaction policies to counteract the success of the HEI under observation. Namely,
these counteracting policies include aggressive, moderate, and limited reactions by
competitors.
Figure 3.18 shows the DPM simulator model. The upper section reports the
investment dynamics depending on the available liquidity and the effects of such
investment decisions on the selected strategic resources (i.e., research assets, pro-
fessors, and research skills). Academic decision-makers further develop the model in
Fig. 3.15 to highlight performance drivers and end-results associated with research
processes. In their viewpoint, while research assets and skills are resources that
ultimately affect research productivity and quality resulting in new publications and
citations, the number of professors influences the research outputs and the main
results of educational activities such as new enrolled students and graduates.
The lower section of the model illustrates an aging chain archetype related to a
5-year educational program offered by the University, thus providing an example of
a core process within the Education process. In this model section, the ratio between
lecturers and students acts as a performance driver affecting the education quality
whose measurement supports academic decision-makers in designing policies more
consistent with the expectations of students—seen as “clients” according to an
“Entrepreneurial University” perspective (Etzkowitz et al., 2000). Other significant
performance drivers influencing Education aim to assess the ratio between new
enrolled and total students, the ratio between graduated and total students, and the
average percentage of students passing from an academic year to the next one. This
latter has particular relevance since it enables to detect barriers and critical issues in
students’ educational pathways year by year, thereby implementing more accurate
corrective actions.
The core results emerging from Research and Education processes jointly flow
into the change in University image, whose effect generates influence on those
3.8 A Dynamic Performance Management Approach to University Management 119

Research assets
Investments in new
assets_decision
Yearly avg wages per
professor
Financial resource Investment decision Obsolete equipments
New assets
outflow
Avg asset per 000€

FINANCIAL
RESOURCES
Total wages Obsolescence rate
Investments
Professors
Investments in
Cash flow Human Effect of assets on
Standard % of Resources_decision productivity
funding to one
university New hires Retirements
Total education fees Investments in
Time to retire
Research Skills
Avg professor per
000€
Avg articles per Normal research
Expected professor per yr productivity
investments in R
Investments in skills Effect of Research
Research skills on Research
Public funding Skills_decision quality
Gap in investments Submitted articles
Available financial on Research skills
Effect of education
ratio on public resources to be
funding allocated by the
Ministry
Standard Research New submitted
Research skills quality articles
Effect of research
ratio on public Research quality
funding External over internal
research funds

Change in Research
Effect of publications Published over
skills
Avg Time to invest on image submitted articles
research funds
Max % of
Avg % of students Relative research skills acceptance
from one yr to
Time to change Relative research Effect of RQ on % of
another
Invested funds Research skills Change in quality quality acceptance
Publications

Time to change
quality Effect of Research
External funds for Research contracts
research activity with external funders quality on citations Avg % of acceptance New publications

Avg citations per


Standard contracts publication
per year

Avg funds per New external funds New contracts


Research contract Effect of image on Relative citations Total citations
external research
contracts

IMAGE New_citations
Competitors reaction
Graduated students Effect of graduation policy
over total students index on image

Avg competitors
Effect of citation
citations New competitors
Effect of education index on image COMPETITOR
quality on image citations REACTION
Time to change Change in image
image
Effect of publications
on image

Avg % of students
from one yr to
another Avg lecturers per
students
Effect of image on
enrolments
Avg % 3rd yr drop
out
Effect on transit time

Avg % 1st yr drop Avg % 2nd yr drop


out out
Avg % 4th yr drop
out Avg % 5th yr drop
out
Avg enrolled students
1st yr drop out rate 2nd yr drop out rate 3rd yr drop out rate 4th yr drop out rate 5th yr drop out rate
per year

1st year students 2nd year students 3rd year students 4th year students 5th year students
New enrolled Passing rate 1-2 Passing rate 2-3 Passing rate 3-4 Passing rate 4-5 Graduated students
students

Standard time to Graduated students


New enrolled over
transit from one yr over total students
total students
to another

Avg fee per student Total education fees Total students Avg lecturers per Professors
per year students

Fig. 3.18 A DPM simulator exploring the synergies between Research and Education

results associated with a change in financial resources. Again, such results affect
Research since improving the University image may increase research contracts and
new external funds. They also influence Education given that the University image
affects new enrollments, implying more tuition fees students pay to attend educa-
tional programs.
The emerging simulation scenarios display the behaviors related to the strategic
resources (Fig. 3.19) and performance drivers (Fig. 3.20), as identified in the model
structure. Simulation runs move along 6 years (2012–2017). By analyzing these
trends, academic decision-makers may take advantage of a quantitative perspective
showing how the HE system reacts according to the specific strategies and manage-
ment decisions adopted in the simulation settings.
120 3 Designing Dynamic Performance Management Systems in Higher. . .

0,9 800.000
Uni vers i ty i mage

Total students
30.000
600.000
0,6 25.000

400.000 Avg com pe titors citations 20.000


Citations
0,3 15.000
200.000
10.000
0
2012 2013 2014 2015 2016 2017 12 13 14 15 16 17 2012 2013 2014 2015 2016 2017

1.500 0,9
Research assets

researchers skills
2.300

Professors
1.000 2.200 0,6

Relative
2.100
500 0,3
2.000

2012 2013 2014 2015 2016 2017 2012 2013 2014 2015 2016 2017 2012 2013 2014 2015 2016 2017

€ €
40.000

External funds
100.000.000 2.500.000

for research
resources

30.000
Fi nanci al

2.000.000
Publications
20.000

activity
50.000.000 Subm itte d article s 1.500.000

10.000 1.000.000

0 0 500.000
2012 2013 2014 2015 2016 2017 12 13 14 15 16 17 2012 2013 2014 2015 2016 2017

Fig. 3.19 Simulations of strategic resources

3 4 0,25
External over internal

Avg lecturers per


Rel ati ve ci tati ons

3 0,20
2
research funds

2 0,15

1
students
1 0,10

0 0 0,05
2012 2013 2014 2015 2016 2017 2012 2013 2014 2015 2016 2017 2012 2013 2014 2015 2016 2017
New enrolled over
submitted articles
Res earch qual i ty

0,8 0,3
Published over

1,0
total students

0,6 0,2

0,5
0,4 0,1

0,2 0,0
2012 2013 2014 2015 2016 2017 2012 2013 2014 2015 2016 2017 2012 2013 2014 2015 2016 2017
Graduated s tudents over

160
Avg % of students from

5
professor per yr
Avg articles per

0,3 140
one yr to the next

4
total s tudents

120
0,2
3
100

0,1 2
80
2012 2013 2014 2015 2016 2017 2012 2013 2014 2015 2016 2017 2012 2013 2014 2015 2016 2017

Fig. 3.20 Simulations of performance drivers

In addition, University administrators can better understand the underlying causes


of the observed behaviors by turning back to the model structure and exploring the
quantified interactions between the variables. This also enables them to identify
those strategy levers to be modified in the pursuit of better organizational results and,
consequently, retest alternative policies through simulation. Such a process evolves
according to an evolutionary pathway—based on model refinements and strategy
reversals—that supports the strategic learning of academic decision-makers
over time.
The advantages and limitations of adopting DPM in HEIs are discussed and
analyzed in the following section.
3.9 Advantages and Limitations of Using Dynamic Performance Management. . . 121

3.9 Advantages and Limitations of Using Dynamic


Performance Management in Higher Education
Institutions

The DPM approach—based on the combination between conventional PM frame-


works and SD methodology—provides academic decision-makers with an insight-
ful, effective, quite rapid, and inexpensive method to frame and assess operations,
activities, and processes in terms of strategic planning and performance measure-
ment (Cosenz, 2018; Groesser & Jovy, 2016). The use of a simulation-based
technique may provide a deeper and more integrated understanding of the complex-
ity characterizing University management through the qualitative and quantitative
exploration of its systemic interdependencies (Ter Bogt & Scapens, 2012).
In addition, DPM is a flexible approach to frame the interplays in the system and
remodel the underlying organizational structure to adapt it to external changes in the
environment. Namely, the modeling process begins with a simple model structure
and continuously improves in an evolutionary way using rapid adaptations to
contextual changes. Consequently, this process of elaboration and calibration pro-
duces a robust and purpose-oriented model (Morecroft, 2007; Groesser &
Schwaninger, 2012). Thus, the modeling process involves the academic key-actors
operating within the organizational system and, as a result, may effectively support
their strategic learning processes over time.
The use of SD modeling emphasizes a continuous perspective (Sterman, 2000)
that strives to look beyond single events to analyze the dynamic patterns underlying
them in the short and long term. Then, by identifying these patterns, simulation
scenarios provide an understanding of the causes of current critical issues and
support decision-makers to tackle them timely (Groesser & Jovy, 2016). Therefore,
as remarked by Groesser and Jovy (2016), the possibility of experimenting with
different scenarios and strategic initiatives by using simulation techniques can
reduce erroneous management decisions and identify disregarded factors and pat-
terns that could become relevant in the future.
The disadvantages of DPM applied to University management are mainly related
to the “ossification” process, which affects the opportunity to encourage cultural
evolutions in managing HEIs when external changes occur, and different managerial
perspectives are required. “Ossification” processes are frequently observed in public
sector institutions traditionally characterized by loosely coupled organizational units
demonstrating strong resistance to changes and an idiosyncratic perspective to face
shared administrative complexities (Osborne, 2002). Such a setting may hamper the
use of SD applied to PM mechanisms as a tool promoting collaboration, shared
understanding of problematic management issues, and participatory decision-
making.
In this context, it is not that simple to involve the multiple academic key-actors—
operating and interacting at a different level and/or in other organizational areas—in
the modeling process (Rouwette et al., 2011). When they adopt a too bounded
122 3 Designing Dynamic Performance Management Systems in Higher. . .

perspective on the goals to achieve and get in conflicts in negotiating resources,


DPM may fail in producing the necessary cohesion to enhance the strategic coordi-
nation between the organizational units and institutions. As a result, there might be
no improvement in their strategic learning processes, and the empirical evidence
about the learning outcomes of simulation methods and their effectiveness may
appear inadequate (Karakul & Quadrat-Ullah, 2008; Sterman, 2010; Qudrat-Ullah,
2014). Thus, effective SD modeling requires a collaborative and shared learning
perspective among key-actors toward common goals and related strategies.
Another criticism related to SD modeling is the erroneous belief that its adoption
may perfectly replicate real dynamics observed in complex organizational or inter-
organizational settings (Sterman, 2000; Forrester, 2007a, b). The aim of SD models
is not to perfectly simulate reality or precisely reproduce problematic behavior
patterns. Building a model that replicates the actual system and its behavior faith-
fully is not the purpose of SD. As Meadows et al. (1972, p. 21) underline, “the model
we have constructed is, like every other model, imperfect, oversimplified, and
unfinished,” and, as such, it is challenging to compare emergent simulations with
historical data.
Conversely, the goal of SD modeling is to support decision-makers to understand
and explore the systemic structure of a complex social system that drives behavior
over time (Forrester, 1961; Senge, 1990; Sterman, 2000). Far from being considered
a deterministic approach (Keys, 1990), as Forrester (1961) emphasizes, generating
the model and learning about the system produces greater advantages than the model
itself. The modeling process provides more significant learning about the internal
and external causes and implications of systemic structure than the model on its own
would (Featherson & Doolan, 2012). In this perspective, the work of
Ghaffarzadegan et al. (2011) recommends the use of small SD models to better
support the exploration of decision-making and performance measurement processes
in public organizations.
Truth be told, real social systems are quite problematic to understand, and human
beings often experience difficulties in identifying the causes of certain systemic
behaviors due to factors such as time and spatial separation of cause and effect, as
well as incorrect or limited information (Featherson & Doolan, 2012; Sterman,
2002). However, SD models are designed to provide decision-makers with interpre-
tive lenses for exploring complex ecosystems and support their systematic strategic
learning process (Sterman, 2000).

3.10 Closing Remarks

This chapter contributed to the debate on University management by illustrating how


SD modeling may provide additional methodological support in designing and using
PM systems in HEIs. In particular, after an introduction oriented to discuss those
critical issues affecting PM design and implementation in HEIs, the SD methodol-
ogy, and its modeling principles have been illustrated, focusing on its application to
3.10 Closing Remarks 123

strategic management research. Indeed, SD modeling may offer valuable methodo-


logical support to PM due to its inner attributes, namely, simulation, systemic view,
explicit link between structure and behavior of the system, as well as an effective
visual representation of the causal interplays among relevant variables affecting
organizational performance (Sterman, 2000; Bianchi, 2016; Größler et al., 2008).
Limitations, methodological shortcomings, and possible unintended implications
originating from the use of SD modeling have also been pointed out and debated. In
addition, the chapter highlighted the extant research adopting SD for supporting
University management. Then, drawing on the work developed by Bianchi (2010,
2012, 2016), the combination of PM with SD modeling has been proposed to
improve strategic planning and performance measurement in HEIs according to a
sustainable development perspective. Specifically, such a combination results in a
more robust methodological approach named Dynamic Performance Management—
DPM.
DPM is particularly effective for supporting management decisions in complex
organizations like HEIs. Time disjunctions between actions and results, and
nonlinear feedback relationships affecting outputs, limit academic decision-makers’
understanding of the structure and behavior of the system in which they implement
their strategies. Such an approach helps them to manage possible risks associated
with unintended implications of strategies which, although they may appear consis-
tent from a static and sectorial perspective, may fail in the long term due to a lack of
coordination or lack of flexibility (Bianchi, 2016; Ghaffarzadegan et al., 2011). In
particular, DPM can support academic decision-making through better coordination
between strategy design and performance measurement reporting. Such coordination
enables tracing both causes and drivers leading to organizational performance over
time. In addition, it contributes to enhancing the diagnostic process for promptly
undertaking corrective actions aimed to fill the gap between the actual and the target
performance.
To test the effectiveness of DPM, the chapter reported two cases describing how
to apply this approach to explore the research fund-raising process in an academic
department and the synergies between Research and Education activities, respec-
tively. These cases highlighted the contribution of SD modeling to conducting a
deeper analysis of how strategies affect organizational performance in HEIs through
the use of simulation scenarios, thus fostering strategic learning processes of aca-
demic decision-makers.
Building on the insights related to the adoption of DPM in the organizational
setting of Universities, the next chapter will aim to widen the scope of DPM
application from the assessment of Research and Education toward Third Mission
activities in HEIs. This will imply additional challenges and methodological efforts
for structuring supportive PM mechanisms shifting the focus from a single Univer-
sity to the broader academic network that influences the performance of the local/
regional area where the HEI operates.
124 3 Designing Dynamic Performance Management Systems in Higher. . .

References

Adamides, E. D., & Pomonis, N. (2009). The co-evolution of product, production and supply chain
decisions, and the emergence of manufacturing strategy. International Journal of Production
Economics, 121, 301–312.
Alexander, A., Kumar, M., & Walker, H. (2018). A decision theory perspective on complexity in
performance measurement and management. International Journal of Operations & Production
Management, 38(11), 2214–2244.
Altbach, P. G. (2004). Globalisation and the university: Myths and realities in an unequal world.
Tertiary Education and Management, 10(1), 3–25.
Altbach, P. G., Reisberg, L., & Rumbley, L. E. (2009). Trends in global higher education: Tracking
an academic revolution. Report prepared for the UNESCO 2009 World Conference on Higher
Education. UNESCO.
Anderson, L. W., & Sosniak, L. A. (1994). Bloom’s Taxonomy: A Forty-year Retrospective. Ninety-
third Yearbook of the National Society for the Study of Education (NSSE). University of
Chicago Press.
Argyris, C. (1992). On Organizational Learning. Blackwell Publishing.
Argyris, C. (2002). Double-Loop Learning, Teaching, and Research. Academy of Management
Learning & Education, 1(2), 206–218.
Arndt, H. (2007). Using system dynamics-based learning environments to enhance system thinking.
Proceedings of the 2007 International Conference of the System Dynamics Society, July 29–
August 2, Boston, Massachusetts, USA.
Barlas, Y., & Diker, G.D., (1996a). Decision support for strategic university management: A
dynamic interactive game. Proceedings of the 14th International System Dynamics Conference,
Boston, Massachusetts, USA.
Barlas, Y., & Diker, G.D., (1996b). An interactive dynamic simulation model of a university
management system. Proceedings of the 1996 ACM Symposium on Applied Computing.
Available from http://portal.acm.org/citation.cfm?id¼331119.331162
Barlas, Y., & Diker, V. (2000). A Dynamic Simulation Game for Strategic University Management.
Simulation and Gaming, 31(3), 331–358.
Barnabè, F. (2003). La managerializzazione dell’università italiana. Le potenzialità della System
Dynamics.
Barnabè, F., (2004). From ivory towers to learning organizations: The role of system dynamics in
the “Managerialisation” of academic institutions. Paper presented at the XXII International
Conference of the System Dynamics Society, Oxford, UK, June 25–29.
Bell, G., Cooper, M., Kennedy, M., & Warwick, J. (2000). The development of the holon planning
and costing framework for higher education management. Proceedings of the 18th System
Dynamics Conference, Bergen, Norway.
Bianchi, C. (2010). Improving performance and fostering accountability in the public sector
through system dynamics modelling: From an ‘external’ to an ‘internal’ perspective. Systems
Research and Behavioral Science, 27(4), 361–384.
Bianchi, C., Winch, G., & Cosenz, F., (2012). Sustainable strategies for small companies compet-
ing against multinational giants. Paper presented at the ACERE Diana Conference,
Freemantle–Perth, Australia.
Bianchi, C. (2012). Enhancing performance management and sustainable organizational growth
through system-dynamics modelling. In S. N. Grösser & R. Zeier (Eds.), Systemic management
for intelligent organizations (pp. 143–161). Springer-Verlag.
Bianchi, C. (2016). Dynamic performance management. Springer.
Bianchi, C., Bereciartua, P., Vignieri, V., & Cohen, A. (2019). Enhancing urban brownfield
regeneration to pursue sustainable community outcomes through dynamic performance gover-
nance. International Journal of Public Administration. https://doi.org/10.1080/01900692.2019.
1669180
References 125

Bianchi, C., Cosenz, F., & Marinković, M. (2015). Designing dynamic performance management
systems to foster SME competitiveness according to a sustainable development perspective:
Empirical evidences from a case-study. International Journal of Business Performance Man-
agement, 16(1), 84–108.
Biondi, L., & Cosenz, F. (2017). La misurazione della performance accademica: un’analisi
applicata al “costo standard per studente in corso”. RIREA, 3, 357–376.
Bisbe, J., & Malagueño, R. (2012). Using strategic performance measurement systems for strategy
formulation: Does it work in dynamic environments? Management Accounting Research, 23(4),
296–311.
Bivona, E., & Montemaggiore, G. B. (2010). Understanding short-and long-term implications of
“myopic” fleet maintenance policies: A system dynamics application to a city bus company.
System Dynamics Review, 26(3), 195–215.
Blumenstyk, B. (2000). A computer game lets you manage the university. The Chronicle of Higher
Education, 46(28), A51. Retrieved September 8, 2002 from the World Wide Web: http://www.
chronicle.com
Budde-Sung, A. E. K. (2011). The increasing internationalization of the international business
classroom: Cultural and generational considerations. Business Horizons, 54(4), 365–373.
Castells, M. (2011). The rise of network society. Wiley-Blackwell.
Coda, V. (2010). Entrepreneurial values and strategic management. Essays in Management theory.
Cosenz, F. (2014). A dynamic viewpoint to design performance management systems in academic
institutions: Theory and practice. International Journal of Public Administration, 37(13),
955–969.
Cosenz, F. (2017). Supporting start-up business model design through system dynamics modelling.
Management Decision, 55(1), 57–80.
Cosenz, F. (2018). Supporting public sector management through simulation-based methods: A
dynamic performance management approach. International Review of Public Administration,
23(1), 20–36.
Cosenz F., & Bianchi C. (2013). Improving performance measurement/management in Academic
Institutions: A dynamic resource-based view. Insights from a field project. Paper presented at the
ASPA (American Society of Public Administration) Annual Conference for the Center for
Accountability and Performance (CAP) Symposium, Baltimore (USA), March 12.
Cosenz, F., & Noto, G. (2018a). A dynamic business modelling approach to design and experiment
new business venture strategies. Long Range Planning, 51(1), 127–140.
Cosenz, F., & Noto, G. (2018b). Fostering entrepreneurial learning processes through Dynamic
Start-up business model simulators. International Journal of Management Education, 16(3),
468–482.
Cosenz, F., & Noto, G. (2016). Applying system dynamics modelling to strategic management: A
literature review. Systems Research and Behavioral Science, 33(6), 703–741.
Cotton, J. L., & Stewart, A. (2013). Evaluate your business school’s writing as if your strategy
matters. Business Horizons, 56(3), 323–331.
Davidsen, P. (1991). The structure-behavior graph. System Dynamics Group, Massachusetts
Institute of Technology.
Davidsen, P. (1996). Educational features of the system dynamics approach to modeling and
simulation. Journal of Structural Learning, 12(4), 269–290.
De Boer, H., Huisman, J., Klemperer, A., van der Meulen, B., Neave, G., Theisens, H., & van der
Wende, M. (2002). Academia in the 21st century. An analysis of trends and perspectives in
higher education and research. AWT-Achtergrondstudie 28. The Hague: Adviesraad voor het
Wetenschaps-en Technolo-giebeleid.
De Geus, A. (1999). The living company: Growth learning and longevity in business. Nicholas
Brealey Publishing.
Deem, R., & Brehony, K. J. (2008). Management as ideology: The case of ‘new managerialism’ in
higher education. Oxford Review of Education, 31(2), 217–235.
126 3 Designing Dynamic Performance Management Systems in Higher. . .

Deiaco, E., Hughes, A., & Mckelvey, M. (2012). Universities as strategic actors in the knowledge
economy. Cambridge Journal of Economics, 36, 525–541.
Dekkers, J., & Donatti, S. (1981). The integration of research studies on the use of simulation as an
instructional strategy. Journal of Educational Research, 74, 424–427.
Eftekhar, N., & Strong, D. R. (2005). Dynamic modelling of a learning process. The International
Journal of Engineering Education. Available from http://www.ijee.dit.ie/articles/999995/
article.htm
EPRS. (2014). Digital opportunities for education in the EU. European Parliamentary Research
Service.
Etzkowitz, H., Webster, A., Gebhardt, C., & Cantisano Terra, B. R. (2000). The future of the
university and the university of the future: Evolution of ivory tower to entrepreneurial paradigm.
Research Policy, 29(2), 313–330.
European Commission. (2013). Modernisation of higher education. Publications Office of the
European Union.
Featherson, C.R., & Doolan, M. A. (2012). Critical review of the criticisms of system dynamics, in
Proceedings of the 30th International Conference of the System Dynamics Society, St. Gallen,
Switzerland, 22–26 July 2012.
Fincher, C. (1994). Learning theory and research. In K. A. Feldman & M. B. Paulsen (Eds.),
Teaching and learning in the college classroom. ASHE Reader Series, Ginn Press.
Forrester, J. W. (1958). Industrial dynamics–a major breakthrough for decision makers. Harvard
Business Review, 36(4), 37–66.
Forrester, J. W. (1961). Industrial dynamics. Massachusetts Institute of Technology.
Forrester, J. W. (1974). Educational implications of responses to system dynamics models D-2021.
In J. W. Forrester (Ed.), MIT system dynamics group literature collection DVD. System
Dynamics Society.
Forrester, J. W. (2007a). System dynamics–a personal view of the first fifty years. System Dynamics
Review, 23, 345–358.
Forrester, J. W. (2007b). System dynamics-the next fifty years. System Dynamics Review, 23(2/3),
359–370.
Frances, C. (1995). Using system dynamics technology to improve planning for human resource
development. Presentation at the Congress of Political Economists, International 6th Annual
Conference, Seoul, Korea.
Frances, C. (2000). Using system dynamics as a tool for decision making in higher education
management- U.S. experience. In M. Kennedy (Ed.), Selected papers presented at an interna-
tional seminar on ‘Using System Dynamics as a Tool for Decision Making in Higher Education
Management’, held in June 1999 at the Royal Society. London and London South Bank
University, under the auspices of the Society for Research into Higher Education, London
South Bank University Technical Report SBU-CISM-12-00.
Frances, C., Van Alstyne, M., Ashton, A., & Hochstettler, T. (1994). Using system dynamics
technology to improve planning higher education: Results in Arizona and Houston, Texas. In
Proceedings of the 13th International System Dynamics Conference, Stirling, Scotland,
pp. 444–453.
Francesconi, A., & Guarini, E. (2016). Performance-based funding and internal resource allocation:
The case of Italian universities. In E. Borgonovi, E. Anessi Pessina, & C. Bianchi (Eds.),
Outcome-based performance management in the public sector (pp. 289–306). Springer.
Friedman, S., Cavaleri S., & Raphael, M., (2007). Individual learning style and systems tool
preferences. Proceedings of the 2007 International Conference of the System Dynamics Society,
July 29–August 2, Boston, Massachusetts, USA.
Friga, P. N., Bettis, R. A., & Sullivan, R. S. (2003). Changes in graduate management education and
new business school strategies for the 21st century. Academy of Management Learning and
Education, 2(3), 233–249.
Galbraith, P.L. (1982). Forecasting futures for higher education in Australia: An application of
dynamic modelling. Ph.D. Thesis, The University of Queensland.
References 127

Galbraith, P. L. (1989). Strategies for institutional resource allocation: Insights from a dynamic
model. Higher Education Policy, 2(2), 31–38.
Galbraith, P. L. (1998a). When strategic plans are not enough: Challenges in university manage-
ment. System Dynamics: An International Journal of Policy Modelling, X(1 and 2), 55–84.
Galbraith, P. L. (1998b). System dynamics and university management. System Dynamics Review,
10(2), 69–84.
Galbraith, P. L. (1998c). Are universities learning organisations? In P. K. J. Mohapatra (Ed.),
Systems thinking in management (pp. 70–87). Department of Industrial Engineering and
Management.
Galbraith, P. L. (2010). System dynamics: A lens and scalpel for organisational decision making.
OR Insight, 23(2), 96–123.
Galbraith, P. L., & Carss, B. W. (1989). Strategies for institutional resource allocation: Insights
from a dynamic model. Higher Education Policy, 2(2), 31–36.
Gary, M. S., Kunc, M., Morecroft, J. D. W., & Rockart, S. F. (2008). System dynamics and strategy.
System Dynamics Review, 24, 407–429.
Gerrish, E. (2016). The impact of performance management on performance in public organiza-
tions: A meta-analysis. Public Administration Review, 76(1), 48–66.
Ghaffarzadegan, N., Larson, R., & Hawley, J. (2017). Education as a complex system. Systems
Research and Behavioral Science, 34, 211–215.
Ghaffarzadegan, N., Lyneis, J., & Richardson, G. P. (2011). How small system dynamics models
can help the public policy process. System Dynamics Review, 27(1), 22–44.
Gibbs, P., & Murphy, P. (2009). Implementation of ethical higher education marketing. Tertiary
Education and Management, 15(4), 341–354.
Groesser, S. N., & Jovy, N. (2016). Business model analysis using computational modeling: A
strategy tool for exploration and decision-making. Journal of Management Control, 27(1),
61–88.
Groesser, S. N., & Schwaninger, M. (2012). Contributions to model validation: Hierarchy, process,
and cessation. System Dynamics Review, 28(2), 157–181.
Größler, A. (2007). System dynamics projects that failed to make an impact. System Dynamics
Review, 23(4), 437–452.
Größler, A., Thun, J. H., & Milling, P. (2008). System dynamics as a structural theory in operations
management. Production and Operations Management, 17(3), 373–384.
Guarini, E., Magli, F., & Francesconi, A. (2020). Academic logics in changing performance
measurement systems: An exploration in a university setting. Qualitative Research in Account-
ing & Management, 17(1), 109–142.
Hall, R. I., & Menzies, W. B. (1983). A corporate system model of a sports club: Using simulation
as an aid to policy making in a crisis. Management Science, 29(1), 52–64.
Head, B. W., & Alford, J. (2015). Wicked problems: Implications for public policy and manage-
ment. Administration & Society, 47(6), 711–739.
Hicks, D. (2012). Performance-based university research funding systems. Research Policy, 41(2),
251–261.
Hoecht, A. (2006). Quality assurance in UK higher education: Issues of trust, control, professional
autonomy and accountability. Higher Education, 51(4), 541–563.
James, O., Leth Olsen, A., Moynihan, D., & Van Ryzin, G. (2020). Behavioral public performance:
How people make sense of government metrics. Cambridge University Press.
Jongbloed, B., Enders, J., & Salerno, C. (2008). Higher education and its communities: Intercon-
nections, interdependencies and a research agenda. Higher Education, 56(3), 303–324.
Jordan, S. M. (1992). Enrolment demand in Arizona policy choices and social consequences. New
Direction for Institutional Research, no. 76.
Kaplan, R. S., & Norton, D. P. (1992). The balanced scorecard: Measures that drives performance.
Harvard Business Review, 70(1), 71–79.
128 3 Designing Dynamic Performance Management Systems in Higher. . .

Karakul, M., & Quadrat-Ullah, H. (2008). How to improve dynamic decision making? Practice and
promise. In H. Quadrat-Ullah, J. M. Spector, & P. Davidsen (Eds.), Complex decision making
(pp. 3–24). Springer.
Kennedy, M. (1998a). A pilot system dynamics model to capture and monitor quality issues in
higher education institutions experiences gained. Proceedings of the 16th System Dynamics
Conference, Quebec City, Canada.
Kennedy, M. (1998b). Some issues in system dynamics model building to support quality monitor-
ing in higher education. Proceedings of the 16th System Dynamics Conference, Quebec City,
Canada.
Kennedy, M. (2011). A taxonomy of system dynamics models of educational policy issues. Pro-
ceedings of the 29th International Conference of the System Dynamics Society, System
Dynamics Society.
Kennedy, M., & Clare, C. (1999). Some issues in building system dynamics model for improving the
resource management process in higher education. Proceedings of the 17th International
System Dynamics Conference, Wellington, New Zealand.
Kersbergen, R.J.V., Daalen, C.E.V. Meze, C.M.C., and Horlings, E. (2016). The impact of career
and funding policies on the academic workforce in the Netherlands: A system dynamics based
promotion chain study. Proceedings of the 34th International Conference of the System Dynam-
ics Society, Delft, the Netherlands.
Keys, P. (1990). System dynamics as a systems-based problem-solving method. Systems Practice,
3(5), 479–493.
Kim, B., & Park, K. (2006). Dynamics of industry consolidation and sustainable competitive
strategy: Is birthright irrevocable? Long Range Planning, 39, 543–566.
Kim, H., & Rehg, M. (2018). Faculty performance and morale in higher education: A systems
approach. Systems Research and Behavioral Science, 35(3), 308–323.
Kim, H., MacDonald, R. H., & Andersen, D. F. (2013). Simulation and managerial decision
making: A double-loop learning framework. Public Administration Review, 73, 291–300.
Kolb, D. (1984). Experiential learning. Prentice Hall.
Küçük, B., Güler, N., and Eskici, B. (2008). A dynamic simulation model of academic publications
and citations. Proceedings of the 26th International Conference of the System Dynamics
Society, Athens, Greece.
Kumar, R., & Vrat, P. (1989). Using computer models in corporate planning. Long Range Planning,
22(2), 114–120.
Linard, K., and Dvorsky, L. (2001). People–not human resources: The system dynamics of human
capital accounting. Proceedings of the Operations Research Society Conference, Bath, Univer-
sity of Bath.
Linard, K., Flemin, C., & Dvorsky, L. (2002). System dynamics as the link between corporate vision
and key performance indicators. Proceedings of the 2002 International System Dynamics
Conference, Palermo, System Dynamic Society.
Lyneis, M. J. (1999). System dynamics for business strategy: A phased approach. System Dynamics
Review, 15(1), 37–70.
Maier, F. H., & Größler, A. (2000). What are we talking about? A taxonomy of computer
simulations to support learning. System Dynamics Review, 16(2), 135–148.
Martins, H. (2017). Strategic organizational learning–using system dynamics for innovation and
sustained performance. The Learning Organization, 24(3), 198–200.
McHaney, R. (2011). The new digital shoreline: How Web 2. 0 and millennials are revolutionizing
higher education. Stylus Publishing, LLC.
McKeachie, W. J. (1990). Research on college teaching: The historical background. Journal of
Educational Psychology, 82(2), 189–200.
Meadows, D. L. (1999). Learning to be simple: My odyssey with games. Simulation & Gaming,
30(3), 342–351.
Meadows, D. H., Meadows, D. L., Randers, J., & Behrens, W. W., III. (1972). Limits to growth.
Universe Books.
References 129

Meadows, D. L. (1980). The unavoidable a priori. In J. Randers (Ed.), Elements of the system
dynamics method (pp. 161–240). Waltham, MA.
Melnyk, S. A., Bititci, U., Platts, K., Tobias, J., & Andersen, B. (2013). Is performance measure-
ment and management fit for the future? Management Accounting Research, 25, 173–186.
Micheli, P., & Mari, L. (2014). The theory and practice of performance measurement. Management
Accounting Research, 25, 147–156.
Mio, C. (2013). Towards a sustainable university: The Ca’ Foscari experience. Palgrave
Macmillan.
Morecroft, J. D. W. (1984). Strategy support models. Strategic Management Journal, 5(3),
215–229.
Morecroft, J. D. W. (1985). Rationality in the analysis of behavioral simulation models. Manage-
ment Science, 31(7), 900–916.
Morecroft, J. D. W. (1988). System dynamics and microworlds for policymakers. European
Journal of Operational Research, 35, 301–320.
Morecroft, J.D.W. (1997). The rise and fall of people express: A dynamic resource-based view.
Proceedings of the 1997 International System Dynamics Conference, Istanbul, System Dynamic
Society.
Morecroft, J. D. W. (2007). Strategic modeling and business dynamics. Wiley.
Morecroft, J. D. W. (2013). Modelling and simulating firm performance with system dynamics. In
J. Gotze & A. Jensen-Waud (Eds.), Beyond Alignment (pp. 453–475). College Publications.
Morecroft, J. D. W., & Sterman, J. D. (1992). Modelling for learning. European Journal of
Operations Research, 59.
Narchal, R. M. (1988). A simulation model for corporate planning in a steel plant. European
Journal of Operational Research, 34, 282–296.
Newell, B. (2012). Simple models, powerful ideas: Towards effective integrative practice. Global
Environmental Change, 22, 776–783.
Newell, B., & Proust, K. (2009). I see how you think: Using influence diagrams to support dialogue.
In Working Paper, The Fenner School of Environment and Society, College of Medicine,
Biology and Environment. The Australian National University.
Nodenof, T., Latorcade, P., Marquesuzaa, C., & Sallaberry, C. (2004). Model based engineering of
learning for adaptive web based educational systems. Proceedings of the 13th international
World Wide Web conference Available from http://0portal.acm.org.lispac.lsbu.ac.uk/citation.
cfm?id¼1013385&coll¼Portal&dl¼GUIDE&CFID¼39697420&CFTOKEN¼40723049
Noordegraaf, M. (2015). Public management: Performance, professionalism and politics. Palgrave
Macmillan.
Norreklit, H. (2000). The balanced scorecard. A critical analysis of some of its assumptions.
Management Accounting Research, 11(1), 65–88.
Noto, G., & Cosenz, F. (2021). Introducing a strategic perspective in lean thinking applications
through system dynamics modelling: The dynamic value stream map. Business Process Man-
agement Journal, 27(1), 306–327.
Onsel, N., & Barlas, Y. (2011). Modeling the dynamics of academic publications and citations.
Proceedings of the 29th International Conference of the System Dynamics Society,
Washington, DC.
Osborne, S. (2002). Public management: A critical perspective. Routlege.
Oyo, B., Williams, D., & Barendsen, E. (2008). A system dynamics tool for higher education
funding and quality policy analysis. Proceeding of the 26th International Conference of the
System Dynamics Society, Athens, Greece.
Parker, L. (2011). University corporatisation: Driving redefinition. Critical Perspectives on
Accounting, 22(4), 434–450.
Perez Salazar, G., Scheel, C., & Martinez Medina, M., (2007). Achievements teaching systems
thinking and systems dynamics to graduate students through e-learning. Conference Proceed-
ings of the 2007 International Conference of the System Dynamics Society, July 29–August
2, Boston, Massachusetts, USA.
130 3 Designing Dynamic Performance Management Systems in Higher. . .

Pilonato, S., & Monfardini, P. (2020). Performance measurement systems in higher education: How
levers of control reveal the ambiguities of reforms. The British Accounting Review, 52, 100908.
Potash, P., & Heinbokel, J. (2005). Unleashing the revolutionary implications of a system dynamics
education. Paper presented at the 23rd International Conference of the System Dynamics
Society, July 17–21, Boston MA, USA.
Pucciarelli, F., & Kaplan, A. (2016). Competition and strategy in higher education: Managing
complexity and uncertainty. Business Horizons, 59(3), 311–320.
Qi, J., Li, L., & Ai, H. (2009). A system dynamics approach to competitive strategy in mobile
telecommunication industry. Systems Research and Behavioral Science, 26, 155–168.
Qudrat-Ullah, H. (2014). Yes we can: Improving performance in dynamic tasks. Decision Support
Systems, 61(1), 23–33.
Rajala, T., Laihonen, H., & Haapala, P. (2018). Why is dialogue on performance challenging in the
public sector? Measuring Business Excellence, 22(2), 117–129.
Richardson, G. P. (1999). System dynamics. In S. Gass & C. Harris (Eds.), Encyclopedia of
operations research and management science. Kluwer Academic Publishers.
Richardson, G.P., & Andersen, D.F. (1979). Combining teaching and research in system dynamics.
Dynamica, Summer 1979.
Richardson, G. P., & Andersen, D. F. (1980). Toward a pedagogy of system dynamics. System
Dynamics, TIMS Studies in the Management Sciences, 14, 91–106.
Richardson, G. P., & Pugh, A. I. (1981). Introduction to system dynamics modeling with DYNAMO.
Productivity Press.
Roberts, N. H. (1978). System simulation of student performance in the elementary classroom. In
E. B. Roberts (Ed.), Management applications of system dynamics. Productivity Press.
Rouwette, E. A. J. A., Korzilius, H., Vennix, J., & Jacobs, E. (2011). Modeling as persuasion: The
impact of group model building on attitudes and behavior. System Dynamics Review, 27(1),
1–21.
Runge, D. (1977). A note on teaching system dynamics D-2653. In J. W. Forrester (Ed.), (2004)
MIT system dynamics group literature collection DVD, system dynamics society. Albany.
Saeed, K. (1990). Bringing practicum to learning: A system dynamics modelling approach. Higher
Education Policy, 3, 2.
Saeed, K. (1993). Bringing experimental learning to the social sciences: A simulation laboratory on
economic development. System Dynamics Review, 9(2), 153–164.
Saeed, K. (1996). The dynamics of collegial systems in the developing counties. Higher Education
Policy, 9(2), 75–86.
Saeed, K. (1997). System dynamics as a technology for new liberal education. Worcester Poly-
technic Institute, Report No. 3, Worcester, Mass. USA.
Saeed, K. (1998). Maintaining professional competence in innovation organisations. Human
Systems Management, 17, 69–87.
Santos, S. P., Belton, V., & Howick, S. (2002). Adding value to performance measurement by using
system dynamics and multicriteria analysis. International Journal of Operations and Produc-
tion Management, 22(11), 1246–1272.
Saravanamuthu, K., & Tinker, T. (2002). The University in the new corporate world. Critical
Perspective on Accounting, 13(5/6), 545–554.
Sawyer, B. (2002). Serious games: Improving public policy through game-based learning and
simulation. Foresight and Governance Project, Woodrow Wilson International Center for
Scholars, Publication 2002–2001. http://wwics.si.edu/subsites/game/index.htm
Schneider Fuhrmann, B., & Grasha, A. F. (1994a). The past, present, and future in college teaching:
Where does your teaching fit? In K. A. Feldman & M. B. Paulsen (Eds.), Teaching and learning
in the college classroom. ASHE Reader Series, Ginn Press.
Schneider Fuhrmann, B., & Grasha, A. F. (1994b). Toward a definition of effective teaching. In
K. A. Feldman & M. B. Paulsen (Eds.), Teaching and learning in the college classroom. ASHE
Reader Series, Ginn Press.
References 131

Schofield, C., Cotton, D., Gresty, K., Kneale, P., & Winter, J. (2013). Higher education provision in
a crowded marketplace. Journal of Higher Education Policy and Management, 35(2), 193–205.
Senge, P. (1990). The fifth discipline: The art & practice of the learning organization. Doubleday.
Senge, P. M. (1988). New system dynamics learning tools for management education and training
D-3999. In J. W. Forrester (Ed.), (2004) MIT system dynamics group literature collection DVD,
system dynamics society. Albany.
Shaffer, W. A. (1976). An initial concept for organizing system dynamics curriculum D-2438. In
J. W. Forrester (Ed.), (2004) MIT system dynamics group literature collection DVD, system
dynamics society. Albany.
Sloper, P., Linard, K., & Paterson, D. (1999). Towards a dynamic feedback framework for public
sector performance management. In Proceedings of the 1999 International System Dynamics
Conference. System Dynamic Society.
Sterman, J. D. (1992, October) Teaching takes Off OR/MS Today.
Sterman, J. D. (2010). Does formal system dynamics training improve people’s understanding of
accumulation? System Dynamics Review, 26(4), 316–334.
Sterman, J. D. (2000). Business dynamics systems thinking and modeling for a complex world.
McGraw-Hill.
Sterman, J. D. (2002). All models are wrong: Reflections on becoming a systems scientist. System
Dynamics Review, 18(4), 501–531.
Szelest, B. P. (2003). A system dynamics assessment of organization strategic goal realization:
Case study of a public research university (dissertation). State University of New York at
Albany.
Temple, P., & Shattock, M. (2007). What does branding mean in higher education? In B. Stensaker
& V. d’Andrea (Eds.), Branding in higher education: Exploring an emerging phenomenon
(pp. 73–82). EAIR.
Teixeira, P. N., Veiga, A., Pires da Rosa, M. J. M., & Magalhães, A. (2019). Under pressure higher
education institutions coping with multiple challenges. Brill.
Ter Bogt, H. J., & Scapens, R. W. (2012). Performance management in universities: Effects of the
transition to more quantitative measurement systems. European Accounting Review, 21(3),
451–497.
Torres, J. P., Kunc, M., & O'Brien, F. (2017). Supporting strategy using system dynamics.
European Journal of Operation Research, 260, 1081–1094.
Trailer J. 2012. Strategic planning model & tools for a state university college. Proceedings of the
International Conference of the System Dynamics Society. St. Gallen, Switzerland.
Vahdatzad, M.A., & Mojtahedzadeh, M.T. (2000). Some issues in the strategic management in a
fast-growing academic Institution: The case of university of Yadz, Proceedings of the 18th
International Conference of the System Dynamics Society (2000) 6–10 August 2000 Bergen-
Norway.
Välimaa, J., & Hoffman, D. (2008). Knowledge society discourse and higher education. Higher
Education, 56, 265–285.
Vennix, J. A. M. (1996). Group model building. Facilitating Team Learning Using System
Dynamics.
Virtual University. (2005a). Virtual university. http://virtual-u.org
Virtual University. (2005b). Game simulations for educational leadership & visualization: Virtual U
and beyond. http://virtual-u.org/conference-invite.asp
Warren, K. (2002). Competitive strategy dynamics. Wiley.
Warren, K. (2008). Strategic management dynamics. Wiley.
Wolstenholme, E. (1999). Qualitative vs. quantitative modelling: The evolving balance. Journal of
the Operational Research Society, 50(4), 422–428.
Yin, R. K. (2009). Case study research: Design and methods (4th ed.). Sage.
Zaini, R. M., Pavlov, O. V., Saeed, K., Radzicki, M. J., Hoffman, A. H., & Tichenor, K. R. (2016).
Let’s talk change in a university: A simple model for addressing a complex agenda. Systems
Research and Behavioral Science, 34(3), 250–266.
Chapter 4
University’s “Third Mission” Assessment
Through Outcome-Based Dynamic
Performance Management

4.1 Introduction

HEIs are currently undergoing major challenges as their role in society changes
constantly. To adequately take on these challenges and benefit from possible oppor-
tunities related to such a changing role, innovative and more comprehensive
approaches to University management are required (Bianchi & Caperchione,
2022). As previously described, managing academic institutions in the contemporary
global context is a complex task that requires adopting effective PM mechanisms to
fulfill the basic missions (i.e., Education and Research) such organizations exist for.
In this regard, the previous chapters focused on a micro viewpoint aimed to explore
and discuss how PM tools can comply with the organizational features of HEIs. They
also investigated the role played by SD modeling in offering methodological support
to design, implement, and use such tools through a DPM perspective (Bianchi, 2016;
Cosenz, 2014). Research on University management has traditionally adopted an
internally focused viewpoint to explore and experiment how PM tools in HEIs
enable academic decision-makers to frame and assess value generation processes
related to Education, Research, and their underlying administrative operations. In
this way, they facilitate the pursuit of strategies and action plans oriented to primarily
improve the organizational performance of Universities (Broadbent, 2007; Cave
et al., 1997; Angiola et al., 2018; Miller, 2007; Bianchi & Cosenz, 2013; Cosenz,
2014; Guthrie & Neumann, 2007). This chapter is an enhanced and extended version
of Cosenz (2022).
In recent years, building on the “Entrepreneurial University” model (Etzkowitz
et al., 2000; Gibb & Hannon, 2006; Gulbrandsen & Slipersaeter, 2007; Guerrero
et al., 2016; Thorp & Goldstein, 2013), the literature on Higher Education policy and
management identified and debated a third crucial role played by Universities in the
contemporary socio-economic context. It complements knowledge transfer and
development (i.e., Education and Research) and further emphasizes the

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 133
F. Cosenz, Managing Sustainable Performance and Governance in Higher
Education Institutions, System Dynamics for Performance Management &
Governance 5, https://doi.org/10.1007/978-3-030-99317-7_4
134 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

entrepreneurial vocation of these institutions (Ricci et al., 2019; Lombardi et al.,


2019). This role acknowledges HEIs as fundamental engines for the socio-economic
growth and innovation of a given local/regional area where the so-called knowledge
economy1 characterizes the developmental conditions of its economic sectors.
In this context, by collaborating with other local/regional players into policy
networks (Rhodes, 1990, 2017), HEIs are gradually taking a proactive role in
long-term value generation processes through the commercialization of knowledge
and engagement in entrepreneurial activities (Urbano & Guerrero, 2013). Such a
role—nowadays widely known as Third Mission (or Third Stream/Function/Role/
Leg)—also found an institutional endorsement by both the European Commission
and OECD (2012, p. 1), whose guiding framework declares: “higher education is
facing unprecedented challenges in the definition of its purpose, role, organization,
and scope in society and the economy. The information and communication tech-
nology revolution, the emergence of the knowledge economy, the turbulence of the
economy and consequent funding conditions have all thrown new light and new
demands on higher education systems across the world.” As such, pursuing the
Third Mission poses additional challenges, opportunities, and threats for Universi-
ties which are now called to foster partnerships, networking, and collaborations with
other local and regional players, as well as sustainability, economy, and social
engagement (Laredo, 2007; Gulbrandsen & Slipersaeter, 2007; Trencher et al.,
2014; Perkmann et al., 2013; Rippa & Secundo, 2019).
In terms of PM system design, the Third Mission of Universities implies adopting
a macro perspective for assessing the long-term results generated to enhance the
socio-economic development of the surrounding environment (Bianchi, 2010). This
endeavor must consider the collaborative governance settings and associated per-
formance emerging from the aggregated contribution of the single academic insti-
tution and its non-academic partners (e.g., public, private, and nonprofit
organizations, utilities, agencies, local authorities, civil participants, etc.) involved
in specific value creation processes (Douglas & Ansell, 2020; Ansell & Gash, 2008;
Bianchi et al., 2019; Bianchi & Vignieri, 2020; Mazza et al., 2006). Nevertheless, the
relevance of designing PM systems according to such a macro perspective is
hampered by the greater complexity of a longer and fragmented value creation
chain where multiple actors—entailing additional performance variables—intervene
at both political/strategic and operational levels (Moynihan et al., 2011;
Noordegraaf, 2015; Cepiku et al., 2012).
Third Mission activities are carried out through inter-institutional coordination,
which focuses on the strategic and operational interdependences between the Uni-
versity and its non-academic partners. This coordination mechanism leads to gener-
ate long-term results aimed to foster the socio-economic evolution of the local area

1
Powell and Snellman (2004, p. 199) defined the “knowledge economy” as “production and
services based on knowledge-intensive activities that contribute to an accelerated pace of technical
and scientific advance, as well as rapid obsolescence. The key component of a knowledge economy
is a greater reliance on intellectual capabilities than on physical inputs or natural resources.”
4.2 Defining Third Mission Activities in Higher Education Institutions 135

and its community living standards (Cosenz, 2022; Bianchi et al., 2019; Xavier &
Bianchi, 2019; Bianchi & Vignieri, 2020; Rasmussen & Borch, 2010; EENEE,
2014). Particularly, managing those value creation processes operationalized
through inter-organizational coordination requires exploring the collaborative gov-
ernance mechanisms—formally or informally—established by the University with
its network partners (Choi & Moynihan, 2019; Aversano et al., 2018). As argued by
Bianchi et al. (2019), this perspective differs from the evaluation of the organiza-
tional performance focused on the outputs intended as short-term results exclusively
produced by a single HEI (e.g., graduations, enrollments, publications). Collabora-
tive governance performance is measured through outcome indicators conceived as
long-term results—or impacts—generated by the aggregated contribution of multi-
ple organizations, including the University (e.g., academic partnerships, lower
unemployment rate, academic spin-offs).
With the intent to fuel the scientific debate on University Third Mission and those
methodological approaches to evaluate its outcomes, this chapter aims to explore
and discuss how an outcome-based DPM may support collaborative governance
settings (Xavier & Bianchi, 2019; Bianchi et al., 2017, 2019; Bianchi, 2016). Such
support focuses on managing their inter-organizational coordination and emerging
performance from the HEI perspective. New Public Governance (NPG)’s principles
have inspired this outcome-based approach (Osborne, 2006, 2010; Douglas &
Ansell, 2020; Broucker et al., 2017).
This approach is oriented to broaden the scope of traditional PM systems applied
to Universities by focusing on the sustainable community outcomes generated by
Third Mission activities, thereby exploring how HEIs and their local/regional part-
ners collaborate and implement network policies. To this end, the chapter initially
provides an overview of Third Mission definitions and the methods proposed by the
literature for evaluating associated results. Then, Third Mission evaluation practices
and initiatives in the European Union and OECD countries are reported and debated.
Subsequently, drawing on the insights and shortcomings emerging from the litera-
ture review, the outcome-based DPM approach is described and applied—through
an example—to Third Mission activities characterizing academic network perfor-
mance, policy design settings, and inter-organizational value generation processes.
Eventually, the chapter summarizes the main findings related to the use of DPM in
evaluating Third Mission activities and concludes with future research perspectives.

4.2 Defining Third Mission Activities in Higher Education


Institutions

In recent years, the definition of the University’s Third Mission has interested many
scholars proposing multiple notions of the multifaceted and complex activities
carried out by HEIs under this moniker. In HE policy and management research,
every attempt to define these activities converges toward the ultimate impact they
136 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

Fig. 4.1 The “Triple Helix”


framework (Etzkowitz,
2008)

should generate on the surrounding socio-economic context. However, Third Mis-


sion remains an ambiguous concept as it depends on the structure of academic
activities, its role in its geographical area (and associated socio-economic condi-
tions), and the country’s institutional framework (Laredo, 2007). In some countries,
Third Mission activities are widely recognized. They are an integral part of the
ministerial evaluation mechanism to distribute public funding to national HEIs (e.g.,
Finland, Sweden, UK, Spain, USA, Canada, Australia). In others, these activities are
partially included—or even not yet identified—in the broader mission of
Universities.
Pinto et al. (2016, p. 315) argue that the Third Mission “refers to an additional
function of the universities in the context of knowledge society. The university is not
only responsible for qualifying the human capital (Education—the first mission) and
for producing new knowledge (Research—the second mission). Universities must
engage with societal needs and market demands by linking the university’s activity
with its own socio-economic context. Today universities develop their strategies
around these three missions. Academics debate negative effects and the effective
integration of these missions in a coherent institutional framework. Governments
develop third mission policies allocating funding to this role while policy-makers
and experts are implementing specific indicators.”
In line with the above statement, two main approaches for defining the
University’s Third Mission have been proposed. The first one is the “Triple Helix”
model, which revolves around the University-Industry-Government interdepen-
dencies intended as the key-actors affecting the level of knowledge within a given
socio-economic context (Etzkowitz, 2008). Figure 4.1 portraits the Triple Helix
framework.
The second approach is suggested by Molas-Gallart et al. (2002, p. 2), who define
the Third Mission as “all activities concerned with the generation, use, application
and exploitation of knowledge and other university capabilities outside academic
environments.”
4.2 Defining Third Mission Activities in Higher Education Institutions 137

Building on the Triple Helix model and the rise of the knowledge-based society
paradigm, approximately since 2000, the Third Mission theory received greater
attention from intellectual capital (IC) scholars. They developed an evolutionary
process made up of consequential stage perspectives in IC theory advance related to
the role of Universities in today’s society (Secundo et al., 2019; Secundo & Elia,
2014; Esposito et al., 2013). At the end of this process, the fourth-stage IC perspec-
tive incorporates knowledge development and sharing into an ecosystem (national,
regional, or local). Thus, it encourages HEIs to create and strengthen relationships
with their local communities with the intent to contribute to the enhancement of their
socio-economic development through a knowledge sharing-based paradigm
(Secundo et al., 2015; Dumay & Garanina, 2013; Sanchez & Elena, 2006; Paoloni
et al., 2020; Frondizi et al., 2019).
Since its conceptualization (Leydesdorff & Etzkowitz, 1996), the Triple Helix
model also experienced significant innovations leading to enlarge the set of stake-
holders intervening—at different levels—in the generation of social and economic
value (Crosby et al., 2017; Miller et al., 2014; Amaral & Magalhães, 2002). Together
with the original helixes (i.e., University-Industry-Government), a Quadruple Helix
model included “media-based and culture-based public” and the “civil society”
(Carayannis & Campbell, 2012; Miller et al., 2016; McAdam & Debackere, 2018),
thereby acknowledging the relevance of value coproduction (or co-creation) pro-
cesses in the knowledge society (Bovaird, 2007; Alford, 2009; Pestoff & Brandsen,
2008; Osborne et al., 2016; Sicilia et al., 2016; McAdam et al., 2017).
Aiming to embrace a focus on sustainability issues, Carayannis et al. (2012) also
added a fifth helix corresponding to the “natural environments of society.” These
helixes should support a systemic view in the design of those policies oriented to
enhance the effectiveness of regional innovation processes conducted by the Uni-
versity with its non-academic partners, thus operationalizing the Third Mission
approach at all institutional levels (Frondizi et al., 2019).
With the intent to shed light on the new role played by Universities in the
knowledge society, Frondizi et al. (2019) have recently conducted a literature review
exploring the multiple definitions of this novel academic mission. Table 4.1 sum-
marizes the main results that emerged from this review.
Based on the above background, examples of Third Mission activities include
University-Industry-Government relations, technology transfer, academic entrepre-
neurship, knowledge commercialization, collaborative research, and academic con-
sulting. Each of the above examples implies the collaboration and coordination
between the University and other non-academic stakeholders for generating value
(Manes Rossi et al., 2018; Guerrero et al., 2019).
In recent years, alongside the institutionalization of Third Mission activities
strengthening the entrepreneurial orientation of Universities within the knowledge-
based society, the need for measuring the value generated by carrying out these
activities strongly emerged in many countries (Olcay & Bulu, 2017; Barnabè &
Riccaboni, 2007). Spanning the boundaries of the University organizational setting
toward the analysis of a broader value creation ecosystem—wherein multiple insti-
tutions interact and collaborate for generating outcomes at a local and regional
138 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

Table 4.1 Main definitions of the new role played by HEIs (Frondizi et al., 2019, p. 7)
Author(s) (year) Source Concept Definition
Molas-Gallart Science and Technology – Third Third Stream/Leg activities “are
et al. (2002) Policy Research Stream concerned with the generation,
– Third Leg use application and exploitation
of knowledge and other univer-
sity capabilities outside aca-
demic environments” (p. iv)
Gunasekara Journal of Technology – Third Role “Third role [is] performed by
(2006) Transfer – University universities in animating
engagement regional economic and social
development” (p. 102). “The
universities engagement
approach points to a develop-
mental role performed by uni-
versities in regional economic
and social development that
centres on the intersection of
learning economies and the
regionalisation of production
and regulation” (p. 103)
Pilbeam (2006) Journal of Higher Edu- – Third “Revenues from the commercial
cation Policy and stream exploitation of university intel-
Management income lectual assets (third stream
income)” (p. 297)
Business/Higher – – Commu- “Communities engagement has a
Education nity Engage- broad vista that extends beyond
Round ment business and economic aspects.
Table (2006) – Third Universities have a wider view of
mission engagement which includes
social, economic, environmental
and cultural dimensions of
capacity building” (p. 3). “Third
Mission activities of universities
seek to generate, apply and use
knowledge and other university
capabilities outside academic
environments” (p.4)
HEFCE (2008) – – Third “Third Stream refers to work to
Stream increase the impact of higher
education on economic develop-
ment and the strength and vitality
of society as a third stream of
activity alongside, and comple-
mentary to, teaching and
research” (p. 26)
Webber and Journal of Higher Edu- – Third con- “Third constituent of higher
Jones (2011) cation Policy and stituent of education can be described as
Management higher consisting of universities’ rela-
education tions with and contributions to
other sectors of society” (p. 17)
(continued)
4.2 Defining Third Mission Activities in Higher Education Institutions 139

Table 4.1 (continued)


Author(s) (year) Source Concept Definition
Bornmann Journal of the American – Societal “Societal impact of research is
(2013) Society for Information impact of concerned with the assessment of
and Science Technology research social, cultural, environmental,
and economic returns (impact
and effects) from results
(research output) or products
(research outcome) of publicly
funded research” (p. 217)
Sánchez- Research Policy – Social and “Social and Business Engage-
Barrioluengo Business ment is seen as reflecting the
(2014) Engagement changing nature of scientific
knowledge and the natural ten-
dency for academia to adapt in
response to social changes”
(p. 2)
Watson and Hall International Journal of – Third “Third Stream agenda is a criti-
(2015) Academic Research in Stream cal strategy in the pursuit of
Management enriched learning, enhancing
student employability and much
needed revenues” (p. 48)
Guerrero et al. Research Policy – Third “The entrepreneurial university
(2015) Mission serves as a conduit of spillovers
contributing to economic and
social development through its
multiple missions of teaching,
research, and entrepreneurial
activities” (p. 748)

level—implies additional challenges and complexities in terms of performance


management, policy design, and implementation. In this regard, several attempts
to propose methodological approaches, tools, and models for evaluating Third
Mission have been undertaken by HE policy and management scholars and supra-
national bodies, national Ministries of Research and Education, and specific aca-
demic institutions. The following section offers an overview of the main practices,
methodological proposals, and ongoing experiences reported in different contexts
(e.g., European Union, OECD countries, etc.), thus further framing state of the art
related to Third Mission evaluation.
140 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

4.3 The Evaluation of University Third Mission:


Conceptual Frameworks, Adopted Practices,
and Ongoing Experiences2

During the past 20 years, two main factors contributed to questioning and redefining
the governance model of HEIs on a global scale. They are (1) the emergence of the
“knowledge society” and (2) the adoption of international ranking systems intro-
ducing greater competitiveness among HEIs at both national and international levels.
On the one hand, focusing on the “knowledge-based society” has become a source
for obtaining political, social, and economic power in the era of globalization, thus
identifying the HE sector as “the engine of development in the new world economy”
(Castells, 1994, p. 14). The promotion of a “knowledge-based society” is considered
crucial in the government strategy for growth, innovation, and economic develop-
ment, as proved by international research and studies which show a strong correla-
tion between education and training, on the one hand, and the pursuit of economic
growth and social cohesion, on the other (EU Commission, 2003; OECD, 2009).
Enhancing the capability of HEIs to contribute to socio-economic development
requires a greater level of transparency and accountability. It allows different
stakeholders—be they students, other public institutions, businesses, local authori-
ties, or nongovernmental bodies—to deepen their knowledge about Third Mission
activities. As a result, they can better collaborate in designing participatory action
plans for coproducing value in the socio-economic context (Cooke, 2002; McAdam
et al., 2016; Nicolò et al., 2020). This call for greater transparency and accountability
about Third Mission outcomes is also stressed by the emergence of global evaluation
systems in the leading North American, European, and Asian Universities. These
systems anchor the perceived attractiveness of a country to the ability of its HEIs to
attract talents and develop new knowledge (Hazelkorn, 2012).
The growing spread of worldwide academic ranking systems has led to issuing
Anglo-American supranational rankings—such as the Times HE World University
Ranking—or Asian rankings, such as the Shanghai Jiao Tong Academic Ranking of
World Universities. In particular, in the North American system, Third Mission
activities are divided into two dimensions: (1) those linked to technological innova-
tion and (2) those related to engagement in society. As for the first dimension, the
evaluation of Third Mission activities is annually carried out by the Association of
University Technology Managers (AUTM). In this case, the aim is to check the
number of patents and licenses issued to American and Canadian Universities as a
criterion for allocating specific funds for research by philanthropic bodies and
private organizations.
In addition, in the USA, the Technology Licensing Office and some Universities
regularly collect complementary data to understand the entrepreneurial activities

2
Sections 4.3, 4.3.1, and 4.3.2 originate from a joint research activity conducted within the AIDEA
(Italian Academy of Management) work group by Luca Brusati and Silvia Iacuzzi (University
of Udine, Italy) and Carmine Bianchi and Federico Cosenz (University of Palermo, Italy).
4.3 The Evaluation of University Third Mission: Conceptual Frameworks,. . . 141

related to those patents and licenses issued to HEIs. In Canada, the Association of
Universities and Colleges of Canada regularly publishes a Survey of Intellectual
Property Commercialization in the Higher Education Sector, which describes the
number and type of patents issued to Universities and explores topics related to
intellectual property management.
Regarding the activities of social engagement, the reference point in the USA is
the Carnegie classification system which in 2005 included the Community Engage-
ment Classification giving the possibility to American Universities to acknowledge
their efforts in terms of socio-economic impact on the community. Beginning in
1970, the Carnegie Commission for Higher Education developed a classification of
US Colleges and Universities based on Education and Research performance met-
rics. The first Carnegie classification was published in 1973 using empirical data on
Colleges and Universities. It was subsequently updated in 1976, 1987, 1994, 2000,
2005, and 2010, and ultimately in 2015 when the responsibility for drawing up the
Carnegie classification was transferred from the Carnegie Commission to the Center
for Postsecondary Research of the Indiana University School of Education. This
classification has been widely used in the evaluation of HE to assess Colleges and
Universities and design research projects oriented to ensure adequate representative-
ness of samples of HEIs, or their students or academic staff. Since 2005, the
classification includes an evaluation of the “community engagement” defined as
“the collaboration between institutions of higher education and their larger com-
munities (local, regional/state, national, global) for the mutually beneficial
exchange of knowledge and resources in a context of partnership and reciprocity”
(Driscoll, 2008, p. 38). While the original Carnegie classification covers all HEIs, the
“community engagement” classification is drawn up voluntarily and managed by the
New England Resource Center for Higher Education. Its last classification dates
back to 2015 and incorporates the evaluation of 240 Colleges and Universities.
The Australian system followed the model adopted in the UK to evaluate Third
Mission activities. Since 2005, the Australian University Community Engagement
Alliance (AUCEA) has conducted field projects to develop indicators for supporting
the Australian University Quality Agency (AUQA) in assessing the performance of
Universities and rewarding them accordingly. The AUCEA approach focuses on
evaluating how much the activities carried out by HEIs contribute to achieving local,
regional, and national policy objectives. Nevertheless, the project is still at an early
developmental stage due to the difficulties encountered in articulating an accurate
and precise definition of Third Mission activities and the insufficient support pro-
vided by the central government for these initiatives.
The following subsections illustrate a review of the main practices and methods
for evaluating Third Mission activities, focusing on the European Union and OECD
experiences.
142 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

4.3.1 Third Mission Evaluation Practices in the


European Union

Over the past 20 years, the European Union (EU) has promoted the “knowledge-
based society” as a guiding paradigm for its socio-economic development but, at the
same time, has become aware of the shortcomings in terms of innovation and social
impact of European HE sectors. Too few European Universities are included in the
top of international rankings and often not even among the top 500 (Hazelkorn,
2012, p. 340). In its recommendations issued in September 2005 (Mazza et al.,
2006), the Council of Europe underlined that HE systems cannot reach satisfying
quality levels without the involvement of all stakeholders and without pursuing the
dictates of reforms such as the “Bologna Process.” Since 2000, such a reform
highlighted the need for greater convergence between European national systems,
proposed creating a coherent HE system at the European level for competing
worldwide. It ensured the free movement of students and academic staff acknowl-
edging their skills all around Europe (Reichert, 2009).
The EU aimed to become one of the most competitive and dynamic economies in
the world by 2010. Still, in 2011 the European Commission itself recognized that
“the potential of European higher education institutions to fulfil their role in society
and contribute to Europe's prosperity remains underexploited; Europe is no longer
setting the pace in the global race for knowledge and talent, while emerging
economies are rapidly increasing their investment in higher education” (European
Commission, 2011, p. 2). In the following years, the improvement of transparency,
performance, and competitiveness has been recognized by the European Commis-
sion as an integrating part not only for the success of the European HE Area (EHEA)
and the European Research Area (ERA) but also for achieving the objectives of both
the Lisbon Strategy and the Europe 2020 Strategy (Hazelkorn, 2012).
In response to global pressures claiming for innovating and aligning its HE
system with those more advanced (e.g., USA, UK, Australia), the EU attempted to
fill the gaps in the HE sector of the member countries by promoting greater
transparency, accountability, and comparability of Education, Research, and Third
Mission activities. A key challenge to take on for pursuing this improvement is
creating a system to evaluate and compare results among European HEIs. Such a
system should be easy to understand, harmonious, and not anchored to specific
domestic logics but consistent with European and extra-European experiences, thus
encouraging the attraction of both talents and investments from an international
ecosystem (Ricci et al., 2019). Therefore, since 2000 the EU promoted the devel-
opment of supranational evaluation methodologies—alternative to the North Amer-
ican and Asian systems—to enhance transparency and competitiveness and review
international rankings’ conceptual framework and methods. In particular, within the
Fifth Framework Program (1998–2002), the EU started and sponsored the following
projects:
4.3 The Evaluation of University Third Mission: Conceptual Frameworks,. . . 143

• The Proton project3 aimed to develop the economic and commercial potential of
publicly funded research for improving the professional skills of scholars that
resulted in the creation of ASTP-Proton, i.e., the European association of experts
in “knowledge transfer” between University and industry.4
• The project promoted by the Austrian Ministry of Economy and Labor oriented to
evaluate the relationships between industry and science in terms of R&D, staff
mobility, training, and marketing of research products (Polt et al., 2001).
In the following years, the EU began to consider the Third Mission as activities
beyond partnerships with industry. In 2003, the Prime-OEU project sponsored a
preliminary assessment of all University activities (Schoen et al., 2007). Subse-
quently, in 2005 a mapping process of academic activities—named U-Map pro-
ject—was also funded (Van Vught, 2009). In 2009, a new consortium developed a
multidimensional ranking based on the results of the U-Map project, while the
development of the U-Multirank ranking was sponsored in 2011. This latter—
officially launched in 2014—supports students in choosing their future HEI by
comparing different Universities in terms of educational offer, research fields, and
scholarship availability. During the same years, the AUBR project (Assessing
Europe’s University-Based Research) started with the intent to propose a
multidimensional approach to research evaluation (AUBR, 2010), alongside the
E3M project (European Third Mission) aimed to identify, measure, and compare
Third Mission activities according to a shared assessment framework.

4.3.1.1 Prime-OEU Project

In 2003, in the context of the PRIME Network of Excellence, the European


University Observatory (Prime-OEU) project was promoted to examine the strategic
positioning of European Universities by measuring the different activities character-
izing their Education, Research, and Third Mission (Schoen et al., 2007). The latter
was intended as the University’s relationships with industry, public administrations,
and society. Therefore, from a conceptualization exclusively anchored to an eco-
nomic and commercial perspective, the characterization of Third Mission moved
toward the inclusion of both policy design contribution and involvement in the social
and cultural life of a community.
The Prime-OEU project developed a matrix and a radial map in which two
dimensions—i.e., the economic and the societal ones—are measured according to
four criteria. Namely, the economic dimension focuses on human resources, intel-
lectual property, academic spin-offs, and partnerships with private organizations. In
contrast, the societal dimension embraces a public understanding of science, social

3
http://cordis.europa.eu/project/rcn/71628_en.html
4
http://www.astp-proton.eu/
144 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

and cultural life participation, decision-making processes, and partnerships with


public institutions.
Twelve universities participated in the project contributing to testing and cali-
brating both qualitative and quantitative performance indicators. As for the Third
Mission, 36 indicators were developed, eight of which included quantitative mea-
sures. However, the Prime-OEU project did not lead to a complete scoreboard and
was stopped in its early developmental stage. Its main results were associated with
overcoming the one-dimensional conception of the Third Mission and the develop-
ment of a strategic matrix including a set of questions on those dimensions relevant
to the evaluation of Third Mission activities (Jonbloed, 2008).

4.3.1.2 U-Map

The U-Map is a measurement mechanism based on the Carnegie system used in the
USA (Jumbled, 2008). It aimed to map the different characteristics of the European
HE sectors. The evaluation dimensions included curricula, learning methodologies,
number of students, investment in research projects, international inclination, and
involvement in the local context.
The U-Map classification was an important step in recognizing the complexity of
HE and criticizing the narrowness of existing international rankings. Nevertheless,
the U-Map was geared toward mapping rather than comparing and classifying the
characteristics of HEIs. As a result, it did not generate a real classification system
incorporating systematic and comparable indicators.

4.3.1.3 U-Multirank

The next step after U-Map was the creation of an online database for comparing
Universities according to a multidimensional classification.5 Then, the U-Multirank
was launched in 2014 with data and measures of over 1200 HEIs, 1800 departments,
and 7500 study programs in 80 countries. This project aimed to contrast North
American and Asian rankings by offering a more concrete system that proved the
plurality of activities carried out by European HEIs.
Using some techniques developed by the Center for Higher Education, the
U-Multirank is based on four design principles (Hazelkorn, 2012, p. 350):
1. It is a user-driven system where each user can classify the HEIs included in the
database according to different performance dimensions.
2. It is a multidimensional database with information on five key processes:
research, education, learning, international inclination, and knowledge transfer.
3. It aims to ensure comparability among HEIs.

5
http://www.umultirank.org
4.3 The Evaluation of University Third Mission: Conceptual Frameworks,. . . 145

4. It permits the execution of multilevel analysis, i.e., Universities can be evaluated


in terms of specific departments or disciplines.
However, the U-Multirank methodology does not explicitly include specific
indicators for measuring Third Mission activities, whose evaluation is somehow
included in the following categories: Research—Expenditure on research and
Knowledge transfer—Income from private sources.

4.3.1.4 AUBR

In 2008, by undertaking the Assessing Europe’s University-Based Research


(AUBR) project, the European Commission appointed some experts to develop a
multidimensional methodology for evaluating University Research.
The project concluded that a one-dimensional method could not exist because
there is no single set of indicators capable of capturing the complexity of Research
activities and their evaluation (AUBR, 2010, p. 12). To value the complexity of
Research in its various fields, an effective evaluation system should (AUBR, 2010,
p. 13):
1. Include qualitative and quantitative indicators (e.g., peer-group evaluation and
bibliometric indexes)
2. Provide information on the effect of Research on Education
3. Provide for a self-assessment within the overall evaluation process
4. Measure the social impact and associated advantages
5. Adopt a multilevel approach focusing on evaluating academic scholars, depart-
ments and organizational units, and the HEI as a whole
In addition, the results of the AUBR project rejected the use of perverse incentive-
based systems, thus encouraging the role of strategic planning in predicting possible
future dysfunctions and, consequently, adopting prompt strategic countermeasures.
Therefore, the AUBR project was mainly concerned with offering research evalua-
tion guidelines rather than developing an effective assessment system incorporating
Education and Third Mission activities.

4.3.1.5 E3M

The Life-Long Learning E3M (European indicators and ranking methodology for
University Third Mission) project started in 2008 and ended up in 2012. Under the
guidance of the Universitat Politècnica de València (Spain), this project involved
eight HEIs from different European countries. The main goal was to allow public
and private sponsors to evaluate Universities by focusing on Third Mission activities
(European Commission, 2012). The methodological framework implied three steps:
146 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

1. A literature review aimed to define the macro dimensions of the Third Mission
2. Delphi interviews consisted of three rounds with 30 experts to develop variables
and indicators for each identified macro dimension
3. Validation of the indicators conducted in six European HEIs that were not project
partners
Following the above methodological pathway, three macro dimensions have been
identified for measuring Third Mission activities: (1) technology transfer and inno-
vation, (2) continuing education, and (3) social commitment. For each dimension,
multiple indicators have been developed—both qualitative and quantitative—
reaching a total of 54 measures. Although it represented an interesting attempt to
assess Third Mission activities, the E3M evaluation system found few HEIs available
to adopt such a system. Probably, the causes relate to the achievement of few benefits
compared to its complexity and the high cost of monitoring a wide range of
indicators at frequent intervals. After the publication of the last online report in
2012, there is no mention of the E3M assessment framework in the official guide-
lines of European governmental agencies (e.g., ENQA – European Association for
Quality Assurance in Europe), and its institutional website is nowadays off-line.

4.3.1.6 Intellectual Capital-Based Performance Framework


for Measuring Third Mission

In recent years, drawing on the reported EU experiences and research projects,


Secundo et al. (2017) proposed adopting an intellectual capital-based performance
framework for measuring those results emerging from the execution of Third
Mission activities. Specifically, this framework considers the Third Mission as a
complementary function of Research and Education and, as such, focuses on three
interconnected areas (or goals) which provide an extension of the first two academic
missions. Namely, they are (1) Research intended in terms of technology transfer
and innovation (i.e., management of intellectual property, spin-off creation, and
R&D network development), (2) Education related to lifelong learning and continu-
ing education (i.e., education for entrepreneurial competencies, talent attraction, and
incubation), and (3) social engagement in line with regional, national, and interna-
tional development.
As shown in Fig. 4.2, the above areas identifying Third Mission goals are
associated with the three IC components. Thus, while continuing education is
directly connected with human capital, technology transfer and innovation are
more related to organizational capital and social engagement to the social capital
(Secundo et al., 2017). In addition, the framework highlights the impact of each
Third Mission activity on the multiple levels of the academic structure, moving from
its internal dimensions (Departmental and University level) toward the external
ecosystem (community level). In particular, the impact of Third Mission activities
is relevant in terms of quality assurance and assessment at the Departmental level. In
turn, the University level aims to measure such an impact according to its mission
4.3 The Evaluation of University Third Mission: Conceptual Frameworks,. . . 147

Fig. 4.2 An intellectual capital-based performance framework for assessing the Third Mission of
University (Secundo et al., 2017, p. 235)

and goals. Eventually, the regional development provides evidence of their impact at
the community level (Elena-Perez et al., 2014).
As proposed by Secundo et al. (2017, p. 235), applying the IC-based performance
framework leads to identifying a set of performance indicators to measure and report
the results generated by the HEI for each Third Mission goal. These indicators are
reported in Table 4.2. By adopting an organizational perspective focused on the
institutional setting of the University, they offer a plurality of measures divided into
the three IC dimensions (i.e., human, organizational, and social capital).

4.3.2 The OECD’s Initiatives for Evaluating Third Mission


Activities

At the turn of the twentieth and twenty-first centuries, the OECD was very active in
identifying and developing partnerships between industry and science, thereby
stimulating the deepest commercial spirit of Third Mission activities (Molas-Gallart
et al., 2002). In 2002, the OECD sponsored a research project on academic spin-off
development. Since 2001, it has engaged in an initiative to collect and analyze data
on the role and relevance of intellectual property rights originating from public
institutions, such as Universities.
Together with the EU, the OECD also funded several projects to analyze,
measure, and encourage Third Mission activities. In this regard, a recent example
is the Entrepreneurial Universities project (OECD, 2012). It involved European and
North American Universities in developing a framework for measuring entrepre-
neurship in HEIs, i.e., their ability to offer a plurality of innovative, creative, and
148 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

Table 4.2 Measuring Third Mission of universities by adopting an intellectual capital perspective
(Secundo et al., 2017, p. 235)
Third
Mission Third Mission Organizational
goals processes Human capital capital Social capital
Technology Intellectual prop- – No. of staff – No. of incuba- – No. of interna-
transfer and erty and spin-off involved in crea- tors co-owned tional awards
innovation tive commons and by the Univer- received
social innovation sity – No. of
projects – No. of patents, consortiums
– No. of start-ups/ licenses, trade-
spin-offs founded marks co-owned
by graduates/ by the
Higher Education University
Institution
employees
– No. of staff
funded by compet-
itively funded
R&D projects
R&D network – No. of joint pub- – Success rate in – No. of joint
development lications with R&D project international
non-academic applications R&D projects
authors – No. of shared – No. of (new)
– No. of postgrad- (open access) partnerships in
uate students and laboratories or R&D projects
postdoctoral buildings – No. of compa-
researchers directly nies co-funding
funded by private research or edu-
business cation activities
carried out by the
University
Continuing Continuing educa- – No. of staff – No. of active – No. of corpo-
education tion for entrepre- delivering CE with CE programs rate clients
neurial competence experience in – No. of ECT co-funding edu-
launching start- (European cation of their
ups/spin-offs Credit Transfer) staff
– % of staff teach- credits of the – No. of interna-
ing in CE programs delivered CE tional students in
programs CE programs
– % of staff with
entrepreneurship
experience
Talent attraction – No. of HEI staff – No. of staff – % of staff/stu-
and incubation who attended con- employed for dents with quali-
tinuing training talent attraction fications obtained
courses and incubation abroad
– No. of research (e.g., external
fellows (scientific cooperation)
staff funded by
scholarships)
(continued)
4.3 The Evaluation of University Third Mission: Conceptual Frameworks,. . . 149

Table 4.2 (continued)


Third
Mission Third Mission Organizational
goals processes Human capital capital Social capital
Social Social engagement – No. of academic – No. of events – No. of partners
engagement with the staff involved in open to commu- (academic/non-
community volunteering advi- nity/public academic) in pro-
sory – No. of jects that do not
– No. of media research initia- generate income
appearances on tives with direct – No. of institu-
public issues impact on the tions involved in
– No. of academic community formal agreement
staff involved in – No. of with the
the regional plan- museum centers University
ning managed or
– No. of citizens co-managed by
attending work- the structure
shops and scien-
tific events
– No. of external
stakeholders (man-
agers,
policymakers, etc.)
involved in curric-
ulum design and
delivery
Internationalization – No. of scientific – No. of scien- – No. of partner
staff who stayed tific journals institutions deliv-
abroad for at least with university ering joint degree
5 days staff serving on programs
– No. of faculty editorial boards – % of students
presentations at engaged in
scientific inward and out-
conferences ward interna-
tional mobility

pragmatic approaches characterizing their specific entrepreneurial orientation. The


framework consists of seven dimensions:
1. Leadership and governance
2. Organizational skills, people, and incentives
3. Development of entrepreneurship in teaching and learning
4. Executive programs for entrepreneurs
5. University-Industry: external relations for knowledge exchange
6. The Entrepreneurial University model as an international institution
7. Measures of the impact of the Entrepreneurial University
For each of the above dimensions, five to seven qualitative indicators have been
designed to ensure that each University can measure its entrepreneurial orientation
and associated impact on the community. Then, such a framework does not
150 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

characterize a system to evaluate and classify Third Mission activities, but rather a
different approach to exploring the complexity of HE activities to encourage a more
entrepreneurial orientation of Universities worldwide.
Based on the above review of the literature and the current initiatives undertaken
by different countries, the following section introduces an outcome-based DPM
approach as a method for exploring and managing the Third Mission activities of
Universities into a broader value creation ecosystem, thus providing an
inter-organizational perspective in setting policy coordination and collaborative
governance aimed to foster sustainable socio-economic development and better
community outcomes (Bianchi et al., 2019; Bianchi & Vignieri, 2020; Bryson
et al., 2020; Herrera et al., 2019; Torfing & Ansell, 2017; Moynihan, 2008; Bovaird,
2007).

4.4 Applying Dynamic Performance Management


to Enhance Third Mission Activities

Outcome-based DPM—also known as Dynamic Performance Governance (Bianchi,


2021; Bianchi et al., 2017, 2019, Bianchi and Salazar Rua 2020)—is an approach
“able to support policy networks to pursue sustainable community outcomes”
(Bianchi et al., 2019, p. 2). It fits into the evolutionary path of methods suggested
in the NPG research (Osborne, 2010, 2006). This approach uses the same method-
ological perspective of DPM for spanning the organizational boundaries of a single
institution to foster both consistency and learning in policy design, implementation,
and inter-organizational coordination at a policy network level (Cosenz, 2022). For
this reason, such a method can be valuable to explore the value generation mecha-
nisms related to Third Mission activities. These activities are characterized by
complex operational interactions between the HEI and other community stake-
holders who form a collaborative ecosystem to pursue sustainable development
goals (Klijn, 2008; Klijn & Koppenjan, 2000). In this view, the long-term results
associated with sustainable development goals are intended as the outcomes emerg-
ing from the aggregated contribution of those institutions collaborating into the
academic network system.
According to outcome-based DPM, the pursuit of sustainable development
implies to focus on a multidimensional view of Third Mission performance which
entails a balance among the “academic network success,” “time,” and “space”
perspectives. This view crosses the single organizational system boundaries—as
depicted in Fig. 2.4—to embrace a larger ecosystem wherein the University is one of
the multiple stakeholders involved in a policy network to create value locally and
nationally.
As shown in Fig. 4.3, the “academic network success” depends on three main
dimensions: a competitive, financial, social, and environmental dimension (Coda,
2010). The competitive and the social/environmental dimension should be oriented
4.4 Applying Dynamic Performance Management to Enhance Third Mission Activities

Fig. 4.3 Complementary dimensions of the academic network performance (adapted from Bianchi et al., 2019)
151
152 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

to sustain satisfying financial results in the long term by better fulfilling those
community needs associated with the role of HEIs in the knowledge society
(Coda, 2010; Guthrie & Neumann, 2007). In addition, perspectives related to
“time” and “space” must be considered when framing and exploring the Third
Mission performance of HEIs. Regarding the “time” perspective, an improvement
in short-term performance should not be obtained to the prejudice of long-term
results. Too often, unbalanced time-horizon policies result in a transitory short-term
improvement that hides worse undesired effects in the long run. Thus, balancing the
short- and long-term performance in policy design and implementation requires
adopting a consistent methodological approach to performance management and
measurement. Such an approach is related to balancing short- and long-term goals
and measuring the outcomes of current and often inertial policies. These policies
relate to the change in organizational structures and external contextual conditions
affecting the local/regional ecosystem where the University and its non-academic
partners (co-)operate (Mastilak et al., 2012).
In this context, while the organizational performance of a single University can be
gauged through the use of output measures, e.g., graduations, enrollments, and
publications, the academic network performance focuses on outcome indicators.
Outcomes represent long-term impacts generated by the aggregated contribution of
multiple partner organizations, including the University, e.g., academic spin-offs,
University-Industry-Government partnerships, and technology transfer. In this
regard, the outputs produced by a single HEI affect the endowment of its strategic
resources (e.g., enrolled students, research outputs, academic spin-offs, citations).
Conversely, the outcomes generated by a plurality of local/regional actors—includ-
ing the University—are likely to influence those shared strategic resources of the
community (e.g., employment rate, human capital, local businesses, social capital,
pollution). Concerning the “space” perspective, a sustainable development based on
Third Mission activities emerges from the search for consistency and mutual depen-
dencies between not only the multiple outputs offered by different organizational
units, departments, and faculties of a University but also the University’s network
and its local area performance. As Bianchi et al. (2019, p. 4) remarked, “this can be
defined as an ‘interorganizational’ (or multi-agency) perspective. The focus of this
view is on local area performance, i.e., the aptitude of stakeholders in a region (e.g.,
a city neighborhood) to collaborate for developing common goods that may gener-
ate public value, which may provide better conditions for local organizations to
pursue sustainable development.”
This outcome-focused approach applies the DPM perspective to enhance perfor-
mance governance, thus supporting policymaking through the adoption of interpre-
tive lenses for exploring how and why performance measures change over time, as
an effect of undertaken policies, stakeholders’ actions, and external conditions
(Bianchi, 2016, 2021; Herrera et al., 2020; Bianchi et al., 2019). As for Third
Mission activities in Universities, the use of outcome-based DPM supports a sys-
temic and inter-organizational perspective to frame academic network value creation
processes (see Fig. 4.4) according to a collaborative policy design mechanism. This
mechanism primarily focuses on the sustainable development of the local area and,
4.4 Applying Dynamic Performance Management to Enhance Third Mission Activities 153

Fig. 4.4 A systemic perspective to frame academic network value creation processes

subsequently, of the single University. It allows collaborative governance partici-


pants to build consensus and accountability around the policies and associated
actions for pursuing sustainable development goals, as well as to nurture the
implementation of value co-creation programs and citizens’ participation in policy
design settings (Douglas & Ansell, 2020; Alford, 2009; Bovaird, 2007; Becerra
Fernández et al. 2020; Osborne et al., 2016).
The outcome-based DPM approach applied to Third Mission activities explores
how an academic network achieves its results by analyzing and measuring those
critical drivers leading to performance. It contributes to designing performance
indicators that combine shared and unshared strategic resources with the associated
community outcomes. While unshared resources are owned and managed by a single
network organization (e.g., the University), shared resources include common goods
and assets for the use of the community as a whole (e.g., employment rate, human
capital, local businesses). Although they can only be influenced by a plurality of
actions by multiple stakeholders and, hence, are complex to manage, these latter are
quite relevant for building and supporting the performance of the local area
according to a sustainable development perspective.
As Fig. 4.5 shows, this approach aims to highlight the relation between resource
accumulation/depletion processes and corresponding outcomes. This is possible by
identifying the performance drivers linked to those critical factors leading to the
outcomes of the local area. Thus, academic network decision-makers may act on
these drivers to affect end-results (Zhang et al., 2020). Namely, it investigates how
results are achieved at the policy network level in terms of resource allocation and
consumption, as well as how these results, in turn, create value fueling the
corresponding resources over time. Measuring performance drivers provides a
154 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

Fig. 4.5 An outcome-based dynamic performance management perspective (Bianchi, 2010, 2016,
2021; Bianchi et al., 2019)

deeper understanding of how this value is generated along the academic network
value chain. It focuses on the coordination mechanism between the various stake-
holders acting and collaborating into the network. These drivers represent the main
factors driving performance governance. Their assessment also provides an under-
standing of how a University and its non-academic partners are generating value in
correspondence with those critical success factors able to affect community out-
comes. They are measured as ratios between the current and the desired strategic
resource levels. Assessing these value drivers enables academic network decision-
makers to identify causal determinants and related interplays significantly affecting
end-results. Consequently, desired corrective actions applied to such drivers can be
promptly undertaken in the short term. In fact, unlike end-results, academic network
decision-makers may directly influence these drivers to effectively pursue sustain-
able development goals.
The emerging outcome-based DPM models are developed by constructing feed-
back structures—used in System Dynamics modeling (Morecroft, 2015; Sterman,
2000)—able to frame the causal interdependences among the relevant variables (i.e.,
strategic resources, performance drivers, and end-results) of the governance struc-
ture under observation. As described in Chap. 3, these feedback structures explain
the rationale underlying the behavior of the variables forming the loops by highlight-
ing both drivers and policy levers to influence the current state of the system
(Sterman, 2000). Consequently, like other modeling approaches supporting
4.4 Applying Dynamic Performance Management to Enhance Third Mission Activities 155

Fig. 4.6 Using outcome-based DPM for enhancing collaborative governance into the academic
network

decision-making, outcome-based DPM models serve as cognitive tools to explore


the value creation processes affecting community outcomes. It also fosters a better
understanding of how the stakeholder network reacts to implemented policies in
terms of performance governance (Douglas & Ansell, 2020).
This approach uses insight (or policy-based) SD models that have proven to
support a descriptive perspective in policy analysis and performance management.
These models contribute to communicating and sharing an understanding of the
causes and implications underlying the governance system among network partici-
pants (Bianchi & Vignieri, 2020; Bryson et al., 2016).
Figure 4.6 synthesizes the circular logic through which outcome-based DPM
offers methodological support to analyze the performance of the local area where the
academic network operates to create value. This systemic approach provides ade-
quate levels of breakdown, enabling further investigations to discover cause-effect
correlations, thus offering more effective support in academic decision-making
processes.
By highlighting how value is generated and the corresponding policy levers to
affect outcomes, such an analysis provides network stakeholders with an interpretive
framework aimed at (1) facilitating the diagnosis of performance gaps, (2) fostering a
strategic and organizational learning process of network participants, (3) supporting
policy (re)design, and (4) improving the strategic coordination among network
156 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

participants. Then, these actions form the backbone to set up a robust collaborative
governance structure for nurturing policy outcomes over time (Bryson et al., 2020).
In the next section, an example applying outcome-based DPM to Third Mission
activities—namely, University-Industry-Government partnerships—is shown and
discussed to provide explanatory evidence on the use of this approach in this specific
policy network context.

4.5 Applying Dynamic Performance Management


to Enhance Third Mission Activities (Cont’d): An
Example Focused on University-Industry-Government
Partnerships

This section aims to provide an example of how to apply the outcome-based DPM
approach to Third Mission activities. Drawing on the Triple Helix model (Etzkowitz,
2008; Leydesdorff & Etzkowitz, 1996), this example frames the role played by HEIs
in establishing University-Industry-Government (UIG) partnerships with the intent
to strengthen the socio-economic ecosystem of a given local area according to a
sustainable development perspective (Etzkowitz, 2003, 2008).
In particular, UIG partnerships support the entrepreneurial vocation of the Uni-
versity, thus fostering its capability to generate new academic spin-offs. This
capability is likely to increase the local businesses operating in the same local area
(Fuster et al., 2019; Carree et al. 2014; Graham, 2014; Guerrero et al., 2016; Hayter,
2013; McAdam et al., 2016; Van Looy et al., 2003). In this case, the academic
network focuses on implementing entrepreneurial activities by graduates and spe-
cific professional profiles trained and supported by the University. They collaborate
with local businesses, R&D centers, science and technological parks, governmental
bodies, and other public organizations. As such, UIG partnerships aim to generate a
supportive ecosystem for the academic community and its surroundings, developing,
sharing, grasping, and using new knowledge that can give rise to academic spin-offs
(Perkmann et al., 2011). As argued by Fuster et al. (2019, p. 219), academic spin-offs
“are an important vehicle of knowledge transfer from Universities that take advan-
tage of innovations and creating new high-quality employment and accelerating the
productivity of regional economies. Policymakers are increasingly investing in
universities to foster the creation of innovative start-ups in the hope of producing
areas of economic growth and the resulting initiatives are predicated on the idea,
using successful well-known examples such as Silicon Valley, that a well-structured
entrepreneurial university ecosystem automatically leads to the emergence of suc-
cessful business ecosystems.”
In the vein depicted by Lubik et al. (2013) and Guerrero et al. (2019), the
proposed example assumes that, besides the University, the academic network is
formed by its graduates, local businesses, local governmental bodies, and residents.
These community actors are called to implement inter-organizational coordination at
4.5 Applying Dynamic Performance Management to Enhance Third. . . 157

Fig. 4.7 Applying outcome-based DPM to University-Industry-Government partnerships into a


Third Mission setting

a policy network level, thus forming collaborative governance settings aimed to


jointly affect the value creation processes leading to the growth of their socio-
economic ecosystem (Jongbloed et al., 2008). While the economic development
focuses on starting academic spin-offs resulting in new local businesses, the social
growth depends on the capability of these new firms to hire and retain resident
talents, thus fostering a lower unemployment rate in the local area (Hayter, 2013;
Jonbloed, 2008).
Figure 4.7 illustrates the emerging outcome-based DPM model. Noticeably, it
identifies the main factors framing how UIG partnerships affect the socio-economic
performance of the local area and divides them into the corresponding layers (i.e.,
strategic resources, performance drivers, end-results) according to a systemic per-
spective. Such a perspective entails the identification of the causal interdependences
among these factors, thus providing academic network decision-makers with proper
interpretive lenses to understand how the system works and reacts to implemented
policies. The feedback connections between end-results and strategic resources—
underlying this specific inter-organizational value creation process—are emphasized
through gray-colored variables.
In particular, implementing UIG partnerships calls for a more explicit request of
specific professional profiles to be educated and trained by the University. These
professional profiles may depend on the search for peculiar skills and abilities
required by non-academic partners (i.e., local businesses and governmental bodies
or involved public institutions) to meet those new requirements of the current local
labor market. Then, the University may produce efforts oriented to introduce and
implement innovative curricula of study to fulfill these requirements, thereby
increasing the number of graduates holding these professional skills. In this context,
158 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

a performance driver—i.e., the graduates’ ratio—helps measure the fraction of


graduates over the resident population, influencing the local area’s human capital
employability. With the joint support of local businesses and public spending
policies, increased human capital may give rise to new UIG opportunities evaluated
through another performance driver, i.e., the UIG opportunity ratio. Namely, this
driver aims to gauge the match between the human capital educated by the Univer-
sity and the professional requirements of its non-academic partners, thus resulting in
new opportunities for starting UIG partnerships. It affects the change in new UIG
partnerships whose corresponding stock is evaluated through the UIG partnership
ratio. This latter identifies a performance driver assessing the aptitude of the aca-
demic network to establish new UIG agreements oriented to start academic spin-offs
and support their development in the local area.
In the medium/long term, the rise in academic spin-offs is likely to foster the
area’s economic development by increasing the stock of local businesses thereinto
operating and producing value for the benefit of the community. This value may
imply new job opportunities for the residents of the local area whose assessment is
carried out by matching the local businesses’ job offerings with the resident popu-
lation living thereinto. Other conditions being equal, an improvement of this per-
formance driver leads to a decrease in the local area’s unemployment rate, which, in
turn, implicitly enhances its attractiveness toward nonresidents in terms of working
standards, business opportunities, and career perspectives. Eventually, an increase in
local businesses and resident population may strengthen the local taxpayers’ capac-
ity to contribute to public finance, thereby feeding back into public funding to keep
improving the UIG partnership system and, as a result, the socio-economic devel-
opment of the local area.
The application of outcome-based DPM to frame the collaborative governance
structure affecting UIG partnerships for pursuing sustainable public value generation
enables academic network decision-makers to share and nurture a common view
over the factors and associated mechanisms leading to local area performance
(Douglas & Ansell, 2020; Alford & Yates, 2014). This view facilitates the identifi-
cation of those policy levers on which—depending on the specific role played into
the network—the HEI and its non-academic partners may act to influence commu-
nity outcomes. Such policy levers provide for more effective allocation and con-
sumption of the shared and unshared strategic resources—identified in the DPM
model—under the control of the academic network actors. For instance, involved
local businesses may address their efforts, on the one side, toward a greater engage-
ment of governmental bodies and public institutions in establishing UIG projects
aimed to foster the local economy and competitiveness and, on the other, toward a
more explicit request of specific professional profiles to the University. By acknowl-
edging these requests, governmental bodies and public institutions are called to
proactively engage in UIG partnerships and allocate adequate public funding to
develop new academic spin-offs.
However, this policy should not contrast with the pursuit of other community
outcomes. It should highlight the emergence of managing a critical trade-off between
pursuing a too rapid and boundless increase in new businesses in the short term and
4.6 Closing Remarks 159

the associated rise of pollution harmful for the local natural environment. In the same
way, the HEI may innovate and adapt its educational programs to the specific
requirements of UIG partners, thereby training professional profiles which can be
easily employed in the local labor market. Once this common view has been
acknowledged and shared among network participants, assessing performance
drivers and outcomes allows academic network decision-makers to overcome poten-
tial conflicts, enhance inter-organizational coordination, and promote a shared learn-
ing process in policy design and implementation.
The following section highlights the main findings emerging from adopting
outcome-based DPM in assessing Third Mission activities by drawing on the
example above described. It proposes possible avenues for contributing to the
development of this research stream.

4.6 Closing Remarks

Assessing Third Mission activities of HEIs is quite a novel topic in the HE policy
and management literature. There is a widespread and open debate about what
methodological approaches and indicators are more suitable for measuring the
impact produced by Third Mission activities on the community and socio-economic
context where the University operates. This chapter introduced and described an
outcome-based DPM approach to manage and affect the academic network value
generation processes underlying Third Mission activities in Universities. Such an
approach supports the academic network—formed by the HEI and other local/
regional stakeholders—to measure and assess the outcomes produced by collabora-
tive policies oriented to foster the socio-economic development of a local area
according to a sustainability-based perspective. These outcomes represent the
end-results originating from Third Mission activities.
The chapter has initially provided a review of the literature reporting a plurality of
definitions applied to the “Third Mission” concept. Although there is no conver-
gence toward a fully accepted definition (Frondizi et al., 2019), the review empha-
sized that this concept embraces all the activities in which a University plays a
proactive role in collaboration with other local/regional partners (e.g., public and
private organizations, governmental bodies, agencies, and utilities). These activities
aim to generate a long-term impact on the local/regional area, its community,
economy, and environment. As such, the core attributes identifying Third Mission
activities include (1) a proactive role of the University into the local stakeholder
network, (2) a collaborative governance structure formed by the academic network
partners involved in policy design and implementation processes, and (3) the gen-
eration of long-term impacts (i.e., outcomes) as a result of implemented network
policies affecting the social, economic, and environmental conditions of the area.
Building on these attributes characterizing the development of Third Mission
activities, the chapter then focused on the scientific debate related to the methods and
approaches for managing and measuring the outcomes emerging from these specific
160 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

activities. To this end, adopting the DPM approach focused on Third Mission
outcomes has been proposed as it may provide a valuable methodological contribu-
tion to the design of performance measurement systems applied to complex gover-
nance structures.
The proposed approach uses a systemic perspective—focused on identifying
causal interdependencies between critical factors affecting results over time—to
support policy networks in the pursuit of sustainable community outcomes. Such a
perspective—which inspires the design of outcome-based DPM models—is oriented
to enhance the inter-organizational coordination among network participants by
offering a shared view of the different roles and corresponding responsibilities
played by each stakeholder in the generation of sustainable community outcomes.
In addition, this approach provides the academic network decision-makers with a
cognitive framework highlighting shared and unshared strategic resources, output,
and outcome measures, as well as those performance drivers affecting them. Partic-
ularly, these remarks have been drawn up by applying the described approach to a
well-recognized Third Mission example, i.e., UIG partnerships (Etzkowitz, 2008).
Such an application adopted a qualitative modeling perspective to depict the perfor-
mance governance setting related to UIG partnerships, which involved identifying
strategic resources, performance drivers, and end-results, as well as their causal
interplays.
Building on the insights emerging from the described example of UIG partner-
ships, more research can be developed to investigate the effectiveness of DPM in
supporting academic networks dealing with Third Mission activities. In particular,
future research perspectives may deepen the analysis related to the use of DPM for
assessing Third Mission activities by designing, exploring, and testing quantitative
System Dynamics models (Sterman, 2000; Morecroft, 2015). These models focus on
quantifying the causal interplays among the variables of the observed system to
simulate their behavior over time (Sterman, 2000). In this way, these simulation
models—and associated simulation scenarios—may provide a deeper and more
rigorous analysis in terms of performance management and policymaking
(Wolstenholme, 1999). Since simulation modeling requires detailed data, this
research avenue also entails developing field analyses and case studies to be
conducted in real-world contexts, thereby leading to more empirical evidence of
the effectiveness of this approach to support academic networks in evaluating Third
Mission activities. In addition, the suggested approach may be applied to other Third
Mission examples (e.g., knowledge commercialization, collaborative research pro-
jects, academic consulting), which, unlike UIG partnerships, pose different chal-
lenges and requirements in terms of performance governance and policy network
structure. These future research perspectives may ultimately contribute to the current
debate on exploring and assessing Third Mission activities in HEIs.
Nevertheless, although Third Mission activities’ evaluation criteria are crucial for
academic decision-makers and policymakers as they affect regional socio-economic
development and competitiveness, the definition of effective and acknowledged
performance measures remains quite complex. In particular, limitations originate
from the difficulty of identifying and collecting comprehensive data on Third
References 161

Mission activities and related outcomes. Data collection imposes the necessity to
establish robust mechanisms to share relevant information about each academic
network participant’s operations and associated results who standardly acts at
different institutional levels (e.g., national, regional, local). In addition, data related
to long-term results are often unquantifiable, unreliable, unavailable, or informally
provided and, as a result, difficult to track and measure. Therefore, there remains the
need for designing and validating more effective performance measurement mech-
anisms applied to Third Mission activities, which poses additional challenges for HE
management scholars.

References

Alford, J. (2009). Engaging public sector clients: From service-delivery to co-production. Palgrave
Macmillan.
Alford, J., & Yates, S. (2014). Mapping public value processes. International Journal of Public
Sector Management, 27(4), 334–352.
Amaral, A., & Magalhães, A. (2002). The emergent role of external stakeholders in European
higher education governance. In A. Amaral, V. L. Meek, & I. M. Larsen (Eds.), Governing
higher education: National perspectives on institutional governance (pp. 1–21). Kluwer Aca-
demic Publishers.
Angiola, N., Bianchi, P., & Damato, L. (2018). Performance management in public universities:
Overcoming bureaucracy. International Journal of Productivity and Performance Management,
67(4), 736–753.
Ansell, C., & Gash, A. (2008). Collaborative governance in theory and practice. Journal of Public
Administration Research and Theory, 18(4), 543–571.
AUBR – EU Expert Group on Assessment of University-based Research. (2010). Assessing
Europe’s university-based research. EUR 24187 EN, DG Research, European Commission,
Brussels.
Aversano, N., Manes Rossi, F., & Tartaglia Polcini, P. (2018). Performance measurement systems
in Universities: A critical review of the Italian system. In E. Borgonovi, E. Anessi-Pessina, &
C. Bianchi (Eds.), Outcome-based performance management in the public sector (pp. 269–288).
Springer.
Barnabè, F., & Riccaboni, A. (2007). Which role for performance measurement systems in higher
education? Focus on quality assurance in Italy. Studies in Educational Evaluation, 33(3–4),
302–319.
Becerra Fernández, M., Cosenz, F., & Dyner, I. (2020). Modeling the natural gas supply chain for
sustainable growth policy. Energy, 205, 118018.
Bianchi, C. (2010). Improving performance and fostering accountability in the public sector
through system dynamics modelling: From an ‘external’ to an ‘internal’ perspective. Systems
Research and Behavioral Science, 27(4), 361–384.
Bianchi, C. (2016). Dynamic performance management. Springer.
Bianchi, C. (2021). Fostering sustainable community outcomes through policy networks: A
dynamic performance governance approach. In J. W. Meek (Ed.), Handbook of collaborative
public management (pp. 333–356). Edward Elgar Publishing.
Bianchi, C., & Caperchione, E. (2022). Performance management and governance in public
universities: Challenges and opportunities. In E. Caperchione & C. Bianchi (Eds.), Governance
and performance management in Public Universities (pp. 1–14). Springer.
162 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

Bianchi, C., & Cosenz, F. (2013). Designing performance management systems in academic
institutions: A dynamic performance management view. In Proceedings of ASPA 2013 Annual
Conference, New Orleans, LA, 19–23 September. ASPA.
Bianchi, C., & Salazar Rua, R. (2020). A feedback view of behavioural distortions from perceived
public service gaps at ‘street-level’ policy implementation: The case of unintended outcomes in
public schools. Systems Research and Behavioural Science, 1–22. https://doi.org/10.1002/sres.
2771.
Bianchi, C., & Vignieri, V. (2020). Dealing with “abnormal” business growth by leveraging local
area common goods: An outside-in stakeholder collaboration perspective. International Journal
of Productivity and Performance Management. https://doi.org/10.1108/IJPPM-07-2019-0318
Bianchi, C., Bovaird, T., & Loeffler, E. (2017). Applying a dynamic performance management
framework to wicked issues: How coproduction helps to transform young people’s services in
Surrey County Council, UK. International Journal of Public Administration, 40(10), 833–846.
Bianchi, C., Bereciartua, P., Vignieri, V., & Cohen, A. (2019). Enhancing urban brownfield
regeneration to pursue sustainable community outcomes through dynamic performance gover-
nance. International Journal of Public Administration. https://doi.org/10.1080/01900692.2019.
1669180.
Bornmann, L. (2013). What is societal impact of research and how can it be assessed? A literature
survey. Journal of the American Society for Information Science and Technology, 64(2),
217–233.
Bovaird, T. (2007). Beyond engagement and participation: User and community coproduction of
public services. Public Administration Review, 67(5), 846–860.
Broadbent, J. (2007). If you can’t measure it, how can you manage it? Management and governance
in Higher Educational institutions. Public Money and Management, 27(3), 193–198.
Broucker, B., De Wit, K., & Verhoeven, J. C. (2017). Higher education research: Looking beyond
new public management, theory and method in Higher Education research (Vol. 3, pp. 21–38).
Emerald Publishing.
Bryson, J. M., Ackermann, F., & Eden, C. (2016). Discovering collaborative advantage: The
contributions of goal categories and visual strategy mapping. Public Administration Review,
76(6), 912–925.
Bryson, J. M., Crosby, B. C., & Seo, D. (2020). Using a design approach to create collaborative
governance. Policy & Politics, 48(1), 167–189.
Business/Higher Education Round Table. (2006). Universities’ third Mission: Communities
engagement, B-HERT. Position paper. B-HERT.
Carayannis, E., & Campbell, D. (2012). Mode 3 knowledge production in quadruple helix innova-
tion systems. Springer Briefs in Business.
Carayannis, E., Barth, T., & Campbell, D. (2012). The quintuple helix innovation model: Global
warming as a challenge and driver for innovation. Journal of Innovation and Entrepreneurship,
1, 1–12.
Carree, M., Della Malva, A., & Santarelli, E. (2014). The contribution of universities to growth:
Empirical evidence for Italy. The Journal of Technology Transfer, 39(3), 393–414.
Castells, M. (1994). Technopoles of the world: The making of 21st century industrial complexes.
Routledge.
Cave, M., Hanney, S., Henkel, M., & Kogan, M. (1997). The use of performance indicators in
higher education. The challenge of the quality movement. Jessica Kingsley Publishers.
Cepiku, D., Mussari, R., Poggesi, S., & Reichard, C. (2012). Special issue on governance of
networks: Challenges and future issues from a public management perspective editorial. Journal
of Management and Governance, 18, 1–7.
Choi, I., & Moynihan, D. (2019). How to foster collaborative performance management? Key
factors in the US federal agencies. Public Management Review, 2(1), 1–22.
Coda, V. (2010). Entrepreneurial values and strategic management. Essays in management theory.
Palgrave Macmillan.
References 163

Cooke, P. (2002). Knowledge economies: Clusters, learning and cooperative advantage.


Routledge.
Cosenz, F. (2014). A dynamic viewpoint to design performance management systems in Academic
Institutions: Theory and practice. International Journal of Public Administration, 37(13),
955–969.
Cosenz, F. (2022). Adopting a dynamic performance governance approach to frame
interorganizational value generation processes into a university third mission setting. In
E. Caperchione & C. Bianchi (Eds.), Governance and performance management in Public
Universities (pp. 87–108). Springer.
Crosby, B., t’Hart, P., & Torfing, J. (2017). Public value creation through collaborative innovation.
Public Management Review, 19, 655–669.
Douglas, S., & Ansell, C. (2020). Getting a grip on the performance of collaborations: Examining
collaborative performance regimes and collaborative performance summits. Public Administra-
tion Review, 81(5), 951–961.
Driscoll, A. (2008). Carnegie’s community-engagement classification: Intentions and insights.
Change: The Magazine of Higher Learning, 40(1), 38–41.
Dumay, J., & Garanina, T. (2013). Intellectual capital research: A critical examination of the third
stage. Journal of Intellectual Capital, 14(1), 10–25.
Elena-Perez, S., Leitner, K. H., Secundo, G., & Martinaitis, Ž. (2014). Shaping new managerial
models in European universities: The impact of reporting and managing IC. In: P. Ordonez De
Pablos & L. Edvinsson (Eds.), Intellectual capital in organizations: Non-financial reports and
accounts (pp. 150–165). Routledge.
Esposito, V., De Nito, E., Iacono, M. P., & Silvestri, L. (2013). Dealing with knowledge in the
Italian public universities. The role of performance management systems. Journal of Intellectual
Capital, 14(3), 431–450.
Etzkowitz, H. (2003). Innovation in innovation: The triple helix of university-industry-government
relations. Social Science Information, 42(3), 293–337.
Etzkowitz, H. (2008). The triple helix: University-industry-government innovation in action.
Routledge.
Etzkowitz, H., Webster, A., Gebhardt, C., & Terra, B. (2000). The future of the university and the
university of the future: Evolution of ivory tower to entrepreneurial paradigm. Research Policy,
29(2), 313–330.
European Commission. (2003). The role of the Universities in the Europe of knowledge. Retrieved
from http://eur-lex.europa.eu/legal-content/PL/TXT/?uri¼celex:52003DC0058
European Commission. (2011). Explanatory Memorandum to COM(2011)567 – Supporting growth
and jobs – An agenda for the modernisation of Europe’s higher education systems. Retrieved
from: https://www.eumonitor.eu/9353000/1/j4nvhdfdk3hydzq_j9vvik7m1c3gyxp/visyrzh2
fxzx
European Commission. (2012). E3M needs and constraints analysis of the three dimensions of third
Mission activities. Belgium.
European Commission and OECD. (2012). A guiding framework for entrepreneurial
universities. OECD.
European Expert Network on Economics of Education (EENEE). (2014). The contribution of
universities to innovation, (regional) growth and employment. In R. Veugelers, & E. Del Rey
(Eds.), EENEE Analytical Report No. 18 prepared for the European Commission. Retrieved
from http://www.eenee.de/eeneeHome/EENEE/Analytical-Reports.html
Frondizi, R., Fantauzzi, C., Colasanti, N., & Fiorani, G. (2019). The evaluation of universities’ third
mission and intellectual capital: Theoretical analysis and application to Italy. Sustainability,
11(12), 3455.
Fuster, E., Padilla-Meléndez, A., Lockett, N., & del-Águila-Obra, A.R. (2019). The emerging role
of university spin-off companies in developing regional entrepreneurial university ecosystems:
The case of Andalusia. Technological Forecasting and Social Change, 141, 219–231.
164 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

Gibb, A. A., & Hannon, P. (2006). Towards the entrepreneurial university? International Journal of
Entrepreneurship Education, 4, 73–110.
Graham, R. (2014). Creating university-based entrepreneurial ecosystems: Evidence from emerg-
ing world leaders. Massachusetts Institute of Technology.
Guerrero, M., Cunningham, J. A., & Urbano, D. (2015). Economic impact of entrepreneurial
universities’ activities: An exploratory study of the United Kingdom. Research Policy, 44(3),
748–764.
Guerrero, M., Urbano, D., Fayolle, A., Klofsten, M., & Mian, S. (2016). Entrepreneurial universi-
ties: Emerging models in the new social and economic landscape. Small Business Economics,
47(3), 551–563.
Guerrero, M., Herrera, F., & Urbano, D. (2019). Strategic knowledge management within
subsidised entrepreneurial university-industry partnerships. Management Decision, 57(12),
3280–3300.
Gulbrandsen, M., & Slipersaeter, S. (2007). The third mission and the entrepreneurial university
model. In A. Bonaccorsi & C. Daraio (Eds.), Universities and strategic knowledge creation:
Specialization and performance in Europe (pp. 112–143). Edward Elgar Publishing.
Gunasekara, C. (2006). Reframing the role of universities in the development of regional innovation
system. The Journal of Technology Transfer, 31(1), 101–113.
Guthrie, J., & Neumann, R. (2007). Economic and non-financial performance indicators in univer-
sities. Public Management Review, 9(2), 231–252.
Hayter, C. S. (2013). Harnessing university entrepreneurship for economic growth factors of
success among university spin-offs. Economic Development Quarterly, 27(1), 18–28.
Hazelkorn, E. (2012). European transparency instruments: Driving the modernisation of European
higher education. In S. Curaj, P. Scott, L. Vlasceanu, & L. Wilson (Eds.), Between the Bologna
process and national reform (Vol. 1, pp. 339–360). Springer.
Herrera, M., Cosenz, F., & Dyner, I. (2019). How to support energy policy coordination? Findings
from the Brazilian wind industry. The Electricity Journal, 32(8), 106636.
Herrera, M., Cosenz, F., & Dyner, I. (2020). Blending collaborative governance and dynamic
performance management to Foster policy coordination in renewable energy supply chains. In
C. Bianchi, L. F. Luna-Reyes, & E. Rich (Eds.), Enabling collaborative governance through
systems modeling methods. Public Policy Design and Implementation. Understanding Complex
Systems Series (pp. 237–261). Springer.
Higher Education Funding Council for England. (2008). Strategic plan 2006–11. HEFCE.
Jonbloed, B. (2008). Indicators for mapping university-regional interactions. Paper presented at the
ENID-PRIME Indicators Conference, Oslo, Norway, May 26–28.
Jongbloed, B., Enders, J., & Salerno, C. (2008). Higher education and its communities: Intercon-
nections, interdependencies and a research agenda. Higher Education, 56(3), 303–324.
Klijn, E. H. (2008). Governance and governance networks in Europe. Public Management Review,
10(4), 505–525.
Klijn, E. H., & Koppenjan, J. F. M. (2000). Public management and policy networks. Public
Management: An International Journal of Research and Theory, 2(2), 135–158.
Laredo, P. (2007). Revisiting the third mission of universities: Toward a renewed categorization of
university activities? Higher Education Policy, 20(4), 441–456.
Leydesdorff, L., & Etzkowitz, H. (1996). Emergence of a triple helix of university-industry-
government relations. Science and Public Policy, 23(5), 279–286.
Lombardi, R., Massaro, M., Dumay, J., & Nappo, F. (2019). Entrepreneurial universities and
strategy: The case of the University of Bari. Management Decision, 57(12), 3387–3405.
Lubik, S., Garnsey, E., Minshall, T., & Platts, K. (2013). Value creation from the innovation
environment: Partnership strategies in university spin-outs. R&D Management, 43(2), 136–150.
Manes Rossi, F., Nicolò, G., & Tartaglia Polcini, P. (2018). New trends in intellectual capital
reporting: Exploring online intellectual capital disclosure in Italian universities. Journal of
Intellectual Capital, 19(4), 814–835.
References 165

Mastilak, C., Matuszewski, L., Miller, F., & Woods, A. (2012). Evaluating conflicting performance
on driver and outcome measures: The effect of strategy maps. Journal of Management Control,
23(2), 97–114.
Mazza, C., Quattrone, P., & Riccaboni, A. (2006). Verso l’Università bifronte? Complessità interne
e semplificazione dei rapporti con l’esterno. In C. Mazza, P. Quattrone, & A. Riccaboni (Eds.),
L’Università in cambiamento fra mercato e tradizione. Il Mulino, Bologna.
McAdam, M., & Debackere, K. (2018). Beyond “triple helix” toward “quadruple helix” models in
regional innovation systems: Implications for theory and practice. R&D Management, 48(1),
3–6.
McAdam, M., Miller, K., & McAdam, R. (2016). Situated regional university incubation: A multi-
level stakeholder perspective. Technovation, 50-51, 69–78.
McAdam, M., Miller, K., & McAdam, R. (2017). University business models in disequilibrium–
engaging industry and end users within university technology transfer processes. R&D Man-
agement, 47(3), 458–472.
Miller, B. A. (2007). Assessing organizational performance in higher education. Jossey-Bass.
Miller, K., McAdam, M., & McAdam, R. (2014). The changing university business model: A
stakeholder perspective. R&D Management, 44(3), 265–287.
Miller, K., McAdam, R., Moffett, S., Alexander, A., & Puthusserry, P. (2016). Knowledge transfer
in university quadruple helix ecosystems: An absorptive capacity perspective. R&D Manage-
ment, 46(2), 383–399.
Molas-Gallart, J., Salter, A., Patel, P., Scott, A., & Duran, X. (2002). Measuring third stream
activities. Final Report to the Russell Group Universities. SPRU, University of Sussex.
Morecroft, J. D. W. (2015). Strategic modelling and business dynamics: A feedback systems
approach. Wiley.
Moynihan, D. P. (2008). The dynamics of performance management. Georgetown University Press.
Moynihan, D. P., Fernandez, S., Kim, S., LeRoux, K. M., Piogrowski, S. J., Wright, B. E., & Yang,
K. (2011). Performance regimes amidst governance complexity. Journal of Public Administra-
tion Research and Theory, 21, 141–155.
Nicolò, G., Manes Rossi, F., Christiaens, J., & Aversano, N. (2020). Accountability through
intellectual capital disclosure in Italian universities. Journal of Management and Governance.
https://doi.org/10.1007/s10997-019-09497-7
Noordegraaf, M. (2015). Public management: Performance, professionalism and politics. Palgrave
Macmillan.
OECD. (2009). Education at a glance. OCSE.
OECD. (2012). A guiding framework for entrepreneurial universities. Retrieved from https://www.
oecd.org/site/cfecpr/EC-OECD%20Entrepreneurial%20Universities%20Framework.pdf
Olcay, G. A., & Bulu, M. (2017). Is measuring the knowledge creation of universities possible? A
review of university rankings. Technological Forecasting and Social Change, 123, 153–160.
Osborne, S. (2006). The new public governance? Public Management Review, 8(3), 377–387.
Osborne, S. (2010). The new public governance? Emerging perspectives on the theory and practice
of public governance. Routledge.
Osborne, S., Radnor, Z., & Strokosch, K. (2016). Co-production and the co-creation of value in
public services: A suitable case for treatment? Public Management Review, 18(5), 639–653.
Paoloni, P., Modaffari, G., & Mattei, G. (2020). Knowledge resources in the university context: An
overview of the literature. Journal of Intellectual Capital. https://doi.org/10.1108/JIC-01-
2020-0010
Perkmann, M., Neely, A., & Walsh, K. (2011). How should firms evaluate success in university–
industry alliances? A performance measurement system. R&D Management, 41(2), 202–216.
Perkmann, M., Tartari, V., McKelvey, M., Autio, E., Broström, A., D’Este, P., Fini, R., Geuna, A.,
Grimaldi, R., & Hughes, A. (2013). Academic engagement and commercialisation: A review of
the literature on university–industry relations. Research Policy, 42, 423–442.
Pestoff, V., & Brandsen, T. (2008). Co-production: The third sector and the delivery of public
services. Routledge.
166 4 University’s “Third Mission” Assessment Through Outcome-Based. . .

Pilbeam, C. (2006). Generating additional revenue streams in UK universities: An analysis of


variation between disciplines and institutions. Journal of Higher Education Policy and Man-
agement, 28(3), 297–311.
Pinto, H., Cruz, A. R., & de Almeida, H. (2016). Academic entrepreneurship and knowledge
transfer networks: Translation process and boundary organizations. In L. Carvalho (Ed.),
Handbook of research on entrepreneurial success and its impact on regional development
(pp. 315–344). IGI Global.
Polt, W., Rammer, C., Gassler, H., Schibany, A., & Schartinger, D. (2001). Benchmarking industry-
science relations: The role of framework conditions. Science and Public Policy, 28(4), 247–258.
Powell, W. W., & Snellman, K. (2004). The knowledge economy. Annual Review of Sociology,
30(1), 199–220.
Rasmussen, E., & Borch, O. J. (2010). University capabilities in facilitating entrepreneurship: A
longitudinal study of spin-off ventures at mid-range universities. Research Policy, 39, 602–612.
Reichert, S. (2009). Using the classification in the European higher education area. In F. A. Van
Vught (Ed.), Mapping the higher education landscape. Towards a European classification of
higher education (pp. 105–122). Springer.
Rhodes, R. A. W. (1990). Policy networks: A British perspective. Journal of Theoretical Politics,
2(3), 293–317.
Rhodes, R. A. W. (2017). Network governance and the differentiated polity. Oxford University
Press.
Ricci, R., Colombelli, A., & Paolucci, E. (2019). Entrepreneurial activities and models of advanced
European science and technology universities. Management Decision, 57(12), 3447–3472.
Rippa, P., & Secundo, G. (2019). Digital academic entrepreneurship: The potential of digital
technologies on academic entrepreneurship. Technological Forecasting and Social Change,
146, 900–911.
Sanchez, P., & Elena, S. (2006). Intellectual capital in universities. Journal of Intellectual Capital,
7(4), 529–548.
Sánchez-Barrioluengo, M. (2014). Articulating the ‘three-missions’ in Spanish universities.
Research Policy, 43(10), 1760–1773.
Schoen, A., Laredo, P., Bellon, B., & Sanchez, P. (2007). Observatory of European University.
PRIME position paper. Retrieved from http://www.prime-noe.org/conference-presentations,92.
html.
Secundo, G., & Elia, G. (2014). A performance measurement system for academic entrepreneur-
ship. Measuring Business Excellence, 18(3), 23–37.
Secundo, G., Perez, S. E., Martinaitis, Z., & Leitner, K. H. (2015). An intellectual capital maturity
model (ICMM) to improve strategic management in European universities. Journal of Intellec-
tual Capital, 16(2), 419–442.
Secundo, G., Perez, S. E., Martinaitis, Z., & Leitner, K. H. (2017). An intellectual capital
framework to measure universities’ third mission activities. Technological Forecasting and
Social Change, 123, 229–239.
Secundo, G., Ndou, V., Del Vecchio, P., & De Pascale, G. (2019). Knowledge management in
entrepreneurial universities: A structured literature review and avenue for future research
agenda. Management Decision, 57(12), 3226–3257.
Sicilia, M. F., Guarini, E., Sancino, A., Andreani, M., & Ruffini, R. (2016). Public services
management and co-production in multi-level governance settings. International Review of
Administrative Sciences, 82(1), 8–27.
Sterman, J. (2000). Business dynamics: Systems thinking and modeling for a complex world. Irwin/
McGraw-Hill.
Thorp, H., & Goldstein, B. (2013). Engines of Innovation: The entrepreneurial university in the
twenty-first century. UNC Press Books.
Torfing, J., & Ansell, C. (2017). Strengthening political leadership and policy innovation through
the expansion of collaborative forms of governance. Public Management Review, 19(1), 37–54.
References 167

Trencher, G., Yarime, M., McCormick, K. B., Doll, C. N. H., & Kraines, S. B. (2014). Beyond the
third mission: Exploring the emerging university function of co-creation for sustainability.
Science and Public Policy, 41(2), 151–179.
Urbano, D., & Guerrero, M. (2013). Entrepreneurial universities: Socioeconomic impacts of
academic entrepreneurship in a European context. Economic Development Quarterly, 27,
40–55.
Van Looy, B., Debackere, K., & Andries, P. (2003). Policies to stimulate regional innovation
capabilities via university-industry collaboration: An analysis and an assessment. R&D Man-
agement, 33(2), 209–229.
Van Vught, F. A. (2009). Mapping the higher education landscape. Towards a European classi-
fication of higher education. Springer.
Watson, D., & Hall, L. (2015). Addressing the elephant in the room: Are universities committed to
the third stream agenda. International Journal of Academic Research in Management, 4(2),
48–76.
Webber, R., & Jones, K. (2011). Re-positioning as a response to government higher education
policy development–an Australian case study. Journal of Higher Education Policy and Man-
agement, 33(1), 17–26.
Wolstenholme, E. (1999). Qualitative vs. quantitative modelling: The evolving balance. Journal of
the Operational Research Society, 50(4), 422–428.
Xavier, J. A., & Bianchi, C. (2019). An outcome-based dynamic performance management
approach to collaborative governance in crime control: Insights from Malaysia. Journal of
Management and Governance. https://doi.org/10.1007/s10997-019-09486-w
Zhang, Z., Bivona, E., Qi, J., & Yan, H. (2020). Applying dynamic performance management to
Foster collaborative governance in higher education: A conceptual framework. In C. Bianchi,
L. Luna-Reyes, & E. Rich (Eds.), Enabling collaborative governance through systems modeling
methods (pp. 317–333). Springer.
Chapter 5
Conclusions

This book addressed the issue of organizational and inter-organizational complexity


in HEIs by proposing and exploring a systemic performance management perspec-
tive. This study retreated a decade of research in this field, thus providing a more
comprehensive and homogenous work within an ongoing heated debate related to
HE policy and management.
Following their evolutionary pathway, the research framed HEIs as complex,
fast-evolving, organizational systems that play a fundamental role in society with
their multiple interrelated activities and values. Such a role—defined to some extent
as “entrepreneurial”—makes them responsible for contributing to the socio-
economic development of their regional area. Other community actors and stake-
holders share the same responsibility, thus identifying collaborative performance
routines in a broader academic value creation process. These routines lead HEIs and
their regional partners to jointly make decisions on crucial socio-economic aspects
by clarifying goals, exchanging performance information, evaluating progress, and
exploring actions. As remarked by several scholars, these broader governance
structures further increase the complexity of managing HEIs due to critical issues
such as lack of strategic coordination, fragmentation in public service delivery,
unclear organizational boundaries, and information asymmetry.
The book argued that increased complexity and the relevance of this role imply
greater challenges in terms of performance management and governance for those
HEIs aiming to successfully compete according to a sustainable perspective. This
book suggested the adoption of a Dynamic Performance Management (DPM)
approach to frame this specific organizational and inter-organizational complexity
of HEIs by providing a systemic view of value creation processes and multi-actor
performance governance.
The proposed method draws on integrating conventional performance manage-
ment with System Dynamics modeling. By adopting a systemic design perspective,
this combined approach facilitates strategic learning processes in Universities by
endowing academic decision-makers with deeper cognitive and informative

© The Author(s), under exclusive license to Springer Nature Switzerland AG 2022 169
F. Cosenz, Managing Sustainable Performance and Governance in Higher
Education Institutions, System Dynamics for Performance Management &
Governance 5, https://doi.org/10.1007/978-3-030-99317-7_5
170 5 Conclusions

supports. A systemic perspective helps identify and investigate the causal relation-
ships among the forces influencing academic performance into a consistent perfor-
mance management framework.
In particular, qualitative DPM modeling supports joint model building, commu-
nication, and shared understanding of the organizational system’s structure among
University administrators and other key-actors at all governance levels. As such, it
enables academic stakeholders to develop consensus and accountability around the
policies and associated actions for pursuing sustainable development goals. Emerg-
ing models’ variables and related causal interplays are then quantified, thus gener-
ating alternative simulation scenarios and monitoring (un)desired effects under
different assumptions and across alternative decisions. This approach applies to
both the organizational and inter-organizational dynamics of HEIs. In its breadth,
the book showed and discussed a wide variety of illustrative examples, cases, and
practices to investigate how DPM works in academic settings.
The book contended that using DPM does not claim to solve the critical issues,
dysfunctions, and challenges characterizing performance management and gover-
nance in Universities. Rather, this method provides academic decision-makers with
interpretive lenses to better understand and explore the systemic structure of its
complex organizational system that drives its performance over time. For this
reason, these interpretive lenses foster a systematic strategic learning process of
key academic actors, thus offering them shared decision support to improve organi-
zational performance, multilevel coordination, and collaborative governance. Under
these premises, this approach promotes a shared use of performance information not
only within HEIs but also between HEIs and their regional stakeholders, coordinat-
ing joint efforts to produce sustainable community outcomes.
This book hopes to create momentum for the further study of performance
management and collaborative governance in HEIs according to a sustainable
development perspective. Future research should continue to explore how the role
of HEIs will evolve worldwide and how performance management and governance
systems will adapt to new emerging challenges over time. Getting a better under-
standing of this evolution, its complexity, and associated challenges is crucial for
both scholars and practitioners seeking to support HEIs in generating more sustain-
able outcomes in their socio-economic contexts.

You might also like