Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

Medical Teacher

ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/imte20

Continuous enhancement of educational quality –


fostering a quality culture: AMEE Guide No. 147

Renée E. Stalmeijer, Jill R. D. Whittingham, Guy W. G. Bendermacher, Ineke


H. A. P. Wolfhagen, Diana H. J. M. Dolmans & Carolin Sehlbach

To cite this article: Renée E. Stalmeijer, Jill R. D. Whittingham, Guy W. G. Bendermacher, Ineke
H. A. P. Wolfhagen, Diana H. J. M. Dolmans & Carolin Sehlbach (2023) Continuous enhancement
of educational quality – fostering a quality culture: AMEE Guide No. 147, Medical Teacher, 45:1,
6-16, DOI: 10.1080/0142159X.2022.2057285

To link to this article: https://doi.org/10.1080/0142159X.2022.2057285

© 2022 The Author(s). Published by Informa Published online: 25 Apr 2022.


UK Limited, trading as Taylor & Francis
Group.

Submit your article to this journal Article views: 3822

View related articles View Crossmark data

Citing articles: 2 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=imte20
MEDICAL TEACHER
2023, VOL. 45, NO. 1, 6–16
https://doi.org/10.1080/0142159X.2022.2057285

AMEE GUIDE

Continuous enhancement of educational quality – fostering a quality culture:


AMEE Guide No. 147
Ren
ee E. Stalmeijer , Jill R. D. Whittingham , Guy W. G. Bendermacher , Ineke H. A. P. Wolfhagen,
Diana H. J. M. Dolmans and Carolin Sehlbach
School of Health Professions Education, Faculty of Health, Medicine and Life Sciences, Maastricht University, Maastricht, the Netherlands

ABSTRACT KEYWORDS
Internal quality assurance (IQA) is one of the core support systems on which schools in the health Internal quality assurance;
professions rely to ensure the quality of their educational processes. Through IQA they demon- quality culture;
strate being in control of their educational quality to accrediting bodies and continuously improve programme evaluation
and enhance their educational programmes. Although its need is acknowledged by all stakehold-
ers, creating a system of quality assurance has often led to establishing a ‘tick-box’ exercise overly
focusing on quality control while neglecting quality improvement and enhancement. This AMEE
Guide uses the concept of quality culture to describe the various dimensions that need to be
addressed to move beyond the tick-box exercise. Quality culture can be defined as an organisa-
tional culture which consists of a structural/managerial aspect and a cultural/psychological aspect.
As such this AMEE Guide addresses tools and processes to further an educational quality culture
while also addressing ways in which individual and collective awareness of and commitment to
educational quality can be fostered. By using cases within health professions education of both
formal and informal learning settings, examples will be provided of how the diverse dimensions of
a quality culture can be addressed in practice.

Introduction
Aims of internal quality assurance Practice points
 Fostering a quality culture requires attention to
Internal quality assurance (IQA) can be described as the set structural/managerial aspects and cultural/psycho-
of activities and processes implemented by educational logical aspects.
organisations to control, monitor, improve and enhance  Different stakeholders may have a different take
educational quality. IQA is not only essential to the day-to- on ‘what the goal of internal quality assurance is’
day practices of educational management and organisation, and ‘what quality is’. Addressing these questions
but also provides evidence towards accrediting bodies that will help to provide guidance to quality assur-
the educational practices within an organisation are up to ance processes.
standard (i.e. external quality assurance). This AMEE Guide  Addressing the structural/managerial aspect of a
specifically focuses on IQA practices. quality culture requires a cyclical process in which
Many different concepts are used when discussing the responsibilities are clearly defined, evaluation
aims of IQA within educational organisations. These con- instruments are informed by educational theory,
cepts are often used interchangeably without noting differ- and all relevant stakeholders get a voice.
ent connotations and definitions that are attached to them.  Addressing the cultural/psychological aspect of a
When the aim of IQA is quality control, educational organi- quality culture requires leadership, faculty devel-
sations want to check whether the outcomes of an educa- opment, enabling Communities of Practice, reflec-
tional programme are conform predetermined standards tion and dialogue.
(Harvey 2004–2021). Monitoring educational quality refers
more to the procedures that an educational organisation
has in place to ensure the quality of education provided
(Harvey 2004–2021). Quality improvement often refers to (QAA 2003). Quality control, monitoring, improvement and
processes in place to ensure that something is ‘up to enhancement are all important goals of IQA. However,
standard’ (where it previously was below a certain stand- each concept comes with its own standards and processes.
ard) (Williams 2016). Williams (2016) suggests that the term Therefore, an educational organisation needs to consider
quality enhancement describes the deliberate and continu- which goals of IQA it is striving towards and whether the
ous process of augmenting students’ learning experiences right processes are in place to attain them.

CONTACT Renee E. Stalmeijer r.stalmeijer@maastrichtuniversity.nl Universiteitssingel 60, 6229 ER Maastricht, the Netherlands
ß 2022 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.
This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives License (http://creativecommons.org/licenses/by-nc-
nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited, and is not altered, transformed, or
built upon in any way.
MEDICAL TEACHER 7

Quality Culture

Structural/ Managerial

Cultural/ Psychological
• Design & Triangulate • Smulate Dialogue &
Instruments Discuss & Determine Reflecon amongst
• Involve mulple the Goals and Values Stakeholders
Stakeholders underlying IQA • Offer Faculty
• Ensure a cyclic Development & CoP
evaluaon process • Invest in Leadership

Figure 1. Quality culture practices.

Adverse effects of good intentions – seeking balance various sets of questions one needs to ask when designing
a system of evaluation, including ethical questions. Guide
‘Being in control’ of educational quality and having a sound
number 67 (Frye and Hemmer 2012), built on the sugges-
IQA system are important areas of attention for accreditation
tions by Goldie and provided an overview of common
bodies (EUA 2009). As a consequence, however, quality assur-
evaluation models which can be used to determine
ance runs the risk of being considered a ‘beast that requires
whether an educational programme (ranging from the level
feeding’: a list of bureaucratic boxes that needs to be ticked
of course to curriculum) has brought about change, i.e.
(Newton 2000). Harvey and Stensaker (2008) signalled that
focusing on the outcomes of an educational programme.
the continuous generation of quality control data has often This AMEE Guide focuses on building a comprehensive
been disconnected from staff and students’ needs which approach to IQA of education by using the concept of
actually inhibited educational improvement and enhance- quality culture (EUA 2006; Ehlers 2009; Blouin 2019). We
ment. Ultimately, both quality control and improvement are describe practices (see Figure 1) that educational organisa-
needed, as also voiced by teachers participating in a study by tions can use to optimise the processes and systems
Kleijnen et al. (2014) who acknowledged that enhancement informing IQA while simultaneously fostering the aware-
should be at the core of IQA without neglecting the import- ness of and commitment to continuous enhancement of
ance of monitoring educational quality. For brevity purposes, educational quality (Bendermacher et al. 2017). Throughout
this guide will from here on out speak of monitoring and the following paragraphs, we will use cases from the field
enhancement where monitoring covers both control and of health professions education, to illustrate how quality
monitoring goals, and enhancement covers the improvement culture development in health professions education can
and enhancement goals of IQA. be nurtured. It is important to note that the practices we
describe are not hierarchical in nature and that there is no
Building a quality culture real linearity to the process of employing them. Defining
The European University Association (EUA) coined the con- and creating an organisational quality culture requires sim-
cept of quality culture as ‘an organisational culture that ultaneous attention to all aspects.
intends to enhance quality permanently and is characterised
by two distinct elements: a cultural/psychological element of Our perspective – reflexivity
shared values, beliefs, expectations and commitment con-
cerning quality and a structural/managerial element with This guide is written from the perspective of an expert
defined processes that enhance quality and aim at coordinat- group on IQA at Maastricht University, Faculty of Health
ing individual efforts’ (EUA 2006, p. 10). The notion of quality Medicine and Life Sciences (FHML) entitled ‘task force pro-
culture captures the intention to nurture shared educational gramme evaluation’. The task force is, in collaboration with
values, flexibility, openness, a sense of ownership, and a col- relevant stakeholders, responsible for monitoring and
lective commitment – alongside IQA systems and processes enhancing educational quality of all educational pro-
(Sursock 2011; Bendermacher et al. 2020). A quality culture is grammes within FHML. This responsibility is executed by
considered to help coordinate individual improvement (1) ensuring valid and reliable data collection through
efforts, shape mutual expectations, and stimulate a collective standardized instruments, (2) executing in-depth studies on
responsibility (EUA 2006). It is through a created synergy educational quality through a mix of quantitative and
between these elements and evaluation programmes that qualitative research methodologies, (3) providing faculty
continuous educational enhancement becomes embedded in development on educational quality assurance to relevant
the everyday teaching practice (Ehlers 2009; Blouin 2019). stakeholders, and (4) stimulating a dialogue towards con-
tinuous quality enhancement of the programmes within
FHML. The authors of this AMEE Guide are all researchers
This AMEE guide
within the School of Health Professions Education and
An earlier AMEE Guide, number 29, (Goldie 2006) looked have backgrounds in educational sciences (RS, IW, DD),
into the history of programme evaluation and described educational psychology (JW), health sciences (CS), and
8 R. E. STALMEIJER ET AL.

policy and management (GB). In addition, one of the kind of quality culture which determines how that institu-
authors (GB) is a policy advisor who specializes specifically tion reacts to external developments, and attempts to fos-
in the concept of quality culture and, through his research, ter internal collaborations (Bendermacher 2021). Moreover,
has further explored the concept of quality culture. We are a quality culture is influenced by the organisation’s
aware that not all health professions education schools context and its developmental phase in dealing with qual-
employ similar task forces and that sometimes the task of ity management (Harvey and Stensaker 2008). For instance,
programme evaluation may rest on the shoulders of a sin- educational organisations who are in the early stages of
gle individual. Nevertheless, we hope that our experiences, setting-up their IQA system would probably benefit most
research and insights provide the necessary guidance and from first focusing on monitoring and controlling educa-
support to anyone involved in educational qual- tional quality (i.e. to install or fine-tune procedures and
ity assurance. standards). Those organisations who already have estab-
lished a robust system which monitors whether minimum
criteria are met, can shift their focus towards continuous
Fostering a quality culture – what are we enhancement while addressing changes in the organisa-
striving for? tion’s environment. Reflection exercises like discussing the
Discuss and determine underlying goals and values Organisational Culture Assessment Index (see Box 1) can
aid stakeholders to disclose their core values. This discus-
What is quality? sion can help shape strategies and solutions that lead to a
When considering what goals IQA has, an important ques- better alignment between these values and organisational
tion to address is ‘what do we mean when we say quality?’; procedures (Berings and Grieten 2012). Such reflections
what is often forgotten is that quality lies in the eye of the and dialogues can also help to alter the emphasis from
beholder. This may result in multiple stakeholders within a ‘whether things are being done well’, to ‘whether the right
single organisation having different conceptions when it things are being done’ (Cartwright 2007).
come to the question ‘what is quality?’. In their seminal
work Defining Quality, Harvey and Green (1993) discerned
five different conceptions of quality: excellence, perfection/ Building a quality culture – structural &
consistency, fit-for-purpose, value for money, and trans- managerial components
formation. Seeing quality as striving for ‘fit-for-purpose’
No system of IQA can exist without a regular influx of data
requires, for example, ensuring that educational goals are
that helps to address the goals of IQA and provides input
attained and that graduates pass a certain standard.
for stakeholders to monitor and enhance the programme’s
Through the lens of quality as transformation, however,
quality. To ensure that this system contributes to strength-
IQA would be striving to determine to what extent ‘value
ening the quality culture, the following aspects need to be
has been added’ to the student and that students have
considered: the design and triangulation of instruments,
been empowered. Although it is outside of the scope of
involvement of stakeholders, and the cyclical nature of IQA.
this AMEE Guide to go into depth on each definition, we
encourage readers to familiarise themselves with these
conceptions of quality. Familiarity with these conceptions Design & triangulation of instruments
will provide a vocabulary through which stakeholders
within an educational organisation can discuss which defi- Incorporating educational principles in design
IQA should be geared towards the key components that
nition(s) they value and foreground. We believe that educa-
make up a curriculum, such as educational activities or
tional organisations will always be aiming to attain
courses, and should build evaluation instruments firmly
different types of quality simultaneously. Imagine ensuring
grounded in theories and the literature on learning and
that the finances of a programme are in order versus
teaching (Frick et al. 2010). In other words, the items of an
ensuring the ranking of an educational organisation.
evaluation instrument must be derived from what the lit-
Therefore, we strongly advise readers to discuss with rele-
erature has shown determines the quality of an educational
vant stakeholders in their own organisation what quality
programme. If the design of IQA instruments is based on
means to them. Because, just like defining the aims of IQA
these educational principles, this will augment the chances
within your organisation, the conception(s) of quality that
of outcomes of data collection being actionable by the dif-
are held will provide direction and guidance on how to
ferent stakeholders (Bowden and Marton 1999; Dolmans
design instruments and procedures for IQA, which stake-
et al. 2011). Educational principles can be defined as a way
holders to involve and how to shape a quality culture.
of thinking about how and why an educational program is
Misalignment in perceptions about goals of IQA and con-
effective (Figure 2). Depending on the nature of the cur-
ceptions of quality may thwart the process of building a
riculum or the programme you want to evaluate, multiple
fruitful quality culture as the processes that inform quality
educational principles might apply. In Box 2, we provide
as fit-for-purpose are different to those informing quality as
two examples (1) principles of constructivist learning to
transformation.
evaluate problem-based learning, and (2) cognitive appren-
ticeship to evaluate clinical teaching.
Which quality culture? In addition to the fact that educational principles are
It is important to realise that a quality culture is not some- important to keep in mind when designing an instrument,
thing that institutions have to develop from scratch and it is essential to also consider the fact that education is
also that there is no such thing as ‘the ideal’ quality cul- highly context-specific, i.e. what might work, based on the-
ture. That is, each institution already possesses a certain oretical principles, might differ dependent on the particular
MEDICAL TEACHER 9

Box 1. The organisational culture assessment index


A well-known questionnaire to conceptualise organisational values is the Organisational Culture Assessment Index (OCAI). The OCAI is based on a
competing values framework which includes two dimensions. The first dimension encompasses an internal versus an external orientation. The
second dimension concerns a control or flexibility orientation. Organisations can strive to invest in different orientations at the same time: to be
structured and stable (internal/control orientation), to be a collaborative community, (internal/flexible orientation), to be proactive and innovative
(external/flexible orientation), and to be goal-oriented and efficient (external/control orientation). The different combinations of value orientations
result in four organisational culture models which reflect the means and ends an organisation can target at (internal process- human relations-,
open system-, and rational goal model). The OCAI sketches both the current and preferred value orientation of organisational members and there-
with can be used to identify the direction in which the organisation needs to develop (Cameron and Quinn 1999).

Figure 2. Competing values model (Kleijnen 2012 – based on Quinn and Rohrbaugh 1983).

Box 2. Educational principles translated to IQA instrument-design


Evaluating Problem Based Learning (PBL) – Constructivist Learning Principles
At FHML, we apply a PBL approach, which fits well with four educational principles: contextual, constructive, collaborative, and self-directed learning
(Dolmans 2019). These principles are aligned with the vision of our institute in which it is stated that our educational programs are perceived to
have high quality if they are well aligned with these four educational principles. These principles have been defined in the educational literature,
empirically tested and are known to be effective in educational practice. Therefore, we have used the educational principles underlying PBL as the
backbone for a standardized questionnaire used to evaluate all Bachelor courses at FHML. This questionnaire is administered after each course and
filled out by students that followed the course. Results are reported back to all stakeholders involved.

Example items (Likert scale 1 ¼ fully disagree; 5 ¼ fully agree) are:


 Contextual: The course encouraged me to apply what I learned to other cases
 Constructive: The educational activities encouraged me to actively engage with the course content
 Collaborative: The educational activities stimulated collaboration with fellow students
 Self-directed: I was encouraged to study various resources during the course
Providing feedback to clinical teachers – Cognitive Apprenticeship
Another example is an approach we took to evaluating clinical teaching during clinical rotations by building on the teaching methods as described
in Cognitive Apprenticeship (Collins et al. 1989). In order to provide feedback to individual physicians tasked with clinical teaching, we constructed
items out of the teaching methods (modelling, coaching, scaffolding, articulation, reflection, exploration) which could be filled out by medical students
(Stalmeijer et al. 2008; 2010b). The feedback produced by this instrument was thought to be highly relevant and actionable by clinical teachers
(Stalmeijer et al. 2010a; 2010b).
Example items (Likert scale 1 ¼ fully disagree; 5 ¼ fully agree) are, The clinical teacher … :
 Modelling: … created sufficient opportunities for me to observe them
 Coaching: … gave useful feedback during or immediately after direct observation of my patient encounters
 Articulation: … asked me questions aimed at increasing my understanding
 Exploration: … encouraged me to formulate learning goals
For the full instrument see: Stalmeijer et al. (2010a).
10 R. E. STALMEIJER ET AL.

aims and target groups as well as contextual differences the local context in which stakeholders will be involved as
(Hodges et al. 2009; Kikukawa et al. 2017). As a conse- appropriate involvement may take different shapes
quence, the design of instruments, which is preferably depending on institutional, accreditation or even cul-
based on theoretical guidelines, remains a matter of cus- tural standards.
tomisation. It is important to continuously adapt and align
your instruments, depending on what your aims are and
Involving key stakeholders
the context in which the instruments are being used.
Students have traditionally been subjected to course evalu-
ations, for, as the consumers of education, they are consid-
Triangulation of instruments ered to have a crucial stake in the QA process (Coates
Depending on the aims of IQA, the aspects of the pro- 2005; Marsh 2007). As ‘experienced experts’, they can pro-
gramme you are trying to evaluate and the stakeholders vide detailed, information on distinct educational entities
whose input you need, requirements for instrument design such as a course, workshop or teachers’ performance. This
change. When the primary aim is to control and monitor data can be collected both quantitatively through validated
quality for accountability purposes, collecting valid and reli- questionnaires and qualitatively through interviews and
able data for a standardised set of factors from the major- focus groups. Although different data might be generated
ity of the population takes precedent (Dolmans et al. 2011). (numeric and narrative, generic and specific), as a whole,
This goal warrants a quantitative approach to data collec- involving students’ perspectives can provide insight into
tion. When the focus is on the improving and enhancing of the way the curriculum is put into practice compared to
educational quality, the information that is needed should how it has been designed, including perceived strong
be rich, descriptive and more qualitative in nature so that points and points for improvement. To strengthen a quality
clear and specific input can be found on how to improve culture students can also be involved in education and
and enhance education. Since educational quality is multi- institutional decision-making. They can, for instance, serve
dimensional and learning environments are complex, it is on a student evaluation committee (SEC) (Stalmeijer et al.
recommended to combine both quantitative and qualita- 2016) or, as student representatives, join faculty and pro-
tive methods to measure it (Cashin 1999). Questionnaires, gramme coordinators on management bodies (Elassy 2013;
interviews, classroom observations, document analysis and Healey et al. 2016). These students often already have
focus groups are all commonly used (Braskamp and Ory gained content expertise, developed a shared language
1994; Seldin 1999). For instance, one could first evaluate a with faculty, and learned how to overcome potential power
relations. There is a strong movement towards more delib-
course by administering a large-scale quantitative student
erate involvement of students in the design, implementa-
questionnaire to bring into focus the course’s overall
tion and decision making regarding educational practices
strengths and weaknesses. Subsequent qualitative focus
(e.g. Bovill et al. 2016). This is reflected in different,
groups can then elaborate on these data and place them
approaches to student participation like design-based
in a clearer perspective. Building a quality culture requires
research, participatory design, co-creation, co-design, stu-
triangulation of different instruments and procedures to
dent voice, student–staff partnership, students as change
provide a holistic overview of the quality of an educational
agents, student engagement, and student empowerment
programme and to collect data that may inform either
(Seale 2009; Anderson and Shattuck 2012; Bovill et al.
monitoring or enhancement purposes.
2016). By intensifying the active engagement of students in
the educational (design) processes, the aim is to simultan-
Involving all relevant stakeholders eously improve teaching and learning of both faculty and
students (Bovill et al. 2016). It is important to stress that
Identifying stakeholders this approach goes beyond just listening to student voices;
When designing a system of continuous quality enhance- the focus is on empowering students to actively collabor-
ment various stakeholders need to be involved. Since ate with teachers and educational management (Bovill
‘quality is in the eye of the beholder’ and we need to con- et al. 2011). Through this approach students and staff can
sider the context in which evaluations take place, each form partnerships, co-create, and (re-)design education
stakeholder should be invited to add their perspective on (Martens et al. 2019; 2020).
educational quality to render a holistic programme of Not only students’ input can be insightful, inviting fac-
evaluation (Harvey and Stensaker 2008). To identify stake- ulty into the evaluation process can give great insight into
holders one may consider questions like ‘Who are the practical teaching and organizational aspects and simultan-
recipients of this education?’ ‘Who are involved in teaching eously increase buy-in and commitment. While faculty is
and managing this education?’ ‘Who can judge the effect- mostly the user of evaluation data to improve courses and
iveness of this education?’. Once relevant stakeholders are training programmes, they can also become evaluation
identified, the question arises of how and through which respondents and thereby contribute to IQA. Teaching staff
instruments they can be involved, what aspects and level could also be included structurally during or at the end of
of education they can best evaluate, followed by the ques- an educational activity to evaluate observed pitfalls or
tion of best timing of involving them and approaching organizational challenges. As an example, consider involv-
them. Depending on context and timing, each stakeholder ing PBL tutors in the evaluation of PBL cases, lectures and
might have their own situated knowledge and lived experi- workshop providers on the alignment of their educational
ence as well as contribution to offer in the process of activities, or mentors in fulfilling their roles in program-
enhancing educational quality (Kikukawa et al. 2021). All matic assessment (see Box 3). On a meta-level, faculty can
these questions need to be considered within the light of also be engaged in the evaluation of programme-evaluation
MEDICAL TEACHER 11

procedures. Considering the absence of formal evaluation education but also through inviting their feedback on stu-
data for course coordinators’ performance, course coordina- dent’s performance in a simulated setting or during clinical
tors could actively invite faculty to engage in feedback dia- rotations. This, however, requires the development and use
logues and reflection or debriefing sessions. Moreover, by of validated questionnaires, awareness of power dynamics
empowering teaching staff to co-design (part of the) IQA and respect for individual patient preferences concerning the
approaches, their involvement in these practices and their provision of performance evaluation (Sehlbach et al. 2020).
required follow-up might be further increased
(Bendermacher 2021).
The future employers of students are also important A cyclical approach to IQA – systematic, structural
stakeholders in IQA: how do representatives from the work and integrated
field feel about the graduates? Are students joining the To ensure that data generated by IQA practices results in
work field well-prepared and equipped with the required continuous improvement of educational quality, IQA
skills, knowledge and attitudes? Does the curriculum reflect requires a cyclic process characterised by practices that are
the needs of the workforce? Employers could be consulted (1) systematic, (2) structural and (3) integrated (Dolmans
ad-hoc in dialogue or through questionnaires when design- et al. 2003) (Figure 3).
ing a new curriculum or on a more regular basis to discuss Systematic implies that all important educational design
students’ performance during internships or curriculum or elements of a curriculum are addressed and covered by a
alumni evaluations. variety of evaluation instruments and procedures (Dolmans
Alumni can provide valuable information through evalu- et al. 2003). Systematic also applies to the involvement of
ation questionnaires about areas they missed in the pro- stakeholders. For example, evaluating a PBL curriculum
gramme and the extent to which the competences taught requires information about various aspects of each course:
and trained have helped them in their career(choice). This lectures, practicals, tutorial group meetings, the quality of
can be particularly insightful when facing a curriculum PBL tutors, the quality of the assessment and performance
redesign, indicating that involvement of alumni is not of students. This information should be provided by the
required on a structural level. stakeholders involved in the course: students, lecturers,
In a similar vein, clients or patients can shed light on stu- teachers, PBL tutors, course coordinators.
dents’ preparedness to enter the field or on educational Structural points to the importance of periodic evalu-
quality improvement from their perspective (Romme et al. ation at regular intervals (Dolmans et al. 2003). Frequency
2020). This can either be in the context of designing of these intervals should be determined based on the
importance of the aspect being evaluated (Figure 3). For
instance, if after two academic years the course quality has
Box 3. Evaluating the experience of being a mentor in program- stabilised, one may decide to evaluate parts of the pro-
matic assessment gramme in depth only once in 2 years to avoid evaluation
Multiple-Role Mentoring (Meeuwissen et al. 2019) fatigue (Svinicki and Marilla 2001). It is important that these
The Master in Medicine at FHML is designed according to principles practices are clear and agreed upon. Policy documents
of competency-based education and programmatic assessment.
should communicate the different activities (e.g. question-
Students’ learning is supported by physician-mentors who have the
role to both coach the student but also provide advice on their naire, focus groups etc.), purposes (e.g. monitor versus
assessment. Using semi-structured interviews, the experience of the enhance), frequency of activities (e.g. regular intervals,
mentors with ‘multiple-role mentoring’ was explored. Based on the
results it was concluded that mentors chose different approaches to avoid ad-hoc evaluations), levels of curriculum (e.g. course
fulfilling their mentor role and sometimes struggled. The results level versus curriculum level) and relevant stakeholders
gave rise to revision of the faculty development directed (e.g. responsibilities) involved in IQA.
towards mentors.
Finally, IQA practices should be integrated, meaning that
relevant stakeholders are aware of and give shape to their
responsibilities within monitoring and enhancing educational

Systemac

Measure

Improve Judge

Integrated Structural
Figure 3. Cyclic process of IQA – Inspired by Dolmans et al. 2003.
12 R. E. STALMEIJER ET AL.

Box 4. Stimulating reflection through IQA – two examples

Providing relative evaluation data on quality of clerkships


Students doing their clerkships through FHML are asked to evaluate each clerkship using a questionnaire informed by theories on workplace learn-
ing (Strand et al. 2013). As part of the report, clerkship coordinators and affiliated hospitals are presented with two sets of relative evaluation data:
comparison between clerkships provided within the same hospital and comparison between the same clerkship provided by different affili-
ated hospitals.
Experience tells us that especially this contrasting data helps clerkship coordinators get a sense of how they perform and where to improve.

Comparison clerkships within one hospital


Organisation learning effect Supervision Learning climate
Rotation N M SD N M SD N M SD N M SD
Internal Medicine 132 7.5 1.3 132 7.8 1.3 132 7.5 1.8 132 7.8 1.6
Surgery 137 7.6 1.4 137 8.2 1.4 137 7.6 1.6 137 8.1 1.4
Paediatrics 36 7.8 1.4 36 8.4 1.0 36 8.1 1.3 36 8.0 1.2
Etc
1 ¼ very poor; 10 ¼ outstanding
Hospital X in Y

Comparison clerkship between hospitals.


Organisation Learning Effect Supervision Learning Climate
Rotation N M SD N M SD N M SD N M SD
Location A 130 7.6 1.3 130 7.8 1.4 130 7.7 1.6 130 7.8 1.4
Location B 119 7.5 1.0 119 8.0 1.1 119 7.1 1.5 119 7.6 1.5
Location C 89 7.3 1.5 89 7.8 1.4 89 7.7 1.4 89 7.8 1.5
Etc
1 ¼ very poor; 10 ¼ outstanding
Internal Medicine Rotations Affiliated Hospitals FHML

quality (Dolmans et al. 2003). Educational organisations be a long road between paper and action. Translating data
should enable active involvement of stakeholders by giving to action may be stimulated in different ways. Action may
them a voice in the IQA process. Furthermore, organisations be spurred by providing contrasting evaluation data. For
need to ensure that improvement plans are developed, example, when reporting evaluation results of a course,
implemented and evaluated, and that this process is dis- one can consider to report relative evaluation data (e.g.
cussed regularly and transparently. Overall, when IQA activ- comparison with other courses, with the course results of
ities are integrated in the organisation’s regular work the previous year) next to the evaluation results of a course
patterns, this will contribute to a continuous cyclical process (see Box 4). Similarly, but in the case of providing feedback
and continuous enhancement of educational quality. to individual teachers, asking teachers to fill out a self-
assessment and presenting these results next to actual stu-
Building a quality culture – cultural & dent evaluations of the teacher in question could provide
psychological components the teacher in question with added insights of areas to
improve in. Clinical teachers being interviewed about the
If we want to ensure that the data collected for IQA pur- effect of this self-assessment indicated that especially nega-
poses will indeed feed into a quality culture in which all tive discrepancies between their self-assessment and the
stakeholders are aware of and feel commitment towards evaluations of students were experienced as a strong
continuously enhancing educational quality, several com- impetus for change (Stalmeijer et al. 2010). However, in the
ponents require attention: the extent to which reflection same study, clinical teachers indicated needing help to
and dialogue are stimulated, supporting stakeholders translate certain aspects of the feedback to action. For
through faculty development and enabling communities of example, they had a hard time helping students to formu-
practice, and leadership that fosters a quality culture. late learning goals and requested additional coaching.
Furthermore, there is evidence that evaluation data may
Stimulating reflection and dialogue regarding cause emotional reactions like denial or defensiveness
educational quality (DeNisi and Kluger 2000; Sargeant et al. 2008; Overeem
et al. 2009) making discussion of the data essential. van
Stimulating reflection Lierop et al. (2018) experimented with coaching for clinical
Having data on educational quality does not automatically teachers by introducing peer group reflection meetings in
lead to enhancement of education quality (Richardson and which clinical teachers would discuss their self-assessments
Placier 2001; Hashweh 2003). Although grounding the and student evaluations. The study found that the peer
design of your evaluation in theoretical principles will gen-
group reflection meetings assisted clinical teachers in for-
erate rich data that is theoretically grounded, context-spe-
mulating plans for improvement.
cific and more actionable (Bowden and Marton 1999)
research also demonstrates that the recipients of that data,
may need help translating the data to activities that will Dialogue
enhance educational quality (Stalmeijer et al. 2010; To ensure continuous enhancement of educational quality,
Boerboom et al. 2015; van Lierop et al. 2018). There may it is important that the data which is being collected
MEDICAL TEACHER 13

informs improvement initiatives and is discussed by all


layers and structures of the educational organisation. The
Box 5. Faculty development examples
organisational dialogue should similarly be structurally
embedded within the organisational process as quality Workshop Topics for Student Evaluation Committee (SEC) members
assurance. Otherwise it may run the risk of becoming a (Stalmeijer et al. 2016)
jungle of information without appropriate steps being Next to highlighting the importance of the quality assurance cycle,
taken to act on the evaluation results. To do so, evaluation the role of student therein is explained, students learn how to evalu-
should be a recurring topic on the agenda of different lev- ate specific educational activities of the curriculum, their alignment,
practical organization and their learning affect according to the PBL
els in the curriculum (e.g. course, clerkship, year, bachelor). criteria. Students are then offered ample opportunity in form of role
This open dialogue about the evaluation activities should play to practice how to provide effective feedback to course coordi-
be informed by action plans formulated by coordinators on nators and members of the planning group.
the different levels of the educational organisation. Workshop Topics Educational Programme Committee (EPC) members
Implementation and evaluation of these action plans The following topics are incorporated in the training
should be discussed regularly. Results must be discussed
 Defining educational quality
openly so that they can be used effectively to improve
 Defining educational quality assurance, its facets and
education continuously (Dolmans et al. 2011; determinants
Bendermacher et al. 2020). That is, avoiding judgemental,  Monitoring vs enhancing educational quality
assessing way of discussing results and performances,  Moving from quality assurance to quality enhancement
instead focus on constructive language supporting  Analysing a case related to EQA aimed at understanding what is
improvement. In this way, dialogue will enhance the use of required for a balanced quality assurance system
evaluative data for improvement purposes (Kleijnen et al.  Using the concept of Quality Culture to see how FHML can
improve its EQA approaches
2014). Creating opportunities for this dialogue should take
on a structural character, meaning that these discussions
are organised with a set frequency and involving all rele-
vant stakeholders. For example, course coordinators can
discuss evaluation reports with teachers and student repre- of staff and student representatives and is tasked with
sentatives during reflection sessions (van der Leeuw et al. overall monitoring of educational quality at the programme
2013), formulate specific plans for action and attach a time- level. The EPC can give advice, either on request or on
line to implementation and evaluation of these plans on a their own initiative, on education matters like examination
structural basis. On a larger scale, students can be made regulation and its implementation, education budget, edu-
cational innovation, and the system of IQA. During the
active owners of educational quality by presenting evalu-
yearly workshop (see Box 5) EPC members are trained
ation results and plans for improvement during (online)
jointly and involved in an active discussion about their role
lectures (Griffin and Cook 2009). Making them a part of the
in the process of educational quality assurance.
discussion will provide students with an extra incentive to
participate in evaluation procedures if they see the results
of their participation and are afforded the opportunity to Communities of practice
be heard by faculty and exchange views (Griffin and Cook A Community of Practice (CoP) can be defined as a
2009; Healey et al. 2015). ‘persistent, sustaining, social network of individuals who
share and develop an overlapping knowledge base, set of
beliefs, values, history, and experiences focused on a com-
Supporting stakeholders through faculty development
mon practice and/or mutual enterprise’ (Barab et al. 2002,
and communities of practice
p. 495). The establishing of communities of practice (CoPs),
Faculty development on quality assurance can further educational enhancement as they foster an
Providing faculty development for stakeholders actively exchange of expertise and ideas for innovation (de
involved within the process of quality assurance, is another Carvalho-Filho et al. 2020). CoPs facilitate staff in gaining
strategy that may be employed to ensure that data on new perspectives and opportunities to improve, e.g. by
educational quality can be effectively translated to means of sharing good practice. CoPs are specifically rele-
enhancement of educational quality. Depending on the vant for the nurturing of the psychological dimension of a
role that a certain (group of) stakeholder(s) has within the quality culture; they form an environment in which the val-
process of quality assurance, different aspects can be uing of teaching and learning, reflection on daily work
addressed during faculty development workshops. For experiences and challenges, and teacher identity building
example, at FHML we provide yearly workshops for mem- is central (Cantillon et al. 2016). Constructive peer feedback
bers of the student evaluation committees (SEC) (Stalmeijer processes in CoPs, and the offering of mutual professional
et al. 2016) and the educational programme committees and social support help to balance the relation between
(EPC). The SEC, usually comprise 10–12 students, and their developing a sense of ownership and feeling accountable
goal is to generate qualitative data on educational activ- for educational enhancement (Bendermacher et al. 2020).
ities and to discuss outcomes with teaching teams. In a An open and longitudinal character of CoPs is essential to
yearly workshop, members of the SEC are provided with their success; they should go beyond the mere establish-
guidelines on how to effectively fulfil their role (Box 5). ment of a temporal group of those already highly involved
Another yearly workshop is provided to members of the and committed to education. At FHML we organise several
EPC. In the Netherlands, each university programme is activities aimed at stimulating CoPs like journal clubs,
mandated by law to have an EPC. An EPC consists of a mix monthly meetings in which educational innovations are
14 R. E. STALMEIJER ET AL.

presented and discussed, and a longitudinal leadership organisational developments through co-creation (Uhl-Bien
training for course coordinators in which participants are et al. 2014).
explicitly invited to share experiences with their colleagues. Interventions that can aid educational leaders to make
We recommend the twelve tips by Carvalho-Filho and col- the most out of quality culture development and IQA
leagues (2020) for inspiration on how to implement a CoP include: training leaders to stimulate reflection, expertise,
for faculty development purposes. and knowledge sharing, learning leaders to foster safety
and trust relations in teacher teams, and strengthen lead-
er’s situational sensitivity (Hill and Stephens 2005;
Leadership that fosters a quality culture
Edmondson et al. 2007; Nordquist and Grigsby 2011). In
To value the voice of students and staff members, educa- order to continuously enhance, health professions educa-
tional leaders can make a difference by facilitating debate tion is best served with leaders who are able to combine
about the quality of education (Sursock 2011), but also by multiple styles and who are able to coalesce different
listening and being responsive to others within the organ- stakeholder goals and ambitions (Lieff and Albert 2010).
isation (Knight and Trowler 2000). A review conducted by
Bland et al, indicated that, in addition to creating an open
communication climate, favourable leadership behaviours
Conclusion
for successful curriculum development concern assertive, Using the concept of ‘Quality Culture’ this AMEE Guide has
participative and cultural/value-influencing behaviours described various practices that can aid educational organi-
(Bland et al. 2000). That is, successful leaders promote col- sations in moving beyond the ‘tick-box’ exercises that IQA
laboration, share values in the light of the envisioned practices often evoke. We are not claiming that this is an
change, and build trust and facilitate involvement (Bland easy task. Continuous enhancement of educational quality
et al. 2000). is a veritable team effort requiring the contribution of
Bendermacher et al. (2021) highlighted that in their many. By addressing practices needed to create the sys-
efforts to nurture a quality culture, leaders acting at differ- tematic/managerial and cultural/psychological aspects of a
ent levels within the organisation face various challenges. quality culture (see Figure 1), we hope this AMEE Guide
On the higher management level, leaders typically deal provides inspiration and direction for all those with a stake
with ‘political’, ‘strategic’ and ‘structural’ issues which con- in educational quality.
cern rules, policies, responsibilities and accountability,
needed for a quality culture to take root. Educational lead-
ership entails a balancing of present structures and systems Disclosure statement
with staff values and requires coalition building, negoti- No potential conflict of interest was reported by the author(s).
ation and mediating for resources (Bolman and Deal 2003).
To this end, educational leaders should work to build
strong relationships and networks within the institution Funding
and stimulate collaboration and interaction (O’Sullivan and The author(s) reported there is no funding associated with the work
Irby 2011). Leaders working on the organisational meso, or featured in this article.
micro level appear to be more engaged in the direct
improvement of the educational content and focus more
on facilitating team learning. Complex organisational struc- Notes on contributors
tures and emerging trends of interdisciplinary collaboration Renee E. Stalmeijer, PhD, is an Assistant Professor at the School of
in health professions education, cause the influence of Health Professions Education, Faculty of Health, Medicine and Life
leaders to be exerted more and more in indirect ways Sciences, Maastricht University, the Netherlands.
(Meeuwissen et al. 2020; Bendermacher et al. 2021). The Jill R. D. Whittingham, PhD, is an Assistant Professor at the School of
influence of leaders on educational quality enhancement is Health Professions Education, Faculty of Health, Medicine and Life
increasingly being manifested through shared, collaborative Sciences, Maastricht University, the Netherlands.
and distributed approaches (e.g. McKimm and Lieff 2013; Guy W. G. Bendermacher, PhD, is a Policy Advisor at the Institute for
Sundberg et al. 2017; Sandhu 2019). Hence, instead of Education, Faculty of Health, Medicine and Life Sciences, Maastricht
attributing responsibility for educational quality enhance- University, the Netherlands.
ment to ‘strong’ leaders, leaders are expected to be moti-
Ineke H. A. P. Wolfhagen, PhD, is an Associate Professor at the School
vators, mentors, and facilitators and leadership in health of Health Professions Education, Faculty of Health, Medicine and Life
professions education is changing from individual staff Sciences, Maastricht University, the Netherlands.
supervision, guidance, and support to a focus on the
Diana H. J. M. Dolmans, PhD, is a Professor at the School of Health
broader collective. Professions Education, Faculty of Health, Medicine and Life Sciences,
In the knowledge-intensive setting of health professions Maastricht University, the Netherlands.
education institutes, educational leaders and other teach-
Carolin Sehlbach, PhD, is an Assistant Professor at the School of
ing staff members co-construct meaning and solutions to
Health Professions Education, Faculty of Health, Medicine and Life
organisational issues (Tourish 2019). As leadership in health Sciences, Maastricht University, the Netherlands.
professions education is multi-layered, within the reality of
(medical) school hierarchies, one might be a leader and a
follower at the same time (McKimm and O’Sullivan 2016). ORCID
Moreover, instead of being mere leadership recipients, aca- Renee E. Stalmeijer http://orcid.org/0000-0001-8690-5326
demics can steer the leadership of others and impact Jill R. D. Whittingham http://orcid.org/0000-0001-9869-2719
MEDICAL TEACHER 15

Guy W. G. Bendermacher http://orcid.org/0000-0002-7804-4594 Dolmans DHJM. 2019. How theory and design-based research can
Diana H. J. M. Dolmans http://orcid.org/0000-0002-4802-1156 mature PBL practice and research. Adv Health Sci Educ Theory
Carolin Sehlbach http://orcid.org/0000-0001-9732-1377 Pract. 24(5):879–891.
Dolmans DHJM, Stalmeijer RE, Van Berkel HJM, Wolfhagen IHAP. 2011.
Quality assurance of teaching and learning: enhancing the quality
References culture. In: Dornan T, Mann K, Scherpbier A, Spencer J, editors.
Medical education theory and practice. Edinburgh: Elsevier; p.
Anderson T, Shattuck J. 2012. Design-based research: a decade of pro- 257–264.
gress in education research? Educational Researcher. 41(1):16–25. Dolmans DHJM, Wolfhagen HAP, Scherpbier AJJA. 2003. From quality
Barab SA, Barnett M, Squire K. 2002. Developing an empirical account assurance to total quality management: how can quality assurance
of a community of practice: Characterizing the essential tensions. result in continuous improvement in health professions education?
Journal of the Learning Sciences. 11(4):489–542. Educ Health (Abingdon). 16(2):210–217.
Bendermacher GWG. 2021. Navigating from Quality Management to Edmondson AC, Dillon JR, Roloff KS. 2007. Three perspectives on team
Quality Culture [dissertation]. Maastricht: Ipskamp Printing. learning: outcome improvement, task mastery, and group process.
Bendermacher GWG, De Grave WS, Wolfhagen IHAP, Dolmans DHJM, ANNALS. 1(1):269–314.
Oude Egbrink MGA. 2020. Shaping a culture for continuous quality Ehlers UD. 2009. Understanding quality culture. Qual Assur Educ. 17(4):
improvement in undergraduate medical education. Acad Med. 343–363.
95(12):1913–1920. Elassy N. 2013. A model of student involvement in the quality assur-
Bendermacher GWG, Dolmans DHJM, de Grave WS, Wolfhagen IHAP, ance system at institutional level. Qual Assur Educ. 21(2):162–198.
Oude Egbrink MGA. 2021. Advancing quality culture in health pro- EUA 2006. Quality culture in European universities: a bottom-up
fessions education: experiences and perspectives of educational approach. Brussels: European University Association.
leaders. Adv Health Sci Educ Theory Pract. 26(2):467–487. EUA 2009. ENQA report on standards and guidelines for quality assur-
Bendermacher GWG, Oude Egbrink MGA, Wolfhagen IHAP, Dolmans ance in the European higher education area. Brussels: ENQA.
DHJM. 2017. Unravelling quality culture in higher education: a real- Frick TW, Chadha R, Watson C, Zlatkovska E. 2010. Improving course
ist review. High Educ. 73(1):39–60. evaluations to improve instruction and complex learning in higher
Berings D, Grieten S. 2012. Dialectical reasoning around quality culture. education. Educ Tech Res Dev. 58(2):115–136.
7th European Quality Assurance Forum (EQAF) of the European Frye AW, Hemmer PA. 2012. Program evaluation models and related
University Association (EUA), Tallinn. theories: AMEE guide no. 67. Med Teach. 34(5):e288–e299.
Bland CJ, Starnaman S, Wersal L, Moorehead-Rosenberg L, Zonia S, Goldie J. 2006. AMEE Education Guide no. 29: evaluating educational
Henry R. 2000. Curricular change in medical schools: how to suc- programmes. Med Teach. 28(3):210–224.
ceed. Acad Med. 75(6):575–594. Griffin A, Cook V. 2009. Acting on evaluation: Twelve tips from a
Blouin D. 2019. Quality improvement in medical schools: vision meets national conference on student evaluations. Med Teach. 31(2):
culture. Med Educ. 53(11):1100–1110. 101–104.
Boerboom TBB, Stalmeijer RE, Dolmans DHJM, Jaarsma DADC. 2015. Harvey L. 2004–2021. Analytic quality glossary. http://www.qualityre-
How feedback can foster professional growth of teachers in the searchinternational.com/glossary/
clinical workplace: a review of the literature. Stud Educ Eval. 46: Harvey L, Green D. 1993. Defining quality. Assess Eval Higher Educ.
47–52. 18(1):9–34.
Bolman LG, Deal TE. 2003. Reframing organizations. Artistry, choice, Harvey L, Stensaker B. 2008. Quality Culture: understandings, bounda-
and Leadership. 3rd ed. San Fransisco (CA): Jossey-Bass. ries and linkages. Eur J Educ. 43(4):427–442.
Bovill C, Cook-Sather A, Felten P. 2011. Students as co-creators of Hashweh MZ. 2003. Teacher accommodative change. Teach Teach
Educ. 19(4):421–434.
teaching approaches, course design, and curricula: implications for
Healey M, Bovill C, Jenkins A. 2015. Students as partners in learning.
academic developers. Int J Acad Devel. 16(2):133–145.
In: Enhancing learning and teaching in higher education: engaging
Bovill C, Cook-Sather A, Felten P, Millard L, Moore-Cherry N. 2016.
with the dimensions of practice. Maidenhead: Open University
Addressing potential challenges in co-creating learning and teach-
Press; p. 141–172.
ing: overcoming resistance, navigating institutional norms and
Healey M, Flint A, Harrington K. 2016. Students as partners: reflections
ensuring inclusivity in student–staff partnerships. High Educ. 71(2):
on a conceptual model. Teach Learn Inq. 4(2):8–20.
195–208.
Hill F, Stephens C. 2005. Building leadership capacity in medical educa-
Bowden J, Marton F. 1999. The university of learning. Educa þ Train.
tion: developing the potential of course coordinators. Med Teach.
41(5):ii–ii.
27(2):145–149.
Braskamp LA, Ory JC. 1994. Assessing faculty work. Enhancing individ-
Hodges BD, Maniate JM, Martimianakis MA, Alsuwaidan M, Segouin C.
ual and institutional performance. San Francisco (CA): Jossey-Bass. 2009. Cracks and crevices: globalization discourse and medical edu-
Cameron KS, Quinn RE. 1999. Diagnosing and changing organizational
cation. Med Teach. 31(10):910–917.
culture based on the competing values framework. Reading (MA): Kikukawa M, Stalmeijer R, Matsuguchi T, Oike M, Sei E, Schuwirth LWT,
Addison-Wesley. Scherpbier AJJA. 2021. How culture affects validity: understanding
Cantillon P, D’Eath M, De Grave W, Dornan T. 2016. How do clinicians Japanese residents’ sense-making of evaluating clinical teachers.
become teachers? A communities of practice perspective. Adv BMJ Open. 11(8):e047602.
Health Sci Educ Theory Pract. 21(5):991–1008. Kikukawa M, Stalmeijer RE, Okubo T, Taketomi K, Emura S, Miyata Y,
Cartwright MJ. 2007. The rhetoric and reality of “quality” in higher Yoshida M, Schuwirth LWT, Scherpbier AJJA. 2017. Development of
education. Quality Assurance in Education. 15(3):287–301. culture-sensitive clinical teacher evaluation sheet in the Japanese
Cashin WE. 1999. Learner ratings of teaching: uses and misuses. In: context. Med Teach. 39(8):844–850.
Seldin P, editor. Changing practices in evaluating teaching Anker: Kleijnen J. 2012. Internal quality management and organisational val-
Bolton (MA): Anker Publishing Co., Inc; p. 25–44. ues in higher education; conceptions and perceptions of teaching
Coates H. 2005. The value of student engagement for higher educa- staff [dissertation]. Maastricht: School of Health Professions
tion quality assurance. Qual Higher Educ. 11(1):25–36. Education/Heerlen; Zuyd University of Applied Sciences.
Collins A, Brown JS, Newman SE. 1989. Cognitive apprenticeship: Kleijnen J, Dolmans DHJM, Willems J, van Hout H. 2014. Effective qual-
teaching the crafts of reading, writing, and mathematics. In: ity management requires a systematic approach and a flexible
Knowing, learning, and instruction: essays in honor of Robert organisational culture: a qualitative study among academic staff.
Glaser. Hillsdale (NJ): Lawrence Erlbaum Associates, Inc; p. 453–494. Quality in Higher Education. 20(1):103–126.
de Carvalho-Filho MA, Tio RA, Steinert Y. 2020. Twelve tips for imple- Knight PT, Trowler PR. 2000. Department-level cultures and the
menting a community of practice for faculty development. Med improvement of learning and teaching. Stud Higher Educ. 25(1):
Teach. 42(2):143–149. 69–83.
DeNisi AS, Kluger AN. 2000. Feedback effectiveness: can 360-degree Lieff SJ, Albert M. 2010. The mindsets of medical education leaders:
appraisals be improved? AMP. 14(1):129–139. how do they conceive of their work? Acad Med. 85(1):57–62.
16 R. E. STALMEIJER ET AL.

Marsh HW. 2007. Students’ evaluations of university teaching: dimen- Seale A. 2009. A contextualised model for accessible e-learning in
sionality, reliability, validity, potential biases and usefulness. In: higher education: understanding the students’ perspective. Berlin
Perry RP, Smart JC, editors. The scholarship of teaching and learn- (Heidelberg): Springer Berlin Heidelberg.
ing in higher education: an evidence-based perspective. Dordrecht: Sehlbach C, Govaerts MJB, Mitchell S, Teunissen TGJ, Smeenk FWJM,
Springer Netherlands; p. 319–383. Driessen EW, Rohde GGU. 2020. Perceptions of people with respira-
Martens SE, Meeuwissen SNE, Dolmans DHJM, Bovill C, Ko €nings KD. tory problems on physician performance evaluation-a qualitative
2019. Student participation in the design of learning and teaching: study. Health Expect. 23(1):247–255.
disentangling the terminology and approaches. Med Teach. 41(10): Seldin P. 1999. Changing practices in evaluating teaching: a practical
1203–1205. guide to improved faculty performance and promotion/tenure deci-
Martens SE, Wolfhagen IHAP, Whittingham JRD, Dolmans DHJM. 2020. sions. Vol. 10. San-Francisco (CA): Jossey-Bass.
Mind the gap: teachers’ conceptions of student-staff partnership Stalmeijer RE, Dolmans DHJM, Wolfhagen IHAP, Muijtjens AMM,
and its potential to enhance educational quality. Med Teach. 42(5):
Scherpbier AJJA. 2008. The development of an instrument for eval-
529–535.
uating clinical teachers: involving stakeholders to determine con-
McKimm J, Lieff SJ. 2013. Medical education leadership. In: Dent J,
tent validity. Med Teach. 30(8):e272–e277.
Harden RM, Hunt D, editor. A practical guide for medical teachers.
Stalmeijer RE, Dolmans DHJM, Wolfhagen IHAP, Muijtjens AMM,
London: Churchill Livingstone-Elsevier; p. 343–351.
Scherpbier AJJA. 2010a. The Maastricht Clinical Teaching
McKimm J, O’Sullivan H. 2016. When I say … leadership. Med Educ.
50(9):896–897. Questionnaire (MCTQ) as a valid and reliable instrument for the
Meeuwissen SN, Stalmeijer RE, Govaerts MJB. 2019. Multiple-role men- evaluation of clinical teachers. Acad Med. 85(11):1732–1738.
toring: mentors’ conceptualisations, enactments and role conflicts. Stalmeijer RE, Dolmans DHJM, Wolfhagen IHAP, Peters WG, van
Med Educ. 53(6):605–615. Coppenolle L, Scherpbier AJJA. 2010b. Combined student ratings
Meeuwissen SNE, Gijselaers WH, Wolfhagen IHAP, Oude Egbrink MGA. and self-assessment provide useful feedback for clinical teachers.
2020. How teachers meet in interdisciplinary teams: hangouts, dis- Adv Health Sci Educ Theory Pract. 15(3):315–328.
tribution centers, and melting pots. Acad Med. 95(8):1265–1273. Stalmeijer RE, Whittingham JRD, de Grave WS, Dolmans DHJM. 2016.
Newton J. 2000. Feeding the Beast or Improving Quality?: Academics’ Strengthening internal quality assurance processes: facilitating stu-
perceptions of quality assurance and quality monitoring. Quality in dent evaluation committees to contribute. Assess Eval Higher Educ.
Higher Education. 6(2):153–163. 41(1):53–66.
Nordquist J, Grigsby RK. 2011. Medical schools viewed from a political Strand P, Sjo€borg K, Stalmeijer R, Wichmann-Hansen G, Jakobsson U,
perspective: how political skills can improve education leadership. Edgren G. 2013. Development and psychometric evaluation of the
Med Educ. 45(12):1174–1180. undergraduate clinical education environment measure (UCEEM).
O’Sullivan PS, Irby DM. 2011. Reframing research on faculty develop- Med Teach. 35(12):1014–1026.
ment. Acad Med. 86(4):421–428. Sundberg K, Josephson A, Reeves S, Nordquist J. 2017. Power and
Quinn RE, Rohrbaugh J. 1983. A spatial model of effectiveness criteria: resistance: leading change in medical education. Stud Higher Educ.
towards a competing values approach to organizational analysis. 42(3):445–462.
Manag Sci. 29(3):363–377. Sursock A. 2011. Examining quality culture part II: processes and tools-
Overeem K, Wollersheim H, Driessen EW, Lombarts K, Van De Ven G, participation, ownership and bureaucracy. Brussel: EUA Publications.
Grol R, Arah O. 2009. Doctors’ perceptions of why 360-degree feed-
https://wbc-rti.info/object/document/7543/attach/Examining_
back does (not) work: a qualitative study. Med Educ. 43(9):874–882.
Quality_Culture_Part_II.pdf.
QAA. 2003. Handbook for enhancement-led institutional review:
Svinicki MD, Marilla D. 2001. Encouraging your students to give feed-
Scotland. Gloucester: Quality Assurance Agency for Higher
back. New Dir Teach Learn. 2001(87):17–24.
Education.
Tourish D. 2019. Is complexity leadership theory complex enough? A
Richardson V, Placier P. 2001. Teacher change. In: Richardson V, editor.
critical appraisal, some modifications and suggestions for further
Handbook of research on teaching. Washington (DC): American
Educational Research Association; p. 905–947. research. Organ Stud. 40(2):219–228.
Romme S, Bosveld MH, Van Bokhoven MA, De Nooijer J, Van den Uhl-Bien M, Riggio RE, Lowe KB, Carsten MK. 2014. Followership the-
Besselaar H, Van Dongen JJJ. 2020. Patient involvement in interpro- ory: a review and research agenda. Leadersh Quart. 25(1):83–104.
fessional education: a qualitative study yielding recommendations van der Leeuw RM, Slootweg IA, Heineman MJ, Lombarts MJMH. 2013.
on incorporating the patient’s perspective. Health Expect. 23(4): Explaining how faculty members act upon residents’ feedback to
943–957. improve their teaching performance. Med Educ. 47(11):1089–1098.
Sandhu D. 2019. Healthcare educational leadership in the twenty-first van Lierop M, de Jonge L, Metsemakers J, Dolmans DHJM. 2018. Peer
century. Med Teach. 41(6):614–618. group reflection on student ratings stimulates clinical teachers to
Sargeant J, Mann K, Sinclair D, Van der Vleuten CPM, Metsemakers J. generate plans to improve their teaching. Med Teach. 40(3):
2008. Understanding the influence of emotions and reflection upon 302–309.
multi-source feedback acceptance and use. Adv Health Sci Educ Williams J. 2016. Quality assurance and quality enhancement: is there
Theory Pract. 13(3):275–288. a relationship? Qual Higher Educ. 22(2):97–102.

You might also like