Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

Nurse Education in Practice 56 (2021) 103190

Contents lists available at ScienceDirect

Nurse Education in Practice


journal homepage: www.elsevier.com/locate/issn/14715953

Using a station within an objective structured clinical examination to assess


interprofessional competence performance among undergraduate
nursing students
Juan Luis González-Pascual a, *, Inmaculada López-Martín a, Elena María Saiz-Navarro a, b,
Óscar Oliva-Fernández a, c, Francisco Javier Acebedo-Esteban a, d, Marta Rodríguez-García a
a
Universidad Europea de Madrid, Faculty of Biomedical and Health Sciences, Department of Nursing, Spain
b
12 de Octubre Hospital, Madrid, Spain
c
General de Villalba Hospital, Spain
d
Emergency Prehospital Service, SAMUR-PC, Madrid, Spain

A R T I C L E I N F O A B S T R A C T

Keywords: Aim/Objective: To describe and analyse the use of a station within an OSCE to assess interprofessional competence
Education, Interdisciplinary performance in undergraduate nursing students.
Interprofessional relations The specific objectives were:
Nursing education
- To measure the students’ level of competence performance in relation to the interprofessional competences
OSCES, Communication skills
Roles and Responsibilities, Communication and Teamwork.
- To determine inter-observer concordance in the assessment of the interprofessional competences.
Background: Teamwork competencies are key to improving patient safety and avoiding medical errors. Today,
healthcare professionals work in interdisciplinary teams. To foster a culture of safety, some of the measures that
can be taken at the individual, team and organisational levels include fostering clear communication among team
members, knowledge of respective roles and functions, and deepening team functioning through respect and
trust in judgement and capabilities. The World Health Organization recommends starting to develop these
competencies in university studies, through interprofessional education. There are numerous programmes in
universities all over the world, but more research is needed on the assessment of interprofessional education
activities, preferably through objective methods. Competency performance can be assessed by an external
evaluator, in a simulated environment, with the Objective Structured Clinical Examination, which is widely used
in nursing.
Design: Cross-sectional study.
Methods: 63 second-year nursing undergraduate students completed an interprofessional competencies station
within an 8-station OSCE. Communication, Roles and Responsibility and Teamwork competences were assessed.
The Interprofessional Collaborator Assessment Rubric (ICAR) was used as a model to assess the performance of
students. Inter-observer concordance analysis was performed using the kappa coefficient and the concordance
rate.
Results: 92.1% of students reached a good level in communication competence, 88.9% in roles and responsibility
competence, and 55.6% in teamwork competence. The global concordance rate was 83.8%, and the kappa co­
efficient was 0.67.
Conclusions: Most students have demonstrated interprofessional competence performance at a good level.
However, the inter-observer concordance obtained for some of the items was not as expected.
The assessment of interprofessional competencies, as it deals mainly with relational and communicative aspects,
requires greater preparation both in terms of the specification of assessment items and in agreement between
examiners.

* Correspondence to: Universidad Europea de Madrid, c/ Tajo s/n. 28670, Villaviciosa de Odón, Madrid, Spain.
E-mail address: juanluis.gonzalez2@universidadeuropea.es (J.L. González-Pascual).

https://doi.org/10.1016/j.nepr.2021.103190
Received 15 June 2020; Received in revised form 20 August 2021; Accepted 31 August 2021
Available online 4 September 2021
1471-5953/© 2021 Elsevier Ltd. All rights reserved.
J.L. González-Pascual et al. Nurse Education in Practice 56 (2021) 103190

1. Introduction There is no empirical evidence on the interprofessional competencies


that students should develop, but there is considerable consistency in the
1.1. Interprofessional education (IPE) core domains of all the available interprofessional frameworks: Role
understanding, Interprofessional communication, Interprofessional
Patient safety is a global issue (World Health Organization (WHO), values, Coordination and collaborative decision-making, Reflexivity and
2008). In the United States, medical errors could be the third leading Teamwork (Rogers et al., 2016).
cause of death, according to the study by Makary and Daniel (2016).
This study, although controversial due to its methodology (Gianoli, 1.2. Assessment of interprofessional education
2016), follows the line of the report “To err is human”, which estimates
that between 44,000 and 98,000 patients die a year as a result of medical Numerous interprofessional education initiatives are carried out in
errors (Institute of Medicine (IoM), 2000). Errors can occur due to universities around the world (Barr, 2015). However, more research is
multiple factors and circumstances (World Health Organization (WHO), needed on the assessment of interprofessional education activities
2008). They are more frequent in complex systems such as healthcare, (Reeves et al., 2013). An essential principle of competency-based edu­
which involve the interaction of multiple professionals as well as tech­ cation is the ability to objectively assess the achievement of compe­
nology (Institute of Medicine (IoM), 2000). tencies (Bandiera et al., 2006). The assessment of competencies is useful
Errors cannot be blamed solely on individuals. To prevent errors and for providing feedback, identifying needs to improve interventions, and
build a culture of patient safety, it is necessary to understand that, identifying their effects on attitudes, knowledge, skills and collaborative
although there may be an individual at the end of the error chain, behaviour (Reeves et al., 2016).
multiple latent conditions are necessary to arrive at that error. These The use of Kirkpatrick’s (2004) model allows for measuring change
latent conditions may relate to individuals, equipment, technology, in interprofessional education (Reeves et al., 2016) and it is imple­
processes and organisations (Donaldson, 2008). Today, healthcare pro­ mented internationally in the field of interprofessional education (Bill­
fessionals work in interdisciplinary teams. To foster a culture of safety, ings and Halstead, 2020). The model has four levels (Fig. 1): a first level
some of the measures that can be taken at the individual, team and that assesses student satisfaction; a second level that assesses the
organisational levels include fostering clear communication among self-perception of interprofessional competency acquisition; a third level
team members, knowledge of respective roles and functions, and deep­ that assesses the performance of these competencies; and a fourth level
ening team functioning through respect and trust in judgement and that assesses the impact of these competencies on the results in patients
capabilities, as well as being sensitive to each other’s safety concerns and the health system. Each upper level provides deeper and more
(Donaldson, 2008). The acquisition of teamwork skills is one of the precise measurement and requires a slower and more stringent
essential keys to promoting healthcare quality and patient safety assessment.
(Institute of Medicine (IoM), 2000). Most of the interprofessional education experiences published at
For these reasons, the World Health Organization (2010) published international level correspond to levels one and two of Kirkpatrick’s
the “Framework for action on Interprofessional Education and Collab­ classification (Ehlers et al., 2017; Gordon et al., 2018). This could be
orative Practice” (World Health Organization WHO, 2010), which con­ because each level involves more complexity in measurement, and more
cludes that, in order to achieve adequate teamwork among health time. Measuring satisfaction after an activity is immediate. Measuring
professionals, students must develop interprofessional skills during their self-perceived competence acquisition takes slightly longer but is also
university education. relatively quick and easy with a validated questionnaire. Knowledge,
Interprofessional education is in place when students from two or attitudes and perceived skills are being assessed. In contrast, demon­
more health professions learn from each other, in an interactive way, to strating the ability to integrate these knowledge, attitudes and skills by
facilitate effective collaboration and improve health outcomes (Centre performing a task is more complex, and requires assessment in a real or
for the Advancement of Interprofessional Education (CAIPE), 2016). simulated environment. Finally, assessing the impact of these compe­
There is a consensus that interprofessional education should form part of tencies on patients is a long-term task that also involves considering the
the health sciences undergraduate syllabus (Barr, 2015). other factors that may influence outcomes. However, few of the

Results
in patients

Performance

Self-perception

Satisfaction

Fig. 1. Levels of assessment according to the Kirkpatrick Model highlighting the third level.

2
J.L. González-Pascual et al. Nurse Education in Practice 56 (2021) 103190

assessment tools designed for levels one and two (satisfaction ques­ using actors in the roles of patients and nurses.
tionnaires, self-perception tests) allow for objective assessment of However, there is still a lack of experience where the uniprofessional
learning (Brashers et al., 2016). Even validated questionnaires are not station within an OSCE has been applied as a method for assessing
completely objective but reflect students’ self-perception or intention. interprofessional competencies in nursing undergraduate students.

1.3. Objective Structured Clinical Examination as an instrument to assess 1.4. Objectives


interprofessional competence performance
This study aimed to describe and analyse the use of a station within
Competency performance (Fig. 1) can be assessed by an external an OSCE to assess interprofessional competence performance.
evaluator, in a simulated environment, with the Objective Structured The specific objectives were:
Clinical Examination (OSCE) (Khan et al., 2013a, 2013b). - To measure the students’ level of competence performance in
The OSCE can be defined as “An assessment tool based on the prin­ relation to the interprofessional competences Roles and Responsibilities,
ciples of objectivity and standardisation, in which the candidates move Communication and Teamwork.
through a series of time-limited stations in a circuit for the purposes of - To determine inter-observer concordance in the assessment of the
assessment of professional performance in a simulated environment” interprofessional competences.
(Khan et al., 2013a, 2013b, e1440). The circuit is the configuration of
stations through which students flow. The OSCE incorporates simulation 2. Methodology
technologies into the case method, recreating real-world assumptions,
such as the presence of a physical environment that serves as a context 2.1. Setting
and allows to situate oneself, the interaction with other people such as
patients, relatives or staff (actors and/or simulators) in real time, as well This study was conducted in Spain. The nursing programme (Bach­
as the need to integrate different knowledge and skills to give an answer elor of Science in Nursing) consists of 240 ECTS (European Credit
not only in a theoretical but also in a practical way. (Khan et al., 2013a, Transfer System) (4 years).
2013b). However, the OSCE does not replace the assessment of
real-world performance, but only allows extrapolation of the simulation 2.2. Participants and data collection
results to reality. This is because performance in the same task varies
considerably depending on the context in which it is performed (Khan All students enroled in the second year of the university nursing
et al., 2013a, 2013b). degree (N = 86) completed an Objective Structured Clinical Examina­
Assessment using the OSCE focuses on cognitive, psychomotor and tion (OSCE) on June 2018. The OSCE was mandatory for all the students
affective skills. OSCEs can be used to assess competences that integrate as part of their standard assessment at the end of the academic year.
knowledge, attitudes and skills in a holistic way, and which cannot be An interprofessional competencies station was included on 2018 in
assessed by exams or written cases, or by merely observing an isolated the OSCE to assess three interprofessional competencies: Roles and Re­
technical skill. For example, the performance of a patient assessment or sponsibility; Communication; and Teamwork. After reviewing the
a procedure (Khan et al., 2013a, 2013b, e1443). OSCEs are widely used interprofessional competency frameworks at international level (Inter­
in nursing worldwide as a valid and reliable method of assessing com­ professional Education Collaborative – IPEC, Canadian Interprofessional
petencies. However, it can cause stress and anxiety for students, and is a Health Collaborative – CIHC and others) and the characteristics of
costly method of assessment in terms of human and material resources nursing practice in Spain, these three competencies have been chosen by
(Goh et al., 2020). Regarding to the interprofessional competence per­ our university as the interprofessional competency framework for all
formance, OSCEs have been used to assess competencies related to pa­ students of biomedical and health sciences.
tient safety (Ginsburg et al., 2015; Stroud and Vidyarthi, 2015), Students were informed before starting the OSCE by one researcher
interprofessional competencies such as clarification of professional roles about the specific objectives of the research, that their participation
among team members, shared decision-making, problem-solving ca­ would consist of allowing their results at the interprofessional compe­
pacity and the handling of unexpected situations (Sharma et al., 2015), tencies station to be used in the research in addition to filling in a
interprofessional collaborative practice (Oza et al., 2015) and effective general data form, that their participation was voluntary, that refusing
collaborative work (Brashers et al., 2016). to participate would not affect their OSCE score, and that they could
There are three ways in which the OSCE can be used to assess withdraw from the study at any time they wished. Students signed a
interprofessional competencies: 1) Uniprofessional OSCEs with all sta­ written consent form.
tions focusing on interprofessional competencies (iOSCE) (Morison and After being informed, 63 students (73.3%) agreed to participate.
Stewart, 2005; Simmons et al., 2011), 2) Uniprofessional OSCEs with After finishing the OSCE, none of the students withdrew from the study.
only one station focused on interprofessional competencies (Oza et al.,
2015), 3) OSCEs in which a team of students from different professions is 2.3. Description of the OSCE
assessed in the scenarios (TOSCE) (Cullen et al., 2003; Hall et al., 2011).
OSCEs that are conducted individually, regardless of whether they The OSCE consisted of 8 stations, as described in Table 1.
have one or more stations dedicated to interprofessional competencies, The duration of each station was 10 min for each student, and stu­
allow the student’s ability to demonstrate these competencies in a dents had 2 min to move from one station to another and read the
simulated environment to be assessed. TOSCE has been used in homo­ context of the station. In total, the OSCE lasted 96 min from start to
geneous groups of postgraduate professionals (Gordon et al., 2013)), finish for each student.
allowing the assessment of team performance beyond individual inter­
professional competences. If there are no formal teams that have been 2.4. Interprofessional competencies station (station 5)
working together for some time, it seems better to assess competences on
an individual basis, rather than putting students together and assessing Station 5 began with the following contextualisation provided to the
their performance when they have not had time to really form a team. students: You are on a hospital ward. 50-year-old patient with COPD.
This way, a single station for evaluating interprofessional competencies Today he is being discharged. Doctor’s discharge report with treatment:
within an OSCE seems to be a very efficient option. In the study by Oza Atrovent® (ipratropium bromide), 4 inhalations every 6 h +
et al. (2015), the interprofessional collaborative practice of a uni­ Terbasmin® (terbutaline), 2 inhalations every 6 h. Ventolin® (salbu­
professional station with third-year medical students was observed by tamol) is suspended. You go into the room to give him the inhalers and

3
J.L. González-Pascual et al. Nurse Education in Practice 56 (2021) 103190

Table 1
Description of the 8 stations of the OSCE.
Station Contextualisation Objectives Type of station

Standardised
1 Patient with elimination problems Assessment and identification of problems
patient
2 Asthma patient Identification of problems and proposal of risk prevention activities Written
Standardised
3 Hip replacement patient Mobilisation and risk prevention
patient
4 Rest
Standardised
Interprofessional competences: Roles and responsibility, Communication,
5 COPD patient patient
Teamwork
Standardised doctor
Out-of-hospital situation, person in cardiorespiratory
6 Basic cardiopulmonary resuscitation and use of semiautomatic external defibrillator Mannequin
arrest
Standardised
7 Patient with Arterial hypertension, who presents chest pain Assessment and advice Electrocardiogram performance and interpretation
patient
8 Patient with pneumonia Peripheral venous canalisation and administration of antibiotics Mannequin

give him the standard discharge recommendations. tool was selected after reviewing other options (Oates and Davidson,
Two actors took part in the station, one as a standardised patient and 2015; Havyer et al., 2016) because it was observer-based rather than
the other as a standardised doctor. self-reporting, and also covered several interprofessional competencies
The standardised patient was instructed to refuse to use Terbasmin® rather than focusing on just one.
because he didn’t feel the effect of the medication as he did when using 8 assessment items were selected. They were rewritten in a few cases
Ventolin®. to fit the context more specifically. Table 3 shows the original wording
The standardised doctor was instructed to, at first, insist on the and that used in this research.
original prescription. After that, he was instructed to take part in a The assessment scale was also changed to suit the needs of an OSCE.
conversation with the nursing student about the treatment and change The checklist makes the examination more transparent and more
the prescription from Terbamin® to Ventolin®. objectively scored (Harden et al., 2016) and allows for greater observer
The student was expected to: Deliver the inhalers and explain the agreement (Khan et al., 2013a, 2013b). Given these advantages, each of
course of treatment after leaving hospital. Recognise that the patient had the 8 station assessment items was dichotomously scored as Not Per­
a problem with one of the inhalers and investigate it, using empathy and formed or Performed.
active listening. Be aware of his or her own role and that of the doctor in On the day of the OSCE, the interprofessional competencies station
this matter. Explain the situation to the doctor in an assertive way. was assessed for each student simultaneously and independently by 2
Participate in a dialogue with the doctor, considering the patient’s professors belonging to the research team, through a one-way mirror,
opinion. Provide healthcare education for the patient on the use of the and with earphones enabling them to hear the conversations.
inhalers. Give discharge recommendations.
The scenario was designed by the research team together with the 2.6. Statistical analysis
OSCE design team following the stages described by Kaneko and Lopes
(2019): planning, objectives, simulation structure and format, case General data were expressed as means (standard deviation) in the
description, evaluation, material and resources. A station was sought case of age. Sex was expressed by absolute and relative frequencies.
that reflected a typical situation for a second-year nursing student in The 8 station interprofessional assessment items in the assessment of
Spain. The contents studied that year included, among others, discharge each of the two evaluators were expressed by relative frequencies.
care of the COPD patient as well as pharmacology and use of inhalers. In Inter-observer concordance analysis was performed using the kappa
addition, the station allowed the assessment of communication, both coefficient and the concordance rate. The kappa coefficient is a statistic
with the patient and the doctor, the ability to know and transmit one’s that calculates the agreement between 2 observers considering that
own role and to know and respect the role of the doctor, and the ability agreement or disagreement between them may be due to chance. The
to work in a team in the sense of sharing information, participating in result is a number that theoretically can be between − 1 and 1, but in
the debate on what to do and managing conflicts. practice is between 0 and 1 (McHugh, 2012). It is interpreted using the
Following the recommendations described in the scientific literature Landis and Koch (1977) classification: < 0 poor, 0.01–0.20 slight,
(Goh, 2020), prior to the development of the OSCE, a trial was con­ 0.21–0.40 fair, 0.41–0.60 moderate, 0.61–0.80 substantial, 0.81–1.00
ducted with the two actors (standardised patient and standardised almost perfect. The concordance rate reflects the number of times the
doctor), a 3rd year nursing student, and members of the research team as two evaluators agreed on their rating of the item divided by the total
evaluators. The trial found that the scenario was adjusted to the 10-min­ number of ratings. The result is a relative frequency (%). The higher the
ute duration of the station, errors in the role of the actors were corrected, number, the more times both observers agreed on an item (McHugh,
and evaluator scoring was compared. 2012).
Competency level in each of the 3 competencies (Roles and re­
2.5. Instruments sponsibility, Communication and Teamwork) was calculated from the
items related to each of them. Three competency levels (poor, adequate
2.5.1. General data and good) were established ad-hoc. For the Communication and Team­
Students completed an ad-hoc form with general data for the de­ work competencies (with 3 associated items), a poor level meant that
mographic description of the sample: age (years), sex (male, female, 0 or 1 of the associated items had been performed, an adequate level that
other). 2 of the associated items had been performed, and a good level that the 3
items had been performed. For the Roles and Responsibility competence
2.5.2. Interprofessional competence performance assessment (with 2 associated items), a poor level meant that none of the associated
The Interprofessional Collaborator Assessment Rubric (ICAR) items had been performed, an adequate level that 1 of the associated
(Curran et al., 2013) was used as a model to assess the performance of items had been performed, and a good level that the 2 associated items
students in the interprofessional competencies station (station 5). This had been performed. Competency level in the assessment of each of the

4
J.L. González-Pascual et al. Nurse Education in Practice 56 (2021) 103190

two evaluators was expressed through relative frequencies. Table 4 presents the competence level of the students. In the
The software used was SPSS, version 23 (IBM Corp., 2015). Communication and Roles and Responsibility competencies, almost 90%
of students reached a good level. In the Teamwork competence, less than
2.7. Ethics 10% of students demonstrated a poor level, approximately one third an
adequate level, and more than half a good level.
All participants signed an informed consent. A favourable report was
obtained from the University’s Research Committee (reference CIPI/18/ 4. Discussion
098).
The students sample shows that many more women than men study
3. Results nursing at our university, reflecting the feminisation of the nursing
profession (Shannon et al., 2019,) and that not all students are 19 years
Table 2 presents the description of the general data of the 63 stu­ old, as would be the case if they came from high school, but that there
dents. Their average age was 24.77 years, with a range between 19 and are also older students, who study nursing after having studied voca­
49 years. 73% of them were women. tional or technical education, or who return to study after years in the
Table 3 presents the students’ scores in each of the interprofessional professional world.
assessment items, according to each of the evaluators, as well as inter-
observer concordance expressed through the kappa coefficient and the 4.1. Results of the station 5 within the OSCE: interprofessional
concordance rate. In the Communication competence, more than 95% of competence performance and inter-observer concordance
the students have expressed the behaviours assessed in one of the items
in the opinion of the two evaluators, and the level of concordance was, at Globally, more than 50% of students demonstrate good levels of
least, 0.80. In the Roles and Responsibility competence, at least 90% of interprofessional competencies relevant to their course and the profes­
the students have expressed the behaviours assessed in one of the items sional practice of a nurse in Spain, assessed by a station within an OSCE.
in the opinion of the two evaluators, and the level of concordance, when The level in the Roles and Responsibility (over 88.9%) and Communi­
test requirements mean this can be calculated, was 0.54. In the Team­ cation (over 92.1%) competencies is higher than in the Teamwork (over
work competence, at least 60% have expressed the behaviours assessed 55.6%) competency. Compared to other studies assessing interprofes­
in one of the items in the opinion of the two evaluators, and the level of sional competencies in university students through an OSCE, although
concordance ranged between 0 and 0.76. other non-ICAR assessment instruments were used, it is agreed that
Overall, the concordance rate was 83.8%, and the kappa coefficient students demonstrated competencies at a good level: very good (inter­
was 0.67. professional collaborative practice 79.6 on a scale of 0–100) (Oza et al.,
2015), or borderline satisfactory (2 on a scale of 0–4) (Ginsburg et al.,
Table 2 2015).
Characteristics of students participating in the OSCE (n = 63). Global inter-observer concordance measured through the kappa co­
efficient was substantial according to the classification of Landis and
Average age (SD) 24.77 (6.47)
Sex n (%) Male 17 (27%) Koch (1977): < 0 poor, 0.01–0.20 slight, 0.21–0.40 fair, 0.41–0.60
Female 46 (73%) moderate, 0.61–0.80 substantial, 0.81–1.00 almost perfect. However,
Other 0 (0%) two items stand out due to their lesser degree of concordance: One in the
Roles and Responsibility competency, with moderate concordance and

Table 3
Results in each item of the checklist. Inter-observer concordance through kappa coefficient and concordance rate.
Evaluator Evaluator Concordance
Kappa coefficient
1 2 rate

Communication (ICAR 2013: Communication)


The student communicates with the patient in an assertive and respectful manner
98.4% 98.4% 1 100%
(ICAR 2013: Communicates with others in a confident, assertive, and respectful manner)
The student uses paraverbal and non-verbal communication correctly
(ICAR 2013: Uses communication strategies (verbal & non-verbal) appropriately in a variety of 96.8% 98.4% 0.80 98.4%
situations)
The student communicates with the doctor in an assertive and respectful manner
95.2% 96.8% 0.89 98.4%
(ICAR 2013: Communicates with others in a confident, assertive, and respectful manner)
Roles and responsibility (ICAR 2013: Roles and responsibility)
The student explains/shows that he/she knows that health education/patient information is part of
his/her professional role
95.2% 88.9% 0.54 90.5%
(ICAR 2013: Describes one’s own roles and responsibilities in a clear manner with the team/patient/
family.
The student considers that prescribing medication is part of the role of the doctor
Cannot calculate
(ICAR 2013: Describes one’s own roles and responsibilities in a clear manner with the team/patient/ 100% 100% 100%
(constant)
family)
Teamwork (ICAR 2013: Team functioning, Collaboration) (ICAR original Collaborative Patient/Client-Family Centred Approach)
The student shares information about the patient’s problem (inhaler use, preferences, etc.) with the
doctor
(ICAR 2013: Shares information with other providers that is useful for the delivery of patient/client
87.3% 84.1% 0.76 90.5%
care)
(ICAR original: Integration of Patient/Client Beliefs and Values and Patient Advocacy in Decision-
Making)
The student participates in the interprofessional debate
63.5% 69.8% 0.73 80.9%
(ICAR 2013: Contributes to interprofessional team discussions)
The student negotiates with the doctor to facilitate conflict resolution
(ICAR 2013: Uses appropriate conflict resolution strategies to manage and/or resolve conflict.) 98.4% 96.8% 0 95.2%
(ICAR original: Conflict management)

5
J.L. González-Pascual et al. Nurse Education in Practice 56 (2021) 103190

Table 4 checklists make the examination more transparent and more objectively
Level of performance in the interprofessional competencies of the students. scored (Harden et al., 2016) and are the most widely used assessment
Evaluator 1 Evaluator 2 tools in OSCEs in nursing (Goh, 2020).
Finally, some authors raise questions about the level of objectivity in
Communication
Poor 1.6% 0% competency-based assessment (Ten Cate and Regehr, 2019). In our re­
Adequate 6.3% 6.3% view of the objectivity of the OSCE, considered through inter-observer
Good 92.1% 93.7% concordance, we have found in an OSCE among nursing students in
Roles and responsibility our environment (Spain) that competencies related to aspects of inter­
Poor 0% 0%
Adequate 4.8% 11.1%
personal relationship and communication clearly show a lower
Good 95.2% 88.9% inter-observer concordance than that obtained when assessing technical
Teamwork aspects (Castro-Yuste et al., 2018). Likewise, inter-observer concordance
Poor 6.3% 7.9% in some OSCEs where aspects of interpersonal relationship and
Adequate 38.1% 31.7%
communication were assessed was generally lower (Lau et al., 2007;
Good 55.6% 60.3%
Sakurai et al., 2014; Saraiva et al., 2016; Setyonugroho et al., 2016) than
in other OSCEs where more procedural or clinical aspects were assessed
other in the Teamwork competency, with poor concordance. This could (Battistone et al., 2017; Falcone et al., 2011; Pernar et al., 2012; Garg
call into question the results obtained in terms of the competence level of et al., 2015; Noureldin et al., 2016). In this sense, we must not lose sight
the students. However, concordance rates among observers were high, of the fact that examiners are human, and that there is a thought process
and the statistical properties of the kappa coefficient must also be behind each assessment that normally remains hidden, without being
considered. This is influenced by the prevalence of the event to be made explicit (Chahine et al., 2016), which can be influenced by first
observed, in that, if the event is very frequent or very infrequent, low impressions (Wood et al., 2017). Behavioural and communication
kappa coefficients can be obtained, even with high concordance rates, as assessment, key aspects of interprofessional competencies, may have a
in the case of the item on teamwork with 0 on the kappa index (Vandelle, subjective component related to preconceived ideas and socially- and
2017). In addition, if one of the observers indicates that the event has culturally- local expectations of examiners rather than the assessment of
always or never occurred in all the people observed, it is considered to purely clinical aspects (Lau et al., 2007; Sakurai et al., 2014).
be a constant, and the kappa coefficient cannot be calculated.
5. Conclusions
4.2. Reflections about the interprofessional competence performance
assessment process The assessment of interprofessional competencies through a station
within an OSCE, as it deals mainly with relational and communicative
Inter-observer concordance was lower than expected by the re­ aspects, requires greater preparation both in terms of the specification of
searchers, considering the work carried out prior to the completion of assessment items and in agreement between examiners. Assessment
the station. A level above 0.8 was expected after having conducted the items should reflect each step of the behaviour in a measurable way. To
station 5 trial prior to the OSCE, as previously described in the meth­ improve inter-observer agreement, it would be useful to review the
odology section. This has led to a literature review being carried out of evaluation items between the evaluators and the standardised actors, to
the aspects that could have in some way had an influence in the inter­ pre-rehearse with the actors and evaluators, and to review the re­
professional competence performance assessment process, and the cordings with reflective commentary on the score given by each
following was found. evaluator.
Firstly, the ICAR instrument was not specifically designed for use in With this preparation, in our view an interprofessional competencies
OSCE, but as a rubric for assessment, used by an external observer station within an OSCE could be an efficient method for assessing the
through multiple interactions and repeated observation of a learner inter-professional competencies of university students, by demon­
(Shrader et al., 2017). In the modified version from 2013, the rubric was strating competency performance in simulated environments.
transformed into a checklist with a Likert scale, and validated in real
clinical environments (Curran et al., 2013). Interprofessional work is Funding source
very much influenced by an enormous amount of relational, procedural,
organisational and contextual factors (Reeves et al., 2013), which can be This work was supported by the Europea de Madrid University under
different in a simulation than in a real environment (Leung et al., 2012). Grant 2018/UEM38. 1750 euros.
The complete ICAR instrument was not used, but rather an ad hoc
rewrite and selection was made adapted to the characteristics of un­ Ethical approval
dergraduate nursing students in a specific context. In addition, the
measurement scale was modified from a rating scale to a binomial All participants signed an informed consent. A favourable report was
two-point scale. Although the use of an existing behaviour assessment obtained from the University’s Research Committee (reference CIPI/18/
scale (observer-based) may be desirable to standardise the assessment of 098).
interprofessional competencies, adaptation to each context (at the level
of professional practice in each country and the competencies students CRediT authorship contribution statement
must acquire) is necessary, and the differences between performance in
a simulation and in clinical environments must be taken into account. Juan Luis González-Pascual: Conceptualization, Methodology,
Secondly, the wording of the ICAR items includes the assessment of Formal analysis, Supervision, Writing – original draft, Writing – review
complex behaviours, which imply a judgement of the observer beyond & editing. Inmaculada López-Martín: Conceptualization, Funding
the observation of a simple act. There is some debate over the best way acquisition, Writing – original draft, Writing – review & editing. Elena
to assess complex behaviours. Sakurai et al. (2014) proposed that, in María Saiz-Navarro: Conceptualization, Investigation, Writing – re­
order to improve inter-observer concordance in an OSCE, items related view & editing. Óscar Oliva-Fernández: Conceptualization, Investiga­
to the same mix of contents should be subdivided. Furthermore, global tion, Writing – review & editing. Francisco Javier Acebedo-Esteban:
rating scales have been proposed to assess skills where the quality with Conceptualization, Investigation, Writing – review & editing. Marta
which it is performed is as important as performing it at all, like some Rodríguez-García: Conceptualization, Methodology, Writing – original
interprofessional competences (Khan et al., 2013a, 2013b). However, draft, Writing – review & editing.

6
J.L. González-Pascual et al. Nurse Education in Practice 56 (2021) 103190

Conflict of Interest Khan, K.Z., Gaunt, K., Ramachandran, S., Pushkar, P., 2013b. The Objective Structured
Clinical Examination (OSCE): AMEE Guide No. 81. Part II: organisation &
administration. Med. Teach. 35 (9), e1447–e1463.
None declared. Kirkpatrick, D.L., 2004. Evaluación de acciones formativas. Los cuatro niveles.
[Evaluation models. The four levels], EPISE, Barcelona.
References Landis, J., Koch, G., 1977. The measurement of observer agreement for categorical data.
Biometrics 33, 159–174.
Lau, E., Dolovich, L., Austin, Z., 2007. Comparison of self, physician, and simulated
Bandiera, G., Sherbino, J., Frank, J.R., 2006. The CanMEDS Assessment Tools Handbook: patient ratings of pharmacist performance in a family practice simulator.
An Introductory Guide to Assessment Methods for the CanMEDS Competencies. The J. Interprof. Care 21 (2), 129–140.
Royal College of Physicians and Surgeons of Canada, Ottawa. Leung, K., Wang, W., Chen, Y., 2012. Multi-source evaluation of interpersonal and
Barr, H. 2015. Interprofessional Education. The Genesis of a Global Movement, CAIPE, communication skills of family medicine residents. Adv. Health Sci. Educ. 17 (5),
UK. 〈https://www.caipe.org/resources/publications/barr-h-2015-interprofessional 717–726.
-education-genesis-global-movement〉 (Accessed 3 May 2019). Makary, M.A., Daniel, M., 2016. Medical error-the third leading cause of death in the US.
Battistone, M.J., Barker, A.M., Beck, J.P., Tashjian, R.Z., Cannon, G.W., 2017. Validity BMJ 353, i2139.
evidence for two objective structured clinical examination stations to evaluate core McHugh, M.L., 2012. Interrater reliability: the kappa statistic. Biochem. Med. 22 (3),
skills of the shoulder and knee assessment. BMC Med. Educ. 17 (1), 13. 276–282.
Billings, D.M., Halstead, J.A., 2020. Teaching in Nursing: A Guide for Faculty. Elsevier, Morison, S.L., Stewart, M.C., 2005. Developing interprofessional assessment. Learn.
USA. Health Soc. Care 4, 192–202.
Brashers, V., Erickson, J.M., Blackhall, L., Owen, J.A., Thomas, S.M., Conaway, M.R., Noureldin, Y.A., Elkoushy, M.A., Aloosh, M., Carrier, S., Elhilali, M.M., Andonian, S.,
2016. Measuring the impact of clinically relevant interprofessional education on 2016. Objective Structured Assessment of Technical Skills for the Photoselective
undergraduate medical and nursing student competencies: a longitudinal mixed Vaporization of the Prostate Procedure: a pilot study. J. Endourol. 30 (8), 923–929.
methods approach. J. Interprof. Care 30 (4), 448–457. Oates, M., Davidson, M., 2015. A critical appraisal of instruments to measure outcomes
Centre for the Advancement of Interprofessional Education (CAIPE) , 2016. Definition. of interprofessional education. Med. Educ. 49 (4), 386–398.
〈https://www.caipe.org/about-us〉 (Accessed 3 May 2019). Oza, S.K., Boscardin, C.K., Wamsley, M., Sznewajs, A., May, W., Nevins, A.,
Castro-Yuste, C., García-Cabanillas, M.J., Rodríguez-Cornejo, M.J., Carnicer-Fuentes, C., Srinivasan, M., Hauer, K.E., 2015. Assessing 3rd year medical students’
Paloma-Castro, P., Moreno-Corral, L.J., 2018. A student assessment tool for interprofessional collaborative practice behaviors during a standardized patient
standardized patient simulations (SAT-SPS): psychometric analysis. Nurse Educ. encounter: a multi-institutional, cross-sectional study. Med. Teach. 37 (10),
Today 64, 79–84. 915–925.
Chahine, S., Holmes, B., Kowalewski, Z., 2016. In the minds of OSCE examiners: Pernar, L.I.M., Shaw, T.J., Pozner, C.N., Vogelgesang, K.R., Lacroix, S.E., Gandhi, T.K.,
uncovering hidden assumptions. Adv. Health Sci. Educ. 21 (3), 609–625. Peyre, S.E., 2012. Using an objective structured clinical examination to test
Cullen, L., Fraser, D., Symonds, I., 2003. Strategies for interprofessional education: the adherence to Joint Commission National Patient Safety Goal-Associated Behaviors.
Interprofessional Team Objective Structured Clinical Examination for midwifery and Jt. Comm. J. Qual. Patient Saf. 38 (9), 414–418.
medical students. Nurse Educ. Today 23 (6), 427–433. Reeves, S., Perrier, L., Goldman, J., Freeth, D., Zwarenstein, M., 2013. Interprofessional
Curran, V., Hayward, M., Curtis, B., Murphy, S., 2013. Interprofessional Collaborator education: effects on professional practice and healthcare outcomes (update).
Assessment Rubric, Modified version 2013. 〈https://www.med.mun.ca Cochrane Database Syst. Rev. (3), CD002213
/CCHPE/Faculty-Resources/Interprofessional-Collaborator-Assessment-Rubric.aspx〉 Reeves, S., Fletcher, S., Barr, H., Birch, I., Boet, S., Davies, N., McFadyen, A., Rivera, J.,
(Accessed 9 April 2019). Kitto, S., 2016. A BEME systematic review of the effects of interprofessional
Donaldson, M.S., 2008. An overview of to err is human: re-emphasizing the message of education: BEME Guide No. 39. Med. Teach. 38 (7), 656–668.
patient safety. In: Hughes, R.G. (Ed.), Patient Safety and Quality: An Evidence-Based Rogers, G.D., Thistlethwaite, J.E., Anderson, E.S., Dahlgren, M.A., Grymonpre, R.E.,
Handbook for Nurses. Agency for Healthcare Research and Quality, Rockville, MD. Moran, M., Samarasekera, D.D., 2016. International consensus statement on the
Ehlers, J.P., Kaap-Fröhlich, S., Mahler, C., Scherer, T., Huber, M., 2017. Analysis of six assessment of interprofessional learning outcomes. Med. Teach. 39 (4), 347–359.
reviews on the quality of instruments for the evaluation of interprofessional Sakurai, H., Kanada, Y., Sugiura, Y., Motoya, I., Wada, Y., Yamada, M., Tomita, M.,
education in German-speaking countries. GMS J. Med. Educ. 34 (3), 36. Tanabe, S., Teranishi, T., Tsujimura, T., Sawa, S., Okanishi, T., 2014. Reliability of
Falcone, J.L., Schenarts, K.D., Ferson, P.F., Day, H.D., 2011. Using elements from an the OSCE for physical and occupational therapists. J. Phys. Ther. Sci. 26 (8),
acute abdominal pain Objective Structured Clinical Examination (OSCE) leads to 1147–1152.
more standardized grading in the surgical clerkship for third-year medical students. Saraiva, M.D., de-Melo-Paulo, M.L., Avelino-Silva, T.J., Gil-Junior, L.A., Kikuchi, E.L.,
J. Surg. Educ. 68 (5), 408–413. Farias, L.L., Alves, R.L., Suzuki, G.S., Olivieri, F.C., Aranha, V.C., da-Costa-Lopes, L.,
Garg, A., Biello, K., Hoot, J.W., Reddy, S.B., Wilson, L., George, P., Robinson-Bostom, L., Passarelli, M.C., Moriguti, J.C., Ferrioli, E., Wen, C.L., Apolinário, D., Jacob-
Belazarian, L., Domingues, E., Powers, G., Besen, J., Geller, A.C., 2015. The Skin Filho, W., 2016. Evaluating communication skills of geriatrics fellows: Interrater
Cancer Objective Structured Clinical Examination (SCOSCE): a multi-institutional Agreement of an objective structured clinical examination. J. Am. Geriatr. Soc. 64
collaboration to develop and validate a clinical skills assessment for melanoma. (1), 206–207.
J. Am. Acad. Dermatol. 73 (6), 959–965. Setyonugroho, W., Kropmans, T., Kennedy, K.M., Stewart, B., van Dalen, J., 2016.
Gianoli, G.J., 2016. Medical error epidemic hysteria. Am. J. Med. 129 (12), 1239–1240. Calibration of communication skills items in OSCE checklists according to the MAAS-
Ginsburg, L.R., Tregunno, D., Norton, P.G., Smee, S., de Vries, I., Sebok, S.S., Global. Patient Educ. Couns. 99 (1), 139–146.
VanDenKerkhof, E.G., Luctkar-Flude, M., Medves, J., 2015. Development and testing Shannon, G., Minckas, N., Tan, D., Haghparast-Bidgoli, H., Batura, N., Mannell, J., 2019.
of an objective structured clinical exam (OSCE) to assess socio-cultural dimensions of Feminisation of the health workforce and wage conditions of health professions: an
patient safety competency. BMJ Qual. Saf. 24 (3), 188–194. exploratory analysis. Hum. Resour. Health 17 (1), 1–16. https://doi.org/10.1186/
Goh, H.S., Zhang, H., Lee, C.N., Wu, X., Wang, W., 2020. The value of nursing objective s12960-019-0406-0.
structured clinical examinations: a scoping review. Nurse Educ. 44 (5), E1–E6. Sharma, M.K., Chandra, P.S., Chaturvedi, S.K., 2015. Team OSCE: a teaching modality
Gordon, M., Hill, E., Stojan, J.N., Daniel, M., 2018. Educational interventions to improve for promotion of multidisciplinary work in mental health settings. Indian J. Psychol.
handover in health care: an updated systematic review. Acad. Med. 93 (8), Med. 37, 327–329.
1234–1244. Shrader, S., Farland, M.Z., Danielson, J., Sicat, B., Umland, E.M., 2017. A systematic
Gordon, M, Uppal, E, Holt, K, Lythgoe, J, Mitchell, A, Hollins-Martin, C, 2013. review of assessment tools measuring interprofessional education outcomes relevant
Application of the team objective structured clinical encounter (TOSCE) for to pharmacy education. Am. J. Pharm. Educ. 81 (6), 119.
continuing professional development amongst postgraduate health professionals. Simmons, B., Egan-Lee, E., Wagner, S.J., Esdaile, M., Baker, L., Reeves, S., 2011.
J Interprof Care 27 (2), 191–193. https://doi.org/10.3109/13561820.2012.725232. Assessment of interprofessional learning: the design of an interprofessional objective
Hall, P., Marshall, D., Weaver, L., Boyle, A., Taniguchi, A., 2011. A method to enhance structured clinical examination (iOSCE) approach. J. Interprof. Care 25 (1), 73–74.
student teams in palliative care: piloting the McMaster-Ottawa Team Observed Stroud, L., Vidyarthi, A.R., 2015. Assessing patient safety competencies using Objective
Structured Clinical Encounter. J. Palliat. Med. 14 (6), 744–750. Structured Clinical Exams: a new twist on an old tool. BMJ Qual. Saf. 24 (3),
Harden, R.M., Lelley, P., Patricio, M., 2016. The Definitive Guide of the OSCE. The 179–181.
Objective Structured Clinical Examination as a Performance Assessment. Elsevier, Ten Cate, O., Regehr, G., 2019. The power of subjectivity in the assessment of medical
London. trainees. Acad. Med. 94 (3), 333–337.
Havyer, R.D., Nelson, D.R., Wingo, M.T., Comfere, N.I., Halvorsen, A.J., McDonald, F.S., Vandelle, S., 2017. Comparing dependent Kappa coefficients obtained on multilevel data.
Reed, D.A., 2016. Addressing the interprofessional collaboration competencies of the Biom. J. 59 (5), 1016–1034.
Association of American Medical Colleges: a systematic review of assessment Wood, T.J., Chan, J., Humphrey-Murto, S., Pugh, D., Touchie, C., 2017. The influence of
instruments in undergraduate medical education. Acad. Med. 91 (6), 865–888. first impressions on subsequent ratings within an OSCE station. Adv. Health Sci.
IBM Corp., 2015. IBM SPSS Statistics for Windows, Version 23.0. IBM Corp, Armonk, NY. Educ. 22 (4), 969–983.
Institute of Medicine (IoM), 2000. To Err Is Human: Building a Safer Health System. The World Health Organization (WHO) , 2008. Global priorities for research in patient safety.
National Academies Press, Washington, DC. https://doi.org/10.17226/9728. 〈https://www.who.int/publications/i/item/WHO-IER-PSP-2008.13〉. (Accessed 3
Kaneko, R.M.U., Lopes, M.H.B.M., 2019. Realistic health care simulation scenario: what August 2021 ).
is relevant for its design? Rev. da Esc. De. Enferm. da USP 30 (53), 03453. https:// World Health Organization (WHO), 2010. Framework for Action on Interprofessional
doi.org/10.1590/S1980-220X2018015703453. Education & Collaborative Practice. https://www.who.int/publications/i/item/fra
Khan, K.Z., Ramachandran, S., Gaunt, K., Pushkar, P., 2013a. The Objective Structured mework-for-action-on-interprofessional-education-collaborative-practice (Accessed
Clinical Examination (OSCE): AMEE Guide No. 81. Part I: an historical and 3 August 2021).
theoretical perspective. Med. Teach. 35 (9), e1437–e1446.

You might also like