Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/223135572

Using an objective structured video exam to identify differential


understanding of aspects of communication skills

Article in Medical Teacher · April 2012


DOI: 10.3109/0142159X.2012.660213 · Source: PubMed

CITATIONS READS

29 1,511

5 authors, including:

Danielle A Baribeau Ilya Mukovozov


University of Toronto University of British Columbia
30 PUBLICATIONS 909 CITATIONS 51 PUBLICATIONS 761 CITATIONS

SEE PROFILE SEE PROFILE

Carl B Delottinville
McMaster University
3 PUBLICATIONS 46 CITATIONS

SEE PROFILE

All content following this page was uploaded by Ilya Mukovozov on 01 June 2014.

The user has requested enhancement of the downloaded file.


2012; 34: e242–e250

WEB PAPER

Using an objective structured video exam to


identify differential understanding of aspects
of communication skills
DANIELLE A. BARIBEAU1, ILYA MUKOVOZOV1, THOMAS SABLJIC2, KEVIN W. EVA3 &
CARL B. DELOTTINVILLE2
1
University of Toronto, Canada, 2McMaster University, Canada, 3University of British Columbia, Canada

Abstract
Background: Effective communication in health care is associated with patient satisfaction and improved clinical outcomes.
Professional schools increasingly incorporate communication training into their curricula. The objective structured video exam
(OSVE) is a video-based examination that provides an economical way of assessing students’ knowledge of communication skills.
Med Teach Downloaded from informahealthcare.com by 31.57.148.223 on 04/25/12

This study presents a scoring strategy that enables blueprinting of an OSVE to consensus guidelines, to determine which aspects
of communication skills create the most difficulty for students to understand and to what degree understanding improves through
experiential communication skills training.
Methods: Five interactions between a healthcare professional and client were scripted and filmed using standardized patients.
The dialogues were mapped onto the Kalamazoo consensus statement by having five communication experts view each video and
identify effective and ineffective use of communication skills. Undergraduate students enrolled in a communications course
completed an OSVE on three occasions.
Results: A total of 79 students completed at least one testing session. The scores assigned supported the validity of the scoring
For personal use only.

strategy as an indication of knowledge growth. Considerable variability was observed across Kalamazoo sub-domains.
Conclusion: With further refining, this scoring approach may prove useful for educators to tailor their education and assessment
practices to specific consensus guidelines.

Introduction Practice points


Training in communication skills has been shown to improve . Knowledge of communication skills increases through
clinical competence and interviewing skills in a variety of an experiential approach to learning.
health care disciplines (Aspregen 1999; Yedidia et al. 2003; . The OSVE is a cost effective way of measuring relative
Haak et al. 2008). That being said, the intangible nature strengths and weaknesses in knowledge of communi-
of communication skills creates a challenge for curriculum cation skills.
development and competence assessment. Evaluations of . Training for novices should focus on the development of
student communication skills during Objective Structured patient-centered communication skills and on how to
Clinical Examinations (OSCEs) have shown that communica- recognize and improve ineffective communication.
tion skills vary on a case-by-case basis thus necessitating
that a series of observations be collected (Kroboth et al.
1992; Hodges et al. 1996; Boulet et al. 1998; Guiton et al. 2004). skills portrayed in the video. Evaluation forms have taken on
Repeated evaluations using the OSCE format place a high a variety of formats, including multiple choice and short
demand on financial resources. answer questioning (Hulsman et al. 2006; Simpson et al. 2006).
An offshoot of the OSCE, the objective structured video A variety of programs have incorporated an OSVE into their
exam (OSVE), is a video-based written examination that curricula (e.g., Simpson et al. 2006), but the validity of the tool
provides an efficient and economical way of assessing a as an assessment instrument has not been studied extensively
student’s knowledge of communication skills in a classroom (Humphris & Kaney 2000, 2001; Humphris 2002; Hulsman
setting (Humphris & Kaney 2000). With this technique, et al. 2006).
students are typically asked to watch a series of interactions With respect to curriculum development, the literature
on video between a doctor and a patient. The videos are supports an experiential approach to communication skills
followed by written questions, designed to assess the students’ training, in which students learn by interacting with real or
ability to identify, understand, or critique the communication standardized patients (SPs), while receiving feedback from

Correspondence: C. B. deLottinville, Bachelor of Health Sciences (Honours) Program, MDCL-3316, Faculty of Medicine, McMaster University, 1200
Main Street West, Hamilton, Ontario L8N 3Z5, Canada. Email: carl.delottinville@learnlink.mcmaster.ca
Danielle A. Baribeau, Ilya Mukovozov and Thomas Sabljic contributed equally to this article

e242 ISSN 0142–159X print/ISSN 1466–187X online/12/040242–9 ß 2012 Informa UK Ltd.


DOI: 10.3109/0142159X.2012.660213
Objective structured video exam
Med Teach Downloaded from informahealthcare.com by 31.57.148.223 on 04/25/12
For personal use only.

Figure 1. Kalamazoo consensus statement tasks (bold face) and skills adapted for the undergraduate level with assigned coding
number for use in OSVE scoring.

instructors (Knowles et al. 2001; Shaw 2006; Von Fragstein establishing a relationship, reaching agreement on plans) as
et al. 2008). Curriculum development can now be guided by well as information or organization tasks (e.g. providing
expert consensus and written guidelines that describe specific information, clear explanations, suggestions for illness
communication skills. For example, in 1999, leaders in the field prevention).
of medical education agreed upon a set of essential commu- With respect to education and training, the literature
nication skills and tasks, entitled The Kalamazoo consensus suggests that medical students and physicians alike tend to
statement, which has since served as a framework for excel at organization or information-based tasks, but struggle
communication skills training (Makoul 2001). In short, the with patient-centered tasks. Many studies have quantified the
framework highlights the following seven essential elements high frequency with which physicians miss opportunities to
or tasks as being fundamental to clinical communication: provide emotional support or empathy during clinical encoun-
(1) building a relationship, (2) opening the discussion, ters (Levinson et al. 2000; Morse et al. 2008). Aspegren and
(3) gathering information, (4) understanding the patient’s Lonberg-Madsen (2005) showed that medical students and
perspective, (5) sharing information, (6) reaching agreement seasoned physicians alike were experts in content or informa-
on problems and plans, and (7) providing closure. Specific tion-based skills but lacked process and perceptual skills
skill sets are identified for each of the above tasks (Figure 1; related to building rapport and developing the doctor–patient
Makoul 2001). relationship.
The skills described under 1, 2, 4, and 6 focus on enhancing Communication skills training has repeatedly been shown
the subjective patient experience, and relate more to the to enhance patient-centered communication. For example,
interpersonal interaction, which we have termed ‘‘patient- Perera et al. (2009) showed that medical students who
centered tasks.’’ Skills 3, 5, and 7 outline organizational tasks received additional feedback on their communication
necessary to structure a medical encounter and acquire or give improved significantly in understanding the patient and
information, which we have terms ‘‘information or organiza- building the relationship. No improvement was noted with
tion tasks.’’ There is considerable evidence that both categories respect to information/organization tasks like sharing infor-
are essential for doctor–patient communication (Barry et al. mation and closing the discussion. Back et al. (2011), found
2000; Ward et al. 2003; Windish et al. 2005; Ruiz-Moral et al. that prior to communication skills training, oncology fellows
2006). Little et al. (2001), for example, showed that when missed opportunities to respond to emotional cues, but this
surveyed, patients equally valued both patient-centered tasks was improved with an experiential workshop. Bylund et al.
(e.g. being listened to, having their concerns understood, (2010) showed that at baseline, physicians frequently used
e243
D. A. Baribeau et al.

organization skills, but were relatively weaker at patient- A total of five videos were filmed using SPs, and each was
centered skills like negotiating an agenda, checking for cropped to a duration lasting from 4 to 5.5 min in length.
understanding or sharing in decision making. With training,
significant improvements were made in two patient-centered OSVE response forms. The dialogue in each video was
tasks: negotiating the agenda and understanding the transcribed verbatim into a response form, adjacent to two
patient’s perspective. Similarly, Butow et al. (2008), showed blank columns (Figure 2). Participants were instructed to
that frequency of skill use was variable at baseline, but that identify and evaluate the interviewer’s communication skills
training increased the physicians’ ability to reach agreement by commenting in the blank columns adjacent to the area
in decision making and elicit patient emotions. in the script where they perceived communication skills to be
It is unknown whether this pattern, in which the organiza- employed. The responses were requested in ‘‘free text’’ form,
tional aspects of communication are performed better than in that the comments were not restricted to any particular
patient-centered aspects, represents a basic communication aspect of the dialogue. Participants recorded comments
tendency, or alternatively whether it is the result of knowledge on effective communication skills in column A, whereas
gained through professional training. Furthermore, it is comments on ineffective communication skills or missed
unknown whether one first requires a knowledge base on the opportunities to use communication skills were recorded
structure of clinical interviewing in order to capitalize on training in column B.
focused on patient-centered communication. Determining
whether or not the same patterns as described in the preceding
Part B: Development of a criterion standard
Med Teach Downloaded from informahealthcare.com by 31.57.148.223 on 04/25/12

paragraphs exist in a pre-clinical sample of students would help


address both of these issues, thereby yielding further insight into Acquiring expert responses. It was important to determine
the appropriate timing and strategies for communication skills which communication skills were portrayed by the SPs in
training at varying levels of experience. the videos in an identifiable form and, therefore, could be
In this study, we use a quasi-experimental, pre-test–post- evaluated on the scoring key (criterion standard). To this end,
test design to test which specific communication elements rather than relying solely on the scenario author’s opinion or
described in the Kalamzoo consensus pose the greatest intention, five communication skills experts were recruited
difficulty to pre-clinical non-professional undergraduate stu- to create a criterion standard. Each expert had at least 15 years of
For personal use only.

dents, and to what extent knowledge of these skills may be experience teaching communication skills and was very familiar
differentially learned in an introductory communication skills with the Kalamazoo consensus statement. Experts followed a
course. We demonstrate a way to map an OSVE scoring key testing regimen identical to that which would subsequently be
onto current consensus guidelines and then present multiple applied to the student participants (i.e., an initial viewing of a
videotaped scenarios to students that represented a variety of video scenario followed by 9 min to complete the response form
professional contexts. We use a response form that requires and a second viewing). In one testing session, the experts were
students to identify effective and ineffective aspects of the instructed to identify and comment on communication skills
skills demonstrated in the videos. To guard against the portrayed in all five video scenarios.
possibility that any increase in scores could be attributed to
general maturation or learning derived from the pre-test, a pair Interpreting expert responses. The Kalamazoo consensus
of pre-tests was used for a portion of the sample. Retention statement was used as a template for interpreting expert
of knowledge was assessed with a 4-month follow-up test for responses. Numeric codes were assigned to the seven com-
the other half of the sample. Extrapolating from the literature, munication categories or (where possible) to each of the
we hypothesize that students will be overall more effective 24 more specific communication skills that reside within
at recognizing information or organization tasks. With training, specific categories for a total of 31 possible codes (Figure 1).
we anticipate that patient-centered task recognition should Carl Delottinville, Ilya Mukovozov, Thomas Sabljic, and
improve, particularly with respect to understanding the Danielle Baribeau then independently matched the expert
patient’s perspective and reaching agreement. responses to the most appropriate items on the template.
Each author applied the template codes to expert responses
for all five video scenarios and for all five experts. Agreement
between authors with respect to the coding scheme was then
Methods assessed. Using binomial probability theorem, it was deter-
mined that the probability of at least three out of four authors
Part A: Design of the OSVE
agreeing on one of 31 possible codes by chance alone was
Design of video vignettes. Five interactions between a health 0.01%. Using this strict criterion for consensus resulted in the
care professional and client were scripted to contain 5 min inclusion of only expert responses that could be clearly
of dialogue. Each of these scenarios was written deliberately to attributed to one communication category or skill.
contain many elements of effective and ineffective communi-
cation skills based on the Kalamazoo consensus statement. Comparing responses between experts. The responses that
The videos were intended to be realistic, and were not focused were similarly coded by at least three authors were then
on a particular subset of skills. SPs were recruited from the compared across experts for each line of the dialogue. Where
Standardized Patient Program at the Centre for Simulation- two or more of the five experts identified the same specific task
Based Learning at the McMaster University Medical Centre. or communication category in the same line range, the line
e244
Objective structured video exam
Med Teach Downloaded from informahealthcare.com by 31.57.148.223 on 04/25/12
For personal use only.

Figure 2. Sample OSVE response form. The students are instructed to describe the communication skills or tasks that were
effective (column A) or inneffective (column B), adjacent to the line number where these skills occur in the dialog.

range was marked and the response was retained as a


component of the criterion standard. This relatively liberal
criterion was adopted because, with roughly 100 lines of text
per case on average (range 80–182), two response columns
and 32 possible codes (the 31 codes in Figure 1 plus the
chance that a line of text would not be coded) the likelihood
that at least two experts would select the same line of dialogue
and assign the same code to that dialogue by chance alone
is only 0.03%. Figure 3 illustrates a sample portion of the
criterion standard for a single video.

The criterion standard. The protocol described in section B


generated five OSVE marking keys each containing between
26 and 36 scorable responses distributed between both
columns. Some skills were represented more frequently
than others. Table 1 demonstrates the number of times Figure 3. Sample OSVE marking scheme. The communcia-
each skill was portrayed in each video scenario and in tion skills or tasks and their line ranges, identified by a panel of
which column the skills were identified as per the criterion experts, are indicated on the marking scheme. The bracketed
standard. number corresponds to the order of the responses and the
subsequent double-digit number indicates the code that
Part C: Student population and testing corresponds with the specific behaviors presented in Figure 1.

The student population. We recruited from a population of


87 non-professional undergraduate students enrolled in a third to participate in the study, students were randomly allocated
year Communication Skills course for the 2007/2008 academic an anonymous participant number, thus blinding the authors
year. Ethics approval was granted by McMaster University’s to the participant’s identity, semester of enrollment, and test
Faculty of Health Sciences Research Ethics Board. On agreeing administration protocol.
e245
D. A. Baribeau et al.

Table 1. Frequency, type (according to Kalamazoo consensus) and quality of communication skill (A: effective vs. B: ineffective/missed) listed
in the criterion scoring sheet.

Column Video A Video B Video C Video D Video E Total

A B A B A B A B A B

10. Builds and sustains a trusting relationship 10 3 5 6 4 4 7 4 5 6 54


20. Opens the discussion 1 2 1 0 1 0 1 0 0 0 6
30. Gathers information 3 1 1 0 7 0 1 0 2 1 16
40. Understands the interviewee’s perspective 7 1 1 5 4 6 2 4 2 4 36
50. Shares information 2 0 2 2 0 0 1 0 5 0 12
60. Reaches agreement 3 0 0 3 1 0 1 3 0 2 13
70. Provides closure 2 1 0 1 0 1 0 2 0 1 8
Total 28 8 10 17 17 11 13 13 14 14 145
Med Teach Downloaded from informahealthcare.com by 31.57.148.223 on 04/25/12
For personal use only.

Figure 4. Study design. A cross-over study design permitted simultaneous testing for communication skills knowledge
acquisition, knowledge retention, as well as learning effects from repeated testing.

The communication skills course. The communication skills Testing sessions. The sequence of video administration was
course, as the educational intervention, consisted of 3 class planned such that each student viewed three video scenarios
hours per week for 12 weeks. This course used an experiential during each of the three testing sessions. During the second
approach to communication skills training with SP interactions and third testing sessions, they viewed two previously viewed
lasting 2 h each week accompanied by immediate tutor and scenarios and one new scenario. The order of video admin-
group feedback. Students were also given an opportunity to istration varied by testing session, such that each student
observe and reflect on their own interviews, which were video viewed each video no more than twice, and each video was
recorded. The students completed weekly journal reflections represented across all testing sessions. As previously described
on their communication skills as well as two written projects above, the students were tasked to identify and comment
incorporating evidence-based literature specific to communi- on the effective use of communication skills portrayed by the
cation skills. interviewer in column A, as well as areas of ineffective use of
communication skills or missed opportunities for communica-
Testing protocol. Half of the students (n ¼ 45) were enrolled
tion skills in column B. A pilot testing session was conducted
in the semester 1 course (September to December), while the
with a sample group of student volunteers (n ¼ 4). From this,
other half (n ¼ 42) were enrolled in semester 2 (January to
it was determined that 9 min spaced between two video
April). Students in both semesters were requested to attend
viewings provided adequate time for students to complete the
three separate testing sessions, the first in September, the
OSVE response form for each video vignette. Viewing three
second in December/January, and the third in April. Mean
different videos and completing three response forms in this
scores were compared before and after the educational
way required a total testing time of approximately 1 h.
intervention, to measure the effect of communication
skills training and to control for inter-student variability.
Additionally, semester 1 students were tested 4 months after Marking student response forms. Danielle Baribeau, Thomas
completing the course, to measure knowledge retention. Sabljic, and Ilya Mukovozov subsequently compared student
Semester 2 students were tested 4 months prior to, and responses to those on the criterion standard developed from
directly prior to the educational intervention, to measure the expert responses. Where a student provided the same
learning effect derived from repeated OSVE participation. response as an expert in the same column and line range,
As such, each semester of students was requested to attend one mark was allocated. Marks were not removed for incorrect
one testing session outside of allotted classroom time. See responses. During the marking of student responses, specific
Figure 4 for an illustration of the research design. communication skills were interpreted as dependent on the
e246
Objective structured video exam

communication categories. Where a communication category


was considered a correct response on the criterion standard,
a student could receive a mark for commenting on a specific
skill within that category. This decision was made to avoid
being overly narrow in interpreting student-expert alignment,
and was consistently applied to all testing sessions.

Part D: Analysis
Student responses were analyzed with respect to the percent
correct (i.e., the number of correct items noted by participants/ Figure 5. Student scores on the OSVE. T1 (Term 1): Results
the number included in the answer key generated by the shown before and after taking a communication skills course,
expert panel), as well as the percent accuracy (number as well as when tested for retention after 4 months. T2 (Term 2):
of correct items noted by the participants/total number of Results shown from repeated pre-intervention testing, as well as
responses noted by participants). Percentage scores were used before and after taking a communication skills course.
as opposed to absolute scores given that each video scenario
contained a variable number of potentially correct items under
the different skill sub-domains. This avoided skewing the to 55.8%. The total number of given responses increased from
Med Teach Downloaded from informahealthcare.com by 31.57.148.223 on 04/25/12

relative contribution of student scores toward OSVE vignettes a mean of 12.7 (T1) and 12.9 (T2) to 17.2 (T1) and 15.2 (T2).
with more items. It also enabled a ready comparison across A mixed design ANOVA performed on the percent correct
communication skill category sub-domains. The percent scores revealed no main effect of semester (F 5 1, p 4 0.4),
correct score for each sub-domain was averaged across all a significant effect of session (F ¼ 11.1, p 5 0.001) and a
five videos. Mixed design analysis of variance (ANOVA) was semester  session interaction that bordered on significance
performed on this variable, treating term in which participants (F ¼ 2.8, p 5 0.08). Planned post hoc comparisons revealed
received the educational intervention (Fall vs Winter) as a significant differences in the places that would be expected
between subjects factor and test administration (September, given the above description: In semester 1 students, pre-test
For personal use only.

January, or April) as a repeated measure. Planned comparison scores were significantly lower than post-test scores (t ¼ 5.6,
t-tests were used to determine the source of any significant p 5 0.001) and post-test scores did not differ from retention
effects. ANOVA was similarly used to determine which specific scores (t ¼ 0.9, p 4 0.3). In semester 2 students, both pre-test
aspects of communication skills students had the greatest scores were significantly lower than post-test scores (t ¼ 4.6
difficulty identifying. and 5.8, p 5 0.001).
To flesh out which aspects of communication skills gave
students the greatest difficulty, sub-scores were created for
Results
both the ‘‘effectively used’’ items from the answer key and the
In total, 79 students (90.1% of all eligible participants) ‘‘missed opportunity/ineffective’’ items as well as for each
contributed data to this study although the sample size of the sub-domains included in the Kalamazoo consensus
varied across administration (n ¼ 61, 69, and 39 for tests 1, 2, statement (Figure 1). ANOVA revealed that students were
and 3, respectively). A total of 27/79 students attended all three significantly better at correctly identifying aspects of commu-
testing sessions. Mean scores for students who completed all nication skills that were effectively used by individuals in the
three sessions were compared to those who completed only OSVE videos (mean ¼ 24.2% correct, 95% CI ¼ 22.3–26.1%)
one or two testing sessions. Pre-intervention scores were relative to identifying missed opportunities or ineffective
slightly higher for students who attended all three testing communication (mean ¼ 18.6% correct, 95% CI ¼ 17.0–20.1%;
sessions with respect to percent correct (18.6% vs 15.3%, F ¼ 31.8, p 5 0.001). The correlation between percent correct
p ¼ 0.02), but were not significantly different with respect to on ‘‘effectively used’’ items and ‘‘missed opportunity/ineffec-
percent accuracy (39.1% vs 37.5%, p ¼ 0.56) or with respect tive’’ items was r ¼ 0.36, p 5 0.01.
to either measure post-intervention ( percent accuracy 51.3 With respect to the Kalamazoo sub-domains, ANOVA
vs 54.6, p ¼ 0.27, percent correct ¼ 28.5 vs 31.2, p ¼ 0.08). similarly revealed that statistically meaningful differences
Authors had a high level of agreement with respect to exist regarding students’ capacity to identify different aspects
scoring the student responses. The inter-rater reliability calcu- of communication skills. A main effect of sub-domain was
lations with respect to scoring agreement for each video observed (F ¼ 17.5, p 5 0.001) with mean percent correct
scenario were: Scenarios A ¼ 0.95; B ¼ 0.95; C ¼ 0.90; D ¼ 0.92; ranging from a low of 9.6% (95% CI ¼ 7.9–11.3%) for the
and E ¼ 0.95. ‘‘Reaches Agreement’’ sub-domain and a high of 29.6% (95%
Both percent accuracy and percent correct scores increased CI ¼ 25.4–33.9%) for the ‘‘Shares Information’’ sub-domain.
following the communication skills training. The total number Table 2 illustrates the mean percent correct for each of the
of responses provided by each student also increased. Overall Kalamazoo consensus statement sub-domains along with a
mean percent correct scores by term and testing session are break-down of these scores pre- vs post-test indicating the
illustrated in Figure 5. For percent accuracy, Term 1 mean extent to which students’ knowledge in each sub-domain
scores increased from 39.2% to 51.3% after the intervention. increased as a result of the learning enabled by the educational
Term 2 percent accuracy mean scores increased from 36.4% intervention. In four of the seven domains, statistically
e247
D. A. Baribeau et al.

Table 2. Percent correct as a function of Kalamazoo consensus statement sub-scores.

Skill sub-type Mean percent correct across all Pre-test mean Post-test mean p-Value comparing
testing session and all videos percent correct percent correct pre-test to post-test
(95% confidence interval)
10. Builds and sustains a trusting relationship 17.7% (16.0–19.5) 21.4 22.1 0.75
20. Opens the discussion 20.6% (16.5–24.8) 24.0 16.0 0.24
30. Gathers information 23.4% (19.6–27.4) 14.1 36.3 50.01
40. Understands the interviewee’s perspective 23.7% (21.5–25.9) 16.9 33.0 50.001
50. Shares information 29.6% (25.4–33.9) 26.0 30.1 0.55
60. Reaches agreement 9.6% (7.9–11.4) 5.8 18.2 50.001
70. Provides closure 27.6% (23.0–32.2) 24.0 40.0 50.05

Notes: The overall mean percent correct is based on total set of data collected whereas the pre-test and post-test means are calculated based on only those
individuals who contributed data before and after the educational intervention.

significant gains in performance were achieved: ‘‘Gathers relatively weaker at identifying missed opportunities or gen-
Information,’’ ‘‘Understands the Interviewee’s Perspective,’’ erating ideas on how the communication skills demonstrated
‘‘Reaches Agreement,’’ and ‘‘Provides Closure.’’ Cronbach’s could be improved. These results, in combination, suggest that
Med Teach Downloaded from informahealthcare.com by 31.57.148.223 on 04/25/12

alpha examining the consistency with which sub-domain pre-clinical training in communication should focus on the
scores differentiate between candidates was found to be 0.54. patient-centered communication skills, such as relationship
building, understanding the patient’s perspective and reaching
agreement. Training should be specifically aimed at helping
Discussion students to identify common mistakes and develop ways to
improve the interaction.
In this study, we designed and tested a novel OSVE response It is important to note that the communication skills course
form and marking protocol, based on consensus guidelines, was not built around the consensus guidelines or the OSVE
For personal use only.

to assess knowledge of communication skills. The ‘‘free text’’ scenarios used. Students acquired their knowledge through
design of the response form encouraged students to identify, experiential learning, journal reflections, using simulated
qualify, and comment on the communication skills they patients, peer feedback, and by exploring the literature.
perceived in each video scenario. This approach varies from As such, we consider the results to be reasonably represen-
that which has been reported in other OSVE testing protocols, tative of what can be expected of the broader population
and represents a method of accommodating the subjective of pre-clinical students rather than being the specific result
nature of clinical interactions. of this particular communication skills course. We specify
The development of the criterion standard using expert ‘‘pre-clinical’’ because while a high proportion of students
consensus revealed that experts were more likely to identify in the Bachelor of Health Sciences Program at McMaster
communication skills related to the patient experience of the University enter medical school upon graduation, this
interpersonal interaction (i.e., building the relationship and specific course involved a non-professional undergraduate
understanding the patient’s perspective; Table 1). The pre- population meaning that the students did not have clinical
professional students included in this sample, in contrast, were knowledge to incorporate into their communication skills
more likely to identify aspects of communication skills that training.
corresponded with organizational tasks (i.e., sharing informa- Unfortunately, student participation rates for testing
tion and providing closure; Table 2). This is in line with what sessions held outside of class time were relatively low,
has been reported in the literature, that health care students creating a potential selection bias toward more dedicated or
and professionals who receive additional training are more interested students. That being said, mean scores for
likely to employ patient-centered communication skills. students who completed all three sessions were for the
That said, training of these relative novices via an experi- most part not significantly different from those who
ential communication skills course led to improvement in four completed only one or two testing sessions. Further, our
out of seven sub-domains, including patient-centered tasks data suggest that repeated OSVE administration or matura-
and concrete/organizational tasks. Previous studies have tion throughout the academic year did not in itself improve
shown improvement primarily in the patient-centered sub- student knowledge as a pair of pre-tests revealed similar
domains. We hypothesize that this difference may relate to the performance as did the post-test vs retention comparison.
nature of the non-professional undergraduate student popu- Further work is required to determine whether or not this
lation, whom had no prior exposure to either the structure or OSVE scoring technique could be used to enable tailoring of
the skills required for clinical interviewing. As a result, training feedback to individual students or to identify those in need
permitted not only development of patient-centered skills, of remediation.
but also a basic introduction to the structure and format of a Of note, the students achieved relatively low mean percent
clinical encounter. correct scores, both before and after the study intervention.
Students were relatively better at identifying communica- The low scores may be attributable to the protocol used to
tion skills that were correctly demonstrated in a video, and generate the criterion standard. Five experts contributed to the
e248
Objective structured video exam

marking key, but consensus was only required between two Notes on contributors
experts on a particular item for it to be included in the criterion
DANIELLE A. BARIBEAU, BHSc, is a medical student at the University
standard. As a result, each individual expert contributed only of Toronto, Ontario, Canada. She completed a Bachelor’s degree in Health
20% of the criterion standard, approximating the percent Sciences in 2008 at McMaster University, Hamilton, Canada.
correct scores achieved by students. A more stringent protocol ILYA MUKOVOZOV, MSc, is a student in the MD/PhD Program at the
regarding expert consensus would be expected to raise University of Toronto, enrolled in the Institute of Medical Science and
the mean percent correct by limiting the answer key to those working as a research associate in the Department of Cell Biology at the
Hospital for Sick Children, Toronto, Canada.
items that are more absolutely observable. This was not done
THOMAS SABLJIC, MSc, is a doctoral student in the Medical Sciences
in the context of this study as the relative accuracy across
Program at McMaster University, Hamilton, Canada. He completed a
sub-domain was of dominant interest. For those looking Bachelor’s degree in Health Sciences at McMaster University in 2008.
to reproduce this type of evaluation for educational pur- KEVIN W. EVA, PhD, is a senior scientist in the Centre for Health Education
poses, an open discussion among educators during the Scholarship, associate professor and director of Educational Research and
generation of the marking scheme would be more straight- Scholarship in the Department of Medicine at the University of British
forward and likely appropriate for nonresearch-based Columbia, Vancouver, Canada.
endeavors. CARL B. DELOTTINVILLE, MSW, is an instructor in the Honours Bachelor
of Health Sciences Program and an associate clinical professor in the
A final limitation of the study arises from lack of clarity
Department of Psychiatry and Behavioral Neurosciences at McMaster
regarding the significance of communication skills knowledge. University, Hamilton, Canada.
There is limited data available to enable confident claims that
Med Teach Downloaded from informahealthcare.com by 31.57.148.223 on 04/25/12

greater knowledge of communication portrayed on OSVEs is


associated with performance (Humphris 2002). More research References
is needed to determine if training and assessment of commu-
Aspegren K. 1999. BEME guide no. 2: Teaching and learning communica-
nication skills knowledge translates to performance in clinical tion skills in medicine-a review with quality grading of articles.
encounters. Med Teach 21(6):563–570.
Aspegren K, Lonberg-Madsen P. 2005. Which basic communication skills
in medicine are learnt spontaneously and which need to be taught and
trained? Med Teach 27(6):539–543.
Conclusions Back A, Arnold R, Baile W, Fryer-Edwards K, Alexandre S, Barley G,
For personal use only.

Gooley T, Tulsky J. 2011. Efficacy of communication skills training for


An OSVE based on consensus guidelines with respect to giving bad news and discussing transition to palliative care. Arch Intern
clinical communication was capable of tracking increases Med 167:453–460.
in student knowledge of communication skills following an Barry C, Bradley C, Britten N, Stevenson F, Barber N. 2000. Patients’
unvoiced agendas in general practice consultations: Qualitative study.
educational intervention. The students’ ability to identify BMJ 320(7244):1246–1250.
communication skills varied depending on the skill sub-type. Boulet JR, David B, Friedman M, Ziv A, Burdick WP, Curtis M, Peitzman S,
Students were better at identifying information or organization- Gary N. 1998. High-stakes examinations: What do we know about
based tasks such as sharing information and closing the measurement? Using standardized patients to assess the interpersonal
skills of physicians. Acad Med 73(10):S94–S96.
discussion. They were weaker at recognizing patient-centered
Butow P, Cockburn J, Girgis A, Bowman D, Schofield P, D’Este C,
tasks such as building the relationship and reaching agreement Stojanovski E, Tattersall M. 2008. Increasing oncologists’ skills in
with the patient. Communication skills training resulted in eliciting and responding to emotional cues: Evaluation of a communi-
improved recognition of some but not all types of communi- cation skills training program. Psychooncology 17(3):209–218.
cation skills. Educators in the field of clinical communication Bylund C, Brown R, Gueguen J, Diamond C, Bianculli J, Kissane D. 2010.
The implementation and assessement of a comprehensive communi-
may find it useful to evaluate knowledge acquisition of specific
cation skills training curriculum for oncologists. Psychooncology
communication skill sub-types using OSVEs to enable tailoring 19(6):583–593.
of feedback and further curriculum development to the Guiton G, Hodgson CS, Delandshere G, Wilkerson L. 2004. Communication
specific deficiencies observed. Pre-clinical training in commu- skills in standardized-patient assessment of final-year medical students:
A psychometric study. Adv Health Sci Educ 9(3):179–187.
nication should focus on recognizing opportunities to improve
Haak R, Rosenbohm J, Koerfer A, Obliers R, Wicht MJ. 2008. The effect of
communication skills that enhance the subjective patient undergraduate education in communications skills: A randomized
experience. controlled clinical trial. Eur J Dent Educ 12(4):213–218.
Hodges B, Turnbull F, Cohen R, Bienenstock A, Norman G. 1996.
Evaluating communication skills in the objective structured clini-
cal examination format: Reliability and generalizability. Med Educ
Acknowledgments 30:38–43.
Hulsman RL, Mollema ED, Oort FJ, Hoos AM, de Haes JCJM. 2006. Using
The authors thank Jennifer Gallé, Osama Khan, and Jayant standardized video cases for assessment of medical communication
Ramakrishna for their preliminary work on this project. skills: Reliability of an objective structured video examination by
Gratitude is extended to the Bachelor of Health Sciences computer. Patient Educ Couns 60(1):24–31.
Humphris G. 2002. Communications skills knowledge, understanding and
Program at McMaster University for providing funding and
OSCE performance in medical trainees: A multivariate prospective
technical support. This research was conducted at McMaster study using structural equation modeling. Med Educ 36(9):842–852.
University, Hamilton, Ontario, Canada. Humphris G, Kaney S. 2000. The objective structured video exam for
assessment of communication skills. Med Educ 34(11):939–945.
Declaration of interest: The authors report no declarations Humphris G, Kaney S. 2001. Assessing the development of communication
of interest. skills in undergraduate medical students. Med Educ 35(3):225–231.
e249
D. A. Baribeau et al.

Knowles C, Kinchington F, Erwin J, Peters B. 2001. A randomized Shaw J. 2006. Four core communication skills of highly effective
controlled trial of the effectiveness of combining video role play with practitioners. Vet Clin North Am Small Anim Pract 36(2):385–396.
traditional methods of delivering undergraduate medical education. Simpson D, Gehl S, Helm R, Kerwin D, Drewniak T, Bragg D, Ziebert M,
Sex Transm Infect 77:376–380. Denson S, Brown D, Gleason H, et al. 2006. Objective Structured Video
Kroboth F, Hanusa BH, Parker S, Coulehan JL, Kapoor WN, Brown FH, Examinations (OSVEs) for geriatrics education. Gerontol Geriatr Educ
Karpf M, Levey G. 1992. The inter-rater reliability and internal 26(4):7–24.
consistency of a clinical evaluation exercise. J Gen Intern Med Stewart MA. 1995. Effective physician-patient communication and health
7(2):174–179. outcomes: A review. CMAJ 152(9):1423–1433.
Levinson W, Gorawara-Bhat R, Lamb J. 2000. A study of patient clues and The Royal College of Physicians and Surgeons of Canada. 2008–2009.
physician responses in primary care and surgical settings. J Am Med CanMeds best practice submissions, Ottawa: [Published 2010 June 20].
Assoc 284(8):1021–1027. Available from: http://rcpsc.medical.org/canmeds/bestpractices/
Little P, Everitt H, Williamson I, Warner G, Moore M, Gould C, Ferrier K, Payne index.php
S. 2001. Preferences of patients for patient centered approach to consul- Von Fragstein M, Silverman J, Cushing A, Quilligan S, Salisbury H,
tation in primary car: Observational study. BMJ 322(7284):468–472. Wiskin C. 2008. UK council for clinical communication skills
Makoul G. 2001. Essential elements of communication in medical encoun- teaching in undergraduate medical education. Med Educ
ters: The Kalamazoo consensus statement. Acad Med 76(4):390–393. 42(11):1100–1107.
Morse D, Edwardsen E, Gordon H. 2008. Missed opportunities for interval Ward M, Sundaramurthy S, Lotstein D, Bush TM, Neuwelt CM, Street Jr RL.
empathy in lung cancer communication. Arch Intern Med 168(17): 2003. Participatory patient-physician communication and morbidity
1853–1858. in patients with systemic lupus erythematosus. Arth Rheum 49(6):
Perera J, Mohamadou G, Kaur S. 2009. The use of Objective Structured Self- 810–818.
Assessment and Peer Feedback (OSSP) for learning communication Windish DM, Price EG, Clever SL, Magaziner JL, Thomas PA. 2005.
skills: Evaluation using a controlled trial. Adv Health Sci Educ Teaching medical students the important connection between commu-
Med Teach Downloaded from informahealthcare.com by 31.57.148.223 on 04/25/12

15(2):185–193. nication and clinical reasoning. J Gen Intern Med 20(12):1108–1113.


Ruiz-Moral R, Perez Rodriguez E, Perula de Torres LA, de la Torre J. 2006. Yedidia M, Gillespie CC, Kachur E, Schwartz MD, Ockene J, Chepaitis AE,
Physician-patient communication: A study on the observed behaviours Snyder CW, Lazare A, Lipkin Jr M. 2003. Effect of communications
of specialty physicians and the ways their patients perceive them. training on medical student performance. J Am Med Assoc 290(9):
Patient Educ Couns 64(1–3):242–248. 1157–1165.
For personal use only.

e250

View publication stats

You might also like