An Evaluation of e-OSCE Software For Implementation in The Clinical Skills and Simulation Unit

You might also like

Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1of 10

An evaluation of e-OSCE software for implementation in the Clinical

Skills and Simulation Unit


Introduction
The Objective Structured Clinical Examination (OSCE) was introduced by Harden
and colleagues in 1975 and is an assessment by which medical students rotate
around a series of stations (Fig. 1) [1]. At each station students are asked to
perform a specific procedure that is assessed against a checklist. The students
final mark is calculated by totalling the scores on each checklist. Students are
assessed at each station, on the same content, this method of assessment can
be seen as objective, valid and reliable [1-3]. Since its introduction this type of
assessment has been implemented and regularly used at medical schools around
the world [2].

Station 10

Station 9

Station 8

Station 7

Fig. 1. OSCE consists of a circuit of stations where each stations starts with a student
and they rotate between stations while being assessed by an examiner at each station
on a different procedure

The main drawback of OSCEs is that it is time and resource consuming during
and after the assessments. The paper assessments must be verified to ensure
that it contains no missing data. It then has to be typed into an excel
spreadsheet to be used for further processing and this needs to happen within 24
hours, so that students can get their marks.
Summing up: OSCE is a time and resource-intensive form of examination (e.g.
[2], [4]). This unsatisfactory process has motivated us to start looking for an
alternative way of data capturing for the OSCE, which is more effective and
increases the validity of marks.
A previous study of using clickers has been trialled for use but this was not as
effective due to infrastructure, training and other reasons. We therefore looked at
alternative e-assessments options that offer better functionality; several software
options were evaluated based on price, functionality, effectiveness, efficiency
and support. We opted for the eOSCE software programme that was developed
by Bern University.

eOSCE developed their software in several stages and tested previous software
that offers the ability to assess using PDAs or tablets. During this phase they
found that users complained about the display size for the intended content,
which was too small. In addition, they observed that when using larger devices,
assessors complained about the ease and ergonomics of using the devices, in
particular the weight and heat emission; for this reason they switched to Apple
IPads.
eOSCE is comprised of 3 components: OSCE-Eval used on the tablet for
assessment during the OSCE, replacing the paper based checklist; OSCE track
to track the progress of the examination and to assist with early identification of
problems that may occur at stations and
management component, OSCE-editor.

thirdly,

the web-based exam

All of these components have been

undergoing various laboratory and field tests and have also been successfully
implemented at many other universities [10].

The software was designed to be user friendly, taking into account that the
examiners experience a high cognitive load during the exam.
Furthermore, a major focus of the project is to improve the unsatisfactory data
quality of the paper-based evaluations. The digital checklists have the potential
to reduce or eliminate input errors by preventing missing, unreadable or
ambiguous data input.
The change from paper checklists to a digital evaluation process is challenging
for the OSCE examiners as it is a previously unused system at SU; we are
interested in how well they will accept the new system in comparison to the
paper version. Computer anxiety, which is the increased stress that a person
experiences during the usage of digital devices,could be evoked when using
OSCE-Eval [14-15].
Research Question
Does the OSCE-Eval outperform the pen and paper checklist system with regards
to usability, mark processing, data errors, scoring results and preference rating?

Aims and objectives


To determine whether eOSCE is more efficient than the pen and paper method
To determine correlation of results between the paper based version vs the e-assessment

To evaluate the feelings of the examiner regarding the use of eOSCE

Research design and methods


The research will include making observations during the OSCE of the examiners
behaviour while using the tablets, and of any problems that may occur. Paper
checklists will also be available should there be any problem with use of the
checklist. Immediately after the assessment the participants will be given a short
survey to determine the subjective usability or perception of the examiners re
the device and software.. A focus group with participants will be held after the
marks was entered and a comparison of the marks between the paper checklist
and the electronic checklist will also be done. The participants will be the

assessors of the OSCE as well as the administrators in the clinical skills and
simulation unit.
Participants will receive training 1 or 2 days before the OSCE to make sure they
know how to use the device. During the training session, before the exam, a
questionnaire will be handed out. It will contain questions regarding their gender,
age and handedness as well as their experience with iPads, iPhones, and touch
screens in general. We also want to know what is their level of experience as
OSCE examiners (all 7-point rating scales from 1 = not experienced at all to 7 =
very experienced).
Thirty minutes minutes before the assessment the examiners will receive the
iPads with the OSCE assessments preloaded. Only half of the examiners will
make use of the tablets for the assessment, while the other half will do the
assessment by using the paper checklist.
Each assessor will evaluate the performance of candidates by using either
checklist type. After lunch we will exchange the checklist type so the accordant
examiners evaluate the rest of the candidates by using the opposed system.
After the OSCE they will fill out the questionnaires on usability satisfaction
Materials
The checklists that will be used will feature the same items in the same order.
Assessors using OSCE-Eval will use Apple iPad (2010) devices, preferably the
same device but if not this will be noted.
Measurements
Subjective Usability.
To test which checklist type is perceived as more usable the examiners will be
asked a series of questions. Some Likert scale questions ranging from 1 =
strongly disagree to 7 = strongly agree.
Missing Data.
To test which checklist type contains less missing data, or counting errors, we will
manually count the unanswered evaluation items for the paper checklists and for
OSCE-Eval.
Preference Ratings.

After using both OSCE-Eval and paper checklists, assessors will be asked to
choose between the two checklists with a view to future usage.

Assumptions and limitations


We expect the iPad devices to be inviting (playful) tools for the examiners and to
decrease the likeliness of computer anxiety to occurring. We anticipate that the
assessors will feel comfortable in using the eOSCE, in turn its use will decrease
the administrative load and error rate of assessments.

Anticipated benefits

It
It
It
It

will
will
will
will

be a more cost effective solution in the long term


save on the time required for processing marks
reduce human error
minimize the induced cognitive load for assessors

Ethical considerations
All the examiners of the OSCE will be asked to participate. They will be informed
that their participation is voluntary and that they can indicate whether they do
not want the information to be used for research purposes. For the focus groups
the participants will be made aware that the sessions are recorded and will ask
for their permission to do so. Participants will also be made aware that there will
be complete anonymity and that no personal information will be used.
Participants are also free to withdraw from the investigation at any point, even
after they have agreed to take part by means of an e-mail to the researcher. No
incentives will be given for participation

Conclusion
The mandate of the Health Science Unit for Learning Technologies is to
implement the use of learning technologies within the FMHS and to evaluate
whether these implementations make a considerable difference in the success of
modules. The aim of introducing the eOSCE system is therefore to to improve
assessing quality and to reduce the workload while also reducing costs.

Evaluating the effectiveness of the eOSCE system is critical to determine the


benefits and possible weaknesses of using it. Therefore, research has to be done
with the

results of these investigations used to develop a better assessment

strategy for the

FMHS. Furthermore, we need

enhance the scholarship of

teaching and learning, while contributing to the understanding on how learning


technology initiatives can contribute to health professions education which is
critical for better curriculum design. The future of educational technology in
medical education should not be limited to the solitary use of technology for
instruction, instead systematic approaches should be developed that facilitates
teaching, learning and assessment by selecting the best technological processes.

Addendum A: Focus group

Dear Participant
You are invited to participate in this focus groups which will will help us gain a
valuable insight into the use of electronic OSCE assessments, particularly to your
preferences for its use. The results will be used to implement the use of
electronic OSCE assessments if the trial is a success. Your responses will be kept
confidential and anonymous.

ETHICAL CLEARANCE REFERENCE NUMBER:


PRINCIPAL INVESTIGATOR: Mr Dr Steve WalshJanus van As
ADDRESS:

Centre for Health Professions Education, PO Box 19063, Tygerberg,

7505
CONTACT DETAILS: janus@sun.ac.za

PURPOSE OF THE RESEARCH


This information will be gathered to evaluate the implementation of an electronic
OSCE assessment platform. The results of the study will provide insight on
whether eOSCE enhances the assessment experience of assessors. The results
will also be used to make recommendations for the ongoing educational
activities and for any necessary changes to the undergraduate curriculum and
the training of academic staff.

CONSENT
I hereby give consent to participate in the study subject to the understanding
that my responses will be coded and that no personal information will be
acquired or used in this study and that I am free to withdraw at any stage.

______________________________
SIGN

______________________
DATE

You can contact the Health Research Ethics Committee at 021-938 9207
if you have any concerns.
Focus group questions
Guided questions, this will change depending on how the focus group
progresses
1. What was your experience like, using the tablets for assessment?
2. Was it easy to learn how to use the eOSCE software?
3. Did the training help?
4. What did you think about the ergonomics of using the devices (weight and heat
emission) ?
5. Was it easy to use the device and software for assessment?
6. Did it have any impact on the way you assess? If so how?
7. Which method of assessment do you prefer and why?
8. What did you like about using the eOSCE app?
9. What didnt you like?
10.What impact did the use of this app have on administration?
11. What is your perception on the overall experience?
12. What recommendations would you make?

Addendum B: OSCE observation


ETHICAL CLEARANCE REFERENCE NUMBER:
PRINCIPAL INVESTIGATOR: Mr Janus van As
ADDRESS:
Centre for Health Professions Education, PO Box 19063, Tygerberg, 7505
CONTACT DETAILS: janus@sun.ac.za
This study has been approved by the Health Research Ethics Committee at Stellenbosch
University and will be conducted according to the ethical guidelines and principles of the international
Declaration of Helsinki, South African Guidelines for Good Clinical Practice and the Medical Research
Council (MRC) Ethical Guidelines for Research.

PURPOSE OF THE RESEARCH


This information will be gathered to evaluate the implementation an eOSCE app in the FMHS. The
results of the study will provide insight on how eOSCE can be used during assessment. The results
will be used to make recommendations for the ongoing assessment activities in the clinical skills and
simulation unit and the training of academic staff.
CONSENT
I hereby consent that I may be observed during the OSCE assessment. This implies that you consent
to participate in the study subject to the understanding that your interaction with the tablet and OSCE
app will be observed and recorded and that no personal information will be acquired or used in this
study and that you are free to withdraw at any stage.
You can contact the Health Research Ethics Committee at 021-938 9207 if you have any concerns.

Declaration by participant
I have read this information and had a chance to ask questions. I understand that taking part in
this study is voluntary and that I may choose to leave the study at any time or refuse to answer
any question.

I agree to being observed

_______________________
Signed

Observations:
1.
2.
3.
4.
5.
6.
7.
8.

The lecturers found it easy to use Yes /No


Were any problems observed Yes /No
What type of problems arose during the assessment?
Was it technology related problems? Explain
Could the problem be solved? How
Average time that it took to score students using the tablet?
Lecturers found it easy to sync the marks to the cloud?
The information was always secure?
9. No information was lost?

_____________________
Date

You might also like