Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

Nurse Education Today 31 (2011) 145–151

Contents lists available at ScienceDirect

Nurse Education Today


j o u r n a l h o m e p a g e : w w w. e l s ev i e r. c o m / n e d t

Best Practice Guidelines for use of OSCEs: Maximising value for student learning
D.D. Nulty a,⁎, M.L. Mitchell b, C.A. Jeffrey c, A. Henderson d, M. Groves e
a
Griffith Institute for Higher Education, Griffith University, Mount Gravatt, Brisbane, Queensland 4111, Australia
b
School of Nursing and Midwifery, Nathan Campus, Brisbane, Queensland 4111, Australia
c
School of Nursing and Midwifery, Griffith University Logan Campus University Drive Meadowbrook Queensland 4131, Australia
d
Nursing Practice Development Unit, Princess Alexandra Hospital, Ipswich Road, Woolloongabba, Queensland 4102, Australia
e
Faculty of Health Sciences, University of Queensland, Edith Cavell Building, Royal Brisbane and Women's Hospital, Herston, Qld. 4029, Australia

a r t i c l e i n f o s u m m a r y

Article history: Objective structured clinical examinations (OSCEs) are a regular component of Bachelor of Nursing (BN)
Accepted 4 May 2010 programs within Australia and internationally. OSCEs are a valuable strategy to assess ‘fitness to practice’ at
the students' expected level of clinical practice within a nursing context where the importance of accurate
Keywords: patient assessment is paramount. This report discusses the integration of seven proposed ‘Best Practice
Evaluation
Guidelines’ (BPG) into an undergraduate BN program in Queensland, Australia. A range of learning and
Assessment
Clinical performance
assessment strategies was introduced in accordance with the adoption of these guidelines to maximise
Nursing student engagement. There is some evidence that these strategies have directly assisted in enhanced student
Competency confidence around clinical practice and provide preliminary evidence of the effectiveness of BPG for OSCEs
OSCE within nursing programs internationally.
© 2010 Elsevier Ltd. All rights reserved.

Introduction valid and reliable method of assessment (Roberts et al., 2006). OSCEs
are also increasingly being adapted for use in other health profes-
In the health professions, professional skills and attributes are sional programs, commonly nursing and allied health, with many of
taught, learnt and assessed in a variety of ways. In nursing, students' the variants focussing on integrated, rather than isolated, assessment
ability to demonstrate in action that they have acquired these skills and tasks which are considered to more accurately reflect real-life clinical
attributes is assessed formatively and summatively whilst on clinical settings (Major, 2005).
practicum and via on-campus clinical laboratory summative assess- OSCEs are gaining in popularity in undergraduate nursing
ments such as objective structured clinical examinations (OSCEs). programs throughout the western world (Joy and Nickless, 2008)
OSCEs are a valuable strategy to assess clinical skill acquisition and and recent evaluations in Ireland identified a positive impact on all
‘fitness to practice’ as long as they are applied at the students' expected stakeholders (Brosnan et al., 2006).
level of clinical practice. That is, for example, for first year students, As reported by Mitchell et al. (2009, p. 399):
being able to perform accurate assessments of a healthy adult is an
important level of achievement to assess. “An OSCE requires each student to demonstrate specific skills and
OSCEs were developed in the 1970s for the assessment of medical behaviours in a simulated work environment with standardized
students (Harden and Gleeson, 1979). They are designed to assess a patients. It typically consists of a series of short assessment tasks
range of professional skills and knowledge involved in clinical practice (stations), each of which is assessed by an examiner using a
and are defined as “an approach to the assessment of clinical predetermined, objective marking scheme (Bartfay et al., 2004;
competence in which the components of competence are assessed Major, 2005; Ward and Barratt, 2005)…. It is increasingly being
in a well planned or structured way with attention being paid to used as a method of assessment in nursing and allied health
objectivity” (Harden, 1988, p.19). In medical programs where their curricula (Bartfay et al., 2004; Wessel et al., 2003).”
use is now widespread, they commonly consist of a series of short
tasks or “stations” designed to test discrete clinical skills (typically, While there is considerable debate around this way of assessing
ten stations of 5–8 min each). In this format, they have proven to be a students' clinical competence, particularly in respect of reliability and
validity, evidence suggests that it “offers particular strengths in terms
of assessor objectivity and parity... for all students [and that not
⁎ Corresponding author. withstanding some limitations] achieving high levels of reliability and
E-mail addresses: d.nulty@griffith.edu.au (D.D. Nulty),
marion.mitchell@griffith.edu.au (M.L. Mitchell), c.jeffrey@griffith.edu.au (C.A. Jeffrey),
validity is possible” (Rushforth, 2007, p.488). Rushforth (2007) also
Amanda_Henderson@health.qld.gov.au (A. Henderson), m.groves@uq.edu.au emphasises that research “makes very clear that each new OSCE
(M. Groves). should be subject to rigorous scrutiny and piloting to ensure that the

0260-6917/$ – see front matter © 2010 Elsevier Ltd. All rights reserved.
doi:10.1016/j.nedt.2010.05.006
146 D.D. Nulty et al. / Nurse Education Today 31 (2011) 145–151

reliability and validity of that particular assessment is maximized” In this paper we argue that the OSCE can be viewed as a method
(Rushforth, 2007, p.488). which intrinsically offers the opportunity for feedback (at the very
Although OSCEs are expensive and labour intensive, there is least, through self-reflection) by virtue of the in vivo self-assessment
evidence that they enhance the quality of health professional which is implicit in participation as students are required to ‘talk
education (Mitchell et al., 2009). What is not well reported is how through’ how their knowledge is applied to the situation. This is
they may best be integrated into the curriculum to support quality another way in which student engagement is boosted, and higher
outcomes. This paper begins to explore how these assessments can quality learning outcomes achieved.
demonstrate procedural, professional and educational efficiency and Specifically, research literature is unequivocal on the point that
effectiveness. For example, the concepts of “alignment” of curricular assessment activities are both a way for and a driver of students' study
components (i.e. teaching, learning and assessment), and “authentic- behaviours (Biggs, 2002, 2006; Ramsden, 2003; Rowntree, 1977;
ity” (especially in assessment) are critical to good curriculum design Sadler, 1987). In this respect, we argue that the participant-observer
and effective learning, particularly with adult learners because both nature of students' involvement in OSCEs axiomatically develops their
promote student engagement in behaviours that correlate with the skills in self-reflection — and thus, engagement in learning.
achievement of deep learning outcomes (Biggs, 2003; Cohen, 1987; As with other types of assessments OSCEs need to be rigorous with
Meyers and Nulty, 2009; Toohey, 1999). Simulated clinical situations standardisation and objectivity being principal aims (Ebbert and
such as OSCEs are intrinsically aligned and authentic, and should also Connors, 2004). Checklist style marking has been found to be useful
promote student engagement and the achievement of desired learning when OSCEs are used for formative assessment (Regehr et al., 1998).
outcomes. This justifies their use as a learning/assessment tool. These can provide specific feedback and reduce inter-rater variability.
This paper reports on preliminary evidence of the effectiveness of However, the same research suggests that in summative assessment
teaching and assessment practices that were modified in accordance global rating scales are more valid and reliable (Regehr et al., 1998).
with the guidelines proposing the best use of OSCEs. This is because global or ‘holistic’ judgements reflect the nature of
human judgement and the appraisal of complex tasks for which it is
The contribution of OSCEs inappropriate to specify components as if they were discrete (Sadler,
2009).
Increasingly, workforce issues within the healthcare sector mean
that some approaches to clinical education and assessment (e.g. Development of Best Practice Guidelines for OSCEs
clinical placements) are becoming difficult to sustain (Rischbieth,
2006). This is exacerbated by simultaneously escalating student Extensive literature reviews (Mitchell et al., 2009), the trialling and
numbers in some instances. These changes significantly limit the piloting of local initiatives, and consideration by experienced nurses,
ability of universities to provide high quality clinical training for educators, and academics resulted in the development of seven Best
nursing students, thereby increasing the imperative to develop Practice Guidelines (BPG) for OSCEs.
alternative and innovative learning opportunities. The BPG recommendations following this comprehensive process
Universities need to respond to these challenges/limitations in a were that the OSCE should:
way that is both acceptable to the health professions and to other
stakeholders (Feingold et al., 2004; Tanner, 2006). Better use of BPG1 Focus on aspects of practice related directly to delivery of
simulated learning has the capacity to reduce clinical placement safe client/patient care;
demands and improve the preparation for new graduates as BPG2 Focus on aspects of practice which are most relevant and
controlled pre-set learning can expose students to quality clinical likely to be commonly encountered;
material not always possible in hospital settings (Tanner, 2006; The BPG3 Be judged via a holistic marking guide to enhance both the
Nurse Policy Branch of the Victorian Government of Human Services, rigor of assessment and reliability. (This allows judgements
2006). of students' performance to be related to clinical practice as
To date, learning through simulated clinical experiences has been a whole, rather than as a collection of discrete independent
limited, with OSCEs being used almost exclusively for assessment pur- actions.)
poses. However, we argue that OSCEs present one viable educational BPG4 Require students to perform tasks in an integrated rather
strategy to promote student engagement and the achievement of than piecemeal fashion by combining assessments of
desired learning outcomes, notably including clinical competence, and discrete skills in an authentic manner;
to thereby meet the challenges arising out of the restricted access to BPG5 Be structured and delivered in a manner which aligns
clinical areas as just described (Tanner, 2006). In this way all students directly with mastery of desired knowledge and skill. This
can experience learning, practice and assessment of core skills prior to alignment should be both internal to the course and (in
clinical placement. keeping with #2) aligned prospectively with clinical tasks
The importance of feedback for performance improvement has likely to be encountered;
been long recognised, originally known as ‘The law of effect’ BPG6 Be appropriately timed in the sequence of students' learning
(Thorndike, 1911). As well, learners' motivation has also been noted to maximise assimilation and synthesis of disparate course
as of critical importance (McKeachie, 1974, 1990). McKeachie (1974) content and to minimise the potential for students to adopt
also argued that, for feedback to improve students' learning, it should: a piecemeal, superficial learning approach;
have an impact on students' motivation, carry information about the BPG7 Allow for ongoing practice of integrated clinical assessment
learners' performances, and information about ways in which these and intervention skills, thereby also ensuring the appropriate
performances can be improved. In this more sophisticated form, the and timely use of feedback to guide students' development.
value of feedback on learning has substantial contemporary support
(Brown and Knight, 1994; Race, 2005, 2008). As such, it has been From these guidelines, the first year OSCE was revised with the aim
argued that feedback is more effective when provided with minimal not only of enhancing students' awareness of the importance of
delay (Marton et al., 1984; Childs and Sepples, 2006). In the authors' patient assessment (Clark et al., 2009) but also of increasing their
experiences and elsewhere, feedback during or immediately follow- ability to effectively undertake a preliminary assessment during the
ing student performance is not always structured into each student's initial patient interaction. The ‘patients’ in the context of the OSCEs
assessment time due to the associated staffing costs (White et al., were a mixture of the assessed student's family, friends or a fellow
2009). student. The restrictions placed on this choice were that the patient
D.D. Nulty et al. / Nurse Education Today 31 (2011) 145–151 147

had to be: 16 years of age or over; be in relatively good health; and if semester. This strategy was implemented to allow students to develop
the student was assessing a fellow student (i.e. using each other as a sound understanding of the discipline content prior to the release
patients) then these assessments would be required to take place at of possible topics designed to encourage them to integrate and
least 1 h apart. synthesise their learning.

Adaptation of the OSCE based on the Best Practice Guidelines BPG6 Be appropriately timed in the sequence of students' learning
to maximise assimilation and synthesis of disparate course
BPG1 Focus on aspects of practice related directly to delivery of content and to minimise the potential for students to adopt
safe client/patient care. a piecemeal, superficial learning approach.

The patient assessment, rather than being focussed on a single It was considered that the early release of information about the
body system, reflected the real clinical work of a holistic review of an OSCE may detract from what was otherwise an integrated sequence of
individual. Specifically, the OSCE was changed to the comprehensive, constructing meaning for students one ‘piece’ at a time. This revised
integrated assessment of a healthy adult that incorporated all aspects approach facilitated integration and synthesis as students now were
relevant to general nursing practice. The revised OSCE commenced provided with the ‘big picture’ and were able to see how the infor-
with an initial client assessment designed to evaluate students' ob- mation fitted into this.
servational skills. Students were encouraged to discuss gait, posture,
height, weight, appearance and vital signs, including any obvious de- BPG7 Allow for ongoing practice of integrated clinical assessment
viations from the norm. The manner in which the student interacted and intervention skills, thereby also ensuring the appropriate
with the client was also part of the OSCE and covered communication, and timely use of feedback to guide students' development.
consent, confidentiality and privacy issues.
The availability of extra guided sessions during the course was
BPG2 Focus on aspects of practice which are most relevant and deemed to be essential to support students' learning through
likely to be commonly encountered. continuous practice. Informal feedback during these sessions highlight-
ed student's strengths and weaknesses and areas for improvement.
A complete patient assessment was deemed most relevant as it is
a task that nurses undertake many times each day during normal Value of the adaptations derived from student feedback
clinical practice. It is increasingly being recognised as that task that is
most poorly performed thereby increasing risks to patients particu- Our university, like many other educational providers worldwide,
larly when associated with poor communication (Clark et al., 2009). seeks student feedback on both subject content and assessment. The
standard student evaluation form (which asks students for feedback
BPG3 Be judged via a holistic marking guide to enhance both the on their subjects) when compared across two years (before the
rigor and reliability of the assessment. This allows judge- adaptations and afterwards), indicated that those students who were
ments of students' performance to be related to clinical repeating the subject for whatever reason (some may have failed or
practice as a whole, rather than as a collection of discrete withdrawn from the subject the previous year) perceived the new
independent actions. format of the course and its assessment better suited their needs to
prepare them for practice. Student feedback is given below according
The previous marking guide revealed an overly prescriptive ap- to the BPG to which they have the greatest relevance.
proach with allocation of marks to each item of each area being assessed. Fifty-eight students responded anonymously (45% of cohort,
The revised marking guide allocated marks to each area, allowing a n = 150) to the standard questionnaire — Student Evaluation of Course
global judgement for each, thereby improving both construct and (SEC) questionnaire, and a custom questionnaire used as an adjunct. The
concurrent validity (Regehr et al., 1998) [refer Appendix A]. Inter-rater latter asked specific questions about the impact of the OSCE on students
reliability was supported by organising discussions among assessors and is reported below. (The SEC consists of 10 standard items which
prior to, throughout and on completion of the OSCE examination period. student's rate on a seven point Likert scale. The scale has the following
descriptors as anchors: 1 = “unacceptable”, 2 = “very poor”, 3 =
BPG4 Require students to perform tasks in an integrated rather “poor”, 4 = “average”, 5 = “good”, 6 = “very good” and 7 =
than piecemeal fashion by combining assessments of dis- “excellent”.)
crete skills in an authentic manner. Quantitative comparison of scores from the previous and current
year's SEC suggest that students regard the revised OSCE as an
This was achieved through the nature of the selected assessment, improvement over previous practice. Of the ten items in that survey,
through the marking guide, and through preparation of the exam- the three listed below are relevant to the aims of this project. In
iners. Combined, these factors ensured that the task was approached each case a substantial increase was seen between the two year
and marked in an integrative fashion. evaluations.

BPG5 Be structured and delivered in a manner which aligns Question 5 How effective were the assessment tasks in this course in
directly with mastery of desired knowledge and skill. This helping you with what you were expected to learn?
alignment should be both internal to the course and (in (increased by 0.9 from 5.4 to 6.3)
keeping with #2) aligned prospectively with clinical tasks Question 6 How clear were the assessment guidelines at explaining
likely to be encountered. how assessment tasks would be marked? (increased by 0.8
from 5.4. to 6.2)
While the same course content was taught as in previous years, the Question 10 Overall, how effective was this course in helping you to
timing of the release of information about possible topics for learn? (increased by 0.9 from 5.5 to 6.4).
assessment was changed: students were told after nine weeks of
teaching, instead of at the beginning of the course. Students were then The additional customised set of nine questions specific to the
encouraged to practice the OSCE in the remaining three weeks of adaptation of the OSCE is outlined in Appendix B. The student
148 D.D. Nulty et al. / Nurse Education Today 31 (2011) 145–151

responses were number coded for reporting purposes. Ethical removing anxiety from the examination process is difficult even when
approval was received by the university ethics committee. additional practice and feedback sessions are incorporated.
The following students' comments indicate improvements around The impact of the BPG5, 6 and 7 was better addressed through
developing safe (BPG1) and relevant (BPG2) practice. the customised questions (Appendix B). The questions focussed on
students' study behaviours and their reactions to the OSCE (i.e. stu-
“Compared to last year's OSCE,… this was a much better & fairer… I dent engagement). The BPG5, 6 and 7 were focused on facilitating
felt confident that I knew all assessments & the reason for doing the students' development of study practices through establishing a
assessment, rather than just memorising everything.” Student 10. process whereby they were more likely to engage in the learning
content. Questions 1 and 2 of the customised questions asked stu-
“I didn't feel I was expected to know stuff that would rarely/never dents about the study behaviours they ‘should’ enact when preparing
use in practice.” Student 23. for the OSCE and those they said they ‘did’. The purpose here was to
determine if students realised the merit of engaging in study
behaviours that were geared toward deeper understanding, and if
These comments are important because they identify that the
they were subsequently able and motivated to do so. Results (Fig. 3)
changes around increased safety and relevancy have arguably im-
revealed that students' actual study behaviour was always less than
proved the meaningfulness of the activity to the student rather than
they thought they ‘should’ do, but that this discrepancy was smallest
implicitly supporting students' engagement in the form of simply
for those behaviours that students most favoured. Thus, 88% of
memorising.
students indicated they should ‘use the marking criteria and notes
Evidence around the value of the revised marking guide (BPG3 and
regarding the OSCE as a guide’, and almost all of these (77%) said they
BPG4) was demonstrated as follows:
did. At the opposite end of the spectrum, 42% of students said they
should ‘read through lecture and lab notes to memorise as much as
“The un-ambiguity of the study guide and the fact that we knew
possible’ but relatively few (21%) did so.
in advance that it covered all we needed to know.” Student 3.
This paper reports on preliminary evidence of the effectiveness of
teaching and assessment practices that were modified in accordance
BPG5 was interested in the delivery that would facilitate maxi- with the guidelines proposing the best use of OSCEs.
mum performance of the student. In part, success in this area was Comments that reflected that the implementation of the OSCE
evidenced through comments such as: succeeded in its aim to improve learning and assessment practices
was evidenced where students could make the connection between
“The way the OSCA was set out, i.e. head to toe, made it much the performance of this OSCE and nursing practice, thereby increasing
easier to assess the person as the systematic approach was easy to their confidence:
remember.” Student 45.

“The labs were great and supported the content of the course. The
Findings that revealed that only a small percentage of students end of semester OSCE was relevant to the requirements of a
(12%) thought that being anxious helped, and nearly 80% felt put at nurse.” Student 55.
ease by their examiner (Question 4, see Fig. 1), indicated that the
focus on practice and developing mastery as opposed to demonstrat- “Doing this OSCA has increased my confidence for what may come
ing perfection had a positive effect on student anxiety. Despite this next year…” Student 12.
finding, 88% of students indicated that their nervousness was one of
the worst aspects of the OSCE (Question 5, Fig. 2) suggesting that “I feel more confident because I can actually demonstrate the basic

Fig. 1. The best aspects of the OSCE.


D.D. Nulty et al. / Nurse Education Today 31 (2011) 145–151 149

Fig. 2. The worst aspects of the OSCE.

assessment skills, which will become my grounding for second year Overview
courses. I feel this course & OSCA above other courses has prepared
me more for my 2nd year & also for my 1st year clinical placement.” The ways students are assessed is widely regarded as the single most
Student 18. important aspect of the curriculum in determining what, and how,
students learn (i.e. student engagement) (Biggs, 2003; Ramsden, 1992;
Rowntree, 1987). It has been shown that the assessment regime can give
Furthermore, although an admittedly crude indicator, the final students de facto ideas about curriculum content and learning
question in the additional questions explored whether participa- requirements, and that these may not match what is stated elsewhere
tion in the OSCE affected their confidence. After the OSCE, 58% of (Snyder, 1971) — the so called ‘hidden curriculum’. Related ideas have
students reported feeling more confident about entering second been expressed by Rowntree (1977), who argued that it is through
year. assessment, rather than by any other means, that we know our students.

Fig. 3. Preparation strategies used.


150 D.D. Nulty et al. / Nurse Education Today 31 (2011) 145–151

Related to this is the concept of ‘backwash’ (Biggs, 2002, 2006). This idea Appendix B. Custom evaluation of OCSE survey
is that students' engagement in study behaviours, and the nature of
their cognitions, is influenced by messages they perceive from their Question 1
interpretation of the assessment requirements. These messages may not
be messages we wish to convey, and may be contrary to what is Thinking about the course as a whole, the impression I got about
conveyed by other parts of the curriculum. This is partly why Biggs what I should be doing to prepare myself for the OSCE was:
(2006) argued for constructive alignment of all curricular components
and for all curricular components to be regarded as parts of an Please tick as many as apply to you, don't worry about what you
integrated system. It is also why clarity and authenticity in assessment actually did — that's the next question.
purposes is important — it relates directly to the students' patterns of □ To rely mostly on the extra “practice sessions”
engagement, and the quality of their learning outcomes. □ To use the marking criteria and notes regarding the OSCE as a guide
Notwithstanding questions about the intrinsic authenticity of □ To read through my lecture and lab notes to memorise as much as
OSCEs compared with the use of real patients, and ongoing concern possible
about maximising the predictive validity of OSCEs in relation to the □ To consult with the lecturer in regard to review of OSCE skills
clinical practice of nurses (e.g. Rushforth, 2007), this modification and □ To rely mostly on the labs which are conducted each week
trial application of BPG for OSCEs derived from consideration of □ To practice skills by using the labs at times which are additional to
research literature and the considered views of an expert advisory the timetabled sessions
panel provides affirmative evidence that makes a provisional case for □ To think about the theory behind the skills while practicing them
the effectiveness of these guidelines. □ To draw on the experiences gained through my clinical prac.
However, consistent with Rushforth's (2007) cautionary notes, this
exercise shows equally that the application of the BPG alone does not Question 2
guarantee positive student learning outcomes (many other variables
including the students' ability are inevitably at work). Accordingly, further We recognize that many factors impact on the choice of the actual
research is required to determine the relationship between the revised study behaviour you would engage in. Please select the top three study
OSCE, student learning outcomes and clinical practice performance, as methods from the list below which most closely match what you actually
well as to explore the applicability of the BPG to OSCEs in other institutions did to help you prepare for the OSCE.
or health programs internationally with different contextual and cultural
features. Such work could also explore more directly the potential for What I actually did to prepare for the OSCE was:
OSCEs to aid learning rather than to be used purely for assessment.
□ To rely mostly on the extra “practice sessions”
□ To use the marking criteria and notes regarding the OSCE as a guide
Conclusion □ To read through my lecture and lab notes to memorise as much as
possible
Generic education strategies in the form of BPG for OSCEs have been □ To consult with the lecturer in regard to review of OSCE skills
proposed that are potentially applicable across a range of clinically based □ To rely mostly on the labs which are conducted each week
health professional education. Further research will support refinement □ To practice skills by using the labs at times which are additional to
of the guidelines, and provide examples of successful cases from other the weekly labs
institutions to help guide others in the reform and enhancement of their □ To think about the theory behind the skills while practicing them
assessment practices. □ To draw on the experiences gained through your clinical prac.
□ Other? E.g. practice skills on friends, family, etc? (Please specify
Appendix A. Excerpt of marking guides below)

Question 3
Excerpt of prescriptive marking guide — 2006
Please explain the main reason for any difference between your
Assessment test Marks Marks Comments
answer to question 1 and 2. (e.g. you may have been too busy with
possible awarded
Assess abdomen other courses to use the strategies)
Inspect abdomen 4
Auscultate abdomen in 4 quadrants 4 Question 4
(provide rationale)
Percuss abdomen — 4 quadrants 4
Percuss liver (indicate size) and stomach 4
Thinking back to the OSCE, which of the following were the best
Palpate abdomen in 4 quadrants 4 aspects? Please tick as many as apply to you.
Palpate liver 4
Palpate for aortic pulsation 2
□ I finished in the allocated time
Total marks □ I had the chance to show my knowledge
□ The OSCE was just as I had expected it to be
Excerpt of global marking guide — 2007
□ I really felt it all came together for me
Assessment test Comments and □ My nerves actually helped me
marks awarded □ The examiner made me feel at ease
Digestive system

Auscultation of abdomen (4 quadrants in correct order)


□ Other? (Please specify below)
States frequency and character of sounds heard in abdomen
States rationale for sequence: not to disturb bowel and Question 5
to avoid causing more pain if are already tender
Percussion lower border (RUQ) of liver, finding by
Thinking back to the OSCE, which of the following were the worst
describing difference in sounds tympany to dullness
Percussion over stomach (LUQ) moves over gastric bubble aspects? Please tick as many as apply to you.
describing sounds
Total marks /15 □ I did not have enough time to finish
D.D. Nulty et al. / Nurse Education Today 31 (2011) 145–151 151

□ I was very nervous Major, D., 2005. OSCEs — seven years on the bandwagon: the progress of an objective
structured clinical evaluation programme. Nurse Education Today 25 (6), 442–454.
□ I did not feel well prepared Marton, F., Hounsell, D., Entwistle, A., 1984. The Experience of Learning. Scottish Academic
□ It was nothing like I thought it would be Press, Edinburgh.
□ I did not feel I was able to show my knowledge McKeachie, W.J., 1974. The decline and fall of the laws of learning. Educational Researcher
3 (3), 7–11.
□ The examiner did not make me feel at ease McKeachie, W.J., 1990. Learning, thinking and thorndike. Educational Psychologist 25
□ Other? (Please specify below) (2), 127–141.
Meyers, N.M., Nulty, D.D., 2009. How to use (five) curriculum design principles to align
authentic learning environments, assessment, students' approaches to thinking, and
Question 6 learning outcomes. Assessment and Evaluation in Higher Education 34 (5), 565–577.
Mitchell, M.L., Henderson, A., Groves, M., Dalton, M., Nulty, D.D., 2009. The Objective
Structures Clinical Examination (OSCE): optimising its value in the undergraduate
We would like to know whether you think that you were given nursing curriculum. Nurse Education Today 29 (4), 398–404.
enough information about the OSCE and the manner in which you Race, P., 2005. Making Learning Happen. Sage Publications, London.
would be marked? Race, P., 2008. Compendium of my Writings on Assessment. bhttp://phil-race.co.uk/
wp-content/plugins/download-monitor/download.php?id=67N.
Please circle one number on the scale below to indicate your answer. Ramsden, P., 1992. Learning to Teach in Higher Education. Routledge, London.
Ramsden, P., 2003. Learning to Teach in Higher Education, 2nd Edition. Routledge
The amount of information I was given about the OSCE and the Falmer, London and New York.
manner in which I would be marked was: Regehr, G., MacRae, H., Reznick, R.K., Szalay, D., 1998. Comparing the psychometric
properties of checklists and global rating scales for assessing performance on an
OCSE-format examination. Academic Medicine 73 (9), 993–997.
1 2 3 4 5 6 7 Rischbieth, A., 2006. Matching nurse skill with patient acuity in the intensive care units:
a risk management mandate. Journal of Nursing Management 14 (5), 397–404.
Unacceptable Very poor Poor Average Good Very good Excellent Roberts, C., Newble, D., Jolly, B., Reed, M., Hampton, K., 2006. Assuring the quality of
high-stakes undergraduate assessments of clinical competence. Medical Teacher 28
(6), 535–543.
Question 7 Rowntree, D., 1977. Assessing Students: How Shall We Know Them? Harper and Row,
London.
Rowntree, D., 1987. Assessing Students: How Shall We Know Them? Kogan Page, London.
Thinking back to the OSCE, do you have anything else you would Rushforth, H.E., 2007. Objective structured clinical examination (OSCE): review of the
like to tell us or any suggestions? literature and implications for nursing education. Nurse Education Today 27 (5),
481–490.
Please explain your answer. Sadler, D.R., 1987. Specifying and Promulgating Achievement Standards. Ministerial
Advisory Committee on Curriculum Development Newsletter 3 (1).
Sadler, D.R., 2009. Indeterminacy in the use of preset criteria for assessment and
Question 8 grading. Assessment and Evaluation in Higher Education 34 (2), 159–179.
Snyder, B.R., 1971. The Hidden Curriculum. Knopf, New York.
Tanner, C.A., 2006. The next transformation: clinical education. Journal of Nursing
Finally, although you do not know your OSCE result, we are Education 45 (4), 99–100.
interested to find out if completing the OSCE has had any effect on The Nurse Policy Branch of the Victorian Government of Human Services 2006.
how confident you feel about entering the second year. Please indicate Accessed 11/09/07. bhttp://www.health.vic.gov.au/nursingN.
Thorndike, E.L., 1911. Animal Intelligence: Experimental Studies. Macmillan, New York.
by ticking one of the following.
Toohey, S., 1999. Designing Courses for Higher Education. SRHE and Open University
□ Lower confidence □ No effect □ Higher confidence Press, Buckingham.
Ward, H., Barratt, J., 2005. Assessment of nurse practitioner advanced clinical practice
Question 9 skills: using the objective structured clinical examination (OSCE): Helen Ward and
Julian Barratt examine how OSCEs can be developed to ensure a robust assessment
of clinical competence. Primary Health Care 15 (10), 37–41.
Please explain your answer to Question 8. Wessel, J., Williams, R., Finch, E., Gemus, M., 2003. Reliability and validity of an objective
structured clinical examination for physical therapy students. Journal of Allied
Health 32 (4), 266–269.
References White, C.B., Ross, P.T., Gruppen, L.D., 2009. Remediating students' failed OSCE
performances at one school: the effects of self-assessment, reflection, and feedback.
Bartfay, W.J., Rombough, R., Howse, E., LeBlanc, R., 2004. The OSCE approach in nursing Academic Medicine 84 (5), 65–654.
education: objective structured clinical examinations can be effective vehicles for
nursing education and practice by promoting the mastery of clinical skills and
decision-making in controlled and safe learning environments. The Canadian Nurse Dr. Nulty is Senior Lecturer in the Griffith Institute for Higher Education at Griffith
100 (3), 18–25. University, Queensland Australia. He has more than a decade of experience in teaching,
Biggs, J.B., 2002. Aligning the Curriculum to Promote Good Learning. . Accessed 21/11/ course and program evaluation obtained in several large universities. He has also
02. b http://www.ltsn.ac.uk/application.asp?app=resources.asp&process=full_ conducted many educational evaluation consultancies in Australia and overseas.
record&id=167N.
Biggs, J.B., 2003. Teaching for Quality Learning at University, 2nd edition. Open University Dr. Mitchell is a Senior Research Fellow at Griffith University and Princess Alexandra
Press, Maidenhead. Hospital. Her previous appointment was Deputy Head of School, Logan Campus,
Biggs, J.B., 2006. Teaching for Quality Learning at University, (2nd edition). Open University Griffith University, Brisbane, Australia. She is instrumental in the development and
Press, Maidenhead. implementation of innovative educational initiatives designed to improve educational
Brosnan, M., Evans, W., Brosnan, E., Brown, G., 2006. Implementing objective structured outcomes for graduate nurses from Griffith University.
clinical skills evaluation (OSCE) in nursing registration programmes in a centre in
Ireland: a utilisation focused evaluation. Nurse Education Today 26 (2), 115–122.
Brown, S., Knight, P., 1994. Assessing Learners in Higher Education. Kogan Page, London. Carol Jeffrey is in her third year as an Associate Lecturer in the Undergraduate Nursing
Childs, J.C., Sepples, S., 2006. Clinical teaching by simulation: lessons learnt from a Degree program at Logan Campus teaching across the three years of the degree. Carol
complex patient care scenario. Nursing Education Perspectives 27 (3), 154–158. comes from an extensive clinical background and is enrolled in a PhD.
Clark, E., Squire, S., Heyme, A., Mickle, M.E., Petrie, E., 2009. The PACT project: improving
communication at handover. Medical Journal of Australia 190 (Suppl. 11), 125–127. Professor Henderson is the Nursing Director (Education) at the Princess Alexandra
Cohen, S.A., 1987. Instructional alignment: searching for a magic bullet. Educational Hospital, Brisbane, Australia, where she is actively involved in maximising clinical
Researcher 16 (8), 16–20. learning opportunities and practices for both undergraduate students and nurses
Ebbert, D.W., Connors, H., 2004. Standardised patient experiences: evaluation of clinical within the workplace. She has an extensive background in clinical nursing, education
performance and nurse practitioner student satisfaction. Nursing Education and research and holds a clinical title of professor at Griffith University, Queensland,
Perspectives 25 (1), 12–15. Australia.
Feingold, C.E., Calaluce, M., Kallen, M.A., 2004. Computerized patient model and simulated
clinical with baccalaureate nurses. Journal of Nursing Education 43 (4), 156–163.
Harden, R.M., 1988. What is an OSCE? Medical Teacher 10 (1), 9–22. A/Prof Michele Groves is Associate Dean, Teaching & Learning for the Faculty of
Harden, R.M., Gleeson, F.A., 1979. Assessment of clinical competence using and Health Sciences at the University of Queensland. She has wide experience in medical
objective structured clinical examination (OSCE). Medical Education 13 (1), 41–54. education with particular emphasis on clinical reasoning, problem-based learning and
Joy, R., Nickless, L., 2008. Revolutionising assessment in a clinical skills environment — medical student selection. Other research interests include inter-professional educa-
a global approach: the recorded assessment. Nurse Education in Practice 8 (5), 352–358. tion, work-based learning and assessment.

You might also like