Progress Testing For Medical Students at The Unive

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

Open Access: Full open access to Journal of Medical Education

this and thousands of other papers at


http://www.la-press.com. and Curricular Development

Progress Testing for Medical Students at The University of Auckland: Results


from The First Year of Assessments
Steven Lillis1, Jill Yielder2, Vernon Mogol2, Barbara O’Connor2, Kira Bacal3, Roger Booth4
and Warwick Bagg5
1
Director of Assessment, Medical Programme Directorate, Faculty of Medical and Health Sciences, University of Auckland, Auckland,
New Zealand. 2Medical Programme Directorate, Faculty of Medical and Health Sciences, University of Auckland, Auckland,
New Zealand. 3Phase 2 Director, Medical Programme Directorate, Faculty of Medical and Health Sciences, University of Auckland,
Auckland, New Zealand. 4Phase 1 Director, Medical Programme Directorate, Faculty of Medical and Health Sciences, University
of Auckland, Auckland, New Zealand. 5Head of the Medical Programme, Faculty of Medical and Health Sciences, University of
Auckland, Auckland, New Zealand.

ABSTR ACT
BACKGROUND: Progress testing is a method of assessing longitudinal progress of students using a single best answer format pitched at the standard of
a newly graduated doctor.
AIM: To evaluate the results of the first year of summative progress testing at the University of Auckland for Years 2 and 4 in 2013.
SUBJECTS: Two cohorts of medical students from Years 2 and 4 of the Medical Program.
METHODS: A survey was administered to all involved students. Open text feedback was also sought. Psychometric data were collected on test perfor-
mance, and indices of reliability and validity were calculated.
RESULTS: The three tests showed increased mean scores over time. Reliability of the assessments was uniformly high. There was good concurrent valid-
ity. Students believe that progress testing assists in integrating science with clinical knowledge and improve learning. Year 4 students reported improved
knowledge retention and deeper understanding.
CONCLUSION: Progress testing has been successfully introduced into the Faculty for two separate year cohorts and results have met expectations. Other
year cohorts will be added incrementally.
RECOMMENDATION: Key success factors for introducing progress testing are partnership with an experienced university, multiple and iterative
briefings with staff and students as well as demonstrating the usefulness of progress testing by providing students with detailed feedback on performance.

KEY WORDS: assessment, progress testing, undergraduate

CITATION: Lillis et al. Progress Testing for Medical Students at The University of Auckland: Results from The First Year of Assessments. Journal of Medical Education and Curricular
Development 2014:1 41–45 doi:10.4137/JMECD.S20094.
RECEIVED: September 11, 2014. RESUBMITTED: October 17, 2014. ACCEPTED FOR PUBLICATION: October 21, 2014.
ACADEMIC EDITOR: Steven R. Myers, Editor in Chief
TYPE: Original Research
FUNDING: Authors disclose no funding sources.
COMPETING INTERESTS: WB discloses that he is Chair of the Board of Examiners at the University of Auckland. Other authors disclose no competing interests.
COPYRIGHT: © the authors, publisher and licensee Libertas Academica Limited. This is an open-access article distributed under the terms of the Creative Commons
CC-BY-NC 3.0 License.
CORRESPONDENCE: slillis@wave.co.nz
Paper subject to independent expert blind peer review by minimum of two reviewers. All editorial decisions made by independent academic editor. Upon submission manuscript
was subject to anti-plagiarism scanning. Prior to publication all authors have given signed confirmation of agreement to article publication and compliance with all applicable
­ethical and legal requirements, including the accuracy of author and contributor information, disclosure of competing interests and funding sources, compliance with ethical require-
ments relating to human and animal study participants, and compliance with any copyright requirements of third parties. This journal is a member of the Committee on Publication
Ethics (COPE).

Introduction programme”.1 All progress tests are written assessments using


Progress testing can be defined as “a form of longitudinal single best answer format.2
examination which, in principle, samples at regular intervals Progress testing is different from more traditional tests.
from the complete domain of knowledge considered a require- Historically, assessment in medical schools has occurred
ment for medical students on completion of the undergraduate soon after a particular curriculum subject has been taught

Journal of Medical Education and Curricular Development 2014:1 41


Lillis et al

and has been limited to that subject. The intent of developing of the three tests from 2013 and results of Years 4 and 2 student
progress testing as a concept was to sever the direct relation- feedback on progress testing from 2013 and suggests a possible
ship between education in a particular subject and its assess- framework for assessing progress testing.
ment.3 Items used in progress tests are commonly clinically
based, require integrated knowledge of sciences underpinning Method
medicine with clinical knowledge and cover the entire medi- The University of Auckland Human Participants Ethics
cal school curriculum. The test is set at the level of knowledge Committee approved the study. Aggregated data for the sepa-
expected of a first year house officer. rate cohorts of Years 2 and 4 were collected to assess the prog-
The major advantage of progress testing is that rote mem- ress of student cohorts from test to test. For Year 2, the number
orization is discouraged in favor of integrated, higher order of students sitting the tests was 241, 238, and 240 for each test,
knowledge.4–6 Other recognized benefits of progress testing respectively, and for Year 4, the numbers were 203, 201, and
include the capability of improved feedback to students and to 202. Analysis of reliability was undertaken on each test using
academic departments on student attainment and teaching,7 Kuder–Richardson coefficient.
fostering of deep learning strategies,8 change in study hab- For calculating concurrent validity, the performance of
its,9 reduced stress from spreading assessment across several Year 4 students in the domain of Clinical and Communica-
events,10 and the early identification of both high achieving tion Skills was compared with the performance in progress
and struggling students.11 For these reasons, the method is tests. In the calculation, progress test scores in all the three
increasingly being used in medical schools internationally.8 A tests for each student were averaged. The Clinical and Com-
more detailed history of progress testing and its advantages for munication Skills domain uses a combination of 10 separate
creating a learning environment can be found in the AMEE assessments; two Mini-CEX, one short OSCE in muscu-
guide on progress testing.2 loskeletal medicine, end-of-year-6-station OSCE, and an
The University of Auckland has worked closely with assessment in Drug and Alcohol. Grades of Distinction, Pass,
Peninsula College of Medicine and Dentistry (PCMD) in Borderline, and Fail are possible for all these clinical assess-
developing policies for delivering, analyzing, and reporting ments. For the purpose of this analysis, all the assessments
progress tests. The framework chosen at this university is a were equally weighted and numerical scores of 0, 1, 2, and
3-hour test of 125 items which students sit three times per 3 were replaced for the grades of Fail, Borderline, Pass, and
year irrespective of year level. A large databank of 4000 Distinction, respectively.
items is available to ensure that items are not repeated within End of year 2013 feedback was sought from Year 4 stu-
a three-year timeframe. Formula scoring is used where 0.25 dents by way of a questionnaire using Likert scales. The
of a mark is deducted for an incorrect answer. There is a questions asked were: Did use of PT enhance retention of
“Don’t know” option that results in an item score of zero. knowledge? Did use of PTs enhance learning? Did PTs deepen
The blueprint used is based on the Professional and Lin- understanding of clinical medicine? Did PTs help to integrate
guistic Assessments Board (PLAB) blueprint.12 All items science with clinical medicine? Was the student confident in
are reviewed by a moderation group to check validity, word- preparing for PTs? Responses were received from 115 (57%) of
ing, and answer. After each test, the results are reviewed for the Year 4 cohort after they had taken three summative prog-
items that did not perform as expected using point-biserial ress tests. The results of “Significantly Disagree” and “Mod-
correlation, analysis of distractors, and index of difficulty. erately Disagree” were combined, “Slightly Disagree” and
These items are then discussed by the moderation group “Slightly Agree” were combined as were “Moderately Agree”
who decides if the item should be kept or excluded from and “Strongly Agree”.
analysis. In progress tests, it is usual that all but the final Feedback was sought from Year 2 students in the form of
year of the program are norm-referenced because of diffi- a survey. The questions sought feedback on the use of prog-
culties in applying criterion-referenced methods to multiple ress tests to guide learning, enhance retention of knowledge,
class cohorts.13,14 deepen understanding, integrate learning in the taught mod-
Although progress testing is being introduced incremen- ules, and integrate basic science theory with clinical practice.
tally in the University of Auckland with Year 2 and Year 4 par- Responses were received from 190 out of 242 students repre-
ticipation in 2013, by 2015, all students from Year 2 to Year 6 senting 79% of the cohort.
will be required to complete these tests. In anticipation of Years
2 and 4 of the medical program having to sit progress testing Results
in 2013, the Year 3 cohort of 2012 was invited to take a pilot Psychometrics. The KR-20 scores for each test are given
test. In 2013, students in Years 2 and 4 of the program were in Table 1.
required to take three progress tests. For Year 2, the first was Measurements of test reliability show a KR-20 of either
formative and the remaining two were summative with limited 0.84 or 0.85 for all six measurements. Test results showed an
weighting and in addition to end of module testing. For Year 4, increase in mean score for each cohort over the three tests.
all three were summative. This paper reports on the evaluation The P25 (score below which 25% of the class achieved) and

42 Journal of Medical Education and Curricular Development 2014:1


Progress testing for medical students at the university of Auckland

Table 1. KR-20 for progress tests.

PT1 PT2 PT3


Year 2 0.84 0.84 0.85
Year 4 0.84 0.84 0.85
Items 123 123 121

P75 (score above which 25% of the class achieved) are given in
Figure 1 and showed corresponding increase.
Concurrent validity. The correlation of progress
test scores with the domain of CCS assessments shows a
Pearson’s r of 0.54 indicating moderate to strong correla- Figure 2. Scatterplot of CCS score and progress test results.
tion. The coefficient of determination (R2) value is 29 indi-
cating that 29% of the variability seen in the performance of
students in the CCS domain is due to their performance in “Study for progress tests and study relevant to clinical
progress tests. A scatter plot confirms that the relationship medicine was quite different to study for pre-clinical
is linear (Fig. 2). years and took quite a while to adapt to this.”
Year 4 end of year survey. The questions are detailed in “Progress tests require approximately zero prepara-
the Methods section. The results are provided in Figure 3. tion—it feels like all the preparation to be done is through
Although there was some diffidence about preparing for the clinical portion of the course.”
progress tests, it is also clear that students considered the tests “Throughout the year I worked hard to learn and
assisted in knowledge retention, deepened understanding of relearn material which I was facing on the wards and I
clinical medicine, assisted with integrating knowledge, and believe this showed in my progress testing.”
enhanced their learning.
However, not all comments were favorable about changes
The qualitative section of the survey revealed more about
in study habits:
why students gave the responses above. For many students,
progress testing was seen as requiring a different and more “For me they actually inhibit my learning as there is no
clinically based approach to learning: external motivation to study particular topics in depth.”
“Progress tests did not work for my learning, feel I
“I liked the progress tests because they motivated me to
have actually lost overall knowledge this year.”
learn as much as I could in the clinical environment, and
they motivated me to learn by seeing patients.” Several students commented on the reduced stress levels
“They were fun to do from the point of view of being associated with progress testing:
clinical and making me think in a clinical fashion.”
“I had initial reservations about the progress test, but I
The issue of preparation for progress tests revealed that feel now that is an appropriate test. It has caused less stress
many found revision of material for progress tests to be dis- and anxiety and I am now preparing better for the tests.”
similar to their prior experience of studying for examinations:

Figure 1. Results of Year 2 and Year 4 across three 2013 progress tests.
Mean, P25 and P75. Figure 3. Year 4 survey.

Journal of Medical Education and Curricular Development 2014:1 43


Lillis et al

most appropriate measure of internal consistency due to the


dichotomous nature of the variables and provides the same
information as Cronbach’s alpha. A KR-20 score 0.7 is
acceptable and a KR-20 score 0.8 is preferable.15 Scores 0.9
may indicate possible redundancy of items or may indicate an
unnecessarily long assessment.16 It is of note that the three
tests reported in this paper had satisfactory reliability. The
KR-20 is reported on a little 125 items as a post moderation
process undertaken on raw data excludes items that have per-
formed poorly (usually 2–3 items). These items are excluded
before feedback on performance is given to students.
A measure of concurrent validity provides insight into
Figure 4. Year 2 feedback.
student performance on progress tests in comparison to clini-
cal assessments where the correlation found is moderate to
strong. The r value is high considering that that the clinical
“Progress tests have reduced anxiety levels towards assessments used in this study were a combination of different
assessments for me this year.” methods including Mini-CEx and OSCE styled assessments.
It is expected that there would be a reasonable correla-
Year 2 feedback. Year 2 feedback is presented in Figure 4. tion based on other research that examines the relationship
Students perceive that progress testing assists in inte- between written and clinical assessments.17
grating science and clinical medicine as well as integrating It is reassuring that the mean score for both Years 2 and
learning, were ambivalent about the ability of the tests to pro- 4 increased across the three tests. However, historical data
mote knowledge retention or deepen understanding and were from PCMD indicates that there is not a consistent incremen-
slightly negative about the ability of the tests to guide learning. tal increase in mean scores for cohorts and some tests show a
Positive open comments from Year 2 students focused on decrease. This represents a challenge in test equating where
the clinical nature of progress tests and the integrated learning individual tests may vary in difficulty. Thus, it is unlikely that
they require: all progress tests in the program will continue to show such
incremental increases.
“Progress Tests helped me focus on the importance of Overall, the feedback indicates students perceive that
thinking about what I’ve learnt in the modules in a clini- progress testing has considerable educational value. Students
cal context.” would appear to agree that the objectives of encouraging
“Progress tests are a good chance to see practical deeper and more integrated medical knowledge have been
application of our knowledge and they area challenge/ largely met. Preparation for progress tests is an issue for some
motivating factor.” students. This is unsurprising as the nature of the assessment
“Might need to change HOW course is taught. does not lend itself to short intense periods of study on iso-
Progress test is very clinically focused whereby lecture lated topics, an approach that may have been successful with
content is not.” more traditional testing techniques. Rather, preparation that
focuses on clinically relevant information drawn from multiple
Negative comments were mostly around the difficulties of preclinical and clinical sources and across all medical disci-
sitting a clinical test so early in the education of the students: plines is required. Of particular note was the strong indication
from students that the test had a clinical or real world focus.
“At this stage of our learning, progress tests are not a good Alongside this was understanding that preclinical knowledge
assessment of our learning. It’s all about guessing and alone was limited in preparing for the tests.
doesn’t reflect how much you’ve learnt from modules.” The issue of competitiveness due to the norm-referenc-
“The Q’s of the progress test are not at all suited to ing method of standard setting was seldom mentioned by the
what we have learnt this year.” Year 4 group. There was also little concern over negative mark-
ing and stress of the assessment process (issues that appeared
Discussion problematic in student feedback prior to the summative tests).
This evaluation of progress testing incorporated a number of Indeed, it would appear that the Year 4 cohort considered that
facets: psychometric evaluation, key success factors, and stu- progress testing would reduce stress.
dent feedback. The partnership with a university experienced in running
Psychometric analysis of the progress tests revealed progress tests was a critical success factor in the introduction
a KR-20 of 0.84–0.85. For this assessment approach, the of progress testing. As stated by two authors with considerable
Kuder–Richardson Formula 20 (KR-20) was chosen as the experience in progress testing, “introducing progress testing

44 Journal of Medical Education and Curricular Development 2014:1


Progress testing for medical students at the university of Auckland

involves not only a change in thinking about assessment but Author Contributions
also an academic cultural change”.18 The partnership assisted Conceived and designed the experiments: SL, WB. Analyzed
with developing policies, processes, analytical capability and the data: SL, VM, JY. Wrote the first draft of the manuscript:
item sharing. The combination of psychometric data, student SL. Contributed to the writing of the manuscript: SL, JY, KB,
feedback that includes the acceptability of the assessment and RB, WB. Agree with manuscript results and conclusions: SL,
the performance of key success factors such as partnership JY, WB, VM, RB, BO’C, KB. Jointly developed the structure
with experienced progress test providers may have promise as and arguments for the paper: SL, JY. Made critical revisions
a framework for an assessment instrument for progress tests. and approved final version: JY, WB, VM, RB, BO’C, KB. All
Limitations. Progress testing is still a relatively new part authors reviewed and approved of the final manuscript.
of the assessment suite and student experience so far is lim-
ited to three summative tests over a single year. The long-term
benefits of progress testing such as encouraging clinical learn- REFERENCES
1. McHarg J, Bradley P, Chamberlain S, Ricketts C, Searle J, McLachlan JC.
ing in the early years of the medical program and providing a
Assessment of progress tests. Med Educ. 2005;39(2):221–227.
clear vision of the depth of clinical knowledge that is needed 2. Wrigley W, van der Vleuten CP, Freeman A, Muijtjens A. A systemic framework
by the end of the medical program may not yet be apparent to for the progress test: strengths, constraints and issues: AMEE Guide No. 71.
Med Teach. 2012;34(9):683–697.
students. 3. Van der Vleuten C, Verwijnen G, Wijnen W. Fifteen years of experience with
The Kuder–Richardson coefficient assumes dichotomous progress testing in a problem-based learning curriculum. Med Teach. 1996;18(2):
103–109.
data. However, the formula scoring structure of these tests 4. Norman G, Neville A, Blake JM, Mueller B. Assessment steers learning down
introduces a third option (Don’t know). This may have a small the right road: impact of progress testing on licensing examination performance.
Med Teach. 2010;32(6):496–499.
impact on the coefficient. 5. Van Berkel HJM, Nuy HJP, Geerlings T. The influence of progress tests and
The two surveys used different questions and different block tests on study behaviour. Instructional Sci. 1994;22(4):317–333.
6. Blake JM, Norman GR, Smit EKM. Report card from McMaster: student eval-
Likert scales. The format of the questions and scale used for
uation at a problem-based medical school. Lancet. 1995;345(8):899–902.
the different years is historical and allows longitudinal com- 7. Coombes L, Ricketts C, Freeman A, Stratford J. Beyond assessment: feed-
parison of student feedback over many years but does make back for individuals and institutions based on the progress test. Med Teach.
2010;32(6):486–490.
direct comparison problematic. 8. Freeman A, Van Der Vleuten C, Nouns Z, Ricketts C. Progress testing interna-
tionally. Med Teach. 2010;32(6):451–455.
9. Vleuten CVD, Verwijnen GM, Wijnen WHFW. Fifteen years of experience with
Conclusions progress testing in a problem-based learning curriculum. Med Teach. 1996;18(2):
Progress testing has been introduced as a significant part of the 103–109.
10. Freeman A, Ricketts C. Choosing and designing knowledge assessments: expe-
assessment suite in the medical program at the University of rience at a new medical school. Med Teach. 2010;32:578–581.
Auckland. Important success factors include partnership with 11. Finucane P, Flannery D, Keane D, Norman G. Cross-institutional progress test-
ing: feasibility and value to a new medical school. Med Educ. 2010;44:184–186.
an experienced university, moderation of items both before
12. GMC PLAB test blueprint. Available from: http://www.gmcuk.org/PLAB___Blueprint_
and after tests, and running a pilot test. The combination of overarching_statement___DC2714.pdf_48527961.pdf. Accessed December 19, 2013.
multiple and iterative briefing and feedback opportunities 13. Ricketts C, Freeman AC, Coombes LR. Standard setting for progress tests:
combining external and internal standards. Med Educ. 2009;43(6):589–593.
coupled with the experience of a pilot test with no academic 14. Schuwirth LW, van der Vleuten CP. The use of progress testing. Perspect Med
consequences has proved valuable in gaining student support Educ. 2012;1(1):24–30.
15. Tavakol M, Dennick R. Making sense of Cronbach’s alpha. Int J Med Educ.
for progress testing. Student feedback after three summa- 2011;2:53–55.
tive tests indicated an understanding of the positive aspects 16. Streiner DL. Starting at the beginning: an introduction to coefficient alpha and
internal consistency. J Pers Assess. 2003;80(1):99–103.
of this assessment method and decreasing levels of concern 17. Ram P, van der Vleuten C, Rethans JJ, Schouten B, Hobma S, Grol R. Assess-
as students gain experience with the method. The 2013 test- ment in general practice: the predictive value of written-knowledge tests and a
multiple-station examination for actual medical performance in daily practice.
ing data demonstrate a progressively improving standard for
Med Educ. 1999;33(3):197–203.
student cohorts. The reliability of the tests was acceptable and 18. Muijtjens AM, Hoogenboom RJ, Verwijnen GM, Van Der Vleuten CP. Relative
a measure of concurrent validity reassuring. Progress testing or absolute standards in assessing medical knowledge using progress tests. Adv
Health Sci Educ Theory Pract. 1998;3(2):81–87.
will continue to be implemented for all years of the medical
program. Possible frameworks for the systematic evaluation of
progress tests have been suggested.

Journal of Medical Education and Curricular Development 2014:1 45

You might also like