Professional Documents
Culture Documents
Progress Testing For Medical Students at The Unive
Progress Testing For Medical Students at The Unive
Progress Testing For Medical Students at The Unive
ABSTR ACT
BACKGROUND: Progress testing is a method of assessing longitudinal progress of students using a single best answer format pitched at the standard of
a newly graduated doctor.
AIM: To evaluate the results of the first year of summative progress testing at the University of Auckland for Years 2 and 4 in 2013.
SUBJECTS: Two cohorts of medical students from Years 2 and 4 of the Medical Program.
METHODS: A survey was administered to all involved students. Open text feedback was also sought. Psychometric data were collected on test perfor-
mance, and indices of reliability and validity were calculated.
RESULTS: The three tests showed increased mean scores over time. Reliability of the assessments was uniformly high. There was good concurrent valid-
ity. Students believe that progress testing assists in integrating science with clinical knowledge and improve learning. Year 4 students reported improved
knowledge retention and deeper understanding.
CONCLUSION: Progress testing has been successfully introduced into the Faculty for two separate year cohorts and results have met expectations. Other
year cohorts will be added incrementally.
RECOMMENDATION: Key success factors for introducing progress testing are partnership with an experienced university, multiple and iterative
briefings with staff and students as well as demonstrating the usefulness of progress testing by providing students with detailed feedback on performance.
CITATION: Lillis et al. Progress Testing for Medical Students at The University of Auckland: Results from The First Year of Assessments. Journal of Medical Education and Curricular
Development 2014:1 41–45 doi:10.4137/JMECD.S20094.
RECEIVED: September 11, 2014. RESUBMITTED: October 17, 2014. ACCEPTED FOR PUBLICATION: October 21, 2014.
ACADEMIC EDITOR: Steven R. Myers, Editor in Chief
TYPE: Original Research
FUNDING: Authors disclose no funding sources.
COMPETING INTERESTS: WB discloses that he is Chair of the Board of Examiners at the University of Auckland. Other authors disclose no competing interests.
COPYRIGHT: © the authors, publisher and licensee Libertas Academica Limited. This is an open-access article distributed under the terms of the Creative Commons
CC-BY-NC 3.0 License.
CORRESPONDENCE: slillis@wave.co.nz
Paper subject to independent expert blind peer review by minimum of two reviewers. All editorial decisions made by independent academic editor. Upon submission manuscript
was subject to anti-plagiarism scanning. Prior to publication all authors have given signed confirmation of agreement to article publication and compliance with all applicable
ethical and legal requirements, including the accuracy of author and contributor information, disclosure of competing interests and funding sources, compliance with ethical require-
ments relating to human and animal study participants, and compliance with any copyright requirements of third parties. This journal is a member of the Committee on Publication
Ethics (COPE).
and has been limited to that subject. The intent of developing of the three tests from 2013 and results of Years 4 and 2 student
progress testing as a concept was to sever the direct relation- feedback on progress testing from 2013 and suggests a possible
ship between education in a particular subject and its assess- framework for assessing progress testing.
ment.3 Items used in progress tests are commonly clinically
based, require integrated knowledge of sciences underpinning Method
medicine with clinical knowledge and cover the entire medi- The University of Auckland Human Participants Ethics
cal school curriculum. The test is set at the level of knowledge Committee approved the study. Aggregated data for the sepa-
expected of a first year house officer. rate cohorts of Years 2 and 4 were collected to assess the prog-
The major advantage of progress testing is that rote mem- ress of student cohorts from test to test. For Year 2, the number
orization is discouraged in favor of integrated, higher order of students sitting the tests was 241, 238, and 240 for each test,
knowledge.4–6 Other recognized benefits of progress testing respectively, and for Year 4, the numbers were 203, 201, and
include the capability of improved feedback to students and to 202. Analysis of reliability was undertaken on each test using
academic departments on student attainment and teaching,7 Kuder–Richardson coefficient.
fostering of deep learning strategies,8 change in study hab- For calculating concurrent validity, the performance of
its,9 reduced stress from spreading assessment across several Year 4 students in the domain of Clinical and Communica-
events,10 and the early identification of both high achieving tion Skills was compared with the performance in progress
and struggling students.11 For these reasons, the method is tests. In the calculation, progress test scores in all the three
increasingly being used in medical schools internationally.8 A tests for each student were averaged. The Clinical and Com-
more detailed history of progress testing and its advantages for munication Skills domain uses a combination of 10 separate
creating a learning environment can be found in the AMEE assessments; two Mini-CEX, one short OSCE in muscu-
guide on progress testing.2 loskeletal medicine, end-of-year-6-station OSCE, and an
The University of Auckland has worked closely with assessment in Drug and Alcohol. Grades of Distinction, Pass,
Peninsula College of Medicine and Dentistry (PCMD) in Borderline, and Fail are possible for all these clinical assess-
developing policies for delivering, analyzing, and reporting ments. For the purpose of this analysis, all the assessments
progress tests. The framework chosen at this university is a were equally weighted and numerical scores of 0, 1, 2, and
3-hour test of 125 items which students sit three times per 3 were replaced for the grades of Fail, Borderline, Pass, and
year irrespective of year level. A large databank of 4000 Distinction, respectively.
items is available to ensure that items are not repeated within End of year 2013 feedback was sought from Year 4 stu-
a three-year timeframe. Formula scoring is used where 0.25 dents by way of a questionnaire using Likert scales. The
of a mark is deducted for an incorrect answer. There is a questions asked were: Did use of PT enhance retention of
“Don’t know” option that results in an item score of zero. knowledge? Did use of PTs enhance learning? Did PTs deepen
The blueprint used is based on the Professional and Lin- understanding of clinical medicine? Did PTs help to integrate
guistic Assessments Board (PLAB) blueprint.12 All items science with clinical medicine? Was the student confident in
are reviewed by a moderation group to check validity, word- preparing for PTs? Responses were received from 115 (57%) of
ing, and answer. After each test, the results are reviewed for the Year 4 cohort after they had taken three summative prog-
items that did not perform as expected using point-biserial ress tests. The results of “Significantly Disagree” and “Mod-
correlation, analysis of distractors, and index of difficulty. erately Disagree” were combined, “Slightly Disagree” and
These items are then discussed by the moderation group “Slightly Agree” were combined as were “Moderately Agree”
who decides if the item should be kept or excluded from and “Strongly Agree”.
analysis. In progress tests, it is usual that all but the final Feedback was sought from Year 2 students in the form of
year of the program are norm-referenced because of diffi- a survey. The questions sought feedback on the use of prog-
culties in applying criterion-referenced methods to multiple ress tests to guide learning, enhance retention of knowledge,
class cohorts.13,14 deepen understanding, integrate learning in the taught mod-
Although progress testing is being introduced incremen- ules, and integrate basic science theory with clinical practice.
tally in the University of Auckland with Year 2 and Year 4 par- Responses were received from 190 out of 242 students repre-
ticipation in 2013, by 2015, all students from Year 2 to Year 6 senting 79% of the cohort.
will be required to complete these tests. In anticipation of Years
2 and 4 of the medical program having to sit progress testing Results
in 2013, the Year 3 cohort of 2012 was invited to take a pilot Psychometrics. The KR-20 scores for each test are given
test. In 2013, students in Years 2 and 4 of the program were in Table 1.
required to take three progress tests. For Year 2, the first was Measurements of test reliability show a KR-20 of either
formative and the remaining two were summative with limited 0.84 or 0.85 for all six measurements. Test results showed an
weighting and in addition to end of module testing. For Year 4, increase in mean score for each cohort over the three tests.
all three were summative. This paper reports on the evaluation The P25 (score below which 25% of the class achieved) and
P75 (score above which 25% of the class achieved) are given in
Figure 1 and showed corresponding increase.
Concurrent validity. The correlation of progress
test scores with the domain of CCS assessments shows a
Pearson’s r of 0.54 indicating moderate to strong correla- Figure 2. Scatterplot of CCS score and progress test results.
tion. The coefficient of determination (R2) value is 29 indi-
cating that 29% of the variability seen in the performance of
students in the CCS domain is due to their performance in “Study for progress tests and study relevant to clinical
progress tests. A scatter plot confirms that the relationship medicine was quite different to study for pre-clinical
is linear (Fig. 2). years and took quite a while to adapt to this.”
Year 4 end of year survey. The questions are detailed in “Progress tests require approximately zero prepara-
the Methods section. The results are provided in Figure 3. tion—it feels like all the preparation to be done is through
Although there was some diffidence about preparing for the clinical portion of the course.”
progress tests, it is also clear that students considered the tests “Throughout the year I worked hard to learn and
assisted in knowledge retention, deepened understanding of relearn material which I was facing on the wards and I
clinical medicine, assisted with integrating knowledge, and believe this showed in my progress testing.”
enhanced their learning.
However, not all comments were favorable about changes
The qualitative section of the survey revealed more about
in study habits:
why students gave the responses above. For many students,
progress testing was seen as requiring a different and more “For me they actually inhibit my learning as there is no
clinically based approach to learning: external motivation to study particular topics in depth.”
“Progress tests did not work for my learning, feel I
“I liked the progress tests because they motivated me to
have actually lost overall knowledge this year.”
learn as much as I could in the clinical environment, and
they motivated me to learn by seeing patients.” Several students commented on the reduced stress levels
“They were fun to do from the point of view of being associated with progress testing:
clinical and making me think in a clinical fashion.”
“I had initial reservations about the progress test, but I
The issue of preparation for progress tests revealed that feel now that is an appropriate test. It has caused less stress
many found revision of material for progress tests to be dis- and anxiety and I am now preparing better for the tests.”
similar to their prior experience of studying for examinations:
Figure 1. Results of Year 2 and Year 4 across three 2013 progress tests.
Mean, P25 and P75. Figure 3. Year 4 survey.
involves not only a change in thinking about assessment but Author Contributions
also an academic cultural change”.18 The partnership assisted Conceived and designed the experiments: SL, WB. Analyzed
with developing policies, processes, analytical capability and the data: SL, VM, JY. Wrote the first draft of the manuscript:
item sharing. The combination of psychometric data, student SL. Contributed to the writing of the manuscript: SL, JY, KB,
feedback that includes the acceptability of the assessment and RB, WB. Agree with manuscript results and conclusions: SL,
the performance of key success factors such as partnership JY, WB, VM, RB, BO’C, KB. Jointly developed the structure
with experienced progress test providers may have promise as and arguments for the paper: SL, JY. Made critical revisions
a framework for an assessment instrument for progress tests. and approved final version: JY, WB, VM, RB, BO’C, KB. All
Limitations. Progress testing is still a relatively new part authors reviewed and approved of the final manuscript.
of the assessment suite and student experience so far is lim-
ited to three summative tests over a single year. The long-term
benefits of progress testing such as encouraging clinical learn- REFERENCES
1. McHarg J, Bradley P, Chamberlain S, Ricketts C, Searle J, McLachlan JC.
ing in the early years of the medical program and providing a
Assessment of progress tests. Med Educ. 2005;39(2):221–227.
clear vision of the depth of clinical knowledge that is needed 2. Wrigley W, van der Vleuten CP, Freeman A, Muijtjens A. A systemic framework
by the end of the medical program may not yet be apparent to for the progress test: strengths, constraints and issues: AMEE Guide No. 71.
Med Teach. 2012;34(9):683–697.
students. 3. Van der Vleuten C, Verwijnen G, Wijnen W. Fifteen years of experience with
The Kuder–Richardson coefficient assumes dichotomous progress testing in a problem-based learning curriculum. Med Teach. 1996;18(2):
103–109.
data. However, the formula scoring structure of these tests 4. Norman G, Neville A, Blake JM, Mueller B. Assessment steers learning down
introduces a third option (Don’t know). This may have a small the right road: impact of progress testing on licensing examination performance.
Med Teach. 2010;32(6):496–499.
impact on the coefficient. 5. Van Berkel HJM, Nuy HJP, Geerlings T. The influence of progress tests and
The two surveys used different questions and different block tests on study behaviour. Instructional Sci. 1994;22(4):317–333.
6. Blake JM, Norman GR, Smit EKM. Report card from McMaster: student eval-
Likert scales. The format of the questions and scale used for
uation at a problem-based medical school. Lancet. 1995;345(8):899–902.
the different years is historical and allows longitudinal com- 7. Coombes L, Ricketts C, Freeman A, Stratford J. Beyond assessment: feed-
parison of student feedback over many years but does make back for individuals and institutions based on the progress test. Med Teach.
2010;32(6):486–490.
direct comparison problematic. 8. Freeman A, Van Der Vleuten C, Nouns Z, Ricketts C. Progress testing interna-
tionally. Med Teach. 2010;32(6):451–455.
9. Vleuten CVD, Verwijnen GM, Wijnen WHFW. Fifteen years of experience with
Conclusions progress testing in a problem-based learning curriculum. Med Teach. 1996;18(2):
Progress testing has been introduced as a significant part of the 103–109.
10. Freeman A, Ricketts C. Choosing and designing knowledge assessments: expe-
assessment suite in the medical program at the University of rience at a new medical school. Med Teach. 2010;32:578–581.
Auckland. Important success factors include partnership with 11. Finucane P, Flannery D, Keane D, Norman G. Cross-institutional progress test-
ing: feasibility and value to a new medical school. Med Educ. 2010;44:184–186.
an experienced university, moderation of items both before
12. GMC PLAB test blueprint. Available from: http://www.gmcuk.org/PLAB___Blueprint_
and after tests, and running a pilot test. The combination of overarching_statement___DC2714.pdf_48527961.pdf. Accessed December 19, 2013.
multiple and iterative briefing and feedback opportunities 13. Ricketts C, Freeman AC, Coombes LR. Standard setting for progress tests:
combining external and internal standards. Med Educ. 2009;43(6):589–593.
coupled with the experience of a pilot test with no academic 14. Schuwirth LW, van der Vleuten CP. The use of progress testing. Perspect Med
consequences has proved valuable in gaining student support Educ. 2012;1(1):24–30.
15. Tavakol M, Dennick R. Making sense of Cronbach’s alpha. Int J Med Educ.
for progress testing. Student feedback after three summa- 2011;2:53–55.
tive tests indicated an understanding of the positive aspects 16. Streiner DL. Starting at the beginning: an introduction to coefficient alpha and
internal consistency. J Pers Assess. 2003;80(1):99–103.
of this assessment method and decreasing levels of concern 17. Ram P, van der Vleuten C, Rethans JJ, Schouten B, Hobma S, Grol R. Assess-
as students gain experience with the method. The 2013 test- ment in general practice: the predictive value of written-knowledge tests and a
multiple-station examination for actual medical performance in daily practice.
ing data demonstrate a progressively improving standard for
Med Educ. 1999;33(3):197–203.
student cohorts. The reliability of the tests was acceptable and 18. Muijtjens AM, Hoogenboom RJ, Verwijnen GM, Van Der Vleuten CP. Relative
a measure of concurrent validity reassuring. Progress testing or absolute standards in assessing medical knowledge using progress tests. Adv
Health Sci Educ Theory Pract. 1998;3(2):81–87.
will continue to be implemented for all years of the medical
program. Possible frameworks for the systematic evaluation of
progress tests have been suggested.