Professional Documents
Culture Documents
Article - Learning Assessment Toolkit
Article - Learning Assessment Toolkit
Topics in Training
Learning Assessment Toolkit
By Peter G. de Boer, MA, FRCS, Rick Buckley, MD, FRCS(C), Pascal Schmidt, Robert Fox, EdD, and Jesse Jupiter, MD
Modern surgical education is evolving gical education, as it is delivered, has course has evolved over the past forty-
with changes in the academic had a measurable effect5-7. nine years, but it has been taught in a
environment. Surgeons, surgical Through a series of pilot-tested standardized form for the past decade.
educators, and administrative associations steps, we developed a set of instruments, The ‘‘key competencies’’ guide the
need to know what to teach, how to do it, providing insights into the effectiveness development of the Learning Assessment
and whether it has succeeded. Whereas of surgical education in the field of Toolkit, and the teaching and learning in
surgeons have many tools to help them to orthopaedic trauma. We termed the the course (Table I). A key competency was
see what is needed for better patient care instrument the Learning Assessment defined as a piece of knowledge and/or
and to measure patient progress, surgical Toolkit. It was developed to supplement skill that educators expected the course
education is just beginning to develop the judgment of surgical educators participants must know or be able to do
techniques of interpreting the needs of before and after a teaching event with after the course8,9. Key competencies are
students and to assess outcomes of real evidence of need, motivation, and statements describing what behaviors are
learning in their practice1,2. The absence of outcomes of educational programs in necessary to address problems related to
surgical education assessment techniques orthopaedic trauma surgery. successfully providing the gold standard
has made it necessary to guess what is The primary goal of this report is for patient care. These responses were
needed to improve education. We have to outline the elements of the Learning collected from three experienced course
estimated the success or failure of Assessment Toolkit and its develop- chairmen, all with greater than ten years
teaching by attempting to score faculty mental steps. A secondary goal is to of teaching experience acting as a panel of
performance rather than on obtaining show how its use can change the nature experts.
more valid data. This has led surgical and content of surgical educational
educators to use mistaken assumptions events to improve learning outcomes. How the Learning Assessment
based on their personal experience or This report also suggests what further Toolkit Works (Table II)
reasoning without evidence3. instruments are necessary to achieve the Precourse Assessment
Surgical education should be based desired end result: education that sub- Two weeks before the educational event,
on needs assessment, efficient program stantially improves patient care. course participants were contacted on-
planning, and a strong curriculum. The line. They were presented with the
motivation of learners should be high, Materials and Methods fourteen key competencies and were
and the learners’ needs should be met. The educational event used to design and asked two questions about each com-
This is dependent on measurable out- test the assessment toolkit is a fracture petency. The first question was: ‘‘How
comes with objective data demonstrating course based on the AO principles of important is this competency to you in
knowledge acquisition that meets the operative fracture management. The your daily practice?’’ Participants were
needs of surgical learners4. course is aimed at residents in their first asked to score from 1 to 5 (with 5
In the last ten years, the providers few years of training. The participant indicating the highest importance) on a
of educational resources—governments, groups are homogeneous in North Likert scale. They were then asked to
training boards, charitable foundations, America and Western Europe but het- evaluate their own ability relating to the
and commercial companies—have been erogeneous in the developing world in individual competencies using the same
increasingly interested in whether sur- terms of their needs and experience. The scoring system.
Disclosure: The authors did not receive any outside funding or grants in support of their research for or preparation of this work. Neither they nor a
member of their immediate families received payments or other benefits or a commitment or agreement to provide such benefits from a commercial
entity.
Fig. 1
A sample question from the precourse and postcourse objective assessment used to assess Competency 1: Assess fracture-
related soft-tissue injuries and apply strategies to treat these injuries.
provides evidence on which effective All that is required is the creation of a E-mail address for P.G. de Boer:
future changes can be made. list of competencies for the educational Piet.deboer@aofoundation.org.
This evaluation system provides event and the test questions designed to E-mail address for P. Schmidt:
Pascal.schmidt@aofoundation.org
a useful guide to enable educators to test knowledge of these competencies.
design appropriate educational offer- Many software packages exist to allow
Rick Buckley, MD, FRCS(C)
ings to meet the needs of surgeons. online testing. The assessment toolkit
Foothills Medical Centre, AC 144A,
However, if individuals learn knowledge described in this paper is not subject to 1403-29th Street NW, Calgary,
and skills in an education event but copyright, and the authors would be Alberta T2N 2T9, Canada.
cannot put their skills into practice after pleased to assist any groups interested E-mail address: buckclin@ucalgary.ca
the event, then the event clearly has in setting up their own assessment
not been successful. Assessment of the program. Robert Fox, EdD
barriers to knowledge implementation NOTE: The authors thank Dr. Laurent Audigé, DVM, PhD, 1001 Olde Oak Court,
Manager, Methodology, AO Clinical Investigation and Docu-
after a course is therefore critical. We mentation, for his invaluable help in the statistical analysis Norman, OK 73026.
within this paper. E-mail address: drrdfox@me.com
are presently conducting a study to
identify barriers that are encountered by
doctors in implementing what they have Jesse Jupiter, MD
learned at courses and how educators Department of Orthopaedic Surgery,
Massachusetts General Hospital,
can help them to overcome these Peter G. de Boer, MA, FRCS Yawkey Building 2100,
barriers20. Pascal Schmidt 55 Parkman Street,
A Learning Assessment Toolkit AO Education, Stettbachstrasse 6, Boston, MA 02114.
can be used by any group of educators. CH-8600 Dübendorf, Switzerland. E-mail address: JJupiter1@partners.org
References
1. Spencer JA, Jordan RK. Learner centered new vision of the professional development of 14. Davis DA, Mazmanian PE, Fordis M, Van
approaches in medical education. BMJ. 1999;318: physicians. Acad Med. 2000;75:1167-72. Harrison R, Thorpe KE, Perrier L. Accuracy of
1280-3. physician self- assessment compared with
8. Norman GR, Davis DA, Lamb S, Hanna E,
observed measures of competence: a systematic
2. Davis DA, Barnes BE, Fox RD. The continuing Caulford P, Kaigas T. Competency assessment of
review. JAMA. 2006;296:1094-102.
professional development of physicians: from primary care physicians as part of a peer review
research to practice. Chicago: AMA Press; 2003. program. JAMA. 1993;270:1046-51. 15. Green JS, de Boer PG. AO principles of
teaching and learning. Stuttgart: Georg Thieme
3. Davis DA, Thomson MA, Oxman AD, Haynes 9. Ward J, Macfarlane S. Needs assessment in
Verlag; 2005.
RB. Changing physician performance. A continuing medical education. Its feasibility and
systematic review of the effect of continuing value in a seminar about skin cancer for general 16. Coats W, Smidchens U. Audience recall as a
medical education strategies. JAMA. 1995;274: practitioners. Med J Aust. 1993;159:20-3. function of speaker dynamism. J Educ Psychology.
700-5. 1966;57:189-91.
10. Fox RD, Miner C. Motivation and the facilitation
4. Wilkes M, Bligh J. Evaluating educational of change, learning, and participation in educational 17. Russell IJ, Hendricson WD, Herbert RJ. Effects
interventions. BMJ. 1999;318:1269-72. programs for health professionals. J Contin Educ of lecture information density on medical student
Prof. 1999;19:132-41. achievement. J Med Educ. 1984;59:881-9.
5. Casebeer L, Raichle L, Kristofco R, Carillo A.
Cost-benefit analysis: review of an evaluation 11. Ebel RL. Essentials of educational 18. ACCME. http://www.accme.org. Accessed
methodology for measuring return on investment in measurement. 3rd ed. Englewood Cliffs, NJ: 2010 Jan.
continuing education. J Contin Educ Health Prof. Prentice-Hall; 1979.
19. Sachdeva AK. The new paradigm of
1997;17:224-7.
12. Ward M, MacRae H, Schlachta C, Mamazza J, continuing education in surgery. Arch Surg. 2005;
6. Regnier K, Kopelow M, Lane D, Alden E. Poulin E, Reznick R, Regehr G. Resident self- 140:264-9.
Accreditation for learning and change: quality and assessment of operative performance. Am J Surg.
20. Mazmanian PE, Daffron SR, Johnson RE, Davis
improvement as the outcome. J Contin Educ Health 2003;185:521-4.
DA, Kantrowitz MP. Information about barriers to
Prof. 2005;25:174-82.
13. Dunning D, Heath C, Suls JM. Flawed self- planned change: a randomized controlled trial
7. Bennett NL, Davis DA, Easterling WE Jr, assessment. Implications for health, education and involving continuing medical education lectures and
Friedmann P, Green JS, Koeppen BM, Mazmanian the workplace. Psychological Science in the Public commitment to change. Acad Med. 1998;73:
PE, Waxman HS. Continuing medical education: a Interest. 2004;5:69-106. 882-6.