Exploring Validation and Verification How They.8

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 7

Concepts and Commentary

Exploring Validation and Verification


How they Different and What They Mean to Healthcare Simulation

John Jacob Barnes, III, MD; Summary Statement: The healthcare simulation (HCS) community recognizes the im-
portance of quality management because many novel simulation devices and techniques
Mojca Remskar Konia, MD, PhD, include some sort of description of how they tested and assured their simulation’s quality.
MACM Verification and validation play a key role in quality management; however, literature
Downloaded from https://journals.lww.com/simulationinhealthcare by BhDMf5ePHKav1zEoum1tQfN4a+kJLhEZgbsIHo4XMi0hCywCX1AWnYQp/IlQrHD3wX04VDhDA65JyIrQxIAmw8OudmFC7vF5gSn6ug+Sx+k= on 02/09/2020

published on HCS has many different interpretations of what these terms mean and how
to accomplish them. The varied use of these terms leads to varied interpretations of how
verification process is different from validation process. We set out to explore the concepts
of verification and validation in this article by reviewing current psychometric science de-
scription of the concepts and exploring how other communities relevant to HCS, such as
medical device manufacturing, aviation simulation, and the fields of software and engi-
neering, which are building blocks of technology-enhanced HCS, use the terms, with the
focus of trying to clarify the process of verification. We also review current literature avail-
able on verification, as compared with validation in HCS and, finally, offer a working defi-
nition and concept for each of these terms with hopes to facilitate improved communication
within, and with colleagues outside, the HCS community.
(Sim Healthcare 13:356–362, 2018)

Key Words: Verification, validation, healthcare simulation, medical education, quality assurance.

F or the last two decades, healthcare simulation (HCS) has


grown exponentially and expanded its role as a valuable tool
through the Federal Aviation Administration (FAA). Software
and engineering, both of which heavily rely on verification and
for training medical professionals. There has been a 10-fold in- validation to ensure their products work and function in their
crease in HCS literature from 1990s to 2000s,1,2 and the search intended use, are the foundational building blocks of technology-
term “healthcare simulation training” in PubMed resulted in enhanced HCS. Medicine itself has stressed the importance of
the exponential growth of 909 peer-reviewed articles in 2016 verification and validation in medical devices and pharmaceu-
compared with 300 in 2006 and 57 in 1996. Albeit this pro- tical industries with guidance documents from the Food and
duces exciting possibilities, it also comes with some growing Drug Administration (FDA). It is time for HCS training to
pains. Efforts have been made to establish consensus defini- learn from its abutting industries and establish clear defini-
tions and concepts in HCS, notably by The Society for Simula- tions for verification and validation.
tion in Healthcare and the textbook Designing Excellence in The exercise of exploring verification and validation in
Simulation Programs by Palaganas et al3,4; however, the varied psychometric science, software, engineering, aviation simula-
use of terms and concepts remains a challenge in the field and tion, and medicine offers insight into and understanding of
its subsequent research publications. One of these concepts, the concepts, but it is important to note that one must be cau-
which needs further defining, is how to ensure that an HCS tious when drawing conclusions from comparing verification
is built according to set specifications and fulfills the intended and validation of these distinct fields to HCS, because these
training needs. Psychometric science will serve as the founda- concepts are heavily dependent on context.5 The decision to
tion of our understanding of verification and validation. Then, discuss certain fields of study in this article is meant to high-
we will explore varied use of these terms in distinct industries. light the depth and breadth of use of verification and valida-
Simulation itself is no longer in its infancy, even if its use and tion in arenas that would be both familiar and relevant to
application in medical training are. The aviation industry is a the audience interested in HCS and medical education using
good example of how simulation has grown and evolved for simulation. The disciplines included are not meant to be all-
nearly one century and recognized the importance of standards inclusive and were not selected in a scientific manner.
for validation resulting in established, very specific guidelines
Validation in Psychometric Science
For the past 60 years, psychometric science has made sig-
From the Department of Anesthesiology, University of Minnesota, Minneapolis, MN. nificant advances in the understanding and use of validation,
Reprints: John Jacob Barnes III, MD, Anesthesiology, B515 Mayo Bldg, 420 Delaware St SE, particularly in educational assessment. In 1990, Messick6 wrote
Minneapolis, MN 55455 (e‐mail: BarnesJ@umn.edu).
about viewing validation as a unified framework with varying
The authors declare no conflict of interest.
degrees (not all-or-nothing) and with constituent evidence (not
This study is attributed to Department of Anesthesiology, University of Minnesota School
of Medicine. different types of validation). This view of validation was adopted
Copyright © 2018 Society for Simulation in Healthcare by the American Educational Research Association, Psycho-
DOI: 10.1097/SIH.0000000000000298 logical Association, and National Council on Measurement

356 Exploring Validation and Verification Simulation in Healthcare


Copyright © 2018 by the Society for Simulation in Healthcare. Unauthorized reproduction of this article is prohibited.
in Education in 1999 with the publication of Standards for Ed- a more structured analysis of the computer-human interaction
ucational and Psychological Testing and continues to be present of software through usability engineering, which focuses on
in the 2014 edition.7 Before this time, validity was considered the ease of learning, ease of use, and user satisfaction. Carefully
to have different types, such as face validity, construct validity, planning validation that can be integrated into the software de-
etc. More recent work in psychometric science by Kane and velopment process, before actual development occurs, can ad-
Service8 further develops the idea of validation as an argument- dress some of the challenges of subjectivity and facilitate more
based approach to support claims that go beyond observed effective validation.14
performances. In the more modern approach to validation, Verification and validation are also important in the field
there are five sources of evidence that support the argument of engineering. The Institute of Electrical and Electronic Engi-
for validation of assessments, which are content, response neers (IEEE) publishes standards for and definitions of verifi-
process, internal structure, relationship to other variables, cation and validation in their guidebook for project managers:
and consequences.
Standards for Education and Psychological Testing does not “Verification. The evaluation of whether or not a product,
include a dedicated chapter to verification, like it does for val- service, or system complies with a regulation, requirement,
idation. The chapter on validation is arranged in the first part specification, or imposed condition. It is often an internal
of the book under “Foundation,” which also includes a chapter process. Contrast with validation.”15
titled “Reliability/Precision and Errors of Measurement.”
Publications with discussion of HCS verification efforts “Validation. The assurance that a product, service, or sys-
use both reliability and errors of measurement concepts as parts tem meets the needs of the customer and other identified
of verification.9,10 stakeholders. It often involves acceptance and suitability
with external customers. Contrast with verification.”15
How Nonmedical Fields Use These Terms
Because it is difficult to find or establish consensus defini- The IEEE definition partially overlaps with the software
tions and techniques for verification and validation in HCS, it industry's definitions of verification and validation, but there
is useful to look at how the foundational sciences behind is also a hardware or physical component that requires testing,
technology-enhanced HCS, software and engineering, use the verification, and validation. As the IEEE oversees multiple in-
terms and how the aviation industry has developed established dustries, the definitions they provide are broader and further
guidelines for these concepts. Before delving into these three reaching than those offered by the software industry.
fields and their use and implementation of verification and Looking at how the fundamental sciences behind HCS de-
validation, it is worth noting scientific or theoretical knowl- fine verification and validation is helpful as is looking at how
edge and understanding on a technical level plays a fundamen- simulation in other fields, such as aviation, use the terms. It
tal role in these fields and their verification and validation is common to compare HCS with simulation in aviation and
efforts. Oberkampf et al11 provide an excellent review of veri- one review article by Aebersold16 focuses on this compari-
fication and validation in scientific fields, discussing specific son.17 Aebersold16 discusses the Link trainer's origins in the
and technical aspects that likely are not applicable or relevant 1920s, often cited as the first flight simulator, and how 50 years
to HCS development and quality control. The utility of explor- later, in the 1970s, most pilot training was conducted in simu-
ing these fields lies in growing and developing a better under- lators. The author highlights that simulation use in healthcare,
standing of what verification and validation mean, how they specifically in the nursing field, is being used to develop tech-
are used in other sectors, and how this applies to HCS. nical skills and “crew resource management” skills in a similar
In software, the notion of verification and validation has manner to the aviation industry but has failed to meet the
been around since the 1960s.12 Wallace and Fujii12 summa- milestone set in the 1970s of having most training done in
rized verification and validation as a way to “comprehensively the safety of the simulation umbrella.16 As aviation simulation
analyze and test software to determine that it performs its has matured as a field, so have its methods of validating the
intended functions correctly, to ensure it performs no unin- quality of the experience. In the early stages of aircraft flight
tended functions, and to measure its quality and reliability” simulators, there was a reliance on an experienced pilot to val-
in their overview of the concepts. Boehm13 differentiated the idate a simulator based on its fidelity to actual performance
terms in the 1980s by defining verification as answering the and handling of an aircraft.18 Although logical, these evalua-
question, “Are we building the product right?” and validation tions are prone to subjectivity and fraught with vulnerability
as “Are we building the right product?” With regard to soft- to human error. Because of these susceptibilities, the industry
ware testing, designers focus on making sure that the software began using more quantitative validation of simulators in ad-
can be verified by being functional, without bugs, and that the dition to the opinions of experienced pilots. The FAA has de-
product is smooth, built according to specifications, and has veloped many requirements to validate and maintain certification
consistent output. Software validation can be less objective, be- of flight simulators used to qualify commercial aircraft pilots
cause it often relies on opinions when answering the question in their guidance documents under the title of Flight Simula-
of “Are we building the right product?” To answer this ques- tor Training Qualification. Because of these regulations, the
tion, one must test the product to see whether it, indeed, does time and cost of validating flight simulator training devices
serve its intended purpose. This can be achieved with beta test- are tremendous, with estimates of US $15 million dollars in
ing, where subjects evaluate usability and determine whether monetary cost and enormous data sets required to be collected
the software or product achieves the intended goal, or through and analyzed for validation. One advantage the aviation

Vol. 13, Number 5, October 2018 © 2018 Society for Simulation in Healthcare 357
Copyright © 2018 by the Society for Simulation in Healthcare. Unauthorized reproduction of this article is prohibited.
industry has is the ability to compare objective information It defines software verification as “objective evidence that de-
from mathematical models and data collected by the electron- sign outputs…meet all of the specified requirements.”21 It
ics of actual aircraft flight with the output data produced by suggests software testing, static and dynamic analysis, and code
simulators to form complex validation techniques.19 inspections as ways of completing this task. The FDA defines
Validation tests are used to compare objectively flight train- software validation in bold text as:
ing device data and airplane data (or other approved reference
data) to assure that they agree within a specified tolerance. “Validation is the confirmation by examination and pro-
Functions tests provide a basis for evaluating flight training vision of objective evidence that software specifications
device capability to perform for a typical training period conform to user needs and intended uses, and that the
and to verify correct operation of the controls, instruments, particular requirements implemented through software
and systems.18 can be consistently fulfilled.”22
A common technique for validation of aviation simula-
tion is to test individual components in a quantitative way in It also recognizes that validation depends on verification
four levels, first individual modules individually, second inter- tasks and recommends testing the functionality of the software
related modules together, third models tested for dynamic re- as a prototype in a simulated use environment and later in lim-
sponse, and finally a pilot using the model.18 ited patient use as key components to ensure validity. The FDA
It is important to note that aviation, based on physics recommends using verification and validation throughout the
principles, lends itself nicely to modeling and simulation. design process.
The principles of aviation are more fundamentally established The pharmaceutical industry, also regulated by the FDA,
and have extensive and accurate mathematical models, which puts a strong emphasis on process validation defined as “the
have been vetted through experimental research over the years. collection and evaluation of data, from the process design stage
Mathematical models based on physics laws and principles can through commercial production, which establishes scientific
accurately predict how an aircraft will behave under variable evidence that a process is capable of consistently delivering
circumstances and actions. This same luxury is not afforded quality product.”24 The FDA details this process in their docu-
to HCS, because human physiology is highly variable within ment “Process Validations: General Principles and Practices”
and between subjects20 and running experimental research accompanied by their current recommendations. Process val-
on human studies is limited by ethical considerations. It is idation breaks down into three stages, with continued process
conceivable that HCS and modeling focusing on specific tasks, verification as the final stage, in which “ongoing assurance is
such as employing an endovascular stent, could approach the gained during routine production that the process remains
precision found in aviation simulation; however, HCS attempting in a state of control (the validated state).”24 Manufacturers of
to emulate a “whole patient,” as in the use of technology- pharmaceutical goods go through a stringent validation pro-
enhanced mannequins, is never likely to achieve the pin- cess proving they “consistently produce drug products…relat-
point accuracy of modeling used in aviation simulation based ing to identity, strength, quality, purity, and potency”24 and
on physics. The differences between simulation in aviation and then must continue to verify that the process does not depart
healthcare are important when discussing verification and from its approved and validated state by collecting and analyz-
validation as the concepts are highly context dependent. ing product and process data. As the pharmaceutical industry
For example, the verification and validation activities for a markets and intends their products for human consumption,
highly mathematical model of aviation physics would rightly the process is scrupulous and overseen by the FDA. Although
look much different than an HCS mannequin used as a HCS for training does not report to the FDA, the broad con-
whole-patient trainer. cepts of validation through scrupulously checking for quality
and reproducibility at various points in production as well as
Verification and Validation in Medicine continued verification are important and practical for applica-
Verification and validation are not foreign concepts to tion to HCS.
medicine or healthcare. As healthcare is an industry with tight The act of validating and verifying biomechanical prod-
regulations and restrictions, overseeing organizations, such as ucts for use in medicine incorporates several similar concepts
the United States FDA, have defined the terms verification used by the nonbiological engineering fields but has its own
and validation and how these concepts apply to determining unique challenges because of their design for and use by the
the safety of products used. Although HCS for training pur- medical community and patients. Hicks et al25 and Stanford
poses is not currently under the umbrella of an overseeing or- University colleagues published recommendations for validat-
ganization, such as the FDA, the industry can learn from the ing and verifying biomechanical products. They recommend
use of these terms in other areas of medicine. verification of computational models and underlying algo-
The FDA publishes guidelines for verification and validation rithms of simulated physiological or physical movement by
of medical devices, now including computational modeling.21–23 comparing with known standards, which they state is more
The FDA relies on making clear definitions and identifying feasible if done individually on components and then on aggre-
broad concepts that can be applied to each unique medical de- gates of components, and validation through comparison of
vice.21 The FDA specifically makes note of how it is common whole model simulation data with patient data from indepen-
to group verification and validation together as “V&V” but dent experiments or other published model data sets. Hicks
specifies it to consider them separate and distinct terms in et al25 mention the limited criterion standard data sets due to
their guidelines for software development in medical devices. difficulty measuring desired output in humans and how there

358 Exploring Validation and Verification Simulation in Healthcare


Copyright © 2018 by the Society for Simulation in Healthcare. Unauthorized reproduction of this article is prohibited.
is inherent interpatient variability, making both verification and considered this as one facet of validation. It also examined
and validation more challenging than in nonmedical fields. “content validity” or how well the simulation covers the subject
Fields of study, or areas, within medicine address verifica- matter of the real activity, “construct validity” or “contrast valid-
tion and validation in a similar, but diverse way, as it evident ity” as the ability for the simulation to differentiate between abil-
by the examples provided. The FDA makes a point to differen- ity or experience levels, and “predictive validity” as the ability to
tiate between verification and validation and the medical de- correlate performance on the simulator with established assess-
vice, pharmaceutical, and biomechanical industries perform ment of a skill or attribute in patient care. It argued that the most
different tasks to accomplish these two, distinct tasks. For ver- powerful evidence for validation found or the most important
ification, the medical device industry uses tools such as static aspect of validation for their group is the predictive validity.9
and dynamic analysis of software and code inspection, the One of the most common validation questions addressed
pharmaceutical industry uses ongoing data collection and anal- is “Does a scoring of performance in this simulator correlate
ysis of product to ensure the approved manufacturing process with a measure of experience or ability?” The grouping of “nov-
has not altered course, and the biomechanical field compares ice” users like medical students or residents and comparing
individual components of a model to established principles. their performance to “experts” such as attending physicians at
For validation, the medical device industry tests devices in teaching hospitals or principal investigators is a common way
simulated environments and limited patient sets, the pharma- of validating that the simulation in question is able to test con-
ceutical industry analyzes product at various stages of produc- struct validity or differentiate users based on experience and
tion for quality, and the biomechanical field compares novel skill levels.29–31 A similar technique is to have the novice group
devices to other published models or experimental data on pa- broken down into a control group and a group, which receives
tients. Healthcare simulation shares many of the challenges in training on a simulator and then comparing their skills to those
successfully performing verification and validation and reviewing of experts. This is a way to validate a simulator in teaching a skill
how these fields face those challenges may help develop these that translates into patient use.32 In the updated unified frame-
concepts in HCS. work of validation, these techniques are considered “relation-
ship with other variables” evidence and Cook et al27 note that
Healthcare Simulation Validation this was by far the most commonly used evidence for valida-
There is much more information published on validation tion in their systematic analysis, with 73% of the studies using
of simulation use in healthcare compared with verification. this method and 33% using this as their only evidence to sup-
Validation has many different definitions, and it is important port validity. Expert-novice comparison does demonstrate a
to ask, “What are we really validating?” Validation of tools simulation's sensitivity, or the ability to discriminate between
for educational assessment has made significant advancements two groups, but Cook et al27 describe this as a “relatively weak”
in the last 60 years, and Downing26 and Cook et al have made form of validation evidence and that overreliance on expert-
efforts to bring these ideas into medical education and HCS, novice comparison for validation is unfortunate.
respectively.27,28 Downing provides a review of the contempo- Other publications focus on the face validity of an HCS.
rary use of validity, being a unitary concept with multiple facets The idea of face validity is to analyze a simulation and decide
of evidence, and discusses this in the context of assessments how closely it resembles or feels like the actual activity. This
(standardized exams) in medical education. In a systematic re- can be more closely examined and compartmentalized into
view of validation evidence in 417 simulation studies used for categories such as tissue or anatomic quality. For example,
healthcare professionals, Cook et al27 discovered only 3% used one could examine the anatomy of an airway simulation trainer
the unified view of validation discussed here and accepted by and compare their experience with typical human anatomy.
Standards of Educational and Psychological Measurement, 35% Classifying or “proving” face validity often relies on the subjective
used the older framework focusing on “types” of validation, ratings of subject matter experts or those with some predefined
and 24% used no validation framework at all. It seems that val- experience in the simulated subject matter.1,10,33
idation is underused and not well understood in medical edu- These techniques struggle with some form of subjectivity,
cation and HCS research. which is part of the intrinsic nature of validation. If you are an-
The European Association of Endoscopic Surgeons (EAES) swering the question, “Did I build the right product?” or “Are
saw simulation as a valuable tool for training and noted the lack the user needs and intended uses met?” a subjective answer is
of consensus guidelines or uniform assessment of available sim- required. Users may have different ideas of what the intended
ulators. In 2005, it formed a subgroup called the Work Group use is and may have different opinions on whether this need
for Evaluation and Implementation of Simulators and Skills was met. One method to make validation more objective is
Training Programmes at their annual meeting to help define to incorporate validation into the developmental cycle so that
validation of surgical simulators and to evaluate current prod- there are agreed-upon ways to establish validity before the de-
ucts on the market based on their definition of validation and sign process ever begins (Fig. 1). This can prevent bias from
available research. It defined validity as “the extent to which an coming in after a design is complete or a product is ready for
assessment instrument measures what it was designed to mea- testing. This is common in medical device manufacturing as
sure.” It argued that to do this, it was necessary to look at va- part of the entire process termed “design control activities.”21
lidity; however, it used the former understanding of validity
focusing on validity types rather than using the unified validity Healthcare Simulation Verification
framework. The subgroup noted “face validity” as the degree Publications in HCS focus less on verification. This may
of resemblance between the simulator and the real activity result from the “bundling” of the individual concepts of

Vol. 13, Number 5, October 2018 © 2018 Society for Simulation in Healthcare 359
Copyright © 2018 by the Society for Simulation in Healthcare. Unauthorized reproduction of this article is prohibited.
musculoskeletal movement and the underlying established mathe-
matical model of physical movement.25 In this case, the two au-
thors are using different words to describe the similar
mathematical concepts to discuss quality management within
HCS. Gallagher and colleagues,10 who use the word reliability
in their article, caution that readers may misconstrue the word
reliable, because it carries favorable connotations and is syn-
onymous with “good” or “worthwhile” in common speech,
when in scientific writing, reliable means consistent or how a
tool yields the same results when used repeatedly under similar
conditions. Further confounding matters, Cumin et al34 stud-
ied the consistency of physiologic response in various HCSs
and used the term repeatability. It seems that reliability and re-
peatability are not equivocal to verification, rather it may be
more appropriate to consider them individual components
of the larger verification effort. Furthermore, Lampotang36
notes, in an article discussing HCS repeatability, patient
FIGURE 1. Adapted from FDA figure of Design Control Guidance
for Medical Device Manufacturers. A pictorial representation of how variability constitutes an inherent challenge in HCS but also
verification and validation can be incorporated into the design argues that variability within a design framework of HCS
control process (developmental cycle) of HCS as it is in medical “fortuitously relaxes modeling requirements” as repeatabil-
device manufacturing. ity of a simulation should only need to fall within a range of
plausible patient variation that has been predetermined. With
verification and validation into one concept and lack of clear inconsistent use of terms, it is important for authors to state and
separation in the individual activities around verification and describe with clarity what they aim to accomplish and how this
validation. Some studies use the term verification but do not comprises a component of quality control or assurance.
specify what the word means or how it was determined, whereas Sweet et al37 have proposed integration of validation and
other use different terms such as reliability or repeatability verification into the developmental and evaluation process of
when discussing quality management in HCS.9,10,34 The previ- simulation tools. In the article, they describe the technique as
ously mentioned EAES subgroup on evaluating surgical simu- the Center for Research in Education and Simulation Technol-
lators also noted the importance of “reliability” or the “ability ogies (CREST) process, which is based on the earlier “back-
to provide consistent results with minimal errors in measure- ward design principles” of Wiggins and McTighe38 in their
ment” and named test-test reproducibility and internal consis- book on curriculum development. Using backward design
tency as commonly used methods to estimate this.9 Software principles, Sweet et al37 used verification and validation
and engineering fields perform these tests as a form of verifica- techniques throughout the development process and ap-
tion.12,15 In the EAES's work on establishing consensus guide- plied these to early prototypes rather than the final product.
lines for rating simulation equipment available at the time, it The authors noted that using this method helped refine
noted the paucity of data available for verification and decided their models, made verification and validation easier to
to focus their attention solely on validation. Because different complete, and improved their final product. Focus on verifica-
authors in different fields use the words reliability and verifica- tion and validation throughout the developmental process,
tion to describe similar concepts and tasks, it can be confusing. rather than focusing validation and verification efforts after
It seems that verification is the preferred word in software, en- design, is an interesting way to confirm that results are consis-
gineering, and medical device industries for testing and dem- tent (verification) and the individual parts perform their spec-
onstrating test-test reproducibility, internal consistency, ified function (validation).
and accuracy, whereas some HCS publications address sim-
ilar tasks and discuss them under the terms repeatability and Summary
reliability.10,29,30,34,35 Looking at the use of verification and validation in HCS
Gallagher and colleagues10 also discuss how reliability re- provides a glimpse into how these techniques are used and
lates to the medical field. Their article focused on creating val- how broadly defined they are as well as establishes them as
idation systems for evaluating surgical training and assessment unique concepts with interdependence and overlapping of con-
devices. They argued that reliability is a crucial part of a total stituent activities (Fig. 2). As pointed out in articles by
validation system (ie, that it is mathematically consistent, re- Gallagher and colleagues10 and Sweet,35 the educated reader
producible, and dependable) rather than using the term verifi- of scientific publications in this field must be aware of the
cation. Specifically, the group explored split-half method and broad implications and use of the techniques and be careful
test-retest methods to establish reliability in assessment tools.10 to draw their own conclusion of what has been “verified” or
In contrast, Hicks et al25 set out to perform verification in the “validated.” It is again worth noting the importance of using
field of biomechanical devices by determining the accuracy caution when comparing highly context-dependent concepts
of the musculoskeletal models and simulations of movements. such as verification and validation in the distinct fields covered
They defined verification as the determining of agreement in this review. Many of the techniques for verification and val-
between more complex novel computational models of idation in highly scientific or technical fields would not be

360 Exploring Validation and Verification Simulation in Healthcare


Copyright © 2018 by the Society for Simulation in Healthcare. Unauthorized reproduction of this article is prohibited.
FIGURE 2. The relationship between verification and validation in HCS showing tasks unique to each and overlapping tasks.

practical or have any utility in the evaluation of the applied sci- that we are effectively communicating important ideas for
ence of HCS, but the exploration of these topics helps develop quality management in the rapidly emerging and exciting
and understand purpose of verification and validation in qual- realm of HCS.
ity assurance. Some fields rely on regulatory agencies to pro-
vide strict definitions of and laws governing verification and REFERENCES
validation such as aviation (FAA) and medicine (FDA). We 1. Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ.
would suggest that this is too stringent for the current state Features and uses of high-fidelity medical simulations that lead to effective
learning: a BEME systematic review. Med Teach 2005;27(1):10–28.
of HCS and the field would rather benefit from further re-
finement of commonly used terms and guidelines for their 2. Rosen KR. The history of medical simulation. J Crit Care 2008;23(2):
157–166.
activities such as verification, validation, and their constituent
elements in a comparable manner to the fundamental sciences 3. Loreiato J, Downing D, Gammon W, et al. The Terminology & Concepts
Working Group. Healthcare Simulation Dictionary. Rockville, MD: Agency
of software and engineering. Efforts have been made to do so for Healthcare Research and Quality; 2016.
as evident by the 2016 publication of a dictionary by Simula- 4. Palaganas JC, Maxworthy J, Epps CA, Mancini ME. Defining Excellence in
tion in Healthcare, which included the terms simulation Simulation Programs. Philadelphia, PA: Lippincott Williams &
reliability and simulation validity. Wilkins; 2014.
Borrowing from those fields, a hybrid definition for veri- 5. Spiegel M, Reynolds PF Jr, Brogan DC. A case study of model context for
fication and validation in HCS could be the following: simulation composability and reusability. Paper presented at: Proceedings
of the 37th Conference on Winter simulation; December 4, 2005.
Verification: the act of confirming the simulator was built
according to its specifications 6. Messick S. Validity of test interpretations and use. ETS Research Report
Series 1990;1:1487–1495.
Validation: the act of confirming the HCS accurately por-
7. American Educational Research Association APA, National Council on
trays what it is modeling and produces the desired outcome Measurement in Education, Joint Committee on Standards for
These definitions differentiate verification and validation Educational and Psychological Testing. Standards for Educational and
and allow for a foundation on which authors can precisely Psychological Testing. Washington, DC: American Educational Research
Association; 2014.
communicate their findings and readers can better understand
the concepts and more effectively draw their own conclusions. 8. Kane MT, Service ET. Validating the interpretations and uses of test scores.
J Educ Meas 2017;50(1):1–73.
We agree that verification and validation should be integral
parts of the design process rather than tasks completed after 9. Carter FJ, Schijven MP, Aggarwal R, et al. Consensus guidelines for
validation of virtual reality surgical simulators. Surg Endosc 2005;
the completions of process, as has been previously suggested. 19(12):1523–1532.
The industries explored, in this article, all stress the impor- 10. Gallagher AG, Ritter EM, Satava RM. Fundamental principles of
tance of integration of verification and validation in the design validation, and reliability: rigorous science for the assessment of surgical
process, as do Sweet and colleagues37 for use in HCS educa- education and training. Surg Endosc 2003;17(10):1525–1529.
tion. Formulating a way to answer the questions from early 11. Oberkampf WL, Trucano TG, Hirsch C. Verification, validation, and
work by Boehm13 with software “Did we build the right prod- predictive capability in computational engineering and physics. Appl Mech
Rev 2004;57(5):345–384.
uct?” and “Did we build the product right?” is much easier and
may more appropriately reflect true validation and verifica- 12. Wallace DR, Fujii RU. Software verification and validation: an overview.
IEEE Softw 1989;6(3):10–17.
tion, before the design process is initiated rather than after
13. Boehm B. Software risk management: principles and practices. IEE Softw
the fact. In the end, the burden of proving a product or tech- 1991;8(1):32–41.
nique as valid or verified falls on the author and the burden
14. Rosson MB, Carroll JM. Usability Engineering: Scenario-Based Development
of drawing those conclusions falls on the reader. Having a clear of Human-Computer Interaction. San Francisco, CA: Morgan
definition of these separate but overlapping concepts ensures Kaufmann; 2002.

Vol. 13, Number 5, October 2018 © 2018 Society for Simulation in Healthcare 361
Copyright © 2018 by the Society for Simulation in Healthcare. Unauthorized reproduction of this article is prohibited.
15. IEEE Draft Guide: Adoption of the Project Management Institute (PMI) 27. Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R.
Standard: A Guide to the Project Management Body of Knowledge (PMBOK Technology-enhanced simulation to assess health professionals: a
Guide)-2008. 4th ed. IEEE; 2011. systematic review of validity evidence, research methods, and reporting
quality. Acad Med 2013;88(6):872–883.
16. Aebersold M. The history of simulation and its impact on the future.
AACN Adv Crit Care 2016;27(1):56–61. 28. Cook DA, Hatala R, Brydges R, et al. Technology-enhanced simulation
for health professions education: a systematic review and meta-analysis.
17. Abrahamson S, Denson JS, Wolf RM. Effectiveness of a simulator in
JAMA 2011;306(9):978–988.
training anesthesiology residents. J Med Educ 1969;44(6):515–519.
18. National Research Council. Appendix F: Validation of Aircraft Flight 29. Devitt JH, Kurrek MM, Cohen MM, et al. Testing the raters: inter-rater
Simulators. In: Webster WC, ed. Shiphandling Simulation: Application to reliability of standardized anaesthesia simulator performance. Can J
Waterway Design. Washington, DC: National Academies Press; 1992. Anaesth 1997;44(9):924–928.

19. Dillard AE. Validation of advanced flight simulators for human-factors 30. Devitt JH, Kurrek MM, Cohen MM, et al. Testing internal consistency and
operational evaluation and training programs. Foundations for Verification construct validity during evaluation of performance in a patient simulator.
and Validation Workshop. John Hopkins University Applied Physics Anesth Analg 1998;86(6):1160–1164.
Laboratory; 2002. 31. Ullah W, Hunter RJ, Finlay M, et al. Validation of a high-fidelity
20. Lipsitz LA, Goldberger AL. Loss of 'complexity' and aging. Potential electrophysiology simulator and development of a proficiency-based
applications of fractals and chaos theory to senescence. JAMA 1992; simulator training program. Simul Healthc 2017;12(1):41–46.
267(13):1806–1809.
32. Morris E, Kesser BW, Peirce-Cottler S, Keeley M. Development and
21. US Food and Drug Administration. Design Control Guidance for Medical validation of a novel ear simulator to teach pneumatic otoscopy. Simul
Device Manufacturers. Rockville, MD: Center for Devices and Radiological Healthc 2012;7(1):22–26.
Health; 1997.
33. Berwick RJ, Mercer SJ, Groom P. Evaluating the fidelity of a novel
22. US Food and Drug Administration. General principles of software part-task trainer for emergency front of neck access training. BMJ Simul
validation. In: Final Guidance for Industry and FDA Staff. Rockville, MD: Technology Enhanced Learn 2017. doi: 10.1136/bmjstel-2017-000208.
Center for Devices and Radiological Health; 2002:6.
34. Cumin D, Chen C, Merry AF. Measuring the repeatability of
23. US Food and Drug Administration. Reporting of computational modeling simulated physiology in simulators. Simul Healthc 2015;10(6):
studies in medical device submissions. In: Guidance for Industry and Food 336–344.
and Drug Administration Staff. Rockville, MD: Center for Devices and
Radiological Health; 2016. 35. Sweet RM. The CREST simulation development process: training the next
generation. J Endourol 2017;31(S1):S69–S75.
24. US Food and Drug Administration. Process validation: general principles
and practices. In: Guidance for Industry. Rockville, MD: Center for Drug 36. Lampotang S. Unlike history, should a simulator not repeat itself? Simul
Evaluation and Research; 2011. Healthc 2015;10(6):331–335.
25. Hicks JL, Uchida TK, Seth A, Rajagopal A, Delp SL. Is my model good 37. Sweet RM, Hananel D, Lawrenz F. A unified approach to validation,
enough? Best practices for verification and validation of musculoskeletal reliability, and education study design for surgical technical skills training.
models and simulations of movement. J Biomech Eng 2015;137(2):020905. Arch Surg 2010;145(2):197–201.
26. Downing SM. Validity: on meaningful interpretation of assessment data. 38. Wiggins G, McTighe J. Understanding by Design. 2nd ed. Alexandria, VA:
Med Educ 2003;37(9):830–837. ACSD; 2005.

362 Exploring Validation and Verification Simulation in Healthcare


Copyright © 2018 by the Society for Simulation in Healthcare. Unauthorized reproduction of this article is prohibited.

You might also like