Professional Documents
Culture Documents
Practicing Healthcare Professionals' Evidence-Based Practice Competencies: An Overview of Systematic Reviews
Practicing Healthcare Professionals' Evidence-Based Practice Competencies: An Overview of Systematic Reviews
Lynn Gallagher-Ford, PhD, RN, DPFNAP, NE-BC, FAAN ● Tarja Kvist, PhD, RN ●
Katri Vehviläinen-Julkunen, PhD, RN, FEANS
Cumulative Index for Nursing and Allied Health Search Results and Data Evaluation
Literature (CINAHL), Scopus, and Cochrane Library The database searches yielded a total of 3,932 publica-
for primary empirical studies and reviews published tions, and 15 additional publications were identified
between January 1, 2012, and July 31, 2017 (i.e., for a through other sources. Titles were screened, and du-
period of approximately the last 5 years), without any plicates as well as those not clearly indicating a focus
language restrictions. With the expert assistance of a on practicing healthcare professionals’ EBP competen-
university librarian, keywords and search terms related cies were excluded. All remaining abstracts (n = 407)
to the various healthcare disciplines, EBP, and compe- were screened against the purpose and inclusion cri-
tencies were first searched independently and then in teria before being selected for further appraisal. After
combination, with appropriate modifications made for eliminating a total of 392 records that did not meet
the various databases (e.g., MeSH terms in PubMed). one or more inclusion criteria, the second screen-
The term “research utilization” was not used as the aim ing resulted in 12 reviews. Three reviews were added
of this overview of systematic reviews was to focus on through reference-c hasing and hand-searching tables
healthcare professionals’ EBP competencies (i.e., their of content of the selected peer-r eviewed journals, re-
EBP knowledge, skills, attitudes, beliefs, and imple- sulting in a total of 15 full-text reviews, which were
mentation). Moreover, research utilization focuses on assessed for eligibility. Four full-text reviews were ex-
the retrieval, critique, and use of the research results cluded from the overview, as they contained no critical
from a single primary study, whereas EBP is commonly appraisal of methodological quality and therefore did
considered to be a much broader concept including re- not meet the definition of a systematic review outlined
search utilization and the integration of summarized and for this overview. As a result, data were extracted from
translated best evidence from several well-defined stud- 11 systematic reviews. Figure S1 details the stages of
ies into clinical practice (Melnyk & Fineout-O verholt, searching and selecting reviews for inclusion or ex-
2011). In addition to the searched databases, authors of clusion using the PRISMA f low diagram (Moher et al.,
the i ncluded reviews were contacted for any missing key 2009).
information, the reviews were reference-c hased, and the
lists of contents of the following peer-reviewed journals Data Extraction
between the years of 2012–2017 were hand-searched: The following data were extracted for each of the 11 reviews
Worldviews on Evidence- Based Nursing, Journal of and organized in a data matrix, using a standardized data
Advanced Nursing, BMC Health Services Research, BMC extraction form developed according to the guidance from
Medical Education, BMJ Open, Physiotherapy, and British the PRISMA statement (Moher et al., 2009): Author(s),
Journal of Occupational Therapy. These journals were se- country, year of publication, types of participants, settings,
lected because they had published the majority of the study design(s) included, EBP aspects reviewed, quality
reviews focusing on the topic of healthcare profession- appraisal(s) performed, main findings, and author’s con-
als’ EBP competencies yielded by the systematic literature clusions. The data were extracted by one reviewer and in-
searches conducted for this overview. dependently checked for accuracy and consistency by two
other reviewers to ensure rigor and reproducibility. Any dif-
Inclusion and Exclusion Criteria ferences in opinion between the three researchers were dis-
The inclusion and exclusion criteria for systematic re- cussed until a mutual agreement was formed. All 11 reviews
views are listed in Table S1. Systematic reviews were were included in the critical appraisal of methodological
defined as reviews that had clearly stated aims or objec- quality.
tives, predetermined inclusion criteria, searched at least
three databases, performed data extraction, provided a Critical Appraisal of Methodological Quality
synthesis of data, and performed a quality appraisal of The overall quality and differences in quality between the
the included studies. To be eligible for inclusion in this included reviews were compared and contrasted, in order
overview, reviews were required to (a) focus on one or to help interpret the results of the reviews synthesized in
more of the outcomes of interest (i.e., EBP competencies this overview. The overall quality of the reviews was not
of healthcare professionals), (b) fulfill the definition of used as a criterion for inclusion, as the reviews included
a systematic review, (c) meet the inclusion and exclusion in this overview were required to meet the definition of a
criteria, and (d) meet the benchmark set for the method- systematic review, specific inclusion criteria, and to pass a
ological quality of the reviews. Before undertaking this critical appraisal of methodological quality, the main pur-
overview of systematic reviews, the Cochrane Library and pose of which was to ensure that the included reviews con-
the Joanna Briggs Institute Library of Systematic Reviews formed to usual research norms.
were searched. No published or in-progress systematic The criteria used by the three independent reviewers
reviews or overviews of systematic reviews on this topic for evaluating the methodological quality were those in the
were found. Rapid Critical Appraisal (RCA) tool for systematic reviews
systematic reviews focused on practicing healthcare pro- the included reviews. The benchmark for the strength of
fessionals, but four of the 11 (36%) systematic reviews evidence indicating acceptable methodological quality was
also contained small subsamples of healthcare students set at 34% (i.e., a total minimum score of at least 5 out of
in some of their source studies. The clinical settings of a total of 15 critical appraisal criteria fulfilled). All 11 in-
the source studies were poorly identified with only gen- cluded reviews met this minimum standard for acceptable
eral statements such as “various settings” or “any clinical scientific rigor, with 10 out of the 11 reviews appraised at
setting,” or the settings were not described at all in the moderate quality. The median score (0–15) was 8 (mod-
majority (n = 7, 64%) of the included reviews. However, erate), with the scores ranging from 5 to 10 (out of 15).
some of the included reviews did disclose containing Only one of the 11 included reviews barely attained a high
source studies from hospital, primary care, and commu- score (i.e., a score of at least 10 out of 15 appraisal criteria
nity care settings. fulfilled).
The pronounced heterogeneity in the source studies
Outcomes Measured and Overlap Between the of the included reviews in terms of their study designs,
Included Reviews practice settings, outcome measures, outcomes of inter-
Outcomes measured in the included reviews varied est, and educational interventions, combined with poor
considerably, with several reviews containing other and inconsistent reporting quality (e.g., not reporting
outcomes in addition to those related to healthcare pro- source study settings) and missing or incomplete data
fessionals’ EBP competencies. Moreover, the instruments (e.g., only one of the 11 reviews reported effect sizes for
used to measure the outcomes also varied considerably. the source studies and few reported p-values or confi-
Healthcare professionals’ EBP competencies were meas- dence intervals), prompted the results of this overview
ured by using self-report assessments in the source stud- to be narratively summarized. This also precluded any
ies of all of the 11 included reviews (i.e., perceived EBP comparisons of EBP competencies across healthcare dis-
competencies were measured, instead of using more ciplines. In particular, there was considerable variation
objective measures of actual performance, such as EBP in the outcome measures used in the source studies of
knowledge tests). A total of 204 source studies were con- the reviews, including unpublished, not theoretically
tained in the 11 reviews included in this overview. There based, and not psychometrically tested instruments,
was substantial overlap across the included reviews in which were inconsistently or incompletely described.
terms of their source studies, as the 11 included reviews Moreover, many assertions were made in the reporting
with a total of 204 source studies referred to a total of of the source studies, but few assertions were backed up
133 separate studies, of which 48 were included in more by actual data in the reviews. Furthermore, although the
than one review. An effort was made to avoid double educational interventions may have had a positive effect
counting which might lend extra weight to those study on EBP competencies, the impact of the improved EBP
results that had been included in more than one review. competencies on patient outcomes or practice changes
A summary of the main findings from the source studies remains unclear, as healthcare professionals’ improved
can be found in the fuller version of this overview pub- EBP competencies may not necessarily have influenced
lished online. Table S3 summarizes the EBP competency practice in any way.
outcomes of healthcare professionals from the included On the other hand, although the vast majority of the
reviews. source studies in the included reviews used nonprob-
ability sampling methods and cross- sectional survey,
Overall Quality and Completeness of Reporting pretest–posttest intervention, or qualitative study designs,
in the Included Systematic Reviews it is important to acknowledge that seven (64%) of the
The overall quality of the included reviews was appraised 11 reviews contained at least one RCTs or cluster RCT
using guidance from the Cochrane Collaboration (Becker & as a source study. In total, the 11 reviews contained 33
Oxman, 2011). All of the reviews met the definition of sys- RCTs or cluster RCTs as source studies, some of which
tematic reviews as outlined for this overview. Interestingly, were included in more than one review. These results
although two of the 11 included reviews were character- are consistent with the findings of Young, Rohwer,
ized as a “scoping review” or a “systematic scoping review,” Volmink, and Clarke (2014), who found that despite the
they nevertheless included a critical appraisal of methodo- commonly held perception of relatively rare use of ex-
logical quality of their source studies, which reflects the perimental study designs such as RCTs in some health-
wide variety of terms that are used, sometimes inconsist- care disciplines, the reviews included in their overview
ently, to describe the various types of reviews published in nevertheless included a total of 25 RCTs. In summary,
the international literature. the overall quality and completeness of evidence in the
The critical appraisal of methodological quality con- included reviews of this overview was low to moder-
ducted by the three reviewers with the RCA tool (OSUCN, ate at best, as the majority of the reviews did not con-
2017) revealed a broad range of strength of evidence among tain a comprehensive literature search, report on both
search terms modified appropriately for the various data- limited, and the results of this overview should be extrap-
bases. In addition, we searched for ongoing systematic re- olated with caution.
views prior to undertaking this overview, reference-chased
the systematic reviews included in this overview, and
hand-searched the tables of contents of the peer-reviewed IMPLICATIONS FOR PRACTICE AND
scientific journals in which the majority of the systematic RESEARCH
reviews on healthcare professionals’ EBP competencies had Evidence- based practice competencies are essential for
been published. As hand-searching the tables of contents all practicing healthcare professionals in guiding their
did not result in additional searches, we believe that our integration of best evidence into their clinical decision-
search strategy would effectively capture most of the rel- making and thus enabling them to provide higher-quality
evant systematic reviews published on this topic between care and produce better patient outcomes. However, as EBP
January 2012 and July 2017. However, as in any review, it is a shared competency and the steps of EBP implementa-
is possible that some relevant systematic reviews were not tion are universal, there is an urgent need for the collabo-
identified. rative development, implementation, and evaluation of an
Second, three reviewers independently used a study EBP competency set for all healthcare professionals (i.e.,
design-specific critical appraisal tool to evaluate the meth- an interprofessional set of EBP competencies that can be
odological quality of each included review, with any dis- used by all practicing healthcare professionals from any
crepancies and differences discussed to form a mutual healthcare discipline). Recently, the development of a first
agreement, which increased the reliability of the data. In set of such interprofessional core competencies in EBP for
addition, all of the included reviews, originating from 10 all healthcare professionals was published as a consensus
different countries worldwide, had passed an international statement based on a systematic review and Delphi survey
peer review and had been published in high-quality scien- (Albarqouni et al., 2018), which contained 68 core com-
tific journals. As the majority (n = 6, 55%) of the included petencies in EBP applicable to all healthcare profession-
reviews originated from non- English-speaking countries als. This type of interprofessional core competencies in
representing six different languages, publication and lan- EBP for all healthcare professionals should be the focus
guage biases, although possible, are unlikely. of future research studies, as the EBP competencies will
Third, self-reported assessments were used to measure guide the development of interprofessional EBP compe-
healthcare professionals’ EBP competencies in all of the 11 tency measures (via self-ratings or actual performance) as
included reviews (i.e., perceived EBP competencies were as- well as joint EBP curricula for practicing healthcare pro-
sessed, instead of using more objective measures of actual fessionals, and thus, their subsequent uptake, adoption,
performance, such as EBP knowledge tests). Because of a lack and use in clinical practice should be a high priority for
of congruence between self-reported and more objectively all practicing healthcare professionals. In addition, ad-
measured knowledge and ability, especially when measur- dressing the widespread misconceptions and misunder-
ing complex tasks such as EBP implementation (Saunders, standings currently existing among large proportions of
Vehviläinen- Julkunen, et al., 2016; Scurlock- Evans et al., healthcare professionals about the basic concepts of EBP
2014; Wonder et al., 2017), using self-reports may result in is crucially important for increasing their engagement in
bias (through the participants giving more socially accept- EBP implementation and for attaining improved care qual-
able responses than nonrespondents), and in overestima- ity and patient outcomes.
tion of some EBP competencies, such as EBP knowledge, for Nursing and some allied health disciplines, such as
which more objective measures are available. physical therapy and occupational therapy, have tradition-
Fourth, the search term “research utilization” was not ally relied on measuring competencies through self-report
used for our overview of systematic reviews as the aim was assessments even when the constructs of interest, such as
to focus on the EBP competencies that practicing health- EBP knowledge, ability, or competence, could be assessed
care professionals need to successfully integrate translated through more objective measures. Therefore, future stud-
best evidence into daily clinical practice. However, we ies should focus on developing and using actual, that is,
acknowledge that it is not uncommon for research uti- performance-based, validated outcome measures for EBP
lization to be used in studies as if it were an alternative competencies through using rigorous study and review
term for EBP, and therefore, we are aware that some of methodologies and robust reporting practices. Although
the published systematic reviews may have been missed EBP is a shared competency, implementation of EBP is a
by our search. Fifth, the modest methodological quality complex process requiring multifaceted educational inter-
of the identified systematic reviews and the relatively low ventions that contain interacting components, and thus, it
quality of reporting of the results in the systematic reviews should be investigated whether the differences in health-
may have affected the results of this overview. Finally, effect care professionals’ primary roles, educational backgrounds
sizes were not reported in all but one of the included sys- across disciplines, and in contextual factors may influence
tematic reviews. Therefore, generalizability of the results is the effects of the EBP educational interventions.
• Future research studies should also focus on develop- Dawes, M., Summerskill, W., Glasziou, P., Cartabellotta, A.,
Martin, J., Hopayian, K., … Osborne, J. (2005). Second
ing and using actual, that is, performance-based,
International Conference of Evidence-Based Health Care
validated outcome measures for assessing nurses’ EBP Teachers and Developers: Sicily statement on evidence-
competencies, instead of continuing to evaluate based practice. BMC Medical Education, 5, 1. https://doi.
perceived (i.e., self-rated) competencies via self-
org/10.1186/1472-6920-5-1
assessments, even when the constructs of interest,
DiCenso, A., Cullum, N., & Ciliska, D. (1998). Implementing
such as EBP knowledge and ability, could be assessed evidence-based nursing: Some misconceptions. Evidence-
through more objective, performance-based measures. Based Nursing, 1, 38–40. https://doi.org/10.1136/ebn.1.2.
38
Häggman-Laitila, A., Mattila, L.-R., & Melender, H.-L. (2016). PLoS Medicine, 6(6), e1000097. https://doi.org/10.1371/jour-
Educational interventions on evidence- based nursing nal.pmed1000097
in clinical practice: A systematic review with qualita- Mota da Silva, T., da Cunha Menezes Costa, L., Narciso Garcia,
tive analysis. Nurse Education Today, 43, 50–59. https://doi. A., & Oliveira Pena Costa, L. (2015). What do physical
org/10.1016/j.nedt.2016.04.023 therapists think about evidence- based practice? A sys-
Halm, M. A. (2018). Evaluating the impact of EBP education: tematic review. Manual Therapy, 20, 388–401. https://doi.
Development of a modified Fresno test for acute care nurs- org/10.1016/j.math.2014.10.009
ing. Worldviews on Evidence-Based Nursing, 15, 272–280. https:// Ohio State University College of Nursing (2017). Rapid Critical
doi.org/10.1111/wvn.12291 Assessment (RCA) tools for systematic reviews & meta-analyses of quan-
Hecht, L., Buhse, S., & Meyer, G. (2016). Effectiveness of train- titative studies and literature reviews. Columbus, OH: The Helene
ing in evidence-based medicine skills for healthcare pro- Fuld National Institute for Evidence-Based Nursing &
fessionals: A systematic review. BMC Medical Education, 16, Healthcare.
103. https://doi.org/10.1186/s12909-016-0616-2
Sackett, D., Rosenberg, W., Gray, J., Haynes, R., & Richardson,
Ilic, D., Nordin, R. B., Glasziou, P., Tilson, J. K., & Villanueva, W. (1996). Evidence-based medicine: What it is and what
E. (2013). Implementation of a blended learning ap- it isn’t. British Journal of Medicine, 312, 71–72. https://doi.
proach to teach evidence-based practice: A protocol for org/10.1136/bmj.312.7023.71
a mixed- method study. BMC Medical Education, 13(170).
Saunders, H., Stevens, K. R., & Vehviläinen-Julkunen, K.
https://doi.org/10.1186/1472-6920-13-170
(2016). Nurses’ readiness for evidence-based practice at
Laibhen-Parkes, N., Kimble, L. P., Melnyk, B. M., Sudia, T., & Finnish university hospitals: A national survey. Journal of
Codone, S. (2018). An adaptation of the original Fresno test Advanced Nursing, 72, 1863–1874. https://doi.org/10.1111/
to measure evidence-based competence of pediatric bed- jan.12963
side nurses. Worldviews on Evidence-Based Nursing, 15(3), 230–
Saunders, H., & Vehviläinen-Julkunen, K. (2015). The state of
240. https://doi.org/10.1111/wvn.12289
readiness for evidence-based practice among nurses: An inte-
McCluskey, A., & Bishop, B. (2009). The adapted Fresno test of grative review. International Journal of Nursing Studies, 56, 128–140.
competence in evidence-based practice. Journal of Continuing
Saunders, H., Vehviläinen-Julkunen, K., & Stevens, K. R. (2016).
Education in Health Professions, 29(2), 119–126.
Effectiveness of an education intervention to strengthen
Melnyk, B. M., & Fineout-Overholt, E. (2007). The evidence-based nurses’ readiness for evidence-based practice: A single-
practice beliefs scale. Gilbert, AZ: ARCC IIc Publishing. blind randomized controlled study. Applied Nursing Research, 31,
Melnyk, B. M., & Fineout-Overholt, E. (Eds.). (2011). Evidence- 175–185. https://doi.org/10.1016/j.apnr.2016.03.004
based practice in nursing and healthcare: A guide to best practice (2nd Scurlock-Evans, L., Upton, P., & Upton, D. (2014). Evidence-
ed.). Philadelphia, PA: Lippincott, Williams & Wilkins. based practice in physiotherapy: A systematic review of
Melnyk, B., Fineout-Overholt, E., Gallagher-Ford, L., & Kaplan, barriers, enablers and interventions. Physiotherapy, 100, 208–
L. (2012). The state of evidence-based practice among U.S. 219. https://doi.org/10.1016/j.physio.2014.03.001
nurses. Journal of Nursing Administration, 42, 410–417. https:// Spurlock, D., & Wonder, A. H. (2015). Validity and re-
doi.org/10.1097/NNA.0b013e3182664e0a liability evidence for a new measure: The evidence-
Melnyk, B. M., Gallagher-Ford, L., & Fineout-Overholt, E. based practice knowledge assessment in nursing.
(2014). The establishment of evidence-based practice com- Journal of Nursing Education, 54, 605–613. https://doi.
petencies for practicing registered nurses and advanced org/10.3928/01484834-20151016-01
practice nurses in real-world settings: Proficiencies to im- Stevens, K. R. (2009). Essential evidence-based practice competencies in
prove healthcare quality, reliability, patient outcomes, and nursing (2nd ed.). San Antonio, TX: San Antonio Academic
costs. Worldviews on Evidence-Based Nursing, 1(1), 5–15. https:// Center for Evidence-Based Practice, University of Texas
doi.org/10.1111/wvn.12021 Health Science Center.
Melnyk, B. M., Gallagher-Ford, L., & Fineout-Overholt, E. Tilson, J. K. (2010). Validation of the modified Fresno test:
(2016). Implementing the EBP competencies in healthcare: A practical
Assessing physical therapists’ evidence- based practice
guide for improving quality, safety, & outcomes. Indianapolis, IN:
knowledge and skills. BMC Medical Education, 10(38), 38
Sigma Theta Tau International.
https://doi.org/10.1186/1472-6920-10-38
Melnyk, B. M., Gallagher-Ford, L., Zellefrow, C., Tucker, S.,
Thomas, B., Sinnott, L. T., & Tan, A. (2018). The first U.S. Turnbull, C., Grimmer-Somers, K., Kumar, S., May, E., Law,
study on nurses’ evidence- based practice competencies D., & Ashworth, E. (2009). Allied, scientific and comple-
indicates major deficits that threaten healthcare quality, mentary health professionals: A new model for Australian
safety, and patient outcomes. Worldviews on Evidence- Based allied health. Australian Health Review, 33(1), 27–37. https://
Nursing, 15(1), 16–25. https://doi.org/10.1111/wvn.12269 doi.org/10.1071/AH090027
Moher, D., Liberati, A., Tetzlaff, J., & Altman, D. G., & The Ubbink, D. T., Guyatt, G. H., & Vermeulen, H. (2013).
PRISMA Group (2009). Preferred reporting items for sys- Framework of policy recommendations for implemen-
tematic reviews and meta-analyses: The PRISMA statement. tation of evidence-based practice: A systematic scoping
SUPPORTING INFORMATION
Additional supporting information may be found in the online version of this article at the publisher’s web site:
Figure S1. The modified PRISMA Flow diagram (Moher et al., 2009): Identification, screening and selection of systematic
reviews for inclusion in the overview.
Table S1. Inclusion and Exclusion Criteria for the Overview of Systematic Reviews.
Table S2. Characteristics of Included Systematic Reviews in the Overview.
Table S3. Summary Table of EBP Outcomes in the Systematic Reviews Included in the Overview.