Download as pdf or txt
Download as pdf or txt
You are on page 1of 6

National Institutes of Health Stroke Scale Certification Is

Reliable Across Multiple Venues


Patrick Lyden, MD; Rema Raman, PhD; Lin Liu, PhD; Marian Emr; Margo Warren; John Marler, MD

Background and Purpose—National Institutes of Health Stroke Scale certification is required for participation in modern
stroke clinical trials and as part of good clinical care in stroke centers. A new training and demonstration DVD was
produced to replace existing training and certification videotapes. Previously, this DVD, with 18 patients representing
all possible scores on 15 scale items, was shown to be reliable among expert users. The DVD is now the standard for National
Institutes of Health Stroke Scale training, but the videos have not been validated among general (ie, nonexpert) users.
Methods—We sought to measure interrater reliability of the certification DVD among general users using methodology
previously published for the DVD. All raters who used the DVD certification through the American Heart Association
web site were included in this study. Each rater evaluated one of 3 certification groups.
Results—Responses were received from 8214 raters overall, 7419 raters using the Internet and 795 raters using other
venues. Among raters from other venues, 33% of all responses came from registered nurses, 23% from emergency
department MD/other emergency department/other physicians, and 44% from neurologists. Half (51%) of raters were
previously National Institutes of Health Stroke Scale-certified and 93% were from the United States/Canada. Item
responses were tabulated, scoring performed as previously published, and agreement measured with unweighted kappa
coefficients for individual items and an intraclass correlation coefficient for the overall score. In addition, agreement in
this study was compared with the agreement obtained in the original DVD validation study to determine if there were
differences between novice and experienced users. Kappas ranged from 0.15 (ataxia) to 0.81 (Item 1c, Level of
Consciousness-commands [LOCC] questions). Of 15 items, 2 showed poor, 11 moderate, and 2 excellent agreement
based on kappa scores. Agreement was slightly lower to that obtained from expert users for LOCC, best gaze, visual
fields, facial weakness, motor left arm, motor right arm, and sensory loss. The intraclass correlation coefficient for total
score was 0.85 (95% CI, 0.72 to 0.90). Reliability scores were similar among specialists and there were no major
differences between nurses and physicians, although scores tended to be lower for neurologists and trended higher
among raters not previously certified. Scores were similar across various certification settings.
Conclusions—The data suggest that certification using the National Institute of Neurological Disorders and Stroke
DVDs is robust and surprisingly reliable for National Institutes of Health Stroke Scale certification across multiple
venues. (Stroke. 2009;40:2507-2511.)
Key Words: clinimetrics 䡲 reliability 䡲 scales 䡲 stroke

N eurologists who care for patients with stroke are re-


quired to certify in use of the National Institutes of
Health Stroke Scale (NIHSS) now that Disease-Specific
DVD in 2006 that is distributed widely by the American
Academy of Neurology, the American Heart Association,
and the National Stroke Association. Originally the DVD
Specialty Designation as a Primary Stroke Center is available was validated in 3 select stroke centers to obtain a
from the Joint Commission.1,2 The NIHSS is a widely used best-case impression of how the DVD patients should be
stroke deficit assessment tool used in nearly all large clinical scored among expert users.8 The DVD was designed,
stroke trials to document baseline and outcome severity.3–5 A however, for a nonexpert single user to view at home or in
training and certification process exists to assure that raters an office and the use among nonexperts has not been
use the NIHSS in a uniform manner6,7; videotapes were validated. In addition, the DVD certification in group
used for training and certification from 1988 to 2006. To settings is not validated. Also, scores may not be generally
update the training and certification process, the National applicable when novice users view the training DVD and
Institute of Neurological Disorders and Stroke produced a then attempt certification. Hence, we collected scores from

Received July 23, 2008; accepted August 8, 2008.


From the Departments of Neurosciences (P.L., R.R.) and Family and Preventive Medicine (R.R., L.L.), University of California–San Diego School of
Medicine, San Diego, Calif; the Department Neurology (P.L.), Veterans Administration Medical Center, San Diego, Calif; and the National Institute of
Neurological Disorders and Stroke (M.E., M.W., J.M.), Bethesda, Md.
Correspondence to Patrick Lyden, UCSD Stroke Center, OPC Third Floor, Suite #3, 200 W Arbor Drive, San Diego CA 92103. E-mail
plyden@ucsd.edu
© 2009 American Heart Association, Inc.
Stroke is available at http://stroke.ahajournals.org DOI: 10.1161/STROKEAHA.108.532069

2507
Downloaded from http://stroke.ahajournals.org/ by guest on November 16, 2015
2508 Stroke July 2009

single use, group use, and a web site to determine the cialty, prior certification status, and country by calculating the 95%
reliability of the DVD certification outside of experienced CI for the difference in ICC for correlate data between 2 subgroups.
If zero is included in the CI, there is no evidence to indicate a
centers and across multiple venues.
difference. To compare ICC among the 3 groups of patients (A, B,
and C), the Fisher’s Z transformation for comparison of independent
Methods ICCs was used.1 In both instances, the Bonferroni correction was
The training DVD includes 18 patients divided into 3 groups applied to adjust for multiple comparisons. Similar to item score, the
balanced for severity and stroke side. Raters were asked to certify scatterplot of the total NIHSS for each subject was used to visualize
using one of the 3 patient groups. Details on the DVD and the the variability of scores by subgroups.
certification method have been described.8 To assess the mean effect of the covariates on the total NIHSS, a
We obtained certification scores from users in the following random intercept mixed effects regression model was fit to the data.
venues: single user (home or desktop), small groups, large groups,
and a web site. Single users took the DVD home or to an office, Results
watched the training video, and then watched the certification video We received score sheets from 379 single users, 178 small
cases. Small group certifications occurred at single sites where the
training video was shown and then no more than 12 users watched group users, 238 large group users, and 7419 web users.
the certification video and marked score sheets individually. Large Among the 49 284 expected responses (8214⫻6), we re-
group certifications occurred at meetings of trial investigators ceived 49 272 ratings (99.9% completion rate). Responses
participating in a variety of clinical trials; the training video was were received from 8214 individual raters (4796 raters scored
shown and then certification patients were shown. In the large group patients in Group A, 2762 in Group B, and 656 in Group C)
settings, each user marked their own score sheet without discussion
among other users. From all venues, score sheets were faxed to the who each rated between 3 and 6 patients. As a result, each
University of California–San Diego Stroke Clinical Trial Coordinat- patient had somewhere between 655 and 4796 ratings (un-
ing Center for scoring using the published algorithm.7 The training/ equal cluster sizes). Among the raters who provided demo-
certification web site is sponsored by the American Heart Associa- graphic information, 33% of all responses came from regis-
tion. Users were encouraged to watch the training video over the tered nurses, 23% from emergency department/other
Internet before certifying on one of the 3 certification groups; scores
were recorded on the web site and then raw data were transmitted to physicians, and 44% from neurologists. Most of the raters
the University of California–San Diego. (93%) were from the United States and half of the raters on
Descriptive analysis was performed on all data in the data set. The an average (51%) were previously NIHSS-certified. Item
number of raters who certified using this DVD was tabulated by responses were tabulated, scoring performed as described
setting (individual, small group, investigator meeting, and web site) previously, and agreement measured with unweighted ␬
as well as specialty (RN, emergency department MD, neurology,
other emergency department, other), prior certification status (yes, coefficients for individual items and an ICC for the overall
no), and country (US/Canada, others), if collected. Summaries of the score.
individual item score as well as the total NIHSS were generated. Table 1 indicates the range of values obtained on each item
Reliability was assessed for the individual items of the NIHSS as over all 18 patients. The mean NIHSS total score was
well as the overall score. Scores of the individual items were
8.0⫾6.6 (median, 7; range, 0 to 41). The spread of responses
tabulated. Agreement for the individual items among raters was
assessed using the unweighted kappa statistic (␬) for multiple raters9 in individual items and total scores appeared similar among
with a 95% CI obtained using the bootstrap resampling technique the subgroups, namely, sites, specialties, and prior NIHSS
with 1000 replicates. The methods used here are similar to the certification status.
methods used in the original DVD validation study to allow Table 2 compares the agreement obtained using the un-
comparison between the 2 studies.8 In this study, the bootstrap
weighted ␬ from the current data set with that of the original
technique was used instead of the jackknife technique because there
are several instances when the jackknife technique was not appro- DVD study.1 The agreements ranged from 0.15 (ataxia) to
priate.10 Agreement between this study and the original DVD study 0.81 (Item 1c, Level of Consciousness-commands [LOCC])
was considered to be statistically different if the estimated ␬ in the using the current data set. The agreements obtained from this
original study did not fall into the 95% CI for ␬ in this study. Using group of raters were similar to that of the original DVD study
similar methods, reliability of the individual items was assessed
separately for the subgroups of patients by setting as well as
on all items of the NIHSS except for 7 items with lower
specialty, certification status, and country, if available. Comparison agreement (LOCC, best gaze, visual fields, facial weakness,
of ␬ statistics across subgroups was done using the bootstrap motor left arm, motor right arm, and sensory loss).
technique for correlated data.11 Ninety-five percent CIs for differ- Among all 18 certification patients, the agreement was
ences in ␬ between 2 subgroups were calculated. The Bonferroni similar across all subgroups and among all venues. Results
correction was used to adjust for multiple comparisons within each
subgroup comparison. In addition, the scatterplot of the item scores
were remarkably similar to the results in the original DVD
for each subject was used to visually compare and confirm the validation study except for some small inconsistent differ-
reliability graphically and the consistency of item score by group. ences across certain subgroups (data not shown). Agreement
Agreement on the overall total NIHSS was assessed with an in 4 fields (LOCQ, LOCC, visual fields, and motor left leg)
intraclass correlation coefficient (ICC) obtained using a one-way was higher in other countries compared with the United
random effects model for repeated measurements with continuous
outcomes (with ratings nested within patients).12 The bootstrap States/Canada. Among specialties, emergency department
resampling technique was used to obtain 95% CIs for the ICC. There MDs had higher agreement in motor right leg compared with
are 2 comparisons that are of interest in this study: (1) ICC in the nurses; in LOCC, motor right leg and sensory loss compared
current study with that obtained in the DVD validation study; and (2) with neurologists and in motor left leg and motor right leg
ICC in this study among the subgroups. The first was assessed by compared with other specialties; nurses showed greater
determining if the 95% CI for the ICC in this study contained the
ICC from the DVD validation study. If true, there was no evidence agreement in dysarthria compared with neurologists and in
to indicate a difference in ICC between the 2 studies. ICCs in the motor left arm and motor left leg when compared with other
present study were compared between subgroups for setting, spe- specialties. Agreement in LOCQ was higher in noncertified
Downloaded from http://stroke.ahajournals.org/ by guest on November 16, 2015
Lyden et al NIHSS Certification DVD Reliability 2509

Table 1. Distribution of Reponses by NIHSS Item*


Level of Responses, N (%)

Item Total Responses on This Item 0 1 2 3 4


1a LOC 49 272 43 564 (88) 3627 (7.4) 1210 (2.5) 871 (1.8)
1b LOC questions 49 272 28 395 (58) 12 699 (26) 8178 (17)
1c LOC command 49 272 45 815 (93) 869 (1.8) 2588 (5.3)
2 Gaze 49 271 43 908 (89) 3225 (6.5) 2138 (4.3)
3 Visual fields 49 269 39 378 (80) 4845 (9.8) 4836 (9.8) 210 (0.4)
4 Facial weakness 49 272 23 114 (47) 19 263 (39) 5439 (11) 1456 (3)
5a Motor left arm 49 258 34 310 (70) 4958 (10) 3770 (7.7) 2695 (5.5) 3525 (7.2)
5b Motor right arm 49 261 36 828 (75) 6574 (13) 236 (0.5) 699 (1.4) 4924 (10)
6a Motor left leg 49 261 27 665 (56) 13 477 (27) 5007 (10) 3063 (6.2) 49 (0.1)
6b Motor right leg 49 264 27 086 (55) 10 952 (22) 6859 (14) 1637 (3.3) 2730 (5.5)
7 Ataxia 49 243 29 828 (61) 12 715 (26) 6700 (14)
8 Sensory 49 264 4645 (9.4) 38 305 (78) 6314 (13)
9 Aphasia 49 264 27 877 (57) 12 752 (26) 6148 (13) 2487 (5)
10 Dysarthria 49 256 27 222 (55) 17 188 (35) 4846 (10)
11 Extinction 49 256 33 012 (67) 10 193 (21) 6051 (12)
LOC indicates level of consciousness.
*The 15 items of the NIHSS and the level of responses to each item by 8214 raters. The unequal total responses to items are due to the missing values and the
percentages do not add to 100 due to rounding. The level of response corresponds to the available responses for each item; some items have 3 and others have
4 or 5 possible responses; shaded cells represent responses that are not possible.

raters than that in certified raters. Comparing venues, indi- web users. There is no significant difference in agreement
vidual users showed higher agreement in extinction/neglect across 3 certification groups.
compared with the large group setting and higher agreement Table 3 lists the intraclass correlation coefficient for the
in visual fields and motor left arm compared with web users; overall total NIHSS score and total NIHSS by subgroup.
in the large group setting, scores showed lower agreement in There continues to be very good agreement in the total
extinction/neglect compared with the web setting; the small NIHSS score across all venues and subgroups (overall ICC of
group setting showed higher agreement in motor left arm than 0.85; 95% CI, 0.72 to 0.90). There are no statistically

Table 2. Interobserver Agreement for NIHSS Items*


Current DVD

Scale Item No. Original DVD Overall Individual Small Group Investigator Meeting Web
1a LOC 0.46 (0.39 – 0.53) 0.43 (0.01, 0.51) 0.62 (0 – 0.69) 0.52 (0.31– 0.70) 0.43 (0.04 – 0.50) 0.43 (0.01– 0.51)
1b LOC questions 0.77 (0.64–0.90) 0.77 (0.66–0.84) 0.85 (0.69, 0.91) 0.73 (0.49–0.86) 0.70 (0.31–0.92) 0.77 (0.66–0.84)
1c LOC command 0.92 (0.75–1.0) 0.81 (0–0.86) 0.92 (0–0.98) 0.78 (0–1.00) 0.93 (0–0.98) 0.79 (0–0.85)
2 Gaze 0.70 (0.39–1.0) 0.45 (0.03–0.63) 0.51 (0.04, 0.73) 0.48 (0.04–0.69) 0.72 (0.04–0.83) 0.44 (0.03–0.63)
3 Visual fields 0.72 (0.57–0.87) 0.57 (0.27–0.62) 0.71 (0.45, 0.78) 0.71 (0.28–0.90) 0.69 (0.27–0.80) 0.56 (0.20–0.61)
4 Facial weakness 0.38 (0.27–0.49) 25 (0.14–0.32) 0.29 (0.16, 0.38) 0.29 (0.17–0.35) 0.18 (0.07–0.26) 0.25 (0.14–0.32)
5a Motor left arm 0.65 (0.51–0.79) 0.52 (0.21–0.62) 0.63 (0.32, 0.75) 0.62 (0.36–0.75) 0.69 (0.20–0.77) 0.51 (0.18–0.61)
5b Motor right arm 0.72 (0.54–0.90) 0.51 (0.28–0.65) 0.52 (0.29, 0.76) 0.52 (0.30–0.65) 0.61 (0.23–0.74) 0.51 (0.26–0.65)
6a Motor left leg 0.64 (0.51–0.77) 0.66 (0.64–0.73) 0.70 (0.53, 0.80) 0.67 (0.51–0.79) 0.56 (0.19–0.69) 0.66 (0.53–0.74)
6b Motor right leg 0.64 (0.53–0.75) 0.59 (0.49–0.64) 0.63 (0.51, 0.70) 0.56 (0.39–0.67) 0.56 (0.28, 0.69) 0.59 (0.49–0.64)
7 Ataxia 0.21 (0.12, 0.30) 0.15 (0.06–0.22) 0.18 (0.06, 0.27) 0.32 (0.10–0.49) 0.17 (0.06–0.25) 0.15 (0.06–0.22)
8 Sensory 0.73 (0.53–0.93) 0.54 (0.17–0.68) 0.62 (0.30, 0.79) 0.60 (0.25–0.80) 0.65 (0.28–0.89) 0.53 (0.16–0.69)
9 Aphasia 0.64 (0.53–0.75) 0.58 (0.37–0.71) 0.58 (0.30, 0.77) 0.58 (0.32–0.74) 0.60 (0.23–0.71) 0.59 (0.36–0.72)
10 Dysarthria 0.56 (0.39–0.73) 0.46 (0.28–0.58) 0.43 (0.28, 0.54) 0.37 (0.20–0.48) 0.56 (0.22–0.68) 0.46 (0.27–0.59)
11 Extinction 0.57 (0.40–0.74) 0.60 (0.49–0.64) 0.56 (0.43, 0.67) 0.55 (0.38–0.66) 0.38 (0.01–0.42) 0.61 (0.49–0.65)
LOC indicates level of consciousness.
*The agreement (unweighted ␬ and 95% CI) among all raters for 15 NIHSS items using new training and certification DVD. For comparison, the agreement on
individual items in original DVD validation study8 is given. To assess the effect of subgroups, we used a pairwise comparison with a Bonferroni adjustment to account
for the multiple comparisons.

Downloaded from http://stroke.ahajournals.org/ by guest on November 16, 2015


2510 Stroke July 2009

Table 3. ICC for NIHSS Total Score* The reliability assessments of this certification DVD among
these novice users were similar to what was found using the
No. of Raters ICC 95% CI
experienced stroke centers, indicating that the DVD is a
Overall 49 200 0.85 (0.72– 0.90) surprisingly valid and reliable replacement for the previous
Country videotapes. The agreement among the items was similar
USA/Canada 4416 0.86 (0.72–0.90) whether it was used by a single user or in a group setting.
Others 311 0.94 (0.81–0.97) We found no differences in the ICC of the total NIHSS
Specialty when the DVD was used by neurologists, emergency depart-
Nurse 1533 0.94 (0.80–0.97)
ment physicians, and nurses, suggesting that the NIHSS may
be appropriate for use in clinical research trials as well as in
ED MD 184 0.96 (0.84–0.99)
daily communication among healthcare providers. Agreement
Neurology 2085 0.79 (0.63–0.89)
among those identifying themselves as neurologists was
Other MD 364 0.92 (0.64–0.98) slightly lower than individuals identifying themselves as
Other specialties 561 0.92 (0.70–0.97) registered nurses, emergency department/other MDs, or other
Certification specialties, but the results were statistically similar and
Yes 2416 0.82 (0.70–0.90) generally excellent. Agreement across various settings was
No 1414 0.94 (0.80–0.97) similar and generally moderate to excellent.
The DVD format has some advantages over videotape. The
Setting
digital images can be loaded onto a web site, and the
Individual 2251 0.94 (0.79–0.97)
American Heart Association successfully implemented a
Small group 1053 0.71 (0.53–0.91) web-based training campus using our images. This web site
Investigator meeting 1423 0.87 (0.65–0.93) allows raters to view the training and certification patient
Web 44 473 0.85 (0.72–0.90) videos online. The DVD technology is more widely available
Group now than videotapes, so NIHSS certification should be
A 28 722 0.83 (0.50–0.89) possible for many more years, even if videotapes become
obsolete.
B 16 562 0.84 (0.21–0.86)
This study contains certain limitations, the most important
C 3916 0.93 (0.40–0.96)
of which is that most of the raters were from the United States
*The ICCs for total score by overall, by site, specialty, certification status, and Canada. We were able to determine that the scoring sheet
and group. To assess the effect of subgroups, we used a pairwise comparison works well for novice as well as experienced users in North
with a Bonferroni adjustment to account for the multiple comparisons. The ICCs
America. However, these scores may not be generally appli-
by country, specialty, and certification did not include web data because there
is no related information available. cable for non-English-speakers or raters in other countries.
Therefore, we continue to collect scores from the web site to
significant differences in mean NIHSS scores by country and determine if the same scoring sheet generally works well
prior NIHSS certification status. There was a statistically outside of North America. Another inherent limitation is that
video technology is a poor substitute for direct examination.
significant interaction between specialty and setting in mean
In the absence of widespread proctored certification, how-
NIHSS scores (P⫽0.046); however, there were no clinically
ever, no other option is available. Video certification is now
significant differences. Although there were slight differences
widely used in many disciplines with reasonable validity and
in ICC across covariates, in all cases, the agreement still
reliability.2 It is likely that web-based video training and
remained very high. Agreement was lower among raters from
certification will become more widespread, because the cost
the United States/Canada compared with the raters from other
efficiencies are significant. Finally, the web site does not
countries. The ICC was slightly lower among neurologists
require viewing of the training video before attempted certi-
compared with the nurses, emergency department MDs, other
fication, so an unknown number of novice users could have
MDs, and other physicians. Similarly, the raters with prior
tried to certify without proper training.
certification had slightly lower agreement than those who
Due to the unbalanced group sizes, small cells for item
were not certified previously. The ICC was slightly lower in scores, and a crossed study design, we did not use weighted
the case of small group setting as compared with individual, ␬ statistics. Unweighted ␬ scores may underestimate agree-
investigator meeting setting, or web users. The ICCs for certifi- ment, yet in this study, the unweighted ␬ scores were
cation Groups A and B were slightly lower than Group C. comparable to the unweighted scores obtained in the primary
DVD study and the weighted scores obtained in previous
Discussion videotape studies. Therefore, the agreement among the view-
Our data show that NIHSS training and certification using the ers was at least as good and likely better than that seen
DVD is valid and reliable among general users. The certifi- previously with the videotapes. Agreement using the DVD
cation process showed remarkable consistency across widely continues to be surprisingly good and consistent among
differing venues, including single users, small groups, large experienced as well as novice users.
groups, and certification data from the American Heart
Association web site. The individuals in this study included Acknowledgments
novice users—who viewed the training video and then We acknowledge the diligent effort and expertise of Ms Alyssa
attempted certification—as well as previously certified users. Chardi and Karen Rapp, RN.

Downloaded from http://stroke.ahajournals.org/ by guest on November 16, 2015


Lyden et al NIHSS Certification DVD Reliability 2511

Sources of Funding 5. Goldstein LB, Bartels C, Davis JN. Interrater reliability of the NIH Stroke
Scale. Arch Neurol. 1989;46:660.
This work was supported by National Institute of Neurological
6. Albanese MA, Clarke WR, Adams HP Jr, Woolson RF. Ensuring reli-
Disorders and Stroke P50 NS044148 and the Veterans Affairs
ability of outcome measures on multicenter clinical trials of treatments for
Medical Research Service.
acute ischemic stroke: the program developed for the Trial of ORG 10172
in Acute Stroke treatment (TOAST). Stroke. 1994;25:1746.
Disclosures 7. Lyden P, Brott T, Tilley B, Welch KM, Mascha EJ, Levine S, Haley HC,
None. Grotta J, Marler J. Improved reliability of the NIH Stroke Scale using
video training. NINDS tPA Stroke Study Group. Stroke. 1994;25:
References 2220 –2226.
1. Alberts MJ, Hademenos G, Latchaw RE, Jagoda A, Marler J, 8. Lyden P, Raman R, Liu L, Grotta J, Broderick J, Olson S, Shaw S, Spilker
Mayberg MR, Starke RD, Todd HW, Viste KM, Girgus M, Shephard S, Meyer B, Emr M, Warren M, Marler J. NIHSS training and certifi-
T, Emr M, Shwayder P, Walker MD. Recommendations for the estab- cation using a new digital video disk is reliable. Stroke. 2005;36:
lishment of primary stroke centers. Brain Attack Coalition. JAMA. 2000; 2446 –2449.
283:3102. 9. Fleiss JL. Statistical Methods for Rates and Proportions. New York: John
2. Mohammad YM, Divani AA, Jradi H, Hussein HM, Hoonjan A, Qureshi Wiley and Sons; 1981.
AI. Primary stroke center: basic components and recommendations. South 10. Efron B, Tibshirani RJ. An Introduction to the Bootstrap. New York:
Med J. 2006;99:749 –752. Chapman & Hall/CRC; 1993:436.
3. Lyden P, Lu M, Jackson C, Marler J, Kothari R, Brott T, Zivin J. 11. McKinzie DP, Mackinnon AJ, Peladeau N, Onghena P, Bruce PC, Clarke
Underlying structure of the National Institutes of Health Stroke Scale: DM, Harrigan S, McGorry PD. Comparing correlated kappas by resa-
results of a factor analysis. NINDS tPA Stroke Trial Investigators. Stroke. mpling: is one level of agreement significantly different from another?
1999;30:2347–2354. J Psychiatr Res. 1996;30:483.
4. Goldstein L, and Samsa, G. Reliability of the National Institutes of Health 12. Zar JH. Biostatistical Analysis, IV ed. 1999. Princeton, NJ: Prentice Hall;
Stroke Scale. Stroke. 1997;28:307. 1999:390 –392.

Downloaded from http://stroke.ahajournals.org/ by guest on November 16, 2015


National Institutes of Health Stroke Scale Certification Is Reliable Across Multiple Venues
Patrick Lyden, Rema Raman, Lin Liu, Marian Emr, Margo Warren and John Marler

Stroke. 2009;40:2507-2511; originally published online June 11, 2009;


doi: 10.1161/STROKEAHA.108.532069
Stroke is published by the American Heart Association, 7272 Greenville Avenue, Dallas, TX 75231
Copyright © 2009 American Heart Association, Inc. All rights reserved.
Print ISSN: 0039-2499. Online ISSN: 1524-4628

The online version of this article, along with updated information and services, is located on the
World Wide Web at:
http://stroke.ahajournals.org/content/40/7/2507

Permissions: Requests for permissions to reproduce figures, tables, or portions of articles originally published
in Stroke can be obtained via RightsLink, a service of the Copyright Clearance Center, not the Editorial Office.
Once the online version of the published article for which permission is being requested is located, click
Request Permissions in the middle column of the Web page under Services. Further information about this
process is available in the Permissions and Rights Question and Answer document.

Reprints: Information about reprints can be found online at:


http://www.lww.com/reprints

Subscriptions: Information about subscribing to Stroke is online at:


http://stroke.ahajournals.org//subscriptions/

Downloaded from http://stroke.ahajournals.org/ by guest on November 16, 2015

You might also like