Download as pdf or txt
Download as pdf or txt
You are on page 1of 9

Performance Assessment of Communicable Disease Surveillance in Disasters: A Systematic Re... http://sci-hub.cc/http://currents.plos.org/disasters/article/performance-assessment-of-communic...

PLOS Currents Disasters


Performance Assessment of Communicable Disease Surveillance in Disasters: A Systematic
Review
February 24, 2015 Research Article

Citation
Tweet
Babaie J, Ardalan A, Vatandoost H, Goya MM, Akbarisari A. Performance Assessment of Communicable Disease Surveillance in Disasters: A Systematic Review. PLOS Currents Disasters. 2015 Feb 24 .
Edition 1. doi: 10.1371/currents.dis.c72864d9c7ee99ff8fbe9ea707fe4465.

Authors
Javad Babaie

Department of Disaster Public Health, School of public Health, Tehran University of Medical Sciences, Tehran, Iran; Department of Disaster and Emergency Health, National Institute of Health Research, Tehran University of Medical Sciences,
Tehran, Iran.

Ali Ardalan

Department of Disaster Public Health, School of Public Health, Tehran University of Medical Science, Tehran, Iran; Department of Disaster and Emergency Health, National Institute of Health Research, Tehran University of Medical Science,
Tehran, Iran; Harvard Humanitarian Initiative, Harvard University, Cambridge, USA.

Hasan Vatandoost Mohammad Mehdi Goya

Department of Medical Entomology and Vector Control, School of Public Health, Tehran University of Medical Sciences, Tehran, Iran. Centre for Communicable Disease Management, Ministry of Health and Medical Education, Tehran, Iran.

Ali Akbarisari

Department of Health Management and Economics, School of Public Health, Tehran University of Medical Sciences, Tehran, Iran.

Abstract
Background: This study aimed to identify the indices and frameworks that have been used to assess the performance of communicable disease surveillance (CDS) in response to disasters and other emergencies,
including infectious disease outbreaks.

Method: In this systematic review, PubMed, Google Scholar, Scopus, ScienceDirect, ProQuest databases and grey literature were searched until the end of 2013. All retrieved titles were examined in accordance with
inclusion criteria. Abstracts of the relevant titles were reviewed and eligible abstracts were included in a list for data abstraction. Finally, the study variables were extracted.

Results: Sixteen articles and one book were found relevant to our study objectives. In these articles, 31 criteria and 35 indicators were used or suggested for the assessment/evaluation of the performance of
surveillance systems in disasters. The Centers for Disease Control (CDC) updated guidelines for the evaluation of public health surveillance systems were the most widely used.

Conclusion: Despite the importance of performance assessment in improving CDS in response to disasters, there is a lack of clear and accepted frameworks. There is also no agreement on the use of existing criteria
and indices. The only relevant framework is the CDC guideline, which is a common framework for assessing public health surveillance systems as a whole. There is an urgent need to develop appropriate frameworks,
criteria, and indices for specifically assessing the performance of CDS in response to disasters and other emergencies, including infectious diseases outbreaks.

Key words: Disasters, Emergencies, Communicable Diseases, Surveillance System, Performance Assessment

Funding Statement
This study has been funded and supported by I.R.Irans National institute of health research (NIHR), Tehran University of Medical Science, Contract No. 241/M/91375. The funders had no role in study design, data

Sci-Hub http://currents.plos.org/disasters/article/performance-assessment-of-communicable-disease-surveillance-in-disaster

URL , DOI,

1 de 9 19/09/2017 8:25 p. m.
Performance Assessment of Communicable Disease Surveillance in Disasters: A Systematic Re... http://sci-hub.cc/http://currents.plos.org/disasters/article/performance-assessment-of-communic...

Disasters, whether natural or man-made, are common events worldwide1 . These events kill and injure people, destroy health facilities, and disrupt health systems and lifelines2 . Disasters displace populations and
interrupt routine communicable disease management (CDM) programs, including surveillance systems and immunization programs3. In the conditions that develop after disasters, populations are very vulnerable to
outbreaks of communicable diseases, and there are many examples of communicable disease outbreaks after disasters 4. These include cholera in Haiti after the 2010 earthquake, malaria after floods in Brazil,
dengue fever after floods in the Dominican Republic, and acute diarrhea after the 2005 Pakistan earthquake 5. Accordingly, CDM has become one of the most important components of health care programs in disaster
response management 4. The most urgent task in CDM is the establishment of a surveillance system (SS) for timely detection of any increase in disease occurrence and the introduction of rapid control measures.
Almost, all of health systems establish a SS in response to disasters 6,7 .

The establishment of an effective SS in a disaster or emergency setting is a complex and difficult process requiring a large number of resources including manpower, equipment, and administrative facilities. To
determine whether CDS meets target goals, it is necessary to measure the performance of these SS 8.

An appropriate and unique assessment system that is established according to disaster characteristics is required for the monitoring of SS. An effective assessment system should include appropriate indices and
should be conducted in a correct method. Such measures can improve health system response as well 9.They can define shortcomings and provide a guide for future-response success.

Management experts believe that what cannot be measured cannot be managed. Therefore, the measurement of performance is considered one of the most important components of efficacy and effectiveness 10. In
addition, according to current management knowledge, lack of an appropriate performance assessment is an important sign of weakness in a program or organization 9 .

With regard to the importance of performance assessment and its role in improving the performance of communicable diseases SS in response to disasters, there are 2 main questions:

What kind of performance assessment frameworks, indices, and criteria currently exist for CDS systems in response to disasters and other emergencies, including infectious diseases outbreaks? What are the
characteristics of these frameworks, indices, and criteria? The aim of this systematic literature review was to find the answers to these questions.

Materials and Methods


Before starting this systematic review, a written review manual was developed for searching strategy, study inclusion, and exclusion criteria. The study was then conducted according to this review process.We have
reported our review according to the PRISMA guideline.

Research questions

The review aimed to answer the following questions:

1. What kind of performance assessment frameworks currently exist for CDS systems in disasters and other emergencies, including infectious diseases outbreaks?

2. What criteria and indices are used in the performance assessment of CDS systems in response to disasters and emergencies?

3. What are the characteristics of the articles on the performance assessment of CDS systems in terms of article type, study approach (qualitative or quantitative), study setting, results, hazard type, geographical
location, and country or affiliation of first/corresponding authors?

Definitions

For the purpose of this study a communicable disease is a disease caused by living agents as infectious agents, or their products, that can be transmitted from 1 patient to another (Synonym: infectious
disease) 11.
In the literature, surveillance is the ongoing, systematic collection, analysis, interpretation, and dissemination of data regarding communicable disease for use in developing preventive actions to reduce
morbidity, mortality, and to improve health 12 .
A disaster is a man-made or natural event that disrupts the affected community functions and results in widespread losses that are greater than community resources 13 .
An emergency is a condition that needs urgent attention and could become a disaster if not managed effectively. In this study, both natural and man-made disasters and emergencies were included.
Smith defines performance assessment as a systematic process that seeks to monitor, evaluate, and communicate the extent to which various aspects of a system meet its key objectives 14 .

Inclusion criteria

The following criteria were used to select relevant studies:

Articles that were published in peer-reviewed journals and had addressed the performance assessment of CDS in response to disasters and emergencies (as defined above).
Articles in any format including editorials, case reports, reviews, and original research.

Exclusion criteria

Sci-Hub URL , DOI,


2 de 9 19/09/2017 8:25 p. m.
Performance Assessment of Communicable Disease Surveillance in Disasters: A Systematic Re... http://sci-hub.cc/http://currents.plos.org/disasters/article/performance-assessment-of-communic...

Search strategy (Data sources and literature search)

We searched 5 electronic databases, including PubMed, Scopus, Google Scholar, ScienceDirect, and ProQuest. The databases were searched for articles published up to the end of 2013. In addition grey literature[1]
was searched through the New York academy of medicine grey literature reports15 . Websites of CDC and WHO searched for relevant guidelines. We also reviewed the reverences of retrieved studies to identify
additional articles.

[1] Grey literature definition: That which is produced on all levels of government, academics, business and industry in print and electronic formats, but which is not controlled by commercial publishers.

We chose key terms and developed a search strategy based on the National Library of Medicine Medical Subjects Headings (MeSH).

The following search strategy was applied in the PubMed database: (disasters [Title/Abstract]) AND surveillance [Title/Abstract], (disasters [Title/Abstract]) AND communicable diseases [Title/Abstract], (emergencies
[Title/Abstract]) AND surveillance [Title/Abstract], (emergencies [Title/Abstract]) AND communicable diseases [Title/Abstract].

To search the other databases, the PubMed search strategy was adopted. We limited our search to titles and abstracts of articles.

Study screening process

First, the selected key words were entered into the database search boxes, and the search was limited to abstracts and titles. The results of the key words search were reviewed by a member of the review team (JB).
If the study met the inclusion criteria, it was included in the review. If there was any doubt about meeting the inclusion criteria a decision was made based on the consensus of the review team. Articles unrelated to the
aim of the present study were excluded. The remaining titles were entered into an Excel spreadsheet and sorted. Duplicates were excluded. Next, the abstracts of the related titles were screened for their precise
relevance to the aims of the present study. If an abstract met the inclusion criteria, it was included in the review. Abstracts that were not precisely relevant were excluded. The remaining papers were included in the
review. The full texts of these articles were downloaded from the databases. If an article was not available free of charge, we paid for access. Two papers did not have full text that was accessible to us, and the study
variables were not extractable from the abstracts, so they were excluded from the review.

Data analysis

The finally included papers were evaluated by a member of the review team (AA) using a data abstraction sheet developed by the research team. This data sheet included the study variables: name of the journal;
name of the first author; number of authors; publication year; type of potential hazard; model/framework used for the CDS performance assessment; indices/criteria and tools used for the CDS performance
assessment; the study approach; and the location of study. In the extraction of CDS performance assessment criteria, indices, and the study approach, our first priority was the authors statement. If criteria, indices,
and the study approach were mentioned in the article, the data were included in our data abstraction sheet. If not, the review team used a consensus approach to decide whether the data should be included.

Ethics and dissemination

Ethical approval was not required for this literature review.

Results
Literature search

The initial search strategy resulted in a total of 3928 articles/documents (3902 resulted from database searching and 26 documents resulted from Grey literature and websites searching). Of these, 3698 titles did not
fulfill the inclusion criteria and were excluded, leaving 230 articles/documents that were considered potentially relevant. These papers/documents were entered into an Excel spreadsheet and sorted alphabetically.
Duplicates (93 titles) were discarded. In the second phase, the abstracts of the remaining articles/documents (137 titles) were examined. In this step, 114 irrelevant abstracts were excluded and 23 papers/documents
were considered for analysis. Two articles full texts were not accessible and their abstracts were not informative enough thus excluded. Four potentially relevant documents were not accessible too. In total, 16 papers
and one book were included in the final review list for data extraction. Figure 1 outlines the literature search and the study selection process.

Sci-Hub URL , DOI,


3 de 9 19/09/2017 8:25 p. m.
Performance Assessment of Communicable Disease Surveillance in Disasters: A Systematic Re... http://sci-hub.cc/http://currents.plos.org/disasters/article/performance-assessment-of-communic...

Fig. 1: Description of included papers

Sci-Hub URL , DOI,


4 de 9 19/09/2017 8:25 p. m.
Performance Assessment of Communicable Disease Surveillance in Disasters: A Systematic Re... http://sci-hub.cc/http://currents.plos.org/disasters/article/performance-assessment-of-communic...

Sixteen papers and one book were included in the final review list. A total of 97 authors contributed to these 17 papers/documents. The mean number of authors per article was 5.7 (SD = 3.07). The affiliations of the
first/corresponding authors of articles were the USA (n = 6, 37.5%), France (n = 2, 12.5%), the UK (n = 2, 12.5%), Australia (n = 2, 12.5%), Brazil (n = 1, 6.2%), the Netherlands (n = 1, 6.2%), and Turkey (n = 1, 6.2%).
The 16 papers were published in 13 different peer reviewed journals.

The earliest article was published in 2007 and one article was published in each of 2007 and 2008. There was an increase in the number of published studies of CDS system performance assessment from 2009. For
example, in both 2010 and 2011, 3 articles were published, and in 2012, 5 articles were published. The 16 studies focused on 6 hazards types including epidemics (n = 4, 25%), hurricane/cyclone (n = 2, 12.5%), heat
waves (n = 2, 12.5%), mass gatherings (n = 2, 12.5), complex emergencies (n = 1, 6.2%), flood (n = 1, 6.25%), 4 article (25%) and the book included all hazards.

The studies used a quantitative approach (n = 10, 62.5%), a qualitative approach (n = 1, 6.25%), or a mixed approach (n = 3, 18.7%). Two (12.5%) studies were reviews. Six studies (37.5%) were conducted in the
USA, 2 (12.5%) in France, and 2 (12.5%) in Australia. A single study was conducted in 5 countries (Brazil, Chad, Poland, the UK, and Turkey). The location of 1 (6.2%) study was not determined.

Results of included studies

The 16 articles that were finally selected for review were divided into 5 groups according to the theme of the study. These 5 themes were: performance assessment of syndromic surveillance systems (31.2%);
mortality/morbidity SS (25.0%); public health/disease surveillance (12.5%); the applications of cost analysis, efficacy, effectiveness, and usefulness in performance assessment of SS (25.0%); and the review of
performance assessment indicators (6.2%).

The relevant book is about the communicable diseases control in emergencies, and it has a specific section for CDS in disasters.

In the reviewed articles and book, there was no specific performance assessment framework for SS in disasters. However, the CDC updated guidelines for public health surveillance system evaluation 12 was used
exclusively in 3 studies. In the performance assessment of mortality, morbidity, and CDS systems 16,17,18,19 , the CDC guidelines were also used as part of the assessment 12 . The CDC guidelines are based on 9
criteria including simplicity, flexibility, data quality, sensitivity, positive predictive value (PPV), timeliness, acceptability, representativeness, and stability.

Of the CDC public health surveillance evaluation attributes, the most widely applied was timeliness, which was used in 7 studies. Flexibility was used in 5 studies, data quality in 4, simplicity in 3, stability in 3, and
usefulness and representativeness in 2 studies. In all cases, timeliness, data quality, sensitivity, specificity, PPV, cost, and representativeness were calculated quantitatively. Flexibility, usefulness, simplicity, and
acceptability were calculated in a qualitative manner. Stability was calculated both quantitatively and qualitatively.

In the reviewed book20 10 indicators suggested for PA of SS including: zero reporting, completeness, timeliness, the number of cholera cases for which samples were confirmed by the laboratory, the number of malaria
cases confirmed by blood smear date of onset of the first case, date of reporting using outbreak alert form, date of investigation, date of response. These findings are presented in Table 1.

Overall, in the 16 articles and one book that were included, 31 criteria/measures and 35 indicators were used or suggested for the assessment/evaluation of the performance of CDS systems in response to disasters
and emergencies.

Sci-Hub URL , DOI,


5 de 9 19/09/2017 8:25 p. m.
Performance Assessment of Communicable Disease Surveillance in Disasters: A Systematic Re... http://sci-hub.cc/http://currents.plos.org/disasters/article/performance-assessment-of-communic...

Table 1: Summary of the reviewed articles in terms of framework/method, indicator/criteria and studied hazard for performance assessment of communicable diseases surveillance system
Study Framework/method Indicator/Criteria Hazard
Josseran L, Fouillet A, CDCs updated guidelines for Data quality, cost, flexibility, stability, timeliness, effectiveness, sensitivity, specificity, positive predictive value Heat wave
Caillre N, Brun-Ney D, Ilef surveillance system evaluation
D, Brucker G, Medeiros H,
Astagneau Pv 23
Zielinski A 27 Review study Cost minimization, cost-effectiveness analysis, cost utility, cost benefit Heat wave
Cinti S, Haas K, Paliani P, Comparing with data from regional and Percentage of visits by established SS and national surveillance, percentage of samples with positive results Pandemics
Newton D, Wright C, national surveillance reports
Zalewski C, et al. 28
Elliot AJ, Hughes HE, CDCs updated guidelines for Sensitivity, specificity, timeliness, data quality Mass gathering
Hughes TC, Locker TE, surveillance system evaluation
Shannon T, Heyworth J, et (Incomplete)
al. 19
Hope KG, Merritt TD, CDCs updated guidelines for Usefulness, flexibility, acceptability Mass gathering
Durrheim DN, Massey PD, surveillance system
Kohlhagen JK, Todd K, evaluation(Incomplete)
DEste, et al. 16
Schnall AH, Wolkin AF, Noe Comparing of diagnosis in developed Agreement between discharge diagnoses and developed form Natural hazards
R, Hausman LB, Wiersma P, form with ED discharge diagnosis
Soetebier K, Cookson ST 29
Choudhary E, Zane DF, CDCs updated guidelines for Usefulness, simplicity, flexibility, data quality, acceptability, representativeness, timeliness, stability, sensitivity, Hurricane
Beasley C, Jones R, Rey A, surveillance system evaluation positive predictive value
Noe RS, Martin C, Wolkin
AF, Bayleyegn TM 17
Teixeira MG, Costa MCN, Comparing Brazils public health SS with Structure (legal framework and financial, human and physical resource), surveillance procedure (capacity to Public health
Souza LPF, Nascimento international health regulation detect, assess, notify), response (investigate, intervene, and communicate) emergencies
EMR, Barreto ML, Barbosa (reemergence of
N, et al 30 infectious disease)
Noha H. Farag, Rey A, Noe CDCs updated guidelines for Simplicity, flexibility, acceptability, timeliness, stability, data quality, sensitivity, positive predictive value, Hurricane
R, Bayleyegn T, Wood AD, surveillance system evaluation representativeness
Zane D18
Bowden S. Braker K, Simplicity, flexibility, appropriateness, timeliness, dissemination of data Complex emergencies
Checchi F, Wong S. 31
Potter MA, Sweeney P, Practice, process, outcomes (review Process (first clinical observation , accurate diagnosis, laboratory confirmation, identification of exposure source, Outbreaks
Luliano AD, Allswede MP 32 ofpublished records) report to public health authority, report to law enforcement authority, initiation of emergency operation plan ,
initation of risk-mitigation activities, initiation of past exposure prophylaxis, initiation of public health education
activities, initiation of risk advice to health care workers, last reported new case); outcome indicators (primary
cases, total cases, secondary cases, HCW[1]s infected)
[1] Health Care Workers (HCW)
Josseran L, Caillere N, Brun- Comparing frequency of visits due to Percentage of emergency department visits, percentage of hospitalization, incidence of heat related diseases Heat wave
Ney D, Pottner J, Filleul L, heat related patients in on alert periods
Brucker G, et al. 33 (ONAP) with off alert periods (OFAP)
Stoto MA 34 Triangulation approach Validity, utility Pandemics
Hope K, Merritt T, Eastwood Comparing traditional SS with Syndromic Timeliness Natural disasters
K, Main K, Durrheim DN, SS
Muscatello D, Todd K, Zheng

Sci-Hub URL , DOI,


6 de 9 19/09/2017 8:25 p. m.
Performance Assessment of Communicable Disease Surveillance in Disasters: A Systematic Re... http://sci-hub.cc/http://currents.plos.org/disasters/article/performance-assessment-of-communic...

Study Framework/method Indicator/Criteria Hazard


Vergeiner G, Krafft T, Brand evaluation(Incomplete)
H 24
Akgun D 36 Case report Water sanitation, immunization, organization of health services, public education Flood
Conolly MA (Editor). 20 Zero reporting, completeness, timeliness, the number of cholera cases for which samples were confirmed by the All hazards
laboratory, the number of malaria cases confirmed by blood smear date of onset of the first case, date of
reporting using outbreak alert form, date of investigation, date of response

Discussion
The aims of this research were to review and extract the frameworks and indices that have been used for assessing the performance of CDS systems in response to disasters and emergencies. An extensive
systematic literature review was conducted using 5 popular international databases. Our initial search resulted in 3902 titles. Finally, 17 studies/documents met the study inclusion criteria and were considered in the
data abstraction list.

The first disaster-related article was published in 1945, but the first article related to the performance assessment of CDS in disasters was not published until 2007. Surprisingly, there was a 62-year gap between the 2
publications.

Searching for the term disaster in PubMed resulted in more than 60,261 papers, but we could only find 16 papers about CDS systems performance assessment in response to disasters and emergencies. Therefore,
performance assessment in this field is a new research focus, although in recent years there has been a growing interest in CDS system performance assessment in response to disasters. With a recent increase in the
number of disasters, many countries have spent millions of dollars improving the response of their health care systems. Consequently, the assessment of the efficacy and effectiveness of the response to disasters has
become a focus of investigation, to ensure that resources are used in the most efficient and effective way 9,10,11,12,13,14,15,16 .

According to current management knowledge, lack of an appropriate performance assessment system is a symptom of weakness and disorder in an organization or program 9 . For this reason, there has been growing
interest in the performance assessment of organizations and programs in recent years 21 . Although there are several performance assessment frameworks documented in the literature, in our review we were unable
to find any established framework for assessment of this specific field (SS in disasters). However, the US-based CDC has developed guidelines for the evaluation of public health surveillance systems, but only 3
studies used those guidelines exclusively. A number of other studies used this framework in part 22 . This suggests that there is no consensus on the application of the CDC guidelines for the performance assessment
of CDS in disasters.

Furthermore, some economic assessment criteria have been suggested for use in the performance assessment of SS in disasters, but these criteria have not been used in practice.

The selection of appropriate criteria and indexes for CDS assessment is essential to meaningful research. In the included studies, 31 criteria/indices were used and the attributes suggested by CDC were the most
widely used criteria. Some studies used other criteria and indices in addition to those of CDC 17,22,23 . This is another indication that there is no clear agreement on the application of the CDC guidelines for the
performance assessment of CDS in disasters.

Different definitions have been proposed for the CDC criteria, and different scientific methods (qualitative and quantitative) have been applied for measurements. For example, Josseran divided the timeliness criterion
into 4 steps and defined it as the time that it takes for data to be collected, processed, analyzed, and publicized 23 . Choudhary defined timeliness as the average time that a death is reported by a surveillance
system 17 , while Farag defined the same term as the speed of data transmission between surveillance system steps 18 . In the article by Josseran, sensitivity, specificity, and PPV were suggested as components of
the effectiveness of surveillance systems 23 , while in another study the same indices were used for validity 24 . In the study by Farag, validity was based on data quality 18 . These discrepancies show that there is a
lack of consistency in the assessment of this specific field.

Despite the reasonably wide spread use of the CDC guidelines for systems performance assessment in response to disasters and emergencies, experts have emphasized that there are no generally accepted metrics
3,9,25 . Many scientists, universities, and research centers have identified this deficiency, and it has been a research priority for many years 1,2,26 . However, there is still an urgent need for the development of universal

frameworks, methodologies, metrics, and criteria for the performance assessment of CDS systems in response to disasters and emergencies.

Limitations

This review has some limitations. During the study period, the Web of Science (ISI) was not available, and was excluded from the search process. Studies were only included if the texts or abstracts were available in
English. Therefore, there is a bias in the selection of studies. However, high sensitivity was used in the database search, yielding more than 3928 titles.

Another limitation was the identification of criteria and indices. We addressed this problem by using a consensus approach between reviewers. However, there were some errors in the determination of criteria and
indicators.

Finally, there was limited access to the full text of some papers and four potentially relevant documents were not accessible. We were unable to extract the study variables from 2 articles and those results were

Sci-Hub URL , DOI,


7 de 9 19/09/2017 8:25 p. m.
Performance Assessment of Communicable Disease Surveillance in Disasters: A Systematic Re... http://sci-hub.cc/http://currents.plos.org/disasters/article/performance-assessment-of-communic...

Performance assessment is an integral component of the management of all organizations and the lack of performance assessment is considered a significant sign of weakness in an organization. Therefore, the lack
of accepted mechanisms for the assessment of the performance of CDS systems in response to disasters is an important weakness. Although some studies attempted to assess surveillance systems, the results of this
systematic literature review suggest that there is no clear and comprehensive assessment framework in place. While the CDC framework has been used in some studies, it is not specific for SS in response to
disasters. Some criteria and indices were widely applied, but were not universally accepted, and researchers used different definitions and scientific approaches. Further research is required to identify and develop a
universally accepted framework for the assessment of CDS systems in response to disasters and emergencies.

Correspondence
Ali Ardalan, MD, PhD. Department of Disaster Public Health, School of Public Health, Tehran University of Medical Science, Tehran, Iran.

Email: aardalan@tums.ac.ir

Acknowledgements
This study is part of a PhD thesis supported by Tehran University of Medical Science.

Appendix 1

PRISMA Checklist
Download PDF

References
1. Liu FF, Teng YH, Lai CH. The disaster response performance of hospitals in Taiwan: Evaluation and classification. Qual Quant. 2011; 45(3):495-511.

2. Ardalan A, Mowafi H, Khoshsabeghe HY. Impacts of natural hazards on primary health care facilities of Iran: a 10-year retrospective survey. PLoS Curr. 2013; 5. doi:pii: ecurrents.dis.ccdbd870f5d1697e4edee5

3. Lazer JE, Cagliuso NV, Gebbie KM. Need for performance metrics in hospital emergency management. Disaster Med Public Health Prep. 2008; 3(1):1-5.

4. Chretich JP, Lewis SL, Burkon HS, Glass JS, Lombardo JS. Evaluating pandemic influenza surveillance and response system in developing countries: Frame work and pilot application. Advances in diseases
surveillance. 2007; 2:146.

5. Kouadio IK, Aljunid S, Kamigaki T, Hammad K, Oshitani H. Infectious diseases following natural disasters: prevention and control measures. Expert Rev Anti Infect Ther. 2012; 10(1):95-104.

6. Sabatinelli G, Kakar SR, Rahim Khan M, Malik M, et al. Early Warning Disease Surveillance after a Flood Emergency Pakistan 2010. MMWR. 2012; 61(49):1002-7.

7. Yan G, Mei X. Mobile device-based reporting system for Sichuan earthquake-affected areas infectious diseases reporting in China. Biomed Environ Sci. 2012; 25(6):724-9.

8. Tohma K, Suzuki A, Otani K, Okamoto M and et al. Monitoring of influenza virus in the aftermath of Great East Japan earthquake. JPN J Infect Dis. 2012; 65: 542-4.

9. Sheikhzadeh R. Developing of performance evaluation and management model for health system of Iran. Management researches. 2010; 3(10):83-108.

10. Boatright CJ, Brewster PW. Public health and emergency management systems. In: Koenig KL, Schultz CH. Disaster medicine: Comprehensive principles and practices. Cambridge, Cambridge university press;
2010. Pp 133-51.

11. UCLA School of public health. Definitions. [Online]. 2014 [Cited 2014 February 22]; Available from: http://sci-hub.cc/http://www.ph.ucla.edu/epi/bioter/anthapha_def_a.html

12. Centers for Diseases Control and Prevention (CDC). Updated guidelines for evaluating public health surveillance systems. MMWR. 2001; 50(RR-13):1-51.

13. International Strategy for Disaster education (ISDR). 2009. Terminology on disaster risk reduction. [Online]. 2013 [Cited 2013 September 22]; Available from: http://sci-hub.cc/http://www.unisdr.org/files
/7817_UNISDR/TerminologyEnglish.pdf

14. Smith PC, Mossialos E, Papanicolas I. Performance measurement for health system improvement: experiences, challenges and prospects. [Online]. 2008 [Cited 2014 April 11]; Available from: http://sci-hub.cc/http:
//www.euro.who.int/__data/assets/pdf_file/0003/84360/E93697.pdf

Sci-Hub URL , DOI,


8 de 9 19/09/2017 8:25 p. m.
Performance Assessment of Communicable Disease Surveillance in Disasters: A Systematic Re... http://sci-hub.cc/http://currents.plos.org/disasters/article/performance-assessment-of-communic...

Q Rep. 2010; 34(3):310-8.

17. Choudhary E, Zane DF, Beasley C, Jones R, Rey A, Noe RS, Martin C, Wolkin AF, Bayleyegn TM. Evaluation of active mortality surveillance system data for monitoring hurricane-related deaths-Texas, 2008.
Prehosp Disaster Med. 2012; 27(4):392-7.

18. Farag NH, Rey A, Noe R, Bayleyegn T, Wood AD, Zane D. Evaluation of the American Red Cross Disaster-Related Mortality Surveillance System Using Hurricane Ike Data--Texas 2008. Disaster Med Public
Health Prep. 2012; 7(01):13-9.

19. Elliot AJ, Hughes HE, Hughes TC, Locker TE, Shannon T, Heyworth J, Wapling A, Catchpole M, Ibbotson S, McCloskey B, Smith GE. Establishing an emergency department syndromic surveillance system to
support the London 2012 Olympic and Paralympics Games. Emerg Med J. 2012; 29(12):954-60.

20. Conolly MA. Communicable disease control in emergencies: A field manual. [Online]. 2004 [Cited 2014 November 8]; Available from: http://sci-hub.cc/http://www.who.int/diseasecontrol_emergencies/publications
/9241546166/en

21. Nilson H, Vikstrom T, Jonson CC. Performance indicators for initial regional medical response to major incidents: a possible quality control tool. Scand J Trauma Resusc Emerg Med. 2012; 20(81). doi:
10.1186/1757-7241-20-81

22. Teixeira MG, Costa MC, Souza LP, Nascimento EM, Barreto ML, Barbosa N, Carmo EH. Evaluation of Brazil's public health surveillance system within the context of the International Health Regulations (2005).
Rev Panam Salud Publica. 2012; 32(1):49-55.

23. Josseran L, Fouillet A, Caillre N, Brun-Ney D, Ilef D, Brucker G, Medeiros H, Astagneau P. Assessment of a syndromic surveillance system based on morbidity data: results from the Oscour network during a heat
wave. PLoS One. 2010; 5(8):e11984. doi: 10.1371/journal.pone.0011984.

24. Rosenktter N, Ziemann A, Riesgo LG, Gillet JB, Vergeiner G, Krafft T, Brand H. Validity and timeliness of syndromic influenza surveillance during the autumn/winter wave of A (H1N1) influenza 2009: results of
emergency medical dispatch, ambulance and emergency department data from three European regions. BMC Public Health. 2013; 13: 905. doi: 10.1186/1471-2458-13-905.

25. UNCHS. Guidelines for the evaluation of post disaster programs: A resource guide. 2012 [Online]. [Cited 23January 2012]. Available at: www.unhabitat.org/content.asp?cid=1264&catid=286&typeid.

26. Altevogt BM, Pope AM, Hill MN, Shine KI. Research priorities in emergency preparedness and response for public health systems. 2008[online]. [Cited 05 April 2013]. Available from: http://sci-hub.cc/http:
//www.nap.edu/catalog/12136.html.

27. Zeilinski A. Cost analysis of adjustments of the epidemiological surveillance system to mass gathering. Przeglad epidemiologic zny. 2011; 65(1): 5-8.

28. Cinti S, Haas K, Paliani P, Newton D, Wright C, Zalewski C, et al. Efficacy of surveillance for 2009 H1N1 pandemic within a health system. Infect Control Hosp Epidemiol. 2011; 32(8):811-4.

29. Scnall AH, Wolkin AF, Noe R, Hausman LB, Wiersma P, Soetebier K, Cookson ST. Evaluation of a standardized morbidity surveillance form for use during disasters caused by natural hazards. Prehosp Disaster
Med. 2011; 26(2): 90-8.

30. Teixeira MG, Costa MCN, Souza LPF, Nascimento EMR, Barreto ML, Barbosa N, et al. Evaluation of Brazils public health surveillance system within the context of the International Health Regulations (2005). Rev
Panam Salud Publica. 2012; 32(1):4955.

31. Bowden S. Braker K, Checchi F, Wong S. Implementation and utilization of community based mortality surveillance: a case study from Chad. Confl Health. 2012; 6(1): doi: 10.1186/1752-1505-6-11.

32. Potter MA, Sweeney P, Luliano AD, Allswede MP. Performance indicators for response to selected infectious disease outbreaks: a review of the published record. J Public Health Manag Pract. 2007; 13(5):510-8.

33. Josseran L, Caillere N, Brun-Ney D, Pottner J, Filleul L, Brucker G, et al. Syndromic surveillance and heat wave morbidity: a pilot study based on emergency departments in France. BMC Med Inform Decis Mak.
2009; 9(14):doi:10.1186/1472-6947-9-14.

34. Stoto MA. The effectiveness of U.S. public health surveillance systems for situational awareness during the 2009 H1N1 pandemic: A retrospective analysis. PLoS ONE. 2012; 7(8): e40984.
doi:10.1371/journal.pone.0040984.

35. Hope K, Merritt T, Eastwood K, Main K, Durrheim DN, Muscatello D, Todd K, Zheng W. The public health value of emergency department syndromic surveillance following a natural disaster. Commun Dis Intell Q
Rep. 2008; 32(1):92-4.

36. Akgun D. Evaluation of diseases surveillance studies after flood disaster in Batman province. Dicle Medical Journal. 2009; 36(1):1-7.

Sci-Hub URL , DOI,


9 de 9 19/09/2017 8:25 p. m.

You might also like