Assessing Methodological Valdity of Studies

You might also like

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 17

Assessing Methodological Validity of

Epidemiologic Studies

Fessahaye Alemseged (MD, MPHE)

05/26/2020 1
Session objectives

• Explain the purpose of evaluating evidence


• Discuss how to evaluate methodological
validity of epidemiological studies

05/26/2020 2
References
• CASP checklists, UK
• JBI checklists, Australia
• STROBE guidelines
• CONSORT guidelines
• Gliner et al. Research Methods in Applied
Settings, 2017.

05/26/2020 3
Evidence
• Evidence is used in
– Policy making
– Program planning
– Service provision
– Teaching
– Research
• Types of evidence
– Expert opinion, qualitative study, quantitative study,
systematic review, meta-analysis

05/26/2020 4
Evaluation of evidence
• Quality (fitness for purpose) of evidence is evaluated
based on its
– relevance, validity and usefulness
• Guidelines for evaluating evidence
– Appraisal guidelines: CASP, JBI, GRADE framework
– Reporting guidelines

05/26/2020 5
Relevance and applicability of evidence

• Relevance: does the study address issue of interest?


– identify focus of study and evaluate its appropriateness for
intended use
– E.g. disease burden, health risks, health needs, cost-
effectiveness of interventions
• Applicability: can we intervene using the evidence
and bring impact?
– Evaluate strength of recommendations and applicability
considering timeliness, context, cost and acceptability

05/26/2020 6
Validity of evidence
• Validity: correctness of findings
• Appraising validity of research:
– Level of evidence: systematic review and meta-
analysis in general better than single studies
– Credibility of report
– Methodological validity
• Considering liability to sampling error, bias and
confounding

05/26/2020 7
Assessing Methodological Validity

• Assess degree of error from different sources


• Key methods sections to be assessed
– design
– sampling
– measurement
– analysis
• What questions to raise in each section?
Is the design appropriate to research objective?

• Identify focus/objective of study


• Assess
– appropriateness of design to the objectives
– strength and limitations of the design
Is the sample size adequate?

• Identify sample size determination method and the


assumptions
• Assess adequacy of the sample size considering
– appropriateness of assumptions
– non-response rate
Is there selection bias?

• Assess representativeness of the selected sample


considering
– Randomness of the sampling technique employed
– Inclusion and exclusion criteria
– Characteristics of non-respondents
Is there measurement error?

• Assess validity of
– operational definitions
– data collection formats
– data collection instruments
– data collection procedures
– data entry and analysis
Is there confounding?

• Assess liability to confounding considering


– whether prevention techniques are employed
during design
– whether confounders are controlled during
analysis
– presence of unmeasured confounders
Are statistical analysis appropriate?

• Assess
– Appropriateness of analytic methods
– Fulfillment of requirements for statistical tests
Results and discussion
• Strength of associations and their statistical
significance
• Dose-response relationship
• Consistency with findings of other studies
• Plausibility of associations

05/26/2020 15
Judging the evidence
• Assess strength of evidence considering degree of
– Sampling error
– Selection bias
– Measurement error and
– Confounding
• Assess practical significance considering
– strength of effect and prevalence of exposure

05/26/2020 16
Critical appraisal assignment

• Focus of appraisal: methodological validity


• 4 groups
– Gp 1: cross-sectional study
– Gp 2: case-control study
– Gp 3: cohort study
– Gp 4: experimental study

You might also like