Download as pdf or txt
Download as pdf or txt
You are on page 1of 63

Use of Models in Risk

Assessment & Regulatory


Decision Making

Elsie Sunderland
Harvard School of Public Health
Department of Environmental Health
www.epa.gov/crem

“Build the
black box
out of
plexiglass.”
Models in Environmental
Regulatory Decision Making
http://books.nap.edu/catalog.php?record_id=11972

 2007 NRC Recommendations for Best Practices for


Model Development, Application and Evaluation

 EPA Final Guidance on Environmental Models in


2009.
EPA Charge to NRC
A National Research Council committee will provide advice
concerning the development of guidelines and a vision for the
selection and use of models at EPA. The committee will consider
cross-discipline issues related to model use, performance
evaluation, peer review, uncertainty, and quality
assurance/quality control. The objective …. [is] to provide a
report that will serve as a fundamental guide for the selection
and use of models in the regulatory process. As part of its work,
the committee will need to carefully consider the realities of
EPA's regulatory mission so as to avoid an overly prescriptive and
stringent set of guidelines.
Today’s Discussion
 Politics of modeling at EPA
 Scientific vs. policy uncertainty
 Model evaluation: What does this
encompass?
 How to choose a good model?
 Future directions
Challenge:
Regulatory Decision-Making
Despite Model Uncertainty

Consider…counting six million of anything.


….no court of law can determine, with
sufficient accuracy, what the true count
actually was.
Op-Ed by Mathematician John Paulos
on the Bush-Gore Elections, NY Times,
Nov. 22, 2000
Challenge:
Regulatory Decision-Making
Despite Model Uncertainty
“The (model) …produces much larger
errors than … the natural noise of the
data. That is a simple test of whether
or not a model is valid…and both of
those models fail.”
Patrick Michaels, Professor of Environmental Sciences
at University of Virginia

Quoted in Petition to Cease Dissemination of the National Assessment


on Climate Change, submitted February 2003.
WHAT IS A MODEL?

 A system of postulates, data, and


inferences presented as a
mathematical description of an entity
or state of affairs
--Websters New Collegiate Dictionary, 1981
What is a Regulatory Model?

“A computational model used to inform the


environmental regulatory process. The
approaches can range from single parameter
linear relationship models to models with
thousands of separate components and many
billions of calculations.”

Source: NRC, 2007. “Models in Environmental Regulatory Decision-Making”


Measurement Models
Statistical or data based models (empirical)
Response

Dose

Image by MIT OpenCourseWare.


Process Models
Process based or “mechanistic” models –
e.g. HYSPLIT model for Air Quality

© source unknown. All rights reserved. This content is excluded from our Creative Commons license.
For more information, see http://ocw.mit.edu/fairuse.
A System of Models and Data:
Example Air Quality Models
Meteorology Source Emissions
(Data, Model) (Data, Model)

Initial and Boundary Land Use and


Conditions Geospatial
(Data, Model)
(Data, Model)

AIR QUALITY MODEL


(Processes: Transport, Diffusion, Deposition, Gas-phase chemistry,
Aqueous chemistry, Heterogeneous chemistry, Aerosol dynamics and
thermodynamics, Radiation/photolysis)
Transportation Models Used by DOT:
Combined Empirical/Mechanistic Models
Trip
Demographic Generation
Data

Emissions
Land Use Trip Models
Data Distribution

Feedback
Mode
Alternatives
Split
Evaluation
Transportation
Network
Trip
Assignment
Third Example Empirical/Mechanistic
Models: IEUBK model for blood lead in
children
Cortical and
Inhalation Trabecular
Air Pb Lung Plasma + Bone
Extra
From
Cellular
Ingestion Bone
Fluid
Dust Pb Gut

Ingestion
Soil Pb Red
Blood From Air
Cells
Ingestion From Dust
Water Pb
From Soil

Ingestion From Water


Diet Pb
From Diet

Ingestion Urine From Other


Other Pb Liver,
Kidney, Hair
Other Sweat
Soft Nails
Tissues
Feces

Environment Exposure/ Absorption/ Blood Tissues Elimination Blood lead


intake uptake
14
Biokinetics
Image by MIT OpenCourseWare.
Why Use Models?
 Incomplete measurement data (cannot
measure all things and all systems)
Natural Mechanistic
Ecosystems Models

Image courtesy of
laszlonian on Flickr.

Observations
(Sample Data) Empirical Models
15
Why Use Models?
 Policies need to integrate socio-economic
information and physical/biological data

Better
Interconnected
Decisions
Environment

Environment
Economy

Economy

Society
Society

Stove Pipe Decisions Pillars of Sustainability


Why focus on models?
Emerging Technologies
• Object-oriented software
• Moore’s law
• Data from advanced monitoring
technologies
• Genomics/Proteomics

Data Quality Act Courts’ “hard look”


Models should be EPA’s use of a model
useful, objective, is arbitrary if the
transparent, model bears no
reproducible. Models rational relationship
used for risk
assessment should to the reality it
rely on best available, purports to represent
practicable science.
Data Quality Act (2001)

 Two-sentence rider in a spending bill


 “procedural guidelines” for information
quality and dissemination
 Administrative mechanism – “petitions
process” targeted at EPA and models
 Used to remove all climate research
information from EPA website in 2003
CREM Background
Directives from the EPA Administrator:
May 2002
… to improve the Agency’s policy-making process to better
integrate the highest quality science … revitalize the
Council for Regulatory Environmental Modeling (CREM).
February 2003
CREM to work on:
•Guidance to develop, evaluate, and use models
•Web-based Knowledge Base of EPA models
•National Academy of Sciences report on modeling
Models in Environmental Decision Making

Imperfect representations of reality

Understanding complex
systems under current or
future conditions

Assist in formulating
alternative courses of action
Pascual 2004

Subject to uncertainties and limitations

© Pascual, 2004. All rights reserved. This content is excluded from our Creative Commons license.
For more information, see http://ocw.mit.edu/fairuse.
Non-economic
Human activities
impacts

Fate and Human health/


Emission Exposure environment response
transport

Natural system Economic


processes impacts

Image by MIT OpenCourseWare.

NRC, 2007
CREM Draft Guidance on
Environmental Modeling

Purpose: To provide a road map on how to develop,


evaluate, and apply quality models for EPA decisions.

Model Evaluation: Process for generating information to


determine if a model and its results are of a sufficient
quality to serve as the basis for a decision.
Proprietary Models: Source code is not universally shared.
Recommendations:
•Preference for non-proprietary models.
•Document that models adhere to guidance.
•To extent practicable, provide means to replicate model results.
CREM Guidance: Proprietary Models

 Use proprietary models only when:


 No practicable alternative
 Owners show effort to protect code and show
competitive loss
 Document that models adhere to
guidance
 To the extent practicable, provide means
to replicate model results
Today’s Discussion
 Politics of modeling at EPA
 Scientific vs. policy uncertainty
 Model evaluation: What does this
encompass?
 How to choose a good model?
 Future directions
Model Uncertainty
Given model y = f (x1, x2), how best decide whether
yo exceeds some legal limit, given uncertainties…
y

…in differences
modelled
between
modelled and
observed
results?
x2
x1 observed
… in input data
…in model structure
Uncertainty in Environmental Modeling
Modeler’s Perspective Inherent
Technological Societal Behavioral Value diversity randomness
surprise randomness diversity (sociopolitical) of nature

Uncertainty due to variability

The System

ignorance
Irreducible

Indeterminacy

ignorance
Reducible

evidence
Conflicting

immeasurable
Practically

measurements
Lack of observations &

Inexactness
Model Degree of uncertainty

Increasing
uncertainty
Structural uncertainty Unreliability

Uncertainty due to limited


knowledge

Adapted from ICIS (2000) Uncertainty in Integrated Assessment Image by MIT OpenCourseWare.
27
And Walker et al (2003) Defining Uncertainty
Two Major Categories of
Uncertainty
Stochasticity/ Epistemic
Variability • Things we do not
• Natural variability in know
ecological processes • Structural errors in
• Measurement errors model formulation
• Influences model
parameterization
Uncertain Uncertainty
We do not know what we do not know
Biogenic VOC Emissions
(32 km US domain, 28 Jun-9 Jul1999)

3.00E+10
Domain total (molesC/day)
2.50E+10

2.00E+10

1.50E+10

1.00E+10

5.00E+00

0.00E+00
BEIS2 BEIS3.09 BEIS3.10

MEOH ETH ALD2


CO XYL PAR
FORM TOL ISOP
ETOH OLE

Image by MIT OpenCourseWare.

Changes in biogenic VOC emission estimates: BEIS2 (ca 1996)


increased isoprene emissions by x3 over BEIS1. With BEIS3, recently
“discovered” oxygenated VOCs have been added to the inventory. These
uncertainties are largely unknown and unquantifiable.
Uncertainty in Modeling Analysis
Data
limitations
Cost & Time = Model
+ Constraints Uncertainty
Science
limitations
Uncertainty

“…science has always informed decisions


at EPA, but science is not always exact. A
big part of my frustration was that
scientists would give me a range. And I
would ask, “Please just tell me at which
point you are safe, and we can do that.” But
they would give a range, say, from 5 to 25
parts per billion (ppb). And that was often
frustrating.” -Christine Todd Whitman,
-Former EPA Administrator
quoted in Environmental Science &
Technology Online,
April 20, 2005
Uncertainty in Environmental Modeling
Decision-Maker’s Perspective

Model & Monitoring


Uncertainty on agenda setting
Uncertainty

Action Uncertainty
Uncertainty about costs and
benefits of alternative options
Yield Uncertainty

Goal Uncertainty
Uncertainty on goals and
Preferences
Political Uncertainty

Adapted from ICIS (2000) Uncertainty in Integrated Assessment


Operational Decision Process

Situational Awareness Decision Support Robust Response

Modeling Considerations: Decision Making Considerations:


 Creating science useful to decisions  Are the assumptions reasonable?
requires a deliberate research  Does the model make sense based on
approach and must be “use-inspired” best scientific knowledge
 Understand decision needs: Who are  Is the model credible?
the users? Why do they use it? How do  How uncertain are the results?
they use it?  What are the assumptions and
limitations?
 Information criteria: Quality/  How do uncertainties impact on
integrity, Relevance, Timeliness, decision making?
Clarity, Visualization
Today’s Discussion
 Politics of modeling at EPA
 Scientific vs. policy uncertainty
 Model evaluation: What does this
encompass?
 How to choose a good model?
 Future directions
What is a verified and
validated model?
Oreskes et al. (1994)

 Verification and validation of numerical


models of natural systems is
impossible…natural systems are never
closed and model results are always non-
unique. Models can be confirmed by the
demonstration of agreement between
observation and prediction, but
confirmation is inherently partial.
Is the model valid?

 Proven to correspond exactly


to reality
 Demonstrated through
experimental tests to make
consistently accurate
predictions

Source: Suter, 1993


CREM Guidance: Model Evaluation
A process to determine whether the model and its results are of sufficient quality to be
used as a basis for a decision, through peer review of models; data quality assessment;
model corroboration; and sensitivity and uncertainty analysis
Supported by quality and
Reasonably approximates
quantity of available data?
the real system?

f (x) y
Perform the Principles of sound
specified task? science addressed?
Image courtesy of laszlonian on Flickr.
What is Evaluation
Information? Supported by
Reasonably quality and quantity
approximates of available data?
the real system?

f (x) y

Perform the Principles of sound


specified task? science addressed?
Image courtesy of laszlonian on Flickr.
Adapted from Beck, 2002
Model Evaluation

 Qualitative: Model formulation


review, Expert Elicitation, Peer
Review
 Quantitative: Statistical
uncertainty and sensitivity
analysis, corroboration with
field observations
NRC: Model Use in the Regulatory Process
 Models always constrained by computational limitations,
assumptions, and knowledge gaps

 Environmental models can never be completely “validated”


in the traditional sense. But they can be “evaluated”

 Regulatory models are typically used to describe


important, complex, and poorly characterized problems

 Models in the regulatory process best seen as tools


providing inputs, as opposed to “truth generating
machines”
Source: NRC, 2007
Implications for Regulatory Model Use
 Evaluation of regulatory models requires different
tradeoffs than those for research models

 Not simply how well do model estimates match


observations, but also how reproducible, transparent, and
useful a model is to the regulatory decision

 Requires regulatory models must be managed to be


updated in a timely manner and assist users and others to
understand conceptual basis, assumptions, input data
requirements, and life history

Source: NRC, 2007


Life-cycle Model Evaluation
Problem Identification •What is the goal of the model?
•Who will use it?
•What types of decisions will it support?
•What data are available to support the model?

Conceptual Model •Available and alternative theories


•Assumptions
•Uncertainties
•Peer Review
Constructed Model •Spatial/ temporal resolution
•Algorithm choices
•Assumptions
•Data availability/ software tools
•Quality assurance/ quality control
•Test scenarios
•Corroboration with observation
•Uncertainty/ sensitivity
•Peer review
Model Use •Appropriateness of model for problem
•Assumptions
•Model extrapolation
•Input data quality
•Comparison with observation
•Uncertainty/ sensitivity analysis
•Peer review
Life-cycle Model
Evaluation
 EPA should pay special attention
to the peer review of models
throughout their life-cycle
 EPA could consider promulgating
a generic rule for the process of
evaluation and adjustments of
models

Source: NRC, 2007


NRC Reccomendations
Cont’d
 Retrospective Analysis
 EPA should conduct and document
results of retrospective analyses of
regulatory models not only on individual
models but classes of model.
 What are potential problems with this?
Today’s Discussion
 Politics of modeling at EPA
 Scientific vs. policy uncertainty
 Model evaluation: What does this
encompass?
 How to choose a good model?
 Future directions
“An ounce of ex ante explanation is
worth a pound of post hoc
rationalization.”

Wagner & McGarity (2003) - Legal Aspects of


the Regulatory Use of Environmental
Modeling

(An analysis of 30 years of judicial challenge


on EPA use of models.)
Challenges in Model Selection
• Simple vs. complex?

Source: Environmental Protection Agency, US Federal Government.

Source: EPA Guidance on Environmental Models, 2009


Example for Water Quality Models
 Simple  Steady state
 PLOAD  QUAL2E, QUAL2K
 STEPL
 Simple Method  Lake - Simplified
 Mid-range  BATHTUB
 GWLF  EUTROMOD
 P8  Water Quality
 SWAT
 EFDC
 Detailed
 WASP
 HSPF
 SWMM  CE-QUAL-W2
Standard of Review

Any model is an
abstraction from
and
simplification of
the real world… Image courtesy of laszlonian on Flickr.

we must defer to the [agency on


balancing] the cost and complexity of a
more elaborate model against the
oversimplification of a simpler model.
U.S. Court of Appeals
Parsimony and
Transparency
 Occam’s razor: “entities should
not be multiplied unnecessarily”
 Model should capture all
essential processes for the
system under consideration –
but no more
Model Selection Process
Select Performance Measure
I or Indicator

Select Land, Water, and


II Management Features

Identify Special or Innovative


III Analyses

Evaluate Practical Issues and


IV User Preferences

Provide Ranking and Scores


V of Recommended Models
Model Selection Process
Measures relevant to
Select Performance Measure management
I or Indicator concerns
 Flow
Select Land, Water, and  Load of pollutant
II Management Features  Conc. Of pollutant
Magnitude, duration,
Identify Special or Innovative
III Analyses
frequency

Evaluate Practical Issues and


IV User Preferences

Provide Ranking and Scores


V of Recommended Models
Model Selection Process
Performance Measure
A: Performance Measure B: Duration/Frequency
TP Load Hourly or less
TP Conc. Daily
TN Load Seasonal STEP I
TN Conc. Annual
NH4 Load
NH4 Conc.
Steady state
Storm
• Select performance measure and
… associated duration/frequency
• Identify all models that meet
PERFORMANCE MEASURE MATRIX
PERF. MEASURE.. … minimum or better requirement and
excludes those that cannot
MODEL #1
simulate the selected performance
MODEL #2 measure
MODEL #3 • Models with a shorter time step
… can be aggregated to longer
periods (hourly to daily, daily to
annual)
SCORING • Simplest model that meets the
needs of the analysis is initially the
Hourly (or less) 5 best.
Daily 4
Annual 3
Storm 2
Steady State 1
Model Selection Process
 Waterbody type
Select Performance Measure  River
I or Indicator  Lake
 Reservoir
Select Land, Water, and  Tidal estuary,
II Management Features embayment
 Watershed
Identify Special or Innovative  Rural
III Analyses
 Urban
 Forest
Evaluate Practical Issues and
IV User Preferences  Management
Techniques
Provide Ranking and Scores  Stormwater ponds
V of Recommended Models  Operations
 Flow management
Model Selection Process
Model Specific Features
Select Performance Measure  Ease of use
I or Indicator  Time to apply
 Difficulty
 Availability
Select Land, Water, and
II  Data needs
Management Features
 Software capabilities

Identify Special or Innovative


III Analyses User Specific Preferences
 Trained staff
 Available experts
Evaluate Practical Issues and
IV User Preferences  Available data
 Time/schedule
 Resources/financial
Provide Ranking and Scores considerations
V of Recommended Models
Model Selection
Selected Model
Most
appropriate

Appropriate for
application

Available Models
Today’s Discussion
 Politics of modeling at EPA
 Scientific vs. policy uncertainty
 Model evaluation: What does this
encompass?
 How to choose a good model?
 Future directions
CREM Strategic Goals
 Strategic Goal 1: To develop and implement activities to ensure consistency
in the quality of model development, evaluation and application throughout the
Agency;
 Strategic Goal 2: To improve communication among the model development
and user groups, including enhancing communication between IT experts
(developing the underlying IT and IM infrastructure) and domain technologists
(e.g. those who develop air quality forecasting tools) and between domain
technologists and users
 Strategic Goal 3: To improve access to, and information on, EPA’s models
and modeling tools;
 Strategic Goal 4: To bridge disciplines and foster a more integrated and
joined up thinking approach to modeling in environmental management;
 Strategic Goal 5: To foster continuous improvement and innovation in model
development and application and disseminate developments on the use of
emerging modeling technologies and techniques;
 Strategic Goal 6: To enhance the management of uncertainty in model based
environmental decision making.
Meta-Modeling
Temporal Scale

Integrating individual
models into large
modeling systems.

“Plug and Play”

Spatial Scale
Example: 3MRA
Multimedia, Multipathway, Multireceptor
Exposure and Risk Assessment Model
•Developed as a predictive risk assessment tool to support the
delisting of chemicals under RCRA
•Many potential uses
•17 modules that estimate chemical fate, transport, exposure,
and risk
•Multiple databases for waste-management sites nationwide
•Uses FRAMES (Framework for Risk Analysis in Multimedia
Environmental Systems) software platform developed by PNNL
•Currently near completion of external peer review by EPA’s
Science Advisory Board (SAB)
Benefits
of Meta-
Modeling/Integrated
Modeling
•Potentially greater fidelity with real world.

•Dynamic linkages among media to capture real


world phenomena

•New class of problems can be analyzed., e.g.,


cumulative risk assessment

•Integration between diverse models and data


Challenges
Of Integrated-Modeling
•How to combine models that are based on
different scales of time and space?

•How to corroborate models with no


measurable real-world equivalents?

•How does uncertainty propagate through a


multimedia model?

•How to make large and computationally


intensive models more computationally
efficient? IT solutions?
MIT OpenCourseWare
http://ocw.mit.edu

ESD.864 Modeling and Assessment for Policy


Spring 2011

For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms.

You might also like