Download as ppt, pdf, or txt
Download as ppt, pdf, or txt
You are on page 1of 25

An Evaluation System for Environmental

Conflict Resolution Processes

Environmental Evaluator’s Networking Forum


Panel D: Developing a Common Vocabulary and Methodology
for Evaluating Collaborative Action
June 14, 2007
Presentation Outline

ECR evaluation design

Case-specific evaluation products

Aggregate level evaluation products

Available tools and resources


U.S. Institute for Environmental
Conflict Resolution

What: independent federal program established by Congress


in 1998

Why: to assist parties in building consensus or resolving


conflicts on environmental, natural resource, and public lands
issues

How: staff of 12 works with private neutrals throughout the


U.S.

Where: part of Morris K. Udall Foundation in Tucson, AZ


Background and Design
Evolution of the Evaluation System
 A collaborative process – begun in 1999
 Program managers, practitioners, evaluators,
researchers and funders
 Over 300 evaluation criteria from the ECR
literature
 Approved by OMB, used for several years
and by several agencies
The system is used to evaluate
collaborative processes to:

Settle disputes
Enforce rules
Issue licenses
Issue permits
Site facilities
Negotiate rules
Develop plans
Set policy
Identify priorities
The feedback is used to for performance
reporting, learning and improvement
Audiences

Sponsors Participants

Providers Trainers
Evaluation Framework
Desired Process Expected Process End of Process
Impacts
Conditions Dynamics Outcomes

ECR is determined to be appropriate Participants are


Effectively Engaged Agreement
is reached on workable solutions,
Appropriate the agreement is of high quality,
and the participants
Participants expect the agreement to last
engaged in process

They have the time, skills


and resources to participate
(i.e., participants communicate
and collaborate, participants
Appropriate understand each other’s views
Facilitator and perspectives, and participants’
understanding of issues improves)
engaged to guide process

Facilitator Skills & Practices add value

Information Capacity
Participants’ collective to manage
Relevant, high quality and trusted information and resolve this issue or conflict is improved
is effectively incorporated into the process
Questionnaires

Participant and facilitator


questionnaires are administered at the end of a
process.

Can be administered using a paper or an


online survey system.
Rating Scale
Mapping Questions to the Model
Desired Condition

Evaluation Questions Respondent


Type
We worked collaboratively to identify
information needs.
All participants had full access to
relevant information they needed in
order to participate effectively in this Participants
Relevant, high- collaborative process.
I could understand all of the important
quality and trusted information, and data used in this
information is collaborative process were
understandable.
incorporated into
If needed, resources were available to
the process. obtain expertise/technical information
for this case. Mediator/
Facilitator
Scientific and/or technical expertise
was used to educate on issues related
to this case.
All participants had equal access to
the technical input related to this case.
In general, the technical input related
to this case was understood by the
participants (i.e. it was not too
complex).
Case – Specific
Evaluation Products
Case Report

pl e
a m
S
Case Evaluation Reports
Barry M. Goldwater Range: Military Training
and Protection of Endangered Species

The Barry M. Goldwater Range


is one of the premier combat
aviation training ranges
available to the Department of
Defense and will remain critical
to the military readiness of the
armed services into the
foreseeable future. At the same
time, the Range comprises 42
percent of the current U.S.
habitat for the endangered
Sonoran pronghorn and is
necessary to the recovery of the
species.
Aggregate Level Evaluation
Products
Selecting Cases for MAESII
ECR cases must:

Focus on an environmental, natural resource


or public lands

Be agreement-seeking

Involve an independent, third-party facilitator or mediator


Multi-Agency ECR Evaluation Study
(MAES II)
Supported by Hewlett Foundation
In collaboration with federal agencies & ADR
programs ….. and many individuals
52 cases
523 respondents
Analytic Methods: Dual Purpose

 Descriptive statistics to characterize the


process and the outcomes.

 Statistical methods to test relationships


among variables, e.g., what’s most important
for success?
Agreement Outcome (52 Cases)
Average Case Outcomes Case Outcomes
using Participants data per the Facilitators

92% Agreement 94%


or Progress

82% Agreement
79% (on some, most or all issues) 84%

Progress
13% 10%
but no agreement

No agreement
8% 6%
and little progress
Social Capital: Relationships
among parties improved (n=523)

71% of
participants
reported
relationships
among
29% did not report parties
an improved
improvement
75% 76%
70%

Participants Endorse ECR

0 - 10 Rating Scale
Not at all Weakly Moderately Very much
to mostly so
0 1 2 3 4 5 6 7 8 9 10

My first choice would be to


11% 14% 29% 46%
use this type of process again
for similar situations.
75%
I would recommend this type
10% 14% 31% 45%
of process to others in a similar
situation without hesitation.
76%
We could not have 15% 15% 30% 40%
progressed as far as we did
using any other process of 70%
which I am aware.
Available Tools & Resources
Tools and Resources:
 Questionnaires

 Case Report

 Case Briefing

 Overview of the MAES Study

 Confidentiality Protocols

 Inventory of 300 ECR Performance Indicators

Available at: www.ecr.gov


Acknowledgements
The evaluation model described in this document has been created collaboratively in
several stages over the past few years by specialists from several agencies. In addition
to U.S. Institute staff particular thanks is due to key design contributors: Mike Niemeyer
from the Oregon Department of Justice; Will Hall from the Environmental Protection
Agency (Conflict Prevention and Resolution Center); Elena Gonzalez and Kathy Lynne
from the Department of Interior (Office of Collaborative Action and Dispute Resolution);
Chris Pederson from the Florida Conflict Resolution Consortium; Kasha Helget from the
Federal Energy and Regulatory Commission; Susan Brody from the National Policy
Consensus Center; and Chris Carlson from the Policy Consensus Initiative. Evaluation
consultants Kathy McKnight and Lee Sechrest from the University of Arizona have
guided this effort since 2004, Tom Miller of the National Research Center helped with
preliminary data analysis, and Andy Rowe of GHK International, guided the development
of the original model and provided input on the later revisions. The U.S. Institute would
also like to acknowledge the many researcher practitioners, particularly Bernie Mayer of
CDR Associates and Julie Macfarlane of the University of Windsor for their contribution
along the way. Special thanks is also due to the William and Flora Hewlett Foundation
for its financial support over the years.
For more information visit our website:
www.ecr.gov

Or Contact the MAES II Management Team:


Patricia Orr Dale Keyes
Program Manager for Evaluation & MAES II Technical Lead Senior Program Manager & MAES II Project Manager
U.S. Institute for Environmental Conflict Resolution U.S. Institute for Environmental Conflict Resolution
130 South Scott, Tucson, Arizona 85701 130 South Scott, Tucson, Arizona 85701
Telephone (520) 670-5658 or e-mail orr@ecr.gov Telephone (520) 670-5299 or e-mail keyes@ecr.gov

Kirk Emerson
Director
U.S. Institute for Environmental Conflict Resolution
130 South Scott, Tucson, Arizona 85701
Telephone (520) 670-5299 or e-mail emerson@ecr.gov

You might also like