REMA DOC Strategic Environmental Assessment Monitoring The Enduring Forgotten Sibling

You might also like

Download as pdf or txt
Download as pdf or txt
You are on page 1of 10

Impact Assessment and Project Appraisal

ISSN: (Print) (Online) Journal homepage: https://www.tandfonline.com/loi/tiap20

Strategic environmental assessment monitoring:


the enduring forgotten sibling

Ainhoa González

To cite this article: Ainhoa González (2022) Strategic environmental assessment monitoring:
the enduring forgotten sibling, Impact Assessment and Project Appraisal, 40:2, 168-176, DOI:
10.1080/14615517.2022.2031552

To link to this article: https://doi.org/10.1080/14615517.2022.2031552

© 2022 The Author(s). Published by Informa


UK Limited, trading as Taylor & Francis
Group.

Published online: 31 Jan 2022.

Submit your article to this journal

Article views: 1649

View related articles

View Crossmark data

Citing articles: 3 View citing articles

Full Terms & Conditions of access and use can be found at


https://www.tandfonline.com/action/journalInformation?journalCode=tiap20
IMPACT ASSESSMENT AND PROJECT APPRAISAL
2022, VOL. 40, NO. 2, 168–176
https://doi.org/10.1080/14615517.2022.2031552

Strategic environmental assessment monitoring: the enduring forgotten


sibling
Ainhoa González
School of Geography, University College Dublin, Dublin, Ireland

ABSTRACT ARTICLE HISTORY


Monitoring in Strategic Environmental Assessment (SEA) is central to determining whether the Received 30 June 2021
assessment outcomes and associated mitigation measures have had the desired effect and the Accepted 10 January 2022
environment has been protected on the ground. Despite the critical role this procedural stage KEYWORDS
plays, and the international attempts to advance practice in this area, monitoring continues to SEA; monitoring; follow-up;
be poorly performed 20 years on from the implementation of the European SEA Directive. good practice
Several frameworks and approaches can be found in the academic literature, but many are
conceptual and arguably fail to provide pragmatic unsophisticated solutions that are readily
implementable in practice. This paper attempts to address this by first outlining common
issues reported in the academic literature in the last two decades, then highlighting ongoing
issues, as identified in two recent Irish research projects, such as poor definition of monitoring
indicators and lack of integration of monitoring commitments into plans/programmes. It then
puts forward a set of recommendations intended to foster a practical and sensible approach to
kick monitoring into action.

1. Introduction
Strategic Environmental Assessment (SEA) monitoring Monitoring data can be used to check whether SEA
is a mandatory requirement under European and cer­ recommendations and mitigation measures have been
tain international law (EC 2001; Dalal-Clayton and implemented and how effective these are at protect­
Sadler 2005). It is recognised as a good practice prin­ ing the environment. Monitoring can help identify
ciple (IAIA 2002) and essential for assessment account­ unforeseen effects (and support timely remedial
ability and learning (Jiricka-Pürrer et al., 2021; Persson action), address assessment uncertainties and fill data
and Nilsson 2007). SEA monitoring can be defined as gaps identified during the SEA – at planning level but
the process undertaken post-assessment and during also at project implementation level (Morrison-
plan/programme implementation to understand the Saunders and Arts 2004; Gachechiladze-Bozhesku and
actual – as opposed to predicted – outcomes of the Fischer 2012; Azcárate et al. 2013) In other words, with­
plan/programme and SEA. It relies on observations to out monitoring, it is not possible to know and indeed
detect, understand and evaluate changes in the physi­ understand the consequences of SEA. Without moni­
cal environment (Azcárate et al. 2013). toring, it is very difficult to satisfy the substantive
Monitoring is considered a key component of SEA dimension of SEA effectiveness and establish whether
‘follow-up’ defined as the ‘monitoring and evaluation of SEA has resulted in environmental protection. It is not
the impacts of a project or plan (. . .) for management of, possible either to ascertain whether tiering in environ­
and communication about, the environmental perfor­ mental assessment has been effective, that is, whether
mance of that project or plan’ (Morrison-Saunders and SEA’s role in informing planning decisions and indeed
Arts 2004, p.4). It is the ultimate procedural step, and project development and associated Environmental
a cornerstone for both SEA and follow-up, that helps Impact Assessments (EIAs) has been fulfilled
check whether SEA is tangibly effective (Figure 1). The (González and Therivel 2021). And therefore, whether
prime function of monitoring in SEA follow-up is to any causal links can be made from ‘decision’ to ‘imple­
provide data and information on the actual effects of mentation’ (Fischer and Retief 2021; Therivel and
implementing a plan/programme to help determine González 2021). Similarly, without monitoring, it is dif­
whether the SEA predictions were correct and potential ficult to learn from experience to continue enhancing
significant effects have been mitigated (Morrison- assessment processes and outcomes (Jiricka-Pürrer
Saunders and Arts 2004; Gachechiladze-Bozhesku and et al., 2021; Partidário and Fischer 2004; Persson and
Fischer 2012). Nilsson 2007; Thérivel and González, 2019).

CONTACT Ainhoa González ainhoa.gonzalez@ucd.ie University College Dublin, Dublin, Belfield, D04 F6X4, Ireland.
© 2022 The Author(s). Published by Informa UK Limited, trading as Taylor & Francis Group.
This is an Open Access article distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivatives License (http://creativecommons.org/licenses/by-
nc-nd/4.0/), which permits non-commercial re-use, distribution, and reproduction in any medium, provided the original work is properly cited, and is not altered, transformed, or
built upon in any way.
IMPACT ASSESSMENT AND PROJECT APPRAISAL 169

Figure 2. Year of publication of the SEA monitoring and


follow-up publications identified in a targeted Scopus search.

and follow-up seem to have lost research momentum;


Figure 1. Monitoring as a cornerstone in Strategic indeed, the later publications refer to SEA monitoring
Environmental Assessment (SEA) and SEA follow-up. as part of wider SEA effectiveness reviews and there is
a dearth of targeted research on the topic – albeit two
recent papers bring some attention back to it (i.e.
Arguably, there is little point in SEA without mon­ Gutierrez et al. 2021; Jiricka-Pürrer et al., 2021).
itoring as the effectiveness of the assessment in pro­ Three papers explore specific monitoring case stu­
tecting the environment remains unknown and dies (e.g. Azcárate et al. 2013; Gachechiladze-Bozhesk,
learning is halted. Yet, monitoring continues to be 2012; Jiricka-Pürrer et al., 2021). Others present con­
the forgotten sibling in the environmental assessment ceptual frameworks, including:
family of steps and tools – forgotten yet essential. This
paper aims to revisit the importance of monitoring, ● conformance, performance, management and
acknowledging the shortcomings identified in the dissemination approach for each planning tier
international literature and in recent practice reviews. proposed by Partidário and Fischer (2004);
The goal of the paper is to update the state of the art, ● indicator-based plan monitoring developed by
and thus to continue to foster discussions in this area Mascarenhas et al. (2012);
by attempting to fill in certain knowledge gaps. ● linking planning and programming processes
Perhaps more importantly, the ultimate purpose of through monitoring by Wallgren et al. (2011).
the paper is to put forward a series of reasonable ● micro- and macro-evaluations by Sadler (2004);
pragmatic measures to foster action on monitoring ● multi-track approach with different levels of follow-
that can also serve as the basis for future research on up suggested by Partidário and Arts (2005); and
monitoring performance. ● tool kit including analytical and deliberative tools
proposed by Nilsson et al. (2009).

2. SEA monitoring in international research


Most of the papers, however, report on SEA monitoring
Monitoring in environmental assessment has gained performance itself as portrayed in a range of good
significant international attention in the academic lit­ practice case studies and SEA effectiveness reviews
erature, but more so on EIA than SEA. When applying (e.g. Gachechiladze et al. 2009; Söderman and Kallio
the search string [‘Strategic Environmental 2009; Lundberg et al. 2010; Wallgren et al. 2011;
Assessment’ OR ‘SEA’] AND [‘monitoring’ OR ‘follow- Gachechiladze-Bozhesku and Fischer 2012;
up’ OR ‘follow up’], with no other search limitations, Lamorgese et al. 2015; Hadi et al. 2019; Gutierrez
Scopus renders 195 results, which when filtered further et al. 2021). In these publications, there is a general
for their actual linkage to SEA renders 30 results. agreement that monitoring is weak.
A thorough review of these papers identified 22 rele­
vant manuscripts and two milestone books: one spe­
3. Enduring challenges in monitoring practice
cific to environmental assessment follow-up by
Morrison-Saunders and Arts (2004), which is mostly The international literature continues to report on
focused on EIA but does include chapters on SEA monitoring shortcomings such as the absence of allo­
follow-up, and one on SEA by Sadler et al. (2012) that cation of monitoring responsibilities (Polido et al. 2016)
touches upon SEA monitoring. The identified publica­ and questionable adequacy of monitoring measures
tions date from 2003 to 2021. Interestingly, a particular (Gutierrez et al. 2021). Monitoring measures in SEA
intensive or concentrated attentiveness to SEA mon­ environmental reports are typically not linked to iden­
itoring and follow-up is observed between 2007 and tified impacts, demonstrating ‘a lack of understanding
2013 where 68% of the peer-reviewed journal articles as to the very purpose of monitoring’ (Söderman and
were published (Figure 2). Since 2013, SEA monitoring Kallio 2009, p. 17). Monitoring generally focuses on
170 A. GONZÁLEZ

observing whether the SEA recommendations and and undertaking remedial action (. . .) environmental
mitigation measures are implemented (Lundberg effects are not adequately monitored (. . .) and poor
et al. 2010; Wallgren et al. 2011) and in very rare occa­ monitoring hinders the Directive’s success.’ (EC 2019,
sions they measure actual impacts on the physical p. 46). Recommendations to address ongoing chal­
environment (Azcárate et al. 2013). When they do, lenges include ‘a more explicit link between the SEA
measurements and observations are commonly requirements of an individual plan or programme and
based on existing monitoring systems, following from existing monitoring activities, in order to avoid unne­
the SEA Directive’s recommendation to use ongoing cessary duplication of these actions (e.g. by establish­
monitoring arrangements where appropriate, with ing an open national/regional database of
a view to avoiding duplication of monitoring (EC environmental monitoring activities)’ and provision of
2001). Where existing monitoring systems are applied, ‘examples of successful [monitoring] implementation
these do not always cover the indicators and informa­ approaches, including strategies for ensuring [their]
tion needs of SEA (Hanusch and Glasson 2008; long-term sustainability’ (EC 2019, pp. 47 & 159).
Gacheciladze et al., 2009).
Monitoring has been advocated as a means to fill
in data gaps and address uncertainties, but only on 3.1. Monitoring performance in Irish SEA practice
very rare occasions has it been reported to be A research project looking at SEA effectiveness in
applied with that goal in mind (e.g. Azcárate et al. Ireland (EPA 2018; González et al. 2019) included
2013; Polido et al. 2016). The large majority of the reviewing current monitoring practice in Irish SEAs to
reviewed manuscripts (86%) highlight the need to support the development of monitoring guidance. The
nurture and strengthen monitoring and follow-up review looked at 15 good practice SEA case studies
on SEA. across sectors and planning hierarchies (Table 1), and
The above findings echo those of the recent interviewed 30 national SEA practitioners and 13 inter­
European SEA REFIT programme (EC 2019) which national SEA experts to gather information on bene­
also pointed to monitoring requirements being fits/limitations of current SEA practice not captured in
poorly implemented, including the identification of SEA Environmental Reports (SEA ERs), and to seek
appropriate monitoring indicators. The REFIT report expert opinion on how to improve current practice.
is unclear on whether Member States undertake The key procedural challenges identified in this
monitoring systematically and on the frequency of effectiveness review were similar to those experi­
monitoring. Although some Member States claim enced in an earlier review (EPA 2012), notably the
that monitoring reports are submitted ‘regularly’ consideration of alternatives and monitoring. As
for certain plans, the REFIT report does not state with SEA practice internationally, monitoring
which countries cultivate such good monitoring remains the most significant gap in Irish SEA prac­
practice. The findings also suggest a reliance on tice. A degree of ‘informal’ monitoring takes place
standard indicators (e.g. guidance-defined and/or in land-use planning, where the mandatory plan­
associated to ongoing monitoring measures), with ning requirement to review development plans
case-by-case monitoring indicators more often every 6 years and to formulate interim reports
defined at lower planning tiers. after 2 years forces planners to take stock of envir­
The REFIT study concludes that ‘there are challenges onmental changes – but this is not in a formal SEA
in the quality of monitoring the environmental effects monitoring sense. The current generally systemic
of the implementation of plans or programmes, espe­ lack of SEA monitoring hinders a comprehensive
cially when it comes to identifying unforeseen effects evaluation of impact avoidance and sustainable

Table 1. Case studies reviewed in the second review of SEA effectiveness in Ireland (EPA 2018).
Name of Plan/Programme Sector Hierarchy
Clare County Development Plan 2017–2023 Land-use County
Dublin City Development Plan 2011–2017 Land-use County
EirGrid Grid25 Implementation Programme 2011–2016 Energy, onshore National
Fingal County Development Plan 2011–2017 Land-use County
FoodWise 2025 Agriculture National
Greater Dublin Area Transport Strategy 2011–2030 Transport Regional
National Forestry Programme 2014–2020 Forestry National
National Planning Framework 2040 Land-use National
National River Basin Management Plan 2018–2021 Water management National
Nitrates Action Programme 2017–2021 Agriculture, water National
Offshore Renewable Energy Development Plan 2014 Energy, offshore National
Shannon Catchment Flood Risk Assessment and Management Plan 2015–2021 Water management Regional
Southern Regional Waste Management Plan Waste Management Regional
Strategic Integrated Framework Plan for the Shannon Estuary 2013–2020 Water management, industry Regional
Wild Atlantic Way Operational Programme 2015–2019 Tourism Regional
IMPACT ASSESSMENT AND PROJECT APPRAISAL 171

development due to SEA, even in cases where miti­ To compound the issues above, consulted experts
gation has been integrated into the final plan. The and practitioners highlighted several significant
review unveiled the following ongoing monitoring monitoring challenges including the fact that
deficiencies: national legislation does not specify reporting
requirements or assign any third-party authority
● Monitoring typically focuses on plan/programme with oversight/enforcement functions in relation to
implementation (e.g. whether the plan/pro­ SEA monitoring. They also noted that responsibility
gramme policies and actions have been realised for monitoring and remedial action is particularly
within the planning period), rather than on the difficult to assign where issues cross several agen­
environmental impacts and/or changes resulting cies’ responsibilities. For example, in the case of
from plan/programme implementation, as per water quality, the monitoring is undertaken by the
SEA requirements. EPA, wastewater upgrades are through Irish Water,
● Monitoring indicators are often based on assess­ and the pressure sources may ultimately be
ment objectives; reusing the assessment indica­ a combination of diffuse agricultural pollution, land-
tors and targets from the assessment (i.e. uses, industrial effluent and wastewater. How this is
baseline or impact assessment criteria) as part captured through a development plan/programme
of the proposed monitoring programme pre­ monitoring measures, and remediated, is not
sents one of the key inadequacies in current straightforward.
SEA practice, as these may not capture key issues Consultees observed that knowledge exchange on
or may be too broad to inform monitoring at the monitoring, or on whether monitoring is taking or has
local level. taken place, if affected by personnel changes
● Monitoring is regularly used as a form of mitiga­ between planning cycles: institutional memory, for
tion (i.e. ‘monitor and manage’). This approach is instance on where to find data or indeed the need
appropriate for filling data gaps but should only to collate and analyse them, can be rapidly lost with
be used as a mitigation measure of last resort as, if changes of planning personnel. A lack of resources to
applied as such, it would allow impacts to carry out monitoring following the SEA/plan adop­
become significant before they are identified. tion, and the absence of guidance on monitoring
● The opportunity to use monitoring to specifically were also considered key difficulties hindering prac­
address identified data gaps (and thus help with tice. In particular, consultees recommended that gui­
the assessment of subsequent iterations of the dance is needed to:
plan/programme, and project-level assessments)
is generally missed. (1) clarify the actual objective of SEA monitoring
● Monitoring periodicity, thresholds or remedial (e.g. is it to monitor the plan/programme or
actions are commonly missing in the SEA moni­ the environment, and to what effect?);
toring programme. (2) provide recommendations on the evidence
● Lack of clarity of monitoring responsibilities. needed to track changes resulting from a plan/
Monitoring data tends to come from third-party programme; and
bodies undertaking systematic monitoring of key (3) showcase approaches that help address the
indicators (e.g. the Irish Environmental Protection complexity of interactions that make it difficult
Agency – EPA – monitors water quality nation­ to determine whether any environmental
ally). However, it is the plan-making authority that changes occur as a result of a given plan/pro­
is responsible for collating and synthesising the gramme (due to multiple factors influencing
relevant information and reporting on it in rela­ overall environmental quality, and to amor­
tion to their plan/programme, which is often not phous links between planning hierarchies and
captured in SEA ERs. sectors, where a plan/programme will influence
● Monitoring data from one round of plan-making plans/programmes/projects downstream but
do not seem to inform the next round of plan- also across parallel sectors).
making; SEA ERs jump straight into the baseline of
the current plan, with no reference to what has The guidance should also provide recommendations on
happened since the adoption of the previous the nature and level of monitoring. For instance, where
plan/programme. plans/programmes lack geographic specificity or contain
● Monitoring reports are not prepared and/or made only high-level strategic objectives, monitoring should
publicly available. While some monitoring may be focus on national indicators to examine environmental
taking place, there is a lack of understanding of its trends; where plans/programmes contain detailed
occurrence and effectiveness as the findings are actions, monitoring should focus on cause-effect models
rarely made available. to measure environmental effects more directly.
172 A. GONZÁLEZ

Table 2. Case studies reviewed in the tiering in environmental assessment research project (EPA 2021).
SEA/EIAa Sector Hierarchy
SEA – Clare Wind Energy Strategy 2017–2023 Energy County
EIA – Knocknalough Wind Farm 2012 Wind
EIA – Cahermurphy-Kilmihil Wind Farm 2014 Wind
EIA – Glenmore Wind Farm 2014 Wind
SEA – Cherrywood SDZ Masterplan 2010–2016 Land-use Local
EIA – Mixed-use Town Centre Development 2017 Mixed-use
SEA – Eastern and Midlands Region Waste Management Plan 2015–2021 Waste Regional
EIA – KMK Metal Recycling Ltd. in Kilbeggan 2017 WEEE, metal recovery
SEA – Waterford City Development Plan 2013–2019 Land-use City
SEA – Kerry County Development Plan 2015–2021 Land-use County
EIA – N69 Listowel Bypass Proposed Road Development 2017 Transport
EIA – N70 Sneem to Blackwater Bridge Road Project 2019 Transport
SEA – North Quays Strategic Development Zone (SDZ) 2018 Land-use
EIA – Knockboy Residential Strategic Housing Development 2019 Housing
EIA – Kilbarry Residential Scheme 2018 Housing
EIA – North Quays Development 2019 Housing
SEA – Shannon Strategic Integrated Framework Plan 2013–2020 Land- and marine-use Regional
SEA – Shannon-Foynes Port Development Masterplan 2013–2020 Port
EIA – Shannon-Foynes Port Development Expansion – Strategic Infrastructure Development 2018 Port
SEA – Ulster Canal Restoration Plan 2016–2022 Recreation Regional
EIA – Ulster Canal Restoration Upper Lough Erne to Clones 2011 Recreation
aShaded SEAs refer to the higher-tier plan. No shading refers to the lower-tier SEAs and EIAs under the higher-tier plan.

3.2. SEA and EIA monitoring links in Irish SEA level should be undertaken’ (SIFP SEA ER, p. 426). Of
practice the 12 EIARs, only two have EIA monitoring measures
that are influenced by monitoring at the higher tier.
A recent project on the influence of SEA on EIA, or tiering
For example, the Cherrywood project (Mixed-use
in environmental assessment (EPA 2021; González and
Town Centre Development 2017) EIAR includes mon­
Therivel 2021) reviewed a separate set of nine SEAs and
itoring measures in each chapter, and these are linked
12 associated EIAs (Table 2). Specific review criteria were
to the SEA. These shortcomings further emphasise the
included to evaluate monitoring links between SEA and
poor performance on the monitoring stage in envir­
EIA (e.g. checking whether the higher-tier SEA refers to
onmental assessment practice.
monitoring data from previous strategic actions, the SEA
The interviewed practitioners and experts
monitoring refers to individual projects/EIAs, and the
attested to the importance of monitoring at both
lower-tier EIA/SEA refers to the higher-tier SEA and its
SEA and EIA levels and highlighted the role of
monitoring). The review was supplemented with inter­
monitoring in linking different tiers of assessments –
views of 14 national and 14 international SEA practi­
that is, in supporting tiering in impact assessment.
tioners, plan-makers, development control planners and
Strategic monitoring indicators can be brought
academics. The interviews aimed to identify links
down to the project level to follow up on the
between SEA and EIA, good practice, and suggestions
implementation of SEA mitigation measures, fill
for fostering and strengthening these links.
data gaps and identify unforeseen adverse effects.
The review findings revealed again that, overall,
tiering links between SEA and EIA monitoring mea­ The monitoring information at EIA level can accu­
sures are weak. Only one of the reviewed seven SEAs mulate back up to inform the strategic monitoring
(i.e. Waterford City Development Plan 2013–2019 SEA indicator. However, there was also an acknowledg­
ER) relates its monitoring programme to specific pro­ ment that ‘monitoring programmes are included in
jects, with the rest keeping monitoring at the strategic the SEA ER and then almost forgotten about (. . .)
level. Some have vague project-level monitoring Monitoring is the exception rather than the norm’
references, such as the Clare Wind Energy Strategy and that ‘monitoring is often insufficiently detailed
SEA ER which recommends that monitoring informa­ or clear to inform EIA’. Similarly, a planner noted
tion should be placed on Geographic Information that SEA monitoring provisions may not be followed
Systems (GIS), and updated as data become available, up because of lack of resources or, as an EIA con­
for instance from EIA Reports (EIARs). In a number of sultant observed, because they are not ‘stitched
cases, SEA ERs refer to monitoring as a means to fill in into the policy requirements of the relevant plan’.
data gaps. For example, the Shannon Strategic Two interviewees observed that collecting EIA infor­
Integrated Framework Plan 2013–2020 states that mation on sensitive issues that may have arisen at the
‘the most significant data gaps which should be priori­ project stage (e.g. extracting relevant information from
tised are bird surveys (. . .) together with cetacean the EIARs) to inform future SEAs would be a resource
monitoring (. . .) In order to supplement biodiversity intensive task. One of them noted that ‘while EIAs con­
data gaps, additional data gathering to be subse­ tain a wealth of information in terms of dedicated long-
quently used during the plan review or at project term and seasonal surveys, analysis of baseline and
IMPACT ASSESSMENT AND PROJECT APPRAISAL 173

historical monitoring information, much of this is not 2019). Nevertheless, the complexities of determining
captured or collated in any coherent manner which can whether any environmental changes occur as a result
be made available for use in SEAs’. of a given plan/programme remain, as previously
None of the international interviewees gave exam­ emphasised in the international literature (e.g. Arts
ples of effective monitoring of either SEA or EIA. SEA 1998; Partidario and Fischer, 2004; Cherp et al. 2012;
monitoring, where carried out, seems to be of the Fischer and Retief 2021). Similarly, the absence of
environmental baseline – bringing together data that reporting or communication of SEA monitoring results
already exists elsewhere – rather than monitoring of in practice has been repeatedly noted (e.g.
the effectiveness of SEA mitigation (or of plan imple­ Gachechiladze et al. 2009; Lundberg et al. 2010;
mentation). It was considered that, in Europe at least, González et al. 2019).
this lacuna seems to be because there is no legal The international literature has identified issues addi­
requirement for anyone to check the results of SEA or tional to those highlighted by the Irish case studies. For
example, SEA monitoring that relies on existing environ­
EIA monitoring, except in limited cases such as
mental observation systems is often inappropriate and/
licensed facilities (regulated, for example, under
or insufficient due to problems with data collection
Integrated Pollution Prevention and Control licensing).
frequencies, scales and compatibilities (Hanusch and
Yet, it was observed that monitoring will become man­
Glasson 2008; Gachechiladze et al. 2009). Previous stu­
datory under ongoing changes in climate so ‘develop­
dies have also highlighted shortcomings in current
ing climate change indicators that can be measured at monitoring practice with regards to the implementation
various tiers could give SEA better footing.’ of mitigation measures and the identification/evalua­
tion of unforeseen, emerging and external issues
(Hanusch and Glasson 2008; Gachechiladze et al. 2009;
4. Discussion
Lundberg et al. 2010; Wallgren et al. 2011).
On top of the recognition of its legal mandate, the Interestingly, many of the practical inefficiencies
importance and benefits of monitoring have been encountered in other assessment stages (e.g. baseline
widely acknowledged (e.g. Azcárate et al. 2013; environment, alternatives, mitigation) are being slowly
Gachechiladze-Bozhesku and Fischer 2012; Jiricka- but surely addressed. This is partially as a result of
Pürrer et al., 2021; Morrison-Saunders and Arts 2004; guidance and ongoing practice, and partially the result
Persson and Nilsson 2007). In addition, monitoring is of a build-up of legal challenges that have fostered
considered the stage that could best link SEA and EIA improvements in SEA practice. A review of Court of
procedures; tiering through monitoring can enhance Justice of the EU (CJEU) case law highlights that SEA
both assessment types (González and Therivel 2021). and EIA legal challenges to date have mostly focused on
However, the latest peer-reviewed publications, the the failure to carry assessments and comply with legal
findings from the European REFIT review and recent requirements concerning the environmental report (EU
Irish research studies all point to the same issue: mon­ 2020; ECGF 2021). Monitoring does not seem to have
itoring remains the forgotten sibling of environmental ‘hit’ the CJEU yet which, in itself, is a sign that this
assessment practice. The findings of the Irish reviews, procedural stage and its implementation continue to
for example, echo previous studies. The inadequacy of be overlooked and neglected.
monitoring indicators observed in Irish practice was
also observed by Polido et al. (2016). The missed
5. Recommendations for good monitoring
opportunity of monitoring to fill data gaps, address
practice
uncertainty and capture unforeseen adverse effects is
a shortcoming that was also noted by Azcárate et al. In an attempt to support better monitoring practice and
(2013) and Partidário and Fischer (2004). The lack of in order to foster further discussion on this critical envir­
resources to support SEA monitoring implementation onmental assessment sibling, Table 3 puts forward a set
remarked by Irish practitioners aligns with the findings of pragmatic recommendations. These have been
from a previous online international survey gleaned from the literature, interviews with planners
(Gachechiladze-Bozhesku and Fischer 2012) and the and SEA experts, and case studies.
more recent SEA REFIT (EC 2019). This particular limita­
tion could potentially be addressed by participative
6. Conclusion
monitoring approaches that rely on citizen science,
such as those reviewed by Stepenuck and Green Monitoring is a key part of SEA, but it remains the for­
(2015) and Carton and Ache (2017). Such approaches gotten sibling to the other SEA steps that are generally
enable distributed data collection over long periods improving worldwide. The enduring poor performance of
and foster public input and engagement in decision- SEA monitoring significantly reduces the ability of the
making by increasing awareness of local issues/con­ impact assessment research and practice community to
cerns to influence decisions (González and Gazzola determine whether SEA is resulting in sustainable
174 A. GONZÁLEZ

Table 3. Recommendations to improve SEA (and EIA) monitoring practice.


For SEA practitioners – responsible for developing the monitoring measures
1. Definition of Monitoring a. Identify a suitably small set of highly relevant indicators to meaningfully monitor the environmental effects of
Indicators plan/programme implementation.
b. The monitoring indicators should measure:
● Identified/likely environmental impacts of the plan;
● Unforeseen adverse impacts of the plan, notably assumptions underlying the SEA impact predictions that may
not happen in practice (e.g. changes in travel behaviour in response to a new bus service); and
● The implementation of mitigation measures, including their effectiveness – to allow this, mitigation measures
should be clearly described and measurable; and
● Data gaps that need to be filled before the next iteration of the plan (where the plan is cyclical) or the next
phase of plan implementation.
c. Align indicators with the scope and nature of the plan (e.g. several water quality indicators may be needed for
a water resources management plan but only one of such indicators, if any, may be relevant to a transport
plan).
d. Use existing monitoring programmes and initiatives appropriate to the issues and scale of the plan (e.g. water
quality and biodiversity reporting datasets for populating indicators relevant to the regional and local levels;
Sustainable Development Goal (SDG) indicators for national assessment monitoring). However, depending on
the level of detail required, more local-level and/or bespoke monitoring may be needed. Larger-scale datasets
may help identify whether more local-level monitoring is needed (e.g. if there is a particular problem at the
local level) or not.
e. Link the monitoring indicators to measurable international and national thresholds/standards (e.g.
achievement of World Health Organisation air quality standards, or the SDGs). This allows the normative
effectiveness of SEA to be tested.
2. Definition of Monitoring a. Identify the data sources that will be used for monitoring each indicator.
Arrangements b. Identify the organisations with relevant monitoring responsibilities. While the plan-maker is, in all cases,
responsible for plan-implementation monitoring, different third-party organisations may be responsible for
monitoring specific indicators (e.g. water or air quality).
c. Define the level of detail of monitoring required – see 1d. above.
d. Define the monitoring frequency and reporting requirements (e.g. who should be notified). This facilitates
communication between agencies and resource allocation in case of environmental problems.
e. Define remedial actions if the monitoring data identifies significant adverse unforeseen impacts, and the level
of impact at which the remedial actions would be implemented. This could include the application of
additional technologies (e.g. to reduce point source pollutants or improve energy efficiency if stated air quality
or energy targets are not met), additional management measures (e.g. closure of certain areas to recreational
activities if recreational activities are increasing traffic congestion or affecting the populations of certain
species), and not permitting a subsequent planned phase of development (e.g. phase 2 of a wind farm if phase
1 leads to an x% reduction in bird or bat populations).
f. Define the responsibility for remedial actions. A clear definition of responsibilities would provide accountability
for remedial action where environmental targets are not achieved.
3. Monitoring in the SEA ER a. Include a specific recommendation in the SEA ER for the monitoring programme to be incorporated into the
plan/programme or added as a chapter so that this aspect is not ‘lost’ as part of plan/programme
implementation reviews.
b. Refer to previous plans/programmes’ monitoring frameworks and data in the next round of plan SEA. In
particular, the next SEA should identify from the monitoring data any unforeseen adverse impacts of the
previous plan, and attempt to explain why these occurred. Monitoring can provide a robust and up to date
baseline for future SEAs and, more importantly, a better understanding of the implications of certain plans/
programmes – which can, ultimately, contribute to better planning and more informed decisions.
For plan-makers – responsible for interrogating data and reporting on monitoring
4. Implementing Monitoring a. Include a specific commitment in the plan/programme for environmental monitoring and reporting, as
identified in the SEA.
b. Earmark resources, including a defined budget, within the plan-making organisation to collate and report on
the relevant monitoring information.
c. Devise a monitoring approach within the organisation that feeds into plan/programme implementation review
and related reporting. This allows uncertainties and data gaps to be addressed, provides a more robust baseline
and better informs future planning. Moreover, plans at different levels could use the same methodological
framework with regards to core indicators, monitoring frequency, etc. to help align internally all monitoring
efforts in plan-making.
d. Consider the issue of monitoring as part of scoping or alternatives workshops during the preparation of a plan/
programme. This is particularly useful for cyclical plans, since it would ensure that previous monitoring data
inform the new SEA, and that monitoring of the new plan learns from any issues arising from monitoring of the
old plan.
e. Make monitoring data and reports available online and, where suitable, link monitoring data to existing GIS
databases. This can enhance understanding and knowledge across planning hierarchies and sectors, and avoid
institutional memory loss.
f. Where monitoring of individual areas or projects is required, include these requirements in a GIS, so that
planners can interactively click on a site to find out required measures and refer to them at the project level
(this is currently done, for example, in Clare County Council’s land-use planning system, and in the preparation
of Irish River Basin Management Plans).
5. Populating and Evaluating a. Document the monitoring data on a regular (e.g. annual) basis, and make the monitoring results public.
Indicators b. Test the plan/programme impacts in relation to established targets and thresholds.
c. Use spatial/GIS data for spatial plans to ‘zoom in’ on areas to undertake more detailed appraisal of changes in
environmental indicators, and identify where remedial actions may be required.
For EIA consultants – responsible for filling data gaps and undertaking detailed assessments
6. Carrying out SEA Monitoring a. Check the relevant SEA/plan/GIS system, and carry out project-specific monitoring requirements resulting from
SEA. This will allow the EIA to effectively address data gaps and strategic issues and uncertainties (e.g.
cumulative impacts, carbon emissions, habitat fragmentation) for a robust assessment.
b. Publish the monitoring data from a. so the information can accumulate back up, to work with and inform the
strategic level indicator and future SEAs.
(Continued)
IMPACT ASSESSMENT AND PROJECT APPRAISAL 175

Table 3. (Continued).
For SEA practitioners – responsible for developing the monitoring measures
Strategic recommendations for policy-makers
7. Monitoring Requirements Amend guidance and, where appropriate, legislation/regulations to require monitoring findings to be made
publicly available on the plan-makers’ website alongside the plan/programme and SEA related documentation.
8. Monitoring Guidance and Develop guidance and training, including good practice examples, to foster good monitoring practice, taking the
Training points above into account. In particular, showcase approaches that help to address the complexity of
interactions, which makes it difficult to identify the environmental changes that result from a given plan/
programme. For example, showcase approaches that distinguish between the environmental actions of an
individual plan (e.g. energy efficient housing) and the actual aims (e.g. zero carbon by 2040).
9. Dedicated Resources Create a dedicated team (probably at the national level, possibly comprising representatives from the statutory
environmental bodies) to ensure compliance with the statutory requirements for SEA monitoring, follow up on
environmental trends and engage with planners during plan-making. This advisory role would also enhance
environmental integration in plan-making and develop more sustainable plans/programmes.
10. Monitoring Data Gathering and Coordinate the gathering and provision of centralised access to relevant environmental monitoring (including
Centralisation spatial) data across planning hierarchies and sectors. This would present a rapid and systematic way of
addressing ongoing monitoring limitations and reduce the costs of evidence gathering for next round of SEAs.
11. Technology Support Use technology and innovation to encourage monitoring implementation and to tap into currently available but
under-used sources of data gathering and sharing (web-based services, citizen science, remote sensing, etc.).
Support citizen science initiatives and empower the public by giving them a voice, and allow plan-making
authorities to tap into local knowledge and data sources and volunteered reporting of environmental changes.

outcomes and preventing adverse effects on the envir­ References


onment – and thus determine the performance of the
pluralist, substantive, normative and knowledge/learning Arts J. 1998. EIA follow-up: on the role of ex post evaluation in
environmental impact assessment. Groningen: Geo Press.
dimensions of SEA effectiveness.
Azcárate J, Balfors B, Bring A, Destouni G. 2013. Strategic
This paper highlights the continuing need for stron­ environmental assessment and monitoring: arctic key
ger measures to efficiently implement monitoring. The gaps and bridging pathways. Environmental Research
proposed good practice recommendations put for­ Letters. 8(044033):9. doi:10.1088/1748-9326/8/4/044033.
ward have an Irish practice focus but are also relevant Carton L, Ache P. 2017. Citizen-sensor-networks to confront
and transferable to other SEA and planning contexts, government decision-makers: two lessons from The
Netherlands. Journal of Environmental Management.
and should be useful for improved SEA monitoring in
196:234–251. doi:10.1016/j.jenvman.2017.02.044.
any jurisdiction. At the heart of it is the need for Cherp A, Partidário MR, Arts J. 2012. From formulation to
practitioners to focus efforts on monitoring and for implementation: strengthening SEA through follow-up.
plan-makers to commit to implementing monitoring In: Sadler B, Dusik J, Fischer T, Partidario M, Verheem R,
programmes if future plan/programme cycles are to and Aschemann R, editors. Handbook of Strategic
Environmental Assessment. UK: Earthscan. p. 515–534.
benefit from properly understanding assessment out­
Dalal-Clayton B, Sadler DB. 2005. Strategic Environmental
comes and environmental pressures and conse­ Assessment: a Sourcebook and Reference Guide to
quences. The establishment of appropriate International Experience. London: Earthscan.
monitoring systems that respond to monitoring EC. 2001. Directive 2001/42/EC of the European Parliament
requirements across multiple sectors and planning and of the Council of 27 June 2001 on the assessment of
hierarchies will require investment, but should deliver certain plans and programmes on the environment. Official
Journal of the European Communities. L 197/30 21.7.2001.
net benefits in due course by informing assessments
EC (2019). REFIT evaluation of the SEA Directive. European
that provide for better plans with a reduced risk of Commission. [accessed 2022 Jan 24]. https://ec.europa.eu/
unforeseen impacts and need for remedial action. environment/eia/sea-refit.htm .
ECGF (2021). Compilation of CJEU Case Law on Member
States’ Obligation to Remedy Failures to Carry out
Disclosure statement Environmental Assessments. European Commission: DG
ENV - Environmental Compliance and Governance
No potential conflict of interest was reported by the Forum. [accessed 2022 Jan 24]. https://circabc.europa.eu/
author(s). ui/group/cafdbfbb-a3b9-42d8-b3c9-05e8f2c6a6fe/library/
cd912549-8639-4591-aaf5-f5f250a465f3/details .
EPA (2012). Review of Effectiveness of SEA in Ireland. Ireland:
Funding Environmental Protection Agency. [accessed 2022 Jan 24].
https://www.epa.ie/publications/monitoring–assessment/
This work was supported by the environmental protec­ assessment/SEA-EFFECTIVENESS-REVIEW-MAIN-REPORT-
tion agency (Ireland) [2017-NCMS-8 and 2019-SE-DS-21]. 2012.pdf .
EPA (2018). Second Review of Strategic Environmental
Assessment Effectiveness in Ireland. Ireland: Environmental
Protection Agency. [accessed 2022 Jan 24]. https://www.epa.
ORCID
ie/publications/monitoring–assessment/assessment/second-
Ainhoa González http://orcid.org/0000-0002-9334-3066 review-of-sea-effectiveness-in-ireland.php .
176 A. GONZÁLEZ

EPA (2021). Tiering of Environmental Assessment – the Lundberg K, Balfors B, Folkeson L, Nilsson M. 2010. SEA
Influence of SEA on Project Level EIA. Ireland: monitoring in Swedish regional transport infrastruc­
Environmental Protection Agency. [accessed 2022 Jan 24]. ture plans - Improvement opportunities identified in
https://www.epa.ie/publications/research/epa-research- practical experience. Environmental Impact
2030-reports/Research_Report_391.pdf . Assessment Review. 30(6):400–406. doi:10.1016/j.
EU (2020). Environmental Assessment of Projects and Plans eiar.2009.12.002.
and Programmes – rulings of the Court of Justice of the Mascarenhas A, Ramos TB, Nunes L. 2012. Developing an
European Union. Luxembourg: Publications Office of the integrated approach for the strategic monitoring of regio­
European Union. [accessed 2022 Jan 24]. https://ec. nal spatial plans. Land Use Policy. 29(3):641–651.
europa.eu/environment/eia/pdf/EIA_rulings_web.pdf . doi:10.1016/j.landusepol.2011.10.006.
Fischer TB, and Retief F. 2021. Does strategic environmental Morrison-Saunders A, Arts J. 2004. Assessing impact: hand­
assessment lead to more environmentally sustainable deci­ book of EIA and SEA follow-up. London: Routledge.
sions and action? Reflections on substantive effectiveness. Nilsson M, Wiklund H, Finnveden G, Jonsson DK,
In: Fischer TB, and González A, editors. Handbook on Lundberg K, Tyskeng S, Wallgren O. 2009. Analytical fra­
Strategic Environmental Assessment. Cheltenham: Edward mework and tool kit for SEA follow-up. Environmental
Elgar. p. 114–125. Impact Assessment Review. 29(3):186–199. doi:10.1016/j.
Gachechiladze M, Noble BF, Bitter BW. 2009. Following-up in eiar.2008.09.002.
strategic environmental assessment: a case study of Partidário MR, Arts J. 2005. Exploring the concept of strate­
20-year forest management planning in Saskatchewan, gic environmental assessment follow-up. Impact
Canada. Impact Assessment and Project Appraisal. 27 Assessment and Project Appraisal. 23(3):246–257.
(1):45–56. doi:10.3152/146155109X430362. doi:10.3152/147154605781765481.
Gachechiladze-Bozhesku M, Fischer TB. 2012. Benefits of and Partidário MR, and Fischer TB. 2004. Follow-up in current SEA
barriers to SEA follow-up - Theory and practice. understanding. In: Morrison-Saunders A, and Arts J, edi­
Environmental Impact Assessment Review. 34:22–30. tors. Assessing Impact: handbook of EIA and SEA Follow-
doi:10.1016/j.eiar.2011.11.006. up. London: Routledge. p. 224–247.
González A, Bullock C, Gaughran A, Watkin-Bourne K. 2019. Persson Å, Nilsson M. 2007. Towards a framework for sea
Towards a better understanding of SEA effectiveness in follow-up: theorethical issues and lessons from policy
Ireland. Impact Assessment and Project Appraisal. 37(3– evaluation. Journal of Environmental Assessment
4):233–243. doi:10.1080/14615517.2019.1580475. Policy and Management. 9(4):473–496. doi:10.1142/
González A, Gazzola P. 2019. Untapping the potential of S1464333207002901.
technological advancements in Strategic Environmental Polido A, João E, Ramos TB. 2016. Strategic Environmental
Assessment. Journal of Environmental Planning and Assessment practices in European small Islands:
Management. 63(4):585.603. insights from Azores and Orkney Islands.
González A, Therivel R. 2021. Raising the game in environ­ Environmental Impact Assessment Review. 57:18–30.
mental assessment: insights from tiering practice. doi:10.1016/j.eiar.2015.11.003.
Environmental Impact Assessment Review. 92:106695. Sadler B. 2004. On evaluating the success of EIA and SEA. In:
doi:10.1016/j.eiar.2021.106695. Morrison-Saunders A, and Arts J, editors. Assessing Impact:
Gutierrez M, Bekessy SA, Gordon A. 2021. Biodiversity and eco­ handbook of EIA and SEA Follow-up. London: Routledge.
system services in strategic environmental assessment: an p. 248–285.
evaluation of six Australian cases. Environmental Impact Sadler B, Dusik J, Fischer T, Partidario M, Verheem R,
Assessment Review. 87:106552. doi:10.1016/j.eiar.202 Aschemann R. 2012. Handbook of Strategic
1.106552. Environmental Assessment. London: Earthscan.
Hadi SP, Purnaweni H, and Prabawani B (2019). The power­ Söderman T, Kallio T. 2009. Strategic environmental
less of Strategic Environmental Assessment (SEA): case assessment in Finland. Journal of Environmental
Studies of North Kendeng Mountain Area, Central Java, Assessment Policy and Management. 11(1):1–28.
Indonesia. E3S Web of Conferences; Aug 7-8; Semarang, doi:10.1142/S1464333209003269.
Indonesia. Stepenuck KF, Green L. 2015. Individual- and
Hanusch M, Glasson J. 2008. Much ado about SEA/SA mon­ community-level impacts of volunteer environmental
itoring: the performance of English regional spatial strate­ monitoring: a synthesis of peer-reviewed literature.
gies, and some German comparisons. Environmental Ecology and Society. 20(3):19. doi:10.5751/ES-07329-
Impact Assessment Review. 28(8):601–617. doi:10.1016/j. 200319.
eiar.2007.12.001. Therivel R, González A. 2019. Introducing SEA effectiveness.
IAIA. 2002. Strategic Environmental Assessment: performance Impact Assessment and Project Appraisal. 37(3–
Criteria. Fargo (USA):International Association for Impact 4):181–187. doi:10.1080/14615517.2019.1601432.
Assessment. Therivel R, González A. 2021. “Ripe for decision”: tiering
Jiricka-Pürrer, A., Wanner, A., and Hainz-Renetzeder, C. (2021). in Environmental Assessment. Environmental Impact
Who cares? Don't underestimate the values of SEA mon­ Assessment Review. 87:106520. doi:10.1016/j.eiar.202
itoring! Environmental Impact Assessment Review, 0.106520.
90:106610. https://doi.org/10.1016/j.eiar.2021.106610 . Wallgren O, Nilsson M, Jonsson DK, Wiklund H. 2011.
Lamorgese L, Geneletti D, Partidario MR. 2015. Reviewing Confronting sea with real planning: the case of
strategic environmental assessment practice in the oil follow-up in regional plans and programmes in
and gas sector. Journal of Environmental Assessment Sweden. Journal of Environmental Assessment Policy
Policy and Management. 17(2):1550017. doi:10.1142/ and Management. 13(2):229–250. doi:10.1142/S14
S1464333215500179. 64333211003869.

You might also like