Professional Documents
Culture Documents
Statement On The Term Hydrological Normal HCP
Statement On The Term Hydrological Normal HCP
Statement On The Term Hydrological Normal HCP
During the Fifteenth Session of the WMO Commission for Hydrology (CHy, 2015), the
Commission was asked to develop a statement on the definition of hydrological normal. The
request came at a time when the WMO Commission for Climatology (CCl) was preparing an
updated set of Guidelines on the calculation of a climate normal. For perspective, normal in
a climatic context, has historically been described as the average or “normal” weather over a
30-year period. The choice of a 30-year averaging period reflects the view that 30 years
provides enough time to smooth out random weather components and to establish a constant
mean. The 30-year period is then used to provide a benchmark against which climatic
“anomalies” (i.e., monthly or annual differences from the mean) would be calculated. The
use of the 30-year period is so deeply established within the atmospheric sciences community
that the term “climatology” is frequently used to refer to a variable’s 30-year average or
“normal” condition.
A major intent of climate “normals” is to enable comparison among observations from around
the world. Throughout the 20th century, the standard normal was defined as the most recent
continuous non-overlapping 30-year period (1901-1930, 1931-1960, 1961-1990). At the
Seventeenth World Meteorological Congress (WMO, 2015), however, the definition was
changed to the most recent 30-year period finishing in a year ending with 0 (zero). Thus, in
2015, for example, the new 30-year normal period became 1981-2010, whereas under the
old definition it was 1961-1990.
As the Commission for Hydrology had not previously addressed the concept of a hydrological
normal, the newly installed Hydrological Coordination Panel (HCP) decided at its First Session
in December 2019 to develop a statement on the definition of hydrological normal, to be
shared among its members for use in the Hydrological Status and Outlook System
(HydroSOS), by the end of 2020. This Statement is the result. It begins by considering the
concept of, and issues affecting, a normal period in hydrology; then reviews specific
applications that require the use of “normal” statistics; considers the representativeness of
hydrological data; and finally assesses the approaches most appropriate to determining
hydrological normals for the most common applications. The Statement concludes with a
summary of recommended best practices for computing a hydrological normal, rather than
providing a single definition, inasmuch as the complexity of the hydrological regime and the
disparate nature of the applications militate against the latter.
Initially, it is worth reviewing how the term hydrological “normal” is defined in authoritative
hydrological references. The WMO-UNESCO International Glossary of Hydrology (2012), for
example, contains no explicit definition of the term “hydrological normal” or “hydrological
normal period.” It does, however, define the more general term “normals” as “moving
averages calculated over a uniform and relatively long period covering at least three
consecutive decades.” Other hydrological glossaries similarly provide no greater specification
of the term. Thus, in the absence of an established definition or similar characterization of
the term, it is necessary to review the hydrological components relevant to such a definition.
The Multiple Drivers of Hydrological Change
Climate Sensitivity
The issue of human influences has long been recognized in hydrology as researchers have
sought to understand “natural” hydrologic variability in comparison with variations that reflect
the activities of humans within the catchment. In recent decades, researchers have focused
increased attention on the hydrologic impacts of climatic variability and change and, in so
doing, have looked for streamgauging records that reflect conditions minimally affected by
such confounding influences as urbanization, regulation of the watercourse, etc. In 2004, the
WMO Commission for Hydrology requested all National Hydrological Services to identify
streamgauging stations within their jurisdictions that were minimally affected by human
activities, i.e., for which variations primarily reflected variations in climate, and that had
complete records of at least 40 years. In response, 1200 gauging stations were identified
worldwide, a small fraction of the total number of streamgauging stations in operation. In
addition, a number of countries have developed listings of climate sensitive streamgauging
stations that meet specific criteria. The listings could be updated and linked to the WMO
Recognition Mechanism for Long-Term Observing Stations that has been decided by the first
session (part II, 2021) of the Commission for Weather, Climate, Water And Related
Environmental Services And Applications (SERCOM). Moreover, the WMO unified policy for
the international exchange of earth system data, approved by the Commission For
Observation, Infrastructure And Information Systems (INFCOM 1 part III, 2021) and to be
submitted to the WMO Congress, suggests the establishment of global hydrological reference
stations supporting climate analyses among other functionalities. Such stations could support
determining hydrological normals.
One example of the produced listings is the U.S. Geological Survey’s Hydro-Climatic Data
Network 2009 (HCDN-2009) composed of 743 stations that meet the following criteria: (1)
they are identified as being in a “reference” condition: i.e., having minor anthropogenic
modification of the catchment based on a) GIS‐derived variables; b) a visual inspection of
every streamgauge and drainage basin from recent high‐resolution imagery and topographic
maps, and c) information about man‐made influences from annual water data reports; (2)
they have at least 20 years of complete and continuous discharge record through water year
2009; (3) they have less than 5 percent impervious surface area as measured using the
Federal National Land Cover Data; and (4) they were verified as meeting the above criteria
by the USGS State Water Science Centers. Other countries have produced similar data sets
using analogous criteria (For example, the Australian Hydrological Reference Station Network
and the UK Benchmark Network). In situations where hydrologic records primarily reflect
climatic variations, analysts have more flexibility in choosing a normal period based on their
own investigative objectives.
The vast majority of river basins, particularly those that contain streamgauging stations, have
undergone varying degrees of human development. Some have undergone minimal
development, and some have been dramatically transformed by urbanization, non-urban
commercial development, suburbanization, and agriculture. As a result, very few gauges are
located in truly pristine or natural conditions over multidecadal periods. This means that the
statistical characterization of hydrological conditions within the basin cannot be assumed to
be a simple reflection of climatic variations, particularly precipitation, across the catchment.
Computed statistics will necessarily reflect numerous confounding variations that render any
assessment, particularly those attempting to relate variations in one catchment to those in
another questionable. This is true regardless of the time period used because the timing and
extent of development can vary significantly, even across adjacent basins.
The major source of human-induced variation in streamflow stems from the presence of dams
and reservoirs, levees, and various diversions within catchments. Dams and reservoirs
supporting water supply, hydropower, agriculture, flood control, drought mitigation,
environmental requirements, etc., can significantly alter the characteristics of the hydrological
normal in a river basin. Consider, for example, the impact on a 60-year record of hydrological
data where a large dam began operating in the middle of the record. The second half of the
record would be characterized by a period of reservoir filling (when the flows would be lower
than during the previous period), which could last a decade in the case of a very large dam.
Flood peaks would also be reduced as the dam would be operated in order to mitigate flood
damages. Similarly, more water would be released from reservoirs during droughts and low-
flow seasons in order to maintain flows for water supply, irrigation, environmental
requirements and for other legally mandated issues. Thus, low flows would likely increase
during the second half of the record. A clear example of this has occurred in the Upper Odra
River system of the Czech Republic, where water demand decreased by 60 percent in recent
decades due to socio-economic changes and a reduction in coal-based heavy industry. As a
consequence of these changes, low flows increased in rivers downstream of reservoirs.
Selected Applications
As a result of the multiple drivers of hydrological variability outlined in the previous section,
different approaches to the determination of hydrological “normal” may be required
dependent on the application. The section outlines key considerations for some typical
applications.
Hydro-Climatic Analysis
When assessing the behavior of hydrologic conditions through time, the notion of “normal” or
“normal period” in hydrology has broad similarities to that in climatology. There are numerous
examples in the literature where variations and changes in hydrological conditions have been
linked with those in climate, e.g., coupling river flow trends with those found in precipitation
records. In these instances, the use of similar statistical approaches and averaging
timeframes is essential. Thus, if one wishes to assess the impact of precipitation variations
on streamflow over some multidecadal period, and the precipitation normals were calculated
using the 30-year period 1981-2010, then that same 30-year period should be used for
determining the streamflow normals.
The concept of hydrological ‘normals’ provides a benchmark against which recent or current
conditions can be compared from a diagnostic perspective both at a single location and across
multiple locations. It is therefore often used in the delivery of hydrological services that
provide information about the current conditions. Similarly, from a prognostic perspective, a
normal period provides an important context for investigating likely future conditions within
a catchment and is therefore employed in the delivery of hydrological forecasts and outlooks.
Such services often aim to deliver consistent information across a catchment/region/country,
allowing users to understand hydrological condition across space and time. In such
circumstances, it can be important to set a common normalization period across sites that
takes into account the length of record available and the different factor impacting the
hydrology of the area (as outlined above).
For the purposes of this Statement, the discussion of water resources engineering is limited
to the use of data for the design and operation of riverine structures (e.g., reservoirs,
floodways, dams, levees, spillways, channels, water intakes); the analysis of surface runoff;
and extreme event analysis (e.g., flood frequency analysis, low-flow frequency analysis, etc.).
These are the subdisciplines of water resources engineering that make most use of the
concept of a hydrological normal or normal period. The statistical methods most commonly
used in these applications require long data records in order to reduce the uncertainty in
estimates, particularly when dealing with events having long return periods. Consider flood
frequency analysis, for example, where reliable and robust estimation of the magnitude and
frequency of floods is fundamental for infrastructure design, risk‐assessment, and decision‐
making. Traditional engineering practice ties together flood risk assessment and at‐site flood
frequency analysis, allowing one to estimate flood magnitudes at given gauged locations for
return periods beyond the available data record. Flood frequency analysis consists of the
identification of a statistical distribution that is able to model the probability of exceedance of
extreme flood peaks (e.g., Generalized Extreme Value, log-Pearson Type III). Traditionally,
both the statistical distribution class and the parameters describing it are derived from past
data records. The standard approach is based on the annual maximum series, which involves
the statistical modelling of the largest peak flows observed in each year, a series of data that
can be easily and unambiguously extracted.
A number of different approaches have been developed and employed over the years to
accommodate the use of short records. However, these tend to yield results with significantly
greater uncertainty. For example, if a flood frequency estimate based on the annual
maximum series for a 70-year period is treated as a reference, it has been found that the
uncertainty associated with having only a 35-year record can increase by as much as 50
percent. In the case of having only a 20-year record, it can increase by as much as 100
percent. Moreover, results can change based on the distribution used. In general, the
generalized extreme value distribution combined with maximum likelihood estimation are
associated with the largest uncertainty, while the log‐Pearson type III exhibits comparable
bias and smaller uncertainty. Recent work also indicates that application of the partial
duration series approach to 20‐year records does not provide any significant advantage. So,
although methods do exist for utilizing shorter record lengths for engineering applications,
longer, period of record statistics typically produce more certain results.
A somewhat similar dichotomy exists with respect to measures of variability. The sample
variance and its square root, the sample standard deviation, are the classical measures of
variability. Like the mean, they are strongly influenced by extreme values. Given that the
values are computed using the squares of deviations of data from the sample mean, extreme
values influence their magnitudes even more so than for the mean. When extreme values
are present, these measures are unstable and inflated, and may give the impression of much
greater variability than is indicated by the majority of the dataset. Alternatively, the
interquartile range (IQR) provides an effective and resistant measure of variability. It
measures the range of the central 50 percent of the data and is not influenced at all by the
25 percent on either end. Noting both the robust and resistant qualities of the median and
interquartile range in characterizing the “typical” value and “spread” in a set of hydrological
observations, the case is made for using percentiles rather than the mean in describing the
representativeness of hydrological conditions. It is further suggested that, as a standard
practice, percentiles be calculated for each decile (including the 5th, 25th, 75th, and 95th), from
the annual minimum flow to the annual maximum flow. In this way, a comprehensive and
consistent assessment of the entire flow distribution at an observing station is obtained.
Moreover, this practice should be applied regardless of the normal period used.
Recommended Best Practices
• Use the median rather than the mean to describe average streamflow.
Given the inherent skew in the distribution of streamflow values, the median provides a
more representative measure of the average condition.